text
stringlengths 10
951k
| source
stringlengths 39
44
|
---|---|
John Radcliffe (physician)
John Radcliffe (1650 – 1 November 1714) was an English physician, academic and politician. A number of landmark buildings in Oxford, including the Radcliffe Camera (in Radcliffe Square), the Radcliffe Infirmary, the Radcliffe Science Library, Radcliffe Primary Care and the Radcliffe Observatory were named after him. The John Radcliffe Hospital, a large tertiary hospital in Headington, was also named after him.
Radcliffe was born the son of George Radcliffe and Anne Loader, in Wakefield, Yorkshire, where he was baptised on 23 January 1653. He was educated at Queen Elizabeth Grammar School and Northallerton Grammar School and graduated from the University of Oxford, where he was an exhibitioner at University College tutored by Obadiah Walker, to become a Fellow of Lincoln College. He obtained his MD in 1682 and moved to London shortly afterwards. There he enjoyed great popularity and became royal physician to William III and Mary II.
In 1690 he was elected Member of Parliament for Bramber, Sussex and in 1713 member for Buckingham.
On his death in the following year, his property was bequeathed to various charitable causes, including St Bartholomew's Hospital and University College, Oxford, where the Radcliffe Quad is named after him. The charitable trust founded by his will of 13 September 1714 still operates as a registered charity.
1. Among the many singularities related of Radcliffe, it has been noticed that, when he was in a convivial party, he was unwilling to leave it, even though sent for by persons of the highest distinction. Whilst he was thus deeply engaged at a tavern, he was called on by a grenadier, who desired his immediate attendance on his "colonel"; but no entreaties could prevail on the physician to postpone his revelry.
2. To confer medical authority upon themselves, doctors of the day often published their theories, clinical findings, and pharmacopoeia (collections of "receipts" or prescriptions). Radcliffe, however, not only wrote little but also took a certain iconoclastic pride in having read little, remarking once of some vials of herbs and a skeleton in his study: “This is Radcliffe’s library.” However, he bequeathed a substantial sum of money to Oxford for the founding of the Radcliffe Library, an endowment which, Samuel Garth quipped, was "about as logical as if a eunuch should found a seraglio."
3. Physician to King William III until 1699, when Radcliffe offended the King by remarking "Why truly, I would not have your Majesty's two legs for your three kingdoms."
The John Radcliffe Hospital in Oxford is named after John Radcliffe, as was the former Radcliffe Infirmary, now being redeveloped for academic use by Oxford University as the Radcliffe Observatory Quarter.
|
https://en.wikipedia.org/wiki?curid=16530
|
Joual
Joual () is an accepted name for the linguistic features of basilectal Quebec French that are associated with the French-speaking working class in Montreal which has become a symbol of national identity for some. "Joual" is stigmatized by some and celebrated by others. While "joual" is often considered a sociolect of the Québécois working class, many feel that perception is outdated.
Speakers of Quebec French from outside Montreal usually have other names to identify their speech, such as Magoua in Trois-Rivières, and Chaouin south of Trois-Rivières. Linguists tend to eschew this term, but historically some have reserved the term "joual" for the variant of Quebec French spoken in Montreal.
Both the upward socio-economic mobility among the Québécois, and a cultural renaissance around "joual" connected to the Quiet Revolution in the Montreal East-End have resulted in "joual" being spoken by people across the educational and economic spectrum. Today, many Québécois who were raised in Quebec during the 20th century (command of English notwithstanding) can understand and speak at least some "joual".
The creation of "joual" can be traced back to the "era of silence", the period from the 1840s to the 1960s and the start of the Quiet Revolution. The "era of silence" was marked with stark stigmatization of the common working man. Written documents were not shared with the typical working class man, and the very strict form of French that was used by elites excluded a majority of the population. The Quiet Revolution during the 1960s was a time of awakening, in which the Quebec working class demanded more respect in society, including wider use of Québécois in literature and the performing arts. Michel Tremblay is an example of a writer who deliberately used "joual" and Québécois to represent the working class populations of Quebec. "Joual", a language of the working class, quickly became associated with slang and vulgar language. Despite its continued use in Canada, there are still ideologies present which place a negative connotation on the use of "joual".
Although coinage of the name "joual" is often attributed to French-Canadian journalist André Laurendeau, usage of this term throughout French-speaking Canada predates the 1930s.
The actual word "joual" is the representation of how the word "cheval" (Standard French: , "horse") is pronounced by those who speak "joual". ("Horse" is used in a variation of the phrase "parler français comme une vache" [to speak French like a cow], i.e. to speak French terribly; hence, a put-down of the Québécois dialect.) The weak schwa vowel disappeared. Then the voiceless was voiced to , thereby creating . Next, the at the beginning of a syllable in some regional dialects of French or even in very rapid speech in general weakened to become the semi-vowel written . The end result is the word transcribed as "joual".
Diphthongs are normally present where long vowels would be present in standard French. There is also the usage of "sontaient, sonté" ("ils étaient, ils ont été").
Although "moé" and "toé" are today considered substandard slang pronunciations of "toi" and "moi", these were the original pronunciations of "ancien régime" French used in all provinces of Northern France, by the royalty, aristocracy, and common people. After the 1789 French Revolution, the standard pronunciation in France changed to that of a previously-stigmatized form in the speech of Paris, but Quebec French continued to evolve from the historically older dialects, having become isolated from France following the 1760 British conquest of New France.
"Joual" shares many features with modern Oïl languages, such as Norman, Gallo, Picard, Poitevin and Saintongeais though its affinities are greatest with the 17th century koiné of Paris. Speakers of these languages of France predominated among settlers to New France.
It could be argued that at least some aspects of more modern "joual" are further linguistic contractions of standard French. "D'la" ("de la") is an example where the word "de" has nearly fallen out of usage over time and has become contracted. This argument does apply to other words, and this phenomenon has become widespread throughout contemporary French language.
Another significant characteristic of "joual" is the liberal use of profanities called "sacre" in everyday speech.
There are a number of English loanwords in "joual", although they have been stigmatized since the 1960s, instead favoring alternative terms promoted by the "Office québécois de la langue française". The usage of deprecated anglicisms varies both regionally and historically.
Some words were also previously thought to be of English origin, although modern research has shown them to be from regional French dialects:
The two-act play "Les Belles-sœurs" by Canadian writer Michel Tremblay premiered in 1968 at the Theater du Rideau Vert in Montreal, Canada. Many consider it to have had a profound impact on Canadian culture, as it was one of the first times "joual" was seen on a national stage. The play follows a working-class woman named Germaine in Montréal, Québec, Canada. After winning a million trading stamps, she invites her friends over to help paste them into booklets to redeem them. But Germaine is unsuspecting of her jealous friends who are envious of her winnings. The fact that the play was originally written in "joual" is very important to the socio-linguistic aspect of the women. The characters all come from the working class and for the most part, speak in "joual", which at the time was not seen on the main stage. The play was cited at the time as a "radical element among Quebec critics as the dawn of a new era of liberation, both political and aesthetic".
When "Les Belles-sœurs" premiered in Paris, France in 1973 as it was originally written, in "joual", it was met with some initial criticism. One critic described it as difficult to understand as ancient Greek. Tremblay responded, "a culture should always start with speak to herself. The ancient Greeks spoke to each other". The popularity of the play has since caused it to be translated into multiple languages, raising controversies in the translation community over retaining the authenticity of "Les Belles-sœurs" even when not performed in the original dialect of "joual".
Writing in "joual" gave Tremblay an opportunity to resist cultural and linguistic "imperialism" of France, while signifying the secularization of Québec culture.
|
https://en.wikipedia.org/wiki?curid=16533
|
Joseph Lister
Joseph Lister, 1st Baron Lister, (5 April 182710 February 1912), was a British surgeon and a pioneer of antiseptic surgery, whose research into bacteriology and infection in wounds, raised his skillful operative technique, that was similar to his peers, to a new plane where his observations, deductions and practices revolutionised surgery throughout the world.
Lister promoted the idea of sterile surgery while working at the Glasgow Royal Infirmary. Lister successfully introduced carbolic acid (now known as phenol) to sterilise surgical instruments and to clean wounds.
Applying Louis Pasteur's advances in microbiology, Lister championed the use of carbolic acid as an antiseptic, so that it became the first widely used antiseptic in surgery. He first suspected it would prove an adequate disinfectant because it was used to ease the stench from fields irrigated with sewage waste. He presumed it was safe because fields treated with carbolic acid produced no apparent ill-effects on the livestock that later grazed upon them.
Lister's work led to a reduction in post-operative infections and made surgery safer for patients, distinguishing him as the "father of modern surgery".
Lister was born to a prosperous Quaker family in Upton, West Ham, Essex, then near but now in London, England. He was the second son of six siblings to gentleman scientist and wine merchant Joseph Jackson Lister and Isabella Harris. His father was a pioneer in the design of achromatic object lenses for use in compound microscopes. His father spent 30 years of his life, perfecting the microscope. In the process, discovering the Law of Aplanatic foci, building a microscope where the image point of one lens coincided with the focal point of another. Up until that point, the best higher magnification lenses produced an excessive secondary aberration known as a coma which interfered with normal use. His work built a reputation sufficient to enable his being elected to the Royal Society in 1832.
A young Joseph Lister attended Benjamin Abbott's Isaac Brown Academy, a Quaker school in Hitchin in Hertfordshire.
As a teenager, Lister attended Grove House School in Tottenham, studying mathematics, natural science, and languages. At school, Lister became a fluent reader of French and German.
Lister left school in the spring of 1844 when he was seventeen, to attend University College, London, one of only a few institutions which accepted Quakers at that time. He initially studied botany and obtained a bachelor of Arts degree in 1847. He registered as a medical student and graduated with honours as Bachelor of Medicine, subsequently entering the Royal College of Surgeons at the age of 26. In 1854, Lister became both first assistant to and friend of surgeon James Syme at the University of Edinburgh, Edinburgh Royal Infirmary in Scotland. There he joined the Royal Medical Society and presented two dissertations, in 1855 and 1871, which are still in the possession of the Society today.
Lister subsequently left the Quakers, joined the Scottish Episcopal Church, and eventually married Syme's daughter, Agnes. On their honeymoon, they spent three months visiting leading medical institutes (hospitals and universities) in France and Germany. By this time, Agnes was enamoured of medical research and was Lister's partner in the laboratory for the rest of her life.
Before Lister's studies of surgery, many people believed that chemical damage from exposure to "bad air", or "miasma", was responsible for infections in wounds. Hospital wards were occasionally aired out at midday as a precaution against the spread of infection via miasma, but facilities for washing hands or a patient's wounds were not available. A surgeon was not required to wash his hands before seeing a patient; in the absence of any theory of bacterial infection, such practices were not considered necessary. Despite the work of Ignaz Semmelweis and Oliver Wendell Holmes Sr., hospitals practised surgery under unsanitary conditions. Surgeons of the time referred to the "good old surgical stink" and took pride in the stains on their unwashed operating gowns as a display of their experience.
While he was a professor of surgery at the University of Glasgow, Lister became aware of a paper published by the French chemist, Louis Pasteur, showing that food spoilage could occur under anaerobic conditions if micro-organisms were present. Pasteur suggested three methods to eliminate the micro-organisms responsible: filtration, exposure to heat, or exposure to solution/chemical solutions. Lister confirmed Pasteur's conclusions with his own experiments and decided to use his findings to develop antiseptic techniques for wounds. As the first two methods suggested by Pasteur were unsuitable for the treatment of human tissue, Lister experimented with the third idea.
In 1834, Friedlieb Ferdinand Runge discovered phenol, also known as carbolic acid, which he derived in an impure form from coal tar. At that time, there was uncertainty between the substance of creosote – a chemical that had been used to treat wood used for railway ties and ships since it protected the wood from rotting – and carbolic acid. Upon hearing that creosote had been used for treating sewage, Lister began to test the efficacy of carbolic acid when applied directly to wounds.
Therefore, Lister tested the results of spraying instruments, the surgical incisions, and dressings with a solution of carbolic acid. Lister found that the solution swabbed on wounds remarkably reduced the incidence of gangrene. In August 1865, Lister applied a piece of lint dipped in carbolic acid solution onto the wound of a seven-year-old boy at Glasgow Royal Infirmary, who had sustained a compound fracture after a cart wheel had passed over his leg. After four days, he renewed the pad and discovered that no infection had developed, and after a total of six weeks he was amazed to discover that the boy's bones had fused back together, without suppuration. He subsequently published his results in "The Lancet" in a series of six articles, running from March through July 1867.
He instructed surgeons under his responsibility to wear clean gloves and wash their hands before and after operations with 5% carbolic acid solutions. Instruments were also washed in the same solution and assistants sprayed the solution in the operating theatre. One of his additional suggestions was to stop using porous natural materials in manufacturing the handles of medical instruments.
Lister left Glasgow University in 1869, being succeeded by Prof George Husband Baird MacLeod. Lister then returned to Edinburgh as successor to Syme as Professor of Surgery at the University of Edinburgh and continued to develop improved methods of antisepsis and asepsis. Amongst those he worked with there, who helped him and his work, was the senior apothecary and later MD, Dr Alexander Gunn. Lister's fame had spread by then, and audiences of 400 often came to hear him lecture. As the germ theory of disease became more understood, it was realised that infection could be better avoided by preventing bacteria from getting into wounds in the first place. This led to the rise of aseptic surgery. On the hundredth anniversary of his death, in 2012, Lister was considered by most in the medical field as "The Father of Modern Surgery".
Although Lister was so roundly honoured in later life, his ideas about the transmission of infection and the use of antiseptics were widely criticised in his early career. In 1869, at the meetings of the British Association at Leeds, Lister's ideas were mocked; and again, in 1873, the medical journal "The Lancet" warned the entire medical profession against his progressive ideas. However, Lister did have some supporters including Marcus Beck, a consultant surgeon at University College Hospital, who not only practiced Lister's antiseptic technique, but included it in the next edition of one of the main surgical textbooks of the time.
Lister's use of carbolic acid proved problematic, and he eventually repudiated it for superior methods. The spray irritated eyes and respiratory tracts, and the soaked bandages were suspected of damaging tissue, so his teachings and methods were not always adopted in their entirety. Because his ideas were based on germ theory, which was in its infancy, their adoption was slow. General criticism of his methods was exacerbated by the fact that he found it hard to express himself adequately in writing, so they seemed complicated, unorganised, and impractical.
Lister moved from Scotland to King's College Hospital, in London. He was elected President of the Clinical Society of London. He also developed a method of repairing kneecaps with metal wire and improved the technique of mastectomy. He was also known for being the first surgeon to use catgut ligatures, sutures, and rubber drains, and developing an aortic tourniquet. He also introduced a diluted spray of carbolic acid combined with its surgical use, however he abandoned the carbolic acid sprays in the late 1890s after he saw it provided no beneficial change in the outcomes of the surgeries performed with the carbolic acid spray. The only reported reactions were minor symptoms that did not affect the surgical outcome as a whole, like coughing, irritation of the eye, and minor tissue damage among his patients who were exposed to the carbolic acid sprays during the surgery.
Lister's wife had long helped him in research and after her death in Italy in 1893 (during one of the few holidays they allowed themselves) he retired from practice. Studying and writing lost appeal for him and he sank into religious melancholy. Despite suffering a stroke, he still came into the public light from time to time. He had for several years been a Surgeon Extraordinary to Queen Victoria, and from March 1900 was appointed the Serjeant Surgeon to the Queen, thus becoming the senior surgeon in the Medical Household of the Royal Household of the sovereign. After her death the following year, he was re-appointed as such to her successor, King Edward VII.
On 24 August 1902, the King came down with appendicitis two days before his scheduled coronation. Like all internal surgery at the time, the appendectomy needed by the King still posed an extremely high risk of death by post-operational infection, and surgeons did not dare operate without consulting Britain's leading surgical authority. Lister obligingly advised them in the latest antiseptic surgical methods (which they followed to the letter), and the King survived, later telling Lister, "I know that if it had not been for you and your work, I wouldn't be sitting here today."
Lister died on 10 February 1912 at his country home (now known as Coast House) in Walmer, Kent at the age of 84. After a funeral service at Westminster Abbey, his body was buried at Hampstead Cemetery in London in a plot to the south-east of central chapel.
In 1890, Lister was awarded the Cameron Prize for Therapeutics of the University of Edinburgh.
Lister was president of the Royal Society between 1895 and 1900. Following his death, a memorial fund led to the founding of the Lister Medal, seen as the most prestigious prize that could be awarded to a surgeon.
Lister's discoveries were greatly praised and in 1883 Queen Victoria created him a Baronet, of Park Crescent in the Parish of St Marylebone in the County of Middlesex. In 1897 he was further honoured when Her Majesty raised him to the peerage as Baron Lister, of Lyme Regis in the County of Dorset. In the 1902 Coronation Honours list published on 26 June 1902 (the original day of King Edward VII´s coronation), Lord Lister was appointed a Privy Counsellor and one of the original members of the new Order of Merit (OM). He received the order from the King on 8 August 1902, and was sworn a member of the council at Buckingham Palace on 11 August 1902.
Among foreign honours, he received the Pour le Mérite, one of Prussia's highest orders of merit. In 1889 he was elected as Foreign member of the Royal Swedish Academy of Sciences. Two postage stamps were issued in September 1965 to honour Lister for his pioneering work in antiseptic surgery.
Lister is one of the two surgeons in the United Kingdom who have the honour of having a public monument in London. Lister's stands in Portland Place; the other surgeon is John Hunter. There is a statue of Lister in Kelvingrove Park, Glasgow, celebrating his links with the city. In 1903, the British Institute of Preventive Medicine was renamed Lister Institute of Preventive Medicine in honour of Lister. The building, along with another adjacent building, forms what is now the Lister Hospital in Chelsea, which opened in 1985. In 2000, it became part of the HCA group of hospitals.
A building at Glasgow Royal Infirmary which houses cytopathology, microbiology and pathology departments was named in Lister's honour to recognise his work at the hospital. Lister Hospital in Stevenage, Hertfordshire is named after him. The "Discovery" Expedition of 1901–04 named the highest point in the Royal Society Range, Antarctica, Mount Lister.
In 1879, Listerine antiseptic (developed as a surgical antiseptic but nowadays best known as a mouthwash) was named after Lister. Microorganisms named in his honour include the pathogenic bacterial genus "Listeria" named by J. H. H. Pirie, typified by the food-borne pathogen "Listeria monocytogenes", as well as the slime mould genus "Listerella", first described by Eduard Adolf Wilhelm Jahn in 1906. Lister is depicted in the Academy Award winning 1936 film, "The Story of Louis Pasteur", by Halliwell Hobbes. In the film, Lister is one of the beleaguered microbiologist's most noted supporters in the otherwise largely hostile medical community, and is the key speaker in the ceremony in his honour.
Lister's name is one of twenty-three names featured on the Frieze of the London School of Hygiene & Tropical Medicine – although the committee which chose the names to include on the frieze did not provide documentation about why certain names were chosen and others were not.
|
https://en.wikipedia.org/wiki?curid=16535
|
Johann Homann
Johann Baptist Homann (20 March 1664 – 1 July 1724) was a German geographer and cartographer, who also made maps of the Americas.
Homann was born in Oberkammlach near Kammlach in the Electorate of Bavaria. Although educated at a Jesuit school, and preparing for an ecclesiastical career, he eventually converted to Protestantism and from 1687 worked as a civil law notary in Nuremberg. He soon turned to engraving and cartography; in 1702 he founded his own publishing house.
Homann acquired renown as a leading German cartographer, and in 1715 was appointed Imperial Geographer by Emperor Charles VI. Giving such privileges to individuals was an added right that the Holy Roman Emperor enjoyed. In the same year he was also named a member of the Prussian Academy of Sciences in Berlin. Of particular significance to cartography were the imperial printing privileges (Latin: "privilegia impressoria"). These protected for a time the authors in all scientific fields such as printers, copper engravers, map makers and publishers. They were also very important as a recommendation for potential customers.
In 1716 Homann published his masterpiece "Grosser Atlas ueber die ganze Welt" (Grand Atlas of all the World). Numerous maps were drawn up in cooperation with the engraver Christoph Weigel the Elder, who also published "Siebmachers Wappenbuch".
Homann died in Nuremberg in 1724. He was succeeded by his son Johann Christoph (1703-1730). The company carried on upon his death as "Homann heirs" company, managed by Johann Michael Franz and Johann Georg Ebersberger. After subsequent changes in management the company folded in 1852. The company was known as "Homann Erben", "Homanniani Heredes", or "Heritiers de Homann" abroad.
|
https://en.wikipedia.org/wiki?curid=16537
|
Jadavpur University
Jadavpur University is a public state university located in Kolkata in the state of West Bengal in India. It was established in 1955.
University of Calcutta is one of the three universities in early modern India, the other two being Bombay (now Mumbai) and Madras University. It was set up by the British in Calcutta in 1857 as a means of spreading western philosophical thought among the elite in India. It also aimed to create, in the words of Lord Macaulay, "a class of persons Indian in blood and colour, but English in tastes, in opinions, in morals and in intellect."
The nationalists in the freedom struggle of India dubbed the Calcutta University, another pillar of India's education movement, as "Goldighir Ghulamkhana", or the slave house of Goldighi, with reference to the lake adjacent to Calcutta University, and the many graduates it churned out who were used in the British colonial era as ICS officers. Hence, the need for setting up an institution which would impart education along nationalist lines was strongly felt by the luminaries of the period. The real impetus, though, was provided by the partitioning of Bengal by Lord Curzon, the then Governor-General of India, into East Bengal on the one hand (the area that was eventually to become Bangladesh in 1971) and West Bengal and Odisha on the other. The young men of Bengal were amongst the most active in the "Swadeshi" movement, and the participation of university students drew the ire of the Raj. R.W. Carlyle prohibited the participation of students in political meetings on the threat of withdrawal of funding and grants. The decade preceding these decrees had seen Bengali intellectuals increasingly calling for indigenous schools and colleges to replace British institutions.
Generous sums of money were also donated by Brojendra Kishore Roy Choudhury, Maharaja Suryya Kanto Acharya Choudhury and Rashbihari Ghosh, who was appointed the first president of the university. Aurobindo served as the first principal of the college. The organisation in its early days was intricately associated with the nascent revolutionary nationalism in Bengal at the time. It was during his time as principal that Aurobindo started his nationalist publications "Jugantar", "Karmayogin" and "Bande Mataram".
The students' mess at the college was frequented by students of East Bengal who belonged to the Dhaka branch of the Anushilan Samiti, and was known to be hotbed of revolutionary nationalism, which was uncontrolled or even encouraged by the college.
Jadavpur University is semi-residential, which at present operates out of two urban campuses: one in Jadavpur () and another in Salt Lake ().
In 2013, Defence Research & Development Organisation (DRDO) announced plans to set up one of the country's biggest state-of-the-art research hubs and Advanced Technology Centre at the NIL campus. In the same year, CSIR announced the setting up of a research centre for big data analytics and an Inverted Innovation Centre alongside the research hub already announced by DRDO.
In addition to being a unitary university, it has other institutes like the J D Birla Institute, Jadavpur Vidyapith as well as the Institute of Business Management, Jadavpur University affiliated to it, which operate out of independent campuses. While these institutes have their own independent curriculum as well as examination systems, the final degree is offered by Jadavpur University.
Internationally, Jadavpur University ranked 651-700 by the "QS World University Rankings" in 2020, 136 in Asia in 2020 and 75 among BRICS nations in 2019. It was ranked 801-1000 in the world by the "Times Higher Education World University Rankings" of 2020, 196 in Asia and 178 among Emerging Economies University Rankings in 2020. It was also ranked 772 in the world by "U.S. News & World Report". The university was ranked 543rd in the world by "CWTS Leiden Ranking" in 2017, for the period 2012–2015.
The "National Institutional Ranking Framework" has ranked it 17 among engineering institutes in India in 2020, 12 overall and 5th among universities.
The university press publishes all documents of record in the university including PhD theses, question papers and journals. On 26 October 2010 the institution announced plans to launch a publication house, named Jadavpur University Press. The main focus of the publication house will be to publish textbooks and thesis written by research scholars and authors from all universities. The first two titles of JUP were launched on 1 February 2012 at the Calcutta Book Fair. The two titles were "Rajpurush" (translation of Niccolò Machiavelli's "Il Principe"); translated by Doyeeta Majumder, with an introduction by Swapan Kumar Chakravorty, and "Shilpachinta" (translation of selections from Leonardo da Vinci's notebooks); translated by Sukanta Chaudhuri. Both books were translated from the original Italian.
To facilitate interdisciplinary learning and research in diverse fields, there are a number of schools and centre for studies. Some of the major research ventures undertaken by these schools include the pioneering work done by the School of Environmental Studies in highlighting the presence of arsenic in groundwater in countries like India and Bangladesh and the development of the first alcohol based car by the School of Automobile Engineering.
The centres for studies are usually directly associated with a particular department and the centres in Jadavpur University are:
In March 2011, Indian American scientist Manick Sorcar assisted in the opening of a laser animation lab under the School of Illumination Science, Engineering and Design.
Alumni of this university are known as 'Jadavpurians', or in Bangla as 'যদুবংশ'(Joduclan). The Alumni Association, one of the oldest in the country, was founded in 1921 by the ex-students of the National Council of Education.
In 2014 a series of protests broke out in response to the alleged molestation of a female student and beating of a male student by 10 other students on 28 August 2014. Her family and ultimately the student body were unsatisfied by the response of the Vice Chancellor to the allegations. Protests began on 10 September. On 16 September students gheraoed several officials in their offices, demanding that the Vice Chancellor make a statement on the status of a fair probe. Police were summoned, and later that night the police allegedly attacked and beat the student demonstrators. 30 to 40 students were injured; some had to be hospitalized. Reaction was nationwide, with supportive protests at multiple other cities including New Delhi, Hyderabad and Bangalore. On 20 September, Governor Keshari Nath Tripathi, who is also the chancellor of the university, met with student representatives and promised to conduct an impartial inquiry. However, students said they will continue to boycott classes until the Vice Chancellor resigns.
On 26 September, a State Government inquiry panel submitted its report, confirming that the female student had indeed been sexually abused on 28 August 2014. On 26 September, police summoned two Jadavpur University students to come to the Lalbazar Police HQ for questioning at 4 pm on Friday. They were arrested at 6 pm. "The arrests were made after evidence was found, prima facie, against the duo. Further investigation is on," said joint CP-crime Pallab Kanti Ghosh. Mr Ghosh also stated, "(Two names) were arrested because we had enough evidence to prove that they were present at the spot and had carried out the crime as alleged in the victim's complaint." The duo were booked under Sections of 354 (assault or use of criminal force on a woman with the intent to outrage her modesty), 342 (wrongful confinement), 323 (voluntarily causing hurt) and 114 (abettor present when offence is committed) of the IPC.
JU has been embroiled in controversies since July 4, 2018 when the executive council announced its decision to scrap entrance tests for six subjects which was met with protests from the Jadavpur University Teacher's Association and the student unions along with other academics and University students.
|
https://en.wikipedia.org/wiki?curid=16542
|
Prince Harry, Duke of Sussex
Prince Harry, Duke of Sussex, (Henry Charles Albert David; 15 September 1984) is a member of the British royal family. He is the younger son of Charles, Prince of Wales and Diana, Princess of Wales and is sixth in the line of succession to the British throne.
Harry was educated at Wetherby School, Ludgrove School, and Eton College. He spent parts of his gap year in Australia and Lesotho. He then underwent officer training at the Royal Military Academy Sandhurst. He was commissioned as a cornet (second lieutenant) into the Blues and Royals, serving temporarily with his brother Prince William, and he completed his training as a troop leader. In 2007–08, he served for over ten weeks in Helmand, Afghanistan, but was pulled out after an Australian magazine revealed his presence there. He returned to Afghanistan for a 20-week deployment in 2012–13 with the Army Air Corps. He left the army in June 2015.
Harry launched the Invictus Games in 2014 and remains patron of its foundation. He also gives patronage to several other organisations, including the HALO Trust, the London Marathon Charitable Trust, and Walking With The Wounded. On 19 May 2018, he married American actress Meghan Markle. Hours before the wedding, his grandmother Queen Elizabeth II made him Duke of Sussex. The couple's son, Archie Mountbatten-Windsor, was born on 6 May 2019. In January 2020, the couple announced their intention to step back as senior members of the royal family and split their time between the UK and North America.
Harry was born in the Lindo Wing of St Mary's Hospital in Paddington, London, on 15 September 1984 at 4:20 pm as the second child of Charles, Prince of Wales—heir apparent to Queen Elizabeth II—and Diana, Princess of Wales. He was baptised with the names Henry Charles Albert David, on 21 December 1984, at St George's Chapel, Windsor Castle, by the Archbishop of Canterbury, Robert Runcie.
His parents announced their second son's name would officially be Prince Henry Charles Albert David, but that he would be known as "Harry" to his family and friends. As the prince grew up, he was referred to by Kensington Palace, and therefore the Press and the public at large, as Prince Harry. As a son of the Prince of Wales, he was called Prince Henry of Wales. Diana wanted Harry and his older brother, Prince William, to have a broader range of experiences and a better understanding of ordinary life than previous royal children. She took them to venues that ranged from Walt Disney World and McDonald's to AIDS clinics and homeless shelters. Harry began accompanying his parents on official visits at an early age; his first overseas tour was with his parents to Italy in 1985.
Harry's parents divorced in 1996. His mother died in a car crash in Paris the following year. Harry and William were staying with their father at Balmoral at the time, and the Prince of Wales told his sons about their mother's death. At his mother's funeral, Harry, then 12, accompanied his father, brother, paternal grandfather, and maternal uncle, Charles Spencer, 9th Earl Spencer, in walking behind the funeral cortège from Kensington Palace to Westminster Abbey. In a 2017 interview with "The Daily Telegraph", the prince acknowledged that he sought counselling after two years of "total chaos" while struggling to come to terms with the death of his mother.
Like his father and brother, Harry was educated at independent schools. He started at London's Jane Mynors' nursery school and the pre-preparatory Wetherby School. Following this, he attended Ludgrove School in Berkshire. After passing the entrance exams, he was admitted to Eton College. The decision to place Harry at Eton went against the past practice of the Mountbatten-Windsors to send children to Gordonstoun, which Harry's grandfather, father, two uncles, and two cousins had attended. It did, however, see Harry follow in the Spencer family footsteps, as both Diana's father and brother attended Eton.
In June 2003, Harry completed his education at Eton with two A-Levels, achieving a grade B in art and D in geography, having decided to drop history of art after AS level. He excelled in sports, particularly polo and rugby union. One of Harry's former teachers, Sarah Forsyth, has asserted that Harry was a "weak student" and that staff at Eton conspired to help him cheat on examinations. Both Eton and Harry denied the claims. While a tribunal made no ruling on the cheating claim, it "accepted the prince had received help in preparing his A-level 'expressive' project, which he needed to pass to secure his place at Sandhurst."
After school, Harry took a gap year, during which he spent time in Australia working (as his father had done in his youth) on a cattle station, and participating in the Young England vs Young Australia Polo Test match. He also travelled to Lesotho, where he worked with orphaned children and produced the documentary film "The Forgotten Kingdom".
Harry entered the Royal Military Academy Sandhurst on 8 May 2005, where he was known as Officer Cadet Wales, and joined the Alamein Company. In April 2006, Harry completed his officer training and was commissioned as a Cornet (second lieutenant) in the Blues and Royals, a regiment of the Household Cavalry in the British Army. On 13 April 2008, when he reached two years' seniority, Harry was promoted to lieutenant.
In 2006, it was announced that Harry's unit was scheduled to be deployed in Iraq the following year. A public debate ensued as to whether he should serve there. Defence Secretary John Reid said that he should be allowed to serve on the front line of battle zones. Harry agreed saying, "If they said 'no, you can't go front line' then I wouldn't drag my sorry ass through Sandhurst and I wouldn't be where I am now." The Ministry of Defence and Clarence House made a joint announcement on 22 February 2007 that Harry would be deployed with his regiment to Iraq, as part of the 1st Mechanised Brigade of the 3rd Mechanised Divisiona move supported by Harry, who had stated that he would leave the army if he was told to remain in safety while his regiment went to war. He said: "There's no way I'm going to put myself through Sandhurst and then sit on my arse back home while my boys are out fighting for their country."
The head of the British army at the time, General Sir Richard Dannatt, said on 30 April 2007 that he had personally decided that Harry would serve with his unit in Iraq, and Harry was scheduled for deployment in May or June 2007, to patrol the Maysan Governorate. By 16 May, however, Dannatt announced that Harry would not serve in Iraq; concerns included Harry being a high-value target (as several threats by various groups had already been made against him) and the dangers the soldiers around him would face should any attempt be made on his life or if he was captured. Clarence House made public Harry's disappointment with the decision, though he said he would abide by it.
In early June 2007, it was reported that Harry had arrived in Canada to train alongside soldiers of the Canadian Forces and British Army, at CFB Suffield, near Medicine Hat, Alberta. It was said that this was in preparation for a tour of duty in Afghanistan, where Canadian and British forces were participating in the NATO-led Afghan War.
This was confirmed in February of the following year, when the British Ministry of Defence revealed that Harry had been secretly deployed as a Forward Air Controller to Helmand Province in Afghanistan for the previous ten weeks. The revelation came after the medianotably, German newspaper "Bild" and Australian magazine "New Idea"breached the blackout placed over the information by the Canadian and British authorities. It was later reported that Harry helped Gurkha troops repel an attack from Taliban insurgents, and performed patrol duty in hostile areas while in Afghanistan.
His tour made Harry the first member of the British royal family to serve in a war zone since his uncle, Prince Andrew, who flew helicopters during the Falklands War. For his service, his aunt, Princess Anne, presented Harry with an Operational Service Medal for Afghanistan at the Combermere Barracks in May 2008.
In October 2008, it was announced that Harry was to follow his brother, father and uncle in learning to fly military helicopters. After passing the initial aptitude test, he was to undertake a month-long course; if he passed that, he would begin full flight training in early 2009.
Harry had to pass his flying assessment at the Army Air Corps Base (AAC), Middle Wallop, the result of which would determine whether he would continue on to train as a pilot of the Apache, Lynx, or Gazelle helicopter. Having reached the requisite standard, Harry attended the Defence Helicopter Flying School at RAF Shawbury, where he joined his brother.
Prince Charles presented him with his flying brevet (wings) on 7 May 2010 at a ceremony at the Army Air Corps Base (AAC), Middle Wallop. Harry had let it be known he intended to fly Apache attack helicopters if he was successful in passing the rigorous Apache training course. This would allow him to see active military service again on the frontline in Afghanistan.
On 10 March 2011, it was revealed that Harry had passed his Apache flying test and he was awarded his Apache Flying Badge on 14 April 2011. There was speculation he would return to Afghanistan before the withdrawal in 2015. On 16 April 2011, it was announced that Harry had been promoted to captain.
In June 2011, Clarence House announced that on completion of his training conversion course to use Apache helicopters in the war arena, Harry would be available for deployment, including in current operations in Afghanistan, as an Apache helicopter pilot. The final decision rested with the Ministry of Defence's senior commanders, including principally the Chief of the Defence Staff in consultation with the wishes of Harry, the Prince of Wales, and the Queen. In October, he was transferred to a US military base in California to complete his helicopter gunship training. This final phase included live-fire training and "environmental and judgment training" at naval and air force facilities in California and Arizona. Most of those completing the two-month Apache training were deployed to the front lines in Afghanistan. In the same month, it was reported that Harry was said to be a natural pilot who was top of his class in the extensive training he had undertaken at the Naval Air Facility, El Centro, California; while training in Southern California, he spent time in San Diego. In November 2011, Harry returned to England. He went to Wattisham Airfield in Suffolk, in the east of England, to complete his training to fly Apache helicopters.
On 7 September 2012, Harry arrived at Camp Bastion in southern Afghanistan as part of the 100-strong 662 Squadron, 3 Regiment, Army Air Corps, to begin a four-month combat tour as a co-pilot and gunner for an Apache helicopter. On 10 September, within days of arriving in Afghanistan, it was reported that the Taliban had threatened his life. Taliban spokesman Zabiullah Mujahid spoke to Reuters and was quoted as saying: "We are using all our strength to get rid of him, either by killing or kidnapping." He added, "We have informed our commanders in Helmand to do whatever they can to eliminate him."
On 21 January 2013, it was announced that Harry was returning from a 20-week deployment in Afghanistan, where he served as an Apache co-pilot/gunner. On 8 July 2013, the Ministry of Defence announced that Harry had successfully qualified as an Apache aircraft commander. Harry compared operating the Apache's weapons systems in Afghanistan to playing video games.
On 17 January 2014, the Ministry of Defence announced that Harry had completed his attachment to 3 Regiment Army Air Corps, and would take up a staff officer role, SO3 (Defence Engagement), in HQ London District. His responsibilities would include helping to co-ordinate significant projects and commemorative events involving the Army in London. He was based at Horse Guards in central London.
On 6 March 2014, Harry launched Invictus Games, a Paralympic-style sporting event for injured servicemen and women, which was held on 10–14 September 2014. Harry met British hopefuls for the Invictus Games at Tedworth House in Wiltshire for the start of the selection process on 29 April 2014. On 15 May 2014, Harry attended a ticket sale launch for Invictus Games at BT Tower, from where he tweeted on the Invictus Games' official Twitter account as the president of the Games. To promote the Games, he was interviewed by BBC Radio 2's Chris Evans along with two Invictus Games hopefuls. He said: "This (Invictus Games) is basically my full-time job at the moment, making sure that we pull this off." The show aired on 31 July 2014. Harry later wrote an article in "The Sunday Times" about his experiences in Afghanistan: how they had inspired him to help injured personnel and how, after the trip to the Warrior Games, he had vowed to create the Invictus Games. Harry and officials attended the British Armed Forces Team announcement for Invictus Games at Potters Field Park in August 2014. As president of the Invictus Games, he attended all events related to the Games from 8 to 14 September 2014.
In January 2015, it was reported that Harry would take a new role in supporting wounded service personnel by working alongside members of the London District's Personal Recovery Unit for the MOD's Defence Recovery Capability scheme to ensure that wounded personnel have adequate recovery plans. The palace confirmed weeks later that the scheme was established in partnership with Help for Heroes and the Royal British Legion.
In late January 2015, Harry visited The Battle Back Centre set up by the Royal British Legion, and Fisher House UK at the Queen Elizabeth Hospital Birmingham. A partnership between Help for Heroes, the Fisher House Foundation and the Queen Elizabeth Hospital Birmingham (QEHB) Charity created the Centre. Fisher House Foundation is one of the Invictus Games' sponsors.
In February and March 2015, Harry visited Phoenix House in Catterick Garrison, North Yorkshire, a recovery centre run by Help for Heroes. He also visited Merville Barracks in Colchester, where Chavasse VC House Personnel Recovery Centre is located, run by Help for Heroes in partnership with the Ministry of Defence and Royal British Legion.
On 17 March 2015, Kensington Palace announced that Harry would leave the Armed Forces in June. Before then, he would spend four weeks throughout April and May at army barracks in Darwin, Perth and Sydney whilst seconded to the Australian Defence Force (ADF). After leaving the Army, while considering his future, he would return to work in a voluntary capacity with the Ministry of Defence, supporting Case Officers in the Ministry's Recovery Capability Programme. He would be working with both those who administer and receive physical and mental care within the London District area.
On 6 April 2015, Harry reported for duty to Australia's Chief of the Defence Force, Air Chief Marshal Mark Binskin at the Royal Military College, Duntroon in Canberra, Australia. Harry flew to Darwin later that day to begin his month-long secondment to the ADF's 1st Brigade. His visit included detachments to NORFORCE as well as to an aviation unit. While in Perth, he trained with Special Air Service Regiment (SASR), participating in the SASR selection course, including a fitness test and a physical training session with SASR selection candidates. He also joined SASR members in Perth for live-fire shooting exercises with numerous Special Forces weapons at a variety of ranges. Harry completed an insertion training exercise using a rigid-hull inflatable boat. In Sydney, he undertook urban operations training with the 2nd Commando Regiment. Training activities included remotely detonating an Improvised Explosive Device (IED) and rappelling from a building. He also spent time flying over Sydney as co-pilot of an Army Black Hawk helicopter and participated in counter-terrorism training in Sydney Harbour with Royal Australian Navy clearance divers.
Harry's attachment with the ADF ended on 8 May 2015, and on 19 June 2015 he resigned his short service commission.
Since leaving active service with the army, Harry has been closely involved with the armed forces through the Invictus Games, honorary military appointments and other official engagements. On 19 December 2017, he succeeded his grandfather Prince Philip as the Captain General of the Royal Marines. In May 2018, he was promoted to the substantive ranks of Lieutenant Commander of the Royal Navy, Major of the British Army and Squadron Leader of the Royal Air Force.
Harry enjoys playing many sports, including competitive polo, skiing, and motocross. He is a supporter of Arsenal Football Club. Harry is also a keen Rugby football fan and supported England's bid to host rugby union's 2015 Rugby World Cup, and presented the trophy at rugby league's 2019 Challenge Cup finals.
In his youth Harry earned a reputation for being rebellious, leading the tabloid press to label him a "wild child". At age 17 he was seen smoking cannabis, drinking underage with friends, and clashing physically with paparazzi outside nightclubs. He was photographed at Highgrove House at a "Colonial and Native" themed costume party wearing a Nazi German Afrika Korps uniform with a swastika armband. He later issued a public statement apologising for his behaviour.
In January 2009, the British tabloid, the "News of the World", revealed a video made by Harry three years earlier in which he referred to a Pakistani fellow officer cadet as "our little Paki friend" and called a soldier wearing a cloth on his head a "raghead". These terms were described by Leader of the Opposition David Cameron as "unacceptable", and by "The Daily Telegraph" as "racist". A British Muslim youth organisation called Harry a "thug". Clarence House immediately issued an apology from Harry, who stated that no malice was intended in his remarks. Former British MP and Royal Marine, Rod Richards, said that such nicknames were common amongst military comrades, stating "in the Armed Forces people often used to call me Taffy. Others were called Yankie, Oz or Kiwi or whatever. I consider Paki as an abbreviation for Pakistani. I don't think on this occasion it was intended to be offensive."
While on holiday in Las Vegas in August 2012, Harry and an unknown young woman were photographed naked in a Wynn Las Vegas hotel room, reportedly during a game of strip billiards. The pictures were leaked by American celebrity website TMZ on 21 August 2012, and reported worldwide by mainstream media on 22 August 2012. The photographs were shown by the American media, but British media were reluctant to publish them. Royal aides suggested Clarence House would contact the Press Complaints Commission (PCC) if British publications used the pictures. St James's Palace confirmed that Harry was in the photographs, saying that he was essentially a victim whose privacy had been invaded and contacted the PCC upon hearing that a number of British newspapers were considering publishing the photographs. On 24 August 2012, "The Sun" newspaper published the photographs.
Polls conducted in the United Kingdom in November 2012 showed Harry to be the third-most popular member of the royal family, after William and the Queen.
Chelsy Davy, the daughter of Zimbabwean, South Africa-based businessman Charles Davy, was referred to as Harry's girlfriend in an interview conducted for his 21st birthday, and Harry said he "would love to tell everyone how amazing she is but once I start talking about that, I have left myself open... There is truth and there is lies and unfortunately I cannot get the truth across." Davy, who is a businesswoman and lawyer, was present when Harry received his Operational Service Medal for Afghanistan and also attended his graduation ceremony when he received his flying wings from his father. In early 2009, it was reported the pair had parted ways after a relationship that had lasted for five years.
In May 2012, Harry was introduced to Cressida Bonas, an actress and model who is granddaughter of Edward Curzon, 6th Earl Howe, by his cousin Princess Eugenie. On 30 April 2014, it was reported that the couple had parted amicably.
On 8 November 2016, Kensington Palace confirmed that Harry was "a few months" into a relationship with American actress Meghan Markle, in a statement from the prince asking for the "abuse and harassment" of Markle and her family to end. In September 2017, they made their first public appearance at an official royal engagement, the opening ceremonies of the Invictus Games in Toronto.
On 27 November 2017, Clarence House and Kensington Palace announced that Harry and Markle were engaged. The engagement announcement prompted much comment about the possible social significance of Meghan Markle becoming a mixed-race royal. The couple married at St George's Chapel, Windsor Castle, on 19 May 2018.
The Duke and Duchess initially lived at Nottingham Cottage in London, on the grounds of Kensington Palace. The couple later moved to the more than two-centuries-old Frogmore Cottage in the Home Park of Windsor Castle. The Crown Estate refurbished the cottage at a cost of £2.4 million, paid out of the Sovereign Grant, with the couple picking up expenses beyond restoration and ordinary maintenance. Their office was moved to Buckingham Palace.
On 6 May 2019, the couple's first child Archie Mountbatten-Windsor was born, who is seventh in line to the throne.
At the time of the announcement of Harry and Meghan's decision to "step back" as senior members of the royal family in 2020, 95% of the couple's income derived from the £2.3 million given to them annually by Harry's father, Charles, as part of his income from the Duchy of Cornwall.
Harry and his brother William inherited the "bulk" of the £12.9 million left by their mother on their respective 30th birthdays, a figure that had grown since her 1997 death to £10 million each in 2014. In 2002 "The Times" reported that Harry would also share with his brother a payment of £4.9 million from trust funds established by their great-grandmother, Queen Elizabeth The Queen Mother, on their respective 21st birthdays and would share a payment of £8 million upon their respective 40th birthdays. Harry's personal wealth was estimated at £30 million by "The Daily Telegraph" in 2020.
In 2014 Harry and William inherited their mother's wedding dress along with many other of her personal possessions including dresses, diamond tiaras, jewels, letters, and paintings. The brothers also received the original lyrics and score of "Candle in the Wind" by Bernie Taupin and Elton John as performed by John at Diana's funeral.
On 6 January 2009, the Queen granted Harry and William their own royal household. Previously, William and Harry's affairs had been handled by their father's office at Clarence House in central London. The new household released a statement announcing they had established their own office at nearby St James's Palace to look after their public, military and charitable activities.
In March 2012, Harry led an official visit to Belize as part of the Queen's Diamond Jubilee celebrations. He continued to the Bahamas and Jamaica, where the Prime Minister, Portia Simpson-Miller, was considering initiating a process of turning Jamaica into a republic. He then visited Brazil to attend the GREAT Campaign.
Between 9 and 15 May 2013, he made an official visit to the United States. The tour promoted the rehabilitation of injured American and UK troops, publicised his own charities and supported British interests. It included engagements in Washington, DC, Colorado, New York, New Jersey, and Connecticut. He met survivors of Hurricane Sandy in New Jersey. In October 2013, he undertook his first official tour of Australia, attending the International Fleet Review at Sydney Harbour. He also paid a visit to the Australian SAS HQ in Perth.
In May 2014, he visited Estonia and Italy. In Estonia, he visited Freedom Square in the capital Tallinn to honour fallen Estonian soldiers. He also attended a reception at the Estonian Parliament and a NATO military exercise. In Italy, Harry attended commemorations of the 70th anniversary of the Monte Cassino battles, in which Polish, Commonwealth and British troops fought. On 6 November 2014, he opened the Field of Remembrance at Westminster Abbey, a task usually performed by Prince Philip.
Before reporting for duty to the Australian Defence Force (ADF), Harry visited the Australian War Memorial in Canberra on 6 April 2015. On 7 May 2015, he made a farewell walkabout at the Sydney Opera House and visited Macquarie University Hospital. On 24–25 April 2015, he joined his father in Turkey to attend commemorations of the centenary of the Gallipoli Campaign.
From 30 November to 3 December 2015, he made an official visit to South Africa. He visited Cape Town, where he presented the insignia of the Order of the Companions of Honour to the Archbishop on behalf of the Queen.
He visited Nepal 19–23 March 2016. He stayed until the end of March 2016 to help rebuild a secondary school with Team Rubicon UK, and visited a Hydropower Project in Central Nepal.
In April 2018, he was appointed Commonwealth youth ambassador. Also that month, Harry became a patron of Walk of America, a campaign which brings together a number of veterans who will take part in a 1,000-mile expedition across the US in mid-2018. The Prince was appointed the president of the Queen's Commonwealth Trust, which focuses on projects involving children and welfare of prisoners, in April. Also in April 2018, Harry was selected as one of the 100 Most Influential People in the World by "Time" magazine.
In April 2019, it was announced that he was working on a documentary series about mental health together with Oprah Winfrey, which is set to air in 2020 on Apple TV+. He will serve as co-creator and executive producer for the series.
The Sussexes' discontent with their public roles and media scrutiny grew in 2019. In May 2019, Splash News issued a formal apology to Harry and his wife for sending photographers to their Cotswolds residence, which put their privacy at risk. The agency also agreed to pay a "substantial" sum of damages and legal costs associated with the case. At the end of their tour of Southern African countries in September–October 2019, Harry issued a statement mentioning that his wife was "one of the latest victims of a British tabloid press". Later in October, it was announced that Harry had sued "The Sun", the "Daily Mirror" and the now-defunct "News of the World" "in relation to alleged phone-hacking". On 30 January 2020, the Independent Press Standards Organisation (IPSO) sided with the "Mail on Sunday" over a dispute between the Duke and the newspaper regarding an Instagram photo involving Harry in which, according to the newspaper, elephants were in fact "tranquilized" and "tethered" during a relocating process. The IPSO rejected Harry's claim that the paper's description was "inaccurate" or "misleading".
In January 2020, the Duke and Duchess announced that they were stepping back from their role as senior members of the royal family, and would balance their time between the United Kingdom and North America. The couple also said that they would seek financial independence while continuing to support their charities and the Queen. In the same month, the couple's lawyers issued a legal warning to the press after paparazzi photographs were published in the media. It was later confirmed that security for the Duke and Duchess during their stay in Canada was provided for on an "as needed basis" by the Royal Canadian Mounted Police until they formally stepped down on 31 March, "in keeping with their change in status". In March 2020, Harry attended the opening of the Silverstone Experience in Silverstone Circuit together with racing driver Lewis Hamilton. Harry's appearance at the museum was his final solo engagement as a senior royal before he and his wife officially stepped down on 31 March. According to Page Six, and the Canadian Taxpayers Federation, it cost more than CAD$56,384 in security fees during their stay between November 2019 and January 2020.
At the age of 21, Harry was appointed a Counsellor of State and began his duties in that capacity. In 2006, he was in Lesotho to visit Mants'ase Children's Home near Mohale's Hoek, which he first toured in 2004. Along with Prince Seeiso of Lesotho, he launched Sentebale: The Princes' Fund for Lesotho, a charity to aid children orphaned by HIV/AIDS. He has granted his patronage to organisations including WellChild, Dolen Cymru, and MapAction.
On 26 November 2015, as patron of Sentebale, Harry travelled to Lesotho to attend the opening of the Mamohato Children's Centre. Two days later Harry played the Sentebale Royal Salute Polo Cup, at Val de Vie Estate in Cape Town, South Africa, fundraising for Sentebale.
Sport has been a way that Harry has helped charities and other organisations. This has included training as a Rugby Development Officer for the Rugby Football Union in 2004 and coaching students in schools to encourage them to learn the sport. He, along with former rugby player Brian Moore, both argued that in response to Black Lives Matter, the song Swing Low, Sweet Chariot, should no longer be sung in rugby context. He is the patron of the Rugby Football League, Rugby League's governing body in England. Like his brother and father, he has participated in polo matches to raise money for charitable causes.
In 2012, together with the Duke and Duchess of Cambridge, Harry launched Coach Core. The program was set up following the 2012 Olympics and provides apprenticeship opportunities for people who desire to pursue a career as a professional coach. As patron of Walk of Britain, he walked with the team on 30 September and 20 October 2015. On 28 October 2015, he carried out one day of engagements in the US. He launched the Invictus Games Orlando 2016 with First Lady Michelle Obama and Dr. Jill Biden at Fort Belvoir. He later attended an Invictus Games board meeting and a reception to celebrate the launch at the British Ambassador's Residence.
In June 2019, the Duke was present at the launch of Made by Sport, a charity coalition set to raise money to boost sport in disadvantaged communities. In his statement, he lent his support to the charity by arguing that its role in bringing sport into the life of disadvantaged people would save "hundreds of millions of pounds" towards treating the issues among young people. In February 2020, Harry recorded a new version of the song "Unbroken" with Jon Bon Jovi. The new version features backing vocals from members of the Invictus Choir. The song was released on 27 March 2020, the proceeds of which were donated to the Invictus Games Foundation.
In September 2009, William and Harry set up The Foundation of Prince William and Prince Harry to enable the princes to take forward their charitable ambitions. In June 2019, it was announced that the Duke and Duchess of Sussex would split from the charity and establish their own charity foundation by the end of 2019. The couples will reportedly collaborate on mutual projects, such as the mental health initiative Heads Together. In July 2019, Harry and Meghan's new charity was registered in England and Wales under the title "Sussex Royal The Foundation of The Duke and Duchess of Sussex". However, on 21 February 2020 it was confirmed that "Sussex Royal" would not be used as a brand name for the couple following their withdrawal from public life. In April 2020, responding to inquiries from "The Telegraph", the couple confirmed their new foundation would be called "Archewell". The name stems from the Greek word "arche", which means "source of action"; it is the same word that inspired the name of the couple's son, Archie.
In April 2020, Harry launched a new initiative named HeadFIT, a platform designed to provide mental support for members of the armed forces. The initiative was developed mutually by the Royal Foundation's Heads Together campaign, the Ministry of Defence, and King's College London.
To raise awareness for HIV testing, Harry took a test live on the royal family Facebook page on 14 July 2016. He later attended the 21st International AIDS Conference in Durban, South Africa, on 21 July 2016. On World Aids Day, Harry and Rihanna helped publicise HIV testing by taking the test themselves. Since 2016, Harry has been working with Terrence Higgins Trust to raise awareness about HIV and sexual health.
In July 2018, the Elton John AIDS Foundation announced that the Duke of Sussex and British singer Elton John were about to launch a global coalition called MenStar that would focus "on treating HIV infections in men". During his trip to Angola in 2019, the Duke visited the Born Free to Shine project in Luanda, an initiative by First Lady Ana Dias Lourenço which aims to "prevent HIV transmission from mothers to babies" through education, medical testing and treatment. He also met HIV+ youth and teenagers during his visit. In November 2019, to mark the National HIV Testing Week, the Duke interviewed HIV+ Rugby player Gareth Thomas on behalf of the Terrence Higgins Trust.
On 27 December 2017, Harry was officially appointed the new president of African Parks, a conservation NGO. He previously spent three weeks in Malawi with African Parks where he joined a team of volunteers and professionals to carry out one of the largest elephant translocations in history. The effort to repopulate areas decimated due to poaching and habitat loss moved 500 elephants from Liwonde and Majete National Parks to Nkhotakota Wildlife Reserve.
In March 2019 Prince Harry gave a speech at WE Day UK, an annual event organised by We Charity to inspire young people to become more active towards global social and environmental change. He discussed mental health, climate change and the importance of social participation. Harry attended a Google summit in August 2019 and gave a speech on the importance of tackling climate change in Sicily. He explained that he and Meghan plan to have no more than 2 children to help sustain the environment. Later that month, the Duke and Duchess of Sussex were criticised by environmental campaigners for using private jets regularly when taking their personal trips abroad, which would leave more carbon footprint per person compared to commercial planes. The criticism was in line with the reactions the royal family faced in June 2019, after it was revealed that they "had doubled [their] carbon footprint from business travel." In September, the Duke launched Travalyst during his visit to the Netherlands after two years of development. The initiative is set "to encourage sustainable practices in the travel industry" and "tackle climate change and environmental damage", in collaboration with a number of companies.
During his visit to the Luengue-Luiana National Park in Angola in September 2019, the Duke unveiled an initiative by the Queen's Commonwealth Canopy to help with protecting "an ancient elephant migration route" by proving safe passage for them in the forest.
On the morning of his wedding he was granted by the Queen the Dukedom of Sussex, in the Peerage of the United Kingdom, first conferred in 1801 upon Prince Augustus Frederick, the sixth son of King George III, referring to the English county. At the same time he was also granted two subsidiary titles, both also in the Peerage of the United Kingdom: Earl of Dumbarton, a recreated ancient title formerly in the Peerage of Scotland, and Baron Kilkeel, a new title providing the Prince with a nominal territorial link to Northern Ireland, one of the four constituent countries of the United Kingdom. He is usually known as Prince Harry.
Before his marriage, Harry used Wales as his surname for military purposes and was known as "Captain Harry Wales" in such contexts.
On 4 June 2015, as part of the 2015 Special Honours, Harry was knighted by his grandmother, the Queen, for "services to the sovereign", being appointed a Knight Commander of the Royal Victorian Order (KCVO).
On 18 January 2020, Buckingham Palace announced that, following their decision to step back from royal duties, from 31 March 2020 Harry and his wife have agreed not to use their "Royal Highness" styles, but as a British prince he will not be stripped of his style and titles.
On 18 January 2020, it was announced that he was "required to step back from Royal duties, including official military appointments" in "Spring of 2020".
Harry's charitable efforts have been recognised three times by the international community. In December 2010, the German charity ("A Heart for Children") awarded him the Golden Heart Award, in recognition of his "charitable and humanitarian efforts". On 7 May 2012, the Atlantic Council awarded him its Distinguished Humanitarian Leadership Award. In August 2018, the Royal Canadian Legion granted him the 2018 Founders Award for his role in founding the Invictus Games.
Agnatically, Harry is a member of the House of Glücksburg, a cadet branch of the House of Oldenburg, one of Europe's oldest royal houses. Harry's paternal grandmother, Queen Elizabeth II, issued letters patent on 8 February 1960 declaring his father to be a member of the House of Windsor.
Ancestors on Harry's father's side include most of the royal families of Europe, and on his mother's side, the Earls Spencer.
|
https://en.wikipedia.org/wiki?curid=14457
|
Hail
Hail is a form of solid precipitation. It is distinct from ice pellets (American English "sleet"), though the two are often confused. It consists of balls or irregular lumps of ice, each of which is called a hailstone. Ice pellets fall generally in cold weather while hail growth is greatly inhibited during cold surface temperatures.
Unlike other forms of water ice such as graupel, which is made of rime, and ice pellets, which are smaller and translucent, hailstones usually measure between and in diameter. The METAR reporting code for hail or greater is GR, while smaller hailstones and graupel are coded GS.
Hail is possible within most thunderstorms as it is produced by cumulonimbus, and within of the parent storm. Hail formation requires environments of strong, upward motion of air with the parent thunderstorm (similar to tornadoes) and lowered heights of the freezing level. In the mid-latitudes, hail forms near the interiors of continents, while in the tropics, it tends to be confined to high elevations.
There are methods available to detect hail-producing thunderstorms using weather satellites and weather radar imagery. Hailstones generally fall at higher speeds as they grow in size, though complicating factors such as melting, friction with air, wind, and interaction with rain and other hailstones can slow their descent through Earth's atmosphere. Severe weather warnings are issued for hail when the stones reach a damaging size, as it can cause serious damage to human-made structures and, most commonly, farmers' crops.
Any thunderstorm which produces hail that reaches the ground is known as a hailstorm. Hail has a diameter of or more. Hailstones can grow to and weigh more than .
Unlike ice pellets, hailstones are layered and can be irregular and clumped together. Hail is composed of transparent ice or alternating layers of transparent and translucent ice at least thick, which are deposited upon the hailstone as it travels through the cloud, suspended aloft by air with strong upward motion until its weight overcomes the updraft and falls to the ground. Although the diameter of hail is varied, in the United States, the average observation of damaging hail is between 2.5 cm (1 in) and golf ball-sized (1.75 in).
Stones larger than 2 cm (0.80 in) are usually considered large enough to cause damage. The Meteorological Service of Canada issues severe thunderstorm warnings when hail that size or above is expected. The US National Weather Service has a 2.5 cm (1 in) or greater in diameter threshold, effective January 2010, an increase over the previous threshold of ¾-inch hail. Other countries have different thresholds according to local sensitivity to hail; for instance grape growing areas could be adversely impacted by smaller hailstones. Hailstones can be very large or very small, depending on how strong the updraft is: weaker hailstorms produce smaller hailstones than stronger hailstorms (such as supercells).
Hail forms in strong thunderstorm clouds, particularly those with intense updrafts, high liquid water content, great vertical extent, large water droplets, and where a good portion of the cloud layer is below freezing . These types of strong updrafts can also indicate the presence of a tornado. The growth rate of hailstones is impacted by factors such as higher elevation, lower freezing zones, and wind shear.
Like other precipitation in cumulonimbus clouds, hail begins as water droplets. As the droplets rise and the temperature goes below freezing, they become supercooled water and will freeze on contact with condensation nuclei. A cross-section through a large hailstone shows an onion-like structure. This means the hailstone is made of thick and translucent layers, alternating with layers that are thin, white and opaque. Former theory suggested that hailstones were subjected to multiple descents and ascents, falling into a zone of humidity and refreezing as they were uplifted. This up and down motion was thought to be responsible for the successive layers of the hailstone. New research, based on theory as well as field study, has shown this is not necessarily true.
The storm's updraft, with upwardly directed wind speeds as high as , blows the forming hailstones up the cloud. As the hailstone ascends it passes into areas of the cloud where the concentration of humidity and supercooled water droplets varies. The hailstone's growth rate changes depending on the variation in humidity and supercooled water droplets that it encounters. The accretion rate of these water droplets is another factor in the hailstone's growth. When the hailstone moves into an area with a high concentration of water droplets, it captures the latter and acquires a translucent layer. Should the hailstone move into an area where mostly water vapor is available, it acquires a layer of opaque white ice.
Furthermore, the hailstone's speed depends on its position in the cloud's updraft and its mass. This determines the varying thicknesses of the layers of the hailstone. The accretion rate of supercooled water droplets onto the hailstone depends on the relative velocities between these water droplets and the hailstone itself. This means that generally the larger hailstones will form some distance from the stronger updraft where they can pass more time growing. As the hailstone grows it releases latent heat, which keeps its exterior in a liquid phase. Because it undergoes 'wet growth', the outer layer is "sticky" (i.e. more adhesive), so a single hailstone may grow by collision with other smaller hailstones, forming a larger entity with an irregular shape.
Hail can also undergo 'dry growth' in which the latent heat release through freezing is not enough to keep the outer layer in a liquid state. Hail forming in this manner appears opaque due to small air bubbles that become trapped in the stone during rapid freezing. These bubbles coalesce and escape during the 'wet growth' mode, and the hailstone is more clear. The mode of growth for a hailstone can change throughout its development, and this can result in distinct layers in a hailstone's cross-section.
The hailstone will keep rising in the thunderstorm until its mass can no longer be supported by the updraft. This may take at least 30 minutes based on the force of the updrafts in the hail-producing thunderstorm, whose top is usually greater than 10 km high. It then falls toward the ground while continuing to grow, based on the same processes, until it leaves the cloud. It will later begin to melt as it passes into air above freezing temperature.
Thus, a unique trajectory in the thunderstorm is sufficient to explain the layer-like structure of the hailstone. The only case in which multiple trajectories can be discussed is in a multicellular thunderstorm, where the hailstone may be ejected from the top of the "mother" cell and captured in the updraft of a more intense "daughter" cell. This, however, is an exceptional case.
Hail is most common within continental interiors of the mid-latitudes, as hail formation is considerably more likely when the freezing level is below the altitude of . Movement of dry air into strong thunderstorms over continents can increase the frequency of hail by promoting evaporational cooling which lowers the freezing level of thunderstorm clouds giving hail a larger volume to grow in. Accordingly, hail is less common in the tropics despite a much higher frequency of thunderstorms than in the mid-latitudes because the atmosphere over the tropics tends to be warmer over a much greater altitude. Hail in the tropics occurs mainly at higher elevations.
Hail growth becomes vanishingly small when air temperatures fall below as supercooled water droplets become rare at these temperatures. Around thunderstorms, hail is most likely within the cloud at elevations above . Between and , 60 percent of hail is still within the thunderstorm, though 40 percent now lies within the clear air under the anvil. Below , hail is equally distributed in and around a thunderstorm to a distance of .
Hail occurs most frequently within continental interiors at mid-latitudes and is less common in the tropics, despite a much higher frequency of thunderstorms than in the mid-latitudes. Hail is also much more common along mountain ranges because mountains force horizontal winds upwards (known as orographic lifting), thereby intensifying the updrafts within thunderstorms and making hail more likely. The higher elevations also result in there being less time available for hail to melt before reaching the ground. One of the more common regions for large hail is across mountainous northern India, which reported one of the highest hail-related death tolls on record in 1888. China also experiences significant hailstorms. Central Europe and southern Australia also experience a lot of hailstorms. Regions where hailstorms frequently occur are southern and western Germany, northern and eastern France, and southern and eastern Benelux. In southeastern Europe, Croatia and Serbia experience frequent occurrences of hail.
In North America, hail is most common in the area where Colorado, Nebraska, and Wyoming meet, known as "Hail Alley". Hail in this region occurs between the months of March and October during the afternoon and evening hours, with the bulk of the occurrences from May through September. Cheyenne, Wyoming is North America's most hail-prone city with an average of nine to ten hailstorms per season. To the north of this area and also just downwind of the Rocky Mountains is the Hailstorm Alley region of Alberta, which also experiences an increased incidence of significant hail events.
Weather radar is a very useful tool to detect the presence of hail-producing thunderstorms. However, radar data has to be complemented by a knowledge of current atmospheric conditions which can allow one to determine if the current atmosphere is conducive to hail development.
Modern radar scans many angles around the site. Reflectivity values at multiple angles above ground level in a storm are proportional to the precipitation rate at those levels. Summing reflectivities in the Vertically Integrated Liquid or VIL, gives the liquid water content in the cloud. Research shows that hail development in the upper levels of the storm is related to the evolution of VIL. VIL divided by the vertical extent of the storm, called VIL density, has a relationship with hail size, although this varies with atmospheric conditions and therefore is not highly accurate. Traditionally, hail size and probability can be estimated from radar data by computer using algorithms based on this research. Some algorithms include the height of the freezing level to estimate the melting of the hailstone and what would be left on the ground.
Certain patterns of reflectivity are important clues for the meteorologist as well. The three body scatter spike is an example. This is the result of energy from the radar hitting hail and being deflected to the ground, where they deflect back to the hail and then to the radar. The energy took more time to go from the hail to the ground and back, as opposed to the energy that went directly from the hail to the radar, and the echo is further away from the radar than the actual location of the hail on the same radial path, forming a cone of weaker reflectivities.
More recently, the polarization properties of weather radar returns have been analyzed to differentiate between hail and heavy rain. The use of differential reflectivity (formula_1), in combination with horizontal reflectivity (formula_2) has led to a variety of hail classification algorithms. Visible satellite imagery is beginning to be used to detect hail, but false alarm rates remain high using this method.
The size of hailstones is best determined by measuring their diameter with a ruler. In the absence of a ruler, hailstone size is often visually estimated by comparing its size to that of known objects, such as coins. Using the objects such as hen's eggs, peas, and marbles for comparing hailstone sizes is imprecise, due to their varied dimensions. The UK organisation, TORRO, also scales for both hailstones and hailstorms.
When observed at an airport, METAR code is used within a surface weather observation which relates to the size of the hailstone. Within METAR code, GR is used to indicate larger hail, of a diameter of at least . GR is derived from the French word grêle. Smaller-sized hail, as well as snow pellets, use the coding of GS, which is short for the French word grésil.
Terminal velocity of hail, or the speed at which hail is falling when it strikes the ground, varies. It is estimated that a hailstone of in diameter falls at a rate of , while stones the size of in diameter fall at a rate of . Hailstone velocity is dependent on the size of the stone, friction with air it is falling through, the motion of wind it is falling through, collisions with raindrops or other hailstones, and melting as the stones fall through a warmer atmosphere. As hail stones are not perfect spheres it is difficult to calculate their speed accurately.
Megacryometeors, large rocks of ice that are not associated with thunderstorms, are not officially recognized by the World Meteorological Organization as "hail," which are aggregations of ice associated with thunderstorms, and therefore records of extreme characteristics of megacryometeors are not given as hail records.
Hail can cause serious damage, notably to automobiles, aircraft, skylights, glass-roofed structures, livestock, and most commonly, crops. Hail damage to roofs often goes unnoticed until further structural damage is seen, such as leaks or cracks. It is hardest to recognize hail damage on shingled roofs and flat roofs, but all roofs have their own hail damage detection problems. Metal roofs are fairly resistant to hail damage, but may accumulate cosmetic damage in the form of dents and damaged coatings.
Hail is one of the most significant thunderstorm hazards to aircraft. When hailstones exceed in diameter, planes can be seriously damaged within seconds. The hailstones accumulating on the ground can also be hazardous to landing aircraft. Hail is also a common nuisance to drivers of automobiles, severely denting the vehicle and cracking or even shattering windshields and windows. Wheat, corn, soybeans, and tobacco are the most sensitive crops to hail damage. Hail is one of Canada's most expensive hazards.
Rarely, massive hailstones have been known to cause concussions or fatal head trauma. Hailstorms have been the cause of costly and deadly events throughout history. One of the earliest known incidents occurred around the 9th century in Roopkund, Uttarakhand, India, where 200 to 600 nomads seem to have died of injuries from hail the size of cricket balls.
Narrow zones where hail accumulates on the ground in association with thunderstorm activity are known as hail streaks or hail swaths, which can be detectable by satellite after the storms pass by. Hailstorms normally last from a few minutes up to 15 minutes in duration. Accumulating hail storms can blanket the ground with over of hail, cause thousands to lose power, and bring down many trees. Flash flooding and mudslides within areas of steep terrain can be a concern with accumulating hail.
Depths of up to have been reported. A landscape covered in accumulated hail generally resembles one covered in accumulated snow and any significant accumulation of hail has the same restrictive effects as snow accumulation, albeit over a smaller area, on transport and infrastructure. Accumulated hail can also cause flooding by blocking drains, and hail can be carried in the floodwater, turning into a snow-like slush which is deposited at lower elevations.
On somewhat rare occasions, a thunderstorm can become stationary or nearly so while prolifically producing hail and significant depths of accumulation do occur; this tends to happen in mountainous areas, such as the July 29, 2010 case of a foot of hail accumulation in Boulder County, Colorado. On June 5, 2015, hail up to four feet deep fell on one city block in Denver, Colorado. The hailstones, described as between the size of bumble bees and ping pong balls, were accompanied by rain and high winds. The hail fell in only the one area, leaving the surrounding area untouched. It fell for one and a half hours between 10 p.m. and 11:30 pm. A meteorologist for the National Weather Service in Boulder said, "It's a very interesting phenomenon. We saw the storm stall. It produced copious amounts of hail in one small area. It's a meteorological thing." Tractors used to clear the area filled more than 30 dump-truck loads of hail.
Research focused on four individual days that accumulated more than 5.9 inches (15 cm) of hail in 30 minutes on the Colorado front range has shown that these events share similar patterns in observed synoptic weather, radar, and lightning characteristics, suggesting the possibility of predicting these events prior to their occurrence. A fundamental problem in continuing research in this area is that, unlike hail diameter, hail depth is not commonly reported. The lack of data leaves researchers and forecasters in the dark when trying to verify operational methods. A cooperative effort between the University of Colorado and the National Weather Service is in progress. The joint project's goal is to enlist the help of the general public to develop a database of hail accumulation depths.
During the Middle Ages, people in Europe used to ring church bells and fire cannons to try to prevent hail, and the subsequent damage to crops. Updated versions of this approach are available as modern hail cannons. Cloud seeding after World War II was done to eliminate the hail threat, particularly across the Soviet Union – where it was claimed a 70–98% reduction in crop damage from hail storms was achieved by deploying silver iodide in clouds using rockets and artillery shells. Hail suppression programs have been undertaken by 15 countries between 1965 and 2005.
|
https://en.wikipedia.org/wiki?curid=14458
|
Hypnotherapy
Hypnotherapy is a type of alternative medicine in which hypnosis is used to create a state of focused attention and increased suggestibility during which positive suggestions and guided imagery are used to help individuals deal with a variety of concerns and issues.
The United States' "Federal Dictionary of Occupational Titles" describes the job of the hypnotherapist:
"Induces hypnotic state in client to increase motivation or alter behavior patterns: Consults with client to determine nature of problem. Prepares client to enter hypnotic state by explaining how hypnosis works and what client will experience. Tests subject to determine degree of physical and emotional suggestibility. Induces hypnotic state in client, using individualized methods and techniques of hypnosis based on interpretation of test results and analysis of client's problem. May train client in self-hypnosis conditioning."
GOE: 10.02.02 STRENGTH: S GED: R4 M3 L4 SVP: 7 DLU: 77""" (This definition was created in 1973 by John Kappas, hypnotherapist and founder of the Hypnosis Motivation Institute.)
The form of hypnotherapy practiced by most Victorian hypnotists, including James Braid and Hippolyte Bernheim, mainly employed direct suggestion of symptom removal, with some use of therapeutic relaxation and occasionally aversion to alcohol, drugs, etc.
In the 1950s, Milton H. Erickson developed a radically different approach to hypnotism, which has subsequently become known as "Ericksonian hypnotherapy" or "Neo-Ericksonian hypnotherapy." Erickson made use of an informal conversational approach with many clients and complex language patterns, and therapeutic strategies. This divergence from tradition led some of his colleagues, including Andre Weitzenhoffer, to dispute whether Erickson was right to label his approach "hypnosis" at all.
The founders of neuro-linguistic programming (NLP), a method somewhat similar in some regards to some versions of hypnotherapy, claimed that they had modelled the work of Erickson extensively and assimilated it into their approach. Weitzenhoffer disputed whether NLP bears any genuine resemblance to Erickson's work.
In the 2000s, hypnotherapists began to combine aspects of solution-focused brief therapy (SFBT) with Ericksonian hypnotherapy to produce therapy that was goal-focused (what the client wanted to achieve) rather than the more traditional problem-focused approach (spending time discussing the issues that brought the client to seek help). A solution-focused hypnotherapy session may include techniques from NLP.
Cognitive behavioural hypnotherapy (CBH) is an integrated psychological therapy employing clinical hypnosis and cognitive behavioural therapy (CBT). The use of CBT in conjunction with hypnotherapy may result in greater treatment effectiveness. A meta-analysis of eight different researches revealed "a 70% greater improvement" for patients undergoing an integrated treatment to those using CBT only.
In 1974, Theodore X. Barber and his colleagues published a review of the research which argued, following the earlier social psychology of Theodore R. Sarbin, that hypnotism was better understood not as a "special state" but as the result of normal psychological variables, such as active imagination, expectation, appropriate attitudes, and motivation. Barber introduced the term "cognitive-behavioral" to describe the nonstate theory of hypnotism, and discussed its application to behavior therapy.
The growing application of cognitive and behavioral psychological theories and concepts to the explanation of hypnosis paved the way for a closer integration of hypnotherapy with various cognitive and behavioral therapies.
Many cognitive and behavioral therapies were themselves originally influenced by older hypnotherapy techniques, e.g., the systematic desensitisation of Joseph Wolpe, the cardinal technique of early behavior therapy, was originally called "hypnotic desensitisation" and derived from the "Medical Hypnosis" (1948) of Lewis Wolberg.
David Lesser (1928–2001) was the originator of what is today known by the term "curative hypnotherapy". It was he who first saw the possibility of finding the causes of people's symptoms by using a combination of hypnosis, IMR and a method of specific questioning that he began to explore. Rather than try to override the subconscious information as Janet had done, he realised the necessity- and developed the process- to correct the wrong information. Lesser's understanding of the logicality and simplicity of the subconscious led to the creation of the methodical treatment used today and it is his work and understanding that underpins the therapy and is why the term "Lesserian" was coined and trademarked. As the understanding of the workings of the subconscious continues to evolve, the application of the therapy continues to change. The three most influential changes have been in Specific Questioning (1992) to gain more accurate subconscious information; a subconscious cause/effect mapping system (SRBC)(1996) to streamline the process of curative hypnotherapy treatment; and the 'LBR Criteria' (2003) to be able to differentiate more easily between causal and trigger events and helping to target more accurately the erroneous data which requires reinterpretation.
Hypnotherapy expert Dr. Peter Marshall, former Principal of the London School of Hypnotherapy and Psychotherapy Ltd. and author of "A Handbook of Hypnotherapy", devised the Trance Theory of Mental Illness, which provides that people suffering from depression, or certain other kinds of neuroses, are already living in a trance and so the hypnotherapist does not need to induce them, but rather to make them understand this and help lead them out of it.
Mindful hypnotherapy is therapy that incorporates mindfulness and hypnotherapy. A pilot study was made at Baylor University, Texas, and published in the "International Journal of Clinical and Experimental Hypnosis". Dr. Gary Elkins, director of the Mind-Body Medicine Research Laboratory at Baylor University called it "a valuable option for treating anxiety and stress reduction” and "an innovative mind-body therapy". The study showed a decrease in stress and a increase in mindfulness.
Clinicians choose hypnotherapy to address a wide range of circumstances; however, according to Yeates (2016), people choose to have hypnotherapy for many other reasons:
Smoking
The hypnotherapy has in terms of smoking cessation a greater effect on six-month quit rates than other interventions, nevertheless, another conclusion says there was no evidence available from randomised controlled trials to assess the effectiveness of hypnosis during pregnancy, childbirth, and the postnatal period for preventing postnatal depression.
Hypnotherapy is often applied in the birthing process and the post-natal period, but there is insufficient evidence to determine if it alleviates pain during childbirth and no evidence that it is effective against post-natal depression. Until 2012, there was no thorough research on this topic. However in 2013 the study was conducted during which it was found that: “The use of hypnosis in childbirth leads to a decrease in the amount of pharmacological analgesia and oxytocin used, which reduces the duration of the first stage of labor”. In 2013, studies were conducted in Denmark, during which it was concluded that "The self-hypnosis course improves the experience of childbirth in women and also reduces the level of fear". In 2015, a similar study was conducted in the UK by a group of researchers: "The positive experience of self-hypnosis gives a sense of calm, confidence and empowerment in childbirth". The hypnobirthing is used by stars such as Kate Middleton
Literature shows that a wide variety of hypnotic interventions have been investigated for the treatment of bulimia nervosa, with inconclusive effect. Similar studies have shown that groups suffering from bulimia nervosa, undergoing hypnotherapy, were more exceptional to no treatment, placebos, or other alternative treatments.
Among its many other applications in other medical domains, hypnotism was used therapeutically, by some alienists in the Victorian era, to treat the condition then known as hysteria.
Modern hypnotherapy is widely accepted for the treatment of certain habit disorders, to control irrational fears, as well as in the treatment of conditions such as insomnia and addiction. Hypnosis has also been used to enhance recovery from non-psychological conditions such as after surgical procedures, in breast cancer care and even with gastro-intestinal problems, including IBS.
A 2003 meta-analysis on the efficacy of hypnotherapy concluded that "the efficacy of hypnosis is not verified for a considerable part of the spectrum of psychotherapeutic practice."
In 2005, a meta-analysis by the Cochrane Collaboration found no evidence that hypnotherapy was more successful than other treatments or no treatment in achieving cessation of smoking for at least six months.
In 2007, a meta-analysis from the Cochrane Collaboration found that the therapeutic effect of hypnotherapy was "superior to that of a waiting list control or usual medical management, for abdominal pain and composite primary IBS symptoms, in the short term in patients who fail standard medical therapy", with no harmful side-effects. However the authors noted that the quality of data available was inadequate to draw any firm conclusions.
Two Cochrane reviews in 2012 concluded that there was insufficient evidence to support its efficacy in managing the pain of childbirth or post-natal depression.
In 2016, a literature review published in La Presse Medicale found that there is not sufficient evidence to "support the efficacy of hypnosis in chronic anxiety disorders".
In 2019, a Cochrane review was unable to find evidence of benefit of hypnosis in smoking cessation, and suggested if there is, it is small at best.
The laws regarding hypnosis and hypnotherapy vary by state and municipality. Some states, like Colorado, Connecticut and Washington, have mandatory licensing and registration requirements, while many other states have no specific regulations governing the practice of hypnotherapy.
In 2002, the Department for Education and Skills developed National Occupational Standards for hypnotherapy linked to National Vocational Qualifications based on the then National Qualifications Framework under the Qualifications and Curriculum Authority. NCFE, a national awarding body, issues level four national vocational qualification diploma in hypnotherapy. Currently AIM Awards offers a Level 3 Certificate in Hypnotherapy and Counselling Skills at level 3 of the Regulated Qualifications Framework.
The regulation of the hypnotherapy profession in the UK is at present the main focus of UKCHO, a non-profit umbrella body for hypnotherapy organisations. Founded in 1998 to provide a non-political arena to discuss and implement changes to the profession of hypnotherapy, UKCHO currently represents 9 of the UK's professional hypnotherapy organisations and has developed standards of training for hypnotherapists, along with codes of conduct and practice that all UKCHO registered hypnotherapists are governed by. As a step towards the regulation of the profession, UKCHO's website now includes a National Public Register of Hypnotherapists who have been registered by UKCHO's Member Organisations and are therefore subject to UKCHO's professional standards. Further steps to full regulation of the hypnotherapy profession will be taken in consultation with the Prince's Foundation for Integrated Health.
Professional hypnotherapy and use of the occupational titles "hypnotherapist" or "clinical hypnotherapist" are not government-regulated in Australia.
In 1996, as a result of a three-year research project led by Lindsay B. Yeates, the Australian Hypnotherapists Association (founded in 1949), the oldest hypnotism-oriented professional organization in Australia, instituted a peer-group accreditation system for full-time Australian professional hypnotherapists, the first of its kind in the world, which "accredit[ed] specific individuals on the basis of their actual demonstrated knowledge and clinical performance; instead of approving particular 'courses' or approving particular 'teaching institutions'" (Yeates, 1996, p.iv; 1999, p.xiv). The system was further revised in 1999.
Australian hypnotism/hypnotherapy organizations (including the Australian Hypnotherapists Association) are seeking government regulation similar to other mental health professions. However, currently hypnotherapy is not subject to government regulation through the Australian Health Practitioner Regulation Agency (AHPRA).
|
https://en.wikipedia.org/wiki?curid=14459
|
Hangman (game)
Hangman is a paper and pencil guessing game for two or more players. One player thinks of a word, phrase or sentence and the other(s) tries to guess it by suggesting letters within a certain number of guesses.
If none of the people that are guessing the word get it right the person who give the word DOES NOT get the point
Though the origins of the game are unknown, a variant is mentioned in a book of children's games assembled by Alice Gomme in 1894. This version lacks the image of a hanged man, instead relying on keeping score as to the number of attempts it took each player to fill in the blanks.
The word to guess is represented by a row of dashes, representing each letter of the word. In most variants, proper nouns, such as names, places, and brands, are not allowed. Slang words, sometimes referred to as informal or shortened words, are also not allowed. If the guessing player suggests a letter which occurs in the word, the other player writes it in all its correct positions. If the suggested letter does not occur in the word, the other player draws one element of a hanged man stick figure as a tally mark.
The player guessing the word may, at any time, attempt to guess the whole word. If the word is correct, the game is over and the guesser wins. Otherwise, the other player may choose to penalize the guesser by adding an element to the diagram. On the other hand, if the other player makes enough incorrect guesses to allow his opponent to complete the diagram, the game is also over, this time with the guesser losing. However, the guesser can also win by guessing all the letters that appear in the word, thereby completing the word, before the diagram is completed.
As the name of the game suggests, the diagram is designed to look like a hanging man. Although debates have arisen about the game, it is still in use today. A common alternative for teachers is to draw an apple tree with ten apples, erasing or crossing out the apples as the guesses are used up.
The exact nature of the diagram differs; some players draw the gallows before play and draw parts of the man's body (traditionally the head, then the torso, then the arms and legs one by one). Some players begin with no diagram at all, and drawing the individual elements of the gallows as part of the game, effectively giving the guessing players more chances. The amount of detail on the man can also vary, affecting the number of chances. Some players include a face on the head, either all at once or one feature at a time. Some players include beheading the head as the last chance by drawing a line at the neck.
Some modifications to game play (house rules) to increase the difficulty level are sometimes implemented, such as limiting guesses on high-frequency consonants and vowels. Another alternative is to give the definition of the word; this can be used to facilitate the learning of a foreign language.
The fact that the twelve most commonly occurring letters in the English language are e-t-a-o-i-n-s-h-r-d-l-u (from most to least), along with other letter-frequency lists, are used by the guessing player to increase the odds when it is their turn to guess. On the other hand, the same lists can be used by the puzzle setter to stump their opponent by choosing a word that deliberately avoids common letters (e.g. "rhythm" or "zephyr") or one that contains rare letters (e.g. "jazz").
Another common strategy is to guess vowels first, as English only has five vowels (a, e, i, o, and u, while y may sometimes, but rarely, be used as a vowel) and almost every word has at least one.
According to a 2010 study conducted by Jon McLoone for Wolfram Research, the most difficult words to guess include "jazz", "buzz", "hajj", "faff", "fizz", "fuzz" and variations of these.
The game show "Wheel of Fortune" is based on Hangman but with the addition of multiple-word puzzles, a roulette-styled wheel, and cash awards.
Brazil also had a show in the 1960s and again from 2012–2013 called "Let’s Play Hangman", hosted by the legendary Silvio Santos. Brazil would later get its own version of "Wheel of Fortune", running from 1980 to 1993, again from 2003 to 2012 (during which the new "Let’s Play Hangman" aired), and again since 2013 to the present. These shows were also hosted by Santos.
In July 2017, the BBC introduced a game show of its own called "Letterbox", which is also based on Hangman.
The following example game illustrates a player trying to guess the word "hangman" using a strategy based solely on letter frequency. As the player continues, a part of the stick figure on the noose is added. Once a full body is drawn, the game is over, and the player lost.
|
https://en.wikipedia.org/wiki?curid=14462
|
Harmonic mean
In mathematics, the harmonic mean (sometimes called the subcontrary mean) is one of several kinds of average, and in particular, one of the Pythagorean means. Typically, it is appropriate for situations when the average of rates is desired.
The harmonic mean can be expressed as the reciprocal of the arithmetic mean of the reciprocals of the given set of observations. As a simple example, the harmonic mean of 1, 4, and 4 is
The harmonic mean "H" of the positive real numbers
formula_2 is defined to be
The third formula in the above equation expresses the harmonic mean as the reciprocal of the arithmetic mean of the reciprocals.
From the following formula:
it is more apparent that the harmonic mean is related to the arithmetic and geometric means. It is the reciprocal dual of the arithmetic mean for positive inputs:
The harmonic mean is a Schur-concave function, and dominated by the minimum of its arguments, in the sense that for any positive set of arguments, formula_6. Thus, the harmonic mean cannot be made arbitrarily large by changing some values to bigger ones (while having at least one value unchanged).
The harmonic mean is one of the three Pythagorean means. For all "positive" data sets "containing at least one pair of nonequal values", the harmonic mean is always the least of the three means, while the arithmetic mean is always the greatest of the three and the geometric mean is always in between. (If all values in a nonempty dataset are equal, the three means are always equal to one another; e.g., the harmonic, geometric, and arithmetic means of {2, 2, 2} are all 2.)
It is the special case "M"−1 of the power mean:
Since the harmonic mean of a list of numbers tends strongly toward the least elements of the list, it tends (compared to the arithmetic mean) to mitigate the impact of large outliers and aggravate the impact of small ones.
The arithmetic mean is often mistakenly used in places calling for the harmonic mean. In the speed example below for instance, the arithmetic mean of 40 is incorrect, and too big.
The harmonic mean is related to the other Pythagorean means, as seen in the equation below. This can be seen by interpreting the denominator to be the arithmetic mean of the product of numbers "n" times but each time omitting the "j"-th term. That is, for the first term, we multiply all "n" numbers except the first; for the second, we multiply all "n" numbers except the second; and so on. The numerator, excluding the "n", which goes with the arithmetic mean, is the geometric mean to the power "n". Thus the "n"-th harmonic mean is related to the "n"-th geometric and arithmetic means. The general formula is
If a set of non-identical numbers is subjected to a mean-preserving spread — that is, two or more elements of the set are "spread apart" from each other while leaving the arithmetic mean unchanged — then the harmonic mean always decreases.
For the special case of just two numbers, formula_9 and formula_10, the harmonic mean can be written
In this special case, the harmonic mean is related to the arithmetic mean formula_12 and the geometric mean formula_13 by
Since formula_15 by the inequality of arithmetic and geometric means, this shows for the "n" = 2 case that "H" ≤ "G" (a property that in fact holds for all "n"). It also follows that formula_16, meaning the two numbers' geometric mean equals the geometric mean of their arithmetic and harmonic means.
For the special case of three numbers, formula_9, formula_10 and formula_19, the harmonic mean can be written
Three positive numbers "H", "G", and "A" are respectively the harmonic, geometric, and arithmetic means of three positive numbers if and only if the following inequality holds
If a set of weights formula_22, ..., formula_23 is associated to the dataset formula_9, ..., formula_25, the weighted harmonic mean is defined by
The unweighted harmonic mean can be regarded as the special case where all of the weights are equal.
In many situations involving rates and ratios, the harmonic mean provides the truest average. For instance, if a vehicle travels a certain distance "d" outbound at a speed "x" (e.g. 60 km/h) and returns the same distance at a speed "y" (e.g. 20 km/h), then its average speed is the harmonic mean of "x" and "y" (30 km/h) – not the arithmetic mean (40 km/h). The total travel time is the same as if it had traveled the whole distance at that average speed. This can be proven as follows:
Average speed for the entire journey
= =
However, if the vehicle travels for a certain amount of "time" at a speed "x" and then the same amount of time at a speed "y", then its average speed is the arithmetic mean of "x" and "y", which in the above example is 40 km/h. The same principle applies to more than two segments: given a series of sub-trips at different speeds, if each sub-trip covers the same "distance", then the average speed is the "harmonic" mean of all the sub-trip speeds; and if each sub-trip takes the same amount of "time", then the average speed is the "arithmetic" mean of all the sub-trip speeds. (If neither is the case, then a weighted harmonic mean or weighted arithmetic mean is needed. For the arithmetic mean, the speed of each portion of the trip is weighted by the duration of that portion, while for the harmonic mean, the corresponding weight is the distance. In both cases, the resulting formula reduces to dividing the total distance by the total time.)
However one may avoid the use of the harmonic mean for the case of "weighting by distance". Pose the problem as finding "slowness" of the trip where "slowness" (in hours per kilometre) is the inverse of speed. When trip slowness is found, invert it so as to find the "true" average trip speed. For each trip segment i, the slowness si = 1/speedi. Then take the weighted arithmetic mean of the si's weighted by their respective distances (optionally with the weights normalized so they sum to 1 by dividing them by trip length). This gives the true average slowness (in time per kilometre). It turns out that this procedure, which can be done with no knowledge of the harmonic mean, amounts to the same mathematical operations as one would use in solving this problem by using the harmonic mean. Thus it illustrates why the harmonic mean works in this case.
Similarly, if one wishes to estimate the density of an alloy given the densities of its constituent elements and their mass fractions (or, equivalently, percentages by mass), then the predicted density of the alloy (exclusive of typically minor volume changes due to atom packing effects) is the weighted harmonic mean of the individual densities, weighted by mass, rather than the weighted arithmetic mean as one might at first expect. To use the weighted arithmetic mean, the densities would have to be weighted by volume. Applying dimensional analysis to the problem while labelling the mass units by element and making sure that only like element-masses cancel, makes this clear.
If one connects two electrical resistors in parallel, one having resistance "x" (e.g., 60 Ω) and one having resistance "y" (e.g., 40 Ω), then the effect is the same as if one had used two resistors with the same resistance, both equal to the harmonic mean of "x" and "y" (48 Ω): the equivalent resistance, in either case, is 24 Ω (one-half of the harmonic mean). This same principle applies to capacitors in series or to inductors in parallel.
However, if one connects the resistors in series, then the average resistance is the arithmetic mean of "x" and "y" (with total resistance equal to the sum of x and y). As with the previous example, the same principle applies when more than two resistors are connected, provided that all are in parallel or all are in series.
The "conductivity effective mass" of a semiconductor is also defined as the harmonic mean of the effective masses along the three crystallographic directions.
The weighted harmonic mean is the preferable method for averaging multiples, such as the price–earnings ratio (P/E), in which price is in the numerator. If these ratios are averaged using a weighted arithmetic mean (a common error), high data points are given greater weights than low data points. The weighted harmonic mean, on the other hand, gives equal weight to each data point. The simple weighted arithmetic mean when applied to non-price normalized ratios such as the P/E is biased upwards and cannot be numerically justified, since it is based on equalized earnings; just as vehicles speeds cannot be averaged for a roundtrip journey.
For example, consider two firms, one with a market capitalization of $150 billion and earnings of $5 billion (P/E of 30) and one with a market capitalization of $1 billion and earnings of $1 million (P/E of 1000). Consider an index made of the two stocks, with 30% invested in the first and 70% invested in the second. We want to calculate the P/E ratio of this index.
Using the weighted arithmetic mean (incorrect):
Using the weighted harmonic mean (correct):
Thus, the correct P/E of 93.46 of this index can only be found using the weighted harmonic mean, while the weighted arithmetic mean will significantly overestimate it.
In any triangle, the radius of the incircle is one-third of the harmonic mean of the altitudes.
For any point P on the minor arc BC of the circumcircle of an equilateral triangle ABC, with distances "q" and "t" from B and C respectively, and with the intersection of PA and BC being at a distance "y" from point P, we have that "y" is half the harmonic mean of "q" and "t".
In a right triangle with legs "a" and "b" and altitude "h" from the hypotenuse to the right angle, is half the harmonic mean of and .
Let "t" and "s" ("t" > "s") be the sides of the two inscribed squares in a right triangle with hypotenuse "c". Then equals half the harmonic mean of and .
Let a trapezoid have vertices A, B, C, and D in sequence and have parallel sides AB and CD. Let E be the intersection of the diagonals, and let F be on side DA and G be on side BC such that FEG is parallel to AB and CD. Then FG is the harmonic mean of AB and DC. (This is provable using similar triangles.)
One application of this trapezoid result is in the crossed ladders problem, where two ladders lie oppositely across an alley, each with feet at the base of one sidewall, with one leaning against a wall at height "A" and the other leaning against the opposite wall at height "B", as shown. The ladders cross at a height of "h" above the alley floor. Then "h" is half the harmonic mean of "A" and "B". This result still holds if the walls are slanted but still parallel and the "heights" "A", "B", and "h" are measured as distances from the floor along lines parallel to the walls. This can be proved easily using the area formula of a trapezoid and area addition formula.
In an ellipse, the semi-latus rectum (the distance from a focus to the ellipse along a line parallel to the minor axis) is the harmonic mean of the maximum and minimum distances of the ellipse from a focus.
In computer science, specifically information retrieval and machine learning, the harmonic mean of the precision (true positives per predicted positive) and the recall (true positives per real positive) is often used as an aggregated performance score for the evaluation of algorithms and systems: the F-score (or F-measure). This is used in information retrieval because only the positive class is of relevance, while number of negatives, in general, is large and unknown. It is thus a trade-off as to whether the correct positive predictions should be measured in relation to the number of predicted positives or the number of real positives, so it is measured versus a putative number of positives that is an arithmetic mean of the two possible denominators.
A consequence arises from basic algebra in problems where people or systems work together. As an example, if a gas-powered pump can drain a pool in 4 hours and a battery-powered pump can drain the same pool in 6 hours, then it will take both pumps , which is equal to 2.4 hours, to drain the pool together. This is one-half of the harmonic mean of 6 and 4: . That is the appropriate average for the two types of pump is the harmonic mean, and with one pair of pumps (two pumps), it takes half this harmonic mean time, while with two pairs of pumps (four pumps) it would take a quarter of this harmonic mean time.
In hydrology, the harmonic mean is similarly used to average hydraulic conductivity values for a flow that is perpendicular to layers (e.g., geologic or soil) - flow parallel to layers uses the arithmetic mean. This apparent difference in averaging is explained by the fact that hydrology uses conductivity, which is the inverse of resistivity.
In sabermetrics, a player's Power–speed number is the harmonic mean of their home run and stolen base totals.
In population genetics, the harmonic mean is used when calculating the effects of fluctuations in the census population size on the effective population size. The harmonic mean takes into account the fact that events such as population bottleneck increase the rate genetic drift and reduce the amount of genetic variation in the population. This is a result of the fact that following a bottleneck very few individuals contribute to the gene pool limiting the genetic variation present in the population for many generations to come.
When considering fuel economy in automobiles two measures are commonly used – miles per gallon (mpg), and litres per 100 km. As the dimensions of these quantities are the inverse of each other (one is distance per volume, the other volume per distance) when taking the mean value of the fuel economy of a range of cars one measure will produce the harmonic mean of the other – i.e., converting the mean value of fuel economy expressed in litres per 100 km to miles per gallon will produce the harmonic mean of the fuel economy expressed in miles per gallon. For calculating the average fuel consumption of a fleet of vehicles from the individual fuel consumptions, the harmonic mean should be used if the fleet uses miles per gallon, whereas the arithmetic mean should be used if the fleet uses litres per 100 km. In the USA the CAFE standards (the federal automobile fuel consumption standards) make use of the harmonic mean.
In chemistry and nuclear physics the average mass per particle of a mixture consisting of different species (e.g., molecules or isotopes) is given by the harmonic mean of the individual species' masses weighted by their respective mass fraction.
The harmonic mean of a beta distribution with shape parameters "α" and "β" is:
The harmonic mean with "α" < 1 is undefined because its defining expression is not bounded in [0, 1].
Letting "α" = "β"
showing that for "α" = "β" the harmonic mean ranges from 0 for "α" = "β" = 1, to 1/2 for "α" = "β" → ∞.
The following are the limits with one parameter finite (non-zero) and the other parameter approaching these limits:
With the geometric mean the harmonic mean may be useful in maximum likelihood estimation in the four parameter case.
A second harmonic mean ("H"1 − X) also exists for this distribution
This harmonic mean with "β" < 1 is undefined because its defining expression is not bounded in [ 0, 1 ].
Letting "α" = "β" in the above expression
showing that for "α" = "β" the harmonic mean ranges from 0, for "α" = "β" = 1, to 1/2, for "α" = "β" → ∞.
The following are the limits with one parameter finite (non zero) and the other approaching these limits:
Although both harmonic means are asymmetric, when "α" = "β" the two means are equal.
The harmonic mean ( "H" ) of a lognormal distribution is
where "μ" is the arithmetic mean and "σ"2 is the variance of the distribution.
The harmonic and arithmetic means are related by
where "C"v is the coefficient of variation.
The geometric ("G"), arithmetic and harmonic means are related by
The harmonic mean of type 1 Pareto distribution is
where "k" is the scale parameter and "α" is the shape parameter.
For a random sample, the harmonic mean is calculated as above. Both the mean and the variance may be infinite (if it includes at least one term of the form 1/0).
The mean of the sample "m" is asymptotically distributed normally with variance "s"2.
The variance of the mean itself is
where "m" is the arithmetic mean of the reciprocals, "x" are the variates, "n" is the population size and "E" is the expectation operator.
Assuming that the variance is not infinite and that the central limit theorem applies to the sample then using the delta method, the variance is
where "H" is the harmonic mean, "m" is the arithmetic mean of the reciprocals
"s"2 is the variance of the reciprocals of the data
and "n" is the number of data points in the sample.
A jackknife method of estimating the variance is possible if the mean is known. This method is the usual 'delete 1' rather than the 'delete m' version.
This method first requires the computation of the mean of the sample ("m")
where "x" are the sample values.
A series of value "wi" is then computed where
The mean ("h") of the "w"i is then taken:
The variance of the mean is
Significance testing and confidence intervals for the mean can then be estimated with the t test.
Assume a random variate has a distribution "f"( "x" ). Assume also that the likelihood of a variate being chosen is proportional to its value. This is known as length based or size biased sampling.
Let "μ" be the mean of the population. Then the probability density function "f"*( "x" ) of the size biased population is
The expectation of this length biased distribution E*( "x" ) is
where "σ"2 is the variance.
The expectation of the harmonic mean is the same as the non-length biased version E( "x" )
The problem of length biased sampling arises in a number of areas including textile manufacture pedigree analysis and survival analysis
Akman "et al" have developed a test for the detection of length based bias in samples.
If "X" is a positive random variable and "q" > 0 then for all "ε" > 0
Assuming that "X" and E("X") are > 0 then
This follows from Jensen's inequality.
Gurland has shown that for a distribution that takes only positive values, for any "n" > 0
Under some conditions
where ~ means approximately.
Assuming that the variates ("x") are drawn from a lognormal distribution there are several possible estimators for "H":
where
Of these "H"3 is probably the best estimator for samples of 25 or more.
A first order approximation to the bias and variance of "H"1 are
where "C"v is the coefficient of variation.
Similarly a first order approximation to the bias and variance of "H"3 are
In numerical experiments "H"3 is generally a superior estimator of the harmonic mean than "H"1. "H"2 produces estimates that are largely similar to "H"1.
The Environmental Protection Agency recommends the use of the harmonic mean in setting maximum toxin levels in water.
In geophysical reservoir engineering studies, the harmonic mean is widely used.
|
https://en.wikipedia.org/wiki?curid=14463
|
Hellbender
The hellbender ("Cryptobranchus alleganiensis"), also known as the hellbender salamander, is a species of aquatic giant salamander endemic to the eastern and central United States. A member of the family Cryptobranchidae, the hellbender is the only extant member of the genus "Cryptobranchus". Other closely related salamanders in the same family are in the genus "Andrias", which contains the Japanese and Chinese giant salamanders. The hellbender, which is much larger than all other salamanders in its geographic range, employs an unusual means of respiration (which involves cutaneous gas exchange through capillaries found in its dorsoventral skin folds), and fills a particular niche—both as a predator and prey—in its ecosystem, which either it or its ancestors have occupied for around 65 million years. The species is listed as Near Threatened on the IUCN Red List of Threatened Species.
The origin of the name "hellbender" is unclear. The Missouri Department of Conservation says:
The name 'hellbender' probably comes from the animal's odd look. One theory claims the hellbender was named by settlers who thought "it was a creature from hell where it's bent on returning." Another rendition says the undulating skin of a hellbender reminded observers of "horrible tortures of the infernal regions." In reality, it's a harmless aquatic salamander.
Other vernacular names include "snot otter", "lasagna lizard", "devil dog", "mud-devil", "grampus", "Allegheny alligator", "mud dog", "water dog", "spotted water gecko" and "leverian water newt".
The genus name is derived from the Ancient Greek "kryptos" (hidden) and "branchion" (gill). The subspecific name "bishopi" is in honor of American herpetologist Sherman C. Bishop.
"C. alleganiensis" has a flat body and head, with beady dorsal eyes and slimy skin. Like most salamanders, it has short legs with four toes on the front legs and five on its back limbs, and its tail is keeled for propulsion. The hellbender has working lungs, but gill slits are often retained, although only immature specimens have true gills; the hellbender absorbs oxygen from the water through capillaries of its side frills. It is blotchy brown or red-brown in color, with a paler underbelly.
Both males and females grow to an adult length of from snout to vent, with a total length of , making them the third-largest aquatic salamander species in the world (after the Chinese giant salamander and the Japanese giant salamander, respectively) and the largest amphibian in North America, although this length is rivaled by the reticulated siren of the southeastern United States (although the siren is much leaner in build). An adult weighs , making them the fourth heaviest living amphibian in the world after their Chinese and Japanese cousins and the goliath frog, while the largest cane toads may also weigh as much as a hellbender. Hellbenders reach sexual maturity at about five years of age, and may live 30 years in captivity.
The hellbender has a few characteristics that make it distinguishable from other native salamanders, including a gigantic, dorsoventrally flattened body with thick folds travelling down the sides, a single open gill slit on each side, and hind feet with five toes each. Easily distinguished from most other endemic salamander species simply by their size, hellbenders average up to 60 cm or about 2 ft in length; the only species requiring further distinction (due to an overlap in distribution and size range) is the common mudpuppy ("Necturus maculosus"). This demarcation can be made by noting the presence of external gills in the mudpuppy, which are lacking in the hellbender, as well as the presence of four toes on each hind foot of the mudpuppy (in contrast with the hellbender's five). Furthermore, the average size of "C. a. alleganiensis" has been reported to be 45–60 cm (with some reported as reaching up to 74 cm or 30 in), while "N. m. maculosus" has a reported average size of 28–40 cm in length, which means that hellbender adults will still generally be notably larger than even the biggest mudpuppies.
The genus "Cryptobranchus" has historically only been considered to contain one species, "C. alleganiensis", with two subspecies, "C. a. alleganiensis" and "C. a. bishopi". A recent decline in population size of the Ozark subspecies "C. a. bishopi" has led to further research into populations of this subspecies, including genetic analysis to determine the best method for conservation.
Crowhurst et al., for instance, found that the "Ozark subspecies" denomination is insufficient for describing genetic (and therefore evolutionary) divergence within the genus "Cryptobranchus" in the Ozark region. They found three equally divergent genetic units within the genus: "C. a. alleganiensis", and two distinct eastern and western populations of "C. a. bishopi". These three groups were shown to be isolated, and are considered to most likely be "diverging on different evolutionary paths".
Hellbenders are present in a number of Eastern US states, from southern New York to northern Georgia, including parts of Ohio, Pennsylvania, Maryland, West Virginia, Virginia, Kentucky, Illinois, Indiana, Tennessee, North Carolina, South Carolina, Alabama, Mississippi, Arkansas, Missouri, and even a small bit of Oklahoma and Kansas. The subspecies (or species, depending on the source) "C. a. bishopi" is confined to the Ozarks of northern Arkansas and southern Missouri, while "C. a. alleganiensis" is found in the rest of these states.
Some hellbender populations—namely a few in Missouri, Pennsylvania, and Tennessee—have historically been noted to be quite abundant, but several man-made maladies have converged on the species such that it has seen a serious population decline throughout its range. Hellbender populations were listed in 1981 as already extinct or endangered in Illinois, Indiana, Iowa, and Maryland, decreasing in Arkansas and Kentucky, and generally threatened as a species throughout their range by various human activities and developments.
The hellbender salamander, considered a "habitat specialist", has adapted to fill a specific niche within a very specific environment, and is labeled as such "because its success is dependent on a constancy of dissolved oxygen, temperature and flow found in swift water areas", which in turn limits it to a narrow spectrum of stream/river choices. As a result of this specialization, hellbenders are generally found in areas with large, irregularly shaped, and intermittent rocks and swiftly moving water, while they tend to avoid wider, slow-moving waters with muddy banks and/or slab rock bottoms. This specialization likely contributed to the decline in their populations, as collectors could easily identify their specific habitats. One collector noted, at one time, "one could find a specimen under almost every suitable rock", but after years of collecting, the population had declined significantly. The same collector noted, he "never found two specimens under the same rock", corroborating the account given by other researchers that hellbenders are generally solitary; they are thought to gather only during the mating season.
Both subspecies, "C. a. alleganiensis" and "C. a. bishopi" undergo a metamorphosis after around a year and a half of life. At this point, when they are roughly 13.5 cm long, they lose the gills present during their larval stage. Until then, they are easily confused with mudpuppies, and can be differentiated often only through toe number. After this metamorphosis, hellbenders must be able to absorb oxygen through the folds in their skin, which is largely behind the need for fast-moving, oxygenated water. If a hellbender ends up in an area of slow-moving water, not enough of it will pass over its skin in a given time, making it difficult to garner enough oxygen to support necessary respiratory functions. A below-favorable oxygen content can make life equally difficult.
Hellbenders are preyed upon by diverse predators, including various fish and reptiles (including both snakes and turtles). Cannibalism of eggs is also considered a common occurrence.
Once a hellbender finds a favorable location, it generally does not stray too far from it—except occasionally for breeding and hunting—and will protect it from other hellbenders both in and out of the breeding season. While the range of two hellbenders may overlap, they are noted as rarely being present in the overlapping area when the other salamander is in the area. The species is at least somewhat nocturnal, with peak activity being reported by one source as occurring around "two hours after dark" and again at dawn (although the dawn peak was recorded in the lab and could be misleading as a result). Nocturnal activity has been found to be most prevalent in early summer, perhaps coinciding with highest water depths.
"C. alleganiensis" feeds primarily on crayfish and small fish. One report, written by a commercial collector in the 1940s, noted a trend of more crayfish predation in the summer during times of higher prey activity, whereas fish made up a larger part of the winter diet, when crayfish are less active. There seems to be a specific temperature range in which hellbenders feed, as well: between 45 and 80 °F. Cannibalism—mainly on eggs—has been known to occur within hellbender populations. One researcher claimed perhaps density is maintained, and density dependence in turn created, in part by intraspecific predation.
The hellbenders' breeding season begins in late August or early- to mid-September and can continue as late as the end of November, depending on region. They exhibit no sexual dimorphism, except during the fall mating season, when males have a bulging ring around their cloacal glands. Unlike most salamanders, the hellbender performs external fertilization. Before mating, each male excavates a brood site, a saucer-shaped depression under a rock or log, with its entrance positioned out of the direct current, usually pointing downstream. The male remains in the brood site awaiting a female. When a female approaches, the male guides or drives her into his burrow and prevents her from leaving until she oviposits.
Female hellbenders lay 150–200 eggs over a two- to three-day period; the eggs are 18–20 mm in diameter, connected by five to 10 cords. As the female lays eggs, the male positions himself alongside or slightly above them, spraying the eggs with sperm while swaying his tail and moving his hind limbs, which disperses the sperm uniformly. The male often tempts other females to lay eggs in his nest, and as many as 1,946 eggs have been counted in a single nest. Cannibalism, however, leads to a much lower number of eggs in hellbender nests than would be predicted by egg counts.
After oviposition, the male drives the female away from the nest and guards the eggs. Incubating males rock back and forth and undulate their lateral skin folds, which circulates the water, increasing oxygen supply to both eggs and adult. Incubation lasts from 45 to 75 days, depending on region.
Hatchling hellbenders are 25–33 mm long, have a yolk sac as a source of energy for the first few months of life, and lack functional limbs.
Hellbenders are superbly adapted to the shallow, fast-flowing, rocky streams in which they live. Their flattened shape offers little resistance to the flowing water, allowing them to work their way upstream and also to crawl into narrow spaces under rocks. Although their eyesight is relatively poor, they have light-sensitive cells all over their bodies. Those on their tails are especially finely tuned and may help them position safely under rocks without their tails poking out to give themselves away. They have a good sense of smell and move upstream in search of food such as dead fish, following the trail of scent molecules. Smell is possibly their most important sense when hunting. They also have a lateral line similar to those of fish, with which they can detect vibrations in the water.
Research throughout the range of the hellbender has shown a dramatic decline in populations in the majority of locations. Many different anthropogenic sources have helped to create this decline, including the siltation and sedimentation, blocking of dispersal/migration routes, and destruction of riverine habitats created by dams and other development, as well as pollution, disease and overharvesting for commercial and scientific purposes. As many of these detrimental effects have irreversibly damaged hellbender populations, it is important to conserve the remaining populations through protecting habitats and—perhaps in places where the species was once endemic and has been extirpated—by augmenting numbers through reintroduction.
Due to sharp decreases seen in the Ozark subspecies, researchers have been trying to differentiate "C. a. alleganiensis" and "C. a. bishopi" into two management units. Indeed, researchers found significant genetic divergence between the two groups, as well as between them and another isolated population of "C. a. alleganiensis". This could be reason enough to ensure work is done on both subspecies, as preserving extant genetic diversity is of crucial ecological importance.
The Ozark hellbender has been listed as an endangered species under the Endangered Species Act by the US Fish and Wildlife Service since October 5, 2011. This hellbender subspecies inhabits the White River and Spring River systems in southern Missouri and northern Arkansas, and its population has declined an estimated 75% since the 1980s, with only about 590 individuals remaining in the wild. Degraded water quality, habitat loss resulting from impoundments, ore and gravel mining, sedimentation, and collection for the pet trade are thought to be the main factors resulting in the amphibian's decline. When chytridiomycosis killed 75% of the St. Louis Zoo's captive hellbender population between March 2006 and April 2007, tests began to be conducted on wild populations. The disease has been detected in all Missouri populations of the Ozark hellbender.
The Ozark hellbender was successfully bred in captivity for the first time at the St. Louis Zoo, in a joint project with the Missouri Department of Conservation, hatching on November 15, 2011.
Apart from the Ozark efforts, head-starting programs, in which eggs are collected from the wild and raised in captivity for re-release at a less vulnerable stage, have been initiated in New York and Ohio.
Members of the Pennsylvania State Senate have voted to approve the eastern hellbender as the official state amphibian in an effort to raise awareness about its endangered status. The legislation has been mired in controversy due to a dispute by House members who argue that Wehrle's salamander should be given the honor. The legislation did not pass in 2018, but was reintroduced in 2019. On April 23, 2019, Pennsylvania Governor Tom Wolf signed legislation making the eastern hellbender Pennsylvania's official state amphibian. Youth members of the Chesapeake Bay Foundation's Pennsylvania Student Leadership Council were heavily involved writing and advocating on behalf of this legislation. They hope that the success of the hellbender bill in the Pennsylvania will contribute to clean water efforts and raise awareness for the hellbender's struggling population.
|
https://en.wikipedia.org/wiki?curid=14465
|
Harold Eugene Edgerton
Harold Eugene "Doc" Edgerton, also known as Papa Flash (April 6, 1903 – January 4, 1990) was a professor of electrical engineering at the Massachusetts Institute of Technology. He is largely credited with transforming the stroboscope from an obscure laboratory instrument into a common device. He also was deeply involved with the development of sonar and deep-sea photography, and his equipment was used by Jacques Cousteau in searches for shipwrecks and even the Loch Ness Monster.
Edgerton was born in Fremont, Nebraska, on April 6, 1903, the son of Mary Nettie Coe and Frank Eugene Edgerton, a descendant of Richard Edgerton, one of the founders of Norwich, Connecticut and a descendant of Governor William Bradford (1590–1657) of the Plymouth Colony and a passenger on the Mayflower. His father was a lawyer, journalist, author and orator and served as the assistant attorney general of Nebraska from 1911 to 1915. Edgerton grew up in Aurora, Nebraska. He also spent some of his childhood years in Washington, D.C., and Lincoln, Nebraska.
In 1925 Edgerton received a bachelor's degree in electrical engineering from the University of Nebraska-Lincoln where he became a member of Acacia fraternity. He earned an SM in electrical engineering from MIT in 1927. Edgerton used stroboscopes to study synchronous motors for his Sc.D. thesis in electrical engineering at MIT, awarded in 1931. He credited Charles Stark Draper with inspiring him to photograph everyday objects using electronic flash; the first was a stream of water from a faucet.
In 1936 Edgerton visited Hummingbird expert May Rogers Webster. He was able to illustrate with her help that it was possible to take photographs of the birds beating their wings 60 times a second using an exposure of one hundred thousandth of a second. A picture of her with the birds flying around her appeared in National Geographic.
In 1937 Edgerton began a lifelong association with photographer Gjon Mili, who used stroboscopic equipment, in particular, multiple studio electronic flash units, to produce strikingly beautiful photographs, many of which appeared in Life Magazine. When taking multiflash photographs this strobe light equipment could flash up to 120 times a second. Edgerton was a pioneer in using short duration electronic flash in photographing fast events photography, subsequently using the technique to capture images of balloons at different stages of their bursting, a bullet during its impact with an apple, or using multiflash to track the motion of a devil stick, for example. He was awarded a bronze medal by the Royal Photographic Society in 1934, the Howard N. Potts Medal from the Franklin Institute in 1941, the Golden Plate Award of the American Academy of Achievement in 1966, the David Richardson Medal by the Optical Society of America in 1968, the Albert A. Michelson Medal from the same Franklin Institute in 1969, and the National Medal of Science in 1973.
Edgerton partnered with Kenneth J. Germeshausen to do consulting for industrial clients. Later Herbert Grier joined them. The company name "Edgerton, Germeshausen, and Grier" was changed to EG&G in 1947. EG&G became a prime contractor for the Atomic Energy Commission and had a major role in photographing and recording nuclear tests for the US through the fifties and sixties. For this role Edgerton and Charles Wykoff and others at EG&G developed and manufactured the Rapatronic camera.
His work was instrumental in the development of side-scan sonar technology, used to scan the sea floor for wrecks. Edgerton worked with undersea explorer Jacques Cousteau, by first providing him with custom-designed underwater photographic equipment featuring electronic flash, and then by developing sonar techniques used to discover the Britannic. Edgerton participated in the discovery of the American Civil War battleship USS Monitor. While working with Cousteau, he acquired the nickname in photographic circles: "Papa Flash". In 1974 Doc Edgerton worked with Paul Kronfield in Greece on a sonar search for the lost city of Helike, believed to be the basis for the legend of Atlantis.
Edgerton co-founded EG&G, Inc., which manufactured advanced electronic equipment including side-scan sonars, subbottom profiling equipment. EG&G also invented and manufactured the Krytron, the detonation device for the hydrogen bomb, and an EG&G division supervised many of America's nuclear testing.
In addition to having the scientific and engineering acumen to perfect strobe lighting commercially, Edgerton is equally recognized for his visual aesthetic: many of the striking images he created in illuminating phenomena that occurred too fast for the naked eye adorn art museums worldwide. In 1940, his high speed stroboscopic short film "Quicker'n a Wink" won an Oscar.
Edgerton was appointed a professor of electrical engineering at the Massachusetts Institute of Technology (MIT) in 1934. In 1956, Edgerton was elected a Fellow of the American Academy of Arts and Sciences. He was especially loved by MIT students for his willingness to teach and his kindness: "The trick to education", he said, "is to teach people in such a way that they don't realize they're learning until it's too late". His last undergraduate class, taught during fall semester 1977, was a freshman seminar titled "Bird and Insect Photography". One of the graduate student dormitories at MIT carries his name.
In 1962, Edgerton appeared on "I've Got a Secret", where he demonstrated strobe flash photography by shooting a bullet into a playing card and photographing the result.
Edgerton's work was featured in an October 1987 "National Geographic Magazine" article entitled "Doc Edgerton: the man who made time stand still".
After graduating from the University of Nebraska-Lincoln, Edgerton married Esther May Garrett in 1928. She was born in Aurora, Nebraska on September 8, 1903 and died on March 9, 2002 in Charlestown, South Carolina. She received a bachelor's degree in mathematics, music and education from the University of Nebraska-Lincoln. A skilled pianist and singer, she attended the New England Conservatory of Music and taught in public schools in Aurora, Nebraska and Boston. During their marriage they had three children: Mary Louise (April 21, 1931), William Eugene (8/9/1933), Robert Frank (5/10/1935). His sister, Mary Ellen Edgerton, was the wife of L. Welch Pogue (1899–2003) a pioneering aviation attorney and Chairman of the old Civil Aeronautics Board. David Pogue, a technology writer, journalist and commentator, is his great nephew.
Edgerton remained active throughout his later years, and was seen on the MIT campus many times after his official retirement. He died suddenly on January 4, 1990 at the MIT Faculty Club at the age of 86, and is buried in Mount Auburn Cemetery, Cambridge, Massachusetts.
On July 3, 1990, in an effort to memorialize his accomplishments, several community members in Aurora, Nebraska decided to construct a "Hands-On" science center. It was designated as a "teaching museum," that would preserve Doc's work and artifacts, as well as feature the "Explorit Zone" where people of all ages could participate in hands-on exhibits and interact with live science demonstrations. After five years of private and community-wide fund-raising, as well as individual investments by Doc's surviving family members, the Edgerton Explorit Center was officially dedicated on September 9, 1995.
At MIT, the Edgerton Center, founded in 1992, is a hands-on laboratory resource for undergraduate and graduate students, and also conducts educational outreach programs for high school students and teachers.
Some of Edgerton's noted photographs are :
Edgerton's work is held in the following public collection:
|
https://en.wikipedia.org/wiki?curid=14466
|
Harry Kroto
Sir Harold Walter Kroto (born Harold Walter Krotoschiner; 7 October 1939 – 30 April 2016), known as Harry Kroto, was an English chemist. He shared the 1996 Nobel Prize in Chemistry with Robert Curl and Richard Smalley for their discovery of fullerenes. He was the recipient of many other honors and awards.
Kroto held many positions in academia throughout his life, ending his career as the Francis Eppes Professor of Chemistry at Florida State University, which he joined in 2004. Prior to this, he spent approximately 40 years at the University of Sussex.
Kroto promoted science education and was a critic of religious faith.
Kroto was born in Wisbech, Cambridgeshire, England, to Edith and Heinz Krotoschiner, his name being of Silesian origin. His father's family came from Bojanowo, Poland, and his mother's from Berlin. Both of his parents were born in Berlin and fled to Great Britain in the 1930s as refugees from Nazi Germany; his father was Jewish. Harry was raised in Bolton while the British authorities interned his father on the Isle of Man as an enemy alien during World War II and attended Bolton School, where he was a contemporary of the actor Ian McKellen. In 1955, Harold's father shortened the family name to Kroto.
As a child, he became fascinated by a Meccano set. Kroto credited Meccano, as well as his aiding his father in the latter's balloon factory after World War II – amongst other things – with developing skills useful in scientific research. He developed an interest in chemistry, physics, and mathematics in secondary school, and because his sixth form chemistry teacher (Harry Heaney – who subsequently became a university professor) felt that the University of Sheffield had the best chemistry department in the United Kingdom, he went to Sheffield.
Although raised Jewish, Harry Kroto stated that religion never made any sense to him. He was a humanist who claimed to have three religions: Amnesty Internationalism, atheism, and humour. He was a distinguished supporter of the British Humanist Association. In 2003 he was one of 22 Nobel Laureates who signed the Humanist Manifesto.
In 2015, Kroto signed the Mainau Declaration 2015 on Climate Change on the final day of the 65th Lindau Nobel Laureate Meeting. The declaration was signed by a total of 76 Nobel Laureates and handed to then-President of the French Republic, François Hollande, as part of the successful COP21 climate summit in Paris.
Kroto was educated at Bolton School and went to the University of Sheffield in 1958, where he obtained a first-class honours BSc degree in Chemistry (1961) and a PhD in Molecular Spectroscopy (1964). During his time at Sheffield he also was the art editor of "Arrows" – the University student magazine, played tennis for the University team (reaching the UAU finals twice) and was President of the Student Athletics Council (1963–64). Among other things such as making the first phosphaalkenes (compounds with carbon phosphorus double bonds), his doctoral studies included unpublished research on carbon suboxide, O=C=C=C=O, and this led to a general interest in molecules containing chains of carbon atoms with numerous multiple bonds. He started his work with an interest in organic chemistry, but when he learned about spectroscopy it inclined him towards quantum chemistry; he later developed an interest in astrochemistry.
After obtaining his PhD, Kroto spent two-years in a postdoctoral position at the National Research Council in Ottawa, Canada carrying out further work in molecular spectroscopy, and also spent the subsequent year at Bell Laboratories in New Jersey (1966–1967) carrying out Raman studies of liquid phase interactions and worked on quantum chemistry.
In 1967, Kroto began teaching and research at the University of Sussex in England. During his time at Sussex from 1967 to 1985, he carried out research mainly focused on the spectroscopic studies of new and novel unstable and semi-stable species. This work resulted in the birth of the various fields of new chemistry involving carbon multiply bonded to second and third row elements e.g. S, Se and P. A particularly important breakthrough (with Sussex colleague John Nixon) was the creation of several new phosphorus species detected by microwave spectroscopy. This work resulted in the birth of the field(s) of phosphaalkene and phosphaalkyne chemistry. These species contain carbon double and triple bonded to phosphorus (C=P and C≡P) such as cyanophosphaethyne.
In 1975, he became a full professor of Chemistry. This coincided with laboratory microwave measurements with Sussex colleague David Walton on long linear carbon chain molecules, leading to radio astronomy observations with Canadian astronomers revealing the surprising fact that these unusual carbonaceous species existed in relatively large abundances in interstellar space as well as the outer atmospheres of certain stars – the carbon-rich red giants.
In 1985, on the basis of the Sussex studies and the stellar discoveries, laboratory experiments (with co-workers James R. Heath, Sean C. O'Brien, Yuan Liu, Robert Curl and Richard Smalley at Rice University) which simulated the chemical reactions in the atmospheres of the red giant stars demonstrated that stable C60 molecules could form spontaneously from a condensing carbon vapour. The co-investigators directed lasers at graphite and examined the results. The C60 molecule is a molecule with the same symmetry pattern as a football, consisting of 12 pentagons and 20 hexagons of carbon atoms. Kroto named the molecule buckminsterfullerene, after Buckminster Fuller who had conceived of the geodesic domes, as the dome concept had provided a clue to the likely structure of the new species.
In 1985, the C60 discovery caused Kroto to shift the focus of his research from spectroscopy in order to probe the consequences of the C60 structural concept (and prove it correct) and to exploit the implications for chemistry and material science.
This research is significant for the discovery of a new allotrope of carbon known as a fullerene. Other allotropes of carbon include graphite, diamond and graphene. Harry Kroto's 1985 paper entitled "C60: Buckminsterfullerine", published with colleagues J. R. Heath, S. C. O'Brien, R. F. Curl, and R. E. Smalley, was honored by a Citation for Chemical Breakthrough Award from the Division of History of Chemistry of the American Chemical Society, presented to Rice University in 2015. The discovery of fullerenes was recognized in 2010 by the designation of a National Historic Chemical Landmark by the American Chemical Society at the Richard E. Smalley Institute for Nanoscale Science and Technology at Rice University in Houston, Texas.
In 2004, Kroto left the University of Sussex to take up a new position as Francis Eppes Professor of Chemistry at Florida State University. At FSU he carried out fundamental research on: Carbon vapour with Professor Alan Marshall; Open framework condensed phase systems with strategically important electrical and magnetic behaviour with Professors Naresh Dalal (FSU) and Tony Cheetham (Cambridge); and the mechanism of formation and properties of nano-structured systems. In addition, he participated in research initiatives at FSU that probed the astrochemistry of fullerenes, metallofullerenes, and polycyclic aromatic hydrocarbons in stellar/circumstellar space, as well as their relevance to stardust.
In 1995, he jointly set up the Vega Science Trust, a UK educational charity that created high quality science films including lectures and interviews with Nobel Laureates, discussion programmes, careers and teaching resources for TV and Internet Broadcast. Vega produced over 280 programmes, that streamed for free from the Vega website which acted as a TV science channel. The trust closed in 2012.
In 2009, Kroto spearheaded the development of a second science education initiative, Geoset. Short for the Global Educational Outreach for Science, Engineering and Technology, GEOSET is an ever-growing online cache of recorded teaching modules that are freely downloadable to educators and the public. The program aims to increase knowledge of the sciences by creating a global repository of educational videos and presentations from leading universities and institutions.
In 2003, prior to the Blair/Bush invasion of Iraq on the pretext that Iraq had weapons of mass destruction, Kroto initiated and organised the publication of a letter to be signed by a dozen UK Nobel Laureates and published in "The Times". It was composed by his friend the Nobel Peace Prize Laureate the late Sir Joseph Rotblat and published in "The Times" on 15 February 2003.
He wrote a set of articles, mostly opinion pieces, from 2002–2003 for the Times Higher Education Supplement, a weekly UK publication.
From 2002–2004, Kroto served as President of the Royal Society of Chemistry. In 2004, he was appointed to the Francis Eppes Professorship in the chemistry department at Florida State University, carrying out research in nanoscience and nanotechnology.
He spoke at Auburn University on 29 April 2010, and at the James A. Baker III Institute for Public Policy at Rice University with Robert Curl on 13 October 2010.
In October 2010 Kroto participated in the USA Science and Engineering Festival's Lunch with a Laureate program where middle and high school students had the opportunity to engage in an informal conversation with a Nobel Prize–winning scientist.
He spoke at Mahatma Gandhi University, at Kottayam, in Kerala, India in January 2011, where he was an 'Erudite' special invited lecturer of the Government of Kerala, from 5 to 11 January 2011.
Kroto spoke at CSICon 2011, a convention "dedicated to scientific inquiry and critical thinking" organized by the Committee for Skeptical Inquiry in association with "Skeptical Inquirer" magazine and the Center for Inquiry.
He also delivered the IPhO 2012 lecture at the International Physics Olympiad held in Estonia.
In 2014, Kroto spoke at the Starmus Festival in the Canary Islands, delivering a lecture about his life in science, chemistry, and design.
In 1963, he married Margaret Henrietta Hunter, also a student of the University of Sheffield at the time. The couple had two sons: Stephen and David. Throughout his entire life, Kroto was a lover of film, theatre, art, and music and published his own artwork.
Kroto was a "devout atheist" who thought that beliefs in immortality derive from lack of the courage to accept human mortality. He was a patron of the British Humanist Association. He was a supporter of Amnesty International. He referred to his view that religious dogma causes people to accept unethical or inhumane actions: "The only mistake Bernie Madoff made was to promise returns in "this" life." He held that scientists had a responsibility to work for the benefit of the entire species. On 15 September 2010, Kroto, along with 54 other public figures, signed an open letter published in "The Guardian", stating their opposition to Pope Benedict XVI's state visit to the UK.
Kroto was an early Signatory of Asteroid Day.
In 2008, Kroto was critical of Michael Reiss for directing the teaching of creationism alongside evolution.
Kroto praised the increase of organized online information as an "Educational Revolution" and named it as the "GooYouWiki" world referring to Google, YouTube and Wikipedia.
One of Kroto's favourite quotes was: "I believe in Spinoza's God who reveals himself in the orderly harmony of what exists, not in a God who concerns himself with the fates and actions of human beings." said by Albert Einstein,
The discovery of buckminsterfullerene caused Kroto to postpone his dream of setting up an art and graphic design studio – he had been doing graphics semi-professionally for years. However, Kroto's graphic design work resulted in numerous posters, letterheads, logos, book/journal covers, medal design, etc. He produced artwork after receiving graphic awards in the Sunday Times Book Jacket Design competition (1964) and the Moet Hennesy/Louis Vuitton Science pour l'Art Prize (1994). Other notable graphical works include the design of the Nobel UK Stamp for Chemistry (2001) and features at the Royal Academy (London) Summer Exhibition (2004).
Kroto died on 30 April 2016 in Lewes, East Sussex from complications of amyotrophic lateral sclerosis at the age of 76.
Richard Dawkins wrote a memorial for chemist Kroto where he mentioned Kroto's "passionate hatred of religion." The "Wall Street Journal" described him as "(spending much of his later life) jetting around the world to extol scientific education in a world he saw as blinded by religion." Slate's Zack Kopplin related a story about how Kroto gave him advice and support to fight Louisiana's creationism law, a law that allows public school science teachers to attack evolution and how Kroto defended the scientific findings of global warming. In an obituary published in the journal "Nature", Robert Curl and James R. Heath described Kroto as having an "impish sense of humour similar to that of the British comedy group Monty Python".
Kroto won numerous awards, individually and with others:
Kroto was made a Knight Bachelor in the 1996 New Year Honours list.
The University of Sheffield North Campus contains two buildings named after Kroto: The Kroto Innovation Centre and the Kroto Research Institute.
|
https://en.wikipedia.org/wiki?curid=14467
|
Heimskringla
Heimskringla () is the best known of the Old Norse kings' sagas. It was written in Old Norse in Iceland by the poet and historian Snorri Sturluson (1178/79–1241) 1230. The name "Heimskringla" was first used in the 17th century, derived from the first two words of one of the manuscripts ("kringla heimsins", "the circle of the world").
"Heimskringla" is a collection of sagas about Swedish and Norwegian kings, beginning with the saga of the legendary Swedish dynasty of the Ynglings, followed by accounts of historical Norwegian rulers from Harald Fairhair of the 9th century up to the death of the pretender Eystein Meyla in 1177. The exact sources of his work are disputed, but included earlier kings' sagas, such as Morkinskinna, Fagrskinna and the twelfth century Norwegian synoptic histories and oral traditions, notably many skaldic poems. Snorri had himself visited Norway and Sweden. For events of the mid-12th century, Snorri explicitly names the now-lost work "Hryggjarstykki" as his source. The composition of the sagas is Snorri's.
The name "Heimskringla" comes from the fact that the first words of the first saga in the compilation ("Ynglinga saga") are "Kringla heimsins", "the orb of the Earth".
The earliest parchment copy of the work is referred to as "Kringla", now in the Reykjavík National Library, catalogued as Lbs fragm 82. This is now a single vellum leaf from c. 1260, a part of the Saga of St. Olaf; the rest of the manuscript was lost to fire in 1728.
"Heimskringla" consists of several sagas, often thought of as falling into three groups, giving the overall work the character of a triptych. The saga narrates the contests of the kings, the establishment of the kingdom of Norway, Viking expeditions to various European countries, ranging as far afield as Palestine in the saga of Sigurd the Crusader. The stories are told with a life and freshness, giving a picture of human life in all its reality. The saga is a prose epic, relevant to the history not only of Scandinavia but the regions included in the wider medieval Scandinavian diaspora. The first part of the "Heimskringla" is rooted in Norse mythology; as the collection proceeds, fable and fact intermingle, but the accounts become increasingly historically reliable.
The first section tells of the mythological prehistory of the Norwegian royal dynasty, tracing Odin, described here as a mortal man, and his followers from the East, from Asaland and Asgard, its chief city, to their settlement in Scandinavia (more precisely to east-central Sweden, according to Snorri). The subsequent sagas are (with few exceptions) devoted to individual rulers, starting with Halfdan the Black.
A version of "Óláfs saga helga", about the saint Olaf II of Norway, is the main and central part of the collection: Olaf's 15-year-long reign takes up about one third of the entire work.
Thereafter, the saga of Harald Hardrada narrates Harald's expedition to the East, his brilliant exploits in Constantinople, Syria, and Sicily, his skaldic accomplishments, and his battles in England against Harold Godwinson, the son of Godwin, Earl of Wessex, where he fell at the battle Battle of Stamford Bridge in 1066 only a few days before Harold fell at the Battle of Hastings. After presenting a series of other kinds, the saga ends with Magnus V of Norway.
"Heimskringla" contains the following sagas (see also List of Norwegian monarchs):
Snorri explicitly mentions a few prose sources, now mostly lost in the form that he knew them: "Hryggjarstykki" ('spine pieces') by Eiríkr Oddsson (covering events 1130-61), "Skjǫldunga saga", an unidentified saga about Knútr inn gamli, and a text called "Jarlasǫgurnar" ('sagas of the jarls', which seems to correspond to the saga now known as "Orkneyinga saga").
Snorri may have had access to a wide range of the early Scandinavian historical texts known today as the 'synoptic histories', but made most use of:
Snorri also made extensive use of skaldic verse which he believed to have been composed at the time of the events portrayed and transmitted orally from that time onwards, and clearly made use of other oral accounts, though it is uncertain to what extent.
Up until the mid-19th century, historians put great trust in the factual truth of Snorri's narrative, as well as other old Norse sagas. In the early 20th century, this trust was largely abandoned with the advent of "saga criticism", pioneered by Lauritz and Curt Weibull. These historians pointed out that Snorri's work had been written several centuries after most of the events it describes. In Norway, the historian Edvard Bull famously proclaimed that "we have to give up all illusions that Snorri's mighty epic bears any deeper resemblance to what actually happened" in the time it describes. A school of historians has come to believe that the motives Snorri and the other saga writers give to their characters owe more to conditions in the 13th century than in earlier times.
"Heimskringla" has, however, continued to be used as a historical source, though with more caution. It is not common to believe in the detailed accuracy of the historical narrative and historians tend to see little to no historical truth behind the first few sagas, however, they are still seen by many as a valuable source of knowledge about the society and politics of medieval Norway. The factual content of the work tends to be deemed more credible where it discusses more recent times, as the distance in time between the events described and the composition of the saga was shorter, allowing traditions to be retained in a largely accurate form, and because in the twelfth century the first contemporary written sources begin to emerge in Norway.
Whereas prior to "Heimskringla" there seems to have been a diversity of efforts to write histories of kings, Snorri's "Heimskringla" seems thereafter to have been the basis for Icelandic writing about Scandinavian kings, and was expanded by scribes rather than entirely revised. "Flateyjarbók", from the end of the fourteenth century, is the most extreme example of expansion, interweaving Snorri's text with many "þættir" and other whole sagas, prominently "Orkneyinga saga", "Færeyinga saga", and "Fóstbrœðra saga".
The text is also referenced in "Journey to the Center of the Earth" by Jules Verne; the work is the one Professor Liedenbrock finds Arne Saknussem's note in.
By the mid-16th century, the Old Norse language was unintelligible to Norwegian, Swedish or Danish readers. At that time several translations of extracts were made in Norway into the Danish language, which was the literary language of Norway at the time. The first complete translation was made around 1600 by Peder Claussøn Friis, and printed in 1633. This was based on a manuscript known as "Jofraskinna".
Subsequently, the Stockholm manuscript was translated into Swedish and Latin by Johan Peringskiöld (by order of Charles XI) and published in 1697 at Stockholm under the title "Heimskringla", which is the first known use of the name. This edition also included the first printing of the text in Old Norse. A new Danish translation with the text in Old Norse and a Latin translation came out in 1777–83 (by order of Frederick VI as crown prince). An English translation by Samuel Laing was finally published in 1844, with a second edition in 1889. Starting in the 1960s English-language revisions of Laing appeared, as well as fresh English translations.
In the 19th century, as Norway was achieving independence after centuries of union with Denmark and Sweden, the stories of the independent Norwegian medieval kingdom won great popularity in Norway. Heimskringla, although written by an Icelander, became an important national symbol for Norway during the period of romantic nationalism. In 1900, the Norwegian parliament, the Storting, subsidized the publication of new translations of Heimskringla into both Norwegian written forms, landsmål and riksmål, "in order that the work may achieve wide distribution at a low price".
The most recent English translation of "Heimskringla" is by Alison Finlay and Anthony Faulkes and is available open-access.
|
https://en.wikipedia.org/wiki?curid=14468
|
Hamar
Hamar is a town in Hamar Municipality in Innlandet county, Norway. It is part of the traditional region of Hedmarken. The administrative centre of the municipality is the town of Hamar. The municipality of Hamar was separated from Vang as a town and municipality of its own in 1849. Vang was merged back into Hamar on 1 January 1992.
The town is located on the shores of Mjøsa, Norway's largest lake, and is the principal city of Hedmark county. It is bordered to the northwest by the municipality of Ringsaker, to the north by Åmot, to the east by Løten, and to the south by Stange.
The municipality (originally the town) is named after the old "Hamar" farm (Old Norse: "Hamarr"); the medieval town was built on its ground. The name is identical with the word "hamarr" which means "rocky hill".
The coat-of-arms shows a Black Grouse sitting in the top of a pine tree on a white background. It was first described in the anonymous Hamar Chronicle, written in 1553.
Between 500 and 1000 AD, Aker farm was one of the most important power centres in Norway, located just a few kilometres away from today's Hamar. Three coins found in Ringerike in 1895 have been dated to the time of Harald Hardråde and are inscribed "Olafr a Hamri".
At some point, presumably after 1030 but clearly before 1152, the centre was moved from Aker to the peninsula near Rosenlundvika, what we today know as Domkirkeodden. There are some indications Harald Hardråde initiated this move because he had property at the new site.
Much of the information about medieval Hamar is derived from the Hamar Chronicles, dated to about 1550. The town is said to have reached its apex in the early 14th century, dominated by the Hamar cathedral, bishop's manor, and fortress, and surrounding urbanization. The town was known for its fragrant apple orchards, but there were also merchants, craftsmen, and fishermen in the town.
After the Christianization of Norway in 1030, Hamar began to gain influence as a centre for trade and religion, until the episcopal representative Nikolaus Breakspear in 1152 founded Hamar Kaupangen as one of five dioceses in medieval Norway. This diocese included Hedemarken and Christians Amt, being separated in 1152 from the former diocese of Oslo. The first bishop of Hamar was Arnold, Bishop of Gardar, Greenland (1124–1152). He began to build the now ruined cathedral of Christ Church, which was completed about the time of Bishop Paul (1232–1252). Bishop Thorfinn (1278–1282) was exiled and died at Ter Doest abbey in Flanders, and was later canonised. Bishop Jörund (1285–1286) was transferred to Trondheim. A provincial council was held in 1380. Hamar remained an important religious and political centre in Norway, organized around the cathedral and the bishop's manor until the Reformation 1536-1537, when it lost its status as a bishopric after the last Catholic bishop, Mogens Lauritssøn (1513–1537), was taken prisoner in his castle at Hamar by Truid Ulfstand, a Danish noble, and sent to Antvorskov in Denmark, where he was mildly treated until his death in 1542. There were at Hamar a cathedral chapter with ten canons, a school, a Dominican Priory of St. Olaf, and a monastery of the Canons Regular of St. Anthony of Vienne.
Hamar, like most of Norway, was severely diminished by the Black Plague in 1349, and by all accounts continued this decline until the Reformation, after which it disappeared.
The Reformation in Norway took less than 10 years to complete, from 1526 to 1536. The fortress was made into the residence of the sheriff and renamed Hamarhus fortress. The cathedral was still used but fell into disrepair culminating with the Swedish army's siege and attempted demolition in 1567, during the Northern Seven Years' War, when the manor was also devastated.
By 1587, merchants in Oslo had succeeded in moving all of Hamar's market activities to Oslo. Though some regional and seasonal trade persisted into the 17th century, Hamar as a town ceased to exist by then. In its place, the area was used for agriculture under the farm of Storhamar, though the ruins of the cathedral, fortress, and lesser buildings became landmarks for centuries since then.
The King made Hamarhus a feudal seat until 1649, when Frederick III transferred the property known as Hammer to Hannibal Sehested, making it private property. In 1716, the estate was sold to Jens Grønbech (1666–1734). With this, a series of construction projects started, and the farm became known as Storhamar, passing through several owners until Norwegian nobility was abolished in 1831, when Erik Anker took over the farm.
As early as 1755, the Danish government in Copenhagen expressed an interest in establishing a trading center on Mjøsa. Elverum was considered a frontier town with frequent unrest, and there was even talk of encouraging the dissenting Hans Nielsen Hauge to settle in the area. Bishop Fredrik Julius Bech, one of the most prominent officials of his time, proposed establishing a town at or near Storhamar, at the foot of Furuberget.
In 1812, negotiations started in earnest, when the regional governor of Kristians Amt, proposed establishing a market on Mjøsa. A four-person commission was named on 26 July 1814, with the mandate of determining a suitable site for a new town along the shore. On 8 June 1815, the commission recommended establishing such a town at Lillehammer, then also a farm, part of Fåberg.
Acting on objections to this recommendation, the department of the interior asked two professors, Ludvig Stoud Platou and Gregers Fougner Lundh, to survey the area and develop an alternative recommendation. It appears that Lundh in particular put great effort into this assignment, and in 1824 he presented to the Storting a lengthy report, that included maps and plans for the new town.
Lundh's premise was that the national economic interest reigned supreme, so he based his recommendation on the proposed town's ability to quickly achieve self-sustaining growth. He proposed that the name of the new town be called "Carlshammer" and proposed it be built along the shore just north of Storhamar and eastward. His plans were detailed, calling for streets 20 meters broad, rectangular blocks with 12 buildings in each, 2 meters separating each of them. He also proposed tax relief for 20 years for the town's first residents, that the state relinquish property taxes in favor of the city, and that the city be given monopoly rights to certain trade. He even proposed that certain types of foreigners be allowed to settle in the town to promote trade, in particular, the Quakers.
His recommendation was accepted in principle by the government, but the parliamentary committee equivocated on the location. It left the determination of the actual site to the king so as to not slow down things further. Another commission was named in June 1825, consisting of Herman Wedel-Jarlsberg, professor Lundh, and other prominent Norwegians. After surveying the entire lake, it submitted another report that considered eleven different locations, including sites near today's Eidsvoll, Minnesund, Tangen in Stange, Aker, Storhamar, Brumunddal, Nes, Moelven, Lillehammer, Gjøvik, and Toten. Each was presented with pros and cons. The commission itself was split between Lillehammer and Storhamar. The parliament finally decided on Lillehammer, relegating Hamar once more, it seemed, to be a sleepy agricultural area.
As steamboats were introduced on the lake, the urban elite developed an interest in the medieval Hamar, and in 1841, editorials appeared advocating the reestablishment of a town at Storhamar. By then the limitations of Lillehammer's location had also become apparent, in particular those of its shallow harbor. After a few more years of discussions and negotiations both regionally and nationally, member of parliament Frederik Stang put on the table once more the possibility of a town in or near Storhamar. The governor at the time, Frederik Hartvig Johan Heidmann, presented a thorough deliberation of possible specific locations, and ended up proposing the current site, at Gammelhusbukten.
On 26 April 1848, the king signed into law the establishment of Hamar on the grounds of the farms of Storhamar and Holset, along the shores of Mjøsa. The law stated that the town will be founded on the date its borders are settled, which turned out to be 21 March 1849, known as the merchant town of Hamar, with a trading zone within five kilometers () of its borders.
The area of the new town covered 400 mål which is the equivalent to today's (40 hectare). An army engineer, Røyem, drafted the initial plan. There would be three thoroughfares, at Strandgata, Torggata, and Grønnegate (the latter the name of a medieval road) and a grid system of streets between them. The orientation of the town was toward the shore. Røyem set aside space for three parks and a public square, and also room for a church just outside the town's borders.
There were critics of the plan, pointing out that the terrain was hilly and not suitable for the proposed rigid grid. Some adjustments were made, but the plan was largely accepted and is evident in today's Hamar. There were also lingering concerns about the town's vulnerability to flooding.
No sooner had the ink dried on the new law, and building started in the spring of 1849. The first buildings were much like sheds, but there was great enthusiasm, and by the end of 1849, ten buildings were insured in the new town. None of these are standing today; the last two were adjacent buildings on Skappelsgate. By 1850, there were 31 insured houses, and 1852, 42; and in 1853, 56. Building slowed down for a few years and then picked up again in 1858, and by the end of 1860 there were 100 insured houses in the town. The shore side properties were obliged to grow gardens, setting the stage for a leafy urban landscape.
Roads quickly became a challenge – in some places, it was necessary to ford creeks in the middle of town. The road inspector found himself under considerable stress, and it was not until 1869 street names were settled. Highways in and out of the city also caused considerable debate, especially when it came to financing their construction.
The first passenger terminal in Hamar was in fact a crag in the lake, from which travelers were rowed into the city. In 1850, another pier was built with a two-storey terminal building. All this was complicated by the significant seasonal variations in water levels. In 1857 a canal was built around a basin that would allow freight ships to access a large warehouse. Although the canal and basin still were not deep enough to accommodate passenger steamships, the area became one of the busiest areas in the town and the point around which the harbor was further developed.
The Diocese of Hamar was established in 1864, and the Hamar Cathedral was consecrated in 1866 and remains a central point in the city.
A promenade came into being from the harbor area, past the gardens on the shore, and north toward the site of the old town.
The first executive of Hamar was Johannes Bay, who arrived in October 1849 to facilitate an election of a board of supervisors and representatives. The town's Royal Charter called for the election of 3 supervisors and 9 representatives, and elections were announced in the paper and through town crier. Of the 10 eligible town citizens, three supervisors were elected, and the remaining six were elected by consent to be representatives, resulting in a shortfall of 3 on the board. The first mayor of Hamar was Christian Borchgrevink.
The first order of business was the allocation of liquor licenses and the upper limit of alcohol that could be sold within the town limits. The board quickly decided to award licenses to both applicants and set the upper limit to 12,000 "pots" of liquor, an amount that was for all intents and purposes limitless.
The electorate increased in 1849 to 26, including merchants and various craftsmen, and the empty representative posts were filled in November. In 1850, the board allowed for unlimited exercise of any craft for which no citizenship had been taken out, which led to much unregulated craftsmanship. Part-time policemen were hired, and the town started setting taxes and a budget by the end of 1849. In 1850, a new election was held for the town board.
The painter Jakobsen had early on offered his house for public meetings and assembly, and upon buying a set of solid locks, his basement also became the town prison. One merchant was designated as the town's firefighter and was given two buckets with equipment, and later a simple hose, but by 1852 a full-time fire chief was named. There was also some controversy around the watchman who loudly reported the time to all the town's inhabitants every half-hour, every night. Hamar also had a scrupulously enforced ordinance against smoking (pipe) without a lid in public or private.
In Hamar's early days, the entire population consisted of young entrepreneurs, and little was needed in the way of social services. After a few years, a small number of indigent people needed support, and a poorhouse was erected.
In 1878, as the firefighting capabilities of the young town were upgraded, a fire broke out in a bakery that was put out without doing too much damage. In February 1879 at 2:00 in the morning another fire broke out after festivities, burning down an entire building that housed many historical items from town's history. This was followed by a series of fires that left entire blocks in ashes that seemed to come to an end in 1881, when a professional fire corps was hired.
In 1860, concerns about flooding were vindicated when a late and sudden spring caused the lake to flood, peaking on about 24 June, when the street-level floor of the front properties was completely inundated. This was the worst flood recorded since 1789. By 9 July, the floods had receded. But it was not to prove the end of the calamities. In August, massive rainfall led to flash flooding in the area, putting several streets under water. This was immediately followed by unseasonably cold weather, freezing the potato crops and inconveniencing Hamar's residents. And then, mild weather melted all the ice and accumulated snow, leading to another round of flooding. By the time a particularly cold and snow-filled winter set in, there was mostly relief about getting some stability.
In 1876, the town was scandalized by the apprehension of one Kristoffer Svartbækken, arrested for the cold-blooded murder of 19-year-old Even Nilsen Dæhlin. Svartbækken was convicted for the murder and executed the year after in the neighboring rural community of Løten in what must have been a spectacle with an audience of 3,000 locals, presumably most of Hamar's population at the time.
Then in 1889, there were riots in Hamar over the arrest of one of their own constables, one sergeant Huse, who had been insubordinate while on a military drill at the cavalry camp at Gardermoen. In an act of poor judgment, Huse's superior sent him to Hamar's prison in place of military stockades. Partly led and partly tolerated by other constables, the town's population engaged in demonstrations, marches, and other unlawful but non-violent acts that were effectively ended when a company of soldiers arrived from the camp at Terningmoen near Elverum.
The Hedmark museum, located on Domkirkeodden, is an important historical landmark in Hamar, an outdoor museum with remains of the medieval church, in a protective glass housing, the episcopal fortress and a collection of old farm houses. The institution is a combined medieval, ethnological and archaeological museum, and has received architectural prizes for its approach to conservation and exhibition. It also houses a vast photographic archive for the Hedmark region.
Additionally, Hamar is known for its indoor long track speed skating and bandy arena, the "Olympia Hall", better known as Vikingskipet ("The Viking ship") for its shape. It was built to host the speed skating competitions of the 1994 Winter Olympics that were held in nearby Lillehammer. Already in 1993 it hosted the Bandy World Championship. The Vikingskipet Olympic Arena was later used in the winter of 2007 as the service park for Rally Norway, the second round of the 2007 World Rally Championship season. It has been the host for the world's second largest computer party The Gathering starting on the Wednesday in Easter each year, for the last 13 years.
Also situated in Hamar is the Hamar Olympic Amphitheatre which hosted the figure skating and short track speed skating events of the 1994 Winter Olympics. The figure skating competition was highly anticipated. It featured Nancy Kerrigan and Tonya Harding, who drew most of the media attention, however the gold medal was won by Oksana Baiul of Ukraine.
The centre of Hamar is the pedestrian walkway in the middle of town, with the library, cinema and farmer's market on Stortorget (the big square) on the western side, and Østre Torg (the eastern square), which sits on top of an underground multi-story carpark, on the eastern side.
Hamar is an important railway junction between two different lines to Trondheim. Rørosbanen, the old railway line, branches off from the mainline Dovre Line. The Norwegian Railway Museum ("Norsk Jernbanemuseum") is also in Hamar. Hamar Airport, Stafsberg caters to general aviation.
Hamar boasts several teams at the Norwegian top level in various sports:
Hamar is known for its speed skating history, both for its skaters and the championships that have been hosted by the city, already in 1894 Hamar hosted its first European championship, and the first World Championship the following year. After the Vikingskipet was built, Hamar has hosted international championships on a regular basis.
The most notable skaters from Hamar are Dag Fornæss and Even Wetten, both former World champions, allround and 1000m respectively. Amund Sjøbrend, Ådne Sønderål and Eskil Ervik have all been members of the local club Hamar IL, although they were not born in Hamar.
In Hamar on 17 July 1993, Scottish cyclist Graeme Obree set a world record for the distance covered in an hour. His 51,596 metres broke the 51,151 set at altitude nine years earlier but lasted only six days before Chris Boardman broke it in Bordeaux.
Other notable athletes:
Hamar was the venue of three sports during the 1994 Winter Olympics, figure skating, short track and speed skating.
The following cities, both in Scandinavia and around the world, are twinned with Hamar:
Part of the plot of "The Axe", the first volume of Sigrid Undset's "The Master of Hestviken", is set in the Medieval Hamar. The book's young lovers, denied the right to marry by malicious relatives, come to the town in order to try to get the help of the kindly and compassionate Bishop Thorfinn of Hamar.
|
https://en.wikipedia.org/wiki?curid=14470
|
Irina Krush
Irina Krush (; born December 24, 1983) is an American chess player. She was awarded the title of Grandmaster by FIDE in 2013. Krush is a seven-time U.S. women's champion.
Irina Krush was born in Odessa, USSR (now Ukraine). She learned to play chess at age five, emigrating with her parents to Brooklyn that same year (1989).
At age 14 Krush won the 1998 U.S. Women's Chess Championship to become the youngest U.S. women's champion ever. She has won the championship on six other occasions, in 2007, 2010, 2012, 2013, 2014 and 2015.
Krush took part in the "Kasparov versus the World" chess competition in 1999. Garry Kasparov played the white pieces and the Internet public, via a Microsoft host website, voted on moves for the black pieces, guided by the recommendations of Krush and three of her contemporaries, Étienne Bacrot, Elisabeth Pähtz and Florin Felecan. On the tenth move, Krush suggested a , for which the World team voted. Kasparov said later that he lost control of the game at that point, and wasn't sure whether he was winning or losing.
Krush played in the Group C of the 2008 Corus Chess Tournament, a 14-player round-robin tournament held in Wijk aan Zee, the Netherlands. She finished in joint fifth place having scored 7/13 points after five wins (including the one against the eventual winner, Fabiano Caruana), four draws and four losses.
In 2013, she was awarded the Grandmaster title thanks to her results at the NYC Mayor's Cup International GM Tournament in 2001, Women's World Team Chess Championship 2013 and Baku Open 2013.
Krush has played on the U.S. national team in the Women's Chess Olympiad since 1998. The U.S. team won the silver medal in 2004 and bronze in 2008. She also competed as part of the US team in the Women's World Team Chess Championship in 2009 and 2013.
She played for the team Manhattan Applesauce in the U.S. Chess League in 2015; she previously played for the New York Knights (2005–2011, 2013). Krush and her ex-husband, Canadian Grandmaster Pascal Charbonneau, have played in the United Kingdom league for Guildford-ADC.
Krush also is an author, who frequently contributes articles to "Chess Life" magazine and "uschess.org." Her article on earning her grandmaster title in 2013 was honored as the "Best of US Chess" that year.
Krush attended Edward R. Murrow High School in Brooklyn. She graduated in International Relations from the New York University in 2006.
In March 2016 she appeared as a guest on "Steve Harvey", along with Hillary Clinton. Along with two other women, who were actresses, she answered questions from host Steve Harvey and Clinton regarding her life and chess career. Krush and the two impostors all gave plausible answers to the questions. Clinton successfully identified the real Irina Krush.
In March 2020, she was hospitalized and treated for a "moderate" COVID-19 infection, then released to recover under quarantine at home. While quarantined, she played in the Isolated Queens Swiss, an online women's blitz chess tournament played by internet streaming. She scored 7.5/10 in the tournament, putting her in joint second place, a half point behind tournament winner GM Alexandra Kosteniuk. In May 2020, Krush played for the USA team in the FIDE Online Nations Cup.
|
https://en.wikipedia.org/wiki?curid=14526
|
Institut des hautes études scientifiques
The Institut des hautes études scientifiques (IHÉS; English: Institute of Advanced Scientific Studies) is a French institute supporting advanced research in mathematics and theoretical physics. It is located in Bures-sur-Yvette just south of Paris. It is an independent research institute in a partnership with the federal University of Paris-Saclay.
The IHÉS was founded in 1958 by businessman and mathematical physicist Léon Motchane with the help of Robert Oppenheimer and Jean Dieudonné as a research centre in France, modeled on the renowned Institute for Advanced Study in Princeton, United States.
The strong personality of Alexander Grothendieck and the broad sweep of his revolutionizing theories were a dominating feature of the first ten years at the IHÉS. René Thom received an invitation from IHÉS in 1963 and after his appointment remained there until his death in 2002. Dennis Sullivan is remembered as one who had a special talent for encouraging fruitful exchanges among visitors and provoking a new and deeper insight into their ideas.
The IHÉS runs a highly regarded mathematical journal, "Publications Mathématiques de l'IHÉS".
IHÉS celebrated its 40th anniversary in 1998 and its 50th in 2008.
Alain Connes (Fields Medal 1982), has been holding the Léon Motchane Chair since 1979. Several CNRS researchers are also based at the IHES: Ahmed Abbes, Cédric Deffayet, Ofer Gabber, Fanny Kassel, and Christophe Soulé.
|
https://en.wikipedia.org/wiki?curid=14527
|
Iceland
Iceland (; ) is a Nordic island country in the North Atlantic, with a population of 364,134 and an area of , making it the most sparsely populated country in Europe. The capital and largest city is Reykjavík. Reykjavik and the surrounding areas in the southwest of the country are home to over two-thirds of the population. Iceland is volcanically and geologically active. The interior consists of a plateau characterised by sand and lava fields, mountains, and glaciers, and many glacial rivers flow to the sea through the lowlands. Iceland is warmed by the Gulf Stream and has a temperate climate, despite a high latitude just outside the Arctic Circle. Its high latitude and marine influence keep summers chilly, with most of the archipelago having a polar climate.
According to the ancient manuscript "Landnámabók", the settlement of Iceland began in 874 AD when the Norwegian chieftain Ingólfr Arnarson became the first permanent settler on the island. In the following centuries, Norwegians, and to a lesser extent other Scandinavians, emigrated to Iceland, bringing with them thralls (i.e., slaves or serfs) of Gaelic origin.
The island was governed as an independent commonwealth under the Althing, one of the world's oldest functioning legislative assemblies. Following a period of civil strife, Iceland acceded to Norwegian rule in the 13th century. The establishment of the Kalmar Union in 1397 united the kingdoms of Norway, Denmark, and Sweden. Iceland thus followed Norway's integration into that union, coming under Danish rule after Sweden's secession from the union in 1523. Although the Danish kingdom introduced Lutheranism forcefully in 1550, Iceland remained a distant semi-colonial territory in which Danish institutions and infrastructures were conspicuous by their absence.
In the wake of the French Revolution and the Napoleonic Wars, Iceland's struggle for independence took form and culminated in independence in 1918 and the founding of a republic in 1944. Although its parliament (Althing) was suspended from 1799 to 1845, the island republic has been credited with sustaining the world's oldest and longest-running parliament.
Until the 20th century, Iceland relied largely on subsistence fishing and agriculture. Industrialisation of the fisheries and Marshall Plan aid following World War II brought prosperity and Iceland became one of the wealthiest and most developed nations in the world. In 1994, it became a part of the European Economic Area, which further diversified the economy into sectors such as finance, biotechnology, and manufacturing.
Iceland has a market economy with relatively low taxes, compared to other OECD countries, as well as the highest trade union membership in the world. It maintains a Nordic social welfare system that provides universal health care and tertiary education for its citizens. Iceland ranks high in economic, democratic, social stability, and equality, currently ranking third in the world by median wealth per adult. In 2018, it was ranked as the sixth most developed country in the world by the United Nations' Human Development Index, and it ranks first on the Global Peace Index. Iceland runs almost completely on renewable energy.
Hit hard by the worldwide financial crisis, the nation's entire banking system systemically failed in October 2008, leading to an economic crisis and the collapse of the country's three largest banks. The crisis prompted substantial political unrest, the Icesave dispute, and the institution of capital controls (imposed in 2008 and lifted in 2017). By 2014, the Icelandic economy has made a significant recovery, in large part due to a surge in tourism.
Icelandic culture is founded upon the nation's Scandinavian heritage. Most Icelanders are descendants of Norse and Gaelic settlers. Icelandic, a North Germanic language, is descended from Old West Norse and is closely related to Faroese. The country's cultural heritage includes traditional Icelandic cuisine, Icelandic literature, and medieval sagas. Iceland has the smallest population of any NATO member and is the only one with no standing army, with a lightly armed coast guard.
The Sagas of Icelanders say that a Norwegian named Naddodd (or Naddador) was the first Norseman to reach Iceland, and in the 9th century he named it Snæland or "snow land" because it was snowing. Following Naddodd, the Swede Garðar Svavarsson arrived, and so the island was then called Garðarshólmur which means "Garðar's Isle".
Then came a Viking named Flóki Vilgerðarson; his daughter drowned en route, then his livestock starved to death. The sagas say that the rather despondent Flóki climbed a mountain and saw a fjord (Arnarfjörður) full of icebergs, which led him to give the island its new and present name. The notion that Iceland's Viking settlers chose that name to discourage oversettlement of their verdant isle is a myth.
According to both Landnámabók and Íslendingabók, monks known as the Papar lived in Iceland before Scandinavian settlers arrived, possibly members of a Hiberno-Scottish mission. Recent archaeological excavations have revealed the ruins of a cabin in Hafnir on the Reykjanes peninsula. Carbon dating indicates that it was abandoned sometime between 770 and 880. In 2016, archeologists uncovered a longhouse in Stöðvarfjörður that has been dated to as early as 800.
Swedish Viking explorer Garðar Svavarsson was the first to circumnavigate Iceland in 870 and establish that it was an island. He stayed over winter and built a house in Húsavík. Garðar departed the following summer but one of his men, Náttfari, decided to stay behind with two slaves. Náttfari settled in what is now known as Náttfaravík and he and his slaves became the first permanent residents of Iceland.
The Norwegian-Norse chieftain Ingólfr Arnarson built his homestead in present-day Reykjavík in 874. Ingólfr was followed by many other emigrant settlers, largely Scandinavians and their thralls, many of whom were Irish or Scottish. By 930, most arable land on the island had been claimed; the Althing, a legislative and judicial assembly, was initiated to regulate the Icelandic Commonwealth. Lack of arable land also served as an impetus to the settlement of Greenland starting in 986. The period of these early settlements coincided with the Medieval Warm Period, when temperatures were similar to those of the early 20th century. At this time, about 25% of Iceland was covered with forest, compared to 1% in the present day. Christianity was adopted by consensus around 999–1000, although Norse paganism persisted among segments of the population for some years afterwards.
The Icelandic Commonwealth lasted until the 13th century, when the political system devised by the original settlers proved unable to cope with the increasing power of Icelandic chieftains. The internal struggles and civil strife of the Age of the Sturlungs led to the signing of the Old Covenant in 1262, which ended the Commonwealth and brought Iceland under the Norwegian crown. Possession of Iceland passed from the Kingdom of Norway (872–1397) to the Kalmar Union in 1415, when the kingdoms of Norway, Denmark and Sweden were united. After the break-up of the union in 1523, it remained a Norwegian dependency, as a part of Denmark–Norway.
Infertile soil, volcanic eruptions, deforestation and an unforgiving climate made for harsh life in a society where subsistence depended almost entirely on agriculture. The Black Death swept Iceland twice, first in 1402–1404 and again in 1494–1495. The former outbreak killed 50% to 60% of the population, and the latter 30% to 50%.
Around the middle of the 16th century, as part of the Protestant Reformation, King Christian III of Denmark began to impose Lutheranism on all his subjects. Jón Arason, the last Catholic bishop of Hólar, was beheaded in 1550 along with two of his sons. The country subsequently became officially Lutheran and Lutheranism has since remained the dominant religion.
In the 17th and 18th centuries, Denmark imposed harsh trade restrictions on Iceland. Natural disasters, including volcanic eruption and disease, contributed to a decreasing population. Pirates from several countries, including the Barbary Coast, raided Iceland's coastal settlements and abducted people into slavery. A great smallpox epidemic in the 18th century killed around a third of the population. In 1783 the Laki volcano erupted, with devastating effects. In the years following the eruption, known as the Mist Hardships (Icelandic: "Móðuharðindin"), over half of all livestock in the country died. Around a quarter of the population starved to death in the ensuing famine.
In 1814, following the Napoleonic Wars, Denmark-Norway was broken up into two separate kingdoms via the Treaty of Kiel but Iceland remained a Danish dependency. Throughout the 19th century, the country's climate continued to grow colder, resulting in mass emigration to the New World, particularly to the region of Gimli, Manitoba in Canada, which was sometimes referred to as New Iceland. About 15,000 people emigrated, out of a total population of 70,000.
A national consciousness arose in the first half of the 19th century, inspired by romantic and nationalist ideas from mainland Europe. An Icelandic independence movement took shape in the 1850s under the leadership of Jón Sigurðsson, based on the burgeoning Icelandic nationalism inspired by the "Fjölnismenn" and other Danish-educated Icelandic intellectuals. In 1874, Denmark granted Iceland a constitution and limited home rule. This was expanded in 1904, and Hannes Hafstein served as the first Minister for Iceland in the Danish cabinet.
The Danish–Icelandic Act of Union, an agreement with Denmark signed on 1 December 1918 and valid for 25 years, recognised Iceland as a fully sovereign and independent state in a personal union with Denmark. The Government of Iceland established an embassy in Copenhagen and requested that Denmark carry out on its behalf certain defence and foreign affairs matters, subject to consultation with the Althing. Danish embassies around the world displayed two coats of arms and two flags: those of the Kingdom of Denmark and those of the Kingdom of Iceland. Iceland's legal position became comparable to those of countries belonging to the Commonwealth of Nations such as Canada whose sovereign is Queen Elizabeth II.
During World War II, Iceland joined Denmark in asserting neutrality. After the German occupation of Denmark on 9 April 1940, the Althing replaced the King with a regent and declared that the Icelandic government would take control of its own defence and foreign affairs. A month later, British armed forces conducted Operation Fork, the invasion and occupation of the country, violating Icelandic neutrality. In 1941, the Government of Iceland, friendly to Britain, invited the then-neutral United States to take over its defence so that Britain could use its troops elsewhere.
On 31 December 1943, the Danish–Icelandic Act of Union expired after 25 years. Beginning on 20 May 1944, Icelanders voted in a four-day plebiscite on whether to terminate the personal union with Denmark, abolish the monarchy, and establish a republic. The vote was 97% to end the union, and 95% in favour of the new republican constitution. Iceland formally became a republic on 17 June 1944, with Sveinn Björnsson as its first president.
In 1946, the US Defence Force Allied left Iceland. The nation formally became a member of NATO on 30 March 1949, amid domestic controversy and riots. On 5 May 1951, a defence agreement was signed with the United States. American troops returned to Iceland as the Iceland Defence Force, and remained throughout the Cold War. The US withdrew the last of its forces on 30 September 2006.
Iceland prospered during the Second World War. The immediate post-war period was followed by substantial economic growth, driven by industrialisation of the fishing industry and the US Marshall Plan programme, through which Icelanders received the most aid per capita of any European country (at US$209, with the war-ravaged Netherlands a distant second at US$109).
The 1970s were marked by the Cod Wars—several disputes with the United Kingdom over Iceland's extension of its fishing limits to offshore. Iceland hosted a summit in Reykjavík in 1986 between United States President Ronald Reagan and Soviet Premier Mikhail Gorbachev, during which they took significant steps toward nuclear disarmament. A few years later, Iceland became the first country to recognise the independence of Estonia, Latvia, and Lithuania as they broke away from the USSR. Throughout the 1990s, the country expanded its international role and developed a foreign policy oriented toward humanitarian and peacekeeping causes. To that end, Iceland provided aid and expertise to various NATO-led interventions in Bosnia, Kosovo, and Iraq.
Iceland joined the European Economic Area in 1994, after which the economy was greatly diversified and liberalised. International economic relations increased further after 2001, when Iceland's newly deregulated banks began to raise massive amounts of external debt, contributing to a 32% increase in Iceland's gross national income between 2002 and 2007.
In 2003–2007, following the privatisation of the banking sector under the government of Davíð Oddsson, Iceland moved toward having an economy based on international investment banking and financial services. It was quickly becoming one of the most prosperous countries in the world but was hit hard by a major financial crisis. The crisis resulted in the greatest migration from Iceland since 1887, with a net emigration of 5,000 people in 2009. Iceland's economy stabilised under the government of Jóhanna Sigurðardóttir, and grew by 1.6% in 2012. The centre-right Independence Party was returned to power in coalition with the Progressive Party in the 2013 elections. In the following years, Iceland saw a surge in tourism as the country became a popular holiday destination. In 2016, Prime Minister Sigmundur Davíð Gunnlaugsson resigned after being implicated in the Panama Papers scandal. Early elections in 2016 resulted in a right-wing coalition government of the Independence Party, the Reform Party and Bright Future.
This government fell when Bright Future quit the coalition due to a scandal involving then-Prime Minister Bjarni Benediktsson's father's letter of support for a convicted child sex offender. Snap elections in October 2017 brought to power a new coalition consisting of the Independence Party, the Progressive Party and the Left-Green Movement, headed by Katrín Jakobsdóttir.
Iceland is at the juncture of the North Atlantic and Arctic Oceans. The main island is entirely south of the Arctic Circle, which passes through the small Icelandic island of Grímsey off the main island's northern coast. The country lies between latitudes 63 and 68°N, and longitudes 25 and 13°W.
Iceland is closer to continental Europe than to mainland North America, although it is closest to Greenland (), an island of North America. Iceland is generally included in Europe for geographical, historical, political, cultural, linguistic and practical reasons. Geologically, the island includes parts of both continental plates. The closest bodies of land in Europe are the Faroe Islands (); Jan Mayen Island (); Shetland and the Outer Hebrides, both about ; and the Scottish mainland and Orkney, both about . The nearest part of Continental Europe is mainland Norway, about away, while mainland North America is away, at the northern tip of Labrador.
Iceland is the world's 18th largest island, and Europe's second-largest island after Great Britain. (The island of Ireland is third.) The main island is , but the entire country is in size, of which 62.7% is tundra. About 30 minor islands are in Iceland, including the lightly populated Grímsey and the Vestmannaeyjar archipelago. Lakes and glaciers cover 14.3% of its surface; only 23% is vegetated. The largest lakes are Þórisvatn reservoir: and Þingvallavatn: ; other important lakes include Lagarfljót and Mývatn. Jökulsárlón is the deepest lake, at .
Geologically, Iceland is part of the Mid-Atlantic Ridge, a ridge along which the oceanic crust spreads and forms new oceanic crust. This part of the mid-ocean ridge is located above a mantle plume, causing Iceland to be subaerial (above the surface of the sea). The ridge marks the boundary between the Eurasian and North American Plates, and Iceland was created by rifting and accretion through volcanism along the ridge.
Many fjords punctuate Iceland's 4,970-km-long (3,088-mi) coastline, which is also where most settlements are situated. The island's interior, the Highlands of Iceland, is a cold and uninhabitable combination of sand, mountains, and lava fields. The major towns are the capital city of Reykjavík, along with its outlying towns of Kópavogur, Hafnarfjörður, and Garðabær, nearby Reykjanesbær where the international airport is located, and the town of Akureyri in northern Iceland. The island of Grímsey on the Arctic Circle contains the northernmost habitation of Iceland, whereas Kolbeinsey contains the northernmost point of Iceland. Iceland has three national parks: Vatnajökull National Park, Snæfellsjökull National Park, and Þingvellir National Park. The country is considered a "strong performer" in environmental protection, having been ranked 13th in Yale University's Environmental Performance Index of 2012.
A geologically young land, Iceland is the surface expression of the Iceland Plateau, a large igneous province forming as a result of volcanism from the Iceland hotspot and along the Mid-Atlantic Ridge, the latter of which runs right through it. This means that the island is highly geologically active with many volcanoes including Hekla, Eldgjá, Herðubreið, and Eldfell. The volcanic eruption of Laki in 1783–1784 caused a famine that killed nearly a quarter of the island's population. In addition, the eruption caused dust clouds and haze to appear over most of Europe and parts of Asia and Africa for several months afterward, and affected climates in other areas.
Iceland has many geysers, including Geysir, from which the English word is derived, and the famous Strokkur, which erupts every 8–10 minutes. After a phase of inactivity, Geysir started erupting again after a series of earthquakes in 2000. Geysir has since grown quieter and does not erupt often.
With the widespread availability of geothermal power, and the harnessing of many rivers and waterfalls for hydroelectricity, most residents have access to inexpensive hot water, heating, and electricity. The island is composed primarily of basalt, a low-silica lava associated with effusive volcanism as has occurred also in Hawaii. Iceland, however, has a variety of volcanic types (composite and fissure), many producing more evolved lavas such as rhyolite and andesite. Iceland has hundreds of volcanoes with about 30 active volcanic systems.
Surtsey, one of the youngest islands in the world, is part of Iceland. Named after Surtr, it rose above the ocean in a series of volcanic eruptions between 8 November 1963 and 5 June 1968. Only scientists researching the growth of new life are allowed to visit the island.
On 21 March 2010, a volcano in Eyjafjallajökull in the south of Iceland erupted for the first time since 1821, forcing 600 people to flee their homes. Additional eruptions on 14 April forced hundreds of people to abandon their homes. The resultant cloud of volcanic ash brought major disruption to air travel across Europe.
Another large eruption occurred on 21 May 2011. This time it was the Grímsvötn volcano, located under the thick ice of Europe's largest glacier, Vatnajökull. Grímsvötn is one of Iceland's most active volcanoes, and this eruption was much more powerful than the 2010 Eyjafjallajökull activity, with ash and lava hurled into the atmosphere, creating a large cloud.
The highest elevation for Iceland is listed as 2,110 m (6,923 ft) at Hvannadalshnúkur (64°00′N 16°39′W).
The climate of Iceland's coast is subarctic. The warm North Atlantic Current ensures generally higher annual temperatures than in most places of similar latitude in the world. Regions in the world with similar climates include the Aleutian Islands, the Alaska Peninsula, and Tierra del Fuego, although these regions are closer to the equator. Despite its proximity to the Arctic, the island's coasts remain ice-free through the winter. Ice incursions are rare, the last having occurred on the north coast in 1969.
The climate varies between different parts of the island. Generally speaking, the south coast is warmer, wetter, and windier than the north. The Central Highlands are the coldest part of the country. Low-lying inland areas in the north are the most arid. Snowfall in winter is more common in the north than the south.
The highest air temperature recorded was on 22 June 1939 at Teigarhorn on the southeastern coast. The lowest was on 22 January 1918 at Grímsstaðir and Möðrudalur in the northeastern hinterland. The temperature records for Reykjavík are on 30 July 2008, and on 21 January 1918.
Phytogeographically, Iceland belongs to the Arctic province of the Circumboreal Region within the Boreal Kingdom. Around three-quarters of the island is barren of vegetation; plant life consists mainly of grassland, which is regularly grazed by livestock. The most common tree native to Iceland is the northern birch ("Betula pubescens"), which formerly formed forests over much of Iceland, along with aspens ("Populus tremula"), rowans ("Sorbus aucuparia"), common junipers ("Juniperus communis"), and other smaller trees, mainly willows.
When the island was first settled, it was extensively forested, with around 30% of the land covered in trees. In the late 12th century, Ari the Wise described it in the Íslendingabók as "forested from mountain to sea shore". Permanent human settlement greatly disturbed the isolated ecosystem of thin, volcanic soils and limited species diversity. The forests were heavily exploited over the centuries for firewood and timber. Deforestation, climatic deterioration during the Little Ice Age, and overgrazing by sheep imported by settlers caused a loss of critical topsoil due to erosion. Today, many farms have been abandoned. Three-quarters of Iceland's 100,000 square kilometres is affected by soil erosion, serious enough to make the land useless. Only a few small birch stands now exist in isolated reserves. The planting of new forests has increased the number of trees, but the result does not compare to the original forests. Some of the planted forests include introduced species. The tallest tree in Iceland is a sitka spruce planted in 1949 in Kirkjubæjarklaustur; it was measured at in 2013.
The only native land mammal when humans arrived was the Arctic fox, which came to the island at the end of the ice age, walking over the frozen sea. On rare occasions, bats have been carried to the island with the winds, but they are not able to breed there. Polar bears occasionally come over from Greenland, but they are just visitors, and no Icelandic populations exist. No native or free-living reptiles or amphibians are on the island.
The animals of Iceland include the Icelandic sheep, cattle, chickens, goats, the sturdy Icelandic horse, and the Icelandic Sheepdog, all descendants of animals imported by Europeans. Wild mammals include the Arctic fox, mink, mice, rats, rabbits, and reindeer. Polar bears occasionally visit the island, travelling on icebergs from Greenland. In June 2008, two polar bears arrived in the same month. Marine mammals include the grey seal ("Halichoerus grypus") and harbor seal ("Phoca vitulina").
Many species of fish live in the ocean waters surrounding Iceland, and the fishing industry is a major part of Iceland's economy, accounting for roughly half of the country's total exports. Birds, especially seabirds, are an important part of Iceland's animal life. Puffins, skuas, and kittiwakes nest on its sea cliffs.
Commercial whaling is practised intermittently along with scientific whale hunts. Whale watching has become an important part of Iceland's economy since 1997.
Around 1,300 species of insects are known in Iceland. This is low compared with other countries (over one million species have been described worldwide). Iceland is essentially free of mosquitoes.
Iceland has a left–right multi-party system. Following the 2017 parliamentary election, the biggest parties are the centre-right Independence Party ("Sjálfstæðisflokkurinn"), the Left-Green Movement ("Vinstrihreyfingin – grænt framboð") and the Progressive Party ("Framsóknarflokkurinn"). These three parties form the current ruling coalition in the cabinet led by leftist Katrín Jakobsdóttir.
Other political parties with seats in the Althing (Parliament) are the Social Democratic Alliance ("Samfylkingin"), the Centre Party ("Miðflokkurinn"), Iceland's Pirates, the People's Party ("Flokkur fólksins"), and the Reform Party ("Viðreisn").
Iceland was the first country in the world to have a political party formed and led entirely by women. Known as the Women's List or Women's Alliance ("Kvennalistinn"), it was founded in 1983 to advance the political, economic, and social needs of women. After participating in its first parliamentary elections, the Women's List helped increase the proportion of female parliamentarians by 15%. It disbanded in 1999, formally merging the next year with the Social Democratic Alliance, although about half of its members joined the Left-Green Movement instead. It did leave a lasting influence on Iceland's politics: every major party has a 40% quota for women, and in 2009 nearly a third of members of parliament were female, compared to the global average of 16%. Following the 2016 elections, 48% of members of parliament are female.
In 2016 Iceland was ranked 2nd in the strength of its democratic institutions and 13th in government transparency. The country has a high level of civic participation, with 81.4% voter turnout during the most recent elections, compared to an OECD average of 72%. However, only 50% of Icelanders say they trust their political institutions, slightly less than the OECD average of 56% (and most probably a consequence of the political scandals in the wake of the Icelandic financial crisis).
Iceland is a representative democracy and a parliamentary republic. The modern parliament, "Alþingi" (English: Althing), was founded in 1845 as an advisory body to the Danish monarch. It was widely seen as a re-establishment of the assembly founded in 930 in the Commonwealth period and yet temporarily suspended from 1799 to 1845. Consequently, "it is arguably the world's oldest parliamentary democracy." It currently has 63 members, elected for a maximum period of four years.
The head of government is the prime minister who, together with the cabinet, is responsible for executive government.
The president, in contrast, is elected by popular vote for a term of four years with no term limit. The elections for president, the Althing, and local municipal councils are all held separately every four years. The president of Iceland is a largely ceremonial head of state and serves as a diplomat, but may veto laws voted by the parliament and put them to a national referendum. The current president is Guðni Th. Jóhannesson.
The cabinet is appointed by the president after a general election to the Althing; however, the appointment is usually negotiated by the leaders of the political parties, who decide among themselves after discussions which parties can form the cabinet and how to distribute its seats, under the condition that it has a majority support in the Althing. Only when the party leaders are unable to reach a conclusion by themselves within a reasonable time span does the president exercise this power and appoint the cabinet personally. This has not happened since the republic was founded in 1944, but in 1942 regent Sveinn Björnsson, who had been installed in that position by the Althing in 1941, appointed a non-parliamentary government. The regent had, for all practical purposes, the position of a president, and Sveinn would later become the country's first president in 1944.
The governments of Iceland have always been coalition governments, with two or more parties involved, as no single political party has ever received a majority of seats in the Althing throughout the republican period. The extent of the political power possessed by the office of the president is disputed by legal scholars, in Iceland; several provisions of the constitution appear to give the president some important powers, but other provisions and traditions suggest differently. In 1980, Icelanders elected Vigdís Finnbogadóttir as president, the world's first directly elected female head of state. She retired from office in 1996. In 2009, Iceland became the first country with an openly gay head of government when Jóhanna Sigurðardóttir became prime minister.
Iceland is divided into regions, constituencies and municipalities. The eight regions are primarily used for statistical purposes. District court jurisdictions also use an older version of this division. Until 2003, the constituencies for the parliamentary elections were the same as the regions, but by an amendment to the constitution, they were changed to the current six constituencies:
The redistricting change was made to balance the weight of different districts of the country, since previously a vote cast in the sparsely populated areas around the country would count much more than a vote cast in the Reykjavík city area. The imbalance between districts has been reduced by the new system, but still exists.
74 municipalities in Iceland govern local matters like schools, transport, and zoning. These are the actual second-level subdivisions of Iceland, as the constituencies have no relevance except in elections and for statistical purposes. Reykjavík is by far the most populous municipality, about four times more populous than Kópavogur, the second one.
Iceland, which is a member of the UN, NATO, EFTA, Council of Europe and OECD, maintains diplomatic and commercial relations with practically all nations, but its ties with the Nordic countries, Germany, the United States, Canada and the other NATO nations are particularly close. Historically, due to cultural, economic and linguistic similarities, Iceland is a Nordic country, and it participates in intergovernmental cooperation through the Nordic Council.
Iceland is a member of the European Economic Area (EEA), which allows the country access to the single market of the European Union (EU). It was not a member of the EU, but in July 2009 the Icelandic parliament, the Althing, voted in favour of application for EU membership and officially applied on 17 July 2009. However, in 2013, opinion polls showed that many Icelanders were now against joining the EU; following 2013 elections the two parties that formed the island's new government—the centrist Progressive Party and the right-wing Independence Party—announced they would hold a referendum on EU membership.
Iceland has no standing army, but the Icelandic Coast Guard which also maintains the Iceland Air Defence System, and an Iceland Crisis Response Unit to support peacekeeping missions and perform paramilitary functions.
The Iceland Defense Force (IDF) was a military command of the United States Armed Forces from 1951 to 2006. The IDF, created at the request of NATO, came into existence when the United States signed an agreement to provide for the defense of Iceland. The IDF also consisted of civilian Icelanders and military members of other NATO nations. The IDF was downsized after the end of the Cold War and the U.S. Air Force maintained four to six interceptor aircraft at the Naval Air Station Keflavik, until they were withdrawn on 30 September 2006. Since May 2008, NATO nations have periodically deployed fighters to patrol Icelandic airspace under the Icelandic Air Policing mission. Iceland supported the 2003 invasion of Iraq despite much domestic controversy, deploying a Coast Guard EOD team to Iraq, which was replaced later by members of the Iceland Crisis Response Unit. Iceland has also participated in the ongoing conflict in Afghanistan and the 1999 NATO bombing of Yugoslavia. Despite the ongoing financial crisis the first new patrol ship in decades was launched on 29 April 2009.
Iceland was the neutral host of the historic 1986 Reagan–Gorbachev summit in Reykjavík, which set the stage for the end of the Cold War. Iceland's principal historical international disputes involved disagreements over fishing rights. Conflict with the United Kingdom led to a series of so-called Cod Wars, which included confrontations between the Icelandic Coast Guard and the Royal Navy over British fishermen, in 1952–1956 due to the extension of Iceland's fishing zone from , 1958–1961 following a further extension to , 1972–1973 with another extension to ; and in 1975–1976 another extension to .
According to the 2011 Global Peace Index, Iceland is the most peaceful country in the world, due to its lack of armed forces, low crime rate and high level of socio-political stability. Iceland is listed in the Guinness World Records book as the "country ranked most at peace" and the "lowest military spending per capita".
In 2007, Iceland was the seventh most productive country in the world per capita (US$54,858), and the fifth most productive by GDP at purchasing power parity ($40,112). About 85 percent of total primary energy supply in Iceland is derived from domestically produced renewable energy sources. Use of abundant hydroelectric and geothermal power has made Iceland the world's largest electricity producer per capita. As a result of its commitment to renewable energy, the 2016 Global Green Economy Index ranked Iceland among the top 10 greenest economies in the world.
Historically, Iceland's economy depended heavily on fishing, which still provides 40% of export earnings and employs 7% of the work force. The economy is vulnerable to declining fish stocks and drops in world prices for its main material exports: fish and fish products, aluminium, and ferrosilicon. Whaling in Iceland has been historically significant. Iceland still relies heavily on fishing, but its importance is diminishing from an export share of 90% in the 1960s to 40% in 2006.
Until the 20th century, Iceland was a fairly poor country. Currently, it remains one of the most developed countries in the world. Strong economic growth had led Iceland to be ranked first in the United Nations' Human Development Index report for 2007/2008, although in 2011 its HDI rating had fallen to 14th place as a result of the economic crisis. Nevertheless, according to the Economist Intelligence Index of 2011, Iceland has the 2nd highest quality of life in the world. Based on the Gini coefficient, Iceland also has one of the lowest rates of income inequality in the world, and when adjusted for inequality, its HDI ranking is 6th. Iceland's unemployment rate has declined consistently since the crisis, with 4.8% of the labour force being unemployed , compared to 6% in 2011 and 8.1% in 2010.
Many political parties remain opposed to EU membership, primarily due to Icelanders' concern about losing control over their natural resources (particularly fisheries). The national currency of Iceland is the Icelandic króna (ISK). Iceland is the only country in the world to have a population under two million yet still have a floating exchange rate and an independent monetary policy.
A poll released on 5 March 2010 by Capacent Gallup showed that 31% of respondents were in favour of adopting the euro and 69% opposed. Another Capacent Gallup poll conducted in February 2012 found that 67.4% of Icelanders would reject EU membership in a referendum.
Iceland's economy has been diversifying into manufacturing and service industries in the last decade, including software production, biotechnology, and finance; industry accounts for around a quarter of economic activity, while services comprise close to 70%. The tourism sector is expanding, especially in ecotourism and whale-watching. On average, Iceland receives around 1.1 million visitors annually, which is more than three times the native population. 1.7 million people visited Iceland in 2016, 3 times more than the number that came in 2010. Iceland's agriculture industry, accounting for 5.4% of GDP, consists mainly of potatoes, green vegetables (in greenhouses), mutton and dairy products. The financial centre is Borgartún in Reykjavík, which hosts a large number of companies and three investment banks. Iceland's stock market, the Iceland Stock Exchange (ISE), was established in 1985.
Iceland is ranked 27th in the 2012 Index of Economic Freedom, lower than in prior years but still among the freest in the world. , it ranks 29th in the World Economic Forum's Global Competitive Index, one place lower than in 2015. According to INSEAD's Global Innovation Index, Iceland is the 11th most innovative country in the world. Unlike most Western European countries, Iceland has a flat tax system: the main personal income tax rate is a flat 22.75%, and combined with municipal taxes, the total tax rate equals no more than 35.7%, not including the many deductions that are available. The corporate tax rate is a flat 18%, one of the lowest in the world. There is also a value added tax, whereas a net wealth tax was eliminated in 2006. Employment regulations are relatively flexible and the labour market is one of the freest in the world. Property rights are strong and Iceland is one of the few countries where they are applied to fishery management. Like other welfare states, taxpayers pay various subsidies to each other, but with spending being less than in most European countries.
Despite low tax rates, agricultural assistance is the highest among OECD countries and a potential impediment to structural change. Also, health care and education spending have relatively poor returns by OECD measures, though improvements have been made in both areas. The OECD "Economic Survey of Iceland 2008" had highlighted Iceland's challenges in currency and macroeconomic policy. There was a currency crisis that started in the spring of 2008, and on 6 October trading in Iceland's banks was suspended as the government battled to save the economy. An assessment by the OECD 2011 determined that Iceland has made progress in many areas, particularly in creating a sustainable fiscal policy and restoring the health of the financial sector; however, challenges remain in making the fishing industry more efficient and sustainable, as well as in improving monetary policy to address inflation. Iceland's public debt has decreased since the economic crisis, and is the 31st highest in the world by proportion of national GDP.
Iceland had been hit especially hard by the Great Recession that began in December 2007, because of the failure of its banking system and a subsequent economic crisis. Before the crash of the country's three largest banks, Glitnir, Landsbanki and Kaupthing, their combined debt exceeded approximately six times the nation's gross domestic product of €14 billion ($19 billion). In October 2008, the Icelandic parliament passed emergency legislation to minimise the impact of the financial crisis. The Financial Supervisory Authority of Iceland used permission granted by the emergency legislation to take over the domestic operations of the three largest banks. Icelandic officials, including central bank governor Davíð Oddsson, stated that the state did not intend to take over any of the banks' foreign debts or assets. Instead, new banks were established to take on the domestic operations of the banks, and the old banks will be run into bankruptcy.
On 28 October 2008, the Icelandic government raised interest rates to 18% (, it was 3.5%), a move forced in part by the terms of acquiring a loan from International Monetary Fund (IMF). After the rate hike, trading on the Icelandic króna finally resumed on the open market, with valuation at around 250 ISK per euro, less than one-third the value of the 1:70 exchange rate during most of 2008, and a significant drop from the 1:150 exchange ratio of the week before. On 20 November 2008, the Nordic countries agreed to lend Iceland $2.5 billion.
On 26 January 2009, the coalition government collapsed due to the public dissent over the handling of the financial crisis. A new left-wing government was formed a week later and immediately set about removing Central Bank governor Davíð Oddsson and his aides from the bank through changes in law. Davíð was removed on 26 February 2009 in the wake of protests outside the Central Bank.
Thousands of Icelanders have moved from the country after the collapse, and many of those moved to Norway. In 2005, 293 people moved from Iceland to Norway; in 2009, the figure was 1,625. In April 2010, the Icelandic Parliament's Special Investigation Commission published the findings of its investigation, revealing the extent of control fraud in this crisis. By June 2012, Landsbanki managed to repay about half of the Icesave debt.
According to Bloomberg, Iceland is on the trajectory of 2% unemployment as a result of crisis-management decisions made back in 2008, including allowing the banks to fail.
Iceland has a high level of car ownership per capita; with a car for every 1.5 inhabitants; it is the main form of transport. Iceland has of administered roads, of which are paved and are not. A great number of roads remain unpaved, mostly little-used rural roads. The road speed limits are and in towns, on gravel country roads and on hard-surfaced roads.
Route 1, or the Ring Road (Icelandic: "Þjóðvegur 1" or "Hringvegur"), was completed in 1974, and is a main road that runs around Iceland and connects all the inhabited parts of the island, with the interior of the island being uninhabited. This paved road is long with one lane in each direction, except near larger towns and cities and in the Hvalfjörður Tunnel where it has more lanes. Many bridges on it, especially in the north and east, are single lane and made of timber and/or steel.
Keflavík International Airport (KEF) is the largest airport and the main aviation hub for international passenger transport. It serves several international and domestic airline companies. KEF is in the vicinity of the larger metropolitan capital areas, to the WSW of Reykjavík center, and public bus services are available.
Iceland has no passenger railways.
Reykjavík Airport (RKV) is the second largest airport located just 1,5 km from the capital centre. RKV serves general aviation traffic and has daily- or regular domestic flights to 12 local townships within Iceland. RKV also serves international flights to Greenland and the Faroe Islands, business and private airplanes along with aviation training.
Akureyri Airport (AEY) and Egilsstaðir Airport (EGS) are two other domestic airports with limited international service capacity. There are a total of 103 registered airports and airfields in Iceland; most of them are unpaved and located in rural areas. The second longest runway is at Geitamelur, a four-runway glider field around east of Reykjavík.
Six main ferry services provide regular access to various outpost communities or shorten travel distances.
Renewable sources—geothermal and hydropower—provide effectively all of Iceland's electricity and around 85% of the nation's total primary energy consumption, with most of the remainder consisting of imported oil products used in transportation and in the fishing fleet. A 2000 report from the University of Iceland suggested that Iceland could potentially convert from oil to hydrogen power by 2040. Iceland's largest geothermal power plants are Hellisheiði and Nesjavellir, while Kárahnjúkar Hydropower Plant is the country's largest hydroelectric power station. When the Kárahnjúkavirkjun started operating, Iceland became the world's largest electricity producer per capita. Iceland is one of the few countries that have filling stations dispensing hydrogen fuel for cars powered by fuel cells. It is also one of a few countries currently capable of producing hydrogen in adequate quantities at a reasonable cost, because of Iceland's plentiful renewable sources of energy. The ranking of geopolitical gains and losses after energy transition (GeGaLo Index) places Iceland first out of 156 countries, making it the main geopolitical winner in the global energy transition.
Despite this, Icelanders emitted 16.9 tonnes of CO2 per capita in 2016, the highest in the EU and EFTA, mainly resulting from transport and aluminium smelting. Nevertheless, in 2010, Iceland was noted by Guinness World Records as "the Greenest Country", reaching the highest score by the Environmental Sustainability Index, which measures a country's water use, biodiversity and adoption of clean energies, with a score of 93.5/100.
On 22 January 2009, Iceland announced its first round of offshore licences for companies wanting to conduct hydrocarbon exploration and production in a region northeast of Iceland, known as the Dreki area. Three exploration licenses were awarded but all were subsequently relinquished.
, the government of Iceland was in talks with the government of the United Kingdom about the possibility of constructing Icelink, a high-voltage direct-current connector for transmission of electricity between the two countries. Such a cable would give Iceland access to a market where electricity prices have generally been much higher than those in Iceland. Iceland has considerable renewable energy resources, especially geothermal energy and hydropower resources, and most of the potential has not been developed, partly because there is not enough demand for additional electricity generation capacity from the residents and industry of Iceland; the United Kingdom is interested in importing inexpensive electricity from renewable sources of energy, and this could lead to further development of the energy resources.
The Ministry of Education, Science and Culture is responsible for the policies and methods that schools must use, and they issue the National Curriculum Guidelines. However, playschools, primary schools, and lower secondary schools are funded and administered by the municipalities. The government does allow citizens to home educate their children, however under a very strict set of demands. Students must adhere closely to the government mandated curriculum, and the parent teaching must acquire a government approved teaching certificate.
Nursery school, or "leikskóli", is non-compulsory education for children younger than six years, and is the first step in the education system. The current legislation concerning playschools was passed in 1994. They are also responsible for ensuring that the curriculum is suitable so as to make the transition into compulsory education as easy as possible.
Compulsory education, or "grunnskóli", comprises primary and lower secondary education, which often is conducted at the same institution. Education is mandatory by law for children aged from 6 to 16 years. The school year lasts nine months, beginning between 21 August and 1 September, ending between 31 May and 10 June. The minimum number of school days was once 170, but after a new teachers' wage contract, it increased to 180. Lessons take place five days a week. All public schools have mandatory education in Christianity, although an exemption may be considered by the Minister of Education.
Upper secondary education, or "framhaldsskóli", follows lower secondary education. These schools are also known as gymnasia in English. Though not compulsory, everyone who has had a compulsory education has the right to upper secondary education. This stage of education is governed by the Upper Secondary School Act of 1996. All schools in Iceland are mixed sex schools. The largest seat of higher education is the University of Iceland, which has its main campus in central Reykjavík. Other schools offering university-level instruction include Reykjavík University, University of Akureyri, Agricultural University of Iceland and Bifröst University.
An OECD assessment found 64% of Icelanders aged 25–64 have earned the equivalent of a high-school degree, which is lower than the OECD average of 73%. Among 25- to 34-year-olds, only 69% have earned the equivalent of a high-school degree, significantly lower than the OECD average of 80%. Nevertheless, Iceland's education system is considered excellent: the Programme for International Student Assessment currently ranks it as the 16th best performing, above the OECD average. Students were particularly proficient in reading and mathematics.
According to a 2013 Eurostat report by the European Commission, Iceland spends around 3.11% of its GDP on scientific research and development (R&D), over 1 percentage point higher than the EU average of 2.03%, and has set a target of 4% to reach by 2020. A 2010 UNESCO report found that out of 72 countries that spend the most on R&D (100 million US dollars or more), Iceland ranked 9th by proportion of GDP, tied with Taiwan, Switzerland, and Germany and ahead of France, the UK, and Canada.
The original population of Iceland was of Nordic and Gaelic origin. This is evident from literary evidence dating from the settlement period as well as from later scientific studies such as blood type and genetic analyses. One such genetic study indicated that the majority of the male settlers were of Nordic origin while the majority of the women were of Gaelic origin, meaning many settlers of Iceland were Norsemen who brought Gaelic slaves with them.
Iceland has extensive genealogical records dating back to the late 17th century and fragmentary records extending back to the Age of Settlement. The biopharmaceutical company deCODE genetics has funded the creation of a genealogy database that is intended to cover all of Iceland's known inhabitants. It views the database, called "Íslendingabók", as a valuable tool for conducting research on genetic diseases, given the relative isolation of Iceland's population.
The population of the island is believed to have varied from 40,000 to 60,000 in the period ranging from initial settlement until the mid-19th century. During that time, cold winters, ash fall from volcanic eruptions, and bubonic plagues adversely affected the population several times. There were 37 famine years in Iceland between 1500 and 1804. The first census was carried out in 1703 and revealed that the population was then 50,358. After the destructive volcanic eruptions of the Laki volcano during 1783–1784, the population reached a low of about 40,000. Improving living conditions have triggered a rapid increase in population since the mid-19th century—from about 60,000 in 1850 to 320,000 in 2008. Iceland has a relatively young population for a developed country, with one out of five people being 14 years old or younger. With a fertility rate of 2.1, Iceland is one of only a few European countries with a birth rate sufficient for long-term population growth (see table below).
In December 2007, 33,678 people (13.5% of the total population) living in Iceland had been born abroad, including children of Icelandic parents living abroad. Around 19,000 people (6% of the population) held foreign citizenship. Polish people make up the largest minority group by a considerable margin, and still form the bulk of the foreign workforce. About 8,000 Poles now live in Iceland, 1,500 of them in Fjarðabyggð where they make up 75% of the workforce who are constructing the Fjarðarál aluminium plant. Large-scale construction projects in the east of Iceland (see Kárahnjúkar Hydropower Plant) have also brought in many people whose stay is expected to be temporary. Many Polish immigrants were also considering leaving in 2008 as a result of the Icelandic financial crisis.
The southwest corner of Iceland is by far the most densely populated region. It is also the location of the capital Reykjavík, the northernmost national capital in the world. More than 70 percent of Iceland's population live in the southwest corner (Greater Reykjavík and the nearby Southern Peninsula), which covers less than two percent of Iceland's land area. The largest town outside Greater Reykjavík is Reykjanesbær, which is located on the Southern Peninsula, less than from the capital. The largest town outside the southwest corner is Akureyri in northern Iceland.
Some 500 Icelanders under the leadership of Erik the Red settled Greenland in the late 10th century. The total population reached a high point of perhaps 5,000 and developed independent institutions before disappearing by 1500. People from Greenland attempted to set up a settlement at Vinland in North America, but abandoned it in the face of hostility from the indigenous residents.
Emigration of Icelanders to the United States and Canada began in the 1870s. , Canada had over 88,000 people of Icelandic descent, while there are more than 40,000 Americans of Icelandic descent, according to the 2000 US census.
Iceland's 10 most populous urban areas:
Iceland's official written and spoken language is Icelandic, a North Germanic language descended from Old Norse. In grammar and vocabulary, it has changed less from Old Norse than the other Nordic languages; Icelandic has preserved more verb and noun inflection, and has to a considerable extent developed new vocabulary based on native roots rather than borrowings from other languages. The puristic tendency in the development of Icelandic vocabulary is to a large degree a result of conscious language planning, in addition to centuries of isolation. Icelandic is the only living language to retain the use of the runic letter Þ in Latin script. The closest living relative of the Icelandic language is Faroese.
Icelandic Sign Language was officially recognised as a minority language in 2011. In education, its use for Iceland's deaf community is regulated by the "National Curriculum Guide".
English and Danish are compulsory subjects in the school curriculum. English is widely understood and spoken, while basic to moderate knowledge of Danish is common mainly among the older generations. Polish is mostly spoken by the local Polish community (the largest minority of Iceland), and Danish is mostly spoken in a way largely comprehensible to Swedes and Norwegians—it is often referred to as "skandinavíska" (i. e. "Scandinavian") in Iceland.
Rather than using family names, as is the usual custom in most Western nations, Icelanders carry patronymic or matronymic surnames, patronyms being far more commonly practiced. Patronymic last names are based on the first name of the father, while matronymic names are based on the first name of the mother. These follow the person's given name, e.g. "Elísabet Jónsdóttir" ("Elísabet, Jón's daughter" (Jón, being the father)) or "Ólafur Katrínarson" ("Ólafur, Katrín's son" (Katrín being the mother)). Consequently, Icelanders refer to one another by their given name, and the Icelandic telephone directory is listed alphabetically by first name rather than by surname. All new names must be approved by the Icelandic Naming Committee.
Iceland has a universal health care system that is administered by its Ministry of Welfare () and paid for mostly by taxes (85%) and to a lesser extent by service fees (15%). Unlike most countries, there are no private hospitals, and private insurance is practically nonexistent.
A considerable portion of the government budget is assigned to health care, and Iceland ranks 11th in health care expenditures as a percentage of GDP and 14th in spending per capita. Overall, the country's health care system is one of the best performing in the world, ranked 15th by the World Health Organization. According to an OECD report, Iceland devotes far more resources to healthcare than most industrialised nations. , Iceland had 3.7 doctors per 1,000 people (compared with an average of 3.1 in OECD countries) and 15.3 nurses per 1,000 people (compared with an OECD average of 8.4).
Icelanders are among the world's healthiest people, with 81% reporting they are in good health, according to an OECD survey. Although it is a growing problem, obesity is not as prevalent as in other developed countries. Iceland has many campaigns for health and wellbeing, including the famous television show "Lazytown", starring and created by former gymnastics champion Magnus Scheving. Infant mortality is one of the lowest in the world, and the proportion of the population that smokes is lower than the OECD average. Almost all women choose to terminate pregnancies of children with Down syndrome in Iceland. The average life expectancy is 81.8 (compared to an OECD average of 79.5), the 4th highest in the world.
Iceland has a very low level of pollution, thanks to an overwhelming reliance on cleaner geothermal energy, a low population density, and a high level of environmental consciousness among citizens. According to an OECD assessment, the amount of toxic materials in the atmosphere is far lower than in any other industrialised country measured.
Icelanders have freedom of religion guaranteed under the Constitution, although the Church of Iceland, a Lutheran body, is the state church:
The Registers Iceland keeps account of the religious affiliation of every Icelandic citizen. In 2017, Icelanders were divided into religious groups as follows:
Iceland is a very secular country; as with other Nordic nations, church attendance is relatively low. The above statistics represent administrative membership of religious organisations, which does not necessarily reflect the belief demographics of the population. According to a study published in 2001, 23% of the inhabitants were either atheist or agnostic. A Gallup poll conducted in 2012 found that 57% of Icelanders considered themselves "religious", 31% considered themselves "non-religious", while 10% defined themselves as "convinced atheists", placing Iceland among the ten countries with the highest proportions of atheists in the world. Icelanders registered in the state church, the Church of Iceland, is declining at a rate of more than 1% per year.
Icelandic culture has its roots in North Germanic traditions. Icelandic literature is popular, in particular the sagas and eddas that were written during the High and Late Middle Ages. Centuries of isolation have helped to insulate the country's Nordic culture from external influence; a prominent example is the preservation of the Icelandic language, which remains the closest to Old Norse of all modern Nordic languages.
In contrast to other Nordic countries, Icelanders place relatively great importance on independence and self-sufficiency; in a public opinion analysis conducted by the European Commission, over 85% of Icelanders believe independence is "very important," compared to 47% of Norwegians, 49% of Danes, and an average of 53% for the EU25. Icelanders also have a very strong work ethic, working some of the longest hours of any industrialised nation.
According to a poll conducted by the OECD, 66% of Icelanders were satisfied with their lives, while 70% believed that their lives will be satisfying in the future. Similarly, 83% reported having more positive experiences in an average day than negative ones, compared to an OECD average of 72%, which makes Iceland one of the happiest countries in the OECD. A more recent 2012 survey found that around three-quarters of respondents stated they were satisfied with their lives, compared to a global average of about 53%.
Iceland is liberal with regard to LGBT rights issues. In 1996, the Icelandic parliament passed legislation to create registered partnerships for same-sex couples, conferring nearly all the rights and benefits of marriage. In 2006, parliament voted unanimously to grant same-sex couples the same rights as heterosexual couples in adoption, parenting and assisted insemination treatment. In 2010, the Icelandic parliament amended the marriage law, making it gender neutral and defining marriage as between two individuals, making Iceland one of the first countries in the world to legalise same-sex marriages. The law took effect on 27 June 2010. The amendment to the law also means registered partnerships for same-sex couples are now no longer possible, and marriage is their only option—identical to the existing situation for opposite-sex couples.
Icelanders are known for their strong sense of community and lack of social isolation: An OECD survey found that 98% believe they know someone they could rely on in a time of need, higher than in any other industrialised country. Similarly, only 6% reported "rarely" or "never" socialising with others. This high level of social cohesion is attributed to the small size and homogeneity of the population, as well as to a long history of harsh survival in an isolated environment, which reinforced the importance of unity and cooperation.
Egalitarianism is highly valued among the people of Iceland, with income inequality being among the lowest in the world. The constitution explicitly prohibits the enactment of noble privileges, titles, and ranks. Everyone is addressed by their first name. As in other Nordic countries, equality between the sexes is very high; Iceland is consistently ranked among the top three countries in the world for women to live in.
In 2011, Reykjavík was designated a UNESCO City of Literature.
Iceland's best-known classical works of literature are the Icelanders' sagas, prose epics set in Iceland's age of settlement. The most famous of these include "Njáls saga", about an epic blood feud, and "Grænlendinga saga" and "Eiríks saga", describing the discovery and settlement of Greenland and Vinland (modern Newfoundland). "Egils saga", "Laxdæla saga", "Grettis saga", "Gísla saga" and "Gunnlaugs saga ormstungu" are also notable and popular Icelanders' sagas.
A translation of the Bible was published in the 16th century. Important compositions since the 15th to the 19th century include sacred verse, most famously the Passion Hymns of Hallgrímur Pétursson, and "rímur", rhyming epic poems. Originating in the 14th century, "rímur" were popular into the 19th century, when the development of new literary forms was provoked by the influential, National-Romantic writer Jónas Hallgrímsson. In recent times, Iceland has produced many great writers, the best-known of whom is arguably Halldór Laxness, who received the Nobel Prize in Literature in 1955 (the only Icelander to win a Nobel Prize thus far). Steinn Steinarr was an influential modernist poet during the early 20th century who remains popular.
Icelanders are avid consumers of literature, with the highest number of bookstores per capita in the world. For its size, Iceland imports and translates more international literature than any other nation. Iceland also has the highest per capita publication of books and magazines, and around 10% of the population will publish a book in their lifetimes.
Most books in Iceland are sold between late September to early November. This time period is known as "Jolabokaflod", the Christmas Book Flood. The Flood begins with the Iceland Publisher's Association distributing "Bokatidindi", a catalog of all new publications, free to each Icelandic home.
The distinctive rendition of the Icelandic landscape by its painters can be linked to nationalism and the movement for home rule and independence, which was very active in the mid-19th century.
Contemporary Icelandic painting is typically traced to the work of Þórarinn Þorláksson, who, following formal training in art in the 1890s in Copenhagen, returned to Iceland to paint and exhibit works from 1900 to his death in 1924, almost exclusively portraying the Icelandic landscape. Several other Icelandic men and women artists studied at Royal Danish Academy of Fine Arts at that time, including Ásgrímur Jónsson, who together with Þórarinn created a distinctive portrayal of Iceland's landscape in a romantic naturalistic style. Other landscape artists quickly followed in the footsteps of Þórarinn and Ásgrímur. These included Jóhannes Kjarval and Júlíana Sveinsdóttir. Kjarval in particular is noted for the distinct techniques in the application of paint that he developed in a concerted effort to render the characteristic volcanic rock that dominates the Icelandic environment. Einar Hákonarson is an expressionistic and figurative painter who by some is considered to have brought the figure back into Icelandic painting. In the 1980s, many Icelandic artists worked with the subject of the new painting in their work.
In the recent years artistic practice has multiplied, and the Icelandic art scene has become a setting for many large scale projects and exhibitions. The artist run gallery space Kling og Bang, members of which later ran the studio complex and exhibition venue Klink og Bank, has been a significant part of the trend of self-organised spaces, exhibitions and projects. The Living Art Museum, Reykjavík Municipal Art Museum, Reykjavík Art Museum and the National Gallery of Iceland are the larger, more established institutions, curating shows and festivals.
Much Icelandic music is related to Nordic music, and includes folk and pop traditions. Notable Icelandic music acts include medieval music group Voces Thules, alternative and indie rock acts such as The Sugarcubes, Sóley and Of Monsters and Men, jazz fusion band Mezzoforte, pop singers such as Hafdís Huld, Emilíana Torrini and Björk, solo ballad singers like Bubbi Morthens, and post-rock bands such as Amiina and Sigur Rós. Independent music is strong in Iceland, with bands such as múm and solo artists.
Traditional Icelandic music is strongly religious. Hymns, both religious and secular, are a particularly well-developed form of music, due to the scarcity of musical instruments throughout much of Iceland's history. Hallgrímur Pétursson wrote many Protestant hymns in the 17th century. Icelandic music was modernised in the 19th century, when Magnús Stephensen brought pipe organs, which were followed by harmoniums. Other vital traditions of Icelandic music are epic alliterative and rhyming ballads called rímur. Rímur are epic tales, usually a cappella, which can be traced back to skaldic poetry, using complex metaphors and elaborate rhyme schemes. The best known rímur poet of the 19th century was Sigurður Breiðfjörð (1798–1846). A modern revitalisation of the tradition began in 1929 with the formation of Iðunn.
Among Iceland's best-known classical composers are Daníel Bjarnason and Anna S. Þorvaldsdóttir (Anna Thorvaldsdottir), who in 2012 received the Nordic Council Music Prize and in 2015 was chosen as the New York Philharmonic's Kravis Emerging Composer, an honor that includes a $50,000 cash prize and a commission to write a composition for the orchestra; she is the second recipient.
The national anthem of Iceland is "Lofsöngur", written by Matthías Jochumsson, with music by Sveinbjörn Sveinbjörnsson.
Iceland's largest television stations are the state-run Sjónvarpið and the privately owned Stöð 2 and SkjárEinn. Smaller stations exist, many of them local. Radio is broadcast throughout the country, including some parts of the interior. The main radio stations are Rás 1, Rás 2, X-ið 977, Bylgjan and FM957. The daily newspapers are Morgunblaðið and Fréttablaðið. The most popular websites are the news sites Vísir and Mbl.is.
Iceland is home to "LazyTown" (Icelandic: "Latibær"), a children's educational musical comedy program created by Magnús Scheving. It has become a very popular programme for children and adults and is shown in over 100 countries, including the Americas, the UK and Sweden. The "LazyTown" studios are located in Garðabær. The 2015 television crime series "Trapped" aired in the UK on BBC4 in February and March 2016, to critical acclaim and according to the Guardian "the unlikeliest TV hit of the year".
In 1992, the Icelandic film industry achieved its greatest recognition hitherto, when Friðrik Þór Friðriksson was nominated for the Academy Award for Best Foreign Language Film for his "Children of Nature". It features the story of an old man who is unable to continue running his farm. After being unwelcomed in his daughter's and father-in-law's house in town, he is put in a home for the elderly. There, he meets an old girlfriend of his youth and they both begin a journey through the wilds of Iceland to die together. This is the only Icelandic movie to have ever been nominated for an Academy Award.
Singer-songwriter Björk received international acclaim for her starring role in the Danish musical drama "Dancer in the Dark", directed by Lars von Trier, in which she plays Selma Ježková, a factory worker who struggles to pay for her son's eye operation. The film premiered at the 2000 Cannes Film Festival, where she won the Best Actress Award. The movie also led Björk to nominations for Best Original Song at the 73rd Academy Awards, with the song "I've Seen It All" and for a Golden Globe Award for Best Actress in a Motion Picture - Drama.
Guðrún S. Gísladóttir, who is Icelandic, played one of the major roles in Russian filmmaker Andrei Tarkovsky's 1986 film, "The Sacrifice". Anita Briem, known for her performance in Showtime's "The Tudors", is also Icelandic. Briem starred in the 2008 film "Journey to the Center of the Earth", which shot scenes in Iceland. The 2002 James Bond movie "Die Another Day" is set for a large-part in Iceland. Christopher Nolan's 2014 film, "Interstellar" was also filmed in Iceland for some of its scenes, as was Ridley Scott's "Prometheus".
On 17 June 2010, the parliament passed the Icelandic Modern Media Initiative, proposing greater protection of free speech rights and the identity of journalists and whistle-blowers—the strongest journalist protection law in the world. According to a 2011 report by Freedom House, Iceland is one of the highest ranked countries in press freedom.
CCP Games, developers of the critically acclaimed EVE Online and Dust 514, is headquartered in Reykjavík. CCP Games hosts the third most populated MMO in the world, which also has the largest total game area for an online game.
Iceland has a highly developed internet culture, with around 95% of the population having internet access, the highest proportion in the world. Iceland ranked 12th in the World Economic Forum's 2009–2010 Network Readiness Index, which measures a country's ability to competitively exploit communications technology. The United Nations International Telecommunication Union ranks the country 3rd in its development of information and communications technology, having moved up four places between 2008 and 2010. In February 2013 the country (ministry of the interior) was researching possible methods to protect children in regards to Internet pornography, claiming that pornography online is a threat to children as it supports child slavery and abuse. Strong voices within the community expressed concerns with this, stating that it is impossible to block access to pornography without compromising freedom of speech.
Much of Iceland's cuisine is based on fish, lamb, and dairy products, with little to no use of herbs or spices. Due to the island's climate, fruits and vegetables are not generally a component of traditional dishes, although the use of greenhouses has made them more common in contemporary food. Þorramatur is a selection of traditional cuisine consisting of many dishes, and is usually consumed around the month of Þorri, which begins on the first Friday after 19 January. Traditional dishes also include skyr (a yoghurt-like cheese), hákarl (cured shark), cured ram, singed sheep heads, and black pudding, Flatkaka (flat bread), dried fish and dark rye bread traditionally baked in the ground in geothermal areas. Puffin is considered a local delicacy that is often prepared through broiling.
Breakfast usually consists of pancakes, cereal, fruit, and coffee, while lunch may take the form of a smörgåsbord. The main meal of the day for most Icelanders is dinner, which usually involves fish or lamb as the main course. Seafood is central to most Icelandic cooking, particularly cod and haddock but also salmon, herring, and halibut. It is often prepared in a wide variety of ways, either smoked, pickled, boiled, or dried. Lamb is by far the most common meat, and it tends to be either smoke-cured (known as "hangikjöt") or salt-preserved ("saltkjöt"). Many older dishes make use of every part of the sheep, such as "slátur", which consists of offal (internal organs and entrails) minced together with blood and served in sheep stomach. Additionally, boiled or mashed potatoes, pickled cabbage, green beans, and rye bread are prevalent side dishes.
Coffee is a popular beverage in Iceland, with the country being third placed by per capita consumption worldwide in 2016, and is drunk at breakfast, after meals, and with a light snack in mid-afternoon. Coca-Cola is also widely consumed, to the extent that the country is said to have one of the highest per capita consumption rates in the world.
Iceland's signature alcoholic beverage is "brennivín" (literally "burnt [i.e., distilled] wine"), which is similar in flavouring to the akvavit variant of Scandinavian brännvin. It is a type of schnapps made from distilled potatoes and flavoured with either caraway seeds or angelica. Its potency has earned it the nickname "svarti dauði" ("Black Death"). Modern distilleries on Iceland produce vodka (Reyka), gin (Ísafold), moss schnapps (Fjallagrasa), and a birch-flavoured schnapps and liqueur (Foss Distillery's Birkir and Björk). Martin Miller blends Icelandic water with its England-distilled gin on the island. Strong beer was banned until 1989, so "bjórlíki", a mixture of legal, low-alcohol pilsner beer and vodka, became popular. Several strong beers are now made by Icelandic breweries.
Sport is an important part of Icelandic culture, as the population is generally quite active. The main traditional sport in Iceland is "Glíma", a form of wrestling thought to have originated in medieval times.
Popular sports include football, track and field, handball and basketball. Handball is often referred to as the national sport. The Icelandic national football team qualified for the 2016 UEFA European football championship for the first time. They recorded a draw against later winners Portugal in the group stage, and defeated England 2–1 in the round of 16, with goals from Ragnar Sigurðsson and Kolbeinn Sigþórsson. They then lost to hosts and later finalists France in the quarter finals. Following up on this, Iceland made its debut at the 2018 FIFA World Cup. For both the European and the world championship, Iceland is to date the smallest nation in terms of population to qualify.
Iceland is also the smallest country to ever qualify for Eurobasket. They did it in both 2015 and 2017. Although Iceland has had great success qualifying for Eurobasket, they have not managed to win a single game in the European Basketball final stages.
Iceland has excellent conditions for skiing, fishing, snowboarding, ice climbing and rock climbing, although mountain climbing and hiking are preferred by the general public. Iceland is also a world-class destination for alpine ski touring and Telemark skiing, with the Troll Peninsula in Northern Iceland being the main centre of activity. Although the country's environment is generally ill-suited for golf, there are nevertheless many golf courses throughout the island, and Iceland has a greater percentage of the population playing golf than Scotland with over 17,000 registered golfers out of a population of approximately 300,000. Iceland hosts an annual international golf tournament known as the Arctic Open played through the night during the summer solstice at Akureyri Golf Club. Iceland has also won the second most World's Strongest Man competitions of any country with nine titles, including four by both Magnús Ver Magnússon and Jón Páll Sigmarsson and most recently Hafþór Júlíus Björnsson in 2018.
Iceland is also one of the leading countries in ocean rowing. Icelandic explorer and endurance athlete Fiann Paulholds the highest number of performance-based Guinness World Records within a single athletic discipline. As of 2020, he is the first and only person to achieve the Ocean Explorers Grand Slam (performing open-water crossings on each of the five oceans using human-powered vessels) and has claimed overall speed Guinness World Records for the fastest rowing of all four oceans (Atlantic, Indian, Pacific and Arctic) in a human-powered row boat. He had achieved a total of 41, including 33 performance based Guinness World Records by 2020.
Swimming is popular in Iceland. Geothermally heated outdoor pools are widespread, and swimming courses are a mandatory part of the national curriculum. Horseback riding, which was historically the most prevalent form of transportation on the island, remains a common pursuit for many Icelanders.
The oldest sport association in Iceland is the Reykjavík Shooting Association, founded in 1867. Rifle shooting became very popular in the 19th century with the encouragement of politicians and nationalists who were pushing for Icelandic independence. To this day, it remains a significant pastime.
Iceland has also produced many chess masters and hosted the historic World Chess Championship 1972 in Reykjavík during the height of the Cold War. , there have been nine Icelandic chess grandmasters, a considerable number given the small size of the population. Bridge is also popular, with Iceland participating in a number of international tournaments. Iceland won the world bridge championship (the Bermuda Bowl) in Yokohama, Japan, in 1991 and took second place (with Sweden) in Hamilton, Bermuda, in 1950.
|
https://en.wikipedia.org/wiki?curid=14531
|
Italy
Italy ( ), officially the Italian Republic ( ), is a country consisting of a peninsula delimited by the Alps and surrounded by several islands. Italy is located in south-central Europe, and it is also considered a part of western Europe. A unitary parliamentary republic with its capital in Rome, the country covers a total area of and shares land borders with France, Switzerland, Austria, Slovenia, and the enclaved microstates of Vatican City and San Marino. Italy has a territorial exclave in Switzerland (Campione) and a maritime exclave in Tunisian waters (Lampedusa). With around 60 million inhabitants, Italy is the third-most populous member state of the European Union.
Due to its central geographic location in Southern Europe and the Mediterranean, Italy has historically been home to myriad peoples and cultures. In addition to the various ancient peoples dispersed throughout what is now modern-day Italy, the most predominant being the Indo-European Italic peoples who gave the peninsula its name, beginning from the classical era, Phoenicians and Carthaginians founded colonies mostly in insular Italy, Greeks established settlements in the so-called "Magna Graecia" of Southern Italy, while Etruscans and Celts inhabited central and northern Italy respectively. An Italic tribe known as the Latins formed the Roman Kingdom in the 8th century BC, which eventually became a republic with a government of the Senate and the People. The Roman Republic initially conquered and assimilated its neighbours on the Italian peninsula, eventually expanding and conquering parts of Europe, North Africa and Asia. By the first century BC, the Roman Empire emerged as the dominant power in the Mediterranean Basin and became a leading cultural, political and religious centre, inaugurating the Pax Romana, a period of more than 200 years during which Italy's law, technology, economy, art, and literature developed. Italy remained the homeland of the Romans and the metropole of the empire, whose legacy can also be observed in the global distribution of culture, governments, Christianity and the Latin script.
During the Early Middle Ages, Italy endured the fall of the Western Roman Empire and barbarian invasions, but by the 11th century numerous rival city-states and maritime republics, mainly in the northern and central regions of Italy, rose to great prosperity through trade, commerce and banking, laying the groundwork for modern capitalism. These mostly independent statelets served as Europe's main trading hubs with Asia and the Near East, often enjoying a greater degree of democracy than the larger feudal monarchies that were consolidating throughout Europe; however, part of central Italy was under the control of the theocratic Papal States, while Southern Italy remained largely feudal until the 19th century, partially as a result of a succession of Byzantine, Arab, Norman, Angevin, Aragonese and other foreign conquests of the region. The Renaissance began in Italy and spread to the rest of Europe, bringing a renewed interest in humanism, science, exploration and art. Italian culture flourished, producing famous scholars, artists and polymaths. During the Middle Ages, Italian explorers discovered new routes to the Far East and the New World, helping to usher in the European Age of Discovery. Nevertheless, Italy's commercial and political power significantly waned with the opening of trade routes that bypassed the Mediterranean. Centuries of rivalry and infighting between the Italian city-states, such as the Italian Wars of the 15th and 16th centuries, left Italy fragmented and several Italian states were conquered and further divided by multiple European powers over the centuries.
By the mid-19th century, rising Italian nationalism and calls for independence from foreign control led to a period of revolutionary political upheaval. After centuries of foreign domination and political division, Italy was almost entirely unified in 1861, establishing the Kingdom of Italy as a great power. From the late 19th century to the early 20th century, Italy rapidly industrialised, mainly in the north, and acquired a colonial empire, while the south remained largely impoverished and excluded from industrialisation, fuelling a large and influential diaspora. Despite being one of the four main allied powers in World War I, Italy entered a period of economic crisis and social turmoil, leading to the rise of the Italian fascist dictatorship in 1922. Participation in World War II on the Axis side ended in military defeat, economic destruction and the Italian Civil War. Following the liberation of Italy the country abolished their monarchy, established a democratic Republic and enjoyed a prolonged economic boom, becoming a highly developed country.
Today, Italy is considered to be one of the world's most culturally and economically advanced countries, with the world's eighth-largest economy by nominal GDP (third in the European Union), sixth-largest national wealth and third-largest central bank gold reserve. It ranks very highly in life expectancy, quality of life, healthcare, and education. The country plays a prominent role in regional and global economic, military, cultural and diplomatic affairs; it is both a regional power and a great power, and is ranked the world's eighth most-powerful military. Italy is a founding and leading member of the European Union and a member of numerous international institutions, including the UN, NATO, the OECD, the OSCE, the WTO, the G7, the G20, the Union for the Mediterranean, the Council of Europe, Uniting for Consensus, the Schengen Area and many more. The country has long been a global centre of art, music, literature, philosophy, science and technology, and fashion, and has greatly influenced and contributed to diverse fields including cinema, cuisine, sports, jurisprudence, banking and business. As a reflection of its cultural wealth, Italy is home to the world's largest number of World Heritage Sites (55), and is the fifth-most visited country.
Hypotheses for the etymology of the name "Italia" are numerous. One is that it was borrowed via Greek from the Oscan "Víteliú" 'land of calves' ("cf." Lat "vitulus" "calf", Umb "vitlo" "calf"). Greek historian Dionysius of Halicarnassus states this account together with the legend that Italy was named after Italus, mentioned also by Aristotle and Thucydides.
According to Antiochus of Syracuse, the term Italy was used by the Greeks to initially refer only to the southern portion of the Bruttium peninsula corresponding to the modern province of Reggio and part of the provinces of Catanzaro and Vibo Valentia in southern Italy. Nevertheless, by his time the larger concept of Oenotria and "Italy" had become synonymous and the name also applied to most of Lucania as well. According to Strabo's "Geographica", before the expansion of the Roman Republic, the name was used by Greeks to indicate the land between the strait of Messina and the line connecting the gulf of Salerno and gulf of Taranto, corresponding roughly to the current region of Calabria. The Greeks gradually came to apply the name "Italia" to a larger region In addition to the "Greek Italy" in the south, historians have suggested the existence of an "Etruscan Italy" covering variable areas of central Italy.
The borders of Roman Italy, "Italia", are better established. Cato's "Origines", the first work of history composed in Latin, described Italy as the entire peninsula south of the Alps. According to Cato and several Roman authors, the Alps formed the "walls of Italy". In 264 BC, Roman Italy extended from the Arno and Rubicon rivers of the centre-north to the entire south. The northern area of Cisalpine Gaul was occupied by Rome in the 220s BC and became considered geographically and "de facto" part of Italy, but remained politically and "de jure" separated. It was legally merged into the administrative unit of Italy in 42 BC by the triumvir Octavian as a ratification of Caesar's unpublished acts ("Acta Caesaris"). The islands of Sardinia, Corsica, Sicily and Malta were added to Italy by Diocletian in 292 AD.
Thousands of Paleolithic-era artifacts have been recovered from Monte Poggiolo and dated to around 850,000 years before the present, making them the oldest evidence of first hominins habitation in the peninsula.
Excavations throughout Italy revealed a Neanderthal presence dating back to the Palaeolithic period some 200,000 years ago, while modern Humans appeared about 40,000 years ago at Riparo Mochi. Archaeological sites from this period include Addaura cave, Altamura, Ceprano, and Gravina in Puglia.
The Ancient peoples of pre-Roman Italy – such as the Umbrians, the Latins (from which the Romans emerged), Volsci, Oscans, Samnites, Sabines, the Celts, the Ligures, the Veneti, the Iapygians and many others – were Indo-European peoples, most of them specifically of the Italic group. The main historic peoples of possible non-Indo-European or pre-Indo-European heritage include the Etruscans of central and northern Italy, the Elymians and the Sicani in Sicily, and the prehistoric Sardinians, who gave birth to the Nuragic civilisation. Other ancient populations being of undetermined language families and of possible non-Indo-European origin include the Rhaetian people and Cammuni, known for their rock carvings in Valcamonica, the largest collections of prehistoric petroglyphs in the world. A well-preserved natural mummy known as Ötzi the Iceman, determined to be 5,000 years old (between 3400 and 3100 BCE, Copper Age), was discovered in the Similaun glacier of South Tyrol in 1991.
The first foreign colonizers were the Phoenicians, who initially established colonies and founded various emporiums on the coasts of Sicily and Sardinia. Some of these soon became small urban centres and were developed parallel to the Greek colonies; among the main centres there were the cities of Motya, Zyz (modern Palermo), Soluntum in Sicily and Nora, Sulci, and Tharros in Sardinia.
Between the 17th and the 11th centuries BC Mycenaean Greeks established contacts with Italy and in the 8th and 7th centuries BC a number of Greek colonies were established all along the coast of Sicily and the southern part of the Italian Peninsula, that became known as Magna Graecia. The Greek colonization placed the Italic peoples in contact with democratic government forms and with elevated artistic and cultural expressions.
Rome, a settlement around a ford on the river Tiber in central Italy conventionally founded in 753 BC, was ruled for a period of 244 years by a monarchical system, initially with sovereigns of Latin and Sabine origin, later by Etruscan kings. The tradition handed down seven kings: Romulus, Numa Pompilius, Tullus Hostilius, Ancus Marcius, Tarquinius Priscus, Servius Tullius and Tarquinius Superbus. In 509 BC, the Romans expelled the last king from their city, favouring a government of the Senate and the People (SPQR) and establishing an oligarchic republic.
The Italian Peninsula, named Italia, was consolidated into a single entity during the Roman expansion and conquest of new lands at the expense of the other Italic tribes, Etruscans, Celts, and Greeks. A permanent association with most of the local tribes and cities was formed, and Rome began the conquest of Western Europe, Northern Africa and the Middle East. In the wake of Julius Caesar's rise and death in the first century BC, Rome grew over the course of centuries into a massive empire stretching from Britain to the borders of Persia, and engulfing the whole Mediterranean basin, in which Greek and Roman and many other cultures merged into a unique civilisation. The long and triumphant reign of the first emperor, Augustus, began a golden age of peace and prosperity. Italy remained the metropole of the empire, and as the homeland of the Romans and the territory of the capital, maintained a special status which made it "not a province, but the "Domina" (ruler) of the provinces". More than two centuries of stability followed, during which Italy was referred to as the "rectrix mundi" (queen of the world) and "omnium terrarum parens" (motherland of all lands).
The Roman Empire was among the most powerful economic, cultural, political and military forces in the world of its time, and it was one of the largest empires in world history. At its height under Trajan, it covered 5 million square kilometres. The Roman legacy has deeply influenced the Western civilisation, shaping most of the modern world; among the many legacies of Roman dominance are the widespread use of the Romance languages derived from Latin, the numerical system, the modern Western alphabet and calendar, and the emergence of Christianity as a major world religion. The Indo-Roman trade relations, beginning around the 1st century BCE, testifies to extensive Roman trade in far away regions; many reminders of the commercial trade between the Indian subcontinent and Italy have been found, such as the ivory statuette Pompeii Lakshmi from the ruins of Pompeii.
In a slow decline since the third century AD, the Empire split in two in 395 AD. The Western Empire, under the pressure of the barbarian invasions, eventually dissolved in 476 AD when its last emperor, Romulus Augustulus, was deposed by the Germanic chief Odoacer. The Eastern half of the Empire survived for another thousand years.
After the fall of the Western Roman Empire, Italy fell under the power of Odoacer's kingdom, and, later, was seized by the Ostrogoths, followed in the 6th century by a brief reconquest under Byzantine Emperor Justinian. The invasion of another Germanic tribe, the Lombards, late in the same century, reduced the Byzantine presence to the rump realm of the Exarchate of Ravenna and started the end of political unity of the peninsula for the next 1,300 years. Invasions of the peninsula caused a chaotic succession of barbarian kingdoms and the so-called "dark ages". The Lombard kingdom was subsequently absorbed into the Frankish Empire by Charlemagne in the late 8th century. The Franks also helped the formation of the Papal States in central Italy. Until the 13th century, Italian politics was dominated by the relations between the Holy Roman Emperors and the Papacy, with most of the Italian city-states siding with the former (Ghibellines) or with the latter (Guelphs) from momentary convenience.
The Germanic Emperor and the Roman Pontiff became the universal powers of medieval Europe. However, the conflict for the investiture controversy (a conflict over two radically different views of whether secular authorities such as kings, counts, or dukes, had any legitimate role in appointments to ecclesiastical offices) and the clash between Guelphs and Ghibellines led to the end of the Imperial-feudal system in the north of Italy where city-states gained independence. It was during this chaotic era that Italian towns saw the rise of a peculiar institution, the medieval commune. Given the power vacuum caused by extreme territorial fragmentation and the struggle between the Empire and the Holy See, local communities sought autonomous ways to maintain law and order. The investiture controversy was finally resolved by the Concordat of Worms. In 1176 a league of city-states, the Lombard League, defeated the German emperor Frederick Barbarossa at the Battle of Legnano, thus ensuring effective independence for most of northern and central Italian cities.
Italian city-states such as Milan, Florence and Venice played a crucial innovative role in financial development, devising the main instruments and practices of banking and the emergence of new forms of social and economic organization. In coastal and southern areas, the maritime republics grew to eventually dominate the Mediterranean and monopolise trade routes to the Orient. They were independent thalassocratic city-states, though most of them originated from territories once belonging to the Byzantine Empire. All these cities during the time of their independence had similar systems of government in which the merchant class had considerable power. Although in practice these were oligarchical, and bore little resemblance to a modern democracy, the relative political freedom they afforded was conducive to academic and artistic advancement. The four best known maritime republics were Venice, Genoa, Pisa and Amalfi; the others were Ancona, Gaeta, Noli, and Ragusa. Each of the maritime republics had dominion over different overseas lands, including many Mediterranean islands (especially Sardinia and Corsica), lands on the Adriatic, Aegean, and Black Sea (Crimea), and commercial colonies in the Near East and in North Africa. Venice maintained enormous tracts of land in Greece, Cyprus, Istria and Dalmatia until as late as the mid-17th century.
Venice and Genoa were Europe's main gateway to trade with the East, and a producer of fine glass, while Florence was a capital of silk, wool, banks and jewellery. The wealth such business brought to Italy meant that large public and private artistic projects could be commissioned. The republics were heavily involved in the Crusades, providing support and transport, but most especially taking advantage of the political and trading opportunities resulting from these wars. Italy first felt huge economic changes in Europe which led to the commercial revolution: the Republic of Venice was able to defeat the Byzantine Empire and finance the voyages of Marco Polo to Asia; the first universities were formed in Italian cities, and scholars such as Thomas Aquinas obtained international fame; Frederick of Sicily made Italy the political-cultural centre of a reign that temporarily included the Holy Roman Empire and the Kingdom of Jerusalem; capitalism and banking families emerged in Florence, where Dante and Giotto were active around 1300.
In the south, Sicily had become an Islamic emirate in the 9th century, thriving until the Italo-Normans conquered it in the late 11th century together with most of the Lombard and Byzantine principalities of southern Italy. Through a complex series of events, southern Italy developed as a unified kingdom, first under the House of Hohenstaufen, then under the Capetian House of Anjou and, from the 15th century, the House of Aragon. In Sardinia, the former Byzantine provinces became independent states known in Italian as Judicates, although some parts of the island fell under Genoese or Pisan rule until the eventual Aragonese annexation in the 15th century. The Black Death pandemic of 1348 left its mark on Italy by killing perhaps one third of the population. However, the recovery from the plague led to a resurgence of cities, trade and economy which allowed the bloom of Humanism and Renaissance, that later spread to Europe.
Italy was the birthplace and heart of the Renaissance during the 1400s and 1500s. The Italian Renaissance marked the transition from the medieval period to the modern age as Europe recovered, economically and culturally, from the crises of the Late Middle Ages and entered the Early Modern Period. The Italian polities were now regional states effectively ruled by Princes, "de facto" monarchs in control of trade and administration, and their courts became major centres of Arts and Sciences. The Italian princedoms represented a first form of modern states as opposed to feudal monarchies and multinational empires. The princedoms were led by political dynasties and merchant families such as the Medici in Florence, the Visconti and Sforza in the Duchy of Milan, the Doria in the Republic of Genoa, the Mocenigo and Barbarigo in the Republic of Venice, the Este in Ferrara, and the Gonzaga in Mantua. The Renaissance was therefore a result of the great wealth accumulated by Italian merchant cities combined with the patronage of its dominant families. Italian Renaissance exercised a dominant influence on subsequent European painting and sculpture for centuries afterwards, with artists such as Leonardo da Vinci, Brunelleschi, Botticelli, Michelangelo, Raphael, Giotto, Donatello, and Titian, and architects such as Filippo Brunelleschi, Leon Battista Alberti, Andrea Palladio, and Donato Bramante.
Following the conclusion of the western schism in favor of Rome at the Council of Constance (1415–1417), the new Pope Martin V returned to the Papal States after a three years-long journey that touched many Italian cities and restored Italy as the sole centre of Western Christianity. During the course of this voyage, the Medici Bank was made the official credit institution of the Papacy and several significant ties were established between the Church and the new political dynasties of the peninsula. The Popes' status as elective monarchs turned the conclaves and consistories of the Renaissance into political battles between the courts of Italy for primacy in the peninsula and access to the immense resources of the Catholic Church. In 1439, Pope Eugenius IV and the Byzantine Emperor John VIII Palaiologos signed a reconciliation agreement between the Catholic Church and the Orthodox Church at the Council of Florence hosted by Cosimo "the old" de Medici. In 1453, Italian forces under Giovanni Giustiniani were sent by Pope Nicholas V to defend the Walls of Constantinople but the decisive battle was lost to the more advanced Turkish army equipped with cannons, and Byzantium fell to Sultan Mehmed II.
The fall of Constantinople led to the migration of Greek scholars and texts to Italy, fueling the rediscovery of Greco-Roman Humanism. Humanist rulers such as Federico da Montefeltro and Pope Pius II worked to establish ideal cities where "man is the measure of all things", and therefore founded Urbino and Pienza respectively. Pico della Mirandola wrote the "Oration on the Dignity of Man", considered the manifesto of Renaissance Humanism, in which he stressed the importance of free will in human beings. The humanist historian Leonardo Bruni was the first to divide human history in three periods: Antiquity, Middle Ages and Modernity. The second consequence of the Fall of Constantinople was the beginning of the Age of Discovery.
Italian explorers and navigators from the dominant maritime republics, eager to find an alternative route to the Indies in order to bypass the Ottoman Empire, offered their services to monarchs of Atlantic countries and played a key role in ushering the Age of Discovery and the European colonization of the Americas. The most notable among them were: Christopher Columbus, colonizer in the name of Spain, who is credited with discovering the New World and the opening of the Americas for conquest and settlement by Europeans; John Cabot, sailing for England, who was the first European to set foot in "New Found Land" and explore parts of the North American continent in 1497; Amerigo Vespucci, sailing for Portugal, who first demonstrated in about 1501 that the New World (in particular Brazil) was not Asia as initially conjectured, but a fourth continent previously unknown to people of the Old World (America is named after him); and Giovanni da Verrazzano, at the service of France, renowned as the first European to explore the Atlantic coast of North America between Florida and New Brunswick in 1524;
Following the fall of Constantinople, the wars in Lombardy came to an end and a defensive alliance known as Italic League was formed between Venice, Naples, Florence, Milan, and the Papacy. Lorenzo "the Magnificent" de Medici was the greatest Florentine patron of the Renaissance and supporter of the Italic League. He notably avoided the collapse of the League in the aftermath of the Pazzi Conspiracy and during the aborted invasion of Italy by the Turks. However, the military campaign of Charles VIII of France in Italy caused the end of the Italic League and initiated the Italian Wars between the Valois and the Habsburgs. During the High Renaissance of the 1500s, Italy was therefore both the main European battleground and the cultural-economic centre of the continent. Popes such as Julius II (1503–1513) fought for the control of Italy against foreign monarchs, others such as Paul III (1534–1549) preferred to mediate between the European powers in order to secure peace in Italy. In the middle of this conflict, the Medici popes Leo X (1513–1521) and Clement VII (1523–1534) opposed the Protestant reformation and advanced the interests of their family. The end of the wars ultimately left northern Italy indirectly subject to the Austrian Habsburgs and Southern Italy under direct Spanish Habsburg rule.
The Papacy remained independent and launched the Counter-reformation. Key events of the period include: the Council of Trent (1545–1563); the excommunication of Elizabeth I (1570) and the Battle of Lepanto (1571), both occurring during the pontificate of Pius V; the construction of the Gregorian observatory, the adoption of the Gregorian calendar, and the Jesuit China mission of Matteo Ricci under Pope Gregory XIII; the French Wars of Religion; the Long Turkish War and the execution of Giordano Bruno in 1600, under Pope Clement VIII; the birth of the Lyncean Academy of the Papal States, of which the main figure was Galileo Galilei (later put on trial); the final phases of the Thirty Years' War (1618–1648) during the pontificates of Urban VIII and Innocent X; and the formation of the last Holy League by Innocent XI during the Great Turkish War
The Italian economy declined during the 1600s and 1700s, as the peninsula was excluded from the rising Atlantic slave trade. Following the European wars of succession of the 18th century, the south passed to a cadet branch of the Spanish Bourbons and the North fell under the influence of the Habsburg-Lorraine of Austria. During the Coalition Wars, northern-central Italy was reorganised by Napoleon in a number of Sister Republics of France and later as a Kingdom of Italy in personal union with the French Empire. The southern half of the peninsula was administered by Joachim Murat, Napoleon's brother-in-law, who was crowned as King of Naples. The 1814 Congress of Vienna restored the situation of the late 18th century, but the ideals of the French Revolution could not be eradicated, and soon re-surfaced during the political upheavals that characterised the first part of the 19th century.
The birth of the Kingdom of Italy was the result of efforts by Italian nationalists and monarchists loyal to the House of Savoy to establish a united kingdom encompassing the entire Italian Peninsula. Following the Congress of Vienna in 1815, the political and social Italian unification movement, or "Risorgimento", emerged to unite Italy consolidating the different states of the peninsula and liberate it from foreign control. A prominent radical figure was the patriotic journalist Giuseppe Mazzini, member of the secret revolutionary society "Carbonari" and founder of the influential political movement Young Italy in the early 1830s, who favoured a unitary republic and advocated a broad nationalist movement. His prolific output of propaganda helped the unification movement stay active.
The most famous member of Young Italy was the revolutionary and general Giuseppe Garibaldi, renowned for his extremely loyal followers, who led the Italian republican drive for unification in Southern Italy. However, the Northern Italy monarchy of the House of Savoy in the Kingdom of Sardinia, whose government was led by Camillo Benso, Count of Cavour, also had ambitions of establishing a united Italian state. In the context of the 1848 liberal revolutions that swept through Europe, an unsuccessful first war of independence was declared on Austria. In 1855, the Kingdom of Sardinia became an ally of Britain and France in the Crimean War, giving Cavour's diplomacy legitimacy in the eyes of the great powers. The Kingdom of Sardinia again attacked the Austrian Empire in the Second Italian War of Independence of 1859, with the aid of France, resulting in liberating Lombardy.
In 1860–1861, Garibaldi led the drive for unification in Naples and Sicily (the Expedition of the Thousand), while the House of Savoy troops occupied the central territories of the Italian peninsula, except Rome and part of Papal States. Teano was the site of the famous meeting of 26 October 1860 between Giuseppe Garibaldi and Victor Emmanuel II, last King of Sardinia, in which Garibaldi shook Victor Emanuel's hand and hailed him as King of Italy; thus, Garibaldi sacrificed republican hopes for the sake of Italian unity under a monarchy. Cavour agreed to include Garibaldi's Southern Italy allowing it to join the union with the Kingdom of Sardinia in 1860. This allowed the Sardinian government to declare a united Italian kingdom on 17 March 1861. Victor Emmanuel II then became the first king of a united Italy, and the capital was moved from Turin to Florence.
In 1866, Victor Emmanuel II allied with Prussia during the Austro-Prussian War, waging the Third Italian War of Independence which allowed Italy to annexe Venetia. Finally, in 1870, as France abandoned its garrisons in Rome during the disastrous Franco-Prussian War to keep the large Prussian Army at bay, the Italians rushed to fill the power gap by taking over the Papal States. Italian unification was completed and shortly afterward Italy's capital was moved to Rome. Victor Emmanuel, Garibaldi, Cavour and Mazzini have been referred as Italy's "Four Fathers of the Fatherland".
The new Kingdom of Italy obtained Great Power status. The Constitutional Law of the Kingdom of Sardinia the Albertine Statute of 1848, was extended to the whole Kingdom of Italy in 1861, and provided for basic freedoms of the new State, but electoral laws excluded the non-propertied and uneducated classes from voting. The government of the new kingdom took place in a framework of parliamentary constitutional monarchy dominated by liberal forces. As Northern Italy quickly industrialised, the South and rural areas of the North remained underdeveloped and overpopulated, forcing millions of people to migrate abroad and fuelling a large and influential diaspora. The Italian Socialist Party constantly increased in strength, challenging the traditional liberal and conservative establishment.
Starting from the last two decades of the 19th century, Italy developed into a colonial power by forcing under its rule Eritrea and Somalia in East Africa, Tripolitania and Cyrenaica in North Africa (later unified in the colony of Libya) and the Dodecanese islands. From 2 November 1899 to 7 September 1901, Italy also participated as part of the Eight-Nation Alliance forces during the Boxer Rebellion in China; on 7 September 1901, a concession in Tientsin was ceded to the country, and on 7 June 1902, the concession was taken into Italian possession and administered by a consul. In 1913, male universal suffrage was adopted. The pre-war period dominated by Giovanni Giolitti, Prime Minister five times between 1892 and 1921, was characterized by the economic, industrial and political-cultural modernization of Italian society.
Italy, nominally allied with the German Empire and the Empire of Austria-Hungary in the Triple Alliance, in 1915 joined the Allies into World War I with a promise of substantial territorial gains, that included western Inner Carniola, former Austrian Littoral, Dalmatia as well as parts of the Ottoman Empire. The country gave a fundamental contribution to the victory of the conflict as one of the "Big Four" top Allied powers. The war was initially inconclusive, as the Italian army got stuck in a long attrition war in the Alps, making little progress and suffering very heavy losses. However, the reorganization of the army and the conscription of the so-called "'99 Boys" ("Ragazzi del '99", all males born in 1899 who were turning 18) led to more effective Italian victories in major battles, such as on Monte Grappa and in a series of battles on the Piave river. Eventually, in October 1918, the Italians launched a massive offensive, culminating in the victory of Vittorio Veneto. The Italian victory marked the end of the war on the Italian Front, secured the dissolution of the Austro-Hungarian Empire and was chiefly instrumental in ending the First World War less than two weeks later.
During the war, more than 650,000 Italian soldiers and as many civilians died and the kingdom went to the brink of bankruptcy. Under the Peace Treaties of Saint-Germain, Rapallo and Rome, Italy gained a permanent seat in the League of Nations's executive council and obtained most of the promised territories, but not Dalmatia (except Zara), allowing nationalists to define the victory as "mutilated". Moreover, Italy annexed the Hungarian harbour of Fiume, that was not part of territories promised at London but had been occupied after the end of the war by Gabriele D'Annunzio.
The socialist agitations that followed the devastation of the Great War, inspired by the Russian Revolution, led to counter-revolution and repression throughout Italy. The liberal establishment, fearing a Soviet-style revolution, started to endorse the small National Fascist Party, led by Benito Mussolini. In October 1922 the Blackshirts of the National Fascist Party attempted a coup named the "March on Rome" which failed but at the last minute, King Victor Emmanuel III refused to proclaim a state of siege and appointed Mussolini prime minister. Over the next few years, Mussolini banned all political parties and curtailed personal liberties, thus forming a dictatorship. These actions attracted international attention and eventually inspired similar dictatorships such as Nazi Germany and Francoist Spain.
In 1935, Mussolini invaded Ethiopia and founded the Italian East Africa, resulting in an international alienation and leading to Italy's withdrawal from the League of Nations; Italy allied with Nazi Germany and the Empire of Japan and strongly supported Francisco Franco in the Spanish civil war. In 1939, Italy annexed Albania, a "de facto" protectorate for decades. Italy entered World War II on 10 June 1940. After initially advancing in British Somaliland, Egypt, the Balkans and eastern fronts, the Italians were defeated in East Africa, Soviet Union and North Africa.
The Armistice of Villa Giusti, which ended fighting between Italy and Austria-Hungary at the end of World War I, resulted in Italian annexation of neighbouring parts of Yugoslavia. During the interwar period, the fascist Italian government undertook a campaign of Italianisation in the areas it annexed, which suppressed Slavic language, schools, political parties, and cultural institutions. During World War II, Italian war crimes included extrajudicial killings and ethnic cleansing by deportation of about 25,000 people, mainly Jews, Croats, and Slovenians, to the Italian concentration camps, such as Rab, Gonars, Monigo, Renicci di Anghiari and elsewhere.
In Italy and Yugoslavia, unlike in Germany, few war crimes were prosecuted. Yugoslav Partisans perpetrated their own crimes during and after the war, including the foibe killings. Meanwhile, about 250,000 Italians and anti-communist Slavs fled to Italy in the Istrian exodus.
An Allied invasion of Sicily began in July 1943, leading to the collapse of the Fascist regime and the fall of Mussolini on 25 July. Mussolini was deposed and arrested by order of King Victor Emmanuel III in co-operation with the majority of the members of the Grand Council of Fascism, which passed a motion of no confidence. On 8 September, Italy signed the Armistice of Cassibile, ending its war with the Allies. The Germans helped by the Italian fascists shortly succeeded in taking control of northern and central Italy. The country remained a battlefield for the rest of the war, as the Allies were slowly moving up from the south.
In the north, the Germans set up the Italian Social Republic (RSI), a Nazi puppet state with Mussolini installed as leader after he was rescued by German paratroopers. Some Italian troops in the south were organized into the Italian Co-belligerent Army, which fought alongside the Allies for the rest of the war, while other Italian troops, loyal to Mussolini and his RSI, continued to fight alongside the Germans in the National Republican Army. As result, the country descended into civil war. Also, the post-armistice period saw the rise of a large anti-fascist resistance movement, the "Resistenza", which fought a guerilla war against the German and RSI forces. In late April 1945, with total defeat looming, Mussolini attempted to escape north, but was captured and summarily executed near Lake Como by Italian partisans. His body was then taken to Milan, where it was hung upside down at a service station for public viewing and to provide confirmation of his demise. Hostilities ended on 29 April 1945, when the German forces in Italy surrendered. Nearly half a million Italians (including civilians) died in the conflict, and the Italian economy had been all but destroyed; per capita income in 1944 was at its lowest point since the beginning of the 20th century.
Italy became a republic after a referendum held on 2 June 1946, a day celebrated since as Republic Day. This was also the first time that Italian women were entitled to vote. Victor Emmanuel III's son, Umberto II, was forced to abdicate and exiled. The Republican Constitution was approved on 1 January 1948. Under the Treaty of Peace with Italy of 1947, most of the Julian March was lost to Yugoslavia and, later, the Free Territory of Trieste was divided between the two states. Italy also lost all of its colonial possessions, formally ending the Italian Empire. In 1950, Italian Somaliland was made a United Nations Trust Territory under Italian administration until 1 July 1960.
Fears of a possible Communist takeover (especially in the United States) proved crucial for the first universal suffrage electoral outcome on 18 April 1948, when the Christian Democrats, under the leadership of Alcide De Gasperi, obtained a landslide victory. Consequently, in 1949 Italy became a member of NATO. The Marshall Plan helped to revive the Italian economy which, until the late 1960s, enjoyed a period of sustained economic growth commonly called the "Economic Miracle". In 1957, Italy was a founding member of the European Economic Community (EEC), which became the European Union (EU) in 1993.
From the late 1960s until the early 1980s, the country experienced the Years of Lead, a period characterised by economic crisis (especially after the 1973 oil crisis), widespread social conflicts and terrorist massacres carried out by opposing extremist groups, with the alleged involvement of US and Soviet intelligence. The Years of Lead culminated in the assassination of the Christian Democrat leader Aldo Moro in 1978 and the Bologna railway station massacre in 1980, where 85 people died.
In the 1980s, for the first time since 1945, two governments were led by non-Christian-Democrat premiers: one republican (Giovanni Spadolini) and one socialist (Bettino Craxi); the Christian Democrats remained, however, the main government party. During Craxi's government, the economy recovered and Italy became the world's fifth largest industrial nation, after it gained the entry into the Group of Seven in the 1970s. However, as a result of his spending policies, the Italian national debt skyrocketed during the Craxi era, soon passing 100% of the country's GDP.
Italy faced several terror attacks between 1992–93 perpetrated by the Sicilian Mafia as a consequence of several life sentences pronounced during the "Maxi Trial", and of the new anti-mafia measures launched by the government. In 1992, two major dynamite attacks killed the judges Giovanni Falcone (23 May in the Capaci bombing) and Paolo Borsellino (19 July in the Via D'Amelio bombing). One year later (May–July 1993), tourist spots were attacked, such as the Via dei Georgofili in Florence, Via Palestro in Milan, and the Piazza San Giovanni in Laterano and Via San Teodoro in Rome, leaving 10 dead and 93 injured and causing severe damage to cultural heritage such as the Uffizi Gallery. The Catholic Church openly condemned the Mafia, and two churches were bombed and an anti-Mafia priest shot dead in Rome. Also in the early 1990s, Italy faced significant challenges, as voters – disenchanted with political paralysis, massive public debt and the extensive corruption system (known as "Tangentopoli") uncovered by the Clean Hands ("Mani Pulite") investigation – demanded radical reforms. The scandals involved all major parties, but especially those in the government coalition: the Christian Democrats, who ruled for almost 50 years, underwent a severe crisis and eventually disbanded, splitting up into several factions. The Communists reorganised as a social-democratic force. During the 1990s and the 2000s, centre-right (dominated by media magnate Silvio Berlusconi) and centre-left coalitions (led by university professor Romano Prodi) alternately governed the country.
Amidst the Great Recession, Berlusconi resigned in 2011, and his conservative government was replaced by the technocratic cabinet of Mario Monti. Following the 2013 general election, the Vice-Secretary of the Democratic Party Enrico Letta formed a new government at the head of a right-left Grand coalition. In 2014, challenged by the new Secretary of the PD Matteo Renzi, Letta resigned and was replaced by Renzi. The new government started important constitutional reforms such as the abolition of the Senate and a new electoral law. On 4 December the constitutional reform was rejected in a referendum and Renzi resigned; the Foreign Affairs Minister Paolo Gentiloni was appointed new Prime Minister.
In the European migrant crisis of the 2010s, Italy was the entry point and leading destination for most asylum seekers entering the EU. From 2013 to 2018, the country took in over 700,000 migrants and refugees, mainly from sub-Saharan Africa, which caused great strain on the public purse and a surge in the support for far-right or eurosceptic political parties. The 2018 general election was characterized by a strong showing of the Five Star Movement and the League and the university professor Giuseppe Conte became the Prime Minister at the head of a populist coalition between these two parties. However, after only fourteen months the League withdrew its support to Conte, who formed a new unprecedented government coalition between the Five Star Movement and the centre-left.
In 2020, Italy was severely hit by a coronavirus pandemic. From March to May, Conte's government imposed a national quarantine, as a measure to limit the spread of the pandemic. The measures, despite being widely approved by the public opinion, were also described as the largest suppression of constitutional rights in the history of the republic. Italy was one of the countries with the highest total number of deaths in the worldwide coronavirus pandemic. The pandemic caused also a severe economic disruption, in which Italy resulted as one of the most affected countries.
Italy is located in Southern Europe (it is also considered a part of western Europe) between latitudes 35° and 47° N, and longitudes 6° and 19° E. To the north, Italy borders France, Switzerland, Austria, and Slovenia and is roughly delimited by the Alpine watershed, enclosing the Po Valley and the Venetian Plain. To the south, it consists of the entirety of the Italian Peninsula and the two Mediterranean islands of Sicily and Sardinia (the two biggest islands of the Mediterranean), in addition to many smaller islands. The sovereign states of San Marino and the Vatican City are enclaves within Italy, while Campione d'Italia is an Italian exclave in Switzerland.
The country's total area is , of which is land and is water. Including the islands, Italy has a coastline and border of on the Adriatic, Ionian, Tyrrhenian seas (), and borders shared with France (), Austria (), Slovenia () and Switzerland (). San Marino () and Vatican City (), both enclaves, account for the remainder.
Over 35% of the Italian territory is mountainous. The Apennine Mountains form the peninsula's backbone, and the Alps form most of its northern boundary, where Italy's highest point is located on Mont Blanc (Monte Bianco) (). Other worldwide-known mountains in Italy include the Matterhorn (Monte Cervino), Monte Rosa, Gran Paradiso in the West Alps, and Bernina, Stelvio and Dolomites along the eastern side.
The Po, Italy's longest river (), flows from the Alps on the western border with France and crosses the Padan plain on its way to the Adriatic Sea. The Po Valley is the largest plain in Italy, with , and it represents over 70% of the total plain area in the country.
Many elements of the Italian territory are of volcanic origin. Most of the small islands and archipelagos in the south, like Capraia, Ponza, Ischia, Eolie, Ustica and Pantelleria are volcanic islands.
There are also active volcanoes: Mount Etna in Sicily (the largest active volcano in Europe), Vulcano, Stromboli, and Vesuvius (the only active volcano on mainland Europe).
The five largest lakes are, in order of diminishing size: Garda (), Maggiore (, whose minor northern part is Switzerland), Como (), Trasimeno () and Bolsena ().
Although the country includes the Italian peninsula, adjacent islands, and most of the southern Alpine basin, some of Italy's territory extends beyond the Alpine basin and some islands are located outside the Eurasian continental shelf. These territories are the "comuni" of: Livigno, Sexten, Innichen, Toblach (in part), Chiusaforte, Tarvisio, Graun im Vinschgau (in part), which are all part of the Danube's drainage basin, while the Val di Lei constitutes part of the Rhine's basin and the islands of Lampedusa and Lampione are on the African continental shelf.
Four different seas surround the Italian Peninsula in the Mediterranean Sea from three sides: the Adriatic Sea in the east, the Ionian Sea in the south, and the Ligurian Sea and the Tyrrhenian Sea in the west.
Most of the rivers of Italy drain either into the Adriatic Sea, such as the Po, Piave, Adige, Brenta, Tagliamento, and Reno, or into the Tyrrhenian, like the Arno, Tiber and Volturno. The waters from some border municipalities (Livigno in Lombardy, Innichen and Sexten in Trentino-Alto Adige/Südtirol) drain into the Black Sea through the basin of the Drava, a tributary of the Danube, and the waters from the Lago di Lei in Lombardy drain into the North Sea through the basin of the Rhine.
In the north of the country are a number of subalpine moraine-dammed lakes, the largest of which is Garda (). Other well-known subalpine lakes are Lake Maggiore (), whose most northerly section is part of Switzerland, Como (), one of the deepest lakes in Europe, Orta, Lugano, Iseo, and Idro. Other notable lakes in the Italian peninsula are Trasimeno, Bolsena, Bracciano, Vico, Varano and Lesina in Gargano and Omodeo in Sardinia.
The country is situated at the meeting point of the Eurasian Plate and the African Plate, leading to considerable seismic and volcanic activity. There are 14 volcanoes in Italy, four of which are active: Etna, Stromboli, Vulcano and Vesuvius. The last is the only active volcano in mainland Europe and is most famous for the destruction of Pompeii and Herculanum in the eruption in 79 AD. Several islands and hills have been created by volcanic activity, and there is still a large active caldera, the Campi Flegrei north-west of Naples.
The high volcanic and magmatic neogenic activity is subdivided into provinces:
Italy was the first country to exploit geothermal energy to produce electricity. The high geothermal gradient that forms part of the peninsula makes potentially exploitable also other provinces: research carried out in the 1960s and 1970s identifies potential geothermal fields in Lazio and Tuscany, as well as in most volcanic islands.
After its quick industrial growth, Italy took a long time to confront its environmental problems. After several improvements, it now ranks 84th in the world for ecological sustainability. National parks cover about 5% of the country. In the last decade, Italy has become one of the world's leading producers of renewable energy, ranking as the world's fourth largest holder of installed solar energy capacity and the sixth largest holder of wind power capacity in 2010. Renewable energies now make up about 12% of the total primary and final energy consumption in Italy, with a future target share set at 17% for the year 2020.
However, air pollution remains a severe problem, especially in the industrialised north, reaching the tenth highest level worldwide of industrial carbon dioxide emissions in the 1990s. Italy is the twelfth largest carbon dioxide producer.
Extensive traffic and congestion in the largest metropolitan areas continue to cause severe environmental and health issues, even if smog levels have decreased dramatically since the 1970s and 1980s, and the presence of smog is becoming an increasingly rarer phenomenon and levels of sulphur dioxide are decreasing.
Many watercourses and coastal stretches have also been contaminated by industrial and agricultural activity, while because of rising water levels, Venice has been regularly flooded throughout recent years. Waste from industrial activity is not always disposed of by legal means and has led to permanent health effects on inhabitants of affected areas, as in the case of the Seveso disaster. The country has also operated several nuclear reactors between 1963 and 1990 but, after the Chernobyl disaster and a referendum on the issue the nuclear programme was terminated, a decision that was overturned by the government in 2008, planning to build up to four nuclear power plants with French technology. This was in turn struck down by a referendum following the Fukushima nuclear accident.
Deforestation, illegal building developments and poor land-management policies have led to significant erosion all over Italy's mountainous regions, leading to major ecological disasters like the 1963 Vajont Dam flood, the 1998 Sarno and 2009 Messina mudslides.
Italy has the highest level of faunal biodiversity in Europe, with over 57,000 species recorded, representing more than a third of all European fauna. Italy's varied geological structure contributes to its high climate and habitat diversity. The Italian peninsula is in the centre of the Mediterranean Sea, forming a corridor between central Europe and North Africa, and has of coastline. Italy also receives species from the Balkans, Eurasia, the Middle East. Italy's varied geological structure, including the Alps and the Apennines, Central Italian woodlands, and Southern Italian Garigue and Maquis shrubland, also contributes to high climate and habitat diversity.
Italian fauna includes 4,777 endemic animal species, which include the Sardinian long-eared bat, Sardinian red deer, spectacled salamander, brown cave salamander, Italian newt, Italian frog, Apennine yellow-bellied toad, Aeolian wall lizard, Sicilian wall lizard, Italian Aesculapian snake, and Sicilian pond turtle. There are 102 mammals species (most notably the Italian wolf, Marsican brown bear, Pyrenean chamois, Alpine ibex, crested porcupine, Mediterranean monk seal, Alpine marmot, Etruscan shrew, and European snow vole), 516 bird species and 56,213 invertebrate species.
The flora of Italy was traditionally estimated to comprise about 5,500 vascular plant species. However, , 6,759 species are recorded in the "Data bank of Italian vascular flora". Italy is a signatory to the Berne Convention on the Conservation of European Wildlife and Natural Habitats and the Habitats Directive both affording protection to Italian fauna and flora.
Because of the great longitudinal extension of the peninsula and the mostly mountainous internal conformation, the climate of Italy is highly diverse. In most of the inland northern and central regions, the climate ranges from humid subtropical to humid continental and oceanic. In particular, the climate of the Po valley geographical region is mostly continental, with harsh winters and hot summers.
The coastal areas of Liguria, Tuscany and most of the South generally fit the Mediterranean climate stereotype (Köppen climate classification Csa). Conditions on peninsular coastal areas can be very different from the interior's higher ground and valleys, particularly during the winter months when the higher altitudes tend to be cold, wet, and often snowy. The coastal regions have mild winters and warm and generally dry summers, although lowland valleys can be quite hot in summer. Average winter temperatures vary from on the Alps to
Italy has been a unitary parliamentary republic since 2 June 1946, when the monarchy was abolished by a constitutional referendum. The President of Italy ("Presidente della Repubblica"), currently Sergio Mattarella since 2015, is Italy's head of state. The President is elected for a single seven years mandate by the Parliament of Italy and some regional voters in joint session. Italy has a written democratic constitution, resulting from the work of a Constituent Assembly formed by the representatives of all the anti-fascist forces that contributed to the defeat of Nazi and Fascist forces during the Civil War.
Italy has a parliamentary government based on a mixed proportional and majoritarian voting system. The parliament is perfectly bicameral: the two houses, the Chamber of Deputies that meets in Palazzo Montecitorio, and the Senate of the Republic that meets in Palazzo Madama, have the same powers. The Prime Minister, officially President of the Council of Ministers ("Presidente del Consiglio dei Ministri"), is Italy's head of government. The Prime Minister and the cabinet are appointed by the President of the Republic of Italy and must pass a vote of confidence in Parliament to come into office. To remain the Prime Minister has to pass also eventual further votes of confidence or no confidence in Parliament.
The prime minister is the President of the Council of Ministers – which holds effective executive power – and he must receive a vote of approval from it to execute most political activities. The office is similar to those in most other parliamentary systems, but the leader of the Italian government is not authorised to request the dissolution of the Parliament of Italy.
Another difference with similar offices is that the overall political responsibility for intelligence is vested in the President of the Council of Ministers. By virtue of that, the Prime Minister has exclusive power to: co-ordinate intelligence policies, determining the financial resources and strengthening national cyber security; apply and protect State secrets; authorise agents to carry out operations, in Italy or abroad, in violation of the law.
A peculiarity of the Italian Parliament is the representation given to Italian citizens permanently living abroad: 12 Deputies and 6 Senators elected in four distinct overseas constituencies. In addition, the Italian Senate is characterised also by a small number of senators for life, appointed by the President "for outstanding patriotic merits in the social, scientific, artistic or literary field". Former Presidents of the Republic are "ex officio" life senators.
Italy's three major political parties are the Five Star Movement, the Democratic Party and the Lega. During the 2018 general election these three parties won 614 out of 630 seats available in the Chamber of Deputies and 309 out of 315 in the Senate. Berlusconi's Forza Italia which formed a centre-right coalition with Matteo Salvini's Northern League and Giorgia Meloni's Brothers of Italy won most of the seats without getting the majority in parliament. The rest of the seats were taken by Five Star Movement, Matteo Renzi's Democratic Party along with Achammer and Panizza's South Tyrolean People's Party & Trentino Tyrolean Autonomist Party in a centre-left coalition and the independent Free and Equal party.
The Italian judicial system is based on Roman law modified by the Napoleonic code and later statutes. The Supreme Court of Cassation is the highest court in Italy for both criminal and civil appeal cases. The Constitutional Court of Italy ("Corte Costituzionale") rules on the conformity of laws with the constitution and is a post–World War II innovation. Since their appearance in the middle of the 19th century, Italian organised crime and criminal organisations have infiltrated the social and economic life of many regions in Southern Italy, the most notorious of which being the Sicilian Mafia, which would later expand into some foreign countries including the United States. Mafia receipts may reach 9% of Italy's GDP.
A 2009 report identified 610 which have a strong Mafia presence, where 13 million Italians live and 14.6% of the Italian GDP is produced. The Calabrian 'Ndrangheta, nowadays probably the most powerful crime syndicate of Italy, accounts alone for 3% of the country's GDP. However, at 0.013 per 1,000 people, Italy has only the 47th highest murder rate compared to 61 countries and the 43rd highest number of rapes per 1,000 people compared to 64 countries in the world. These are relatively low figures among developed countries.
The Italian law enforcement system is complex, with multiple police forces. The national policing agencies are the Polizia di Stato (State Police), the Arma dei Carabinieri, the Guardia di Finanza (Financial Guard), and the Polizia Penitenziaria (Prison Police), as well as the Guardia Costiera (coast guard police).
The "Polizia di Stato" are a civil police supervised by the Interior Ministry, while the "Carabinieri" is a gendarmerie supervised by the Defense Ministry; both share duties in law enforcement and the maintenance of public order. Within the Carabinieri is a unit devoted to combating environmental crime. The "Guardia di Finanza" is responsible for combating financial crime and white-collar crime, as well as customs. The "Polizia Penitenziaria" are responsible for guarding the prison system. The Corpo Forestale dello Stato (State Forestry Corps) formerly existed as a separate national park ranger agency, but was merged into the Carabinieri in 2016. Although policing in Italy is primarily provided on a national basis, there also exists "Polizia Provinciale" (provincial police) and "Polizia Municipale" (municipal police).
Italy is a founding member of the European Economic Community (EEC), now the European Union (EU), and of NATO. Italy was admitted to the United Nations in 1955, and it is a member and a strong supporter of a wide number of international organisations, such as the Organisation for Economic Co-operation and Development (OECD), the General Agreement on Tariffs and Trade/World Trade Organization (GATT/WTO), the Organization for Security and Co-operation in Europe (OSCE), the Council of Europe, and the Central European Initiative. Its recent or upcoming turns in the rotating presidency of international organisations include the Organization for Security and Co-operation in Europe in 2018, the G7 in 2017 and the EU Council from July to December 2014. Italy is also a recurrent non-permanent member of the UN Security Council, the most recently in 2017.
Italy strongly supports multilateral international politics, endorsing the United Nations and its international security activities. , Italy was deploying 5,296 troops abroad, engaged in 33 UN and NATO missions in 25 countries of the world. Italy deployed troops in support of UN peacekeeping missions in Somalia, Mozambique, and East Timor and provides support for NATO and UN operations in Bosnia, Kosovo and Albania. Italy deployed over 2,000 troops in Afghanistan in support of Operation Enduring Freedom (OEF) from February 2003.
Italy supported international efforts to reconstruct and stabilise Iraq, but it had withdrawn its military contingent of some 3,200 troops by 2006, maintaining only humanitarian operators and other civilian personnel.
In August 2006 Italy deployed about 2,450 troops in Lebanon for the United Nations' peacekeeping mission UNIFIL. Italy is one of the largest financiers of the Palestinian National Authority, contributing €60 million in 2013 alone.
The Italian Army, Navy, Air Force and Carabinieri collectively form the Italian Armed Forces, under the command of the Supreme Defence Council, presided over by the President of Italy. Since 2005, military service is voluntary. In 2010, the Italian military had 293,202 personnel on active duty, of which 114,778 are Carabinieri. Total Italian military spending in 2010 ranked tenth in the world, standing at $35.8 billion, equal to 1.7% of national GDP. As part of NATO's nuclear sharing strategy Italy also hosts 90 United States B61 nuclear bombs, located in the Ghedi and Aviano air bases.
The Italian Army is the national ground defence force, numbering 109,703 in 2008. Its best-known combat vehicles are the Dardo infantry fighting vehicle, the Centauro tank destroyer and the Ariete tank, and among its aircraft the Mangusta attack helicopter, in the last years deployed in EU, NATO and UN missions. It also has at its disposal many Leopard 1 and M113 armoured vehicles.
The Italian Navy in 2008 had 35,200 active personnel with 85 commissioned ships and 123 aircraft. It is a blue-water navy. In modern times the Italian Navy, being a member of the EU and NATO, has taken part in many coalition peacekeeping operations around the world.
The Italian Air Force in 2008 had a strength of 43,882 and operated 585 aircraft, including 219 combat jets and 114 helicopters. A transport capability is guaranteed by a fleet of 27 C-130Js and C-27J Spartan.
An autonomous corps of the military, the Carabinieri are the gendarmerie and military police of Italy, policing the military and civilian population alongside Italy's other police forces. While the different branches of the Carabinieri report to separate ministries for each of their individual functions, the corps reports to the Ministry of Internal Affairs when maintaining public order and security.
Italy is constituted by 20 regions ("regioni")—five of these regions having a special autonomous status that enables them to enact legislation on additional matters, 107 provinces ("province") or metropolitan cities ("città metropolitane"), and 7,960 municipalities ("comuni").
Italy has a major advanced capitalist mixed economy, ranking as the third-largest in the Eurozone and the eighth-largest in the world. A founding member of the G7, the Eurozone and the OECD, it is regarded as one of the world's most industrialised nations and a leading country in world trade and exports. It is a highly developed country, with the world's 8th highest quality of life in 2005 and the 26th Human Development Index. The country is well known for its creative and innovative business, a large and competitive agricultural sector (with the world's largest wine production), and for its influential and high-quality automobile, machinery, food, design and fashion industry.
Italy is the world's sixth largest manufacturing country, characterised by a smaller number of global multinational corporations than other economies of comparable size and many dynamic small and medium-sized enterprises, notoriously clustered in several industrial districts, which are the backbone of the Italian industry. This has produced a manufacturing sector often focused on the export of niche market and luxury products, that if on one side is less capable to compete on the quantity, on the other side is more capable of facing the competition from China and other emerging Asian economies based on lower labour costs, with higher quality products. Italy was the world's 7th largest exporter in 2016. Its closest trade ties are with the other countries of the European Union, with whom it conducts about 59% of its total trade. Its largest EU trade partners, in order of market share, are Germany (12.9%), France (11.4%), and Spain (7.4%).
The automotive industry is a significant part of the Italian manufacturing sector, with over 144,000 firms and almost 485,000 employed people in 2015, and a contribution of 8.5% to Italian GDP. Fiat Chrysler Automobiles or FCA is currently the world's seventh-largest auto maker. The country boasts a wide range of acclaimed products, from very compact city cars to luxury supercars such as Maserati, Lamborghini, and Ferrari, which was rated the world's most powerful brand by Brand Finance.
Italy is part of the European single market which represents more than 500 million consumers. Several domestic commercial policies are determined by agreements among European Union (EU) members and by EU legislation. Italy introduced the common European currency, the Euro in 2002. It is a member of the Eurozone which represents around 330 million citizens. Its monetary policy is set by the European Central Bank.
Italy has been hit hard by the Financial crisis of 2007–08, that exacerbated the country's structural problems. Effectively, after a strong GDP growth of 5–6% per year from the 1950s to the early 1970s, and a progressive slowdown in the 1980-90s, the country virtually stagnated in the 2000s. The political efforts to revive growth with massive government spending eventually produced a severe rise in public debt, that stood at over 131.8% of GDP in 2017, ranking second in the EU only after the Greek one. For all that, the largest chunk of Italian public debt is owned by national subjects, a major difference between Italy and Greece, and the level of household debt is much lower than the OECD average.
A gaping North–South divide is a major factor of socio-economic weakness. It can be noted by the huge difference in statistical income between the northern and southern regions and municipalities. The richest province, Alto Adige-South Tyrol, earns 152% of the national GDP per capita, while the poorest region, Calabria, 61%. The unemployment rate (11.1%) stands slightly above the Eurozone average, but the disaggregated figure is 6.6% in the North and 19.2% in the South. The youth unemployment rate (31.7% in March 2018) is extremely high compared to EU standards.
Italy has a strong cooperative sector, with the largest share of the population (4.5%) employed by a cooperative in the EU.
According to the last national agricultural census, there were 1.6 million farms in 2010 (−32.4% since 2000) covering 12.7 million hectares (63% of which are located in Southern Italy). The vast majority (99%) are family-operated and small, averaging only 8 hectares in size. Of the total surface area in agricultural use (forestry excluded), grain fields take up 31%, olive tree orchards 8.2%, vineyards 5.4%, citrus orchards 3.8%, sugar beets 1.7%, and horticulture 2.4%. The remainder is primarily dedicated to pastures (25.9%) and feed grains (11.6%).
Italy is the world's largest wine producer, and one of the leading in olive oil, fruits (apples, olives, grapes, oranges, lemons, pears, apricots, hazelnuts, peaches, cherries, plums, strawberries and kiwifruits), and vegetables (especially artichokes and tomatoes). The most famous Italian wines are probably the Tuscan Chianti and the Piedmontese Barolo. Other famous wines are Barbaresco, Barbera d'Asti, Brunello di Montalcino, Frascati, Montepulciano d'Abruzzo, Morellino di Scansano, and the sparkling wines Franciacorta and Prosecco.
Quality goods in which Italy specialises, particularly the already mentioned wines and regional cheeses, are often protected under the quality assurance labels DOC/DOP. This geographical indication certificate, which is attributed by the European Union, is considered important in order to avoid confusion with low-quality mass-produced ersatz products.
In 2004 the transport sector in Italy generated a turnover of about 119.4 billion euros, employing 935,700 persons in 153,700 enterprises. Regarding the national road network, in 2002 there were of serviceable roads in Italy, including of motorways, state-owned but privately operated by Atlantia. In 2005, about 34,667,000 passenger cars (590 cars per 1,000 people) and 4,015,000 goods vehicles circulated on the national road network.
The national railway network, state-owned and operated by Rete Ferroviaria Italiana (FSI), in 2008 totalled of which is electrified, and on which 4,802 locomotives and railcars run. The main public operator of high-speed trains is Trenitalia, part of FSI. Higher-speed trains are divided into three categories: Frecciarossa () trains operate at a maximum speed of 300 km/h on dedicated high-speed tracks; Frecciargento () trains operate at a maximum speed of 250 km/h on both high-speed and mainline tracks; and Frecciabianca () trains operate on high-speed regional lines at a maximum speed of 200 km/h. Italy has 11 rail border crossings over the Alpine mountains with its neighbouring countries.
Italy is one of the countries with the most vehicles per capita, with 690 per 1000 people in 2010. The national inland waterways network comprised of navigable rivers and channels for various types of commercial traffic in 2012.
Italy's largest airline is Alitalia, which serves 97 destinations (as of October 2019) and also operates a regional subsidiary under the Alitalia CityLiner brand. The country also has regional airlines (such as Air Dolomiti), low-cost carriers, and Charter and leisure carriers (including Neos, Blue Panorama Airlines and Poste Air Cargo. Major Italian cargo operators are Alitalia Cargo and Cargolux Italia.
Italy is the fifth in Europe by number of passengers by air transport, with about 148 million passengers or about 10% of the European total in 2011. In 2012 there were 130 airports in Italy, including the two hubs of Malpensa International in Milan and Leonardo da Vinci International in Rome. In 2004 there were 43 major seaports, including the seaport of Genoa, the country's largest and second largest in the Mediterranean Sea. In 2005 Italy maintained a civilian air fleet of about 389,000 units and a merchant fleet of 581 ships.
Italy does not invest enough to maintain its drinking water supply. The Galli Law, passed in 1993, aimed at raising the level of investment and to improve service quality by consolidating service providers, making them more efficient and increasing the level of cost recovery through tariff revenues. Despite these reforms, investment levels have declined and remain far from sufficient.
Eni, with operations in 79 countries, is one of the seven "Supermajor" oil companies in the world, and one of the world's largest industrial companies.
Moderate natural gas reserves, mainly in the Po Valley and offshore Adriatic Sea, have been discovered in recent years and constitute the country's most important mineral resource. Italy is one of the world's leading producers of pumice, pozzolana, and feldspar. Another notable mineral resource is marble, especially the world-famous white Carrara marble from the Massa and Carrara quarries in Tuscany. Italy needs to import about 80% of its energy requirements.
In the last decade, Italy has become one of the world's largest producers of renewable energy, ranking as the second largest producer in the European Union and the ninth in the world. Wind power, hydroelectricity, and geothermal power are also important sources of electricity in the country. Renewable sources account for the 27.5% of all electricity produced in Italy, with hydro alone reaching 12.6%, followed by solar at 5.7%, wind at 4.1%, bioenergy at 3.5%, and geothermal at 1.6%. The rest of the national demand is covered by fossil fuels (38.2% natural gas, 13% coal, 8.4% oil) and by imports.
Solar energy production alone accounted for almost 9% of the total electric production in the country in 2014, making Italy the country with the highest contribution from solar energy in the world. The Montalto di Castro Photovoltaic Power Station, completed in 2010, is the largest photovoltaic power station in Italy with 85 MW. Other examples of large PV plants in Italy are San Bellino (70.6 MW), Cellino san Marco (42.7 MW) and Sant’ Alberto (34.6 MW). Italy was also the first country to exploit geothermal energy to produce electricity.
Italy has managed four nuclear reactors until the 1980s. However, nuclear power in Italy has been abandoned following a 1987 referendum (in the wake of the 1986 Chernobyl disaster in Soviet Ukraine). The national power company Enel operates several nuclear reactors in Spain, Slovakia and France, managing it to access nuclear power and direct involvement in design, construction, and operation of the plants without placing reactors on Italian territory.
Through the centuries, Italy has fostered the scientific community that produced many major discoveries in physics and the other sciences. During the Renaissance Italian polymaths such as Leonardo da Vinci (1452–1519), Michelangelo (1475–1564) and Leon Battista Alberti (1404–1472) made important contributions to a variety of fields, including biology, architecture, and engineering. Galileo Galilei (1564–1642), a physicist, mathematician and astronomer, played a major role in the Scientific Revolution. His achievements include key improvements to the telescope and consequent astronomical observations, and ultimately the triumph of Copernicanism over the Ptolemaic model.
Other astronomers suchs as Giovanni Domenico Cassini (1625–1712) and Giovanni Schiaparelli (1835–1910) made many important discoveries about the Solar System. In mathematics, Joseph Louis Lagrange (born Giuseppe Lodovico Lagrangia, 1736–1813) was active before leaving Italy. Fibonacci (c. 1170 – c. 1250), and Gerolamo Cardano (1501–1576) made fundamental advances in mathematics. Luca Pacioli established accounting to the world. Physicist Enrico Fermi (1901–1954), a Nobel prize laureate, led the team in Chicago that developed the first nuclear reactor and is also noted for his many other contributions to physics, including the co-development of the quantum theory and was one of the key figures in the creation of the nuclear weapon. He, Emilio G. Segrè (1905–1989) who discovered the elements technetium and astatine, and the antiproton), Bruno Rossi (1905–1993) a pioneer in Cosmic Rays and X-ray astronomy) and a number of Italian physicists were forced to leave Italy in the 1930s by Fascist laws against Jews.
Other prominent physicists include: Amedeo Avogadro (most noted for his contributions to molecular theory, in particular the Avogadro's law and the Avogadro constant), Evangelista Torricelli (inventor of barometer), Alessandro Volta (inventor of electric battery), Guglielmo Marconi (inventor of radio), Galileo Ferraris and Antonio Pacinotti, pioneers of the induction motor, Alessandro Cruto, pioneer of light bulb and Innocenzo Manzetti, eclectic pioneer of auto and robotics, Ettore Majorana (who discovered the Majorana fermions), Carlo Rubbia (1984 Nobel Prize in Physics for work leading to the discovery of the W and Z particles at CERN). Antonio Meucci is known for developing a voice-communication device which is often credited as the first telephone. Pier Giorgio Perotto in 1964 designed one of the first desktop programmable calculators, the Programma 101. In biology, Francesco Redi has been the first to challenge the theory of spontaneous generation by demonstrating that maggots come from eggs of flies and he described 180 parasites in details and Marcello Malpighi founded microscopic anatomy, Lazzaro Spallanzani conducted important research in bodily functions, animal reproduction, and cellular theory, Camillo Golgi, whose many achievements include the discovery of the Golgi complex, paved the way to the acceptance of the Neuron doctrine, Rita Levi-Montalcini discovered the nerve growth factor (awarded 1986 Nobel Prize in Physiology or Medicine). In chemistry, Giulio Natta received the Nobel Prize in Chemistry in 1963 for his work on high polymers. Giuseppe Occhialini received the Wolf Prize in Physics for the discovery of the pion or pi-meson decay in 1947. Ennio de Giorgi, a Wolf Prize in Mathematics recipient in 1990, solved Bernstein's problem about minimal surfaces and the 19th Hilbert problem on the regularity of solutions of Elliptic partial differential equations.
Italy is the fifth most visited country in the world, with a total of 52.3 million international arrivals in 2016. The total contribution of travel & tourism to GDP (including wider effects from investment, the supply chain and induced income impacts) was EUR162.7bn in 2014 (10.1% of GDP) and generated 1,082,000 jobs directly in 2014 (4.8% of total employment).
Italy is well known for its cultural and environmental tourist routes and is home to 55 UNESCO World Heritage Sites, the most in the world. Rome is the 3rd most visited city in Europe and the 12th in the world, with 9.4 million arrivals in 2017 while Milan is the 27th worldwide with 6.8 million tourists. In addition, Venice and Florence are also among the world's top 100 destinations.
At the beginning of 2020, Italy had 60,317,116 inhabitants. The resulting population density, at , is higher than that of most Western European countries. However, the distribution of the population is widely uneven. The most densely populated areas are the Po Valley (that accounts for almost a half of the national population) and the metropolitan areas of Rome and Naples, while vast regions such as the Alps and Apennines highlands, the plateaus of Basilicata and the island of Sardinia are very sparsely populated.
The population of Italy almost doubled during the 20th century, but the pattern of growth was extremely uneven because of large-scale internal migration from the rural South to the industrial cities of the North, a phenomenon which happened as a consequence of the Italian economic miracle of the 1950–1960s. High fertility and birth rates persisted until the 1970s, after which they started to decline. The population rapidly aged; by 2010, one in five Italians was over 65 years old, and the country currently has the fifth oldest population in the world, with a median age of 45.8 years. However, in recent years Italy has experienced significant growth in birth rates. The total fertility rate has also climbed from an all-time low of 1.18 children per woman in 1995 to 1.41 in 2008, albeit still below the replacement rate of 2.1 and considerably below the high of 5.06 children born per woman in 1883. Nevertheless, the total fertility rate is expected to reach 1.6–1.8 in 2030.
From the late 19th century until the 1960s Italy was a country of mass emigration. Between 1898 and 1914, the peak years of Italian diaspora, approximately 750,000 Italians emigrated each year. The diaspora concerned more than 25 million Italians and it is considered the biggest mass migration of contemporary times. As a result, today more than 4.1 million Italian citizens are living abroad, while at least 60 million people of full or part Italian ancestry live outside of Italy, most notably in Argentina, Brazil, Uruguay, Venezuela, the United States, Canada, Australia and France.
Source:
In 2016, Italy had about 5.05 million foreign residents, making up 8.3% of the total population. The figures include more than half a million children born in Italy to foreign nationals (second generation immigrants) but exclude foreign nationals who have subsequently acquired Italian citizenship; in 2016, about 201,000 people became Italian citizens, compared to 130,000 in 2014. The official figures also exclude illegal immigrants, who estimated to number at least 670,000 as of 2008.
Starting from the early 1980s, until then a linguistically and culturally homogeneous society, Italy begun to attract substantial flows of foreign immigrants. After the fall of the Berlin Wall and, more recently, the 2004 and 2007 enlargements of the European Union, large waves of migration originated from the former socialist countries of Eastern Europe (especially Romania, Albania, Ukraine and Poland). An equally important source of immigration is neighbouring North Africa (in particular, Morocco, Egypt and Tunisia), with soaring arrivals as a consequence of the Arab Spring. Furthermore, in recent years, growing migration fluxes from Asia-Pacific (notably China and the Philippines) and Latin America have been recorded.
Currently, about one million Romanian citizens (around 10% of them being ethnic Romani people) are officially registered as living in Italy, representing thus the most important individual country of origin, followed by Albanians and Moroccans with about 500,000 people each. The number of unregistered Romanians is difficult to estimate, but the Balkan Investigative Reporting Network suggested in 2007 that there might have been half a million or more.
As of 2010, the foreign born population of Italy was from the following regions: Europe (54%), Africa (22%), Asia (16%), the Americas (8%) and Oceania (0.06%). The distribution of immigrants is largely uneven in Italy: 87% live in the northern and central parts of the country (the most economically developed areas), while only 13% live in the southern half.
Italy's official language is Italian, as stated by the framework law no. 482/1999 and Trentino Alto-Adige's special Statute, which is adopted with a constitutional law. There are an estimated 64 million native Italian speakers and another 21 million who use it as a second language. Italian is often natively spoken in a regional variety, not to be confused with Italy's regional and minority languages; however, the establishment of a national education system led to a decrease in variation in the languages spoken across the country during the 20th century. Standardisation was further expanded in the 1950s and 1960s due to economic growth and the rise of mass media and television (the state broadcaster RAI helped set a standard Italian).
Twelve "historical minority languages" ("minoranze linguistiche storiche") are formally recognised: Albanian, Catalan, German, Greek, Slovene, Croatian, French, Franco-Provençal, Friulian, Ladin, Occitan and Sardinian. Four of these also enjoy a co-official status in their respective region: French in the Aosta Valley; German in South Tyrol, and Ladin as well in some parts of the same province and in parts of the neighbouring Trentino; and Slovene in the provinces of Trieste, Gorizia and Udine. A number of other Ethnologue, ISO and UNESCO languages are not recognised by Italian law. Like France, Italy has signed the European Charter for Regional or Minority Languages, but has not ratified it.
Because of recent immigration, Italy has sizeable populations whose native language is not Italian, nor a regional language. According to the Italian National Institute of Statistics, Romanian is the most common mother tongue among foreign residents in Italy: almost 800,000 people speak Romanian as their first language (21.9% of the foreign residents aged 6 and over). Other prevalent mother tongues are Arabic (spoken by over 475,000 people; 13.1% of foreign residents), Albanian (380,000 people) and Spanish (255,000 people).
In 2017, the proportion of Italians who identified themselves as Roman Catholic Christians was 74.4%. Since 1985, it is no longer officially the state religion.
The Holy See, the episcopal jurisdiction of Rome, contains the central government of the Roman Catholic Church. It is recognised by other subjects of international law as a sovereign entity, headed by the Pope, who is also the Bishop of Rome, with which diplomatic relations can be maintained. Often incorrectly referred to as "the Vatican", the Holy See is not the same entity as the Vatican City State, which came into existence only in 1929.
In 2011, minority Christian faiths in Italy included an estimated 1.5 million Orthodox Christians, or 2.5% of the population; 500,000 Pentecostals and Evangelicals (of whom 400,000 are members of the Assemblies of God), 251,192 Jehovah's Witnesses, 30,000 Waldensians, 25,000 Seventh-day Adventists, 26,925 Latter-day Saints, 15,000 Baptists (plus some 5,000 Free Baptists), 7,000 Lutherans, 4,000 Methodists (affiliated with the Waldensian Church).
One of the longest-established minority religious faiths in Italy is Judaism, Jews having been present in Ancient Rome since before the birth of Christ. Italy has for centuries welcomed Jews expelled from other countries, notably Spain. However, about 20% of Italian Jews were killed during the Holocaust. This, together with the emigration that preceded and followed World War II, has left only around 28,400 Jews in Italy.
Soaring immigration in the last two decades has been accompanied by an increase in non-Christian faiths. There are more than 800,000 followers of faiths originating in the Indian subcontinent with some 70,000 Sikhs with 22 gurdwaras across the country.
The Italian state, as a measure to protect religious freedom, devolves shares of income tax to recognised religious communities, under a regime known as Eight per thousand. Donations are allowed to Christian, Jewish, Buddhist and Hindu communities; however, Islam remains excluded, since no Muslim communities have yet signed a concordat with the Italian state. Taxpayers who do not wish to fund a religion contribute their share to the state welfare system.
Education in Italy is free and mandatory from ages six to sixteen, and consists of five stages: kindergarten ("scuola dell'infanzia"), primary school ("scuola primaria"), lower secondary school ("scuola secondaria di primo grado", upper secondary school ("scuola secondaria di secondo grado") and university ("università").
Primary education lasts eight years. Students are given a basic education in Italian, English, mathematics, natural sciences, history, geography, social studies, physical education and visual and musical arts. Secondary education lasts for five years and includes three traditional types of schools focused on different academic levels: the "liceo" prepares students for university studies with a classical or scientific curriculum, while the "istituto tecnico" and the "Istituto professionale" prepare pupils for vocational education. In 2012, the Italian secondary education was evaluated as slightly below the OECD average, with a strong and steady improvement in science and mathematics results since 2003; however, a wide gap exists between northern schools, which performed significantly better than the national average (among the best in the world in some subjects), and schools in the South, that had much poorer results.
Tertiary education in Italy is divided between public universities, private universities and the prestigious and selective superior graduate schools, such as the Scuola Normale Superiore di Pisa. 33 Italian universities were ranked among the world's top 500 in 2019, the third-largest number in Europe after the United Kingdom and Germany. Bologna University, founded in 1088, is the oldest university in continuous operation, as well as one of the leading academic institutions in Italy and Europe. The Bocconi University, Università Cattolica del Sacro Cuore, LUISS, Polytechnic University of Turin, Polytechnic University of Milan, Sapienza University of Rome, and University of Milan are also ranked among the best in the world.
The Italian state runs a universal public healthcare system since 1978. However, healthcare is provided to all citizens and residents by a mixed public-private system. The public part is the "Servizio Sanitario Nazionale", which is organised under the Ministry of Health and administered on a devolved regional basis. Healthcare spending in Italy accounted for 9.2% of the national GDP in 2012, very close the OECD countries' average of 9.3%. Italy in 2000 ranked as having the world's 2nd best healthcare system, and the world's 2nd best healthcare performance.
Life expectancy in Italy is 80 for males and 85 for females, placing the country 5th in the world for life expectancy. In comparison to other Western countries, Italy has a relatively low rate of adult obesity (below 10%), as there are several health benefits of the Mediterranean diet. The proportion of daily smokers was 22% in 2012, down from 24.4% in 2000 but still slightly above the OECD average. Smoking in public places including bars, restaurants, night clubs and offices has been restricted to specially ventilated rooms since 2005. In 2013, UNESCO added the Mediterranean diet to the Representative List of the Intangible Cultural Heritage of Humanity of Italy (promoter), Morocco, Spain, Portugal, Greece, Cyprus and Croatia.
Divided by politics and geography for centuries until its eventual unification in 1861, Italy's culture has been shaped by a multitude of regional customs and local centres of power and patronage. Italy has had a central role in Western culture for centuries and is still recognised for its cultural traditions and artists. During the Middle Ages and the Renaissance, a number of magnificent courts competed for attracting the best architects, artists and scholars, thus producing a great legacy of monuments, paintings, music and literature. Despite the political and social isolation of these courts, Italy's contribution to the cultural and historical heritage of Europe and the world remain immense.
Italy has more UNESCO World Heritage Sites (55) than any other country in the world, and has rich collections of art, culture and literature from many periods. The country has had a broad cultural influence worldwide, also because numerous Italians emigrated to other places during the Italian diaspora. Furthermore, Italy has, overall, an estimated 100,000 monuments of any sort (museums, palaces, buildings, statues, churches, art galleries, villas, fountains, historic houses and archaeological remains), and according to some estimates the nation is home to half the world's great art treasures.
Italy is known for its considerable architectural achievements, such as the construction of arches, domes and similar structures during ancient Rome, the founding of the Renaissance architectural movement in the late-14th to 16th centuries, and being the homeland of Palladianism, a style of construction which inspired movements such as that of Neoclassical architecture, and influenced the designs which noblemen built their country houses all over the world, notably in the UK, Australia and the US during the late 17th to early 20th centuries.
Along with pre-historic architecture, the first people in Italy to truly begin a sequence of designs were the Greeks and the Etruscans, progressing to classical Roman, then to the revival of the classical Roman era during the Renaissance and evolving into the Baroque era. The Christian concept of a Basilica, a style of church architecture that came to dominate the early Middle Ages, was invented in Rome. They were known for being long, rectangular buildings, which were built in an almost ancient Roman style, often rich in mosaics and decorations. The early Christians' art and architecture was also widely inspired by that of the pagan Romans; statues, mosaics and paintings decorated all their churches. The first significant buildings in the medieval Romanesque style were churches built in Italy during the 800's. Byzantine architecture was also widely diffused in Italy. The Byzantines kept Roman principles of architecture and art alive, and the most famous structure from this period is the Basilica of St. Mark in Venice.
The Romanesque movement, which went from approximately 800 AD to 1100 AD, was one of the most fruitful and creative periods in Italian architecture, when several masterpieces, such as the Leaning Tower of Pisa in the Piazza dei Miracoli, and the Basilica of Sant'Ambrogio in Milan were built. It was known for its usage of the Roman arches, stained glass windows, and also its curved columns which commonly featured in cloisters. The main innovation of Italian Romanesque architecture was the vault, which had never been seen before in the history of Western architecture.
The greatest flowering of Italian architecture took place during the Renaissance. Filippo Brunelleschi made great contributions to architectural design with his dome for the Cathedral of Florence, a feat of engineering that had not been accomplished since antiquity. A popular achievement of Italian Renaissance architecture was St. Peter's Basilica, originally designed by Donato Bramante in the early 16th century. Also, Andrea Palladio influenced architects throughout western Europe with the villas and palaces he designed in the middle and late 16th century; the city of Vicenza, with its twenty-three buildings designed by Palladio, and twenty-four Palladian Villas of the Veneto are listed by UNESCO as part of a World Heritage Site named City of Vicenza and the Palladian Villas of the Veneto.
The Baroque period produced several outstanding Italian architects in the 17th century, especially known for their churches. The most original work of all late Baroque and Rococo architecture is the Palazzina di caccia di Stupinigi, dating back to the 18th century. Luigi Vanvitelli began in 1752 the construction of the Royal Palace of Caserta. In this large complex, the grandiose Baroque style interiors and gardens are opposed to a more sober building envelope. In the late 18th and early 19th centuries Italy was affected by the Neoclassical architectural movement. Everything from villas, palaces, gardens, interiors and art began to be based on Roman and Greek themes.
During the Fascist period, the so-called "Novecento movement" flourished, based on the rediscovery of imperial Rome, with figures such as Gio Ponti and Giovanni Muzio. Marcello Piacentini, responsible for the urban transformations of several cities in Italy and remembered for the disputed Via della Conciliazione in Rome, devised a form of simplified Neoclassicism.
The history of Italian visual arts is significant to the history of Western painting. Roman art was influenced by Greece and can in part be taken as a descendant of ancient Greek painting. Roman painting does have its own unique characteristics. The only surviving Roman paintings are wall paintings, many from villas in Campania, in Southern Italy. Such paintingS can be grouped into four main "styles" or periods and may contain the first examples of trompe-l'œil, pseudo-perspective, and pure landscape.
Panel painting becomes more common during the Romanesque period, under the heavy influence of Byzantine icons. Towards the middle of the 13th century, Medieval art and Gothic painting became more realistic, with the beginnings of interest in the depiction of volume and perspective in Italy with Cimabue and then his pupil Giotto. From Giotto onwards, the treatment of composition by the best painters also became much more free and innovative. The two are considered to be the two great medieval masters of painting in western culture.
The Italian Renaissance is said by many to be the golden age of painting; roughly spanning the 14th through the mid-17th centuries with a significant influence also out of the borders of modern Italy. In Italy artists like Paolo Uccello, Fra Angelico, Masaccio, Piero della Francesca, Andrea Mantegna, Filippo Lippi, Giorgione, Tintoretto, Sandro Botticelli, Leonardo da Vinci, Michelangelo Buonarroti, Raphael, Giovanni Bellini, and Titian took painting to a higher level through the use of perspective, the study of human anatomy and proportion, and through their development of an unprecedented refinement in drawing and painting techniques. Michelangelo was an active sculptor from about 1500 to 1520, and his great masterpieces including his "David", "Pietà", "Moses". Other prominent Renaissance sculptors include Lorenzo Ghiberti, Luca Della Robbia, Donatello, Filippo Brunelleschi and Andrea del Verrocchio.
In the 15th and 16th centuries, the High Renaissance gave rise to a stylised art known as Mannerism. In place of the balanced compositions and rational approach to perspective that characterised art at the dawn of the 16th century, the Mannerists sought instability, artifice, and doubt. The unperturbed faces and gestures of Piero della Francesca and the calm Virgins of Raphael are replaced by the troubled expressions of Pontormo and the emotional intensity of El Greco.
In the 17th century, among the greatest painters of Italian Baroque are Caravaggio, Annibale Carracci, Artemisia Gentileschi, Mattia Preti, Carlo Saraceni and Bartolomeo Manfredi. Subsequently, in the 18th century, Italian Rococo was mainly inspired by French Rococo, since France was the founding nation of that particular style, with artists such as Giovanni Battista Tiepolo and Canaletto. Italian Neoclassical sculpture focused, with Antonio Canova's nudes, on the idealist aspect of the movement.
In the 19th century, major Italian Romantic painters were Francesco Hayez, Giuseppe Bezzuoli and Francesco Podesti. Impressionism was brought from France to Italy by the "Macchiaioli", led by Giovanni Fattori, and Giovanni Boldini; Realism by Gioacchino Toma and Giuseppe Pellizza da Volpedo. In the 20th century, with Futurism, primarily through the works of Umberto Boccioni and Giacomo Balla, Italy rose again as a seminal country for artistic evolution in painting and sculpture. Futurism was succeeded by the metaphysical paintings of Giorgio de Chirico, who exerted a strong influence on the Surrealists and generations of artists to follow like Bruno Caruso and Renato Guttuso.
Formal Latin literature began in 240 BC, when the first stage play was performed in Rome. Latin literature was, and still is, highly influential in the world, with numerous writers, poets, philosophers, and historians, such as Pliny the Elder, Pliny the Younger, Virgil, Horace, Propertius, Ovid and Livy. The Romans were also famous for their oral tradition, poetry, drama and epigrams. In early years of the 13th century, St. Francis of Assisi was considered the first Italian poet by literary critics, with his religious song "Canticle of the Sun".
Another Italian voice originated in Sicily. At the court of Emperor Frederick II, who ruled the Sicilian kingdom during the first half of the 13th century, lyrics modelled on Provençal forms and themes were written in a refined version of the local vernacular. The most important of these poets was the notary Giacomo da Lentini, inventor of the sonnet form, though the most famous early sonneteer was Petrarch.
Guido Guinizelli is considered the founder of the "Dolce Stil Novo", a school that added a philosophical dimension to traditional love poetry. This new understanding of love, expressed in a smooth, pure style, influenced Guido Cavalcanti and the Florentine poet Dante Alighieri, who established the basis of the modern Italian language; his greatest work, the "Divine Comedy", is considered among the foremost literary statements produced in Europe during the Middle Ages; furthermore, the poet invented the difficult "terza rima". The two great writers of the 14th century, Petrarch and Giovanni Boccaccio, sought out and imitated the works of antiquity and cultivated their own artistic personalities. Petrarch achieved fame through his collection of poems, "Il Canzoniere". Petrarch's love poetry served as a model for centuries. Equally influential was Boccaccio's "The Decameron", one of the most popular collections of short stories ever written.
Italian Renaissance authors produced a number of important works. Niccolò Machiavelli's "The Prince" is one of the world's most famous essays on political science and modern philosophy, in which the "effectual truth" is taken to be more important than any abstract ideal. Another important work of the period, Ludovico Ariosto's "Orlando Furioso", continuation of Matteo Maria Boiardo's unfinished romance "Orlando Innamorato", is perhaps the greatest chivalry poem ever written. Baldassare Castiglione's dialogue "The Book of the Courtier" describes the ideal of the perfect court gentleman and of spiritual beauty. The lyric poet Torquato Tasso in "Jerusalem Delivered" wrote a Christian epic, making use of the "ottava rima", with attention to the Aristotelian canons of unity.
Giovanni Francesco Straparola and Giambattista Basile, which have written "The Facetious Nights of Straparola" (1550–1555) and the "Pentamerone" (1634) respectively, printed some of the first known versions of fairy tales in Europe. In the early 17th century, some literary masterpieces were created, such as Giambattista Marino's long mythological poem, "L'Adone". The Baroque period also produced the clear scientific prose of Galileo as well as Tommaso Campanella's "The City of the Sun", a description of a perfect society ruled by a philosopher-priest. At the end of the 17th century, the Arcadians began a movement to restore simplicity and classical restraint to poetry, as in Metastasio's heroic melodramas. In the 18th century, playwright Carlo Goldoni created full written plays, many portraying the middle class of his day.
The Romanticism coincided with some ideas of the "Risorgimento", the patriotic movement that brought Italy political unity and freedom from foreign domination. Italian writers embraced Romanticism in the early 19th century. The time of Italy's rebirth was heralded by the poets Vittorio Alfieri, Ugo Foscolo, and Giacomo Leopardi. The works by Alessandro Manzoni, the leading Italian Romantic, are a symbol of the Italian unification for their patriotic message and because of his efforts in the development of the modern, unified Italian language; his novel "The Betrothed" was the first Italian historical novel to glorify Christian values of justice and Providence, and it has been called the most famous and widely read novel in the Italian language.
In the late 19th century, a realistic literary movement called "Verismo" played a major role in Italian literature; Giovanni Verga and Luigi Capuana were its main exponents. In the same period, Emilio Salgari, writer of action adventure swashbucklers and a pioneer of science fiction, published his "Sandokan" series. In 1883, Carlo Collodi also published the novel "The Adventures of Pinocchio", the most celebrated children's classic by an Italian author and the most translated non-religious book in the world. A movement called Futurism influenced Italian literature in the early 20th century. Filippo Tommaso Marinetti wrote "Manifesto of Futurism", called for the use of language and metaphors that glorified the speed, dynamism, and violence of the machine age.
Modern literary figures and Nobel laureates are Gabriele D'Annunzio from 1889 to 1910, nationalist poet Giosuè Carducci in 1906, realist writer Grazia Deledda in 1926, modern theatre author Luigi Pirandello in 1936, short stories writer Italo Calvino in 1960, poets Salvatore Quasimodo in 1959 and Eugenio Montale in 1975, Umberto Eco in 1980, and satirist and theatre author Dario Fo in 1997.
Over the ages, Italian philosophy and literature had a vast influence on Western philosophy, beginning with the Greeks and Romans, and going onto Renaissance humanism, the Age of Enlightenment and modern philosophy. Philosophy was brought to Italy by Pythagoras, founder of the Italian school of philosophy in Crotone. Major Italian philosophers of the Greek period include Xenophanes, Parmenides, Zeno, Empedocles and Gorgias. Roman philosophers include Cicero, Lucretius, Seneca the Younger, Musonius Rufus, Plutarch, Epictetus, Marcus Aurelius, Clement of Alexandria, Sextus Empiricus, Alexander of Aphrodisias, Plotinus, Porphyry, Iamblichus, Augustine of Hippo, Philoponus of Alexandria and Boethius.
Italian Medieval philosophy was mainly Christian, and included several important philosophers and theologians such as St Thomas Aquinas, the foremost classical proponent of natural theology and the father of Thomism, who reintroduced Aristotelian philosophy to Christianity. Notable Renaissance philosophers include: Giordano Bruno, one of the major scientific figures of the western world; Marsilio Ficino, one of the most influential humanist philosophers of the period; and Niccolò Machiavelli, one of the main founders of modern political science. Machiavelli's most famous work was "The Prince", whose contribution to the history of political thought is the fundamental break between political realism and political idealism. Italy was also affected by the Enlightenment, a movement which was a consequence of the Renaissance. Cities with important universities such as Padua, Bologna and Naples remained great centres of scholarship and the intellect, with several philosophers such as Giambattista Vico (who is widely regarded as being the founder of modern Italian philosophy) and Antonio Genovesi. Cesare Beccaria was also one of the greatest Italian Enlightenment writers and is now considered one of the fathers of classical criminal theory as well as modern penology. Beccaria is famous for his "On Crimes and Punishments" (1764), a treatise that served as one of the earliest prominent condemnations of torture and the death penalty and thus a landmark work in anti-death penalty philosophy.
Italy also had a renowned philosophical movement in the 1800s, with Idealism, Sensism and Empiricism. The main Sensist Italian philosophers were Melchiorre Gioja and Gian Domenico Romagnosi. Criticism of the Sensist movement came from other philosophers such as Pasquale Galluppi (1770–1846), who affirmed that "a priori" relationships were synthetic. Antonio Rosmini, instead, was the founder of Italian Idealism. During the late 19th and 20th centuries, there were also several other movements which gained some form of popularity in Italy, such as Ontologism (whose main philosopher was Vincenzo Gioberti), anarchism, communism, socialism, futurism, fascism and Christian democracy. Giovanni Gentile and Benedetto Croce were two of the most significant 20th-century Idealist philosophers. Anarcho-communism first fully formed into its modern strain within the Italian section of the First International. Antonio Gramsci remains an important philosopher within Marxist and communist theory, credited with creating the theory of cultural hegemony. Italian philosophers were also influential in the development of the non-Marxist liberal socialism philosophy, including Carlo Rosselli, Norberto Bobbio, Piero Gobetti and Aldo Capitini. In the 1960s, many Italian left-wing activists adopted the anti-authoritarian pro-working class leftist theories that would become known as autonomism and "operaismo".
Early and important Italian feminists include Sibilla Aleramo, Alaide Gualberta Beccari, and Anna Maria Mozzoni, though proto-feminist philosophies had previously been touched upon by earlier Italian writers such as Christine de Pizan, Moderata Fonte, and Lucrezia Marinella. Italian physician and educator Maria Montessori is credited with the creation of the philosophy of education that bears her name, an educational philosophy now practiced throughout the world. Giuseppe Peano was one of the founders of analytic philosophy and contemporary philosophy of mathematics. Recent analytic philosophers include Carlo Penco, Gloria Origgi, Pieranna Garavaso and Luciano Floridi.
Italian theatre can be traced back to the Roman tradition. The theatre of ancient Rome was a thriving and diverse art form, ranging from festival performances of street theatre, nude dancing, and acrobatics, to the staging of Plautus's broadly appealing situation comedies, to the high-style, verbally elaborate tragedies of Seneca. Although Rome had a native tradition of performance, the Hellenization of Roman culture in the 3rd century BCE had a profound and energising effect on Roman theatre and encouraged the development of Latin literature of the highest quality for the stage. As with many other literary genres, Roman dramatists was heavily influenced or tended to adapt from the Greek. For example, Seneca's "Phaedra" was based on that of Euripides, and many of the comedies of Plautus were direct translations of works by Menander.
During the 16th century and on into the 18th century, Commedia dell'arte was a form of improvisational theatre, and it is still performed today. Travelling troupes of players would set up an outdoor stage and provide amusement in the form of juggling, acrobatics and, more typically, humorous plays based on a repertoire of established characters with a rough storyline, called "canovaccio". Plays did not originate from written drama but from scenarios called lazzi, which were loose frameworks that provided the situations, complications, and outcome of the action, around which the actors would improvise. The characters of the "commedia" usually represent fixed social types and stock characters, each of which has a distinct costume, such as foolish old men, devious servants, or military officers full of false bravado. The main categories of these characters include servants, old men, lovers, and captains.
Carlo Goldoni, who wrote a few scenarios starting in 1734, superseded the comedy of masks and the comedy of intrigue by representations of actual life and manners through the characters and their behaviours. He rightly maintained that Italian life and manners were susceptible of artistic treatment such as had not been given them before.
The Teatro di San Carlo in Naples is the oldest continuously active venue for public opera in the world, opening in 1737, decades before both the Milan's La Scala and Venice's La Fenice theatres.
From folk music to classical, music has always played an important role in Italian culture. Instruments associated with classical music, including the piano and violin, were invented in Italy, and many of the prevailing classical music forms, such as the symphony, concerto, and sonata, can trace their roots back to innovations of 16th- and 17th-century Italian music.
Italy's most famous composers include the Renaissance composers Palestrina, Monteverdi and Gesualdo, the Baroque composers Scarlatti, Corelli and Vivaldi, the Classical composers Paisiello, Paganini and Rossini, and the Romantic composers Verdi and Puccini. Modern Italian composers such as Berio and Nono proved significant in the development of experimental and electronic music. While the classical music tradition still holds strong in Italy, as evidenced by the fame of its innumerable opera houses, such as "La Scala" of Milan and "San Carlo" of Naples (the oldest continuously active venue for public opera in the world), and performers such as the pianist Maurizio Pollini and tenor Luciano Pavarotti, Italians have been no less appreciative of their thriving contemporary music scene.
Italy is widely known for being the birthplace of opera. Italian opera was believed to have been founded in the early 17th century, in cities such as Mantua and Venice. Later, works and pieces composed by native Italian composers of the 19th and early 20th centuries, such as Rossini, Bellini, Donizetti, Verdi and Puccini, are among the most famous operas ever written and today are performed in opera houses across the world. La Scala operahouse in Milan is also renowned as one of the best in the world. Famous Italian opera singers include Enrico Caruso and Alessandro Bonci.
Introduced in the early 1920s, jazz took a particularly strong foothold in Italy, and remained popular despite the xenophobic cultural policies of the Fascist regime. Today, the most notable centres of jazz music in Italy include Milan, Rome, and Sicily. Later, Italy was at the forefront of the progressive rock and pop movement of the 1970s, with bands like PFM, Banco del Mutuo Soccorso, Le Orme, Goblin, and Pooh. The same period saw diversification in the cinema of Italy, and Cinecittà films included complex scores by composers including Ennio Morricone, Armando Trovaioli, Piero Piccioni and Piero Umiliani. In the early 1980s, the first star to emerge from the Italian hip hop scene was singer Jovanotti. Popular Italian metal bands such as Rhapsody of Fire, Lacuna Coil, Elvenking, Forgotten Tomb, and Fleshgod Apocalypse are also seen as pioneers of various heavy metal subgenres.
Italy was also an important country in the development of disco and electronic music, with Italo disco, known for its futuristic sound and prominent use of synthesisers and drum machines, being one of the earliest electronic dance genres, as well as European forms of disco aside from Euro disco (which later went on to influence several genres such as Eurodance and Nu-disco). By circa 1988, the genre had merged into other forms of European dance and electronic music, such as Italo house, which blended elements of Italo disco with traditional house music; its sound was generally uplifting, and made strong usage of piano melodies. Some bands of this genre are Black Box, East Side Beat, and 49ers. By the latter half of the 1990s, a subgenre of Eurodance known as Italo dance emerged. Taking influences from Italo disco and Italo house, Italo dance generally included synthesizer riffs, a melodic sound, and the usage of vocoders. Notable Italian DJs and remixers include Gabry Ponte (member of the group Eiffel 65), Benny Benassi, Gigi D'Agostino, and the trio Tacabro.
Producers such as Giorgio Moroder, who won three Academy Awards and four Golden Globes for his music, were highly influential in the development of electronic dance music. Today, Italian pop music is represented annually with the Sanremo Music Festival, which served as inspiration for the Eurovision song contest, and the Festival of Two Worlds in Spoleto. Singers such as Mina, Andrea Bocelli, Grammy winner Laura Pausini, Zucchero, Eros Ramazzotti and Tiziano Ferro have attained international acclaim.
The history of Italian cinema began a few months after the Lumière brothers began motion picture exhibitions. The first Italian film was a few seconds, showing Pope Leo XIII giving a blessing to the camera. The Italian film industry was born between 1903 and 1908 with three companies: the Società Italiana Cines, the Ambrosio Film and the Itala Film. Other companies soon followed in Milan and in Naples. In a short time these first companies reached a fair producing quality, and films were soon sold outside Italy. Cinema was later used by Benito Mussolini, who founded Rome's renowned Cinecittà studio for the production of Fascist propaganda until World War II.
After the war, Italian film was widely recognised and exported until an artistic decline around the 1980s. Notable Italian film directors from this period include Vittorio De Sica, Federico Fellini, Sergio Leone, Pier Paolo Pasolini, Luchino Visconti, Michelangelo Antonioni and Roberto Rossellini; some of these are recognised among the greatest and most influential filmmakers of all time. Movies include world cinema treasures such as "Bicycle Thieves", "La dolce vita", "8½", "The Good, the Bad and the Ugly" and "Once Upon a Time in the West". The mid-1940s to the early 1950s was the heyday of neorealist films, reflecting the poor condition of post-war Italy.
As the country grew wealthier in the 1950s, a form of neorealism known as pink neorealism succeeded, and other film genres, such as sword-and-sandal followed as spaghetti westerns, were popular in the 1960s and 1970s. Actresses such as Sophia Loren, Giulietta Masina and Gina Lollobrigida achieved international stardom during this period. Erotic Italian thrillers, or "giallos", produced by directors such as Mario Bava and Dario Argento in the 1970s, also influenced the horror genre worldwide. In recent years, the Italian scene has received only occasional international attention, with movies like "Life Is Beautiful" directed by Roberto Benigni, "" with Massimo Troisi and "The Great Beauty" directed by Paolo Sorrentino.
The aforementioned Cinecittà studio is today the largest film and television production facility in continental Europe and the centre of the Italian cinema, where many of the biggest box office hits are filmed, and one of the biggest production communities in the world. In the 1950s, the number of international productions being made there led to Rome's being dubbed ""Hollywood on the Tiber"". More than 3,000 productions have been made on its lot, of which 90 received an Academy Award nomination and 47 of these won it, from some cinema classics to recent rewarded features (such as "Roman Holiday", "Ben-Hur", "Cleopatra", "Romeo and Juliet", "The English Patient", "The Passion of the Christ", and "Gangs of New York").
Italy is the most awarded country at the Academy Awards for Best Foreign Language Film, with 14 awards won, 3 Special Awards and 31 nominations. , Italian films have also won 12 Palmes d'Or (the second-most of any country), 11 Golden Lions and 7 Golden Bears.
The most popular sport in Italy is football. Italy's national football team is one of the world's most successful teams with four FIFA World Cup victories (1934, 1938, 1982 and 2006). Italian clubs have won 48 major European trophies, making Italy the second most successful country in European football. Italy's top-flight club football league is named Serie A and is followed by millions of fans around the world.
Other popular team sports in Italy include volleyball, basketball and rugby. Italy's male and female national volleyball teams are often featured among the world's best. The Italian national basketball team's best results were gold at Eurobasket 1983 and EuroBasket 1999, as well as silver at the Olympics in 2004. Lega Basket Serie A is widely considered one of the most competitive in Europe. Rugby union enjoys a good level of popularity, especially in the north of the country. Italy's national team competes in the Six Nations Championship, and is a regular at the Rugby World Cup. Italy ranks as a tier-one nation by World Rugby. The men's volleyball team won three consecutive World Championships (in 1990, 1994, and 1998) and earned the Olympic silver medal in 1996, 2004, and 2016.
Italy has a long and successful tradition in individual sports as well. Bicycle racing is a very familiar sport in the country. Italians have won the UCI World Championships more than any other country, except Belgium. The Giro d'Italia is a cycling race held every May, and constitutes one of the three Grand Tours, along with the Tour de France and the Vuelta a España, each of which last approximately three weeks. Alpine skiing is also a very widespread sport in Italy, and the country is a popular international skiing destination, known for its ski resorts. Italian skiers achieved good results in Winter Olympic Games, Alpine Ski World Cup, and World Championship. Tennis has a significant following in Italy, ranking as the fourth most practised sport in the country. The Rome Masters, founded in 1930, is one of the most prestigious tennis tournaments in the world. Italian professional tennis players won the Davis Cup in 1976 and the Fed Cup in 2006, 2009, 2010 and 2013. Motorsports are also extremely popular in Italy. Italy has won, by far, the most MotoGP World Championships. Italian Scuderia Ferrari is the oldest surviving team in Grand Prix racing, having competed since 1948, and statistically the most successful Formula One team in history with a record of 232 wins.
Historically, Italy has been successful in the Olympic Games, taking part from the first Olympiad and in 47 Games out of 48. Italian sportsmen have won 522 medals at the Summer Olympic Games, and another 106 at the Winter Olympic Games, for a combined total of 628 medals with 235 golds, which makes them the fifth most successful nation in Olympic history for total medals. The country hosted two Winter Olympics and will host a third (in 1956, 2006, and 2026), and one Summer games (in 1960).
Italian fashion has a long tradition, and is regarded as one most important in the world. Milan, Florence and Rome are Italy's main fashion capitals. According to "Top Global Fashion Capital Rankings" 2013 by Global Language Monitor, Rome ranked sixth worldwide when Milan was twelfth. Major Italian fashion labels, such as Gucci, Armani, Prada, Versace, Valentino, Dolce & Gabbana, Missoni, Fendi, Moschino, Max Mara, Trussardi, and Ferragamo, to name a few, are regarded as among the finest fashion houses in the world. Also, the fashion magazine Vogue Italia, is considered one of the most prestigious fashion magazines in the world.
Italy is also prominent in the field of design, notably interior design, architectural design, industrial design and urban design. The country has produced some well-known furniture designers, such as Gio Ponti and Ettore Sottsass, and Italian phrases such as ""Bel Disegno"" and ""Linea Italiana"" have entered the vocabulary of furniture design. Examples of classic pieces of Italian white goods and pieces of furniture include Zanussi's washing machines and fridges, the "New Tone" sofas by Atrium, and the post-modern bookcase by Ettore Sottsass, inspired by Bob Dylan's song "Stuck Inside of Mobile with the Memphis Blues Again". Today, Milan and Turin are the nation's leaders in architectural design and industrial design. The city of Milan hosts Fiera Milano, Europe's largest design fair. Milan also hosts major design and architecture-related events and venues, such as the ""Fuori Salone"" and the Salone del Mobile, and has been home to the designers Bruno Munari, Lucio Fontana, Enrico Castellani and Piero Manzoni.
The Italian cuisine has developed through centuries of social and political changes, with roots as far back as the 4th century BC. Italian cuisine in itself takes heavy influences, including Etruscan, ancient Greek, ancient Roman, Byzantine, and Jewish. Significant changes occurred with the discovery of the New World with the introduction of items such as potatoes, tomatoes, bell peppers and maize, now central to the cuisine but not introduced in quantity until the 18th century. Italian cuisine is noted for its regional diversity, abundance of difference in taste, and is known to be one of the most popular in the world, wielding strong influence abroad.
The Mediterranean diet forms the basis of Italian cuisine, rich in pasta, fish, fruits and vegetables and characterised by its extreme simplicity and variety, with many dishes having only four to eight ingredients. Italian cooks rely chiefly on the quality of the ingredients rather than on elaborate preparation. Dishes and recipes are often derivatives from local and familial tradition rather than created by chefs, so many recipes are ideally suited for home cooking, this being one of the main reasons behind the ever-increasing worldwide popularity of Italian cuisine, from America to Asia. Ingredients and dishes vary widely by region.
A key factor in the success of Italian cuisine is its heavy reliance on traditional products; Italy has the most traditional specialities protected under EU law. Cheese, cold cuts and wine are a major part of Italian cuisine, with many regional declinations and Protected Designation of Origin or Protected Geographical Indication labels, and along with coffee (especially espresso) make up a very important part of the Italian gastronomic culture. Desserts have a long tradition of merging local flavours such as citrus fruits, pistachio and almonds with sweet cheeses like mascarpone and ricotta or exotic tastes as cocoa, vanilla and cinnamon. Gelato, tiramisù and cassata are among the most famous examples of Italian desserts, cakes and patisserie.
Public holidays celebrated in Italy include religious, national and regional observances. Italy's National Day, the "Festa della Repubblica" ("Republic Day") is celebrated on 2 June each year, and commemorates the birth of the Italian Republic in 1946.
The Saint Lucy's Day, which take place on 13 December, is very popular among children in some Italian regions, where she plays a role similar to Santa Claus. In addition, the Epiphany in Italy is associated with the folkloristic figure of the Befana, a broomstick-riding old woman who, in the night between 5 and 6 January, bringing good children gifts and sweets, and bad ones charcoal or bags of ashes. The Assumption of Mary coincides with "Ferragosto" on 15 August, the summer vacation period which may be a long weekend or most of the month. Each city or town also celebrates a public holiday on the occasion of the festival of the local patron saint, for example: Rome on 29 June (Saints Peter and Paul) and Milan on 7 December (Saint Ambrose).
There are many festivals and festivities in Italy. Some of them include the Palio di Siena horse race, Holy Week rites, Saracen Joust of Arezzo, Saint Ubaldo Day in Gubbio, Giostra della Quintana in Foligno, and the Calcio Fiorentino. In 2013, UNESCO has included among the intangible cultural heritage some Italian festivals and pasos (in Italian "macchine a spalla"), such as the Varia di Palmi, the Macchina di Santa Rosa in Viterbo, the Festa dei Gigli in Nola, and "faradda di li candareri" in Sassari.
Other festivals include the carnivals in Venice, Viareggio, Satriano di Lucania, Mamoiada, and Ivrea, mostly known for its Battle of the Oranges. The prestigious Venice International Film Festival, awarding the "Golden Lion" and held annually since 1932, is the oldest film festival in the world.
|
https://en.wikipedia.org/wiki?curid=14532
|
India
India, officially the Republic of India (Hindi: ), is a country in South Asia. It is the second-most populous country, the seventh-largest country by area, and the most populous democracy in the world. Bounded by the Indian Ocean on the south, the Arabian Sea on the southwest, and the Bay of Bengal on the southeast, it shares land borders with Pakistan to the west; China, Nepal, and Bhutan to the north; and Bangladesh and Myanmar to the east. In the Indian Ocean, India is in the vicinity of Sri Lanka and the Maldives; its Andaman and Nicobar Islands share a maritime border with Thailand and Indonesia.
Modern humans arrived on the Indian subcontinent from Africa no later than 55,000 years ago.
Their long occupation, initially in varying forms of isolation as hunter-gatherers, has made the region highly diverse, second only to Africa in human genetic diversity. Settled life emerged on the subcontinent in the western margins of the Indus river basin 9,000 years ago, evolving gradually into the Indus Valley Civilisation of the third millennium BCE.
By 1200 BCE, an archaic form of Sanskrit, an Indo-European language, had diffused into India from the northwest, unfolding as the language of the "Rigveda", and recording the dawning of Hinduism in India.
The Dravidian languages of India were supplanted in the northern regions.
By 400 BCE, stratification and exclusion by caste had emerged within Hinduism,
and Buddhism and Jainism had arisen, proclaiming social orders unlinked to heredity.
Early political consolidations gave rise to the loose-knit Maurya and Gupta Empires based in the Ganges Basin.
Their collective era was suffused with wide-ranging creativity, but also marked by the declining status of women, and the incorporation of untouchability into an organised system of belief. In South India, the Middle kingdoms exported Dravidian-languages scripts and religious cultures to the kingdoms of Southeast Asia.
In the early medieval era, Christianity, Islam, Judaism, and Zoroastrianism put down roots on India's southern and western coasts.
Muslim armies from Central Asia intermittently overran India's northern plains,
eventually establishing the Delhi Sultanate, and drawing northern India into the cosmopolitan networks of medieval Islam.
In the 15th century, the Vijayanagara Empire created a long-lasting composite Hindu culture in south India.
In the Punjab, Sikhism emerged, rejecting institutionalised religion.
The Mughal Empire, in 1526, ushered in two centuries of relative peace,
leaving a legacy of luminous architecture.
Gradually expanding rule of the British East India Company followed, turning India into a colonial economy, but also consolidating its sovereignty. British Crown rule began in 1858. The rights promised to Indians were granted slowly, but technological changes were introduced, and ideas of education, modernity and the public life took root.
A pioneering and influential nationalist movement emerged, which was noted for nonviolent resistance and became the major factor in ending British rule. In 1947 the British Indian Empire was partitioned into two independent dominions, a Hindu-majority Dominion of India and a Muslim-majority Dominion of Pakistan, amid large-scale loss of life and an unprecedented migration.
India has been a secular federal republic since 1950, governed in a democratic parliamentary system. It is a pluralistic, multilingual and multi-ethnic society. India's population grew from 361 million in 1951 to 1,211 million in 2011.
During the same time, its nominal per capita income increased from US$64 annually to US$1,498, and its literacy rate from 16.6% to 74%. From being a comparatively destitute country in 1951,
India has become a fast-growing major economy, a hub for information technology services, with an expanding middle class. It has a space programme which includes several planned or completed extraterrestrial missions. Indian movies, music, and spiritual teachings play an increasing role in global culture.
India has substantially reduced its rate of poverty, though at the cost of increasing economic inequality.
India is a nuclear weapons state, which ranks high in military expenditure. It has disputes over Kashmir with its neighbours, Pakistan and China, unresolved since the mid-20th century.
Among the socio-economic challenges India faces are gender inequality, child malnutrition,
and rising levels of air pollution.
India's land is megadiverse, with four biodiversity hotspots. Its forest cover comprises 21.4% of its area. India's wildlife, which has traditionally been viewed with tolerance in India's culture, is supported among these forests, and elsewhere, in protected habitats.
According to the "Oxford English Dictionary" (Third Edition 2009), the name "India" is derived from the Classical Latin "India", a reference to South Asia and an uncertain region to its east; and in turn derived successively from: Hellenistic Greek "India" (" Ἰνδία"); ancient Greek "Indos" (" Ἰνδός"); Old Persian "Hindush", an eastern province of the Achaemenid empire; and ultimately its cognate, the Sanskrit "Sindhu", or "river," specifically the Indus river and, by implication, its well-settled southern basin. The ancient Greeks referred to the Indians as "Indoi" (""), which translates as "The people of the Indus".
The term "Bharat" (; ), mentioned in both Indian epic poetry and the Constitution of India, is used in its variations by many Indian languages. A modern rendering of the historical name "Bharatavarsha", which applied originally to a region of the Gangetic Valley, "Bharat" gained increased currency from the mid-19th century as a native name for India.
"Hindustan" () is a Middle Persian name for India, introduced during the Mughal Empire and used widely since. Its meaning has varied, referring to a region encompassing present-day northern India and Pakistan or to India in its near entirety.
By 55,000 years ago, the first modern humans, or "Homo sapiens", had arrived on the Indian subcontinent from Africa, where they had earlier evolved.
The earliest known modern human remains in South Asia date to about 30,000 years ago. After 6500 BCE, evidence for domestication of food crops and animals, construction of permanent structures, and storage of agricultural surplus appeared in Mehrgarh and other sites in what is now Balochistan. These gradually developed into the Indus Valley Civilisation, the first urban culture in South Asia, which flourished during 2500–1900 BCE in what is now Pakistan and western India. Centred around cities such as Mohenjo-daro, Harappa, Dholavira, and Kalibangan, and relying on varied forms of subsistence, the civilisation engaged robustly in crafts production and wide-ranging trade.
During the period 2000–500 BCE, many regions of the subcontinent transitioned from the Chalcolithic cultures to the Iron Age ones. The Vedas, the oldest scriptures associated with Hinduism, were composed during this period, and historians have analysed these to posit a Vedic culture in the Punjab region and the upper Gangetic Plain. Most historians also consider this period to have encompassed several waves of Indo-Aryan migration into the subcontinent from the north-west. The caste system, which created a hierarchy of priests, warriors, and free peasants, but which excluded indigenous peoples by labelling their occupations impure, arose during this period. On the Deccan Plateau, archaeological evidence from this period suggests the existence of a chiefdom stage of political organisation. In South India, a progression to sedentary life is indicated by the large number of megalithic monuments dating from this period, as well as by nearby traces of agriculture, irrigation tanks, and craft traditions.
In the late Vedic period, around the 6th century BCE, the small states and chiefdoms of the Ganges Plain and the north-western regions had consolidated into 16 major oligarchies and monarchies that were known as the "mahajanapadas". The emerging urbanisation gave rise to non-Vedic religious movements, two of which became independent religions. Jainism came into prominence during the life of its exemplar, Mahavira. Buddhism, based on the teachings of Gautama Buddha, attracted followers from all social classes excepting the middle class; chronicling the life of the Buddha was central to the beginnings of recorded history in India. In an age of increasing urban wealth, both religions held up renunciation as an ideal, and both established long-lasting monastic traditions. Politically, by the 3rd century BCE, the kingdom of Magadha had annexed or reduced other states to emerge as the Mauryan Empire. The empire was once thought to have controlled most of the subcontinent except the far south, but its core regions are now thought to have been separated by large autonomous areas. The Mauryan kings are known as much for their empire-building and determined management of public life as for Ashoka's renunciation of militarism and far-flung advocacy of the Buddhist "dhamma".
The Sangam literature of the Tamil language reveals that, between 200 BCE and 200 CE, the southern peninsula was ruled by the Cheras, the Cholas, and the Pandyas, dynasties that traded extensively with the Roman Empire and with West and South-East Asia. In North India, Hinduism asserted patriarchal control within the family, leading to increased subordination of women. By the 4th and 5th centuries, the Gupta Empire had created a complex system of administration and taxation in the greater Ganges Plain; this system became a model for later Indian kingdoms. Under the Guptas, a renewed Hinduism based on devotion, rather than the management of ritual, began to assert itself. This renewal was reflected in a flowering of sculpture and architecture, which found patrons among an urban elite. Classical Sanskrit literature flowered as well, and Indian science, astronomy, medicine, and mathematics made significant advances.
The Indian early medieval age, 600 CE to 1200 CE, is defined by regional kingdoms and cultural diversity. When Harsha of Kannauj, who ruled much of the Indo-Gangetic Plain from 606 to 647 CE, attempted to expand southwards, he was defeated by the Chalukya ruler of the Deccan. When his successor attempted to expand eastwards, he was defeated by the Pala king of Bengal. When the Chalukyas attempted to expand southwards, they were defeated by the Pallavas from farther south, who in turn were opposed by the Pandyas and the Cholas from still farther south. No ruler of this period was able to create an empire and consistently control lands much beyond his core region. During this time, pastoral peoples, whose land had been cleared to make way for the growing agricultural economy, were accommodated within caste society, as were new non-traditional ruling classes. The caste system consequently began to show regional differences.
In the 6th and 7th centuries, the first devotional hymns were created in the Tamil language. They were imitated all over India and led to both the resurgence of Hinduism and the development of all modern languages of the subcontinent. Indian royalty, big and small, and the temples they patronised drew citizens in great numbers to the capital cities, which became economic hubs as well. Temple towns of various sizes began to appear everywhere as India underwent another urbanisation. By the 8th and 9th centuries, the effects were felt in South-East Asia, as South Indian culture and political systems were exported to lands that became part of modern-day Myanmar, Thailand, Laos, Cambodia, Vietnam, Philippines, Malaysia, and Java. Indian merchants, scholars, and sometimes armies were involved in this transmission; South-East Asians took the initiative as well, with many sojourning in Indian seminaries and translating Buddhist and Hindu texts into their languages.
After the 10th century, Muslim Central Asian nomadic clans, using swift-horse cavalry and raising vast armies united by ethnicity and religion, repeatedly overran South Asia's north-western plains, leading eventually to the establishment of the Islamic Delhi Sultanate in 1206. The sultanate was to control much of North India and to make many forays into South India. Although at first disruptive for the Indian elites, the sultanate largely left its vast non-Muslim subject population to its own laws and customs. By repeatedly repulsing Mongol raiders in the 13th century, the sultanate saved India from the devastation visited on West and Central Asia, setting the scene for centuries of migration of fleeing soldiers, learned men, mystics, traders, artists, and artisans from that region into the subcontinent, thereby creating a syncretic Indo-Islamic culture in the north. The sultanate's raiding and weakening of the regional kingdoms of South India paved the way for the indigenous Vijayanagara Empire. Embracing a strong Shaivite tradition and building upon the military technology of the sultanate, the empire came to control much of peninsular India, and was to influence South Indian society for long afterwards.
In the early 16th century, northern India, then under mainly Muslim rulers, fell again to the superior mobility and firepower of a new generation of Central Asian warriors. The resulting Mughal Empire did not stamp out the local societies it came to rule. Instead, it balanced and pacified them through new administrative practices and diverse and inclusive ruling elites, leading to more systematic, centralised, and uniform rule. Eschewing tribal bonds and Islamic identity, especially under Akbar, the Mughals united their far-flung realms through loyalty, expressed through a Persianised culture, to an emperor who had near-divine status. The Mughal state's economic policies, deriving most revenues from agriculture and mandating that taxes be paid in the well-regulated silver currency, caused peasants and artisans to enter larger markets. The relative peace maintained by the empire during much of the 17th century was a factor in India's economic expansion, resulting in greater patronage of painting, literary forms, textiles, and architecture. Newly coherent social groups in northern and western India, such as the Marathas, the Rajputs, and the Sikhs, gained military and governing ambitions during Mughal rule, which, through collaboration or adversity, gave them both recognition and military experience. Expanding commerce during Mughal rule gave rise to new Indian commercial and political elites along the coasts of southern and eastern India. As the empire disintegrated, many among these elites were able to seek and control their own affairs.
By the early 18th century, with the lines between commercial and political dominance being increasingly blurred, a number of European trading companies, including the English East India Company, had established coastal outposts. The East India Company's control of the seas, greater resources, and more advanced military training and technology led it to increasingly flex its military muscle and caused it to become attractive to a portion of the Indian elite; these factors were crucial in allowing the company to gain control over the Bengal region by 1765 and sideline the other European companies. Its further access to the riches of Bengal and the subsequent increased strength and size of its army enabled it to annexe or subdue most of India by the 1820s. India was then no longer exporting manufactured goods as it long had, but was instead supplying the British Empire with raw materials. Many historians consider this to be the onset of India's colonial period. By this time, with its economic power severely curtailed by the British parliament and having effectively been made an arm of British administration, the company began more consciously to enter non-economic arenas like education, social reform, and culture.
Historians consider India's modern age to have begun sometime between 1848 and 1885. The appointment in 1848 of Lord Dalhousie as Governor General of the East India Company set the stage for changes essential to a modern state. These included the consolidation and demarcation of sovereignty, the surveillance of the population, and the education of citizens. Technological changes—among them, railways, canals, and the telegraph—were introduced not long after their introduction in Europe. However, disaffection with the company also grew during this time and set off the Indian Rebellion of 1857. Fed by diverse resentments and perceptions, including invasive British-style social reforms, harsh land taxes, and summary treatment of some rich landowners and princes, the rebellion rocked many regions of northern and central India and shook the foundations of Company rule. Although the rebellion was suppressed by 1858, it led to the dissolution of the East India Company and the direct administration of India by the British government. Proclaiming a unitary state and a gradual but limited British-style parliamentary system, the new rulers also protected princes and landed gentry as a feudal safeguard against future unrest. In the decades following, public life gradually emerged all over India, leading eventually to the founding of the Indian National Congress in 1885.
The rush of technology and the commercialisation of agriculture in the second half of the 19th century was marked by economic setbacks and many small farmers became dependent on the whims of far-away markets. There was an increase in the number of large-scale famines, and, despite the risks of infrastructure development borne by Indian taxpayers, little industrial employment was generated for Indians. There were also salutary effects: commercial cropping, especially in the newly canalled Punjab, led to increased food production for internal consumption. The railway network provided critical famine relief, notably reduced the cost of moving goods, and helped nascent Indian-owned industry.
After World War I, in which approximately one million Indians served, a new period began. It was marked by British reforms but also repressive legislation, by more strident Indian calls for self-rule, and by the beginnings of a nonviolent movement of non-co-operation, of which Mohandas Karamchand Gandhi would become the leader and enduring symbol. During the 1930s, slow legislative reform was enacted by the British; the Indian National Congress won victories in the resulting elections. The next decade was beset with crises: Indian participation in World War II, the Congress's final push for non-co-operation, and an upsurge of Muslim nationalism. All were capped by the advent of independence in 1947, but tempered by the partition of India into two states: India and Pakistan.
Vital to India's self-image as an independent nation was its constitution, completed in 1950, which put in place a secular and democratic republic. It has remained a democracy with civil liberties, an active Supreme Court, and a largely independent press. Economic liberalisation, which began in the 1990s, has created a large urban middle class, transformed India into one of the world's fastest-growing economies, and increased its geopolitical clout. Indian movies, music, and spiritual teachings play an increasing role in global culture. Yet, India is also shaped by seemingly unyielding poverty, both rural and urban; by religious and caste-related violence; by Maoist-inspired Naxalite insurgencies; and by separatism in Jammu and Kashmir and in Northeast India. It has unresolved territorial disputes with China and with Pakistan. The India–Pakistan nuclear rivalry came to a head in 1998. India's sustained democratic freedoms are unique among the world's newer nations; however, in spite of its recent economic successes, freedom from want for its disadvantaged population remains a goal yet to be achieved.
India accounts for the bulk of the Indian subcontinent, lying atop the Indian tectonic plate, a part of the Indo-Australian Plate. India's defining geological processes began 75 million years ago when the Indian Plate, then part of the southern supercontinent Gondwana, began a north-eastward drift caused by seafloor spreading to its south-west, and later, south and south-east. Simultaneously, the vast Tethyan oceanic crust, to its northeast, began to subduct under the Eurasian Plate. These dual processes, driven by convection in the Earth's mantle, both created the Indian Ocean and caused the Indian continental crust eventually to under-thrust Eurasia and to uplift the Himalayas. Immediately south of the emerging Himalayas, plate movement created a vast trough that rapidly filled with river-borne sediment and now constitutes the Indo-Gangetic Plain. Cut off from the plain by the ancient Aravalli Range lies the Thar Desert.
The original Indian Plate survives as peninsular India, the oldest and geologically most stable part of India. It extends as far north as the Satpura and Vindhya ranges in central India. These parallel chains run from the Arabian Sea coast in Gujarat in the west to the coal-rich Chota Nagpur Plateau in Jharkhand in the east. To the south, the remaining peninsular landmass, the Deccan Plateau, is flanked on the west and east by coastal ranges known as the Western and Eastern Ghats; the plateau contains the country's oldest rock formations, some over one billion years old. Constituted in such fashion, India lies to the north of the equator between 6° 44′ and 35° 30′ north latitude and 68° 7′ and 97° 25′ east longitude.
India's coastline measures in length; of this distance, belong to peninsular India and to the Andaman, Nicobar, and Lakshadweep island chains. According to the Indian naval hydrographic charts, the mainland coastline consists of the following: 43% sandy beaches; 11% rocky shores, including cliffs; and 46% mudflats or marshy shores.
|
https://en.wikipedia.org/wiki?curid=14533
|
Mark Antony
Marcus Antonius (14 January 1 August 30 BC), commonly known in English as Mark Antony or Anthony, was a Roman politician and general who played a critical role in the transformation of the Roman Republic from an oligarchy into the autocratic Roman Empire.
Antony was a supporter of Julius Caesar, and served as one of his generals during the conquest of Gaul and the Civil War. Antony was appointed administrator of Italy while Caesar eliminated political opponents in Greece, North Africa, and Spain. After Caesar's death in 44 BC, Antony joined forces with Marcus Aemilius Lepidus, another of Caesar's generals, and Octavian, Caesar's great-nephew and adopted son, forming a three-man dictatorship known to historians as the Second Triumvirate. The Triumvirs defeated Caesar's murderers, the Liberatores, at the Battle of Philippi in 42 BC, and divided the government of the Republic between themselves. Antony was assigned Rome's eastern provinces, including the client kingdom of Egypt, then ruled by Cleopatra VII Philopator, and was given the command in Rome's war against Parthia.
Relations among the triumvirs were strained as the various members sought greater political power. Civil war between Antony and Octavian was averted in 40 BC, when Antony married Octavian's sister, Octavia. Despite this marriage, Antony carried on a love affair with Cleopatra, who bore him three children, further straining Antony's relations with Octavian. Lepidus was expelled from the association in 36 BC, and in 33 BC disagreements between Antony and Octavian caused a split between the remaining Triumvirs. Their ongoing hostility erupted into civil war in 31 BC, as the Roman Senate, at Octavian's direction, declared war on Cleopatra and proclaimed Antony a traitor. Later that year, Antony was defeated by Octavian's forces at the Battle of Actium. Antony and Cleopatra fled to Egypt, where after a minor victory at the Battle of Alexandria they committed suicide.
With Antony dead, Octavian became the undisputed master of the Roman world. In 27 BC, Octavian was granted the title of "Augustus," marking the final stage in the transformation of the Roman Republic into an empire, with himself as the first Roman emperor.
A member of the plebeian Antonia gens, Antony was born in Rome on 14 January 83 BC. His father and namesake was Marcus Antonius Creticus, son of the noted orator by the same name who had been murdered during the Marian Terror of the winter of 87–86 BC. His mother was Julia, a third cousin of Julius Caesar. Antony was an infant at the time of Lucius Cornelius Sulla's march on Rome in 82 BC.
According to the Roman orator Marcus Tullius Cicero, Antony's father was incompetent and corrupt, and was only given power because he was incapable of using or abusing it effectively. In 74 BC he was given military command to defeat the pirates of the Mediterranean, but he died in Crete in 71 BC without making any significant progress. The elder Antony's death left Antony and his brothers, Lucius and Gaius, in the care of their mother, Julia, who later married Publius Cornelius Lentulus Sura, an eminent member of the old Patrician nobility. Lentulus, despite exploiting his political success for financial gain, was constantly in debt due to the extravagance of his lifestyle. He was a major figure in the Second Catilinarian Conspiracy and was summarily executed on the orders of the consul Cicero in 63 BC for his involvement.
Antony's early life was characterized by a lack of proper parental guidance. According to the historian Plutarch, he spent his teenage years wandering through Rome with his brothers and friends gambling, drinking, and becoming involved in scandalous love affairs. Antony's contemporary and enemy, Cicero, charged that he had a homosexual relationship with Gaius Scribonius Curio. This form of slander was popular during this time in the Roman Republic to demean and discredit political opponents by accusing them of having an inappropriate sexual affair. There is little reliable information on his political activity as a young man, although it is known that he was an associate of Publius Clodius Pulcher and his street gang. He may also have been involved in the Lupercal cult as he was referred to as a priest of this order later in life. By age twenty, Antony had amassed an enormous debt. Hoping to escape his creditors, Antony fled to Greece in 58 BC, where he studied philosophy and rhetoric at Athens.
In 57 BC, Antony joined the military staff of Aulus Gabinius, the Proconsul of Syria, as chief of the cavalry. This appointment marks the beginning of his military career. As consul the previous year, Gabinius had consented to the exile of Cicero by Antony's mentor, Publius Clodius Pulcher.
Hyrcanus II, the Roman-supported Hasmonean High Priest of Judea, fled Jerusalem to Gabinius to seek protection against his rival and son-in-law Alexander. Years earlier in 63 BC, the Roman general Pompey had captured him and his father, King Aristobulus II, during his war against the remnant of the Seleucid Empire. Pompey had deposed Aristobulus and installed Hyrcanus as Rome's client ruler over Judea. Antony achieved his first military distinctions after securing important victories at Alexandrium and Machaerus. With the rebellion defeated by 56 BC, Gabinius restored Hyrcanus to his position as High Priest in Judea.
The following year, in 55 BC, Gabinius intervened in the political affairs of Ptolemaic Egypt. Pharaoh Ptolemy XII Auletes had been deposed in a rebellion led by his daughter Berenice IV in 58 BC, forcing him to seek asylum in Rome. During Pompey's conquests years earlier, Ptolemy had received the support of Pompey, who named him an ally of Rome. Gabinius' invasion sought to restore Ptolemy to his throne. This was done against the orders of the senate but with the approval of Pompey, then Rome's leading politician, and only after the deposed king provided a 10,000 talent bribe. The Greek historian Plutarch records it was Antony who convinced Gabinius to finally act. After defeating the frontier forces of the Egyptian kingdom, Gabinius' army proceeded to attack the palace guards but they surrendered before a battle commenced. With Ptolemy XII restored as Rome's client king, Gabinius garrisoned two thousand Roman soldiers, later known as the "Gabiniani", in Alexandria to ensure Ptolemy's authority. In return for its support, Rome exercised considerable power over the kingdom's affairs, particularly control of the kingdom's revenues and crop yields.
During the campaign in Egypt, Antony first met Cleopatra, the 14-year-old daughter of Ptolemy XII.
While Antony was serving Gabinius in the East, the domestic political situation had changed in Rome. In 60 BC, a secret agreement (known as the "First Triumvirate") was entered into between three men to control the Republic: Marcus Licinius Crassus, Gnaeus Pompey Magnus, and Gaius Julius Caesar. Crassus, Rome's wealthiest man, had defeated the slave rebellion of Spartacus in 70 BC; Pompey conquered much of the Eastern Mediterranean in the 60's BC; Caesar was Rome's Pontifex Maximus and a former general in Spain. In 59 BC, Caesar, with funding from Crassus, was elected consul to pursue legislation favourable to Crassus and Pompey's interests. In return, Caesar was assigned the governorship of Illyricum, Cisalpine Gaul, and Transalpine Gaul for five years beginning in 58 BC. Caesar used his governorship as a launching point for his conquest of free Gaul. In 55 BC, Crassus and Pompey served as consuls while Caesar's command was extended for another five years. Rome was effectively under the absolute power of these three men. The Triumvirate used Publius Clodius Pulcher, Antony's patron, to exile their political rivals, notably Cicero and Cato the Younger.
During his early military service, Antony married his cousin Antonia Hybrida Minor, the daughter of Gaius Antonius Hybrida. Sometime between 54 and 47 BC, the union produced a single daughter, Antonia Prima. It is unclear if this was Antony's first marriage.
Antony's association with Publius Clodius Pulcher allowed him to achieve greater prominence. Clodius, through the influence of his benefactor Marcus Licinius Crassus, had developed a positive political relationship with Julius Caesar. Clodius secured Antony a position on Caesar's military staff in 54 BC, joining his conquest of Gaul. Serving under Caesar, Antony demonstrated excellent military leadership. Despite a temporary alienation later in life, Antony and Caesar developed friendly relations which would continue until Caesar's assassination in 44 BC. Caesar's influence secured greater political advancement for Antony. After a year of service in Gaul, Caesar dispatched Antony to Rome to formally begin his political career, receiving election as quaestor for 52 BC as a member of the Populares faction. Assigned to assist Caesar, Antony returned to Gaul and commanded Caesar's cavalry during his victory at the Battle of Alesia against the Gallic chieftain Vercingetorix. Following his year in office, Antony was promoted by Caesar to the rank of Legate and assigned command of two legions (approximately 7,500 total soldiers).
Meanwhile, the alliance among Caesar, Pompey and Crassus had effectively ended. Caesar's daughter Julia, who had married Pompey to secure the alliance, died in 54 BC while Crassus was killed at the Battle of Carrhae in 53 BC. Without the stability they provided, the divide between Caesar and Pompey grew ever larger. Caesar's glory in conquering Gaul had served to further strain his alliance with Pompey, who, having grown jealous of his former ally, had drifted away from Caesar's democratic Populares party towards the oligarchic Optimates faction led by Cato. The supporters of Caesar, led by Clodius, and the supporters of Pompey, led by Titus Annius Milo, routinely clashed. In 52 BC, Milo succeeded in assassinating Clodius, resulting in widespread riots and the burning of the senate meeting house, the Curia Hostilia, by Clodius' street gang. Anarchy resulted, causing the senate to look to Pompey. Fearing the persecutions of Lucius Cornelius Sulla only thirty years earlier, they avoided granting Pompey the dictatorship by instead naming him sole consul for the year, giving him extraordinary but limited powers. Pompey ordered armed soldiers into the city to restore order and to eliminate the remnants of Clodius' gang.
Antony remained on Caesar's military staff until 50 BC, helping mopping-up actions across Gaul to secure Caesar's conquest. With the war over, Antony was sent back to Rome to act as Caesar's protector against Pompey and the other Optimates. With the support of Caesar, who as Pontifex Maximus was head of the Roman religion, Antony was appointed the College of Augurs, an important priestly office responsible for interpreting the will of the gods by studying the flight of birds. All public actions required favorable auspices, granting the college considerable influence. Antony was then elected as one of the ten plebeian tribunes for 49 BC. In this position, Antony could protect Caesar from his political enemies, by vetoing any actions unfavorable to his patron.
The feud between Caesar and Pompey erupted into open confrontation by early 49 BC. The consuls for the year, Gaius Claudius Marcellus Maior and Lucius Cornelius Lentulus Crus, were firm Optimates opposed to Caesar. Pompey, though remaining in Rome, was then serving as the governor of Spain and commanded several legions. Upon assuming office in January, Antony immediately summoned a meeting of the senate to resolve the conflict: he proposed both Caesar and Pompey lay down their commands and return to the status of mere private citizens. His proposal was well received by most of the senators but the consuls and Cato vehemently opposed it. Antony then made a new proposal: Caesar would retain only two of his eight legions, and the governorship of Illyrium if he was allowed to stand for the consulship "in absentia". This arrangement ensured his immunity from suit would continue: he had needed the consulship to protect himself from prosecution by Pompey. Though Pompey found the concession satisfactory, Cato and Lentulus refused to back down, with Lentulus even expelling Antony from the senate meeting by force. Antony fled Rome, fearing for his life, and returned to Caesar's camp on the banks of the Rubicon, the southern limit of Caesar's lawful command.
Within days of Antony's expulsion, on 7 January 49 BC, the senate reconvened. Under the leadership of Cato and with the tacit support of Pompey, the senate passed a "senatus consultum ultimum", a decree stripping Caesar of his command and ordering him to return to Rome and stand trial for war crimes. The senate further declared Caesar a traitor and a public enemy if he did not immediately disband his army. With all hopes of finding a peaceful solution gone after Antony's expulsion, Caesar used Antony as a pretext for marching on Rome. As tribune, Antony's person was sacrosanct, so it was unlawful to harm him or to refuse to recognize his veto. Three days later, on 10 January, Caesar crossed the Rubicon, initiating the Civil War. During the southern march, Caesar placed Antony as his second in command.
Caesar's rapid advance surprised Pompey, who, along with the other chief members of the Optimates, fled Italy for Greece. After entering Rome, instead of pursuing Pompey, Caesar marched to Spain to defeat the Pompeian loyalists there. Meanwhile, Antony, with the rank of propraetor—despite never having served as praetor—was installed as governor of Italy and commander of the army, stationed there while Marcus Aemilius Lepidus, one of Caesar's staff officers, ran the provisional administration of Rome itself. Though Antony was well liked by his soldiers, most other citizens despised him for his lack of interest in the hardships they faced from the civil war.
By the end of the year 49 BC, Caesar, already the ruler of Gaul, had captured Italy, Spain, Sicily, and Sardinia out of Optimates control. In early 48 BC, he prepared to sail with seven legions to Greece to face Pompey. Caesar had entrusted the defense of Illyricum to Gaius Antonius, Antony's younger brother, and Publius Cornelius Dolabella. Pompey's forces, however, defeated them and assumed control of the Adriatic Sea along with it. Additionally, the two legions they commanded defected to Pompey. Without their fleet, Caesar lacked the necessary transport ships to cross into Greece with his seven legions. Instead, he sailed with only two and placed Antony in command of the remaining five at Brundisium with instructions to join him as soon as he was able. In early 48 BC, Lucius Scribonius Libo was given command of Pompey's fleet, comprising some fifty galleys. Moving off to Brundisium, he blockaded Antony. Antony, however, managed to trick Libo into pursuing some decoy ships, causing Libo's squadron to be trapped and attacked. Most of Libo's fleet managed to escape, but several of his troops were trapped and captured. With Libo gone, Antony joined Caesar in Greece by March 48 BC.
During the Greek campaign, Plutarch records that Antony was Caesar's top general, and second only to him in reputation. Antony joined Caesar at the western Balkan Peninsula and besieged Pompey's larger army at Dyrrhachium. With food sources running low, Caesar, in July, ordered a nocturnal assault on Pompey's camp, but Pompey's larger forces pushed back the assault. Though an indecisive result, the victory was a tactical win for Pompey. Pompey, however, did not order a counterassault on Caesar's camp, allowing Caesar to retreat unhindered. Caesar would later remark the civil war would have ended that day if only Pompey had attacked him. Caesar managed to retreat to Thessaly, with Pompey in pursuit.
Assuming a defensive position at the plain of Pharsalus, Caesar's army prepared for pitched battle with Pompey's, which outnumbered his own two to one. At the Battle of Pharsalus on 9 August 48 BC, Caesar commanded the right wing opposite Pompey while Antony commanded the left, indicating Antony's status as Caesar's top general. The resulting battle was a decisive victory for Caesar. Though the civil war had not ended at Pharsulus, the battle marked the pinnacle of Caesar's power and effectively ended the Republic. The battle gave Caesar a much needed boost in legitimacy, as prior to the battle much of the Roman world outside Italy supported Pompey and the Optimates as the legitimate government of Rome. After Pompey's defeat, most of the senate defected to Caesar, including many of the soldiers who had fought under Pompey. Pompey himself fled to Ptolemaic Egypt, but Pharaoh Ptolemy XIII Theos Philopator feared retribution from Caesar and had Pompey assassinated upon his arrival.
Instead of immediately pursuing Pompey and the remaining Optimates, Caesar returned to Rome and was appointed Dictator with Antony as his Master of the Horse and second in command. Caesar presided over his own election to a second consulship for 47 BC and then, after eleven days in office, resigned this dictatorship. Caesar then sailed to Egypt, where he deposed Ptolemy XIII in favor of his sister Cleopatra in 47 BC. The young Cleopatra became Caesar's mistress and bore him a son, Caesarion. Caesar's actions further strengthened Roman control over the already Roman-dominated kingdom.
While Caesar was away in Egypt, Antony remained in Rome to govern Italy and restore order. Without Caesar to guide him, however, Antony quickly faced political difficulties and proved himself unpopular. The chief cause of his political challenges concerned debt forgiveness. One of the tribunes for 47 BC, Publius Cornelius Dolabella, a former general under Pompey, proposed a law which would have canceled all outstanding debts. Antony opposed the law for political and personal reasons: he believed Caesar would not support such massive relief and suspected Dolabella had seduced his wife Antonia Hybrida Minor. When Dolabella sought to enact the law by force and seized the Roman Forum, Antony responded by unleashing his soldiers upon the assembled masses. The resulting instability, especially among Caesar's veterans who would have benefited from the law, forced Caesar to return to Italy by October 47 BC.
Antony's handling of the affair with Dolabella caused a cooling of his relationship with Caesar. Antony's violent reaction had caused Rome to fall into a state of anarchy. Caesar sought to mend relations with the populist leader; he was elected to a third term as consul for 46 BC, but proposed the senate should transfer the consulship to Dolabella. When Antony protested, Caesar was forced to withdraw the motion out of shame. Later, Caesar sought to exercise his prerogatives as Dictator and directly proclaim Dolabella as consul instead. Antony again protested and, in his capacity as an Augur, declared the omens were unfavorable and Caesar again backed down. Seeing the expediency of removing Dolabella from Rome, Caesar ultimately pardoned him for his role in the riots and took him as one of his generals in his campaigns against the remaining Optimates resistance. Antony, however, was stripped of all official positions and received no appointments for the year 46 BC or 45 BC. Instead of Antony, Caesar appointed Marcus Aemilius Lepidus to be his consular colleague for 46 BC. While Caesar campaigned in North Africa, Antony remained in Rome as a mere private citizen. After returning victorious from North Africa, Caesar was appointed Dictator for ten years and brought Cleopatra and their son to Rome. Antony again remained in Rome while Caesar, in 45 BC, sailed to Spain to defeat the final opposition to his rule. When Caesar returned in late 45 BC, the civil war was over.
During this time Antony married his third wife, Fulvia. Following the scandal with Dolabella, Antony had divorced his second wife and quickly married Fulvia. Fulvia had previously been married to both Publius Clodius Pulcher and Gaius Scribonius Curio, having been a widow since Curio's death in the battle of the Bagradas in 49 BC. Though Antony and Fulvia were formally married in 47 BC, Cicero suggests the two had been in a relationship since at least 58 BC. The union produced two children: Marcus Antonius Antyllus (born 47) and Iullus Antonius (born 45)
Whatever conflicts existed between himself and Caesar, Antony remained faithful to Caesar, ensuring their estrangement did not last long. Antony reunited with Caesar at Narbo in 45 BC with full reconciliation coming in 44 BC when Antony was elected consul alongside Caesar. Caesar planned a new invasion of Parthia and desired to leave Antony in Italy to govern Rome in his name. The reconciliation came soon after Antony rejected an offer by Gaius Trebonius, one of Caesar's generals, to join a conspiracy to assassinate Caesar.
Soon after they assumed office together, the Lupercalia festival was held on 15 February 44 BC. The festival was held in honor of Lupa, the she-wolf who suckled the infant orphans Romulus and Remus, the founders of Rome. The political atmosphere of Rome at the time of the festival was deeply divided. Caesar had enacted a number of constitutional reforms which centralized effectively all political powers within his own hands. He was granted further honors, including a form of semi-official cult, with Antony as his high priest. Additionally, the day before the festival, Caesar had been named Dictator for Life, effectively granting unlimited power. Caesar's political rivals feared these reforms were his attempts at transforming the Republic into an open monarchy. During the festival's activities, Antony publicly offered Caesar a diadem, which Caesar refused. The event presented a powerful message: a diadem was a symbol of a king. By refusing it, Caesar demonstrated he had no intention of making himself King of Rome. Antony's motive for such actions is not clear and it is unknown if he acted with Caesar's prior approval or on his own.
A group of senators resolved to kill Caesar to prevent him from seizing the throne. Chief among them were Marcus Junius Brutus and Gaius Cassius Longinus. Although Cassius was "the moving spirit" in the plot, winning over the chief assassins to the cause of tyrannicide, Brutus, with his family's history of deposing Rome's kings, became their leader. Cicero, though not personally involved in the conspiracy, later claimed Antony's actions sealed Caesar's fate as such an obvious display of Caesar's preeminence motivated them to act. Originally, the conspirators had planned to eliminate not only Caesar but also many of his supporters, including Antony, but Brutus rejected the proposal, limiting the conspiracy to Caesar alone. With Caesar preparing to depart for Parthia in late March, the conspirators prepared to act when Caesar appeared for the senate meeting on the Ides of March (15 March).
Antony was supposed to attend with Caesar, but was waylaid at the door by one of the plotters and prevented from intervening. According to the Greek historian Plutarch, as Caesar arrived at the senate, Lucius Tillius Cimber presented him with a petition to recall his exiled brother. The other conspirators crowded round to offer their support. Within moments, the entire group, including Brutus, was striking out at the dictator. Caesar attempted to get away, but, blinded by blood, he tripped and fell; the men continued stabbing him as he lay defenseless on the lower steps of the portico. According to Roman historian Eutropius, around 60 or more men participated in the assassination. Caesar was stabbed 23 times and died from the blood loss attributable to multiple stab wounds.
In the turmoil surrounding the assassination, Antony escaped Rome dressed as a slave, fearing Caesar's death would be the start of a bloodbath among his supporters. When this did not occur, he soon returned to Rome. The conspirators, who styled themselves the "Liberatores" ("The Liberators"), had barricaded themselves on the Capitoline Hill for their own safety. Though they believed Caesar's death would restore the Republic, Caesar had been immensely popular with the Roman middle and lower classes, who became enraged upon learning a small group of aristocrats had killed their champion.
Antony, as the sole consul, soon took the initiative and seized the state treasury. Calpurnia, Caesar's widow, presented him with Caesar's personal papers and custody of his extensive property, clearly marking him as Caesar's heir and leader of the Caesarian faction. Caesar's Master of the Horse Marcus Aemilius Lepidus marched over 6,000 troops into Rome on 16 March to restore order and to act as the bodyguards of the Caesarian faction. Lepidus wanted to storm the Capitol, but Antony preferred a peaceful solution as a majority of both the Liberators and Caesar's own supporters preferred a settlement over civil war. On 17 March, at Antony's arrangement, the senate met to discuss a compromise, which, due to the presence of Caesar's veterans in the city, was quickly reached. Caesar's assassins would be pardoned of their crimes and, in return, all of Caesar's actions would be ratified. In particular, the offices assigned to both Brutus and Cassius by Caesar were likewise ratified. Antony also agreed to accept the appointment of his rival Dolabella as his consular colleague to replace Caesar. Having neither troops, money, nor popular support, the Liberatores were forced to accept Antony's proposal. This compromise was a great success for Antony, who managed to simultaneously appease Caesar's veterans, reconcile the senate majority, and appear to the Liberatores as their partner and protector.
On 19 March, Caesar's will was opened and read. In it, Caesar posthumously adopted his great-nephew Gaius Octavius and named him his principal heir. Then only nineteen years old and stationed with Caesar's army in Macedonia, the youth became a member of Caesar's Julian clan, changing his name to "Gaius Julius Caesar Octavianus" (Octavian) in accordance with the conventions of Roman adoption. Though not the chief beneficiary, Antony did receive some bequests.
Shortly after the compromise was reached, as a sign of good faith, Brutus, against the advice of Cassius and Cicero, agreed Caesar would be given a public funeral and his will would be validated. Caesar's funeral was held on 20 March. Antony, as Caesar's faithful lieutenant and incumbent consul, was chosen to preside over the ceremony and to recite the elegy. During the demagogic speech, he enumerated the deeds of Caesar and, publicly reading his will, detailed the donations Caesar had left to the Roman people. Antony then seized the blood-stained toga from Caesar's body and presented it to the crowd. Worked into a fury by the bloody spectacle, the assembly rioted. Several buildings in the Forum and some houses of the conspirators were burned to the ground. Panicked, many of the conspirators fled Italy. Under the pretext of not being able to guarantee their safety, Antony relieved Brutus and Cassius of their judicial duties in Rome and instead assigned them responsibility for procuring wheat for Rome from Sicily and Asia. Such an assignment, in addition to being unworthy of their rank, would have kept them far from Rome and shifted the balance towards Antony. Refusing such secondary duties, the two traveled to Greece instead. Additionally, Cleopatra left Rome to return to Egypt.
Despite the provisions of Caesar's will, Antony proceeded to act as leader of the Caesarian faction, including appropriating for himself a portion of Caesar's fortune rightfully belonging to Octavian. Antony enacted the Lex Antonia, which formally abolished the Dictatorship, in an attempt to consolidate his power by gaining the support of the senatorial class. He also enacted a number of laws he claimed to have found in Caesar's papers to ensure his popularity with Caesar's veterans, particularly by providing land grants to them. Lepidus, with Antony's support, was named Pontifex Maximus to succeed Caesar. To solidify the alliance between Antony and Lepidus, Antony's daughter Antonia Prima was engaged to Lepidus' son, also named Lepidus. Surrounding himself with a bodyguard of over six thousand of Caesar's veterans, Antony presented himself as Caesar's true successor, largely ignoring Octavian.
Octavian arrived in Rome in May to claim his inheritance. Although Antony had amassed political support, Octavian still had opportunity to rival him as the leading member of the Caesarian faction. The senatorial Republicans increasingly viewed Antony as a new tyrant. Antony had lost the support of many Romans and supporters of Caesar when he opposed the motion to elevate Caesar to divine status. When Antony refused to relinquish Caesar's vast fortune to him, Octavian borrowed heavily to fulfill the bequests in Caesar's will to the Roman people and to his veterans, as well as to establish his own bodyguard of veterans. This earned him the support of Caesarian sympathizers who hoped to use him as a means of eliminating Antony. The senate, and Cicero in particular, viewed Antony as the greater danger of the two. By summer 44 BC, Antony was in a difficult position due to his actions regarding his compromise with the Liberatores following Caesar's assassination. He could either denounce the Liberatores as murderers and alienate the senate or he could maintain his support for the compromise and risk betraying the legacy of Caesar, strengthening Octavian's position. In either case, his situation as ruler of Rome would be weakened. Roman historian Cassius Dio later recorded that while Antony, as consul, maintained the advantage in the relationship, the general affection of the Roman people was shifting to Octavian due to his status as Caesar's son.
Supporting the senatorial faction against Antony, Octavian, in September 44 BC, encouraged the leading senator Marcus Tullius Cicero to attack Antony in a series of speeches portraying him as a threat to the Republican order. Risk of civil war between Antony and Octavian grew. Octavian continued to recruit Caesar's veterans to his side, away from Antony, with two of Antony's legions defecting in November 44 BC. At that time, Octavian, only a private citizen, lacked legal authority to command the Republic's armies, making his command illegal. With popular opinion in Rome turning against him and his consular term nearing its end, Antony attempted to secure a favorable military assignment to secure an army to protect himself. The senate, as was custom, assigned Antony and Dolabella the provinces of Macedonia and Syria, respectively, to govern in 43 BC after their consular terms expired. Antony, however, objected to the assignment, preferring to govern Cisalpine Gaul which had been assigned to Decimus Junius Brutus Albinus, one of Caesar's assassins. When Decimus refused to surrender his province, Antony marched north in December 44 BC with his remaining soldiers to take the province by force, besieging Decimus at Mutina. The senate, led by a fiery Cicero, denounced Antony's actions and declared him an outlaw.
Ratifying Octavian's extraordinary command on 1 January 43 BC, the senate dispatched him along with consuls Hirtius and Pansa to defeat Antony and his five legions. Antony's forces were defeated at the Battle of Mutina in April 43 BC, forcing Antony to retreat to Transalpine Gaul. Both consuls were killed, however, leaving Octavian in sole command of their armies, some eight legions.
With Antony defeated, the senate, hoping to eliminate Octavian and the remainder of the Caesarian party, assigned command of the Republic's legions to Decimus. Sextus Pompey, son of Caesar's old rival Pompey Magnus, was given command of the Republic's fleet from his base in Sicily while Brutus and Cassius were granted the governorships of Macedonia and Syria respectively. These appointments attempted to renew the "Republican" cause. However, the eight legions serving under Octavian, composed largely of Caesar's veterans, refused to follow one of Caesar's murderers, allowing Octavian to retain his command. Meanwhile, Antony recovered his position by joining forces with Marcus Aemilius Lepidus, who had been assigned the governorship of Transalpine Gaul and Nearer Spain. Antony sent Lepidus to Rome to broker a conciliation. Though he was an ardent Caesarian, Lepidus had maintained friendly relations with the senate and with Sextus Pompey. His legions, however, quickly joined Antony, giving him control over seventeen legions, the largest army in the West.
By mid-May, Octavian began secret negotiations to form an alliance with Antony to provide a united Caesarian party against the Liberators. Remaining in Cisalpine Gaul, Octavian dispatched emissaries to Rome in July 43 BC demanding he be appointed consul to replace Hirtius and Pansa and that the decree declaring Antony a public enemy be rescinded. When the senate refused, Octavian marched on Rome with his eight legions and assumed control of the city in August 43 BC. Octavian proclaimed himself consul, rewarded his soldiers, and then set about prosecuting Caesar's murderers. By the lex Pedia, all of the conspirators and Sextus Pompey were convicted ″in absentia″ and declared public enemies. Then, at the instigation of Lepidus, Octavian went to Cisalpine Gaul to meet Antony.
In November 43 BC, Octavian, Lepidus, and Antony met near Bononia. After two days of discussions, the group agreed to establish a three man dictatorship to govern the Republic for five years, known as the "Three Men for the Restoration of the Republic" (Latin: "Triumviri Rei publicae Constituendae"), known to modern historians as the Second Triumvirate. They shared military command of the Republic's armies and provinces among themselves: Antony received Gaul, Lepidus Spain, and Octavian (as the junior partner) Africa. They jointly governed Italy. The Triumvirate would have to conquer the rest of Rome's holdings; Brutus and Cassius held the Eastern Mediterranean, and Sextus Pompey held the Mediterranean islands. On 27 November 43 BC, the Triumvirate was formally established by a new law, the lex Titia. Octavian and Antony reinforced their alliance through Octavian's marriage to Antony's stepdaughter, Clodia Pulchra.
The primary objective of the Triumvirate was to avenge Caesar's death and to make war upon his murderers. Before marching against Brutus and Cassius in the East, the Triumvirs issued proscriptions against their enemies in Rome. The Dictator Lucius Cornelius Sulla had taken similar action to purge Rome of his opponents in 82 BC. The proscribed were named on public lists, stripped of citizenship, and outlawed. Their wealth and property were confiscated by the state, and rewards were offered to anyone who secured their arrest or death. With such encouragements, the proscription produced deadly results; two thousand Roman knights were executed, and one third of the senate, among them Cicero, who was executed on 7 December. The confiscations helped replenish the State Treasury, which had been depleted by Caesar's civil war the decade before; when this seemed insufficient to fund the imminent war against Brutus and Cassius, the Triumvirs imposed new taxes, especially on the wealthy. By January 42 BC the proscription had ended; it had lasted two months, and though less bloody than Sulla's, it traumatized Roman society. A number of those named and outlawed had fled to either Sextus Pompey in Sicily or to the Liberators in the East. Senators who swore loyalty to the Triumvirate were allowed to keep their positions; on 1 January 42 BC, the senate officially deified Caesar as "The Divine Julius", and confirmed Antony's position as his high priest.
Due to the infighting within the Triumvirate during 43 BC, Brutus and Cassius had assumed control of much of Rome's eastern territories, and amassed a large army. Before the Triumvirate could cross the Adriatic Sea into Greece where the Liberators had stationed their army, the Triumvirate had to address the threat posed by Sextus Pompey and his fleet. From his base in Sicily, Sextus raided the Italian coast and blockaded the Triumvirs. Octavian's friend and admiral Quintus Salvidienus Rufus thwarted an attack by Sextus against the southern Italian mainland at Rhegium, but Salvidienus was then defeated in the resulting naval battle because of the inexperience of his crews. Only when Antony arrived with his fleet was the blockade broken. Though the blockade was defeated, control of Sicily remained in Sextus' hand, but the defeat of the Liberators was the Triumvirate's first priority.
In the summer of 42 BC, Octavian and Antony sailed for Macedonia to face the Liberators with nineteen legions, the vast majority of their army (approximately 100,000 regular infantry plus supporting cavalry and irregular auxiliary units), leaving Rome under the administration of Lepidus. Likewise, the army of the Liberators also commanded an army of nineteen legions; their legions, however, were not at full strength while the legions of Antony and Octavian were. While the Triumvirs commanded a larger number of infantry, the Liberators commanded a larger cavalry contingent. The Liberators, who controlled Macedonia, did not wish to engage in a decisive battle, but rather to attain a good defensive position and then use their naval superiority to block the Triumvirs' communications with their supply base in Italy. They had spent the previous months plundering Greek cities to swell their war-chest and had gathered in Thrace with the Roman legions from the Eastern provinces and levies from Rome's client kingdoms.
Brutus and Cassius held a position on the high ground along both sides of the via Egnatia west of the city of Philippi. The south position was anchored to a supposedly impassable marsh, while the north was bordered by impervious hills. They had plenty of time to fortify their position with a rampart and a ditch. Brutus put his camp on the north while Cassius occupied the south of the via Egnatia. Antony arrived shortly and positioned his army on the south of the via Egnatia, while Octavian put his legions north of the road. Antony offered battle several times, but the Liberators were not lured to leave their defensive stand. Thus, Antony tried to secretly outflank the Liberators' position through the marshes in the south. This provoked a pitched battle on 3 October 42 BC. Antony commanded the Triumvirate's army due to Octavian's sickness on the day, with Antony directly controlling the right flank opposite Cassius. Because of his health, Octavian remained in camp while his lieutenants assumed a position on the left flank opposite Brutus. In the resulting first battle of Philippi, Antony defeated Cassius and captured his camp while Brutus overran Octavian's troops and penetrated into the Triumvirs' camp but was unable to capture the sick Octavian. The battle was a tactical draw but due to poor communications Cassius believed the battle was a complete defeat and committed suicide to prevent being captured.
Brutus assumed sole command of the Liberator army and preferred a war of attrition over open conflict. His officers, however, were dissatisfied with these defensive tactics and his Caesarian veterans threatened to defect, forcing Brutus to give battle at the second battle of Philippi on 23 October. While the battle was initially evenly matched, Antony's leadership routed Brutus' forces. Brutus committed suicide the day after the defeat and the remainder of his army swore allegiance to the Triumvirate. Over fifty thousand Romans died in the two battles. While Antony treated the losers mildly, Octavian dealt cruelly with his prisoners and even beheaded Brutus' corpse.
The battles of Philippi ended the civil war in favor of the Caesarian faction. With the defeat of the Liberators, only Sextus Pompey and his fleet remained to challenge the Triumvirate's control over the Republic.
The victory at Philippi left the members of the Triumvirate as masters of the Republic, save Sextus Pompey in Sicily. Upon returning to Rome, the Triumvirate repartitioned rule of Rome's provinces among themselves, with Antony as the clear senior partner. He received the largest distribution, governing all of the Eastern provinces while retaining Gaul in the West. Octavian's position improved, as he received Spain, which was taken from Lepidus. Lepidus was then reduced to holding only Africa, and he assumed a clearly tertiary role in the Triumvirate. Rule over Italy remained undivided, but Octavian was assigned the difficult and unpopular task of demobilizing their veterans and providing them with land distributions in Italy. Antony assumed direct control of the East while he installed one of his lieutenants as the ruler of Gaul. During his absence, several of his supporters held key positions in Rome to protect his interests there.
The East was in need of reorganization after the rule of the Liberators in the previous years. In addition, Rome contended with the Parthian Empire for dominance of the Near East. The Parthian threat to the Triumvirate's rule was urgent due to the fact that the Parthians supported the Liberators in the recent civil war, aid which included the supply troops at Philippi. As ruler of the East, Antony also assumed responsibility for overseeing Caesar's planned invasion of Parthia to avenge the defeat of Marcus Licinius Crassus at the Battle of Carrhae in 53 BC.
In 42 BC, the Roman East was composed of several directly controlled provinces and client kingdoms. The provinces included Macedonia, Asia, Bithynia, Cilicia, Cyprus, Syria, and Cyrenaica. Approximately half of the eastern territory was controlled by Rome's client kingdoms, nominally independent kingdoms subject to Roman direction. These kingdoms included:
Antony spent the winter of 42 BC in Athens, where he ruled generously towards the Greek cities. A proclaimed "philhellene" ("Friend of all things Greek"), Antony supported Greek culture to win the loyalty of the inhabitants of the Greek East. He attended religious festivals and ceremonies, including initiation into the Eleusinian Mysteries, a secret cult dedicated to the worship of the goddesses Demeter and Persephone. Beginning in 41 BC, he traveled across the Aegean Sea to Anatolia, leaving his friend Lucius Marcius Censorius as governor of Macedonia and Achaea. Upon his arrival in Ephesus in Asia, Antony was worshiped as the god Dionysus born anew. He demanded heavy taxes from the Hellenic cities in return for his pro-Greek culture policies, but exempted those cities which had remained loyal to Caesar during the civil war and compensated those cities which had suffered under Caesar's assassins, including Rhodes, Lycia, and Tarsus. He granted pardons to all Roman nobles living in the East who had supported the Optimate cause, except for Caesar's assassins.
Ruling from Ephesus, Antony consolidated Rome's hegemony in the East, receiving envoys from Rome's client kingdoms and intervening in their dynastic affairs, extracting enormous financial "gifts" from them in the process. Though King Deiotarus of Galatia supported Brutus and Cassius following Caesar's assassination, Antony allowed him to retain his position. He also confirmed Ariarathes X as king of Cappadocia after the execution of his brother Ariobarzanes III of Cappadocia by Cassius before the Battle of Philippi. In Hasmonean Judea, several Jewish delegations complained to Antony of the harsh rule of Phasael and Herod, the sons of Rome's assassinated chief Jewish minister Antipater the Idumaean. After Herod offered him a large financial gift, Antony confirmed the brothers in their positions. Subsequently, influenced by the beauty and charms of Glaphyra, the widow of Archelaüs (formerly the high priest of Comana), Antony deposed Ariarathes, and appointed Glaphyra's son, Archelaüs, to rule Cappadocia.
In October 41, Antony requested Rome's chief eastern vassal, the queen of Ptolemaic Egypt Cleopatra, meet him at Tarsus in Cilicia. Antony had first met a young Cleopatra while campaigning in Egypt in 55 BC and again in 48 BC when Caesar had backed her as queen of Egypt over the claims of her half-sister Arsinoe. Cleopatra would bear Caesar a son, Caesarion, in 47 BC and the two living in Rome as Caesar's guests until his assassination in 44 BC. After Caesar's assassination, Cleopatra and Caesarion returned to Egypt, where she named the child as her co-ruler. In 42 BC, the Triumvirate, in recognition for Cleopatra's help towards Publius Cornelius Dolabella in opposition to the Liberators, granted official recognition to Caesarion's position as king of Egypt. Arriving in Tarsus aboard her magnificent ship, Cleopatra invited Antony to a grand banquet to solidify their alliance. As the most powerful of Rome's eastern vassals, Egypt was indispensable in Rome's planned military invasion of the Parthian Empire. At Cleopatra's request, Antony ordered the execution of Arsinoe, who, though marched in Caesar's triumphal parade in 46 BC, had been granted sanctuary at the temple of Artemis in Ephesus. Antony and Cleopatra then spent the winter of 41 BC together in Alexandria. Cleopatra bore Antony twin children, Alexander Helios and Cleopatra Selene II, in 40 BC, and a third, Ptolemy Philadelphus, in 36 BC. Antony also granted formal control over Cyprus, which had been under Egyptian control since 47 BC during the turmoil of Caesar's civil war, to Cleopatra in 40 BC as a gift for her loyalty to Rome.
Antony, in his first months in the East, raised money, reorganized his troops, and secured the alliance of Rome's client kingdoms. He also promoted himself as Hellenistic ruler, which won him the affection of the Greek peoples of the East but also made him the target of Octavian's propaganda in Rome. According to some ancient authors, Antony led a carefree life of luxury in Alexandria. Upon learning the Parthian Empire had invaded Rome's territory in early 40 BC, Antony left Egypt for Syria to confront the invasion. However, after a short stay in Tyre, he was forced to sail with his army to Italy to confront Octavian due to Octavian's war against Antony's wife and brother.
Following the defeat of Brutus and Cassius, while Antony was stationed in the East, Octavian had authority over the West. Octavian's chief responsibility was distributing land to tens of thousands of Caesar's veterans who had fought for the Triumvirate. Additionally, tens of thousands of veterans who had fought for the Republican cause in the war also required land grants. This was necessary to ensure they would not support a political opponent of the Triumvirate. However, the Triumvirs did not possess sufficient state-controlled land to allot to the veterans. This left Octavian with two choices: alienating many Roman citizens by confiscating their land, or alienating many Roman soldiers who might back a military rebellion against the Triumvirate's rule. Octavian chose the former. As many as eighteen Roman towns through Italy were affected by the confiscations of 41 BC, with entire populations driven out.
Led by Fulvia, the wife of Antony, the senators grew hostile towards Octavian over the issue of the land confiscations. According to the ancient historian Cassius Dio, Fulvia was the most powerful woman in Rome at the time. According to Dio, while Publius Servilius Vatia and Lucius Antonius were the consuls for the year 41 BC, real power was vested in Fulvia. As the mother-in-law of Octavian and the wife of Antony, no action was taken by the senate without her support. Fearing Octavian's land grants would cause the loyalty of the Caesarian veterans to shift away from Antony, Fulvia traveled constantly with her children to the new veteran settlements in order to remind the veterans of their debt to Antony. Fulvia also attempted to delay the land settlements until Antony returned to Rome, so that he could share credit for the settlements. With the help of Antony's brother, the consul of 41 BC Lucius Antonius, Fulvia encouraged the senate to oppose Octavian's land policies.
The conflict between Octavian and Fulvia caused great political and social unrest throughout Italy. Tensions escalated into open war, however, when Octavian divorced Clodia Pulchra, Fulvia's daughter from her first husband Publius Clodius Pulcher. Outraged, Fulvia, supported by Lucius, raised an army to fight for Antony's rights against Octavian. According to the ancient historian Appian, Fulvia's chief reason for the war was her jealousy of Antony's affairs with Cleopatra in Egypt and desire to draw Antony back to Rome. Lucius and Fulvia took a political and martial gamble in opposing Octavian and Lepidus, however, as the Roman army still depended on the Triumvirs for their salaries. Lucius and Fulvia, supported by their army, marched on Rome and promised the people an end to the Triumvirate in favor of Antony's sole rule. However, when Octavian returned to the city with his army, the pair was forced to retreat to Perusia in Etruria. Octavian placed the city under siege while Lucius waited for Antony's legions in Gaul to come to his aid. Away in the East and embarrassed by Fulvia's actions, Antony gave no instructions to his legions. Without reinforcements, Lucius and Fulvia were forced to surrender in February 40 BC. While Octavian pardoned Lucius for his role in the war and even granted him command in Spain as his chief lieutenant there, Fulvia was forced to flee to Greece with her children. With the war over, Octavian was left in sole control over Italy. When Antony's governor of Gaul died, Octavian took over his legions there, further strengthening his control over the West.
Despite the Parthian Empire's invasion of Rome's eastern territories, Fulvia's civil war forced Antony to leave the East and return to Rome in order to secure his position. Meeting her in Athens, Antony rebuked Fulvia for her actions before sailing on to Italy with his army to face Octavian, laying siege to Brundisium. This new conflict proved untenable for both Octavian and Antony, however. Their centurions, who had become important figures politically, refused to fight due to their shared service under Caesar. The legions under their command followed suit. Meanwhile, in Sicyon, Fulvia died of a sudden and unknown illness. Fulvia's death and the mutiny of their soldiers allowed the triumvirs to effect a reconciliation through a new power sharing agreement in September 40 BC. The Roman world was redivided, with Antony receiving the Eastern provinces, Octavian the Western provinces, and Lepidus relegated to a clearly junior position as governor of Africa. This agreement, known as the "Treaty of Brundisium", reinforced the Triumvirate and allowed Antony to begin preparing for Caesar's long-awaited campaign against the Parthian Empire. As a symbol of their renewed alliance, Antony married Octavia, Octavian's sister, in October 40 BC.
The rise of the Parthian Empire in the 3rd century BC and Rome's expansion into the Eastern Mediterranean during the 2nd century BC brought the two powers into direct contact, causing centuries of tumultuous and strained relations. Though periods of peace developed cultural and commercial exchanges, war was a constant threat. Influence over the buffer state of the Kingdom of Armenia, located to the north-east of Roman Syria, was often a central issue in the Roman-Parthian conflict. In 95 BC, Parthian Shah Mithridates II, installed Tigranes the Great as Parthian's client-king over Armenia. Tigranes would wage a series of three wars against Rome before being ultimately defeated by Pompey in 66 BC. Thereafter, with his son Artavasdes II in Rome as a hostage, Tigranes would rule Armenia as an ally of Rome until his death in 55 BC. Rome then installed Artavasdes II as king and continued its influence over Armenia.
In 53 BC, Rome's governor of Syria, Marcus Licinius Crassus, led an expedition across the Euphrates River into Parthian territory to confront the Parthian Shah Orodes II. Artavasdes II offered Crassus the aid of nearly forty thousand troops to assist his Parthian expedition on the condition that Crassus invade through Armenia as the safer route. Crassus refused, choosing instead the more direct route by crossing the Euphrates directly into desert Parthian territory. Crassus' actions proved disastrous as his army was defeated at the Battle of Carrhae by a numerically inferior Parthian force. Crassus' defeat forced Armenia to shift its loyalty to Parthia, with Artavasdes II's sister marrying Orodes' son and heir Pacorus.
In early 44 BC, Julius Caesar announced his intentions to invade Parthia and restore Roman power in the East. His reasons were to punish the Parthians for assisting Pompey in the recent civil war, to avenge Crassus' defeat at Carrhae, and especially to match the glory of Alexander the Great for himself. Before Caesar could launch his campaign, however, he was assassinated. As part of the compromise between Antony and the Republicans to restore order following Caesar's murder, Publius Cornelius Dolabella was assigned the governorship of Syria and command over Caesar's planned Parthian campaign. The compromise did not hold, however, and the Republicans were forced to flee to the East. The Republicans directed Quintus Labienus to attract the Parthians to their side in the resulting war against Antony and Octavian. After the Republicans were defeated at the Battle of Philippi, Labienus joined the Parthians. Despite Rome's internal turmoil during the time, the Parthians did not immediately benefit from the power vacuum in the East due to Orodes II's reluctance despite Labienus' urgings to the contrary.
In the summer of 41 BC, Antony, to reassert Roman power in the East, conquered Palmyra on the Roman-Parthian border. Antony then spent the winter of 41 BC in Alexandria with Cleopatra, leaving only two legions to defend the Syrian border against Parthian incursions. The legions, however, were composed of former Republican troops and Labienus convinced Orodes II to invade.
A Parthian army, led by Orodes II's eldest son Pacorus, invaded Syria in early 40 BC. Labienus, the Republican ally of Brutus and Cassius, accompanied him to advise him and to rally the former Republican soldiers stationed in Syria to the Parthian cause. Labienus recruited many of the former Republican soldiers to the Parthian campaign in opposition to Antony. The joint Parthian–Roman force, after initial success in Syria, separated to lead their offensive in two directions: Pacorus marched south toward Hasmonean Judea while Labienus crossed the Taurus Mountains to the north into Cilicia. Labienus conquered southern Anatolia with little resistance. The Roman governor of Asia, Lucius Munatius Plancus, a partisan of Antony, was forced to flee his province, allowing Labienus to recruit the Roman soldiers stationed there. For his part, Pacorus advanced south to Phoenicia and Palestine. In Hasmonean Judea, the exiled prince Antigonus allied himself with the Parthians. When his brother, Rome's client king Hyrcanus II, refused to accept Parthian domination, he was deposed in favor of Antigonus as Parthia's client king in Judea. Pacorus' conquest had captured much of the Syrian and Palestinian interior, with much of the Phoenician coast occupied as well. The city of Tyre remained the last major Roman outpost in the region.
Antony, then in Egypt with Cleopatra, did not respond immediately to the Parthian invasion. Though he left Alexandria for Tyre in early 40 BC, when he learned of the civil war between his wife and Octavian, he was forced to return to Italy with his army to secure his position in Rome rather than defeat the Parthians. Instead, Antony dispatched Publius Ventidius Bassus to check the Parthian advance. Arriving in the East in spring 39 BC, Ventidius surprised Labienus near the Taurus Mountains, claiming victory at the Cilician Gates. Ventidius ordered Labienus executed as a traitor and the formerly rebellious Roman soldiers under his command were reincorporated under Antony's control. He then met a Parthian army at the border between Cilicia and Syria, defeating it and killing a large portion of the Parthian soldiers at the Amanus Pass. Ventidius' actions temporarily halted the Parthian advance and restored Roman authority in the East, forcing Pacorus to abandon his conquests and return to Parthia.
In the spring of 38 BC, the Parthians resumed their offensive with Pacorus leading an army across the Euphrates. Ventidius, in order to gain time, leaked disinformation to Pacorus implying that he should cross the Euphrates River at their usual ford. Pacorus did not trust this information and decided to cross the river much farther downstream; this was what Ventidius hoped would occur and gave him time to get his forces ready. The Parthians faced no opposition and proceeded to the town of Gindarus in Cyrrhestica where Ventidius' army was waiting. At the Battle of Cyrrhestica, Ventidius inflicted an overwhelming defeat against the Parthians which resulted in the death of Pacorus. Overall, the Roman army had achieved a complete victory with Ventidius' three successive victories forcing the Parthians back across the Euphrates. Pacorus' death threw the Parthian Empire into chaos. Shah Orodes II, overwhelmed by the grief of his son's death, appointed his younger son Phraates IV as his successor. However, Phraates IV assassinated Orodes II in late 38 BC, succeeding him on the throne.
Ventidius feared Antony's wrath if he invaded Parthian territory, thereby stealing his glory; so instead he attacked and subdued the eastern kingdoms, which had revolted against Roman control following the disastrous defeat of Crassus at Carrhae. One such rebel was King Antiochus of Commagene, whom he besieged in Samosata. Antiochus tried to make peace with Ventidius, but Ventidius told him to approach Antony directly. After peace was concluded, Antony sent Ventidius back to Rome where he celebrated a triumph, the first Roman to triumph over the Parthians.
While Antony and the other Triumvirs ratified the Treaty of Brundisium to redivide the Roman world among themselves, the rebel general Sextus Pompey, the son of Caesar's rival Pompey the Great, was largely ignored. From his stronghold on Sicily, he continued his piratical activities across Italy and blocked the shipment of grain to Rome. The lack of food in Rome caused the public to blame the Triumvirate and shift its sympathies towards Pompey. This pressure forced the Triumvirs to meet with Sextus in early 39 BC.
While Octavian wanted an end to the ongoing blockade of Italy, Antony sought peace in the West in order to make the Triumvirate's legions available for his service in his planned campaign against the Parthians. Though the Triumvirs rejected Sextus' initial request to replace Lepidus as the third man within the Triumvirate, they did grant other concessions. Under the terms of the Treaty of Misenum, Sextus was allowed to retain control over Sicily and Sardinia, with the provinces of Corsica and Greece being added to his territory. He was also promised a future position with the Priestly College of Augurs and the consulship for 35 BC. In exchange, Sextus agreed to end his naval blockade of Italy, supply Rome with grain, and halt his piracy of Roman merchant ships. However, the most important provision of the Treaty was the end of the proscription the Trimumvirate had begun in late 43 BC. Many of the proscribed senators, rather than face death, fled to Sicily seeking Sextus' protection. With the exception of those responsible for Caesar's assassination, all those proscribed were allowed to return to Rome and promised compensation. This caused Sextus to lose many valuable allies as the formerly exiled senators gradually aligned themselves with either Octavian or Antony. To secure the peace, Octavian betrothed his three-year-old nephew and Antony's stepson Marcus Claudius Marcellus to Sextus' daughter Pompeia. With peace in the West secured, Antony planned to retaliate against Parthia by invading their territory. Under an agreement with Octavian, Antony would be supplied with extra troops for his campaign. With this military purpose on his mind, Antony sailed to Greece with Octavia, where he behaved in a most extravagant manner, assuming the attributes of the Greek god Dionysus in 39 BC.
The peace with Sextus was short lived, however. When Sextus demanded control over Greece as the agreement provided, Antony demanded the province's tax revenues be to fund the Parthian campaign. Sextus refused. Meanwhile, Sextus' admiral Menas betrayed him, shifting his loyalty to Octavian and thereby granting him control of Corsica, Sardinia, three of Sextus' legions, and a larger naval force. These actions worked to renew Sextus' blockade of Italy, preventing Octavian from sending the promised troops to Antony for the Parthian campaign. This new delay caused Antony to quarrel with Octavian, forcing Octavia to mediate a truce between them. Under the Treaty of Tarentum, Antony provided a large naval force for Octavian's use against Sextus while Octavian promised to raise new legions for Antony to support his invasion of Parthia. As the term of the Triumvirate was set to expire at the end of 38 BC, the two unilaterally extended their term of office another five years until 33 BC without seeking approval of the senate or the popular assemblies. To seal the Treaty, Antony's elder son Marcus Antonius Antyllus, then only 6 years old, was betrothed to Octavian's only daughter Julia, then only an infant. With the Treaty signed, Antony returned to the East, leaving Octavia in Italy.
With Publius Ventidius Bassus returned to Rome in triumph for his defensive campaign against the Parthians, Antony appointed Gaius Sosius as the new governor of Syria and Cilicia in early 38 BC. Antony, still in the West negotiating with Octavian, ordered Sosius to depose Antigonus, who had been installed in the recent Parthian invasion as the ruler of Hasmonean Judea, and to make Herod the new Roman client king in the region. Years before in 40 BC, the Roman senate had proclaimed Herod "King of the Jews" because Herod had been a loyal supporter of Hyrcanus II, Rome's previous client king before the Parthian invasion, and was from a family with long standing connections to Rome. The Romans hoped to use Herod as a bulwark against the Parthians in the coming campaign.
Advancing south, Sosius captured the island-city of Aradus on the coast of Phoenicia by the end of 38 BC. The following year, the Romans besieged Jerusalem. After a forty-day siege, the Roman soldiers stormed the city and, despite Herod's pleas for restraint, acted without mercy, pillaging and killing all in their path, prompting Herod to complain to Antony. Herod finally resorted to bribing Sosius and his troops in order that they would not leave him "king of a desert". Antigonus was forced to surrender to Sosius, and was sent to Antony for the triumphal procession in Rome. Herod, however, fearing that Antigonus would win backing in Rome, bribed Antony to execute Antigonus. Antony, who recognized that Antigonus would remain a permanent threat to Herod, ordered him beheaded in Antioch. Now secure on his throne, Herod would rule the Herodian Kingdom until his death in 4 BC, and would be an ever-faithful client king of Rome.
With the Triumvirate renewed in 38 BC, Antony returned to Athens in the winter with his new wife Octavia, the sister of Octavian. With the assassination of the Parthian king Orodes II by his son Phraates IV, who then seized the Parthian throne, in late 38 BC, Antony prepared to invade Parthia himself.
Antony, however, realized Octavian had no intention of sending him the additional legions he had promised under the Treaty of Tarentum. To supplement his own armies, Antony instead looked to Rome's principal vassal in the East: his lover Cleopatra. In addition to significant financial resources, Cleopatra's backing of his Parthian campaign allowed Antony to amass the largest army Rome had ever assembled in the East. Wintering in Antioch during 37, Antony's combined Roman–Egyptian army numbered some 200,000, including sixteen legions (approximately 160,000 soldiers) plus an additional 40,000 auxiliaries. Such a force was twice the size of Marcus Licinius Crassus's army from his failed Parthian invasion of 53 BC and three times those of Lucius Licinius Lucullus and Lucius Cornelius Sulla during the Mithridatic Wars. The size of his army indicated Antony's intention to conquer Parthia, or at least receive its submission by capturing the Parthian capital of Ecbatana. Antony's rear was protected by Rome's client kingdoms in Anatolia, Syria, and Judea, while the client kingdoms of Cappadocia, Pontus, and Commagene would provide supplies along the march.
Antony's first target for his invasion was the Kingdom of Armenia. Ruled by King Artavasdes II of Armenia, Armenia had been an ally of Rome since the defeat of Tigranes the Great by Pompey the Great in 66 BC during the Third Mithridatic War. However, following Marcus Licinius Crassus's defeat at the Battle of Carrhae in 53 BC, Armenia was forced into an alliance with Parthia due to Rome's weakened position in the East. Antony dispatched Publius Canidius Crassus to Armenia, receiving Artavasdes II's surrender without opposition. Canidius then led an invasion into the Transcaucasia, subduing Iberia. There, Canidius forced the Iberian King Pharnavaz II into an alliance against Zober, king of neighboring Albania, subduing the kingdom and reducing it to a Roman protectorate.
With Armenia and the Caucasus secured, Antony marched south, crossing into the Parthian province of Media Atropatene. Though Antony desired a pitched battle, the Parthians would not engage, allowing Antony to march deep into Parthian territory by mid-August of 36 BC. This forced Antony to leave his logistics train in the care of two legions (approximately 10,000 soldiers), which was then attacked and completely destroyed by the Parthian army before Antony could rescue them. Though the Armenian King Artavasdes II and his cavalry were present during the massacre, they did not intervene. Despite the ambush, Antony continued the campaign. However, Antony was soon forced to retreat in mid-October after a failed two-month siege of the provincial capital.
The retreat soon proved a disaster as Antony's demoralized army faced increasing supply difficulties in the mountainous terrain during winter while constantly being harassed by the Parthian army. According to the Greek historian Plutarch, eighteen battles were fought between the retreating Romans and the Parthians during the month-long march back to Armenia, with approximately 20,000 infantry and 4,000 cavalry dying during the retreat alone. Once in Armenia, Antony quickly marched back to Syria to protect his interests there by late 36 BC, losing an additional 8,000 soldiers along the way. In all, two-fifths of his original army (some 80,000 men) had died during his failed campaign.
Meanwhile, in Rome, the triumvirate was no more. Octavian forced Lepidus to resign after the older triumvir attempted to take control of Sicily after the defeat of Sextus. Now in sole power, Octavian was occupied in wooing the traditional Republican aristocracy to his side. He married Livia and started to attack Antony in order to raise himself to power. He argued that Antony was a man of low morals to have left his faithful wife abandoned in Rome with the children to be with the promiscuous queen of Egypt. Antony was accused of everything, but most of all, of "going native", an unforgivable crime to the proud Romans. Several times Antony was summoned to Rome, but remained in Alexandria with Cleopatra.
Again with Egyptian money, Antony invaded Armenia, this time successfully. In the return, a mock Roman triumph was celebrated in the streets of Alexandria. The parade through the city was a pastiche of Rome's most important military celebration. For the finale, the whole city was summoned to hear a very important political statement. Surrounded by Cleopatra and her children, Antony ended his alliance with Octavian.
He distributed kingdoms among his children: Alexander Helios was named king of Armenia, Media and Parthia (territories which were not for the most part under the control of Rome), his twin Cleopatra Selene got Cyrenaica and Libya, and the young Ptolemy Philadelphus was awarded Syria and Cilicia. As for Cleopatra, she was proclaimed Queen of Kings and Queen of Egypt, to rule with Caesarion (Ptolemy XV Caesar, son of Cleopatra by Julius Caesar), King of Kings and King of Egypt. Most important of all, Caesarion was declared legitimate son and heir of Caesar. These proclamations were known as the "Donations of Alexandria" and caused a fatal breach in Antony's relations with Rome.
While the distribution of nations among Cleopatra's children was hardly a conciliatory gesture, it did not pose an immediate threat to Octavian's political position. Far more dangerous was the acknowledgment of Caesarion as legitimate and heir to Caesar's name. Octavian's base of power was his link with Caesar through adoption, which granted him much-needed popularity and loyalty of the legions. To see this convenient situation attacked by a child borne by the richest woman in the world was something Octavian could not accept. The triumvirate expired on the last day of 33 BC and was not renewed. Another civil war was beginning.
During 33 and 32 BC, a propaganda war was fought in the political arena of Rome, with accusations flying between sides. Antony (in Egypt) divorced Octavia and accused Octavian of being a social upstart, of usurping power, and of forging the adoption papers by Caesar. Octavian responded with treason charges: of illegally keeping provinces that should be given to other men by lots, as was Rome's tradition, and of starting wars against foreign nations (Armenia and Parthia) without the consent of the senate.
Antony was also held responsible for Sextus Pompey's execution without a trial. In 32 BC, the senate deprived him of his powers and declared war against Cleopatra – not Antony, because Octavian had no wish to advertise his role in perpetuating Rome's internecine bloodshed. Both consuls, Gnaeus Domitius Ahenobarbus and Gaius Sosius, and a third of the senate abandoned Rome to meet Antony and Cleopatra in Greece.
In 31 BC, the war started. Octavian's general Marcus Vipsanius Agrippa captured the Greek city and naval port of Methone, loyal to Antony. The enormous popularity of Octavian with the legions secured the defection of the provinces of Cyrenaica and Greece to his side. On 2 September, the naval Battle of Actium took place. Antony and Cleopatra's navy was overwhelmed, and they were forced to escape to Egypt with 60 ships.
Octavian, now close to absolute power, did not intend to give Antony and Cleopatra any rest. In 30 August BC, assisted by Agrippa, he invaded Egypt. With no other refuge to escape to, Antony committed suicide by stabbing himself with his sword in the mistaken belief that Cleopatra had already done so. When he found out that Cleopatra was still alive, his friends brought him to Cleopatra's monument in which she was hiding, and he died in her arms.
Cleopatra was allowed to conduct Antony's burial rites after she had been captured by Octavian. Realising that she was destined for Octavian's triumph in Rome, she made several attempts to take her life and finally succeeded in mid-August. Octavian had Caesarion murdered, but he spared Antony's children by Cleopatra, who were paraded through the streets of Rome. Antony's daughters by Octavia were spared, as was his son, Iullus Antonius. But his elder son, Marcus Antonius Antyllus, was killed by Octavian's men while pleading for his life in the Caesareum.
Cicero's son, Cicero Minor, announced Antony's death to the senate. Antony's honours were revoked and his statues removed, but he was not subject to a complete damnatio memoriae. Cicero Minor also made a decree that no member of the Antonii would ever bear the name Marcus again. "In this way Heaven entrusted the family of Cicero the final acts in the punishment of Antony."
When Antony died, Octavian became uncontested ruler of Rome. In the following years, Octavian, who was known as Augustus after 27 BC, managed to accumulate in his person all administrative, political, and military offices. When Augustus died in AD 14, his political powers passed to his adopted son Tiberius; the Roman Empire had begun.
The rise of Caesar and the subsequent civil war between his two most powerful adherents effectively ended the credibility of the Roman oligarchy as a governing power and ensured that all future power struggles would centre upon which one individual would achieve supreme control of the government, eliminating the senate and the former magisterial structure as important foci of power in these conflicts. Thus, in history, Antony appears as one of Caesar's main adherents, he and Octavian Augustus being the two men around whom power coalesced following the assassination of Caesar, and finally as one of the three men chiefly responsible for the demise of the Roman Republic.
Antony was known to have an obsession with women, and sex. He had many mistresses (including Cytheris) and had been married in succession to Fadia, Antonia, Fulvia, Octavia and Cleopatra, and left behind him a number of children. Through his daughters by Octavia, he would be ancestor to the Roman Emperors Caligula, Claudius and Nero.
Through his daughters by Octavia, he would become the paternal great grandfather of Roman Emperor Caligula, the maternal grandfather of Emperor Claudius, and both maternal great-great-grandfather and paternal great-great uncle of the Emperor Nero of the Julio-Claudian dynasty, the very family, as represented by Octavian Augustus, that he had fought to defeat. Through his eldest daughter, he would become ancestor to the long line of kings and co-rulers of the Bosporan Kingdom, the longest-living Roman client kingdom, as well as the rulers and royalty of several other Roman client states. Through his daughter by Cleopatra, Antony would become ancestor to the royal family of Mauretania, another Roman client kingdom, while through his sole surviving son Iullus, he would be ancestor to several famous Roman statesmen.
Works in which the character of Mark Antony plays a central role:
|
https://en.wikipedia.org/wiki?curid=19960
|
Manchester United F.C.
Manchester United Football Club is a professional football club based in Old Trafford, Greater Manchester, England, that competes in the Premier League, the top flight of English football. Nicknamed "the Red Devils", the club was founded as Newton Heath LYR Football Club in 1878, changed its name to Manchester United in 1902 and moved to its current stadium, Old Trafford, in 1910.
Manchester United have won more trophies than any other club in English football, with a record 20 League titles, 12 FA Cups, five League Cups and a record 21 FA Community Shields. United have also won three UEFA Champions Leagues, one UEFA Europa League, one UEFA Cup Winners' Cup, one UEFA Super Cup, one Intercontinental Cup and one FIFA Club World Cup. In 1998–99, the club became the first in the history of English football to achieve the continental European treble. By winning the UEFA Europa League in 2016–17, they became one of five clubs to have won all three main UEFA club competitions.
The 1958 Munich air disaster claimed the lives of eight players. In 1968, under the management of Matt Busby, Manchester United became the first English football club to win the European Cup. Alex Ferguson won 38 trophies as manager, including 13 Premier League titles, 5 FA Cups and 2 UEFA Champions Leagues, between 1986 and 2013, when he announced his retirement.
Manchester United was the highest-earning football club in the world for 2016–17, with an annual revenue of €676.3 million, and the world's third most valuable football club in 2019, valued at £3.15 billion ($3.81 billion). As of June 2015, it is the world's most valuable football brand, estimated to be worth $1.2 billion. After being floated on the London Stock Exchange in 1991, the club was purchased by Malcolm Glazer in May 2005 in a deal valuing the club at almost £800 million, after which the company was taken private again, before going public once more in August 2012, when they made an initial public offering on the New York Stock Exchange. Manchester United is one of the most widely supported football clubs in the world, and has rivalries with Liverpool, Manchester City, Arsenal and Leeds United.
Manchester United was formed in 1878 as Newton Heath LYR Football Club by the Carriage and Wagon department of the Lancashire and Yorkshire Railway (LYR) depot at Newton Heath. The team initially played games against other departments and railway companies, but on 20 November 1880, they competed in their first recorded match; wearing the colours of the railway company – green and gold – they were defeated 6–0 by Bolton Wanderers' reserve team. By 1888, the club had become a founding member of The Combination, a regional football league. Following the league's dissolution after only one season, Newton Heath joined the newly formed Football Alliance, which ran for three seasons before being merged with The Football League. This resulted in the club starting the 1892–93 season in the First Division, by which time it had become independent of the railway company and dropped the "LYR" from its name. After two seasons, the club was relegated to the Second Division.
In January 1902, with debts of £2,670 – equivalent to £ in 2020 – the club was served with a winding-up order. Captain Harry Stafford found four local businessmen, including John Henry Davies (who became club president), each willing to invest £500 in return for a direct interest in running the club and who subsequently changed the name; on 24 April 1902, Manchester United was officially born. Under Ernest Mangnall, who assumed managerial duties in 1903, the team finished as Second Division runners-up in 1906 and secured promotion to the First Division, which they won in 1908 – the club's first league title. The following season began with victory in the first ever Charity Shield and ended with the club's first FA Cup title. Manchester United won the First Division for the second time in 1911, but at the end of the following season, Mangnall left the club to join Manchester City.
In 1922, three years after the resumption of football following the First World War, the club was relegated to the Second Division, where it remained until regaining promotion in 1925. Relegated again in 1931, Manchester United became a yo-yo club, achieving its all-time lowest position of 20th place in the Second Division in 1934. Following the death of principal benefactor John Henry Davies in October 1927, the club's finances deteriorated to the extent that Manchester United would likely have gone bankrupt had it not been for James W. Gibson, who, in December 1931, invested £2,000 and assumed control of the club. In the 1938–39 season, the last year of football before the Second World War, the club finished 14th in the First Division.
In October 1945, the impending resumption of football led to the managerial appointment of Matt Busby, who demanded an unprecedented level of control over team selection, player transfers and training sessions. Busby led the team to second-place league finishes in 1947, 1948 and 1949, and to FA Cup victory in 1948. In 1952, the club won the First Division, its first league title for 41 years. They then won back-to-back league titles in 1956 and 1957; the squad, who had an average age of 22, were nicknamed "the Busby Babes" by the media, a testament to Busby's faith in his youth players. In 1957, Manchester United became the first English team to compete in the European Cup, despite objections from The Football League, who had denied Chelsea the same opportunity the previous season. En route to the semi-final, which they lost to Real Madrid, the team recorded a 10–0 victory over Belgian champions Anderlecht, which remains the club's biggest victory on record.
The following season, on the way home from a European Cup quarter-final victory against Red Star Belgrade, the aircraft carrying the Manchester United players, officials and journalists crashed while attempting to take off after refuelling in Munich, Germany. The Munich air disaster of 6 February 1958 claimed 23 lives, including those of eight players – Geoff Bent, Roger Byrne, Eddie Colman, Duncan Edwards, Mark Jones, David Pegg, Tommy Taylor and Billy Whelan – and injured several more.
Assistant manager Jimmy Murphy took over as manager while Busby recovered from his injuries and the club's makeshift side reached the FA Cup final, which they lost to Bolton Wanderers. In recognition of the team's tragedy, UEFA invited the club to compete in the 1958–59 European Cup alongside eventual League champions Wolverhampton Wanderers. Despite approval from The Football Association, The Football League determined that the club should not enter the competition, since it had not qualified. Busby rebuilt the team through the 1960s by signing players such as Denis Law and Pat Crerand, who combined with the next generation of youth players – including George Best – to win the FA Cup in 1963. The following season, they finished second in the league, then won the title in 1965 and 1967. In 1968, Manchester United became the first English (and second British) club to win the European Cup, beating Benfica 4–1 in the final with a team that contained three European Footballers of the Year: Bobby Charlton, Denis Law and George Best. They then represented Europe in the 1968 Intercontinental Cup against Estudiantes of Argentina, but lost the tie after losing the first leg in Buenos Aires, before a 1–1 draw at Old Trafford three weeks later. Busby resigned as manager in 1969 before being replaced by the reserve team coach, former Manchester United player Wilf McGuinness.
Following an eighth-place finish in the 1969–70 season and a poor start to the 1970–71 season, Busby was persuaded to temporarily resume managerial duties, and McGuinness returned to his position as reserve team coach. In June 1971, Frank O'Farrell was appointed as manager, but lasted less than 18 months before being replaced by Tommy Docherty in December 1972. Docherty saved Manchester United from relegation that season, only to see them relegated in 1974; by that time the trio of Best, Law, and Charlton had left the club. The team won promotion at the first attempt and reached the FA Cup final in 1976, but were beaten by Southampton. They reached the final again in 1977, beating Liverpool 2–1. Docherty was dismissed shortly afterwards, following the revelation of his affair with the club physiotherapist's wife.
Dave Sexton replaced Docherty as manager in the summer of 1977. Despite major signings, including Joe Jordan, Gordon McQueen, Gary Bailey, and Ray Wilkins, the team failed to achieve any significant results; they finished in the top two in 1979–80 and lost to Arsenal in the 1979 FA Cup Final. Sexton was dismissed in 1981, even though the team won the last seven games under his direction. He was replaced by Ron Atkinson, who immediately broke the British record transfer fee to sign Bryan Robson from West Bromwich Albion. Under Atkinson, Manchester United won the FA Cup twice in three years – in 1983 and 1985. In 1985–86, after 13 wins and two draws in its first 15 matches, the club was favourite to win the league, but finished in fourth place. The following season, with the club in danger of relegation by November, Atkinson was dismissed.
Alex Ferguson and his assistant Archie Knox arrived from Aberdeen on the day of Atkinson's dismissal, and guided the club to an 11th-place finish in the league. Despite a second-place finish in 1987–88, the club was back in 11th place the following season. Reportedly on the verge of being dismissed, victory over Crystal Palace in the 1990 FA Cup Final replay (after a 3–3 draw) saved Ferguson's career. The following season, Manchester United claimed its first Cup Winners' Cup title and competed in the 1991 UEFA Super Cup, beating European Cup holders Red Star Belgrade 1–0 in the final at Old Trafford. A second consecutive League Cup final appearance followed in 1992, in which the team beat Nottingham Forest 1–0 at Wembley. In 1993, the club won its first league title since 1967, and a year later, for the first time since 1957, it won a second consecutive title – alongside the FA Cup – to complete the first "Double" in the club's history. United then became the first English club to do the Double twice when they won both competitions again in 1995–96, before retaining the league title once more in 1996–97 with a game to spare.
In the 1998–99 season, Manchester United became the first team to win the Premier League, FA Cup and UEFA Champions League – "The Treble" – in the same season. Losing 1–0 going into injury time in the 1999 UEFA Champions League Final, Teddy Sheringham and Ole Gunnar Solskjær scored late goals to claim a dramatic victory over Bayern Munich, in what is considered one of the greatest comebacks of all time. The club also won the Intercontinental Cup after beating Palmeiras 1–0 in Tokyo. Ferguson was subsequently knighted for his services to football.
Manchester United won the league again in the 1999–2000 and 2000–01 seasons. The team finished third in 2001–02, before regaining the title in 2002–03. They won the 2003–04 FA Cup, beating Millwall 3–0 in the final at the Millennium Stadium in Cardiff to lift the trophy for a record 11th time. In the 2005–06 season, Manchester United failed to qualify for the knockout phase of the UEFA Champions League for the first time in over a decade, but recovered to secure a second-place league finish and victory over Wigan Athletic in the 2006 Football League Cup Final. The club regained the Premier League in the 2006–07 season, before completing the European double in 2007–08 with a 6–5 penalty shoot-out victory over Chelsea in the 2008 UEFA Champions League Final in Moscow to go with their 17th English league title. Ryan Giggs made a record 759th appearance for the club in that game, overtaking previous record holder Bobby Charlton. In December 2008, the club won the 2008 FIFA Club World Cup and followed this with the 2008–09 Football League Cup, and its third successive Premier League title. That summer, Cristiano Ronaldo was sold to Real Madrid for a world record £80 million. In 2010, Manchester United defeated Aston Villa 2–1 at Wembley to retain the League Cup, its first successful defence of a knockout cup competition.
After finishing as runner-up to Chelsea in the 2009–10 season, United achieved a record 19th league title in 2010–11, securing the championship with a 1–1 away draw against Blackburn Rovers on 14 May 2011. This was extended to 20 league titles in 2012–13, securing the championship with a 3–0 home win against Aston Villa on 22 April 2013.
On 8 May 2013, Ferguson announced that he was to retire as manager at the end of the football season, but would remain at the club as a director and club ambassador. The club announced the next day that Everton manager David Moyes would replace him from 1 July, having signed a six-year contract. Ryan Giggs took over as interim player-manager 10 months later, on 22 April 2014, when Moyes was sacked after a poor season in which the club failed to defend their Premier League title and failed to qualify for the UEFA Champions League for the first time since 1995–96. They also failed to qualify for the Europa League, meaning that it was the first time Manchester United had not qualified for a European competition since 1990. On 19 May 2014, it was confirmed that Louis van Gaal would replace Moyes as Manchester United manager on a three-year deal, with Giggs as his assistant. Malcolm Glazer, the patriarch of the Glazer family that owns the club, died on 28 May 2014.
Although Van Gaal's first season saw United once again qualify for the Champions League through a fourth-place finish in the Premier League, his second season saw United go out of the same tournament in the group stage. United also fell behind in the title race for the third consecutive season, finishing in 5th place, in spite of several expensive signings during Van Gaal's tenure. However, that same season, Manchester United won the FA Cup for a 12th time, this being their first trophy won since 2013. Despite this victory, Van Gaal was sacked as manager just two days later, with José Mourinho appointed in his place on 27 May, signing a three-year contract. That season, United finished in sixth place while winning the EFL Cup for the fifth time and the Europa League for the first time, as well as the FA Community Shield for a record 21st time in Mourinho's first competitive match in charge. Despite not finishing in the top four, United qualified for the Champions League through their Europa League win. Wayne Rooney scored his 250th goal with United, surpassing Sir Bobby Charlton as United's all-time top scorer, before leaving the club at the end of the season to return to Everton. Mourinho was sacked on 18 December 2018 with United in sixth place, 19 points behind league leaders Liverpool and 11 points outside the Champions League places. Former United player and manager of the Norwegian side Molde, Ole Gunnar Solskjær was appointed caretaker manager the next day. On 28 March 2019, following a run of 14 wins in his 19 matches in charge, including knocking Paris Saint-Germain out of the Champions League in the round of 16 after losing the first leg 2–0, Solskjær was appointed permanently on a three-year deal.
The club crest is derived from the Manchester City Council coat of arms, although all that remains of it on the current crest is the ship in full sail. The devil stems from the club's nickname "The Red Devils"; it was included on club programmes and scarves in the 1960s, and incorporated into the club crest in 1970, although the crest was not included on the chest of the shirt until 1971.
Newton Heath's uniform in 1879, four years before the club played its first competitive match, has been documented as 'white with blue cord'. A photograph of the Newton Heath team, taken in 1892, is believed to show the players wearing red-and-white quartered jerseys and navy blue knickerbockers. Between 1894 and 1896, the players wore green and gold jerseys which were replaced in 1896 by white shirts, which were worn with navy blue shorts.
After the name change in 1902, the club colours were changed to red shirts, white shorts, and black socks, which has become the standard Manchester United home kit. Very few changes were made to the kit until 1922 when the club adopted white shirts bearing a deep red "V" around the neck, similar to the shirt worn in the 1909 FA Cup Final. They remained part of their home kits until 1927. For a period in 1934, the cherry and white hooped change shirt became the home colours, but the following season the red shirt was recalled after the club's lowest ever league placing of 20th in the Second Division and the hooped shirt dropped back to being the change. The black socks were changed to white from 1959 to 1965, where they were replaced with red socks up until 1971 with white used on occasion, when the club reverted to black. Black shorts and white socks are sometimes worn with the home strip, most often in away games, if there is a clash with the opponent's kit. For 2018–19, black shorts and red socks became the primary choice for the home kit. Since 1997–98, white socks have been the preferred choice for European games, which are typically played on weeknights, to aid with player visibility. The current home kit is a red shirt with the trademark Adidas three stripes in red on the shoulders, white shorts, and black socks.
The Manchester United away strip has often been a white shirt, black shorts and white socks, but there have been several exceptions. These include an all-black strip with blue and gold trimmings between 1993 and 1995, the navy blue shirt with silver horizontal pinstripes worn during the 1999–2000 season, and the 2011–12 away kit, which had a royal blue body and sleeves with hoops made of small midnight navy blue and black stripes, with black shorts and blue socks. An all-grey away kit worn during the 1995–96 season was dropped after just five games; in its final outing against Southampton, Alex Ferguson instructed the team to change into the third kit during half-time. The reason for dropping it being that the players claimed to have trouble finding their teammates against the crowd, United failed to win a competitive game in the kit. In 2001, to celebrate 100 years as "Manchester United", a reversible white and gold away kit was released, although the actual match day shirts were not reversible.
The club's third kit is often all-blue; this was most recently the case during the 2014–15 season. Exceptions include a green-and-gold halved shirt worn between 1992 and 1994, a blue-and-white striped shirt worn during the 1994–95 and 1995–96 seasons and once in 1996–97, an all-black kit worn during the Treble-winning 1998–99 season, and a white shirt with black-and-red horizontal pinstripes worn between 2003–04 and 2005–06. From 2006–07 to 2013–14, the third kit was the previous season's away kit, albeit updated with the new club sponsor in 2006–07 and 2010–11, apart from 2008–09 when an all-blue kit was launched to mark the 40th anniversary of the 1967–68 European Cup success.
Newton Heath initially played on a field on North Road, close to the railway yard; the original capacity was about 12,000, but club officials deemed the facilities inadequate for a club hoping to join The Football League. Some expansion took place in 1887, and in 1891, Newton Heath used its minimal financial reserves to purchase two grandstands, each able to hold 1,000 spectators. Although attendances were not recorded for many of the earliest matches at North Road, the highest documented attendance was approximately 15,000 for a First Division match against Sunderland on 4 March 1893. A similar attendance was also recorded for a friendly match against Gorton Villa on 5 September 1889.
In June 1893, after the club was evicted from North Road by its owners, Manchester Deans and Canons, who felt it was inappropriate for the club to charge an entry fee to the ground, secretary A. H. Albut procured the use of the Bank Street ground in Clayton. It initially had no stands, by the start of the 1893–94 season, two had been built; one spanning the full length of the pitch on one side and the other behind the goal at the "Bradford end". At the opposite end, the "Clayton end", the ground had been "built up, thousands thus being provided for". Newton Heath's first league match at Bank Street was played against Burnley on 1 September 1893, when 10,000 people saw Alf Farman score a hat-trick, Newton Heath's only goals in a 3–2 win. The remaining stands were completed for the following league game against Nottingham Forest three weeks later. In October 1895, before the visit of Manchester City, the club purchased a 2,000-capacity stand from the Broughton Rangers rugby league club, and put up another stand on the "reserved side" (as distinct from the "popular side"). However, weather restricted the attendance for the Manchester City match to just 12,000.
When the Bank Street ground was temporarily closed by bailiffs in 1902, club captain Harry Stafford raised enough money to pay for the club's next away game at Bristol City and found a temporary ground at Harpurhey for the next reserves game against Padiham. Following financial investment, new club president John Henry Davies paid £500 for the erection of a new 1,000-seat stand at Bank Street. Within four years, the stadium had cover on all four sides, as well as the ability to hold approximately 50,000 spectators, some of whom could watch from the viewing gallery atop the Main Stand.
Following Manchester United's first league title in 1908 and the FA Cup a year later, it was decided that Bank Street was too restrictive for Davies' ambition; in February 1909, six weeks before the club's first FA Cup title, Old Trafford was named as the home of Manchester United, following the purchase of land for around £60,000. Architect Archibald Leitch was given a budget of £30,000 for construction; original plans called for seating capacity of 100,000, though budget constraints forced a revision to 77,000. The building was constructed by Messrs Brameld and Smith of Manchester. The stadium's record attendance was registered on 25 March 1939, when an FA Cup semi-final between Wolverhampton Wanderers and Grimsby Town drew 76,962 spectators.
Bombing in the Second World War destroyed much of the stadium; the central tunnel in the South Stand was all that remained of that quarter. After the war, the club received compensation from the War Damage Commission in the amount of £22,278. While reconstruction took place, the team played its "home" games at Manchester City's Maine Road ground; Manchester United was charged £5,000 per year, plus a nominal percentage of gate receipts. Later improvements included the addition of roofs, first to the Stretford End and then to the North and East Stands. The roofs were supported by pillars that obstructed many fans' views, and they were eventually replaced with a cantilevered structure. The Stretford End was the last stand to receive a cantilevered roof, completed in time for the 1993–94 season. First used on 25 March 1957 and costing £40,000, four pylons were erected, each housing 54 individual floodlights. These were dismantled in 1987 and replaced by a lighting system embedded in the roof of each stand, which remains in use today.
The Taylor Report's requirement for an all-seater stadium lowered capacity at Old Trafford to around 44,000 by 1993. In 1995, the North Stand was redeveloped into three tiers, restoring capacity to approximately 55,000. At the end of the 1998–99 season, second tiers were added to the East and West Stands, raising capacity to around 67,000, and between July 2005 and May 2006, 8,000 more seats were added via second tiers in the north-west and north-east quadrants. Part of the new seating was used for the first time on 26 March 2006, when an attendance of 69,070 became a new Premier League record. The record was pushed steadily upwards before reaching its peak on 31 March 2007, when 76,098 spectators saw Manchester United beat Blackburn Rovers 4–1, with just 114 seats (0.15 per cent of the total capacity of 76,212) unoccupied. In 2009, reorganisation of the seating resulted in a reduction of capacity by 255 to 75,957. Manchester United has the second highest average attendance of European football clubs only behind Borussia Dortmund.
Manchester United is one of the most popular football clubs in the world, with one of the highest average home attendances in Europe. The club states that its worldwide fan base includes more than 200 officially recognised branches of the Manchester United Supporters Club (MUSC), in at least 24 countries. The club takes advantage of this support through its worldwide summer tours. Accountancy firm and sports industry consultants Deloitte estimate that Manchester United has 75 million fans worldwide. The club has the third highest social media following in the world among sports teams (after Barcelona and Real Madrid), with over 73 million Facebook fans as of December 2019. A 2014 study showed that Manchester United had the loudest fans in the Premier League.
Supporters are represented by two independent bodies; the Independent Manchester United Supporters' Association (IMUSA), which maintains close links to the club through the MUFC Fans Forum, and the Manchester United Supporters' Trust (MUST). After the Glazer family's takeover in 2005, a group of fans formed a splinter club, F.C. United of Manchester. The West Stand of Old Trafford – the "Stretford End" – is the home end and the traditional source of the club's most vocal support.
Manchester United has rivalries with Arsenal, Leeds United, Liverpool, and Manchester City, against whom they contest the Manchester derby.
The rivalry with Liverpool is rooted in competition between the cities during the Industrial Revolution when Manchester was famous for its textile industry while Liverpool was a major port. The two clubs are the most successful English teams in both domestic and international competitions; and between them they have won 39 league titles, 9 European Cups, 4 UEFA Cups, 5 UEFA Super Cups, 19 FA Cups, 13 League Cups, 2 FIFA Club World Cups, 1 Intercontinental Cup and 36 FA Community Shields. It is considered to be one of the biggest rivalries in the football world and is considered the most famous fixture in English football.
The "Roses Rivalry" with Leeds stems from the Wars of the Roses, fought between the House of Lancaster and the House of York, with Manchester United representing Lancashire and Leeds representing Yorkshire.
The rivalry with Arsenal arises from the numerous times the two teams, as well as managers Alex Ferguson and Arsène Wenger, have battled for the Premier League title. With 33 titles between them (20 for Manchester United, 13 for Arsenal) this fixture has become known as one of the finest Premier League match-ups in history.
Manchester United has been described as a global brand; a 2011 report by Brand Finance, valued the club's trademarks and associated intellectual property at £412 million – an increase of £39 million on the previous year, valuing it at £11 million more than the second best brand, Real Madrid – and gave the brand a strength rating of AAA (Extremely Strong). In July 2012, Manchester United was ranked first by "Forbes" magazine in its list of the ten most valuable sports team brands, valuing the Manchester United brand at $2.23 billion. The club is ranked third in the Deloitte Football Money League (behind Real Madrid and Barcelona). In January 2013, the club became the first sports team in the world to be valued at $3 billion. "Forbes" magazine valued the club at $3.3 billion – $1.2 billion higher than the next most valuable sports team. They were overtaken by Real Madrid for the next four years, but Manchester United returned to the top of the "Forbes" list in June 2017, with a valuation of $3.689 billion.
The core strength of Manchester United's global brand is often attributed to Matt Busby's rebuilding of the team and subsequent success following the Munich air disaster, which drew worldwide acclaim. The "iconic" team included Bobby Charlton and Nobby Stiles (members of England's World Cup winning team), Denis Law and George Best. The attacking style of play adopted by this team (in contrast to the defensive-minded "catenaccio" approach favoured by the leading Italian teams of the era) "captured the imagination of the English footballing public". Busby's team also became associated with the liberalisation of Western society during the 1960s; George Best, known as the "Fifth Beatle" for his iconic haircut, was the first footballer to significantly develop an off-the-field media profile.
As the second English football club to float on the London Stock Exchange in 1991, the club raised significant capital, with which it further developed its commercial strategy. The club's focus on commercial and sporting success brought significant profits in an industry often characterised by chronic losses. The strength of the Manchester United brand was bolstered by intense off-the-field media attention to individual players, most notably David Beckham (who quickly developed his own global brand). This attention often generates greater interest in on-the-field activities, and hence generates sponsorship opportunities – the value of which is driven by television exposure. During his time with the club, Beckham's popularity across Asia was integral to the club's commercial success in that part of the world.
Because higher league placement results in a greater share of television rights, success on the field generates greater income for the club. Since the inception of the Premier League, Manchester United has received the largest share of the revenue generated from the BSkyB broadcasting deal. Manchester United has also consistently enjoyed the highest commercial income of any English club; in 2005–06, the club's commercial arm generated £51 million, compared to £42.5 million at Chelsea, £39.3 million at Liverpool, £34 million at Arsenal and £27.9 million at Newcastle United. A key sponsorship relationship was with sportswear company Nike, who managed the club's merchandising operation as part of a £303 million 13-year partnership between 2002 and 2015. Through Manchester United Finance and the club's membership scheme, One United, those with an affinity for the club can purchase a range of branded goods and services. Additionally, Manchester United-branded media services – such as the club's dedicated television channel, MUTV – have allowed the club to expand its fan base to those beyond the reach of its Old Trafford stadium.
In an initial five-year deal worth £500,000, Sharp Electronics became the club's first shirt sponsor at the beginning of the 1982–83 season, a relationship that lasted until the end of the 1999–2000 season, when Vodafone agreed a four-year, £30 million deal. Vodafone agreed to pay £36 million to extend the deal by four years, but after two seasons triggered a break clause in order to concentrate on its sponsorship of the Champions League.
To commence at the start of the 2006–07 season, American insurance corporation AIG agreed a four-year £56.5 million deal which in September 2006 became the most valuable in the world. At the beginning of the 2010–11 season, American reinsurance company Aon became the club's principal sponsor in a four-year deal reputed to be worth approximately £80 million, making it the most lucrative shirt sponsorship deal in football history. Manchester United announced their first training kit sponsor in August 2011, agreeing a four-year deal with DHL reported to be worth £40 million; it is believed to be the first instance of training kit sponsorship in English football. The DHL contract lasted for over a year before the club bought back the contract in October 2012, although they remained the club's official logistics partner. The contract for the training kit sponsorship was then sold to Aon in April 2013 for a deal worth £180 million over eight years, which also included purchasing the naming rights for the Trafford Training Centre.
The club's first kit manufacturer was Umbro, until a five-year deal was agreed with Admiral Sportswear in 1975. Adidas received the contract in 1980, before Umbro started a second spell in 1992. Umbro's sponsorship lasted for ten years, followed by Nike's record-breaking £302.9 million deal that lasted until 2015; 3.8 million replica shirts were sold in the first 22 months with the company. In addition to Nike and Chevrolet, the club also has several lower-level "platinum" sponsors, including Aon and Budweiser.
On 30 July 2012, United signed a seven-year deal with American automotive corporation General Motors, which replaced Aon as the shirt sponsor from the 2014–15 season. The new $80m-a-year shirt deal is worth $559m over seven years and features the logo of General Motors brand Chevrolet. Nike announced that they would not renew their kit supply deal with Manchester United after the 2014–15 season, citing rising costs. Since the start of the 2015–16 season, Adidas has manufactured Manchester United's kit as part of a world-record 10-year deal worth a minimum of £750 million. Plumbing products manufacturer Kohler became the club's first sleeve sponsor ahead of the 2018–19 season.
Originally funded by the Lancashire and Yorkshire Railway Company, the club became a limited company in 1892 and sold shares to local supporters for £1 via an application form. In 1902, majority ownership passed to the four local businessmen who invested £500 to save the club from bankruptcy, including future club president John Henry Davies. After his death in 1927, the club faced bankruptcy yet again, but was saved in December 1931 by James W. Gibson, who assumed control of the club after an investment of £2,000. Gibson promoted his son, Alan, to the board in 1948, but died three years later; the Gibson family retained ownership of the club through James' wife, Lillian, but the position of chairman passed to former player Harold Hardman.
Promoted to the board a few days after the Munich air disaster, Louis Edwards, a friend of Matt Busby, began acquiring shares in the club; for an investment of approximately £40,000, he accumulated a 54 per cent shareholding and took control in January 1964. When Lillian Gibson died in January 1971, her shares passed to Alan Gibson who sold a percentage of his shares to Louis Edwards' son, Martin, in 1978; Martin Edwards went on to become chairman upon his father's death in 1980. Media tycoon Robert Maxwell attempted to buy the club in 1984, but did not meet Edwards' asking price. In 1989, chairman Martin Edwards attempted to sell the club to Michael Knighton for £20 million, but the sale fell through and Knighton joined the board of directors instead.
Manchester United was floated on the stock market in June 1991 (raising £6.7 million), and received yet another takeover bid in 1998, this time from Rupert Murdoch's British Sky Broadcasting Corporation. This resulted in the formation of "Shareholders United Against Murdoch" – now the "Manchester United Supporters' Trust" – who encouraged supporters to buy shares in the club in an attempt to block any hostile takeover. The Manchester United board accepted a £623 million offer, but the takeover was blocked by the Monopolies and Mergers Commission at the final hurdle in April 1999. A few years later, a power struggle emerged between the club's manager, Alex Ferguson, and his horse-racing partners, John Magnier and J. P. McManus, who had gradually become the majority shareholders. In a dispute that stemmed from contested ownership of the horse Rock of Gibraltar, Magnier and McManus attempted to have Ferguson removed from his position as manager, and the board responded by approaching investors to attempt to reduce the Irishmen's majority.
In May 2005, Malcolm Glazer purchased the 28.7 per cent stake held by McManus and Magnier, thus acquiring a controlling interest through his investment vehicle Red Football Ltd in a highly leveraged takeover valuing the club at approximately £800 million (then approx. $1.5 billion). Once the purchase was complete, the club was taken off the stock exchange. In July 2006, the club announced a £660 million debt refinancing package, resulting in a 30 per cent reduction in annual interest payments to £62 million a year. In January 2010, with debts of £716.5 million ($1.17 billion), Manchester United further refinanced through a bond issue worth £504 million, enabling them to pay off most of the £509 million owed to international banks. The annual interest payable on the bonds – which were to mature on 1 February 2017 – is approximately £45 million per annum. Despite restructuring, the club's debt prompted protests from fans on 23 January 2010, at Old Trafford and the club's Trafford Training Centre. Supporter groups encouraged match-going fans to wear green and gold, the colours of Newton Heath. On 30 January, reports emerged that the Manchester United Supporters' Trust had held meetings with a group of wealthy fans, dubbed the "Red Knights", with plans to buying out the Glazers' controlling interest.
In August 2011, the Glazers were believed to have approached Credit Suisse in preparation for a $1 billion (approx. £600 million) initial public offering (IPO) on the Singapore stock exchange that would value the club at more than £2 billion. However, in July 2012, the club announced plans to list its IPO on the New York Stock Exchange instead. Shares were originally set to go on sale for between $16 and $20 each, but the price was cut to $14 by the launch of the IPO on 10 August, following negative comments from Wall Street analysts and Facebook's disappointing stock market debut in May. Even after the cut, Manchester United was valued at $2.3 billion, making it the most valuable football club in the world.
List of under-23s and academy players with senior squad numbers
Manchester United are one of the most successful clubs in Europe in terms of trophies won. The club's first trophy was the Manchester Cup, which they won as Newton Heath LYR in 1886. In 1908, the club won their first league title, and won the FA Cup for the first time the following year. Since then, they have gone on to win a record 20 top-division titles – including a record 13 Premier League titles – and their total of 12 FA Cups is second only to Arsenal (13). Those titles have meant they have also appeared a record 30 times in the FA Community Shield (formerly the FA Charity Shield), which is played at the start of each season between the winners of the league and FA Cup from the previous season; of those 30 appearances, Manchester United have won 21, including four times when the match was drawn and the trophy shared by the two clubs.
The club had a successful period under the management of Matt Busby, starting with the FA Cup in 1948 and culminating with becoming the first English club to win the European Cup in 1968, winning five league titles in the intervening years; however, the club's most successful decade came in the 1990s under Alex Ferguson; five league titles, four FA Cups, one League Cup, five Charity Shields (one shared), one UEFA Champions League, one UEFA Cup Winners' Cup, one UEFA Super Cup and one Intercontinental Cup. They also won the Double (winning the Premier League and FA Cup in the same season) three times during the 1990s; before their first in 1993–94, it had only been done five times in English football, so when they won a second in 1995–96 – the first club to do it twice – it was referred to as the "Double Double". When they won the European Cup (now the UEFA Champions League) for a second time in 1999, along with the Premier League and the FA Cup, they became the first English club to win the Treble. That Champions League title gave them entry to the now-defunct Intercontinental Cup, which they also won, making them the only British team to do so. Another Champions League title in 2008 meant they qualified for the 2008 FIFA Club World Cup, which they also won; until 2019, they were the only British team to win that competition.
The club's most recent trophy was the UEFA Europa League, which they won in 2016–17. In winning that title, United became the fifth club to have won the "European Treble" of European Cup/UEFA Champions League, Cup Winners' Cup, and UEFA Cup/Europa League after Juventus, Ajax, Bayern Munich and Chelsea.
Especially short competitions such as the Charity/Community Shield, Intercontinental Cup (now defunct), FIFA Club World Cup or UEFA Super Cup are not generally considered to contribute towards a Double or Treble.
A team called Manchester United Supporters Club Ladies began operations in the late 1970s and was unofficially recognised as the club's senior women's team. They became founding members of the North West Women's Regional Football League in 1989. The team made an official partnership with Manchester United in 2001, becoming the club's official women's team; however, in 2005, following Malcolm Glazer's takeover, the club was disbanded as it was seen to be "unprofitable". In 2018, Manchester United formed a new women's football team, which entered the second division of women's football in England for their debut season.
|
https://en.wikipedia.org/wiki?curid=19961
|
Mesa (programming language)
Mesa is a programming language developed in the late 1970s at the Xerox Palo Alto Research Center in Palo Alto, California, United States. The language name was a pun based upon the programming language catchphrases of the time, because Mesa is a "high level" programming language.
Mesa is an ALGOL-like language with strong support for modular programming. Every library module has at least two source files: a "definitions" file specifying the library's interface plus one or more "program" files specifying the implementation of the procedures in the interface. To use a library, a program or higher-level library must "import" the definitions. The Mesa compiler type-checks all uses of imported entities; this combination of separate compilation with type-checking was unusual at the time.
Mesa introduced several other innovations in language design and implementation, notably in the handling of software exceptions, thread synchronization, and incremental compilation.
Mesa was developed on the Xerox Alto, one of the first personal computers with a graphical user interface, however, most of the Alto's system software was written in BCPL. Mesa was the system programming language of the later Xerox Star workstations, and for the GlobalView desktop environment. Xerox PARC later developed Cedar, which was a superset of Mesa.
Mesa and Cedar had a major influence on the design of other important languages, such as Modula-2 and Java, and was an important vehicle for the development and dissemination of the fundamentals of GUIs, networked environments, and the other advances Xerox contributed to the field of computer science.
Mesa was originally designed in the Computer Systems Laboratory (CSL), a branch of the Xerox Palo Alto Research Center, for the Alto, an experimental micro-coded workstation. Initially, its spread was confined to PARC and a few universities to which Xerox had donated some Altos.
Mesa was later adopted as the systems programming language for Xerox's commercial workstations such as the Xerox 8010 (Xerox Star, Dandelion) and Xerox 6085 (Daybreak), in particular for the Pilot operating system.
A secondary development environment, called the Xerox Development Environment (XDE) allowed developers to debug both the operating system Pilot as well as ViewPoint GUI applications using a world swap mechanism. This allowed the entire "state" of the world to be swapped out, and allowed low-level system crashes which paralyzed the whole system to be debugged. This technique did not scale very well to large application images (several megabytes), and so the Pilot/Mesa world in later releases moved away from the world swap view when the micro-coded machines were phased out in favor of SPARC workstations and Intel PCs running a Mesa PrincOps emulator for the basic hardware instruction set.
Mesa was compiled into a stack-machine language, purportedly with the highest code density ever achieved (roughly 4 bytes per high-level language statement). This was touted in a 1981 paper where implementors from the Xerox Systems Development Department (then, the development arm of PARC), tuned up the instruction set and published a paper on the resultant code density.
Mesa was taught via the Mesa Programming Course that took people through the wide range of technology Xerox had available at the time and ended with the programmer writing a "hack", a workable program designed to be useful. An actual example of such a hack is the BWSMagnifier, which was written in 1988 and allowed people to magnify sections of the workstation screen as defined by a resizable window and a changeable magnification factor. Trained Mesa programmers from Xerox were well versed in the fundamental of GUIs, networking, exceptions, and multi-threaded programming, almost a decade before they became standard tools of the trade.
Within Xerox, Mesa was eventually superseded by the Cedar programming language. Many Mesa programmers and developers left Xerox in 1985; some of them went to DEC Systems Research Center where they used their experience with Mesa in the design of Modula-2+, and later of Modula-3.
Mesa was a strongly typed programming language with type-checking across module boundaries, but with enough flexibility in its type system that heap allocators could be written in Mesa.
Because of its strict separation between interface and implementation, Mesa allows true incremental compilation and encourages architecture- and platform-independent programming. They also simplified source-level debugging, including remote debugging via the Ethernet.
Mesa had rich exception handling facilities, with four types of exceptions. It had support for thread synchronization via monitors. Mesa was the first language to implement monitor BROADCAST, a concept introduced by the Pilot operating system.
Mesa has an "imperative" and "algebraic" syntax, based on ALGOL and Pascal rather than on BCPL or C; for instance, compound commands are indicated by the and keywords rather than braces. In Mesa, all keywords are written in uppercase.
Due to a peculiarity of the ASCII variant used at PARC, the Alto's character set included a left-pointing arrow (←) rather than an underscore. The result of this is that Alto programmers (including those using Mesa, Smalltalk etc.) conventionally used CamelCase for compound identifiers, a practice which was incorporated in PARC's standard programming style. On the other hand, the availability of the left-pointing arrow allowed them to use it for the assignment operator, as it originally had been in ALGOL.
When the Mesa designers wanted to implement an exception facility, they hired a recent M.Sc. graduate from Colorado who had written his thesis on exception handling facilities in algorithmic languages. This led to the richest exception facility for its time, with primitives , , , , , and . Because the language did not have type-safe checks to verify full coverage for signal handling, uncaught exceptions were a common cause of bugs in released software.
Mesa was the precursor to the programming language Cedar. Cedar's main additions were garbage collection, dynamic types, better string support through ropes, a limited form of type parameterization, and special syntax for identifying the type-safe parts of multi-module software packages, to ensure deterministic execution and prevent memory leaks.
|
https://en.wikipedia.org/wiki?curid=19962
|
Morphogenesis
Morphogenesis (from the Greek "morphê" shape and "genesis" creation, literally, "beginning of the shape") is the biological process that causes an organism to develop its shape. It is one of three fundamental aspects of developmental biology along with the control of cell growth and cellular differentiation, unified in evolutionary developmental biology (evo-devo).
The process controls the organized spatial distribution of cells during the embryonic development of an organism. Morphogenesis can take place also in a mature organism, in cell culture or inside tumor cell masses. Morphogenesis also describes the development of unicellular life forms that do not have an embryonic stage in their life cycle, or describes the evolution of a body structure within a taxonomic group.
Morphogenetic responses may be induced in organisms by hormones, by environmental chemicals ranging from substances produced by other organisms to toxic chemicals or radionuclides released as pollutants, and other plants, or by mechanical stresses induced by spatial patterning of the cells.
Some of the earliest ideas and mathematical descriptions on how physical processes and constraints affect biological growth, and hence natural patterns such as the spirals of phyllotaxis, were written by D'Arcy Wentworth Thompson in his 1917 book "On Growth and Form" and Alan Turing in his "The Chemical Basis of Morphogenesis" (1952). Where Thompson explained animal body shapes as being created by varying rates of growth in different directions, for instance to create the spiral shell of a snail, Turing correctly predicted a mechanism of morphogenesis, the diffusion of two different chemical signals, one activating and one deactivating growth, to set up patterns of development, decades before the formation of such patterns was observed. The fuller understanding of the mechanisms involved in actual organisms required the discovery of the structure of DNA in 1953, and the development of molecular biology and biochemistry.
Several types of molecules are important in morphogenesis. Morphogens are soluble molecules that can diffuse and carry signals that control cell differentiation via concentration gradients. Morphogens typically act through binding to specific protein receptors. An important class of molecules involved in morphogenesis are transcription factor proteins that determine the fate of cells by interacting with DNA. These can be coded for by master regulatory genes, and either activate or deactivate the transcription of other genes; in turn, these secondary gene products can regulate the expression of still other genes in a regulatory cascade of gene regulatory networks. At the end of this cascade are classes of molecules that control cellular behaviors such as cell migration, or, more generally, their properties, such as cell adhesion or cell contractility. For example, during gastrulation, clumps of stem cells switch off their cell-to-cell adhesion, become migratory, and take up new positions within an embryo where they again activate specific cell adhesion proteins and form new tissues and organs. Developmental signaling pathways implicated in morphogenesis include Wnt, Hedgehog, and ephrins.
At a tissue level, ignoring the means of control, morphogenesis arises because of cellular proliferation and motility. Morphogenesis also involves changes in the cellular structure or how cells interact in tissues. These changes can result in tissue elongation, thinning, folding, invasion or separation of one tissue into distinct layers. The latter case is often referred as cell sorting. Cell "sorting out" consists of cells moving so as to sort into clusters that maximize contact between cells of the same type. The ability of cells to do this has been proposed to arise from differential cell adhesion by Malcolm Steinberg through his differential adhesion hypothesis. Tissue separation can also occur via more dramatic cellular differentiation events during which epithelial cells become mesenchymal (see Epithelial–mesenchymal transition). Mesenchymal cells typically leave the epithelial tissue as a consequence of changes in cell adhesive and contractile properties. Following epithelial-mesenchymal transition, cells can migrate away from an epithelium and then associate with other similar cells in a new location. In plants, cellular morphogenesis is tightly linked to the chemical composition and the mechanical properties of the cell wall.
During embryonic development, cells are restricted to different layers due to differential affinities. One of the ways this can occur is when cells share the same cell-to-cell adhesion molecules. For instance, homotypic cell adhesion can maintain boundaries between groups of cells that have different adhesion molecules. Furthermore, cells can sort based upon differences in adhesion between the cells, so even two populations of cells with different levels of the same adhesion molecule can sort out. In cell culture cells that have the strongest adhesion move to the center of a mixed aggregates of cells. Moreover, cell-cell adhesion is often modulated by cell contractility, which can exert forces on the cell-cell contacts so that two cell populations with equal levels of the same adhesion molecule can sort out. The molecules responsible for adhesion are called cell adhesion molecules (CAMs). Several types of cell adhesion molecules are known and one major class of these molecules are cadherins. There are dozens of different cadherins that are expressed on different cell types. Cadherins bind to other cadherins in a like-to-like manner: E-cadherin (found on many epithelial cells) binds preferentially to other E-cadherin molecules. Mesenchymal cells usually express other cadherin types such as N-cadherin.
The extracellular matrix (ECM) is involved in keeping tissues separated, providing structural support or providing a structure for cells to migrate on. Collagen, laminin, and fibronectin are major ECM molecules that are secreted and assembled into sheets, fibers, and gels. Multisubunit transmembrane receptors called integrins are used to bind to the ECM. Integrins bind extracellularly to fibronectin, laminin, or other ECM components, and intracellularly to microfilament-binding proteins α-actinin and talin to link the cytoskeleton with the outside. Integrins also serve as receptors to trigger signal transduction cascades when binding to the ECM. A well-studied example of morphogenesis that involves ECM is mammary gland ductal branching.
Tissues can change their shape and separate into distinct layers via cell contractility. Just as in muscle cells, myosin can contract different parts of the cytoplasm to change its shape or structure. Myosin-driven contractility in embryonic tissue morphogenesis is seen during the separation of germ layers in the model organisms "Caenorhabditis elegans", "Drosophila" and zebrafish. There are often periodic pulses of contraction in embryonic morphogenesis. A model called the cell state splitter involves alternating cell contraction and expansion, initiated by a bistable organelle at the apical end of each cell. The organelle consists of microtubules and microfilaments in mechanical opposition. It responds to local mechanical perturbations caused by morphogenetic movements. These then trigger traveling embryonic differentiation waves of contraction or expansion over presumptive tissues that determine cell type and is followed by cell differentiation. The cell state splitter was first proposed to explain neural plate morphogenesis during gastrulation of the axolotl and the model was later generalized to all of morphogenesis.
Cancer can result from disruption of normal morphogenesis, including both tumor formation and tumor metastasis. Mitochondrial dysfunction can result in increased cancer risk due to disturbed morphogen signaling.
|
https://en.wikipedia.org/wiki?curid=19965
|
Muhammad ibn Abd al-Wahhab
Muhammad ibn Abd al-Wahhab (; ; 1703 – 22 June 1792) was a religious leader, Islamic scholar and theologian from Najd in central Arabia, founder of the Islamic doctrine and movement known as Wahhabism. Born to a family of jurists, Ibn 'Abd al-Wahhab's early education consisted of learning a fairly standard curriculum of orthodox jurisprudence according to the Hanbali school of law, which was the school of law most prevalent in his area of birth. Despite his initial rudimentary training in classical Sunni Muslim tradition, Ibn 'Abd al-Wahhab gradually became opposed to many of the most popular Sunni practices such as the visitation to and the veneration of the tombs of saints, which he felt amounted to heretical religious innovation or even idolatry. Despite his teachings being rejected and opposed by many of the most notable Sunni Muslim scholars of the period, including his own father and brother, Ibn 'Abd al-Wahhab charted a religio-political pact with Muhammad bin Saud to help him to establish the Emirate of Diriyah, the first Saudi state, and began a dynastic alliance and power-sharing arrangement between their families which continues to the present day in the Kingdom of Saudi Arabia. The Al ash-Sheikh, Saudi Arabia's leading religious family, are the descendants of Ibn ʿAbd al-Wahhab, and have historically led the "ulama" in the Saudi state, dominating the state's clerical institutions.
Ibn 'Abd al-Wahhab is generally acknowledged to have been born in 1703 into the sedentary and impoverished Arab clan of Banu Tamim in 'Uyayna, a village in the Najd region of central Arabia. Before the emergence of Wahhabism there was a very limited history of Islamic education in the area. For this reason, Ibn 'Abd al-Wahhab had modest access to Islamic education during his youth. Despite this, the area had nevertheless produced several notable jurists of the Hanbali school of orthodox Sunni jurisprudence, which was the school of law most prominently practiced in the area. In fact, Ibn 'Abd-al-Wahhab's own family "had produced several doctors of the school," with his father, Sulaymān b. Muḥammad, having been the Hanbali jurisconsult of the Najd and his grandfather, ʿAbd al-Wahhāb, having been a judge of Hanbali law.
Ibn 'Abd-al-Wahhab's early education consisted of learning the Quran by heart and studying a rudimentary level of Hanbali jurisprudence and theology as outlined in the works of Ibn Qudamah (d. 1223), one of the most influential medieval representatives of the Hanbali school whose works were regarded "as having great authority" in the Najd. As the veneration of saints and the belief in their ability to perform miracles by the grace of God had become one of the most omnipresent and established aspects of Sunni Muslim practice throughout the Islamic world, being an agreed-upon tenet of the faith by the vast majority of the classical scholars, it was not long before Ibn 'Abd-al-Wahhab began to encounter the omnipresence of saint-veneration in his area as well; and he probably chose to leave Najd and look elsewhere for studies to see if the honoring of saints was as popular in the neighboring places of the Muslim world.
After leaving 'Uyayna, Ibn 'Abd al-Wahhab performed the Greater Pilgrimage in Mecca, where the scholars appear to have held opinions and espoused teachings that were unpalatable to him. After this, he went to Medina, the stay at which seems to have been "decisive in shaping the later direction of his thought." In Medina, he met a Hanbali theologian from Najd named ʿAbd Allāh b. Ibrāhīm al-Najdī, who had been a supporter of the neo-Hanbali works of Ibn Taymiyyah (d. 1328), the controversial medieval scholar whose teachings had been considered heterodox and misguided on several important points by the vast majority of Sunni Muslim scholars up to that point in history.
Ibn 'Abd al-Wahhab's teacher Abdallah ibn Ibrahim ibn Sayf introduced the relatively young man to Mohammad Hayya Al-Sindhi in Medina who belonged to the Naqshbandi order (tariqa) of Sufism and recommended him as a student. Mohammad Ibn Abd-al-Wahhab and al-Sindhi became very close and Mohammad Ibn Abd-al-Wahhab stayed with him for some time. Muhammad Hayya also taught Mohammad Ibn ʿAbd-al-Wahhab to reject popular religious practices associated with walis and their tombs that resemble later Wahhabi teachings.
Following his early education in Medina, Ibn Abdul Wahhab traveled outside of the peninsula, venturing first to Basra.
After his return home, Ibn ʿAbd al-Wahhab began to attract followers, including the ruler of 'Uyayna, Uthman ibn Mu'ammar. With Ibn Mu'ammar, Ibn ʿAbd al-Wahhab agreed to support Ibn Mu'ammar's political ambitions to expand his rule "over Najd and possibly beyond", in exchange for the ruler's support for Ibn ʿAbd al-Wahhab's religious teachings. Ibn ʿAbd al-Wahhab began to implement some of his ideas for reform. First, he persuaded Ibn Mu'ammar to help him level the grave of Zayd ibn al-Khattab, a companion of Muhammad, whose grave was revered by locals. Secondly, he ordered the cutting down of trees considered sacred by locals, cutting down "the most glorified of all of the trees" himself. Third, he organized the stoning of a woman who confessed to having committed adultery.
These actions gained the attention of Sulaiman ibn Muhammad ibn Ghurayr of the tribe of Bani Khalid, the chief of Al-Hasa and Qatif, who held substantial influence in Najd. Ibn Ghurayr threatened Ibn Mu'ammar by denying him the ability to collect a land tax for some properties that Ibn Mu'ammar owned in al-Hasa if he did not kill or drive away from Ibn ʿAbd al-Wahhab. Consequently, Ibn Mu'ammar forced Ibn ʿAbd al-Wahhab to leave.
Upon his expulsion from 'Uyayna, Ibn ʿAbd al-Wahhab was invited to settle in neighboring Diriyah by its ruler Muhammad bin Saud. After some time in Diriyah, Muhammad ibn ʿAbd al-Wahhab concluded his second and more successful agreement with a ruler. Ibn ʿAbd al-Wahhab and Muhammad bin Saud agreed that, together, they would bring the Arabs of the peninsula back to the "true" principles of Islam as they saw it. According to one source, when they first met, bin Saud declared:
Muhammad ibn ʿAbd al-Wahhab replied:
The agreement was confirmed with a mutual oath of loyalty ("bay'ah") in 1744.
Ibn Abd al-Wahhab would be responsible for religious matters and Ibn Saud in charge of political and military issues. This agreement became a "mutual support pact"
and power-sharing arrangement between the Al Saud family, and the Al ash-Sheikh and followers of Ibn ʿAbd al-Wahhab, which has remained in place for nearly 300 years, providing the ideological impetus to Saudi expansion.
The 1744 pact between Muhammad bin Saud and Muhammad ibn ʿAbd al-Wahhab marked the emergence of the first Saudi state, the Emirate of Diriyah. By offering the Al Saud a clearly defined religious mission, the alliance provided the ideological impetus to Saudi expansion. First conquering Najd, Saud's forces expanded the Salafi influence to most of the present-day territory of Saudi Arabia, eradicating various popular practices they viewed as akin to polytheism and propagating the doctrines of ʿAbd al-Wahhab.
According to academic publications such as the Encyclopædia Britannica while in Baghdad, Ibn ʿAbd al-Wahhab married an affluent woman. When she died, he inherited her property and wealth. Muhammad ibn 'Abd Al-Wahhab had six sons; Hussain, Abdullah, Hassan, Ali and Ibrahim and Abdul-Aziz who died in his youth. All his surviving sons established religious schools close to their homes and taught the young students from Diriyah and other places.
The descendants of Ibn ʿAbd al-Wahhab, the Al ash-Sheikh, have historically led the ulama in the Saudi state, dominating the state's religious institutions. Within Saudi Arabia, the family is held in prestige similar to the Saudi royal family, with whom they share power, and has included several religious scholars and officials. The arrangement between the two families is based on the Al Saud maintaining the Al ash-Sheikh's authority in religious matters and upholding and propagating Salafi doctrine. In return, the Al ash-Sheikh support the Al Saud's political authority thereby using its religious-moral authority to legitimize the royal family's rule.
Ibn ʿAbd al-Wahhab considered his movement an effort to purify Islam by returning Muslims to what, he believed, were the original principles of that religion. He taught that the primary doctrine of Islam was the uniqueness and unity of God ("Tawhid"). He also denounced popular beliefs as polytheism (shirk), rejected much of the medieval law of the scholars (ulema) and called for a new interpretation of Islam.
The "core" of Ibn ʿAbd al-Wahhab's teaching is found in "Kitab al-Tawhid", a short essay which draws from material in the Quran and the recorded doings and sayings ("hadith") of the Islamic prophet Muhammad. It preaches that worship in Islam includes conventional acts of worship such as the five daily prayers ("salat"); fasting ("sawm"); supplication ("Dua"); seeking protection or refuge ("Istia'dha"); seeking help ("Ist'ana" and "Istighatha") of Allah.
Ibn ʿAbd al-Wahhab was keen on emphasizing that other acts, such as making "dua" or calling upon/supplication to or seeking help, protection or intercession from anyone or anything other than Allah, are acts of "shirk" and contradict the tenets of tawhid and that those who tried would never be forgiven.
Traditionally, most Muslims throughout history have held the view that declaring the testimony of faith is sufficient in becoming a Muslim. Ibn 'Abd al-Wahhab did not agree with this. He held the view that an individual who believed that there could be intercessors with God was actually performing shirk. This was the major difference between him and his opponents and led him to declare Muslims outside of his group to be apostates (takfir) and idolators (mushrikin).
Ibn 'Abd al-Wahhab's movement is today often known as Wahhabism, although many adherents see this as a derogatory term coined by his opponents, and prefer it to be known as the Salafi movement. Scholars point out that Salafism is a term applied to several forms of puritanical Islam in various parts of the world, while Wahhabism refers to the specific Saudi school, which is seen as a more strict form of Salafism. According to Ahmad Moussalli, professor of political science at the American University of Beirut, "As a rule, all Wahhabis are Salafists, but not all Salafists are Wahhabis". Yet others say that while Wahhabism and Salafism originally were two different things, they became practically indistinguishable in the 1970s.
At the end of his treatise, "Al-Hadiyyah al-Suniyyah", Ibn ʿAbd al-Wahhab's son 'Abd Allah speaks positively on the practice of tazkiah (purification of the inner self).
According to author Dore Gold, in "Kitab al-Tawhid", Ibn Abd al-Wahhab described followers of both the Christian and Jewish faiths as sorcerers who believed in devil worship, and cited a hadith of Muhammad stating that punishment for the sorcerer is `that he be struck with the sword.` Wahhab asserted that both religions had improperly made the graves of their prophet into places of worship and warned Muslims not to imitate this practice. Wahhab concluded that `The ways of the people of the book are condemned as those of polytheists.`
However, author Natana J. DeLong-Bas defends Abdul Wahhab, stating that
despite his at times vehement denunciations of other religious groups for their supposedly heretical beliefs, Ibn Abd al Wahhab never called for their destruction or death … he assumed that these people would be punished in the Afterlife …"
Historical accounts of Wahhab also state that "Muhammad ibn ʿAbd al-Wahhab saw it as his mission to restore a more purer and original form of the faith of Islam. … Anyone who didn't adhere to this interpretation were considered polytheists worthy of death, including fellow Muslims (especially Shi'ite who venerate the family of Muhammad), Christians and others. He also advocated for a literalist interpretation of the Quran and its laws"
Despite his great aversion to venerating the saints after their earthly passing and seeking their intercession, it should nevertheless be noted that Ibn 'Abd-al-Wahhab did not deny the existence of saints as such; on the contrary, he acknowledged that "the miracles of saints ("karāmāt al-awliyāʾ") are not to be denied, and their right guidance by God is acknowledged" when they acted properly during their life.
Ibn ʿAbd al-Wahhab's teachings were criticized by a number of Islamic scholars during his life for disregarding Islamic history, monuments, traditions and the sanctity of Muslim life. One scholar named Ibn Muhammad compared Ibn 'Abd al-Wahhab with Musaylimah. He also accused Ibn 'Abd al-Wahhab of wrongly declaring the Muslims to be infidels based on a misguided reading of Qur'anic passages and Prophetic traditions and of wrongly declaring all scholars as infidels who did not agree with his "deviant innovation".
The traditional Hanbali scholar Ibn Fayruz al-Tamimi (d. 1801/1802) publicly repudiated Ibn 'Abd al-Wahhab's teachings when he sent an envoy to him and referred to the Wahhabis as the "seditious Kharijites" of Najd. In response, the Wahhabis considered Ibn Fayruz an idolater (mushrik) and one of their worst enemies.
According to the historian Ibn Humayd, Ibn 'Abd al-Wahhab's father criticized his son for his unwillingness to specialize in jurisprudence and disagreed with his doctrine and declared that he would be the cause of wickedness. Similarly his brother, Suleyman ibn 'Abd al-Wahhab wrote one of the first treatises' refuting Wahhabi doctrine claiming he was ill-educated and intolerant and classing Ibn ʿAbd al-Wahhab's views as fringe and fanatical.
The Shafi'i mufti of Mecca, Ahmed ibn Zayni Dehlan, wrote an anti-Wahhabi treatise, the bulk of which consists of arguments and proof from the sunna to uphold the validity of practices the Wahhabis considered idolatrous: Visiting the tombs of Muhammad, seeking the intercession of saints, venerating Muhammad and obtaining the blessings of saints. He also accused Ibn 'Abd al-Wahhab of not adhering to the Hanbali school and that he was deficient in learning.
Ibn ʿAbd al-Wahhab is accepted by Salafi scholars as an authority and source of reference. 20th century Albanian scholar Nasiruddin Albani refers to Ibn Abdul Wahhab's activism as "Najdi da'wah."
A list of scholars with opposing views, along with names of their books and related information, was compiled by the Islamic scholar Muhammad Hisham.
In 2010, Prince Salman bin Abdul-Aziz at the time serving as the governor of Riyadh said that the teaching of Muhammad Ibn Abdul-Wahab was pure Islam, and said regarding his works, "I dare anyone to bring a single alphabetical letter from the Sheikh's books that goes against the book of Allah ... and the teachings of his prophet, Mohammed."
The national mosque of Qatar is named after him. The "Imam Muhammad ibn Abd al-Wahhab Mosque" was opened in 2011, with the Emir of Qatar presiding over the occasion. The mosque can hold a congregation of 30,000 people. In 2017 there has been a request published on the Saudi Arabian newspaper "Okaz" and signed by 200 descendants of Ibn Abd al-Wahhab that the name of the mosque be changed, because according to their statement "it does not carry its true Salafi path", even though most Qataris practice Wahhabism.
Despite Wahhabi destruction of many Islamic, non-Islamic, cultural and historical sites associated with the early history of Islam and the first generation of Muslims (Muhammad's family and his companions), the Saudi government undertook a large-scale development of Muhammad ibn Abd al-Wahhab's domain, Diriyah, turning it into a major tourist attraction. Other features in the area include the "Sheikh Muhammad bin Abdul Wahab Foundation", which is planned to include a light and sound presentation located near the "Mosque of Sheikh Mohammad bin Abdulwahab".
There are two contemporary histories of Muhammed ibn ʿAbd al-Wahhab and his religious movement from the point of view of his supporters: Ibn Ghannam's "Rawdhat al-Afkar wal-Afham" or "Tarikh Najd" (History of Najd) and Ibn Bishr's "Unwan al-Majd fi Tarikh Najd". Husain ibn Ghannam (d. 1811), an alim from al-Hasa was the only historian to have observed the beginnings of Ibn ʿAbd al-Wahhab's movement first-hand. His chronicle ends at the year 1797. Ibn Bishr's chronicle, which stops at the year 1854, was written a generation later than Ibn Ghannam's but is considered valuable partly because Ibn Bishr was a native of Najd and because he adds many details to Ibn Ghannam's account.
A third account, dating from around 1817 is "Lam' al-Shihab", written by an anonymous Sunni author who respectfully disapproved of Ibn ʿAbd al-Wahhab's movement, regarding it as a "bid'ah". It is also commonly cited because it is considered to be a relatively objective contemporary treatment of the subject. However, unlike Ibn Ghannam and Ibn Bishr, its author did not live in Najd and his work is believed to contain some apocryphal and legendary material concerning the details of Ibn ʿAbd al-Wahhab's life.
|
https://en.wikipedia.org/wiki?curid=19975
|
Maine
Maine () is the northernmost state in the northeastern United States. Maine is the 12th smallest by area, the 9th least populous, and the 13th least densely populated of the 50 U.S. states. Located in New England, it is bordered by New Hampshire to the west, the Atlantic Ocean to the southeast, and the Canadian provinces of New Brunswick and Québec to the northeast and northwest, respectively. Maine is the only state to border exactly one other state, is the easternmost among the contiguous United States, and is the northernmost state east of the Great Lakes.
Maine is known for its jagged, rocky coastline; low, rolling mountains; heavily forested interior; and picturesque waterways, as well as its seafood cuisine, especially lobster and clams. There is a humid continental climate throughout most of the state, including coastal areas. Maine's most populous city is Portland and its capital is Augusta.
For thousands of years, indigenous peoples were the only inhabitants of the territory that is now Maine. At the time of European arrival in what is now Maine, several Algonquian-speaking peoples inhabited the area. The first European settlement in the area was by the French in 1604 on Saint Croix Island, by Pierre Dugua, Sieur de Mons. The first English settlement was the short-lived Popham Colony, established by the Plymouth Company in 1607. A number of English settlements were established along the coast of Maine in the 1620s, although the rugged climate, deprivations, and conflict with the local peoples caused many to fail over the years.
As Maine entered the 18th century, only a half dozen European settlements had survived. Loyalist and Patriot forces contended for Maine's territory during the American Revolution. During the War of 1812, the largely-undefended eastern region of Maine was occupied by British forces, but returned to the United States following failed British offensives on the northern border, mid-Atlantic and south which produced a peace treaty that was to include dedicated land on the Michigan peninsula for Native American peoples. Maine was part of the Commonwealth of Massachusetts until 1820, when it voted to secede from Massachusetts to become a separate state. On March 15, 1820, under the Missouri Compromise, it was admitted to the Union as the 23rd state.
There is no definitive explanation for the origin of the name "Maine", but the most likely origin is that the name was given by early explorers after the former province of Maine in France. Whatever the origin, the name was fixed for English settlers in 1665 when the English King's Commissioners ordered that the "Province of Maine" be entered from then on in official records. The state legislature in 2001 adopted a resolution establishing Franco-American Day, which stated that the state was named after the former French province of Maine.
Other theories mention earlier places with similar names, or claim it is a nautical reference to the mainland. Captain John Smith, in his "Description of New England" (1614) bemoans the lack of exploration: "Thus you may see, of this 2000. miles more then halfe is yet vnknowne to any purpose: no not so much as the borders of the Sea are yet certainly discouered. As for the goodnes and true substances of the Land, wee are for most part yet altogether ignorant of them, vnlesse it bee those parts about the Bay of Chisapeack and Sagadahock: but onely here and there wee touched or haue seene a little the edges of those large dominions, which doe stretch themselues into the Maine, God doth know how many thousand miles;"
Note that his description of the mainland of North America is "the Maine". The word "main" was a frequent shorthand for the word "mainland" (as in "The Spanish Main")
Attempts to uncover the history of the name of Maine began with James Sullivan's 1795 "History of the District of Maine." He made the unsubstantiated claim that the Province of Maine was a compliment to the queen of Charles I, Henrietta Maria, who once "owned" the Province of Maine in France. This was quoted by Maine historians until the 1845 biography of that queen by Agnes Strickland established that she had no connection to the province; further, King Charles I married Henrietta Maria in 1625, three years after the name Maine first appeared on the charter. A new theory, put forward by Carol B. Smith Fisher in 2002, is that Sir Ferdinando Gorges chose the name in 1622 to honor the village where his ancestors first lived in England, rather than the province in France. "MAINE" appears in the Domesday Book of 1086 in reference to the county of Dorset, which is today Broadmayne, just southeast of Dorchester.
The view generally held among British place name scholars is that Mayne in Dorset is Brythonic, corresponding to modern Welsh "maen", plural "main" or "meini". Some early spellings are: MAINE 1086, MEINE 1200, MEINES 1204, MAYNE 1236. Today the village is known as Broadmayne, which is primitive Welsh or Brythonic, "main" meaning rock or stone, considered a reference to the many large sarsen stones still present around Little Mayne farm, half a mile northeast of Broadmayne village.
The first known record of the name appears in an August 10, 1622 land charter to Sir Ferdinando Gorges and Captain John Mason, English Royal Navy veterans, who were granted a large tract in present-day Maine that Mason and Gorges "intend to name the Province of Maine". Mason had served with the Royal Navy in the Orkney Islands, where the chief island is called Mainland, a possible name derivation for these English sailors. In 1623, the English naval captain Christopher Levett, exploring the New England coast, wrote: "The first place I set my foote upon in New England was the Isle of Shoals, being Ilands in the sea, above two Leagues from the Mayne." Initially, several tracts along the coast of New England were referred to as "Main" or "Maine" (cf. the Spanish Main). A reconfirmed and enhanced April 3, 1639, charter, from England's King CharlesI, gave Sir Ferdinando Gorges increased powers over his new province and stated that it "shall forever hereafter, be called and named the PROVINCE OR COUNTIE OF MAINE, and not by any other name or names whatsoever..." Maine is the only U.S. state whose name has exactly one syllable.
The original inhabitants of the territory that is now Maine were Algonquian-speaking Wabanaki peoples, including the Passamaquoddy, Maliseet, Penobscot, Androscoggin and Kennebec. During the later King Philip's War, many of these peoples would merge in one form or another to become the Wabanaki Confederacy, aiding the Wampanoag of Massachusetts & the Mahican of New York. Afterwards, many of these people were driven from their natural territories, but most of the tribes of Maine continued, unchanged, until the American Revolution. Before this point, however, most of these people were considered separate nations. Many had adapted to living in permanent, Iroquois-inspired settlements, while those along the coast tended to be semi-nomadic—traveling from settlement to settlement on a yearly cycle. They would usually winter inland & head to the coasts by summer.
European contact with what is now called Maine started around 1200 CE when Norwegians interacted with the native Penobscot in present-day Hancock County, most likely through trade. About 200 years earlier, from the settlements in Iceland and Greenland, Norwegians had first identified America and attempted to settle areas such as Newfoundland, but failed to establish a permanent settlement there. Archeological evidence suggests that Norwegians in Greenland returned to North America for several centuries after the initial discovery to collect timber and to trade, with the most relevant evidence being the Maine Penny, an 11th-century Norwegian coin found at a Native American dig site in 1954.
The first European settlement in Maine was in 1604 on Saint Croix Island, led by French explorer Pierre Dugua, Sieur de Mons. His party included Samuel de Champlain, noted as an explorer. The French named the entire area Acadia, including the portion that later became the state of Maine. The first English settlement in Maine was established by the Plymouth Company at the Popham Colony in 1607, the same year as the settlement at Jamestown, Virginia. The Popham colonists returned to Britain after 14 months.
The French established two Jesuit missions: one on Penobscot Bay in 1609, and the other on Mount Desert Island in 1613. The same year, Castine was established by Claude de La Tour. In 1625, Charles de Saint-Étienne de la Tour erected Fort Pentagouet to protect Castine. The coastal areas of eastern Maine first became the Province of Maine in a 1622 land patent. The part of western Maine north of the Kennebec River was more sparsely settled, and was known in the 17th century as the Territory of Sagadahock. A second settlement was attempted in 1623 by English explorer and naval Captain Christopher Levett at a place called York, where he had been granted by King Charles I of England. It also failed.
Central Maine was formerly inhabited by people of the Androscoggin tribe of the Abenaki nation, also known as Arosaguntacook. They were driven out of the area in 1690 during King William's War. They were relocated at St. Francis, Canada, which was destroyed by Rogers' Rangers in 1759, and is now Odanak. The other Abenaki tribes suffered several severe defeats, particularly during Dummer's War, with the capture of Norridgewock in 1724 and the defeat of the Pequawket in 1725, which greatly reduced their numbers. They finally withdrew to Canada, where they were settled at Bécancour and Sillery, and later at St. Francis, along with other refugee tribes from the south.
The province within its current boundaries became part of Massachusetts Bay Colony in 1652. Maine was much fought over by the French, English, and allied natives during the 17th and early 18th centuries, who conducted raids against each other, taking captives for ransom or, in some cases, adoption by Native American tribes. A notable example was the early 1692 Abenaki raid on York, where about 100 English settlers were killed and another estimated 80 taken hostage. The Abenaki took captives taken during raids of Massachusetts in Queen Anne's War of the early 1700s to Kahnewake, a Catholic Mohawk village near Montreal, where some were adopted and others ransomed.
After the British defeated the French in Acadia in the 1740s, the territory from the Penobscot River east fell under the nominal authority of the Province of Nova Scotia, and together with present-day New Brunswick formed the Nova Scotia county of Sunbury, with its court of general sessions at Campobello. American and British forces contended for Maine's territory during the American Revolution and the War of 1812, with the British occupying eastern Maine in both conflicts. The territory of Maine was confirmed as part of Massachusetts when the United States was formed following the Treaty of Paris ending the revolution, although the final border with British North America was not established until the Webster–Ashburton Treaty of 1842.
Maine was physically separate from the rest of Massachusetts. Long-standing disagreements over land speculation and settlements led to Maine residents and their allies in Massachusetts proper forcing an 1807 vote in the Massachusetts Assembly on permitting Maine to secede; the vote failed. Secessionist sentiment in Maine was stoked during the War of 1812 when Massachusetts pro-British merchants opposed the war and refused to defend Maine from British invaders. In 1819, Massachusetts agreed to permit secession, sanctioned by voters of the rapidly growing region the following year. Formal secession and formation of the state of Maine as the 23rd state occurred on March 15, 1820, as part of the Missouri Compromise, which geographically limited the spread of slavery and enabled the admission to statehood of Missouri the following year, keeping a balance between slave and free states.
Maine's original state capital was Portland, Maine's largest city, until it was moved to the more central Augusta in 1832. The principal office of the Maine Supreme Judicial Court remains in Portland.
The 20th Maine Volunteer Infantry Regiment, under the command of Colonel Joshua Lawrence Chamberlain, prevented the Union Army from being flanked at Little Round Top by the Confederate Army during the Battle of Gettysburg.
Four U.S. Navy ships have been named USS "Maine", most famously the armored cruiser , whose sinking by an explosion on February 15, 1898 precipitated the Spanish–American War.
To the south and east is the Atlantic Ocean and to the north and northeast is New Brunswick, a province of Canada. The Canadian province of Québec is to the northwest. Maine is both the northernmost state in New England and also the largest, accounting for almost half of the region's entire land area. Maine is the only state to border exactly one other American state (New Hampshire).
Maine is the easternmost state in the United States both in its extreme points and in its geographic center. The town of Lubec is the easternmost organized settlement in the United States. Its Quoddy Head Lighthouse is also the closest place in the United States to Africa and Europe. Estcourt Station is Maine's northernmost point, as well as the northernmost point in New England. (For more information see extreme points of the United States.)
Maine's Moosehead Lake is the largest lake wholly in New England, since Lake Champlain is located between Vermont, New York and Québec. A number of other Maine lakes, such as South Twin Lake, are described by Thoreau in "The Maine Woods" (1864). Mount Katahdin is both the northern terminus of the Appalachian Trail, which extends southerly to Springer Mountain, Georgia, and the southern terminus of the new International Appalachian Trail which, when complete, will run to Belle Isle, Newfoundland and Labrador.
Machias Seal Island and North Rock, off the state's Downeast coast, are claimed by both Canada and the American town of Cutler, and are within one of four areas between the two countries whose sovereignty is still in dispute, but it is the only one of the disputed areas containing land. Also in this easternmost area in the Bay of Fundy is the Old Sow, the largest tidal whirlpool in the Western Hemisphere.
Maine is the least densely populated U.S. state east of the Mississippi River. It is called the Pine Tree State; over 80% of its total land is forested and/or unclaimed, the most forest cover of any U.S. state. In the forested areas of the interior lies much uninhabited land, some of which does not have formal political organization into local units (a rarity in New England). The Northwest Aroostook, Maine unorganized territory in the northern part of the state, for example, has an area of and a population of 10, or one person for every .
Maine is in the temperate broadleaf and mixed forests biome. The land near the southern and central Atlantic coast is covered by the mixed oaks of the Northeastern coastal forests. The remainder of the state, including the North Woods, is covered by the New England-Acadian forests.
Maine has almost of coastline (and of tidal coastline). West Quoddy Head, in Lubec, Maine, is the easternmost point of land in the 48 contiguous states. Along the famous rock-bound coast of Maine are lighthouses, beaches, fishing villages, and thousands of offshore islands, including the Isles of Shoals which straddle the New Hampshire border. There are jagged rocks and cliffs and many bays and inlets. Inland are lakes, rivers, forests, and mountains. This visual contrast of forested slopes sweeping down to the sea has been summed up by American poet Edna St. Vincent Millay of Rockland and Camden, Maine, in "Renascence":
Geologists describe this type of landscape as a "drowned coast", where a rising sea level has invaded former land features, creating bays out of valleys and islands out of mountain tops. A rise in the elevation of the land due to the melting of heavy glacier ice caused a slight rebounding effect of underlying rock; this land rise, however, was not enough to eliminate all the effect of the rising sea level and its invasion of former land features.
Much of Maine's geomorphology was created by extended glacial activity at the end of the last ice age. Prominent glacial features include Somes Sound and Bubble Rock, both part of Acadia National Park on Mount Desert Island. Carved by glaciers, Somes Sound is considered to be the only fjord on the eastern seaboard and reaches depths of . The extreme depth and steep drop-off allow large ships to navigate almost the entire length of the sound. These features also have made it attractive for boat builders, such as the prestigious Hinckley Yachts.
Bubble Rock, a glacial erratic, is a large boulder perched on the edge of Bubble Mountain in Acadia National Park. By analyzing the type of granite, geologists were able to discover that glaciers carried Bubble Rock to its present location from near Lucerne, away. The Iapetus Suture runs through the north and west of the state, being underlain by the ancient Laurentian terrane, and the south and east underlain by the Avalonian terrane.
Acadia National Park is the only national park in New England. Areas under the protection and management of the National Park Service include:
Maine has a humid continental climate (Köppen climate classification "Dfb"), with warm and sometimes humid summers, and long, cold and very snowy winters. Winters are especially severe in the Northern and Western parts of Maine, while coastal areas are moderated slightly by the Atlantic Ocean, resulting in slightly milder winters and cooler summers than inland areas. Daytime highs are generally in the range throughout the state in July, with overnight lows in the high 50s °F (around 15 °C). January temperatures range from highs near on the southern coast to overnight lows averaging below in the far north.
The state's record high temperature is , set in July 1911, at North Bridgton.
Precipitation in Maine is evenly distributed year-round, but with a slight summer maximum in northern/northwestern Maine and a slight late-fall or early-winter maximum along the coast due to "nor'easters" or intense cold-season rain/snow storms. In coastal Maine, the late spring and summer months are usually driest—a rarity across the Eastern United States. Maine has fewer days of thunderstorms than any other state east of the Rockies, with most of the state averaging less than 20 days of thunderstorms a year. Tornadoes are rare in Maine, with the state averaging fewer than four per year, although this number is increasing. Most severe thunderstorms and tornadoes occur in the Sebago Lakes & Foothills region of the state. Maine rarely sees the effect of tropical cyclones, as they tend to pass well east and south or are greatly weakened by the time they reach Maine.
In January 2009, a new record low temperature for the state was set at Big Black River of , tying the New England record.
Annual precipitation varies from in Presque Isle, to in Acadia National Park.
The United States Census Bureau estimates that the population of Maine was 1,344,212 on July 1, 2019, a 1.19% increase since the 2010 United States Census. The population density of the state is 41.3 people per square mile, making it the least densely populated state east of the Mississippi River. As of 2010 Maine was also the most rural state in the Union, with only 38.7% of the state's population living within urban areas. As explained in detail under "Geography", there are large tracts of uninhabited land in some remote parts of the interior of the state, particularly in the North Maine Woods.
The mean population center of Maine is located in Kennebec County, just east of Augusta. The Greater Portland metropolitan area is the most densely populated with nearly 40% of Maine's population. This area spans three counties and includes many farms and wooded areas; the 2016 population of Portland proper was 66,937.
Maine has experienced a very slow rate of population growth since the 1990 census; its rate of growth (0.57%) since the 2010 census ranks 45th of the 50 states. The modest population growth in the state has been concentrated in the southern coastal counties; with more diverse populations slowly moving into these areas of the state. However, the northern, more rural areas of the state have experienced a slight decline in population in recent years.
According to the 2010 Census, Maine has the highest percentage of non-Hispanic whites of any state, at 94.4% of the total population. In 2011, 89.0% of all births in the state were to non-Hispanic white parents. Maine also has the second-highest residential senior population.
The table below shows the racial composition of Maine's population as of 2016.
According to the 2016 American Community Survey, 1.5% of Maine's population were of Hispanic or Latino origin (of any race): Mexican (0.4%), Puerto Rican (0.4%), Cuban (0.1%), and other Hispanic or Latino origin (0.6%). The five largest ancestry groups were: English (20.7%), Irish (17.3%), French (15.7%), German (8.1%), and American (7.8%).
People citing that they are American are of overwhelmingly English descent, but have ancestry that has been in the region for so long (often since the 1600s) that they choose to identify simply as Americans.
Maine has the highest percentage of French Americans of any state. Most of them are of Canadian origin, but in some cases have been living there since prior to the American Revolutionary War. There are particularly high concentrations in the northern part of Maine in Aroostook County, which is part of a cultural region known as Acadia that goes over the border into New Brunswick. Along with the Acadian population in the north, many French came from Quebec as immigrants between 1840 and 1930.
The upper Saint John River valley area was once part of the so-called Republic of Madawaska, before the frontier was decided in the Webster-Ashburton Treaty of 1842. Over a quarter of the population of Lewiston, Waterville, and Biddeford are Franco-American. Most of the residents of the Mid Coast and Down East sections are chiefly of British heritage. Smaller numbers of various other groups, including Irish, Italian and Polish, have settled throughout the state since the late 19th and early 20th century immigration waves.
"Note: Births in table don't add up, because Hispanics are counted both by their ethnicity and by their race, giving a higher overall number."
Maine does not have an official language, but the most widely spoken language in the state is English. The 2000 Census reported 92.25% of Maine residents aged five and older spoke only English at home. French-speakers are the state's chief linguistic minority; census figures show that Maine has the highest percentage of people speaking French at home of any state: 5.28% of Maine households are French-speaking, compared with 4.68% in Louisiana, which is the second highest state. Although rarely spoken, Spanish is the third-most-common language in Maine, after English and French.
According to the Association of Religion Data Archives (ARDA), the religious affiliations of Maine in 2010 were:
The Catholic Church was the largest religious institution with 202,106 members, the United Methodist Church had 28,329 members, the United Church of Christ had 22,747 members
In 2010, a study named Maine as the least religious state in the United States.
Total employment 2016
Total employer establishments 2016
The Bureau of Economic Analysis estimates that Maine's total gross state product for 2010 was $52 billion. Its per capita personal income for 2007 was $33,991, 34th in the nation. , Maine's unemployment rate is 3.0%
Maine's agricultural outputs include poultry, eggs, dairy products, cattle, wild blueberries, apples, maple syrup, and maple sugar. Aroostook County is known for its potato crops. Commercial fishing, once a mainstay of the state's economy, maintains a presence, particularly lobstering and groundfishing. While lobster is the main seafood focus for Maine, the harvest of both oysters and seaweed are on the rise. In 2015, 14% of the Northeast's total oyster supply came from Maine. In 2017, the production of Maine's seaweed industry was estimated at $20 million per year. The shrimp industry of Maine is on a government-mandated hold. With an ever-decreasing Northern shrimp population, Maine fishermen are no longer allowed to catch and sell shrimp. The hold began in 2014 and is expected to continue until 2021. Western Maine aquifers and springs are a major source of bottled water.
Maine's industrial outputs consist chiefly of paper, lumber and wood products, electronic equipment, leather products, food products, textiles, and bio-technology. Naval shipbuilding and construction remain key as well, with Bath Iron Works in Bath and Portsmouth Naval Shipyard in Kittery.
Brunswick Landing, formerly Naval Air Station Brunswick, is also in Maine. Formerly a large support base for the U.S. Navy, the BRAC campaign initiated the Naval Air Station's closing, despite a government-funded effort to upgrade its facilities. The former base has since been changed into a civilian business park, as well as a new satellite campus for Southern Maine Community College.
Maine is the number one U.S. producer of low-bush blueberries (Vaccinium angustifolium). Preliminary data from the USDA for 2012 also indicate Maine was the largest blueberry producer of the major blueberry producing states, with 91,100,000 lbs. This data includes both low (wild), and high-bush (cultivated) blueberries: Vaccinium corymbosum. The largest toothpick manufacturing plant in the United States used to be located in Strong, Maine. The Strong Wood Products plant produced 20 million toothpicks a day. It closed in May 2003.
Tourism and outdoor recreation play a major and increasingly important role in Maine's economy. The state is a popular destination for sport hunting (particularly deer, moose and bear), sport fishing, snowmobiling, skiing, boating, camping and hiking, among other activities.
Historically, Maine ports played a key role in national transportation. Beginning around 1880, Portland's rail link and ice-free port made it Canada's principal winter port, until the aggressive development of Halifax, Nova Scotia, in the mid-1900s. In 2013, 12,039,600 short tons passed into and out of Portland by sea, which places it 45th of U.S. water ports. Portland Maine's Portland International Jetport was recently expanded, providing the state with increased air traffic from carriers such as JetBlue and Southwest Airlines.
Maine has very few large companies that maintain headquarters in the state, and that number has fallen due to consolidations and mergers, particularly in the pulp and paper industry. Some of the larger companies that do maintain headquarters in Maine include Fairchild Semiconductor in South Portland; IDEXX Laboratories, in Westbrook; Hannaford Bros. Co. in Scarborough; TD Bank in Portland; L.L.Bean in Freeport; and Cole Haan in Yarmouth. Maine is also the home of the Jackson Laboratory, the world's largest non-profit mammalian genetic research facility and the world's largest supplier of genetically purebred mice.
Maine has an income tax structure containing two brackets, 6.5 and 7.95 percent of personal income. Before July 2013 Maine had four brackets: 2, 4.5, 7, and 8.5 percent. Maine's general sales tax rate is 5.5 percent. The state also levies charges of 9percent on lodging and prepared food and 10percent on short-term auto rentals. Commercial sellers of blueberries, a Maine staple, must keep records of their transactions and pay the state 1.5 cents per pound ($1.50 per 100 pounds) of the fruit sold each season. All real and tangible personal property located in the state of Maine is taxable unless specifically exempted by statute. The administration of property taxes is handled by the local assessor in incorporated cities and towns, while property taxes in the unorganized territories are handled by the State Tax Assessor.
Maine has a long-standing tradition of being home to many shipbuilding companies. In the 18th and 19th centuries, Maine was home to many shipyards that produced wooden sailing ships. The main function of these ships was to transport either cargos or passengers overseas. One of these yards was located in Pennellville Historic District in what is now Brunswick, Maine. This yard, owned by the Pennell family, was typical of the many family-owned shipbuilding companies of the time period. Other such examples of shipbuilding families were the Skolfields and the Morses. During the 18th and 19th centuries, wooden shipbuilding of this sort made up a sizable portion of the economy.
Maine receives passenger jet service at its two largest airports, the Portland International Jetport in Portland, and the Bangor International Airport in Bangor. Both are served daily by many major airlines to destinations such as New York, Atlanta, and Orlando. Essential Air Service also subsidizes service to a number of smaller airports in Maine, bringing small turboprop aircraft to regional airports such as the Augusta State Airport, Hancock County-Bar Harbor Airport, Knox County Regional Airport, and the Northern Maine Regional Airport at Presque Isle. These airports are served by regional providers such as Cape Air with Cessna 402s, PenAir with Saab 340s, and CommutAir with Embraer ERJ 145 aircraft.
Many smaller airports are scattered throughout Maine, serving only general aviation traffic. The Eastport Municipal Airport, for example, is a city-owned public-use airport with 1,200 general aviation aircraft operations each year from single-engine and ultralight aircraft.
Interstate 95 (I-95) travels through Maine, as well as its easterly branch I-295 and spurs I-195, I-395 and the unsigned I-495 (the Falmouth Spur). In addition, U.S. Route 1 (US 1) starts in Fort Kent and travels to Florida. The eastern terminus of the eastern section of US 2 starts in Houlton, near the New Brunswick, Canada border to Rouses Point, New York, at US 11. US 2A connects Old Town and Orono, primarily serving the University of Maine campus. US 201 and US 202 flow through the state. US 2, Maine State Route 6 (SR 6), and SR 9 are often used by truckers and other motorists of the Maritime Provinces "en route" to other destinations in the United States or as a short cut to Central Canada.
The "Downeaster" passenger train, operated by Amtrak, provides passenger service between Brunswick and Boston's North Station, with stops in Freeport, Portland, Old Orchard Beach, Saco, and Wells. The "Downeaster" makes five daily trips.
Freight service throughout the state is provided by a handful of regional and shortline carriers: Pan Am Railways (formerly known as Guilford Rail System), which operates the former Boston & Maine and Maine Central railroads; St. Lawrence and Atlantic Railroad; Maine Eastern Railroad; Central Maine and Quebec Railway; and New Brunswick Southern Railway.
The Maine Constitution structures Maine's state government, composed of three co-equal branches—the executive, legislative, and judicial branches. The state of Maine also has three Constitutional Officers (the Secretary of State, the State Treasurer, and the State Attorney General) and one Statutory Officer (the State Auditor).
The legislative branch is the Maine Legislature, a bicameral body composed of the Maine House of Representatives, with 151 members, and the Maine Senate, with 35 members. The Legislature is charged with introducing and passing laws.
The executive branch is responsible for the execution of the laws created by the Legislature and is headed by the Governor of Maine (currently Janet Mills). The Governor is elected every four years; no individual may serve more than two consecutive terms in this office. The current attorney general of Maine is Aaron Frey. As with other state legislatures, the Maine Legislature can by a two-thirds majority vote from both the House and Senate override a gubernatorial veto. Maine is one of seven states that do not have a lieutenant governor.
The judicial branch is responsible for interpreting state laws. The highest court of the state is the Maine Supreme Judicial Court. The lower courts are the District Court, Superior Court and Probate Court. All judges except for probate judges serve full-time, are nominated by the Governor and confirmed by the Legislature for terms of seven years. Probate judges serve part-time and are elected by the voters of each county for four-year terms.
Maine is divided into political jurisdictions designated as counties. Since 1860 there have been 16 counties in the state, ranging in size from .
In state general elections, Maine voters tend to accept independent and third-party candidates more frequently than most states. Maine has had two independent governors recently (James B. Longley, 1975–1979 and current U.S. Senator Angus King, 1995–2003). Maine state politicians, Democrats and Republicans alike, are noted for having more moderate views than many in the national wings of their respective parties.
Maine is an alcoholic beverage control state.
On May 6, 2009, Maine became the fifth state to legalize same-sex marriage; however, the law was repealed by voters on November 3, 2009. On November 6, 2012, Maine, along with Maryland and Washington, became the first state to legalize same-sex marriage at the ballot box.
In the 1930s, Maine was one of very few states which retained Republican sentiments. In the 1936 presidential election, Franklin D. Roosevelt received the electoral votes of every state other than Maine and Vermont; these were the only two states in the nation that never voted for Roosevelt in any of his presidential campaigns, though Maine was closely fought in 1940 and 1944. In the 1960s, Maine began to lean toward the Democrats, especially in presidential elections. In 1968, Hubert Humphrey became just the second Democrat in half a century to carry Maine, perhaps because of the presence of his running mate, Maine Senator Edmund Muskie, although the state voted Republican in every presidential election in the 1970s and 1980s.
Since 1969, two of Maine's four electoral votes have been awarded based on the winner of the statewide election; the other two go to the highest vote-getter in each of the state's two congressional districts. Every other state except Nebraska gives all its electoral votes to the candidate who wins the popular vote in the state at large, without regard to performance within districts. Maine split its electoral vote for the first time in 2016, with Donald Trump's strong showing in the more rural central and northern Maine allowing him to capture one of the state's four votes in the Electoral College.
Ross Perot achieved a great deal of success in Maine in the presidential elections of 1992 and 1996. In 1992, as an independent candidate, Perot came in second to Democrat Bill Clinton, despite the long-time presence of the Bush family summer home in Kennebunkport. In 1996, as the nominee of the Reform Party, Perot did better in Maine than in any other state.
Maine has voted for Democratic Bill Clinton twice, Al Gore in 2000, John Kerry in 2004, and Barack Obama in 2008 and 2012. In 2016, Republican Donald Trump won one of Maine's electoral votes with Democratic opponent Hillary Clinton winning the other three. Although Democrats have mostly carried the state in presidential elections in recent years, Republicans have largely maintained their control of the state's U.S. Senate seats, with Edmund Muskie, William Hathaway and George J. Mitchell being the only Maine Democrats serving in the U.S. Senate in the past fifty years.
In the 2010 midterm elections, Republicans made major gains in Maine. They captured the governor's office as well as majorities in both chambers of the state legislature for the first time since the early 1970s. However, in the 2012 elections Democrats managed to recapture both houses of Maine Legislature.
Maine's U.S. senators are Republican Susan Collins and Independent Angus King. The governor is Democrat Janet Mills. The state's two members of the United States House of Representatives are Democrats Chellie Pingree and
Jared Golden.
Maine is the first state to have introduced ranked-choice voting in federal elections.
An organized municipality has a form of elected local government which administers and provides local services, keeps records, collects licensing fees, and can pass locally binding ordinances, among other responsibilities of self-government. The governmental format of most organized towns and plantations is the town meeting, while the format of most cities is the council-manager form. the organized municipalities of Maine consist of 23 cities, 431 towns, and 34 plantations. Collectively these 488 organized municipalities cover less than half of the state's territory. Maine also has three Reservations: Indian Island, Indian Township Reservation, and Pleasant Point Indian Reservation.
Unorganized territory (UT) has no local government. Administration, services, licensing, and ordinances are handled by the state government as well as by respective county governments who have townships within each county's bounds. The unorganized territory of Maine consists of more than 400 townships (towns are incorporated, townships are unincorporated), plus many coastal islands that do not lie within any municipal bounds. The UT land area is slightly over half the entire area of the State of Maine. Year-round residents in the UT number approximately 9,000 (about 1.3% of the state's total population), with many more people staying there only seasonally. Only four of Maine's sixteen counties (Androscoggin, Cumberland, Waldo and York) are entirely incorporated, although a few others are nearly so, and most of the unincorporated area is in the vast and sparsely populated Great North Woods of Maine.
QuickFacts U.S. Census Maine Portland:
Throughout Maine, many municipalities, although each separate governmental entities, nevertheless form portions of a much larger population base. There are many such population clusters throughout Maine, but some examples from the municipalities appearing in the above listing are:
There are thirty institutions of higher learning in Maine. These institutions include the University of Maine, which is the oldest, largest and only research university in the state. UMaine was founded in 1865 and is the state's only land grant and sea grant college. The University of Maine is located in the town of Orono and is the flagship of Maine. There are also branch campuses in Augusta, Farmington, Fort Kent, Machias, and Presque Isle.
Bowdoin College is a liberal arts college founded in 1794 in Brunswick, making it the oldest institution of higher learning in the state. Colby College in Waterville was founded in 1813 making it the second oldest college in Maine. Bates College in Lewiston was founded in 1855 making it the third oldest institution in the state and the oldest coeducational college in New England. The three colleges collectively form the Colby-Bates-Bowdoin Consortium and are ranked among the best colleges in the United States; often placing in the top 10% of all liberal arts colleges.
Maine's per-student public expenditure for elementary and secondary schools was 21st in the nation in 2012, at $12,344.
The collegiate system of Maine also includes numerous baccalaureate colleges such as: the Maine Maritime Academy (MMA), Unity College, and Thomas College. There is only one medical school in the state, (University of New England's College of Osteopathic Medicine) and only one law school (The University of Maine School of Law).
Private schools in Maine are funded independently of the state and its furthered domains. Private schools are less common than public schools. A large number of private elementary schools with under 20 students exist, but most private high schools in Maine can be described as "semi-private".
Maine was a center of agriculture before it achieved statehood. Prior to colonization, Wabanaki nations farmed large crops of corn and other produce in southern Maine. The state is a major producer of potatoes, wild blueberries, apples, maple syrup and sweet corn. Dairy products and chicken's eggs are other major industries.
Maine has many vegetable farms and other small, diversified farms. In the 1960s and 1970s, the book "Living the Good Life" by Helen Nearing and Scott Nearing caused many young people to move to Maine and engage in small-scale farming and homesteading. These back-to-the-land migrants increased the population of some counties.
Maine has a smaller number of commodity farms and confined animal feeding operations.
Maine is home to the Maine Organic Farmers and Gardeners Association and had 535 certified organic farms in 2019.
Since the 1980s, the state has gotten a reputation for its local food and restaurant meals. Portland was named "Bon Appetite" magazine's Restaurant City of the Year in 2018. In 2018, HealthIQ.com named Maine the 3rd most vegan state.
Adapted from the Maine facts site.
A citizen of Maine is known as a "Mainer", though the term is often reserved for those whose roots in Maine go back at least three generations. The term "Downeaster" may be applied to residents of the northeast coast of the state. The term "Mainiac" is considered by some to be derogatory, but embraced with pride by others, and is used for a variety of organizations and for events such as the YMCA Mainiac Sprint Triathlon & Duathlon.
State government
U.S. government
Information
|
https://en.wikipedia.org/wiki?curid=19977
|
Montana
Montana () is a state in the Northwestern United States. Montana has several nicknames, although none are official, including "Big Sky Country" and "The Treasure State", and slogans that include "Land of the Shining Mountains" and more recently "The Last Best Place".
Montana is the fourth-largest in area, the 8th least populous, and the third-least densely populated of the 50 U.S. states. The western half of Montana contains numerous mountain ranges. Smaller mountain ranges are found throughout the state. In all, 77 named ranges are part of the Rocky Mountains. The eastern half of Montana is characterized by western prairie terrain and badlands. Montana is bordered by Idaho to the west, Wyoming to the south, North Dakota and South Dakota to the east, and the Canadian provinces of British Columbia, Alberta, and Saskatchewan to the north.
The economy is primarily based on agriculture, including ranching and cereal grain farming. Other significant economic resources include oil, gas, coal, hard rock mining, and lumber. The health care, service, and government sectors also are significant to the state's economy.
The state's fastest-growing sector is tourism. Nearly 13 million tourists annually visit Glacier National Park, Yellowstone National Park, Beartooth Highway, Flathead Lake, Big Sky Resort, and other attractions.
The name Montana comes from the Spanish word "montaña," which in turn comes from the Latin word "montanea", meaning "mountain", or more broadly, "mountainous country". "Montaña del Norte" was the name given by early Spanish explorers to the entire mountainous region of the west. The name Montana was added to a bill by the United States House Committee on Territories (chaired at the time by James Ashley of Ohio) for the territory that would become Idaho Territory. The name was changed by representatives Henry Wilson (Massachusetts) and Benjamin F. Harding (Oregon), who complained Montana had "no meaning". When Ashley presented a bill to establish a temporary government in 1864 for a new territory to be carved out of Idaho, he again chose Montana Territory. This time, Rep. Samuel Cox, also of Ohio, objected to the name. Cox complained the name was a misnomer given most of the territory was not mountainous and a Native American name would be more appropriate than a Spanish one. Other names such as Shoshone were suggested, but the Committee on Territories decided they could name it whatever they wanted, so the original name of Montana was adopted.
Montana is one of the eight Mountain States, located in the north of the region known as the Western United States. It borders North Dakota and South Dakota to the east. Wyoming is to the south, Idaho is to the west and southwest, and three Canadian provinces, British Columbia, Alberta, and Saskatchewan, are to the north.
With an area of , Montana is slightly larger than Japan. It is the fourth-largest state in the United States after Alaska, Texas, and California; it is the largest landlocked U.S. state.
The state's topography is roughly defined by the Continental Divide, which splits much of the state into distinct eastern and western regions. Most of Montana's hundred or more named mountain ranges are in the state's western half, most of which is geologically and geographically part of the northern Rocky Mountains. The Absaroka and Beartooth ranges in the state's south-central part are technically part of the Central Rocky Mountains. The Rocky Mountain Front is a significant feature in the state's north-central portion, and isolated island ranges that interrupt the prairie landscape common in the central and eastern parts of the state. About 60 percent of the state is prairie, part of the northern Great Plains.
The Bitterroot Mountains—one of the longest continuous ranges in the Rocky Mountain chain from Alaska to Mexico—along with smaller ranges, including the Coeur d'Alene Mountains and the Cabinet Mountains, divide the state from Idaho. The southern third of the Bitterroot range blends into the Continental Divide. Other major mountain ranges west of the divide include the Cabinet Mountains, the Anaconda Range, the Missions, the Garnet Range, the Sapphire Mountains, and the Flint Creek Range.
The divide's northern section, where the mountains rapidly give way to prairie, is part of the Rocky Mountain Front. The front is most pronounced in the Lewis Range, located primarily in Glacier National Park. Due to the configuration of mountain ranges in Glacier National Park, the Northern Divide (which begins in Alaska's Seward Peninsula) crosses this region and turns east in Montana at Triple Divide Peak. It causes the Waterton River, Belly, and Saint Mary rivers to flow north into Alberta, Canada. There they join the Saskatchewan River, which ultimately empties into Hudson Bay.
East of the divide, several roughly parallel ranges cover the state's southern part, including the Gravelly Range, Madison Range, Gallatin Range, Absaroka Mountains, and Beartooth Mountains. The Beartooth Plateau is the largest continuous land mass over high in the continental United States. It contains the state's highest point, Granite Peak, high. North of these ranges are the Big Belt Mountains, Bridger Mountains, Tobacco Roots, and several island ranges, including the Crazy Mountains and Little Belt Mountains.
Between many mountain ranges are rich river valleys. The Big Hole Valley, Bitterroot Valley, Gallatin Valley, Flathead Valley, and Paradise Valley have extensive agricultural resources and multiple opportunities for tourism and recreation.
East and north of this transition zone are the expansive and sparsely populated Northern Plains, with tableland prairies, smaller island mountain ranges, and badlands. The isolated island ranges east of the Divide include the Bear Paw Mountains, Bull Mountains, Castle Mountains, Crazy Mountains, Highwood Mountains, Judith Mountains, Little Belt Mountains, Little Rocky Mountains, the Pryor Mountains, Little Snowy Mountains, Big Snowy Mountains, Sweet Grass Hills, and—in the state's southeastern corner near Ekalaka—the Long Pines. Many of these isolated eastern ranges were created about 120 to 66 million years ago when magma welling up from the interior cracked and bowed the earth's surface here.
The area east of the divide in the state' north-central portion is known for the Missouri Breaks and other significant rock formations. Three buttes south of Great Falls are major landmarks: Cascade, Crown, Square, Shaw, and Buttes. Known as laccoliths, they formed when igneous rock protruded through cracks in the sedimentary rock. The underlying surface consists of sandstone and shale. Surface soils in the area are highly diverse, and greatly affected by the local geology, whether glaciated plain, intermountain basin, mountain foothills, or tableland. Foothill regions are often covered in weathered stone or broken slate, or consist of uncovered bare rock (usually igneous, quartzite, sandstone, or shale). The soil of intermountain basins usually consists of clay, gravel, sand, silt, and volcanic ash, much of it laid down by lakes which covered the region during the Oligocene 33 to 23 million years ago. Tablelands are often topped with argillite gravel and weathered quartzite, occasionally underlain by shale. The glaciated plains are generally covered in clay, gravel, sand, and silt left by the proglacial Lake Great Falls or by moraines or gravel-covered former lake basins left by the Wisconsin glaciation 85,000 to 11,000 years ago. Farther east, areas such as Makoshika State Park near Glendive and Medicine Rocks State Park near Ekalaka contain some of the most scenic badlands regions in the state.
The Hell Creek Formation in Northeast Montana is a major source of dinosaur fossils. Paleontologist Jack Horner of the Museum of the Rockies in Bozeman brought this formation to the world's attention with several major finds.
Montana has thousands of named rivers and creeks, of which are known for "blue-ribbon" trout fishing. Montana's water resources provide for recreation, hydropower, crop and forage irrigation, mining, and water for human consumption. Montana is one of few geographic areas in the world whose rivers form parts of three major watersheds (i.e. where two continental divides intersect). Its rivers feed the Pacific Ocean, the Gulf of Mexico, and Hudson Bay. The watersheds divide at Triple Divide Peak in Glacier National Park.
West of the divide, the Clark Fork of the Columbia (not to be confused with the Clarks Fork of the Yellowstone River) rises near Butte and flows northwest to Missoula, where it is joined by the Blackfoot River and Bitterroot River. Farther downstream, it is joined by the Flathead River before entering Idaho near Lake Pend Oreille. The Pend Oreille River forms the outflow of Lake Pend Oreille. The Pend Oreille River joined the Columbia River, which flows to the Pacific Ocean—making the long Clark Fork/Pend Oreille (considered a single river system) the longest river in the Rocky Mountains. The Clark Fork discharges the greatest volume of water of any river exiting the state. The Kootenai River in northwest Montana is another major tributary of the Columbia.
East of the divide the Missouri River, which is formed by the confluence of the Jefferson, Madison, and Gallatin Rivers near Three Forks, flows due north through the west-central part of the state to Great Falls. From this point, it then flows generally east through fairly flat agricultural land and the Missouri Breaks to Fort Peck reservoir. The stretch of river between Fort Benton and the Fred Robinson Bridge at the western boundary of Fort Peck Reservoir was designated a National Wild and Scenic River in 1976. The Missouri enters North Dakota near Fort Union, having drained more than half the land area of Montana (). Nearly one-third of the Missouri River in Montana lies behind 10 dams: Toston, Canyon Ferry, Hauser, Holter, Black Eagle, Rainbow, Cochrane, Ryan, Morony, and Fort Peck.
The Yellowstone River rises on the Continental Divide near Younts Peak in Wyoming's Teton Wilderness. It flows north through Yellowstone National Park, enters Montana near Gardiner, and passes through the Paradise Valley to Livingston. It then flows northeasterly across the state through Billings, Miles City, Glendive, and Sidney. The Yellowstone joins the Missouri in North Dakota just east of Fort Union. It is the longest undammed, free-flowing river in the contiguous United States, and drains about a quarter of Montana ().
Other major Montana tributaries of the Missouri include the Smith, Milk, Marias, Judith, and Musselshell Rivers. Montana also claims the disputed title of possessing the world's shortest river, the Roe River, just outside Great Falls. Through the Missouri, these rivers ultimately join the Mississippi River and flow into the Gulf of Mexico.
Major tributaries of the Yellowstone include the Boulder, Stillwater, Clarks Fork, Bighorn, Tongue, and Powder Rivers.
The Northern Divide turns east in Montana at Triple Divide Peak, causing the Waterton, Belly, and Saint Mary Rivers to flow north into Alberta. There they join the Saskatchewan River, which ultimately empties into Hudson Bay.
Montana has some 3,000 named lakes and reservoirs, including Flathead Lake, the largest natural freshwater lake in the western United States. Other major lakes include Whitefish Lake in the Flathead Valley and Lake McDonald and St. Mary Lake in Glacier National Park. The largest reservoir in the state is Fort Peck Reservoir on the Missouri river, which is contained by the second largest earthen dam and largest hydraulically filled dam in the world. Other major reservoirs include Hungry Horse on the Flathead River; Lake Koocanusa on the Kootenai River; Lake Elwell on the Marias River; Clark Canyon on the Beaverhead River; Yellowtail on the Bighorn River, Canyon Ferry, Hauser, Holter, Rainbow; and Black Eagle on the Missouri River.
Vegetation of the state includes lodgepole pine, ponderosa pine, Douglas fir, larch, spruce, aspen, birch, red cedar, hemlock, ash, alder, rocky mountain maple and cottonwood trees. Forests cover about 25% of the state. Flowers native to Montana include asters, bitterroots, daisies, lupins, poppies, primroses, columbine, lilies, orchids, and dryads. Several species of sagebrush and cactus and many species of grasses are common. Many species of mushrooms and lichens are also found in the state.
Montana is home to diverse fauna including 14 amphibian, 90 fish, 117 mammal, 20 reptile, and 427 bird species. Additionally, more than 10,000 invertebrate species are present, including 180 mollusks and 30 crustaceans. Montana has the largest grizzly bear population in the lower 48 states. Montana hosts five federally endangered species–black-footed ferret, whooping crane, least tern, pallid sturgeon, and white sturgeon and seven threatened species including the grizzly bear, Canadian lynx, and bull trout. Since re-introduction the gray wolf population has stabilized at about 900 animals, and they have been delisted as endangered. The Montana Department of Fish, Wildlife and Parks manages fishing and hunting seasons for at least 17 species of game fish, including seven species of trout, walleye, and smallmouth bass and at least 29 species of game birds and animals including ring-neck pheasant, grey partridge, elk, pronghorn antelope, mule deer, whitetail deer, gray wolf, and bighorn sheep.
Montana contains Glacier National Park, "The Crown of the Continent"; and parts of Yellowstone National Park, including three of the park's five entrances. Other federally recognized sites include the Little Bighorn National Monument, Bighorn Canyon National Recreation Area, Big Hole National Battlefield, and the National Bison Range.
Federal and state agencies administer approximately , or 35 percent of Montana's land. The U.S. Department of Agriculture Forest Service administers of forest land in ten National Forests. There are approximately of wilderness in 12 separate wilderness areas that are part of the National Wilderness Preservation System established by the Wilderness Act of 1964. The U.S. Department of the Interior Bureau of Land Management controls of federal land. The U.S. Department of the Interior Fish and Wildlife Service administers of 1.1 million acres of National Wildlife Refuges and waterfowl production areas in Montana. The U.S. Department of the Interior Bureau of Reclamation administers approximately of land and water surface in the state. The Montana Department of Fish, Wildlife and Parks operates approximately of state parks and access points on the state's rivers and lakes. The Montana Department of Natural Resources and Conservation manages of School Trust Land ceded by the federal government under the Land Ordinance of 1785 to the state in 1889 when Montana was granted statehood. These lands are managed by the state for the benefit of public schools and institutions in the state.
Areas managed by the National Park Service include:
Montana is a large state with considerable variation in geography, topography and altitude, and the climate is, therefore, equally varied. The state spans from below the 45th parallel (the line equidistant between the equator and North Pole) to the 49th parallel, and elevations range from under to nearly above sea level. The western half is mountainous, interrupted by numerous large valleys. Eastern Montana comprises plains and badlands, broken by hills and isolated mountain ranges, and has a semiarid, continental climate (Köppen climate classification "BSk"). The Continental Divide has a considerable effect on the climate, as it restricts the flow of warmer air from the Pacific from moving east, and drier continental air from moving west. The area west of the divide has a modified northern Pacific Coast climate, with milder winters, cooler summers, less wind, and a longer growing season. Low clouds and fog often form in the valleys west of the divide in winter, but this is rarely seen in the east.
Average daytime temperatures vary from in January to in July. The variation in geography leads to great variation in temperature. The highest observed summer temperature was at Glendive on July 20, 1893, and Medicine Lake on July 5, 1937. Throughout the state, summer nights are generally cool and pleasant. Extreme hot weather is less common above . Snowfall has been recorded in all months of the year in the more mountainous areas of central and western Montana, though it is rare in July and August.
The coldest temperature on record for Montana is also the coldest temperature for the contiguous United States. On January 20, 1954, was recorded at a gold mining camp near Rogers Pass. Temperatures vary greatly on cold nights, and Helena, to the southeast had a low of only on the same date, and an all-time record low of . Winter cold spells are usually the result of cold continental air coming south from Canada. The front is often well defined, causing a large temperature drop in a 24-hour period. Conversely, air flow from the southwest results in "chinooks". These steady (or more) winds can suddenly warm parts of Montana, especially areas just to the east of the mountains, where temperatures sometimes rise up to for 10 days or longer.
Loma is the site of the most extreme recorded temperature change in a 24-hour period in the United States. On January 15, 1972, a chinook wind blew in and the temperature rose from .
Average annual precipitation is , but great variations are seen. The mountain ranges block the moist Pacific air, holding moisture in the western valleys, and creating rain shadows to the east. Heron, in the west, receives the most precipitation, . On the eastern (leeward) side of a mountain range, the valleys are much drier; Lonepine averages , and Deer Lodge of precipitation. The mountains can receive over , for example the Grinnell Glacier in Glacier National Park gets . An area southwest of Belfry averaged only over a 16-year period. Most of the larger cities get of snow each year. Mountain ranges can accumulate of snow during a winter. Heavy snowstorms may occur from September through May, though most snow falls from November to March.
The climate has become warmer in Montana and continues to do so. The glaciers in Glacier National Park have receded and are predicted to melt away completely in a few decades. Many Montana cities set heat records during July 2007, the hottest month ever recorded in Montana. Winters are warmer, too, and have fewer cold spells. Previously, these cold spells had killed off bark beetles, but these are now attacking the forests of western Montana. The warmer winters in the region have allowed various species to expand their ranges and proliferate. The combination of warmer weather, attack by beetles, and mismanagement has led to a substantial increase in the severity of forest fires in Montana. According to a study done for the U.S. Environmental Protection Agency by the Harvard School of Engineering and Applied Science, parts of Montana will experience a 200% increase in area burned by wildfires, and an 80% increase in related air pollution.
The table below lists average temperatures for the warmest and coldest month for Montana's seven largest cities. The coldest month varies between December and January depending on location, although figures are similar throughout.
Montana is one of only two contiguous states (along with Colorado) that are antipodal to land. The Kerguelen Islands are antipodal to the Montana–Saskatchewan–Alberta border. No towns are precisely antipodal to Kerguelen, though Chester and Rudyard are close.
Various indigenous peoples lived in the territory of the present-day state of Montana for thousands of years. Historic tribes encountered by Europeans and settlers from the United States included the Crow in the south-central area, the Cheyenne in the very southeast, the Blackfeet, Assiniboine, and Gros Ventres in the central and north-central area, and the Kootenai and Salish in the west. The smaller Pend d'Oreille and Kalispel tribes lived near Flathead Lake and the western mountains, respectively. A part of southeastern Montana was used as a corridor between the Crows and the related Hidatsas in North Dakota.
The land in Montana east of the continental divide was part of the Louisiana Purchase in 1803. Subsequent to and particularly in the decades following the Lewis and Clark Expedition, American, British, and French traders operated a fur trade, typically working with indigenous peoples, in both eastern and western portions of what would become Montana. These dealings were not always peaceful, and though the fur trade brought some material gain for indigenous tribal groups, it also brought exposure to European diseases and altered their economic and cultural traditions. The trading post Fort Raymond (1807-1811) was constructed in Crow Indian country in 1807. Until the Oregon Treaty (1846), land west of the continental divide was disputed between the British and U.S. and was known as the Oregon Country. The first permanent settlement by Euro-Americans in what today is Montana was St. Mary's (1841) near present-day Stevensville. In 1847, Fort Benton was established as the uppermost fur-trading post on the Missouri River. In the 1850s, settlers began moving into the Beaverhead and Big Hole valleys from the Oregon Trail and into the Clark's Fork valley.
The first gold discovered in Montana was at Gold Creek near present-day Garrison in 1852. A series of major mining discoveries in the western third of the state starting in 1862 found gold, silver, copper, lead, and coal (and later oil) which attracted tens of thousands of miners to the area. The richest of all gold placer diggings was discovered at Alder Gulch, where the town of Virginia City was established. Other rich placer deposits were found at Last Chance Gulch, where the city of Helena now stands, Confederate Gulch, Silver Bow, Emigrant Gulch, and Cooke City. Gold output from 1862 through 1876 reached $144 million; silver then became even more important. The largest mining operations were in the city of Butte, which had important silver deposits and gigantic copper deposits.
Before the creation of Montana Territory (1864–1889), areas within present-day Montana were part of the Oregon Territory (1848–1859), Washington Territory (1853–1863), Idaho Territory (1863–1864), and Dakota Territory (1861–1864). Montana became a United States territory (Montana Territory) on May 26, 1864. The first territorial capital was at Bannack. The first territorial governor was Sidney Edgerton. The capital moved to Virginia City in 1865 and to Helena in 1875. In 1870, the non-Indian population of Montana Territory was 20,595. The Montana Historical Society, founded on February 2, 1865, in Virginia City, is the oldest such institution west of the Mississippi (excluding Louisiana). In 1869 and 1870 respectively, the Cook–Folsom–Peterson and the Washburn–Langford–Doane Expeditions were launched from Helena into the Upper Yellowstone region and directly led to the creation of Yellowstone National Park in 1872.
As settlers began populating Montana from the 1850s through the 1870s, disputes with Native Americans ensued, primarily over land ownership and control. In 1855, Washington Territorial Governor Isaac Stevens negotiated the Hellgate treaty between the United States government and the Salish, Pend d'Oreille, and Kootenai people of western Montana, which established boundaries for the tribal nations. The treaty was ratified in 1859. While the treaty established what later became the Flathead Indian Reservation, trouble with interpreters and confusion over the terms of the treaty led Whites to believe the Bitterroot Valley was opened to settlement, but the tribal nations disputed those provisions. The Salish remained in the Bitterroot Valley until 1891.
The first U.S. Army post established in Montana was Camp Cooke in 1866, on the Missouri River, to protect steamboat traffic going to Fort Benton. More than a dozen additional military outposts were established in the state. Pressure over land ownership and control increased due to discoveries of gold in various parts of Montana and surrounding states. Major battles occurred in Montana during Red Cloud's War, the Great Sioux War of 1876, and the Nez Perce War and in conflicts with Piegan Blackfeet. The most notable were the Marias Massacre (1870), Battle of the Little Bighorn (1876), Battle of the Big Hole (1877), and Battle of Bear Paw (1877). The last recorded conflict in Montana between the U.S. Army and Native Americans occurred in 1887 during the Battle of Crow Agency in the Big Horn country. Indian survivors who had signed treaties were generally required to move onto reservations.
Simultaneously with these conflicts, bison, a keystone species and the primary protein source that Native people had survived on for centuries, were being destroyed. Some estimates say more than 13 million bison were in Montana in 1870. In 1875, General Philip Sheridan pleaded to a joint session of Congress to authorize the slaughtering of herds to deprive the Indians of their source of food. By 1884, commercial hunting had brought bison to the verge of extinction; only about 325 bison remained in the entire United States.
Cattle ranching has been central to Montana's history and economy since Johnny Grant began wintering cattle in the Deer Lodge Valley in the 1850s and traded cattle fattened in fertile Montana valleys with emigrants on the Oregon Trail. Nelson Story brought the first Texas Longhorn cattle into the territory in 1866. Granville Stuart, Samuel Hauser, and Andrew J. Davis started a major open-range cattle operation in Fergus County in 1879. The Grant-Kohrs Ranch National Historic Site in Deer Lodge is maintained today as a link to the ranching style of the late 19th century. Operated by the National Park Service, it is a working ranch.
Tracks of the Northern Pacific Railroad (NPR) reached Montana from the west in 1881 and from the east in 1882. However, the railroad played a major role in sparking tensions with Native American tribes in the 1870s. Jay Cooke, the NPR president, launched major surveys into the Yellowstone valley in 1871, 1872, and 1873, which were challenged forcefully by the Sioux under chief Sitting Bull. These clashes, in part, contributed to the Panic of 1873, a financial crisis that delayed construction of the railroad into Montana. Surveys in 1874, 1875, and 1876 helped spark the Great Sioux War of 1876. The transcontinental NPR was completed on September 8, 1883, at Gold Creek.
Tracks of the Great Northern Railroad (GNR) reached eastern Montana in 1887 and when they reached the northern Rocky Mountains in 1890, the GNR became a significant promoter of tourism to Glacier National Park region. The transcontinental GNR was completed on January 6, 1893, at Scenic, Washington.
In 1881, the Utah and Northern Railway, a branch line of the Union Pacific, completed a narrow-gauge line from northern Utah to Butte. A number of smaller spur lines operated in Montana from 1881 into the 20th century, including the Oregon Short Line, Montana Railroad, and Milwaukee Road.
Under Territorial Governor Thomas Meagher, Montanans held a constitutional convention in 1866 in a failed bid for statehood. A second constitutional convention held in Helena in 1884 produced a constitution ratified 3:1 by Montana citizens in November 1884. For political reasons, Congress did not approve Montana statehood until February 1889 and President Grover Cleveland signed an omnibus bill granting statehood to Montana, North Dakota, South Dakota, and Washington once the appropriate state constitutions were crafted. In July 1889, Montanans convened their third constitutional convention and produced a constitution accepted by the people and the federal government. On November 8, 1889, President Benjamin Harrison proclaimed Montana the union's 41st state. The first state governor was Joseph K. Toole. In the 1880s, Helena (the state capital) had more millionaires per capita than any other United States city.
The Homestead Act of 1862 provided free land to settlers who could claim and "prove-up" of federal land in the Midwest and western United States. Montana did not see a large influx of immigrants from this act because 160 acres were usually insufficient to support a family in the arid territory. The first homestead claim under the act in Montana was made by David Carpenter near Helena in 1868. The first claim by a woman was made near Warm Springs Creek by Gwenllian Evans, the daughter of Deer Lodge Montana pioneer, Morgan Evans. By 1880, farms were in the more verdant valleys of central and western Montana, but few were on the eastern plains.
The Desert Land Act of 1877 was passed to allow settlement of arid lands in the west and allotted to settlers for a fee of $.25 per acre and a promise to irrigate the land. After three years, a fee of one dollar per acre would be paid and the land would be owned by the settler. This act brought mostly cattle and sheep ranchers into Montana, many of whom grazed their herds on the Montana prairie for three years, did little to irrigate the land and then abandoned it without paying the final fees. Some farmers came with the arrival of the Great Northern and Northern Pacific Railroads throughout the 1880s and 1890s, though in relatively small numbers.
In the early 1900s, James J. Hill of the Great Northern began to promote settlement in the Montana prairie to fill his trains with settlers and goods. Other railroads followed suit. In 1902, the Reclamation Act was passed, allowing irrigation projects to be built in Montana's eastern river valleys. In 1909, Congress passed the Enlarged Homestead Act that expanded the amount of free land from per family and in 1912 reduced the time to "prove up" on a claim to three years. In 1916, the Stock-Raising Homestead Act allowed homesteads of 640 acres in areas unsuitable for irrigation. This combination of advertising and changes in the Homestead Act drew tens of thousands of homesteaders, lured by free land, with World War I bringing particularly high wheat prices. In addition, Montana was going through a temporary period of higher-than-average precipitation. Homesteaders arriving in this period were known as "Honyockers", or "scissorbills". Though the word "honyocker", possibly derived from the ethnic slur "hunyak", was applied in a derisive manner at homesteaders as being "greenhorns", "new at his business", or "unprepared", most of these new settlers had farming experience, though many did not.
However, farmers faced a number of problems. Massive debt was one. Also, most settlers were from wetter regions, unprepared for the dry climate, lack of trees, and scarce water resources. In addition, small homesteads of fewer than were unsuited to the environment. Weather and agricultural conditions are much harsher and drier west of the 100th meridian. Then, the droughts of 1917–1921 proved devastating. Many people left, and half the banks in the state went bankrupt as a result of providing mortgages that could not be repaid. As a result, farm sizes increased while the number of farms decreased.
By 1910, homesteaders filed claims on over five million acres, and by 1923, over 93 million acres were farmed. In 1910, the Great Falls land office alone had more than a thousand homestead filings per month, and at the peak of 1917–1918 it had 14,000 new homesteads each year. Significant drops occurred following the drought in 1919.
As World War I broke out, Jeannette Rankin, the first woman in the United States to be a member of Congress, voted against the United States' declaration of war. Her actions were widely criticized in Montana, where support for the war and patriotism were strong. In 1917–18, due to a miscalculation of Montana's population, about 40,000 Montanans, 10% of the state's population, volunteered or were drafted into the armed forces. This represented a manpower contribution to the war that was 25% higher than any other state on a per capita basis. Around 1500 Montanans died as a result of the war and 2437 were wounded, also higher than any other state on a per capita basis. Montana's Remount station in Miles City provided 10,000 cavalry horses for the war, more than any other Army post in the country. The war created a boom for Montana mining, lumber, and farming interests, as demand for war materials and food increased.
In June 1917, the U.S. Congress passed the Espionage Act of 1917, which was extended by the Sedition Act of 1918. In February 1918, the Montana legislature had passed the Montana Sedition Act, which was a model for the federal version. In combination, these laws criminalized criticism of the U.S. government, military, or symbols through speech or other means. The Montana Act led to the arrest of more than 200 individuals and the conviction of 78, mostly of German or Austrian descent. More than 40 spent time in prison. In May 2006, then-Governor Brian Schweitzer posthumously issued full pardons for all those convicted of violating the Montana Sedition Act.
The Montanans who opposed U.S. entry into the war included immigrant groups of German and Irish heritage, as well as pacifist Anabaptist people such as the Hutterites and Mennonites, many of whom were also of Germanic heritage. In turn, pro-War groups formed, such as the Montana Council of Defense, created by Governor Samuel V. Stewart and local "loyalty committees".
War sentiment was complicated by labor issues. The Anaconda Copper Company, which was at its historic peak of copper production, was an extremely powerful force in Montana, but it also faced criticism and opposition from socialist newspapers and unions struggling to make gains for their members. In Butte, a multiethnic community with significant European immigrant population, labor unions, particularly the newly formed Metal Mine Workers' Union, opposed the war on grounds it mostly profited large lumber and mining interests. In the wake of ramped-up mine production and the Speculator Mine disaster in June 1917, Industrial Workers of the World organizer Frank Little arrived in Butte to organize miners. He gave some speeches with inflammatory antiwar rhetoric. On August 1, 1917, he was dragged from his boarding house by masked vigilantes, and hanged from a railroad trestle, considered a lynching. Little's murder and the strikes that followed resulted in the National Guard being sent to Butte to restore order. Overall, anti-German and antilabor sentiment increased and created a movement that led to the passage of the Montana Sedition Act the following February. In addition, the Council of Defense was made a state agency with the power to prosecute and punish individuals deemed in violation of the Act. The Council also passed rules limiting public gatherings and prohibiting the speaking of German in public.
In the wake of the legislative action in 1918, emotions rose. U.S. Attorney Burton K. Wheeler and several district court judges who hesitated to prosecute or convict people brought up on charges were strongly criticized. Wheeler was brought before the Council of Defense, though he avoided formal proceedings, and a district court judge from Forsyth was impeached. Burnings of German-language books and several near-hangings occurred. The prohibition on speaking German remained in effect into the early 1920s. Complicating the wartime struggles, the 1918 influenza epidemic claimed the lives of more than 5,000 Montanans. The suppression of civil liberties that occurred led some historians to dub this period "Montana's Agony".
An economic depression began in Montana after World War I and lasted through the Great Depression until the beginning of World War II. This caused great hardship for farmers, ranchers, and miners. The wheat farms in eastern Montana make the state a major producer; the wheat has a relatively high protein content, thus commands premium prices.
By the time the U.S. entered World War II on December 8, 1941, many Montanans had enlisted in the military to escape the poor national economy of the previous decade. Another 40,000-plus Montanans entered the armed forces in the first year following the declaration of war, and more than 57,000 joined up before the war ended. These numbers constituted about ten percent of the state's population, and Montana again contributed one of the highest numbers of soldiers per capita of any state. Many Native Americans were among those who served, including soldiers from the Crow Nation who became Code Talkers. At least 1500 Montanans died in the war. Montana also was the training ground for the First Special Service Force or "Devil's Brigade", a joint U.S-Canadian commando-style force that trained at Fort William Henry Harrison for experience in mountainous and winter conditions before deployment. Air bases were built in Great Falls, Lewistown, Cut Bank, and Glasgow, some of which were used as staging areas to prepare planes to be sent to allied forces in the Soviet Union. During the war, about 30 Japanese Fu-Go balloon bombs were documented to have landed in Montana, though no casualties nor major forest fires were attributed to them.
In 1940, Jeannette Rankin was again elected to Congress. In 1941, as she had in 1917, she voted against the United States' declaration of war after the Japanese attack on Pearl Harbor. Hers was the only vote against the war, and in the wake of public outcry over her vote, Rankin required police protection for a time. Other pacifists tended to be those from "peace churches" who generally opposed war. Many individuals claiming conscientious objector status from throughout the U.S. were sent to Montana during the war as smokejumpers and for other forest fire-fighting duties.
During World War II, the planned battleship USS "Montana" was named in honor of the state but it was never completed. Montana is the only one of the first 48 states lacking a completed battleship being named for it. Alaska and Hawaii have both had nuclear submarines named after them. Montana is the only state in the union without a modern naval ship named in its honor. However, in August 2007, Senator Jon Tester asked that a submarine be christened USS "Montana". Secretary of the Navy Ray Mabus announced on September 3, 2015, that "Virginia" Class attack submarine SSN-794 will become the second commissioned warship to bear the name.
In the post-World War II Cold War era, Montana became host to U.S. Air Force Military Air Transport Service (1947) for airlift training in C-54 Skymasters and eventually, in 1953 Strategic Air Command air and missile forces were based at Malmstrom Air Force Base in Great Falls. The base also hosted the 29th Fighter Interceptor Squadron, Air Defense Command from 1953 to 1968. In December 1959, Malmstrom AFB was selected as the home of the new Minuteman I intercontinental ballistic missile. The first operational missiles were in place and ready in early 1962. In late 1962, missiles assigned to the 341st Strategic Missile Wing played a major role in the Cuban Missile Crisis. When the Soviets removed their missiles from Cuba, President John F. Kennedy said the Soviets backed down because they knew he had an "ace in the hole", referring directly to the Minuteman missiles in Montana. Montana eventually became home to the largest ICBM field in the U.S. covering .
The United States Census Bureau estimated the population of Montana was 1,068,778 on July 1, 2019, a 8.02% increase since the 2010 United States Census. The 2010 Census put Montana's population at 989,415. During the first decade of the new century, growth was mainly concentrated in Montana's seven largest counties, with the highest percentage growth in Gallatin County, which had a 32% increase in its population from 2000–2010. The city having the largest percentage growth was Kalispell, with 40.1%, and the city with the largest increase in actual residents was Billings, with an increase in population of 14,323 from 2000–2010.
On January 3, 2012, the Census and Economic Information Center (CEIC) at the Montana Department of Commerce estimated Montana had hit the one million population mark sometime between November and December 2011.
According to the 2010 Census, 89.4% of the population was White (87.8% non-Hispanic White), 6.3% American Indian and Alaska Native, 2.9% Hispanics and Latinos of any race, 0.6% Asian, 0.4% Black or African American, 0.1% Native Hawaiian and other Pacific Islander, 0.6% from some other race, and 2.5% from two or more races. The largest European ancestry groups in Montana as of 2010 are: German (27.0%), Irish (14.8%), English (12.6%), Norwegian (10.9%), French (4.7%), and Italian (3.4%).
Montana has a larger Native American population, both numerically and as a percentage, than most U.S. states. Ranked 45th in population (by the 2010 Census) it is 19th in native people, who are 6.5% of the state's population—the sixth-highest percentage of all fifty. Montana has three counties in which Native Americans are a majority: Big Horn, Glacier, and Roosevelt. Other counties with large Native American populations include Blaine, Cascade, Hill, Missoula, and Yellowstone Counties. The state's Native American population grew by 27.9% between 1980 and 1990 (at a time when Montana's entire population rose 1.6%), and by 18.5 percent between 2000 and 2010.
As of 2009, almost two-thirds of Native Americans in the state live in urban areas. Of Montana's 20 largest cities, Polson (15.7%), Havre (13.0%), Great Falls (5.0%), Billings (4.4%), and Anaconda (3.1%) had the greatest percentages of Native American residents in 2010. Billings (4,619), Great Falls (2,942), Missoula (1,838), Havre (1,210), and Polson (706) have the most Native Americans living there. The state's seven reservations include more than 12 distinct Native American ethnolinguistic groups.
While the largest European-American population in Montana overall is German, pockets of significant Scandinavian ancestry are prevalent in some of the farming-dominated northern and eastern prairie regions, parallel to nearby regions of North Dakota and Minnesota. Farmers of Irish, Scots, and English roots also settled in Montana. The historically mining-oriented communities of western Montana such as Butte have a wider range of European-American ethnicity; Finns, Eastern Europeans and especially Irish settlers left an indelible mark on the area, as well as people originally from British mining regions such as Cornwall, Devon, and Wales. The nearby city of Helena, also founded as a mining camp, had a similar mix in addition to a small Chinatown. Many of Montana's historic logging communities originally attracted people of Scottish, Scandinavian, Slavic, English, and Scots-Irish descent.
The Hutterites, an Anabaptist sect originally from Switzerland, settled here, and today Montana is second only to South Dakota in U.S. Hutterite population, with several colonies spread across the state. Beginning in the mid-1990s, the state also had an influx of Amish, who moved to Montana from the increasingly urbanized areas of Ohio and Pennsylvania.
Montana's Hispanic population is concentrated in the Billings area in south-central Montana, where many of Montana's Mexican-Americans have been in the state for generations. Great Falls has the highest percentage of African-Americans in its population, although Billings has more African-American residents than Great Falls.
The Chinese in Montana, while a low percentage today, have been an important presence. About 2000–3000 Chinese miners were in the mining areas of Montana by 1870, and 2500 in 1890. However, public opinion grew increasingly negative toward them in the 1890s, and nearly half of the state's Asian population left the state by 1900. Today, the Missoula area has a large Hmong population and the nearly 3,000 Montanans who claim Filipino ancestry are the largest Asian-American group in the state.
English is the official language in the state of Montana, as it is in many U.S. states. According to the 2000 U.S. Census, 94.8% of the population aged five and older speak English at home. Spanish is the language most commonly spoken at home other than English. About 13,040 Spanish-language speakers were in the state (1.4% of the population) in 2011. Also, 15,438 (1.7% of the state population) were speakers of Indo-European languages other than English or Spanish, 10,154 (1.1%) were speakers of a Native American language, and 4,052 (0.4%) were speakers of an Asian or Pacific Islander language. Other languages spoken in Montana (as of 2013) include Assiniboine (about 150 speakers in the Montana and Canada), Blackfoot (about 100 speakers), Cheyenne (about 1,700 speakers), Plains Cree (about 100 speakers), Crow (about 3,000 speakers), Dakota (about 18,800 speakers in Minnesota, Montana, Nebraska, North Dakota, and South Dakota), German Hutterite (about 5,600 speakers), Gros Ventre (about 10 speakers), Kalispel-Pend d'Oreille (about 64 speakers), Kutenai (about six speakers), and Lakota (about 6,000 speakers in Minnesota, Montana, Nebraska, North Dakota, South Dakota). The United States Department of Education estimated in 2009 that 5,274 students in Montana spoke a language at home other than English. These included a Native American language (64%), German (4%), Spanish (3%), Russian (1%), and Chinese (less than 0.5%).
According to the Pew Forum, the religious affiliations of the people of Montana are: Protestant 47%, Catholic 23%, LDS (Mormon) 5%, Jehovah's Witness 2%, Buddhist 1%, Jewish 0.5%, Muslim 0.5%, Hindu 0.5% and nonreligious at 20%.
The largest denominations in Montana as of 2010 were the Catholic Church with 127,612 adherents, the Church of Jesus Christ of Latter-day Saints with 46,484 adherents, Evangelical Lutheran Church in America with 38,665 adherents, and nondenominational Evangelical Protestant with 27,370 adherents.
About 66,000 people of Native American heritage live in Montana. Stemming from multiple treaties and federal legislation, including the Indian Appropriations Act (1851), the Dawes Act (1887), and the Indian Reorganization Act (1934), seven Indian reservations, encompassing 11 federally recognized tribal nations, were created in Montana. A 12th nation, the Little Shell Chippewa is a "landless" people headquartered in Great Falls; it is recognized by the state of Montana, but not by the U.S. government. The Blackfeet nation is headquartered on the Blackfeet Indian Reservation (1851) in Browning, Crow on the Crow Indian Reservation (1868) in Crow Agency, Confederated Salish and Kootenai and Pend d'Oreille on the Flathead Indian Reservation (1855) in Pablo, Northern Cheyenne on the Northern Cheyenne Indian Reservation (1884) at Lame Deer, Assiniboine and Gros Ventre on the Fort Belknap Indian Reservation (1888) in Fort Belknap Agency, Assiniboine and Sioux on the Fort Peck Indian Reservation (1888) at Poplar, and Chippewa-Cree on the Rocky Boy's Indian Reservation (1916) near Box Elder. Approximately 63% of all Native people live off the reservations, concentrated in the larger Montana cities, with the largest concentration of urban Indians in Great Falls. The state also has a small Métis population, and 1990 census data indicated that people from as many as 275 different tribes lived in Montana.
Montana's Constitution specifically reads, "the state recognizes the distinct and unique cultural heritage of the American Indians and is committed in its educational goals to the preservation of their cultural integrity." It is the only state in the U.S. with such a constitutional mandate. The Indian Education for All Act was passed in 1999 to provide funding for this mandate and ensure implementation. It mandates that all schools teach American Indian history, culture, and heritage from preschool through college. For kindergarten through 12th-grade students, an "Indian Education for All" curriculum from the Montana Office of Public Instruction is available free to all schools. The state was sued in 2004 because of lack of funding, and the state has increased its support of the program. South Dakota passed similar legislation in 2007, and Wisconsin was working to strengthen its own program based on this model—and the current practices of Montana's schools. Each Indian reservation in the state has a fully accredited tribal college. The University of Montana "was the first to establish dual admission agreements with all of the tribal colleges and as such it was the first institution in the nation to actively facilitate student transfer from the tribal colleges."
"Note: Births in table do not add up, because Hispanics are counted both by their ethnicity and by their race, giving a higher overall number."
The Bureau of Economic Analysis estimates Montana's state product in 2014 was $44.3 billion. Per capita personal income in 2014 was $40,601, 35th in the nation.
Montana is a relative hub of beer microbrewing, ranking third in the nation in number of craft breweries per capita in 2011. Significant industries exist for lumber and mineral extraction; the state's resources include gold, coal, silver, talc, and vermiculite. Ecotaxes on resource extraction are numerous. A 1974 state severance tax on coal (which varied from 20 to 30%) was upheld by the Supreme Court of the United States in "Commonwealth Edison Co. v. Montana", 453 U.S. 609 (1981).
Tourism is also important to the economy, with more than ten million visitors a year to Glacier National Park, Flathead Lake, the Missouri River headwaters, the site of the Battle of Little Bighorn, and three of the five entrances to Yellowstone National Park.
Montana's personal income tax contains seven brackets, with rates ranging from 1.0 to 6.9 percent. Montana has no sales tax*, and household goods are exempt from property taxes. However, property taxes are assessed on livestock, farm machinery, heavy equipment, automobiles, trucks, and business equipment. The amount of property tax owed is not determined solely by the property's value. The property's value is multiplied by a tax rate, set by the Montana Legislature, to determine its taxable value. The taxable value is then multiplied by the mill levy established by various taxing jurisdictions—city and county government, school districts, and others.
*In the 1980s the absence of a sales tax became economically deleterious to communities bound to the state's tourism industry, as the revenue from income and property taxes provided by residents was grossly insignificant in regards to paying for the impact of non-residential travel—especially road repair. In 1985, the Montana Legislature passed a law allowing towns with fewer than 5,500 residents and unincorporated communities with fewer than 2,500 to levy a resort tax if more than half the community's income came from tourism. The resort tax is a sales tax that applies to hotels, motels and other lodging and camping facilities; restaurants, fast-food stores, and other food service establishments; taverns, bars, night clubs, lounges, or other public establishments that serve alcohol; as well as destination ski resorts or other destination recreational facilities. It also applies to "luxuries"- defined by law as any item item normally sold to the public or to transient visitors or tourists that does not include food purchased unprepared or unserved, medicine, medical supplies and services, appliances, hardware supplies and tools, or any necessities of life. Approximately 12.2 million non-residents visited Montana in 2018, and the population was estimated to be 1.06 million. This extremely disproportionate ratio of residents paying taxes vs. non-residents using state funded services and infrastructure makes Montana's resort tax crucial in order to safely maintain heavily used roads and highways, as well as protect and preserve state parks.
As of January 2019, the state's unemployment rate is 3.8%.
The Montana Territory was formed on April 26, 1864, when the U.S. passed the Organic Act. Schools started forming in the area before it was officially a territory as families started settling into the area. The first schools were subscription schools that typically met in the teacher's home. The first formal school on record was at Fort Owen in Bitterroot valley in 1862. The students were Indian children and the children of Fort Owen employees. The first school term started in early winter and lasted only until February 28. Classes were taught by Mr. Robinson. Another early subscription school was started by Thomas Dimsdale in Virginia City in 1863. In this school students were charged $1.75 per week. The Montana Territorial Legislative Assembly had its inaugural meeting in 1864. The first legislature authorized counties to levy taxes for schools, which set the foundations for public schooling. Madison County was the first to take advantage of the newly authorized taxes and it formed fhe first public school in Virginia City in 1886. The first school year was scheduled to begin in January 1866, but severe weather postponed its opening until March. The first school year ran through the summer and did not end until August 17. One of the first teachers at the school was Sarah Raymond. She was a 25-year-old woman who had traveled to Virginia City via wagon train in 1865. To become a certified teacher, Raymond took a test in her home and paid a $6 fee in gold dust to obtain a teaching certificate. With the help of an assistant teacher, Mrs. Farley, Raymond was responsible for teaching 50 to 60 students each day out of the 81 students enrolled at the school. Sarah Raymond was paid $125 per month, and Mrs. Farley was paid $75 per month. No textbooks were used in the school. In their place was an assortment of books brought by various emigrants. Sarah quit teaching the following year, but she later become the Madison County superintendent of schools.
Many well-known artists, photographers and authors have documented the land, culture and people of Montana in the last 130 years. Painter and sculptor Charles Marion Russell, known as "the cowboy artist", created more than 2,000 paintings of cowboys, Native Americans, and landscapes set in the Western United States and in Alberta, Canada. The C. M. Russell Museum Complex in Great Falls, Montana, houses more than 2,000 Russell artworks, personal objects, and artifacts.
Pioneering feminist author, film-maker, and media personality Mary MacLane attained international fame in 1902 with her memoir of three months in her life in Butte, "The Story of Mary MacLane". She referred to Butte throughout the rest of her career and remains a controversial figure there for her mixture of criticism and love for Butte and its people.
Evelyn Cameron, a naturalist and photographer from Terry documented early 20th century life on the Montana prairie, taking startlingly clear pictures of everything around her: cowboys, sheepherders, weddings, river crossings, freight wagons, people working, badlands, eagles, coyotes and wolves.
Many notable Montana authors have documented or been inspired by life in Montana in both fiction and non-fiction works. Pulitzer Prize winner Wallace Earle Stegner from Great Falls was often called "The Dean of Western Writers". James Willard Schultz ("Apikuni") from Browning is most noted for his prolific stories about Blackfeet life and his contributions to the naming of prominent features in Glacier National Park.
Montana hosts numerous arts and cultural festivals and events every year. Major events include:
There are no major league sports franchises in Montana due to the state's relatively small and dispersed population, but a number of minor league teams play in the state. Baseball is the minor-league sport with the longest heritage in the state, and Montana is home to three Minor League Baseball teams, all members of the Pioneer League: the Billings Mustangs, Great Falls Voyagers, and Missoula Osprey.
All of Montana's four-year colleges and universities field intercollegiate sports teams. The two largest schools, the University of Montana and Montana State University, are members of the Big Sky Conference and have enjoyed a strong athletic rivalry since the early twentieth century. Six of Montana's smaller four-year schools are members of the Frontier Conference. One is a member of the Great Northwest Athletic Conference.
A variety of sports are offered at Montana high schools. Montana allows the smallest—"Class C"—high schools to utilize six-man football teams, dramatized in the independent 2002 film "The Slaughter Rule".
There are junior ice hockey teams in Montana, four of which are affiliated with the North American 3 Hockey League: the Bozeman Icedogs, Great Falls Americans, Helena Bighorns, and Missoula Jr. Bruins.
Montanans have been a part of several major sporting achievements:
Montana provides year-round outdoor recreation opportunities for residents and visitors. Hiking, fishing, hunting, watercraft recreation, camping, golf, cycling, horseback riding, and skiing are popular activities.
Montana has been a destination for its world-class trout fisheries since the 1930s. Fly fishing for several species of native and introduced trout in rivers and lakes is popular for both residents and tourists throughout the state. Montana is the home of the Federation of Fly Fishers and hosts many of the organizations annual conclaves. The state has robust recreational lake trout and kokanee salmon fisheries in the west, walleye can be found in many parts of the state, while northern pike, smallmouth and largemouth bass fisheries as well as catfish and paddlefish can be found in the waters of eastern Montana. Robert Redford's 1992 film of Norman Mclean's novel, "A River Runs Through It", was filmed in Montana and brought national attention to fly fishing and the state.
Montana is home to the Rocky Mountain Elk Foundation and has a historic big game hunting tradition. There are fall bow and general hunting seasons for elk, pronghorn antelope, whitetail deer and mule deer. A random draw grants a limited number of permits for moose, mountain goats and bighorn sheep. There is a spring hunting season for black bear and in most years, limited hunting of bison that leave Yellowstone National Park is allowed. Current law allows both hunters and trappers specified numbers ("limits") of wolves and mountain lions. Trapping of assorted fur-bearing animals is allowed in certain seasons and many opportunities exist for migratory waterfowl and upland bird hunting.
Both downhill skiing and cross-country skiing are popular in Montana, which has 15 developed downhill ski areas open to the public, including:
Big Sky Resort and Whitefish Mountain Resort are destination resorts, while the remaining areas do not have overnight lodging at the ski area, though several host restaurants and other amenities.
Montana also has millions of acres open to cross-country skiing on nine of its national forests and in Glacier National Park. In addition to cross-country trails at most of the downhill ski areas, there are also 13 private cross-country skiing resorts. Yellowstone National Park also allows cross-country skiing.
Snowmobiling is popular in Montana, which boasts over 4,000 miles of trails and frozen lakes available in winter. There are 24 areas where snowmobile trails are maintained, most also offering ungroomed trails. West Yellowstone offers a large selection of trails and is the primary starting point for snowmobile trips into Yellowstone National Park, where "oversnow" vehicle use is strictly limited, usually to guided tours, and regulations are in considerable flux.
Snow coach tours are offered at Big Sky, Whitefish, West Yellowstone and into Yellowstone National Park. Equestrian skijoring has a niche in Montana, which hosts the World Skijoring Championships in Whitefish as part of the annual Whitefish Winter Carnival.
Montana does not have a Trauma I hospital, but does have Trauma II hospitals in Missoula, Billings, and Great Falls. In 2013, "AARP The Magazine" named the Billings Clinic one of the safest hospitals in the United States.
Montana is ranked as the least obese state in the U.S., at 19.6%, according to the 2014 Gallup Poll.
Montana has the highest suicide rate of any state in the US as of 2017.
As of 2010, Missoula is the 166th largest media market in the United States as ranked by Nielsen Media Research, while Billings is 170th, Great Falls is 190th, the Butte-Bozeman area 191st, and Helena is 206th. There are 25 television stations in Montana, representing each major U.S. network. As of August 2013, there are 527 FCC-licensed FM radio stations broadcast in Montana, with 114 such AM stations.
During the age of the Copper Kings, each Montana copper company had its own newspaper. This changed in 1959 when Lee Enterprises bought several Montana newspapers. Montana's largest circulating daily city newspapers are the "Billings Gazette" (circulation 39,405), "Great Falls Tribune" (26,733), and "Missoulian" (25,439).
Railroads have been an important method of transportation in Montana since the 1880s. Historically, the state was traversed by the main lines of three east–west transcontinental routes: the Milwaukee Road, the Great Northern, and the Northern Pacific. Today, the BNSF Railway is the state's largest railroad, its main transcontinental route incorporating the former Great Northern main line across the state. Montana RailLink, a privately held Class II railroad, operates former Northern Pacific trackage in western Montana.
In addition, Amtrak's "Empire Builder" train runs through the north of the state, stopping in Libby, Whitefish, West Glacier, Essex, East Glacier Park, Browning, Cut Bank, Shelby, Havre, Malta, Glasgow, and Wolf Point.
Bozeman Yellowstone International Airport is the busiest airport in the state of Montana, surpassing Billings Logan International Airport in the spring of 2013. Montana's other major airports include Missoula International Airport, Great Falls International Airport, Glacier Park International Airport, Helena Regional Airport, Bert Mooney Airport and Yellowstone Airport. Eight smaller communities have airports designated for commercial service under the Essential Air Service program.
Historically, U.S. Route 10 was the primary east–west highway route across Montana, connecting the major cities in the southern half of the state. Still the state's most important east–west travel corridor, the route is today served by Interstate 90 and Interstate 94 which roughly follow the same route as the Northern Pacific. U.S. Routes 2 and 12 and Montana Highway 200 also traverse the entire state from east to west.
Montana's only north–south Interstate Highway is Interstate 15. Other major north–south highways include U.S. Routes 87, 89, 93 and 191.
Montana and South Dakota are the only states to share a land border which is not traversed by a paved road. Highway 212, the primary paved route between the two, passes through the northeast corner of Wyoming between Montana and South Dakota.
Montana is governed by a constitution. The first constitution was drafted by a constitutional convention in 1889, in preparation for statehood. Ninety percent of its language came from an 1884 constitution which was never acted upon by Congress for national political reasons. The 1889 constitution mimicked the structure of the United States Constitution, as well as outlining almost the same civil and political rights for citizens. However, the 1889 Montana constitution significantly restricted the power of state government, the legislature was much more powerful than the executive branch, and the jurisdiction of the District Courts very specifically described. Montana voters amended the 1889 constitution 37 times between 1889 and 1972. In 1914, Montana granted women the vote. In 1916, Montana became the first state to elect a woman, Progressive Republican Jeannette Rankin, to Congress.
In 1971, Montana voters approved the call for a state constitutional convention. A new constitution was drafted, which made the legislative and executive branches much more equal in power and which was much less prescriptive in outlining powers, duties, and jurisdictions. The draft included an expanded, more progressive list of civil and political rights, extended these rights to children for the first time, transferred administration of property taxes to the counties from the state, implemented new water rights, eliminated sovereign immunity, and gave the legislature greater power to spend tax revenues. The constitution was narrowly approved, 116,415 to 113,883, and declared ratified on June 20, 1972. Three issues which the constitutional convention were unable to resolve were submitted to voters simultaneously with the proposed constitution. Voters approved the legalization of gambling, a bicameral legislature, and retention of the death penalty.
The 1972 constitution has been amended 31 times as of 2015. Major amendments include establishment of a reclamation trust (funded by taxes on natural resource extraction) to restore mined land (1974); restoration of sovereign immunity, when such immunity has been approved by a two-thirds vote in each house (1974); establishment of a 90-day biennial (rather than annual) legislative session (1974); establishment of a coal tax trust fund, funded by a tax on coal extraction (1976); conversion of the mandatory decennial review of county government into a voluntary one, to be approved or disallowed by residents in each county (1978); conversion of the provision of public assistance from a mandatory civil right to a non-fundamental legislative prerogative (1988); a new constitutional right to hunt and fish (2004); a prohibition on gay marriage (2004); and a prohibition on new taxes on the sale or transfer of real property (2010). In 1992, voters approved a constitutional amendment implementing term limits for certain statewide elected executive branch offices (governor, lieutenant governor, secretary of state, state auditor, attorney general, superintendent of public instruction) and for members of the Montana Legislature. Extensive new constitutional rights for victims of crime were approved in 2016.
The 1972 constitution requires that voters determine every 20 years whether to hold a new constitutional convention. Voters turned down a new convention in 1990 (84 percent no) and again in 2010 (58.6 percent no).
Montana has three branches of state government: Legislative, executive, and judicial. The executive branch is headed by an elected governor. The Governor is Steve Bullock, a Democrat elected in 2012. There are also nine other statewide elected offices in the executive branch: Lieutenant Governor, Attorney General, Secretary of State, State Auditor (who also serves as Commissioner of Securities and Insurance), and Superintendent of Public Instruction. There are five Public Service Commissioners, who are elected on a regional basis. (The Public Service Commission's jurisdiction is statewide.)
There are 18 departments and offices which make up the executive branch: Administration; Agriculture; Auditor (securities and insurance); Commerce; Corrections; Environmental Quality; Fish, Wildlife & Parks; Justice; Labor and Industry; Livestock; Military Affairs; Natural Resources and Conservation; Public Health and Human Services; Revenue; State; and Transportation. Elementary and secondary education are overseen by the Office of Public Instruction (led by the elected Superintendent of Public Instruction), in cooperation with the governor-appointed Board of Public Education. Higher education is overseen by a governor-appointed Board of Regents, which in turn appoints a Commissioner of Higher Education. The Office of the Commissioner of Higher Education acts in an executive capacity on behalf of the regents, and oversees the state-run Montana University System.
Independent state agencies not within a department or office include the Montana Arts Council, Montana Board of Crime Control, Montana Historical Society, Montana Public Employees Retirement Administration, Commissioner of Political Practices, the Montana Lottery, Office of the State Public Defender, Public Service Commission, the Montana School for the Deaf and Blind, the Montana State Fund (which operates the state's unemployment insurance, worker compensation, and self-insurance operations), the Montana State Library, and the Montana Teachers Retirement System.
Montana is an Alcoholic beverage control state. It is an equitable distribution and no-fault divorce state. It is one of five states to have no sales tax.
The Montana Legislature is bicameral, and consists of the 50-member Montana Senate and the 100-member Montana House of Representatives. The legislature meets in the Montana State Capitol in Helena in odd-numbered years for 90 days, beginning the first weekday of the year. The deadline for a legislator to introduce a general bill is the 40th legislative day. The deadline for a legislator to introduce an appropriations, revenue, or referenda bill is the 62nd legislative day. Senators serve four-year terms, while Representatives serve two-year terms. All members are limited to serving no more than eight years in a single 16-year period.
The Courts of Montana are established by the Constitution of Montana. The constitution requires the establishment of a Montana Supreme Court and Montana District Courts, and permits the legislature to establish Justice Courts, City Courts, Municipal Courts, and other inferior courts such as the legislature sees fit to establish.
The Montana Supreme Court is the court of last resort in the Montana court system. The constitution of 1889 provided for the election of no fewer than three Supreme Court justices, and one Chief Justice. Each court member served a six-year term. The legislature increased the number of justices to five in 1919. The 1972 constitution lengthened the term of office to eight years, and established the minimum number of justices at five. It allowed the legislature to increase the number of justices by two, which the legislature did in 1979. The Montana Supreme Court has the authority to declare acts of the legislature and executive unconstitutional under either the Montana or U.S. constitutions. Its decisions may be appealed directly to the U.S. Supreme Court. The Clerk of the Supreme Court is also an elected position, and serves a six-year term. Neither justices nor the clerk are term limited.
Montana District Courts are the courts of general jurisdiction in Montana. There are no intermediate appellate courts. District Courts have jurisdiction primarily over most civil cases, cases involving a monetary claim against the state, felony criminal cases, probate, and cases at law and in equity. When so authorized by the legislature, actions of executive branch agencies may be appealed directly to a District Court. The District Courts also have "de novo" appellate jurisdiction from inferior courts (city courts, justice courts, and municipal courts), and oversee naturalization proceedings. District Court judges are elected, and serve six-year terms. They are not term limited. There are 22 judicial districts in Montana, served by 56 District Courts and 46 District Court judges. The District Courts suffer from excessive workload, and the legislature has struggled to find a solution to the problem.
Montana Youth Courts were established by the Montana Youth Court Act of 1974. They are overseen by District Court judges. They consist of a chief probation officer, one or more juvenile probation officers, and support staff. Youth Courts have jurisdiction over misdemeanor and felony acts committed by those charged as a juvenile under the law. There is a Youth Court in every judicial district, and decisions of the Youth Court are appealable directly to the Montana Supreme Court.
The Montana Worker's Compensation Court was established by the Montana Workers' Compensation Act in 1975. There is a single Workers' Compensation Court. It has a single judge, appointed by the governor. The Worker's Compensation Court has statewide jurisdiction and holds trials in Billings, Great Falls, Helena, Kalispell, and Missoula. The court hears cases arising under the Montana Workers' Compensation Act, and is the court of original jurisdiction for reviews of orders and regulations issued by the Montana Department of Labor and Industry. Decisions of the court are appealable directly to the Montana Supreme Court.
The Montana Water Court was established by the Montana Water Court Act of 1979. The Water Court consists of a Chief Water Judge and four District Water Judges (Lower Missouri River Basin, Upper Missouri River Basin, Yellowstone River Basin, and Clark Fork River Basin). The court employs 12 permanent special masters. The Montana Judicial Nomination Commission develops short lists of nominees for all five Water Judges, who are then appointed by the Chief Justice of the Montana Supreme Court (subject to confirmation by the Montana Senate). The Water Court adjudicates water rights claims under the Montana Water Use Act of 1973, and has statewide jurisdiction. District Courts have the authority to enforce decisions of the Water Court, but only the Montana Supreme Court has the authority to review decisions of the Water Court.
From 1889 to 1909, elections for judicial office in Montana were partisan. Beginning in 1909, these elections became nonpartisan. The Montana Supreme Court struck down the nonpartisan law in 1911 on technical grounds, but a new law was enacted in 1935 which barred political parties from endorsing, making contributions to, or making expenditures on behalf of or against judicial candidates. In 2012, the U.S. Supreme Court struck down Montana's judicial nonpartisan election law in Although candidates must remain nonpartisan, spending by partisan entities is now permitted. Spending on state supreme court races exponentially increased to $1.6 million in 2014, and to more than $1.6 million in 2016 (both new records).
The U.S. Constitution provides each state with two Senators. Montana's two U.S. senators are Jon Tester (Democrat), who was reelected in 2018, and Steve Daines (Republican), first elected in 2014. The U.S. Constitution provides each state with a single Representative, with additional representatives apportioned based on population. From statehood in 1889 until 1913, Montana was represented in the United States House of Representatives by a single representative, elected at-large. Montana received a second representative in 1913, following the 1910 census and reapportionment. Both members, however, were still elected at-large. Beginning in 1919, Montana moved to district, rather than at-large, elections for its two House members. This created Montana's 1st congressional district in the west and Montana's 2nd congressional district in the east. In the reapportionment following the 1990 census, Montana lost one of its House seats. The remaining seat was again elected at-large. Greg Gianforte is the current officeholder.
Montana's Senate district is the fourth largest by area, behind Alaska, Texas, and California. The most notorious of Montana's early Senators was William A. Clark, a "Copper King" and one of the 50 richest Americans ever. He is well known for having bribed his way into the U.S. Senate. Among Montana's most historically prominent Senators are Thomas J. Walsh (serving from 1913 to 1933), who was President-elect Franklin D. Roosevelt's choice for Attorney General when he died; Burton K. Wheeler (serving from 1923 to 1947), an oft-mentioned presidential candidate and strong supporter of isolationism; Mike Mansfield, the longest-serving Senate Majority Leader in U.S. history; Max Baucus (served 1978 to 2014), longest-serving U.S. Senator in Montana history, and the senator who shepherded the Patient Protection and Affordable Care Act through the Senate in 2010; and Lee Metcalf (served 1961 to 1978), a pioneer of the environmental movement.
Montana's House district is the largest congressional district in the United States by population, with just over 1,023,000 constituents. It is the second largest House district by area, after Alaska's at-large congressional district. Of Montana's House delegates, Jeannette Rankin was the first woman to hold national office in the United States when she was elected to the U.S. House of Representatives in 1916. Also notable is Representative (later Senator) Thomas H. Carter, the first Catholic to serve as chairman of the Republican National Committee (from 1892 to 1896).
Federal courts in Montana include the United States District Court for the District of Montana and the United States Bankruptcy Court for the District of Montana. Three former Montana politicians have been named judges on the U.S. District Court: Charles Nelson Pray (who served in the U.S. House of Representatives from 1907 to 1913), James Franklin Battin (who served in the U.S. House of Representatives from 1961 to 1969), and Paul G. Hatfield (who served as an appointed U.S. Senator in 1978). Brian Morris, who served as an Associate Justice of the Montana Supreme Court from 2005 to 2013, currently serves as a judge on the court.
Elections in the state have been competitive, with the Democrats usually holding an edge, thanks to the support among unionized miners and railroad workers. Large-scale battles revolved around the giant Anaconda Copper company, based in Butte and controlled by Rockefeller interests, until it closed in the 1970s. Until 1959, the company owned five of the state's six largest newspapers.
Historically, Montana is a swing state of cross-ticket voters who tend to fill elected offices with individuals from both parties. Through the mid-20th century, the state had a tradition of "sending the liberals to Washington and the conservatives to Helena". Between 1988 and 2006, the pattern flipped, with voters more likely to elect conservatives to federal offices. There have also been long-term shifts of party control. From 1968 through 1988, the state was dominated by the Democratic Party, with Democratic governors for a 20-year period, and a Democratic majority of both the national congressional delegation and during many sessions of the state legislature. This pattern shifted, beginning with the 1988 election, when Montana elected a Republican governor for the first time since 1964 and sent a Republican to the U.S. Senate for the first time since 1948. This shift continued with the reapportionment of the state's legislative districts that took effect in 1994, when the Republican Party took control of both chambers of the state legislature, consolidating a Republican party dominance that lasted until the 2004 reapportionment produced more swing districts and a brief period of Democratic legislative majorities in the mid-2000s.
In more recent presidential elections, Montana has voted for the Republican candidate in all but two elections from 1952 to the present. The state last supported a Democrat for president in 1992, when Bill Clinton won a plurality victory. Overall, since 1889 the state has voted for Democratic governors 60 percent of the time and Republican governors 40 percent of the time. In the 2008 presidential election, Montana was considered a swing state and was ultimately won by Republican John McCain, albeit by a narrow margin of two percent.
At the state level, the pattern of split-ticket voting and divided government holds. Democrats hold one of the state's two U.S. Senate seats with Jon Tester, as well as the governorship with Steve Bullock. The lone congressional district has been Republican since 1996 and in 2014 Steve Daines won one of the state's U.S. Senate seats for the GOP. The Legislative branch had split party control between the house and senate most years between 2004 and 2010, when the mid-term elections returned both branches to Republican control. The state Senate is, as of 2019, controlled by the Republicans 32 to 18, and the State House of Representatives at 59 to 41. Historically, Republicans are strongest in the east, while Democrats are strongest in the west.
Montana has only one representative in the U.S. House, having lost its second district in the 1990 census reapportionment. Montana's single congressional district holds the largest population of any district in the country, which means its one member in the House of Representatives represents more people than any other member of the U.S. House (see List of U.S. states by population). Montana's population grew at about the national average during the 2000s, but it failed to regain its second seat in 2010.
Montana has 56 counties with the United States Census Bureau stating Montana's contains 364 "places", broken down into 129 incorporated places and 235 census-designated places. Incorporated places consist of 52 cities, 75 towns, and two consolidated city-counties. Montana has one city, Billings, with a population over 100,000; and two cities with populations over 50,000, Missoula and Great Falls. These three communities are considered the centers of Montana's three Metropolitan Statistical Areas.
The state also has five Micropolitan Statistical Areas centered on Bozeman, Butte, Helena, Kalispell and Havre. These communities, excluding Havre, are colloquially known as the "big 7" Montana cities, as they are consistently the seven largest communities in Montana, with a significant population difference when these communities are compared to those 8th or lower on the list. According to the 2010 U.S. Census, the population of Montana's seven most populous cities, in rank order, are Billings, Missoula, Great Falls, Bozeman, Butte, Helena and Kalispell. Based on 2013 census numbers, they collectively contain 35 percent of Montana's population. and the counties containing these communities hold 62 percent of the state's population. The geographic center of population of Montana is in sparsely populated Meagher County, in the town of White Sulphur Springs.
Montana's motto, "Oro y Plata", Spanish for "Gold and Silver", recognizing the significant role of mining, was first adopted in 1865, when Montana was still a territory. A state seal with a miner's pick and shovel above the motto, surrounded by the mountains and the Great Falls of the Missouri River, was adopted during the first meeting of the territorial legislature in 1864–65. The design was only slightly modified after Montana became a state and adopted it as their Great Seal in 1893. The state flower, the Bitterroot, was adopted in 1895 with the support of a group called the Floral Emblem Association, which formed after Montana's Women's Christian Temperance Union adopted the bitterroot as the organization's state flower. All other symbols were adopted throughout the 20th century, save for Montana's newest symbol, the state butterfly, the mourning cloak, adopted in 2001, and the state lullaby, "Montana Lullaby", adopted in 2007.
The state song was not composed until 21 years after statehood, when a musical troupe led by Joseph E. Howard stopped in Butte in September 1910. A former member of the troupe who lived in Butte buttonholed Howard at an after-show party, asking him to compose a song about Montana and got another partygoer, the city editor for the "Butte Miner" newspaper, Charles C. Cohan, to help. The two men worked up a basic melody and lyrics in about a half-hour for the entertainment of party guests, then finished the song later that evening, with an arrangement worked up the following day. Upon arriving in Helena, Howard's troupe performed 12 encores of the new song to an enthusiastic audience and the governor proclaimed it the state song on the spot, though formal legislative recognition did not occur until 1945. Montana is one of only three states to have a "state ballad", "Montana Melody", chosen by the legislature in 1983. Montana was the first state to also adopt a State Lullaby.
Montana schoolchildren played a significant role in selecting several state symbols. The state tree, the ponderosa pine, was selected by Montana schoolchildren as the preferred state tree by an overwhelming majority in a referendum held in 1908. However, the legislature did not designate a state tree until 1949, when the Montana Federation of Garden Clubs, with the support of the state forester, lobbied for formal recognition. Schoolchildren also chose the western meadowlark as the state bird, in a 1930 vote, and the legislature acted to endorse this decision in 1931. Similarly, the secretary of state sponsored a children's vote in 1981 to choose a state animal, and after 74 animals were nominated, the grizzly bear won over the elk by a 2–1 margin. The students of Livingston started a statewide school petition drive plus lobbied the governor and the state legislature to name the "Maiasaura" as the state fossil in 1985.
Various community civic groups also played a role in selecting the state grass and the state gemstones. When broadcaster Norma Ashby discovered there was no state fish, she initiated a drive via her television show, "Today in Montana", and an informal citizen's election to select a state fish resulted in a win for the blackspotted cutthroat trout after hot competition from the Arctic grayling. The legislature in turn adopted this recommendation by a wide margin.
|
https://en.wikipedia.org/wiki?curid=19978
|
Machine translation
Machine translation, sometimes referred to by the abbreviation MT (not to be confused with computer-aided translation, machine-aided human translation (MAHT) or interactive translation), is a sub-field of computational linguistics that investigates the use of software to translate text or speech from one language to another.
On a basic level, MT performs mechanical substitution of words in one language for words in another, but that alone rarely produces a good translation because recognition of whole phrases and their closest counterparts in the target language is needed. Not all words in one language have equivalent words in another language, and many words have more than one meaning. In addition, two given languages may have completely different structures.
Solving this problem with corpus statistical and neural techniques is a rapidly-growing field that is leading to better translations, handling differences in linguistic typology, translation of idioms, and the isolation of anomalies.
Current machine translation software often allows for customization by domain or profession (such as weather reports), improving output by limiting the scope of allowable substitutions. This technique is particularly effective in domains where formal or formulaic language is used. It follows that machine translation of government and legal documents more readily produces usable output than conversation or less standardised text.
Improved output quality can also be achieved by human intervention: for example, some systems are able to translate more accurately if the user has unambiguously identified which words in the text are proper names. With the assistance of these techniques, MT has proven useful as a tool to assist human translators and, in a very limited number of cases, can even produce output that can be used as is (e.g., weather reports).
The progress and potential of machine translation have been much debated through its history. Since the 1950s, a number of scholars, first and most notably Yehoshua Bar-Hillel, have questioned the possibility of achieving fully automatic machine translation of high quality.
Some critics claim that there are in-principle obstacles to automating the translation process.
The origins of machine translation can be traced back to the work of Al-Kindi, a 9th-century Arabic cryptographer who developed techniques for systemic language translation, including cryptanalysis, frequency analysis, and probability and statistics, which are used in modern machine translation. The idea of machine translation later appeared in the 17th century. In 1629, René Descartes proposed a universal language, with equivalent ideas in different tongues sharing one symbol.
The field of machine translation was founded with Warren Weaver's Memorandum on Translation (1949). The first researcher in the field, Yehosha Bar-Hillel, began his research at MIT (1951). A Georgetown University MT research team followed (1951) with a public demonstration of its Georgetown-IBM experiment system in 1954. MT research programs popped up in Japan and Russia (1955), and the first MT conference was held in London (1956). Researchers continued to join the field as the Association for Machine Translation and Computational Linguistics was formed in the U.S. (1962) and the National Academy of Sciences formed the Automatic Language Processing Advisory Committee (ALPAC) to study MT (1964). Real progress was much slower, however, and after the ALPAC report (1966), which found that the ten-year-long research had failed to fulfill expectations, funding was greatly reduced. According to a 1972 report by the Director of Defense Research and Engineering (DDR&E), the feasibility of large-scale MT was reestablished by the success of the Logos MT system in translating military manuals into Vietnamese during that conflict.
The French Textile Institute also used MT to translate abstracts from and into French, English, German and Spanish (1970); Brigham Young University started a project to translate Mormon texts by automated translation (1971); and Xerox used SYSTRAN to translate technical manuals (1978). Beginning in the late 1980s, as computational power increased and became less expensive, more interest was shown in statistical models for machine translation. MT became more popular after the advent of computers. SYSTRAN's first implementation system was implemented in 1988 by the online service of the French Postal Service called Minitel. Various MT companies were also launched, including Trados (1984), which was the first to develop and market translation memory technology (1989). The first commercial MT system for Russian / English / German-Ukrainian was developed at Kharkov State University (1991).
MT on the web started with SYSTRAN offering free translation of small texts (1996), followed by AltaVista Babelfish, which racked up 500,000 requests a day (1997). Franz Josef Och (the future head of Translation Development AT Google) won DARPA's speed MT competition (2003). More innovations during this time included MOSES, the open-source statistical MT engine (2007), a text/SMS translation service for mobiles in Japan (2008), and a mobile phone with built-in speech-to-speech translation functionality for English, Japanese and Chinese (2009). Recently, Google announced that Google Translate translates roughly enough text to fill 1 million books in one day (2012).
The idea of using digital computers for translation of natural languages was proposed as early as 1946 by A. D. Booth and possibly others. Warren Weaver wrote an important memorandum "Translation" in 1949. The Georgetown experiment was by no means the first such application, and a demonstration was made in 1954 on the APEXC machine at Birkbeck College (University of London) of a rudimentary translation of English into French. Several papers on the topic were published at the time, and even articles in popular journals (for example an article by Cleave and Zacharov in the September 1955 issue of "Wireless World"). A similar application, also pioneered at Birkbeck College at the time, was reading and composing Braille texts by computer.
The human translation process may be described as:
Behind this ostensibly simple procedure lies a complex cognitive operation. To decode the meaning of the source text in its entirety, the translator must interpret and analyse all the features of the text, a process that requires in-depth knowledge of the grammar, semantics, syntax, idioms, etc., of the source language, as well as the culture of its speakers. The translator needs the same in-depth knowledge to re-encode the meaning in the target language.
Therein lies the challenge in machine translation: how to program a computer that will "understand" a text as a person does, and that will "create" a new text in the target language that sounds as if it has been written by a person.
In its most general application, this is beyond current technology. Though it works much faster, no automated translation program or procedure, with no human participation, can produce output even close to the quality a human translator can produce. What it can do, however, is provide a general, though imperfect, approximation of the original text, getting the "gist" of it (a process called "gisting"). This is sufficient for many purposes, including making best use of the finite and expensive time of a human translator, reserved for those cases in which total accuracy is indispensable.
This problem may be approached in a number of ways, through the evolution of which accuracy has improved.
Machine translation can use a method based on linguistic rules, which means that words will be translated in a linguistic way – the most suitable (orally speaking) words of the target language will replace the ones in the source language.
It is often argued that the success of machine translation requires the problem of natural language understanding to be solved first.
Generally, rule-based methods parse a text, usually creating an intermediary, symbolic representation, from which the text in the target language is generated. According to the nature of the intermediary representation, an approach is described as interlingual machine translation or transfer-based machine translation. These methods require extensive lexicons with morphological, syntactic, and semantic information, and large sets of rules.
Given enough data, machine translation programs often work well enough for a native speaker of one language to get the approximate meaning of what is written by the other native speaker. The difficulty is getting enough data of the right kind to support the particular method. For example, the large multilingual corpus of data needed for statistical methods to work is not necessary for the grammar-based methods. But then, the grammar methods need a skilled linguist to carefully design the grammar that they use.
To translate between closely related languages, the technique referred to as rule-based machine translation may be used.
The rule-based machine translation paradigm includes transfer-based machine translation, interlingual machine translation and dictionary-based machine translation paradigms. This type of translation is used mostly in the creation of dictionaries and grammar programs. Unlike other methods, RBMT involves more information about the linguistics of the source and target languages, using the morphological and syntactic rules and semantic analysis of both languages. The basic approach involves linking the structure of the input sentence with the structure of the output sentence using a parser and an analyzer for the source language, a generator for the target language, and a transfer lexicon for the actual translation. RBMT's biggest downfall is that everything must be made explicit: orthographical variation and erroneous input must be made part of the source language analyser in order to cope with it, and lexical selection rules must be written for all instances of ambiguity. Adapting to new domains in itself is not that hard, as the core grammar is the same across domains, and the domain-specific adjustment is limited to lexical selection adjustment.
Transfer-based machine translation is similar to interlingual machine translation in that it creates a translation from an intermediate representation that simulates the meaning of the original sentence. Unlike interlingual MT, it depends partially on the language pair involved in the translation.
Interlingual machine translation is one instance of rule-based machine-translation approaches. In this approach, the source language, i.e. the text to be translated, is transformed into an interlingual language, i.e. a "language neutral" representation that is independent of any language. The target language is then generated out of the interlingua. One of the major advantages of this system is that the interlingua becomes more valuable as the number of target languages it can be turned into increases. However, the only interlingual machine translation system that has been made operational at the commercial level is the KANT system (Nyberg and Mitamura, 1992), which is designed to translate Caterpillar Technical English (CTE) into other languages.
Machine translation can use a method based on dictionary entries, which means that the words will be translated as they are by a dictionary.
Statistical machine translation tries to generate translations using statistical methods based on bilingual text corpora, such as the Canadian Hansard corpus, the English-French record of the Canadian parliament and EUROPARL, the record of the European Parliament. Where such corpora are available, good results can be achieved translating similar texts, but such corpora are still rare for many language pairs. The first statistical machine translation software was CANDIDE from IBM. Google used SYSTRAN for several years, but switched to a statistical translation method in October 2007. In 2005, Google improved its internal translation capabilities by using approximately 200 billion words from United Nations materials to train their system; translation accuracy improved. Google Translate and similar statistical translation programs work by detecting patterns in hundreds of millions of documents that have previously been translated by humans and making intelligent guesses based on the findings. Generally, the more human-translated documents available in a given language, the more likely it is that the translation will be of good quality. Newer approaches into Statistical Machine translation such as METIS II and PRESEMT use minimal corpus size and instead focus on derivation of syntactic structure through pattern recognition. With further development, this may allow statistical machine translation to operate off of a monolingual text corpus. SMT's biggest downfall includes it being dependent upon huge amounts of parallel texts, its problems with morphology-rich languages (especially with translating "into" such languages), and its inability to correct singleton errors.
Example-based machine translation (EBMT) approach was proposed by Makoto Nagao in 1984. Example-based machine translation is based on the idea of analogy. In this approach, the corpus that is used is one that contains texts that have already been translated. Given a sentence that is to be translated, sentences from this corpus are selected that contain similar sub-sentential components. The similar sentences are then used to translate the sub-sentential components of the original sentence into the target language, and these phrases are put together to form a complete translation.
Hybrid machine translation (HMT) leverages the strengths of statistical and rule-based translation methodologies. Several MT organizations claim a hybrid approach that uses both rules and statistics. The approaches differ in a number of ways:
More recently, with the advent of Neural MT, a new version of hybrid machine translation is emerging that combines the benefits of rules, statistical and neural machine translation. The approach allows benefitting from pre- and post-processing in a rule guided workflow as well as benefitting from NMT and SMT. The downside is the inherent complexity which makes the approach suitable only for specific use cases. One of the proponents of this approach for complex use cases is Omniscien Technologies.
A deep learning based approach to MT, neural machine translation has made rapid progress in recent years, and Google has announced its translation services are now using this technology in preference to its previous statistical methods. Microsoft team reached human parity on WMT-2017 in 2018 and this was a historical milestone.
Word-sense disambiguation concerns finding a suitable translation when a word can have more than one meaning. The problem was first raised in the 1950s by Yehoshua Bar-Hillel. He pointed out that without a "universal encyclopedia", a machine would never be able to distinguish between the two meanings of a word. Today there are numerous approaches designed to overcome this problem. They can be approximately divided into "shallow" approaches and "deep" approaches.
Shallow approaches assume no knowledge of the text. They simply apply statistical methods to the words surrounding the ambiguous word. Deep approaches presume a comprehensive knowledge of the word. So far, shallow approaches have been more successful.
Claude Piron, a long-time translator for the United Nations and the World Health Organization, wrote that machine translation, at its best, automates the easier part of a translator's job; the harder and more time-consuming part usually involves doing extensive research to resolve ambiguities in the source text, which the grammatical and lexical exigencies of the target language require to be resolved:
The ideal deep approach would require the translation software to do all the research necessary for this kind of disambiguation on its own; but this would require a higher degree of AI than has yet been attained. A shallow approach which simply guessed at the sense of the ambiguous English phrase that Piron mentions (based, perhaps, on which kind of prisoner-of-war camp is more often mentioned in a given corpus) would have a reasonable chance of guessing wrong fairly often. A shallow approach that involves "ask the user about each ambiguity" would, by Piron's estimate, only automate about 25% of a professional translator's job, leaving the harder 75% still to be done by a human.
One of the major pitfalls of MT is its inability to translate non-standard language with the same accuracy as standard language. Heuristic or statistical based MT takes input from various sources in standard form of a language. Rule-based translation, by nature, does not include common non-standard usages. This causes errors in translation from a vernacular source or into colloquial language. Limitations on translation from casual speech present issues in the use of machine translation in mobile devices.
Name entities, in narrow sense, refer to concrete or abstract entities in the real world including people, organizations, companies, places etc. It also refers to expressing of time, space, quantity such as 1 July 2011, $79.99 and so on.
Named entities occur in the text being analyzed in statistical machine translation. The initial difficulty that arises in dealing with named entities is simply identifying them in the text. Consider the list of names common in a particular language to illustrate this – the most common names are different for each language and also are constantly changing. If named entities cannot be recognized by the machine translator, they may be erroneously translated as common nouns, which would most likely not affect the BLEU rating of the translation but would change the text's human readability. It is also possible that, when not identified, named entities will be omitted from the output translation, which would also have implications for the text's readability and message.
Another way to deal with named entities is to use transliteration instead of translation, meaning that you find the letters in the target language that most closely correspond to the name in the source language. There have been attempts to incorporate this into machine translation by adding a transliteration step into the translation procedure. However, these attempts still have their problems and have even been cited as worsening the quality of translation. Named entities were still identified incorrectly, with words not being transliterated when they should or being transliterated when they shouldn't. For example, for "Southern California" the first word should be translated directly, while the second word should be transliterated. However, machines would often transliterate both because they treated them as one entity. Words like these are hard for machine translators, even those with a transliteration component, to process.
The lack of attention to the issue of named entity translation has been recognized as potentially stemming from a lack of resources to devote to the task in addition to the complexity of creating a good system for named entity translation. One approach to named entity translation has been to transliterate, and not translate, those words. A second is to create a "do-not-translate" list, which has the same end goal – transliteration as opposed to translation. Both of these approaches still rely on the correct identification of named entities, however.
A third approach to successful named entity translation is a class-based model. In this method, named entities are replaced with a token to represent the class they belong to. For example, "Ted" and "Erica" would both be replaced with "person" class token. In this way the statistical distribution and use of person names in general can be analyzed instead of looking at the distributions of "Ted" and "Erica" individually. A problem that the class based model solves is that the probability of a given name in a specific language will not affect the assigned probability of a translation. A study by Stanford on improving this area of translation gives the examples that different probabilities will be assigned to "David is going for a walk" and "Ankit is going for a walk" for English as a target language due to the different number of occurrences for each name in the training data. A frustrating outcome of the same study by Stanford (and other attempts to improve named recognition translation) is that many times, a decrease in the BLEU scores for translation will result from the inclusion of methods for named entity translation.
Some work has been done in the utilization of multiparallel corpora, that is a body of text that has been translated into 3 or more languages. Using these methods, a text that has been translated into 2 or more languages may be utilized in combination to provide a more accurate translation into a third language compared with if just one of those source languages were used alone.
An ontology is a formal representation of knowledge which includes the concepts (such as objects, processes etc.) in a domain and some relations between them. If the stored information is of linguistic nature, one can speak of a lexicon.
In NLP, ontologies can be used as a source of knowledge for machine translation systems. With access to a large knowledge base, systems can be enabled to resolve many (especially lexical) ambiguities on their own.
In the following classic examples, as humans, we are able to interpret the prepositional phrase according to the context because we use our world knowledge, stored in our lexicons:
A machine translation system initially would not be able to differentiate between the meanings because syntax does not change. With a large enough ontology as a source of knowledge however, the possible interpretations of ambiguous words in a specific context can be reduced.
Other areas of usage for ontologies within NLP include information retrieval, information extraction and text summarization.
The ontology generated for the PANGLOSS knowledge-based machine translation system in 1993 may serve as an example of how an ontology for NLP purposes can be compiled:
While no system provides the holy grail of fully automatic high-quality machine translation of unrestricted text, many fully automated systems produce reasonable output. The quality of machine translation is substantially improved if the domain is restricted and controlled.
Despite their inherent limitations, MT programs are used around the world. Probably the largest institutional user is the European Commission. The project, for example, coordinated by the University of Gothenburg, received more than 2.375 million euros project support from the EU to create a reliable translation tool that covers a majority of the EU languages. The further development of MT systems comes at a time when budget cuts in human translation may increase the EU's dependency on reliable MT programs. The European Commission contributed 3.072 million euros (via its ISA programme) for the creation of MT@EC, a statistical machine translation program tailored to the administrative needs of the EU, to replace a previous rule-based machine translation system.
In 2005, Google claimed that promising results were obtained using a proprietary statistical machine translation engine. The statistical translation engine used in the Google language tools for Arabic English and Chinese English had an overall score of 0.4281 over the runner-up IBM's BLEU-4 score of 0.3954 (Summer 2006) in tests conducted by the National Institute for Standards and Technology.
With the recent focus on terrorism, the military sources in the United States have been investing significant amounts of money in natural language engineering. "In-Q-Tel" (a venture capital fund, largely funded by the US Intelligence Community, to stimulate new technologies through private sector entrepreneurs) brought up companies like Language Weaver. Currently the military community is interested in translation and processing of languages like Arabic, Pashto, and Dari. Within these languages, the focus is on key phrases and quick communication between military members and civilians through the use of mobile phone apps. The Information Processing Technology Office in DARPA hosts programs like TIDES and Babylon translator. US Air Force has awarded a $1 million contract to develop a language translation technology.
The notable rise of social networking on the web in recent years has created yet another niche for the application of machine translation software – in utilities such as Facebook, or instant messaging clients such as Skype, GoogleTalk, MSN Messenger, etc. – allowing users speaking different languages to communicate with each other. Machine translation applications have also been released for most mobile devices, including mobile telephones, pocket PCs, PDAs, etc. Due to their portability, such instruments have come to be designated as mobile translation tools enabling mobile business networking between partners speaking different languages, or facilitating both foreign language learning and unaccompanied traveling to foreign countries without the need of the intermediation of a human translator.
Despite being labelled as an unworthy competitor to human translation in 1966 by the Automated Language Processing Advisory Committee put together by the United States government, the quality of machine translation has now been improved to such levels that its application in online collaboration and in the medical field are being investigated. The application of this technology in medical settings where human translators are absent is another topic of research, but difficulties arise due to the importance of accurate translations in medical diagnoses.
There are many factors that affect how machine translation systems are evaluated. These factors include the intended use of the translation, the nature of the machine translation software, and the nature of the translation process.
Different programs may work well for different purposes. For example, statistical machine translation (SMT) typically outperforms example-based machine translation (EBMT), but researchers found that when evaluating English to French translation, EBMT performs better. The same concept applies for technical documents, which can be more easily translated by SMT because of their formal language.
In certain applications, however, e.g., product descriptions written in a controlled language, a dictionary-based machine-translation system has produced satisfactory translations that require no human intervention save for quality inspection.
There are various means for evaluating the output quality of machine translation systems. The oldest is the use of human judges to assess a translation's quality. Even though human evaluation is time-consuming, it is still the most reliable method to compare different systems such as rule-based and statistical systems. Automated means of evaluation include BLEU, NIST, METEOR, and LEPOR.
Relying exclusively on unedited machine translation ignores the fact that communication in human language is context-embedded and that it takes a person to comprehend the context of the original text with a reasonable degree of probability. It is certainly true that even purely human-generated translations are prone to error. Therefore, to ensure that a machine-generated translation will be useful to a human being and that publishable-quality translation is achieved, such translations must be reviewed and edited by a human. The late Claude Piron wrote that machine translation, at its best, automates the easier part of a translator's job; the harder and more time-consuming part usually involves doing extensive research to resolve ambiguities in the source text, which the grammatical and lexical exigencies of the target language require to be resolved. Such research is a necessary prelude to the pre-editing necessary in order to provide input for machine-translation software such that the output will not be meaningless.
In addition to disambiguation problems, decreased accuracy can occur due to varying levels of training data for machine translating programs. Both example-based and statistical machine translation rely on a vast array of real example sentences as a base for translation, and when too many or too few sentences are analyzed accuracy is jeopardized. Researchers found that when a program is trained on 203,529 sentence pairings, accuracy actually decreases. The optimal level of training data seems to be just over 100,000 sentences, possibly because as training data increases, the number of possible sentences increases, making it harder to find an exact translation match.
Although there have been concerns about machine translation's accuracy, Dr. Ana Nino of the University of Manchester has researched some of the advantages in utilizing machine translation in the classroom. One such pedagogical method is called using "MT as a Bad Model." MT as a Bad Model forces the language learner to identify inconsistencies or incorrect aspects of a translation; in turn, the individual will (hopefully) possess a better grasp of the language. Dr. Nino cites that this teaching tool was implemented in the late 1980s. At the end of various semesters, Dr. Nino was able to obtain survey results from students who had used MT as a Bad Model (as well as other models.) Overwhelmingly, students felt that they had observed improved comprehension, lexical retrieval, and increased confidence in their target language.
In the early 2000s, options for machine translation between spoken and signed languages were severely limited. It was a common belief that deaf individuals could use traditional translators. However, stress, intonation, pitch, and timing are conveyed much differently in spoken languages compared to signed languages. Therefore, a deaf individual may misinterpret or become confused about the meaning of written text that is based on a spoken language.
Researchers Zhao, et al. (2000), developed a prototype called TEAM (translation from English to ASL by machine) that completed English to American Sign Language (ASL) translations. The program would first analyze the syntactic, grammatical, and morphological aspects of the English text. Following this step, the program accessed a sign synthesizer, which acted as a dictionary for ASL. This synthesizer housed the process one must follow to complete ASL signs, as well as the meanings of these signs. Once the entire text is analyzed and the signs necessary to complete the translation are located in the synthesizer, a computer generated human appeared and would use ASL to sign the English text to the user.
Only works that are original are subject to copyright protection, so some scholars claim that machine translation results are not entitled to copyright protection because MT does not involve creativity. The copyright at issue is for a derivative work; the author of the original work in the original language does not lose his rights when a work is translated: a translator must have permission to publish a translation.
|
https://en.wikipedia.org/wiki?curid=19980
|
Central moment
In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean; that is, it is the expected value of a specified integer power of the deviation of the random variable from the mean. The various moments form one set of values by which the properties of a probability distribution can be usefully characterized. Central moments are used in preference to ordinary moments, computed in terms of deviations from the mean instead of from zero, because the higher-order central moments relate only to the spread and shape of the distribution, rather than also to its location.
Sets of central moments can be defined for both univariate and multivariate distributions.
The "n"th moment about the mean (or "n"th central moment) of a real-valued random variable "X" is the quantity "μ""n" := E[("X" − E["X"])"n"], where E is the expectation operator. For a continuous univariate probability distribution with probability density function "f"("x"), the "n"th moment about the mean "μ" is
For random variables that have no mean, such as the Cauchy distribution, central moments are not defined.
The first few central moments have intuitive interpretations:
The "n"th central moment is translation-invariant, i.e. for any random variable "X" and any constant "c", we have
For all "n", the "n"th central moment is homogeneous of degree "n":
"Only" for "n" such that n equals 1, 2, or 3 do we have an additivity property for random variables "X" and "Y" that are independent:
A related functional that shares the translation-invariance and homogeneity properties with the "n"th central moment, but continues to have this additivity property even when "n" ≥ 4 is the "n"th cumulant κ"n"("X"). For "n" = 1, the "n"th cumulant is just the expected value; for "n" = either 2 or 3, the "n"th cumulant is just the "n"th central moment; for "n" ≥ 4, the "n"th cumulant is an "n"th-degree monic polynomial in the first "n" moments (about zero), and is also a (simpler) "n"th-degree polynomial in the first "n" central moments.
Sometimes it is convenient to convert moments about the origin to moments about the mean. The general equation for converting the "n"th-order moment about the origin to the moment about the mean is
where "μ" is the mean of the distribution, and the moment about the origin is given by
For the cases "n" = 2, 3, 4 — which are of most interest because of the relations to variance, skewness, and kurtosis, respectively — this formula becomes (noting that formula_7 and formula_8):
... and so on, following Pascal's triangle, i.e.
because formula_14
The following sum is a stochastic variable having a compound distribution
where the formula_16 are mutually independent random variables sharing the same common distribution and formula_17 a random integer variable independent of the formula_18 with its own distribution. The moments of formula_19 are obtained as
where formula_21 is defined as zero for formula_22.
In a symmetric distribution (one that is unaffected by being reflected about its mean), all odd central moments equal zero, because in the formula for the "n"th moment, each term involving a value of "X" less than the mean by a certain amount exactly cancels out the term involving a value of "X" greater than the mean by the same amount.
For a continuous bivariate probability distribution with probability density function "f"("x","y") the ("j","k") moment about the mean "μ" = ("μ""X", "μ""Y") is
|
https://en.wikipedia.org/wiki?curid=19983
|
Murad I
Murad I (; (nicknamed Hüdavendigâr, from Persian: خداوندگار, "Khodāvandgār", "the devotee of God" – but meaning "sovereign" in this context); 29 June 1326 – 15 June 1389) was the Ottoman Sultan from 1362 to 1389. He was a son of Orhan Gazi and Nilüfer Hatun.
Murad I conquered Adrianople, renamed it to Edirne, and in 1363 made it the new capital of the Ottoman Sultanate. Then he further expanded the Ottoman realm in Southeast Europe by bringing most of the Balkans under Ottoman rule, and forced the princes of northern Serbia and Bulgaria as well as the Byzantine emperor John V Palaiologos to pay him tribute. Murad I administratively divided his sultanate into the two provinces of Anatolia (Asia Minor) and Rumelia (the Balkans).
Murad fought against the powerful beylik of Karaman in Anatolia and against the Serbs, Albanians, Bulgarians and Hungarians in Europe. In particular, a Serb expedition to expel the Turks from Adrianople led by the Serbian brothers King Vukašin and Despot Uglješa, was defeated on September 26, 1371, by Murad's capable second lieutenant Lala Şâhin Paşa, the first governor ("beylerbey") of Rumeli. In 1385, Sofia fell to the Ottomans. In 1386 Prince Lazar Hrebeljanović defeated an Ottoman force at the Battle of Pločnik. The Ottoman army suffered heavy casualties, and was unable to capture Niš on the way back.
While Murad I was at this time, his son, Savci Bey, who was "deputy throne", launched a prince riot. Among the comparator candidates for the Byzantine Empire in Constantinople [
The Byzantine Civil War (1373-1379) were a combination. Murad I the Byzantine Emperor . His eldest son Andronikos in Constantinople (and then Andronikos IV And his younger brother Manuil (then Manuel II) Continued. Taking advantage of his father's departure from the capital, Andronikos conspired and declared his empire. The Ottoman prince Savcı Bey, who was 14 years old for some reason, declared this rebellion to be a ruler instead of his father Murad I in folding Rumelia and had the sermon read in his name. When Murad I moved to Rumelia, he passed with the Ottoman forces under his command. There was a clash in the location of an "Apicridium" in Istanbul with the troops under the command of Prince Savci Bey and Byzantine Gaspci Andronikos, and the army under Murad I dismissed the army of Savci Bey and Andronikos. The prosecutor fled to Dimetoka and was arrested there. His father, prosecutor Bey was very affected by the rebellion, first of his eyes had to apply for miles. Feridun Bey " Münşeat"i term Prosecutor Bey "nur-ı basır mechur (lack of light of vision)" The same punishment was applied to Byzantine Emperor Ioannis V, as well as his rebellious son. However, historians report that the Byzantine Emperor was half-blinded by pouring angry vinegar into his son's eyes by applying this punishment more lightly. After Murad I had blinded his son, he could not defeat his anger and had the Prosecutor in Bursa strangled and executed. When the story of the Savci Bey ended, there was a tragic situation that began in Bursa and ended there.
In 1389, Murad's army defeated the Serbian Army and its allies under the leadership of Lazar at the Battle of Kosovo.
There are different accounts from different sources about when and how Murad I was assassinated. The contemporary sources mainly noted that the battle took place and that both Prince Lazar and the Sultan lost their lives in the battle. The existing evidence of the additional stories and speculations as to how Murad I died were disseminated and recorded in the 15th century and later, decades after the actual event. One Western source states that during first hours of the battle, Murad I was assassinated by Serbian nobleman and knight Miloš Obilić by knife. Most Ottoman chroniclers (including Dimitrie Cantemir) state that he was assassinated after the finish of the battle while going around the battlefield. His older son Bayezid, who was in charge of the left wing of the Ottoman forces, took charge after that. His other son, Yakub Bey, who was in charge of the other wing, was called to the Sultan's command center tent by Bayezid, but when Yakub Bey arrived he was strangled, leaving Bayezid as the sole claimant to the throne.
In a letter from the Florentine senate (written by Coluccio Salutati) to the King Tvrtko I of Bosnia, dated 20 October 1389, Murad I's (and Jakub Bey's) killing was described. A party of twelve Serbian lords slashed their way through the Ottoman lines defending Murad I. One of them, allegedly Miloš Obilić, had managed to get through to the Sultan's tent and kill him with sword stabs to the throat and belly.
Sultan Murad's internal organs were buried in Kosovo field and remains to this day on a corner of the battlefield in a location called "Meshed-i Hudavendigar" which has gained a religious significance by the local Muslims. It has been vandalized between 1999–2006 and renovated recently. His other remains were carried to Bursa, his Anatolian capital city, and were buried in a tomb at the complex built in his name.
He established the sultanate by building up a society and government in the newly conquered city of Adrianople (Edirne in Turkish) and by expanding the realm in Europe, bringing most of the Balkans under Ottoman rule and forcing the Byzantine emperor to pay him tribute. It was Murad who established the former Osmanli tribe into an sultanate. He established the title of sultan in 1383 and the corps of the "janissaries" and the "devşirme" recruiting system. He also organised the government of the "Divan", the system of timars and timar-holders (timariots) and the military judge, the "kazasker". He also established the two provinces of Anadolu (Anatolia) and Rumeli (Europe).
He was the son of Orhan and the Valide Hatun Nilüfer Hatun, daughter of the Prince of Yarhisar, who was of ethnic Greek descent
Notes:
References:
|
https://en.wikipedia.org/wiki?curid=19986
|
Mehmed I
Mehmed I (1389 – 26 May 1421), also known as Mehmed Çelebi (, "the noble-born") or Kirişçi (from Greek "Kyritzes", "lord's son"), was the Ottoman Sultan from 1413 to 1421. The fourth son of Sultan Bayezid I and Devlet Hatun, he fought with his brothers over control of the Ottoman realm in the Ottoman Interregnum (1402–1413). Starting from the province of Rûm he managed to bring first Anatolia and then the European territories (Rumelia) under his control, reuniting the Ottoman state by 1413, and ruling it until his death in 1421.
Mehmed was born in 1389 as the fourth son of Sultan Bayezid I () and one of his consorts, the slave girl Devlet Hatun. Following Ottoman custom, when he reached adolescence in 1399, he was sent to gain experience as provincial governor over the Rûm Eyalet (central northern Anatolia), recently conquered from its Eretnid rulers.
On 20 July 1402, his father Bayezid was defeated in the Battle of Ankara by the Turko-Mongol conqueror and ruler Timur. The brothers (with the exception of Mustafa, who was captured and taken along with Bayezid to Samarkand) were rescued from the battlefield, Mehmed being saved by Bayezid Pasha, who took him to his hometown of Amasya. Mehmed later made Bayezid Pasha his grand vizier (1413–1421).
The early Ottoman Empire had no regulated succession, and according to Turkish tradition, every son could succeed his father. Of Mehmed's brothers, the eldest, Ertuğrul, had died in 1400, while the next in line, Mustafa, was a prisoner of Timur. Leaving aside the underage siblings, this left four princes—Mehmed, Süleyman, İsa, and Musa, to contend over control of the remaining Ottoman territories in the civil war known as the "Ottoman Interregnum". In modern historiography, these princes are usually called by the title "Çelebi", but in contemporary sources, the title is reserved for Mehmed and Musa. The Byzantine sources translated the title as "Kyritzes" (Κυριτζής), which was in turn adopted into Turkish as "kirişçi", sometimes misinterpreted as "güreşçi", "the wrestler".
After winning the Interregnum, Mehmed crowned himself sultan in the Thracian city of Edirne that lay in the European part of the empire (the area dividing the Anatolian and European sides of the empire, Constantinople and the surrounding region, was still held by the Byzantine Empire), becoming Mehmed I. He consolidated his power, made Edirne the most important of the dual capitals, and conquered parts of Albania, the Jandarid emirate, and the Armenian Kingdom of Cilicia from the Mamelukes. Taking his many achievements into consideration, Mehmed is widely known as the "second founder" of the Ottoman Sultanate.
Soon after Mehmed began his reign, his brother Mustafa Çelebi, who had originally been captured along with their father Bayezid I during the Battle of Ankara and held captive in Samarkand, hiding in Anatolia during the Interregnum, reemerged and asked Mehmed to partition the empire with him. Mehmed refused and met Mustafa's forces in battle, easily defeating them. Mustafa escaped to the Byzantine city of Thessaloniki, but after an agreement with Mehmed, the Byzantine emperor Manuel II Palaiologos exiled Mustafa to the island of Lemnos.
However, Mehmed still faced some problems, first being the problem of his nephew Orhan, who Mehmed perceived as a threat to his rule, much like his late brothers had been. There was allegedly a plot involving him by Manuel II Palaiologos, who tried to use Orhan against Sultan Mehmed; however, the sultan found out about the plot and had Orhan blinded for betrayal, according to a common Byzantine practice.
Furthermore, as a result of the Battle of Ankara and other civil wars, the population of the empire had become unstable and traumatized. A very powerful social and religious movement arose in the empire and became disruptive. The movement was led by Sheikh Bedreddin (1359–1420), a famous Muslim Sufi and charismatic theologian. He was an eminent Ulema, born of a Greek mother and a Muslim father in Simavna (Kyprinos) southwest of Edirne (formerly Adrianople). Mehmed's brother Musa had made Bedreddin his "qadi of the army," or the supreme judge. Bedreddin created a populist religious movement in the Ottoman Sultanate, "subversive conclusions promoting the suppression of social differences between rich and poor as well as the barriers between different forms of monotheism." Successfully developing a popular social revolution and syncretism of the various religions and sects of the empire, Bedreddin's movement began in the European side of the empire and underwent further expansion in western Anatolia.
In 1416, Sheikh Bedreddin started his rebellion against the throne. After a four-year struggle, he was finally captured by Mehmed's grand vizier Bayezid Pasha and hanged in the city of Serres, a city in modern-day Greece, in 1420.
The reign of Mehmed I as sultan of the re-united empire lasted only eight years before his death, but he had also been the most powerful brother contending for the throne and "de facto" ruler of most of the empire for nearly the whole preceding period of 11 years of the Ottoman Interregnum that passed between his father's captivity at Ankara and his own final victory over his brother Musa Çelebi at the Battle of Çamurlu.
He was buried in Bursa, in a mausoleum erected by himself near the celebrated mosque which he built there, and which, because of its decorations of green glazed tiles, is called the Green Mosque. Mehmed I also completed another mosque in Bursa, which his grandfather Murad I had commenced but which had been neglected during the reign of Bayezid. Mehmed founded in the vicinity of his own Green Mosque and mausoleum two other characteristic institutions, one a school and one a refectory for the poor, both of which he endowed with royal munificence.
|
https://en.wikipedia.org/wiki?curid=19987
|
Murad II
Murad II's reign was marked by the long war he fought against the Christian feudal lords of the Balkans and the Turkish beyliks in Anatolia, a conflict that lasted 25 years. He was brought up in Amasya, and ascended the throne on the death of his father Mehmed I.
Murad was born in June 1404 (or 1403) to Sultan Mehmed I. The identity of his mother is disputed. According to 15th century historian Şükrullah, Murad's mother was a concubine. Hüseyin Hüsâmeddin Yasar, an early 20th century historian, wrote in his work "Amasya Tarihi", that his mother was Şehzade Hatun, daughter of Divitdar Ahmed Pasha. According to historians İsmail Hami Danişmend, and Heath W. Lowry, his mother was Emine Hatun, daughter of Şaban Suli Bey, ruler of the Dulkadirids.
He spent his early childhood in Amasya. In 1410, Murad came along with his father to the Ottoman capital, Edirne. After his father ascended to the Ottoman throne, he made Murad governor of the Amasya Sanjak. Murad remained at Amasya until the death of Mehmed I in 1421. He was solemnly recognized as sultan of the Ottoman Sultanate at sixteen years of age, girded with the sabre of Osman at Bursa, and the troops and officers of the state willingly paid homage to him as their sovereign.
Murad's reign was troubled by insurrection early on. The Byzantine Emperor, Manuel II, released the 'pretender' Mustafa Çelebi (known as Düzmece Mustafa) from confinement and acknowledged him as the legitimate heir to the throne of Bayezid I (1389–1402). The Byzantine Emperor had first secured a stipulation that Mustafa should, if successful, repay him for his liberation by giving up a large number of important cities. The pretender was landed by the Byzantine galleys in the European dominion of the sultan and for a time made rapid progress. Many Turkish soldiers joined him, and he defeated and killed the veteran general Beyazid Pasha, whom Murad had sent to fight him. Mustafa defeated Murad's army and declared himself Sultan of Adrianople (modern Edirne). He then crossed the Dardanelles to Asia with a large army but Murad out-manoeuvered Mustafa. Mustafa's force passed over in large numbers to Murad II. Mustafa took refuge in the city of Gallipoli, but the sultan, who was greatly aided by a Genoese commander named Adorno, besieged him there and stormed the place. Mustafa was taken and put to death by the sultan, who then turned his arms against the Roman emperor and declared his resolution to punish the Palaiologos for their unprovoked enmity by the capture of Constantinople.
Murad II then formed a new army called Azap in 1421 and marched through the Byzantine Empire and laid siege to Constantinople. While Murad was besieging the city, the Byzantines, in league with some independent Turkish Anatolian states, sent the sultan's younger brother Küçük Mustafa (who was only 13 years old) to rebel against the sultan and besiege Bursa. Murad had to abandon the siege of Constantinople in order to deal with his rebellious brother. He caught Prince Mustafa and executed him. The Anatolian states that had been constantly plotting against him — Aydinids, Germiyanids, Menteshe and Teke — were annexed and henceforth became part of the Ottoman Sultanate.
Murad II then declared war against Venice, the Karamanid Emirate, Serbia and Hungary. The Karamanids were defeated in 1428 and Venice withdrew in 1432 following the defeat at the second Siege of Thessalonica in 1430. In the 1430s Murad captured vast territories in the Balkans and succeeded in annexing Serbia in 1439. In 1441 the Holy Roman Empire and Poland joined the Serbian-Hungarian coalition. Murad II won the Battle of Varna in 1444 against John Hunyadi.
Murad II relinquished his throne in 1444 to his son Mehmed II, but a Janissary revolt in the Empire forced him to return.
In 1448 he defeated the Christian coalition at the Second Battle of Kosovo (the first one took place in 1389). When the Balkan front was secured, Murad II turned east to defeat Timur's son, Shah Rokh, and the emirates of Karamanid and Çorum-Amasya. In 1450 Murad II led his army into Albania and unsuccessfully besieged the Castle of Kruje in an effort to defeat the resistance led by Skanderbeg. In the winter of 1450–1451, Murad II fell ill, and died in Edirne. He was succeeded by his son Mehmed II (1451–81).
When Murad ascended to the throne, he sought to regain the lost Ottoman territories that had reverted to autonomy following his grandfather Bayezid I’s defeat at the Battle of Ankara in 1402 at the hands of Timur Lang. He needed the support of both the public and the nobles “who would enable him to exercise his rule”, and utilized the old and potent Islamic trope of Ghazi King.
In order to gain popular, international support for his conquests, Murad II modeled himself after the legendary Ghazi kings of old. The Ottomans already presented themselves as ghazis, painting their origins as rising from the ghazas of Osman, the founder of the dynasty. For them, ghaza was the noble championing of Islam and justice against non-Muslims and Muslims alike, if they were cruel; for example, Bayezid I labeled Timur Lang, also a Muslim, an apostate prior to the Battle of Ankara because of the violence his troops had committed upon innocent civilians and because “all you do is to break promises and vows, shed blood, and violate the honor of women.” Murad II only had to capitalize on this dynastic inheritance of doing ghaza, which he did by actively crafting the public image of Ghazi Sultan.
After his accession, there was a flurry of translating and compiling activity where old Persian, Arab, and Anatolian epics were translated into Turkish so Murad II could uncover the ghazi king legends. He drew from the noble behavior of the nameless Caliphs in the "Battalname", an epic about a fictional Arab warrior who fought against the Byzantines, and modelled his actions on theirs. He was careful to embody the simplicity, piety, and noble sense of justice that was part of the Ghazi King persona.
For example, the Caliph in "Battalname" saw the battle turning in his enemy’s favor, and got down from his horse and prayed, after which the battle ended in a victory for him. In the Battle of Varna in 1444, Murad II saw the Hungarians gaining the upper hand, and he got down from his horse and prayed just like the Caliph, and soon after, the tide turned in the Ottoman’s favor and the Hungarian king Wladyslaw was killed. Similarly, the Caliph in the epic roused his warriors by saying “Those of you who die will be martyrs. Those of you who kill will be ghazis”; before the Battle of Varna, Murad II repeated these words to his army, saying “Those of us who kill will be ghazis; those of us who die will be martyrs.” In another instance, since the Ghazi King is meant to be a just and fair, when Murad took Thessalonica in the Balkans, he took care to keep the troops in check and prevented widespread looting. Finally, just as the fictional Caliph’s ghazas were immortalized in "Battalname", Murad II’s battles and victories were also compiled and given the title "The Ghazas of Sultan Murad" ("Gazavat- i Sultan Murad)".
Murad II successfully painted himself as a simple soldier who did not partake in royal excesses, and as a noble ghazi sultan who sought to consolidate Muslim power against non-Muslims such as the Venetians and Hungarians. Through this self-presentation, he got the support of the Muslim population of not only the Ottoman territories, for both himself and his extensive, expensive campaigns, but also the greater Muslim populations in the Dar-al-Islam – such as the Mamluks and the Muslim Delhi Sultanates of India. Murad II was basically presenting himself not only as “a ghazi king who fights caffres [nonmuslims], but also serves as protector and master of lesser ghazis.”
Murad II had four known wives:
Murad II is portrayed by İlker Kurt in 2012 film "Fetih 1453". He was also portrayed by Vahram Papazian in the Albanian movie "The Great Warrior Skanderbeg" in 1953.
|
https://en.wikipedia.org/wiki?curid=19988
|
Mustafa I
Mustafa I (; ; 1591 – 20 January 1639), called Mustafa the Saint (Veli Mustafa) during his second reign and often called Mustafa the Mad (Deli Mustafa) by modern historians, was the son of Mehmed III and was the Sultan of the Ottoman Empire from 1617 to 1618 and from 1622 to 1623.
Mustafa was born in the Manisa Palace, as the younger brother of Ahmed I (1603–1617). His mother was Halime Sultan, an Abkhazian lady.
Before 1603 it was customary for an Ottoman Sultan to have his brothers executed shortly after he gained the throne (Mustafa's father Mehmed III had executed 19 of his own brothers). But when the thirteen-year-old Ahmed I was enthroned in 1603, he spared the life of the twelve-year-old Mustafa.
A factor in Mustafa's survival is the influence of Kösem Sultan (Ahmed's favorite consort), who may have wished to preempt the succession of Osman, Ahmed’s first-born son from another concubine. If Osman became Sultan, he would likely try to execute his half-brothers, the sons of Ahmed and Kösem. (This scenario later became a reality when Osman II executed his brother Mehmed in 1621.) However, the reports of foreign ambassadors suggest that Ahmed actually liked his brother.
Until Ahmed's death in 1617, Mustafa lived in the Old Palace, along with his mother, and grandmother Safiye Sultan.
Ahmed's death created a dilemma never before experienced by the Ottoman Empire. Multiple princes were now eligible for the Sultanate, and all of them lived in Topkapı Palace. A court faction headed by the Şeyhülislam Esad Efendi and Sofu Mehmed Pasha (who represented the Grand Vizier when he was away from Constantinople) decided to enthrone Mustafa instead of Ahmed's son Osman. Sofu Mehmed argued that Osman was too young to be enthroned without causing adverse comment among the populace. The Chief Black Eunuch Mustafa Agha objected, citing Mustafa's mental problems, but he was overruled. Mustafa's rise created a new succession principle of seniority that would last until the end of the Empire. It was the first time an Ottoman Sultan was succeeded by his brother instead of his son. His mother Halime Sultan became the Valide Sultan as well as a regent and wielded great power. Due to Mustafa's mental conditions, she acted as a regent and exercised power more directly.
It was hoped that regular social contact would improve Mustafa's mental health, but his behavior remained eccentric. He pulled off the turbans of his viziers and yanked their beards. Others observed him throwing coins to birds and fish. The Ottoman historian İbrahim Peçevi wrote "this situation was seen by all men of state and the people, and they understood that he was psychologically disturbed."
Mustafa was never more than a tool of court cliques at the Topkapı Palace. In 1618, after a short rule, another palace faction deposed him in favour of his young nephew Osman II (1618–1622), and Mustafa was sent back to the Old Palace. The conflict between the Janissaries and Osman II presented him with a second chance. After a Janissary rebellion led to the deposition and assassination of Osman II in 1622, Mustafa was restored to the throne and held it for another year.
Nevertheless, according to Baki Tezcan, there is not enough evidence to properly establish that Mustafa was mentally imbalanced when he came to the throne. Mustafa "made a number of excursions to the arsenal and the navy docks, examining various sorts of arms and taking an active interest in the munitions supply of the army and the navy." One of the dispatches of Baron de Sancy, the French ambassador, "suggested that Mustafa was interested in leading the Safavid campaign himself and was entertaining the idea of wintering in Konya for that purpose."
Moreover, one contemporary observer provides an explanation of the coup which does not mention the incapacity of Mustafa. Baron de Sancy ascribes the deposition to a political conspiracy between the grand admiral Ali Pasha and Chief Black Eunuch Mustafa Agha, who were angered by the former's removal from office upon Sultan Mustafa's accession. They may have circulated rumors of the sultan's mental instability subsequent to the coup in order to legitimize it.
He commenced his reign by executing all those who had taken any share in the murder of Sultan Osman. Hoca Ömer Efendi, the chief of the rebels, the kızlar Agha Suleiman Agha, the vizier Dilaver Pasha, the Kaim-makam Ahmed Pasha, the defterdar Baki Pasha, the segban-bashi Nasuh Agha, and the general of the janissaries Ali Agha, were cut into pieces.
The epithet "Veli" (meaning "saint") was used in reference to him during his reign.
His mental condition unimproved, Mustafa was a puppet controlled by his mother and brother-in-law, the grand vizier Kara Davud Pasha. He believed that Osman II was still alive and was seen searching for him throughout the palace, knocking on doors and crying out to his nephew to relieve him from the burden of sovereignty. "The present emperor being a fool" (according to English Ambassador Sir Thomas Roe), he was compared unfavorably with his predecessor. In fact, it was his mother Halime Sultan the de facto-co-ruler as Valide Sultan of the Ottoman empire.
Political instability was generated by conflict between the Janissaries and the sipahis (Ottoman cavalry), followed by the Abaza rebellion, which occurred when the governor-general of Erzurum, Abaza Mehmed Pasha, decided to march to Istanbul to avenge the murder of Osman II. The regime tried to end the conflict by executing Kara Davud Pasha, but Abaza Mehmed continued his advance. Clerics and the new Grand Vizier (Kemankeş Kara Ali Pasha) prevailed upon Mustafa's mother to allow the deposition of her son. She agreed, on condition that Mustafa's life would be spared.
The 11-year-old Murad IV, son of Ahmed I and Kösem, was enthroned on 10 September 1623. In return for her consent to his deposition, the request of Mustafa's mother that he be spared execution was granted. Mustafa was sent along with his mother to the Eski (old) Palace.
One source states that Mustafa was executed by the orders of his nephew, Sultan Murad IV on 20 January 1640 in order to end the Ottoman dynasty and prevented to give power to his mother Kösem Sultan. Another source states that he died of epilepsy which was caused by being imprisoned for 34 years out of his 48 years of life. He is buried in the courtyard of the Haghia Sophia.
|
https://en.wikipedia.org/wiki?curid=19991
|
Murad IV
Murad IV or Amurath IV (, "Murād-ı Rābiʿ"; 27 July 1612 – 8 February 1640) was the Sultan of the Ottoman Empire from 1623 to 1640, known both for restoring the authority of the state and for the brutality of his methods. Murad IV was born in Constantinople, the son of Sultan Ahmed I (r. 1603–17) and Kösem Sultan. He was brought to power by a palace conspiracy in 1623, and he succeeded his uncle Mustafa I (r. 1617–18, 1622–23). He was only 11 when he ascended the throne. His reign is most notable for the Ottoman–Safavid War (1623–39), of which the outcome would permanently part the Caucasus between the two Imperial powers for around two centuries, while it also roughly laid the foundation for the current Turkey–Iran–Iraq borders.
Murad IV was born on 27 July 1612 to Ahmed I (reign 16031617) and his consort and later wife Kösem Sultan. After his father’s death when he was six years he was confined in the Kafes with his brothers, Suleiman, Kasim, Bayezid and Ibrahim.
Grand Vizier Kemankeş Ali Pasha and Şeyhülislam Yahya Efendi were deposed from their position. They did not stop their words the next day the sultan, the child of the age of 6, was taken to the Eyüp Sultan Mausoleum. The swords of Muhammad and Yavuz Sultan Selim were besieged to him. Five days later he was circumcised.
Murad IV was for a long time under the control of his relatives and during his early years as Sultan, his mother, Kösem Sultan, essentially ruled through him. The Empire fell into anarchy; the Safavid Empire invaded Iraq almost immediately, Northern Anatolia erupted in revolts, and in 1631 the Janissaries stormed the palace and killed the Grand Vizier, among others. Murad IV feared suffering the fate of his elder brother, Osman II (1618–22), and decided to assert his power.
At the age of 16 in 1628, he had his brother-in-law (his sister Fatma Sultan's husband, who was also the former governor of Egypt), Kara Mustafa Pasha, executed for a claimed action "against the law of God".
After the death of the Grand Vizier Çerkes Mehmed Pasha in the winter of Tokat, Diyarbekir Beylerbeyi Hafez Ahmed Pasha became a vizier and an emperor on 8 February 1625.
The epidemic, which started in the summer of 1625 and called the plague of Bayrampaşa, spread to a threat to the population of Istanbul. On average, a thousand people died every day. The people went to the Okmeydanı, to regent themselves from this plague. The situation was worse in the countryside, but there is no one who sees what looks out of Istanbul.
Murad IV tried to quell the corruption that had grown during the reigns of previous Sultans, and that had not been checked while his mother was ruling through proxy.
Executions were issued to the states, and those who came to Istanbul were executed as Jelali, Jelali being followers of Celali, the leader of the revolt in 1519 in Tokat who lead peasants acting against feudal exploitation. Murad IV shivering and brutal sultan started with this shaking.
Ilyas Pasha, who took advantage of the confusion in Istanbul and dominated the Manisa and Balikesir sides, who was taught Şehname, Timurname at night and was caught in the sultan's dreams, was finally caught and brought to Istanbul and executed in front of the Sultan.
Murad IV banned alcohol, tobacco, and coffee in Constantinople. He ordered execution for breaking this ban. He would reportedly patrol the streets and the lowest taverns of Constantinople in civilian clothes at night, policing the enforcement of his command by casting off his disguise on the spot and beheading the offender with his own hands. Rivaling the exploits of Selim the Grim, he would sit in a kiosk by the water near his Seraglio Palace and shoot arrows at any passerby or boatman who rowed too close to his imperial compound, seemingly for sport. He restored the judicial regulations by very strict punishments, including execution, he once strangled a grand vizier for the reason that the official had beaten his mother-in-law.
On 2 September 1633 , the big Cibali fire broke out, burning a fifth of the city. The fire that started during that day when a caulker burned the shrub and the ship caulked into the walls. The fire, which spread from three branches to the city. One arm lowered towards the sea. He returned from Zeyrek and walked to Atpazan. Other kollan Büyükkaraman, Küçükkaraman, Sultanmehmet (Fatih), Saraçhane, Sangürz (Sangüzel) districts have been ruined. The sultan could not do anything other than watching sentence viziers, Bostancı and Yeniçeri. The most beautiful districts of Istanbul have been ruined, from the Yeniodas, Mollagürani districts, Fener gate to Sultanselim, Mesihpaşa, Bali Pasha and Lutfi Pasha mosques, Şahı buhan Palace, Unkapam to Atpazarı, Bostanzade houses, Sofular Bazaar. The fire that lasted for 30 hours could be extinguished after the wind sectioned.
Murad IV's reign is most notable for the Ottoman–Safavid War (1623–39) against Persia (today Iran) in which Ottoman forces managed to conquer Azerbaijan, occupying Tabriz, Hamadan, and capturing Baghdad in 1638. The Treaty of Zuhab that followed the war generally reconfirmed the borders as agreed by the Peace of Amasya, with Eastern Armenia, Eastern Georgia, Azerbaijan, and Dagestan staying Persian, while Western Armenia, and Western Georgia stayed Ottoman. Mesopotamia was irrevocably lost for the Persians. The borders fixed as a result of the war, are more or less the same as the present border line between Turkey, Iraq and Iran.
During the siege of Baghdad in 1638, the city held out for forty days but was compelled to surrender.
Murad IV himself commanded the Ottoman army in the last years of the war.
While he was encamped in Baghdad, Murad IV is known to have met ambassadors of the Mughal Emperor Shah Jahan, Mir Zarif and Mir Baraka, who presented 1000 pieces of finely embroidered cloth and even armor. Murad IV gave them the finest weapons, saddles and Kaftans and ordered his forces to accompany the Mughals to the port of Basra, where they set sail to Thatta and finally Surat.
Murad IV put emphasis on architecture and in his period many monuments were erected. The Baghdad Kiosk, built in 1635, and the Revan Kiosk, built in 1638 in Yerevan, were both built in the local styles. Some of the others include the Kavak Sarayı pavilion; the Meydanı Mosque; the Bayram Pasha Dervish Lodge, Tomb, Fountain, and Primary School; and the Şerafettin Mosque in Konya.
Murad IV wrote many poems. He used "Muradi" penname for his poems. He also liked testing people with riddles. Once he wrote a poemic riddle and announced that whoever came with the correct answer would get a generous reward. Cihadi Bey who was also a poet from Enderun School gave the correct answer and he was promoted.
Murad IV was also a composer. He has a composition called "Uzzal Peshrev".
Very little is known about the concubines of Murad IV, principally because he did not leave sons who survived his death to reach the throne, but many historians consider Ayşe Sultan as his only consort until the very end of Murad's seventeen-year reign, when a second Haseki appeared in the records. It is possible that Murad had only a single concubine until the advent of the second, or that he had a number of concubines but singled out only two as Haseki.
Murad had several daughters, among whom were:
Murad IV died from cirrhosis in Constantinople at the age of 27 in 1640.
Rumours had circulated that on his deathbed, Murad IV ordered the execution of his mentally disabled brother, Ibrahim (reigned 1640–48), which would have meant the end of the Ottoman line. However, the order was not carried out.
In the TV series "", Murad IV is portrayed by Cağan Efe Ak as a child, and Metin Akdülger as Sultan.
|
https://en.wikipedia.org/wiki?curid=19992
|
Masamune Shirow
, better known by his pen name , is a Japanese manga artist. Shirow is best known for the manga "Ghost in the Shell", which has since been turned into three theatrical anime movies, two anime television series, an anime television movie, an anime OVA series, a theatrical live action movie, and several video games.
Born in the Hyōgo Prefecture capital city of Kobe, he studied oil painting at Osaka University of Arts. While in college, he developed an interest in manga, which led him to create his own complete work, "Black Magic", which was published in the manga dōjinshi "Atlas". His work caught the eye of Seishinsha President Harumichi Aoki, who offered to publish him.
The result was best-selling manga "Appleseed", a full volume of densely plotted drama taking place in an ambiguous future. The story was a sensation, and won the 1986 Seiun Award for Best Manga. After a professional reprint of "Black Magic" and a second volume of "Appleseed", he released "Dominion" in 1986. Two more volumes of "Appleseed" followed before he began work on "Ghost in the Shell".
In 2007, he collaborated again with Production I.G to co-create the original concept for the anime television series "Shinreigari/Ghost Hound", Production I.G's 20th anniversary project. A further original collaboration with Production I.G began airing in April 2008, titled "Real Drive".
A substantial amount of Shirow's work has been released in art book or poster book format. The following is an incomplete list.
"Galgrease" (published in "Uppers Magazine", 2002) is the collected name of several erotic manga and poster books by Shirow. The name comes from the fact that the women depicted often look "greased".
The first series of "Galgrease" booklets included four issues each in the following settings:
The second series included another run of 12 booklets in the following worlds:
After each regular series, there were one or more bonus poster books that revisited the existing characters and settings.
Main source:
|
https://en.wikipedia.org/wiki?curid=19994
|
Musical saw
A musical saw, also called a singing saw, is a hand saw used as a musical instrument. Capable of continuous glissando (portamento), the sound creates an ethereal tone, very similar to the theremin. The musical saw is classified as a plaque friction idiophone with direct friction (132.22) under the Hornbostel-Sachs system of musical instrument classification.
The saw is generally played seated with the handle squeezed between the legs, and the far end held with one hand. Some sawists play standing, either with the handle between the knees and the blade sticking out in front of them. The saw is usually played with the serrated edge, or "teeth", facing the body, though some players face them away. Some saw players file down the teeth which makes no discernable difference to the sound. Many—especially professional—saw players use a handle, called a Cheat, at the tip of the saw for easier bending and higher virtuosity.
To sound a note, a sawist first bends the blade into an S-curve. The parts of the blade that are curved are damped from vibration, and do not sound. At the center of the S-curve a section of the blade remains relatively flat. This section, the "sweet spot", can vibrate across the width of the blade, producing a distinct pitch: the wider the section of blade, the lower the sound. Sound is usually created by drawing a bow across the back edge of the saw at the sweet spot, or sometimes by striking the sweet spot with a mallet.
The sawist controls the pitch by adjusting the S-curve, making the sweet spot travel up the blade (toward a thinner width) for a higher pitch, or toward the handle for a lower pitch. Harmonics can be created by playing at varying distances on either side of the sweet spot. Sawists can add vibrato by shaking one of their legs or by wobbling the hand that holds the tip of the blade. Once a sound is produced, it will sustain for quite a while, and can be carried through several notes of a phrase.
On occasion the musical saw is called for in orchestral music, but orchestral percussionists are seldom also sawists. If a note outside of the saw's range is called for, an electric guitar with a slide can be substituted.
Sawists often use standard wood-cutting saws, although special musical saws are also made. As compared with wood-cutting saws, the blades of musical saws are generally wider, for range, and longer, for finer control. They do not have set or sharpened teeth, and may have grain running parallel to the back edge of the saw, rather than parallel to the teeth. Some musical saws are made with thinner metal, to increase flexibility, while others are made thicker, for a richer tone, longer sustain, and stronger harmonics.
A typical musical saw is wide at the handle end and wide at the tip. Such a saw will generally produce about two octaves, regardless of length. A bass saw may be over at the handle and produce about two-and-a-half octaves. There are also musical saws with 3–4 octaves range, and new improvements have resulted in as much as 5 octaves note range. Two-person saws, also called "misery whips", can also be played, though with less virtuosity, and they produce an octave or less of range.
Most sawists use cello or violin bows, using violin rosin, but some may use improvised home-made bows, such as a wooden dowel.
Musical saws have been produced for over a century, primarily in the United States, but also in Scandinavia, Germany, France (Lame sonore) and Asia.
In the early 1900s, there were at least ten companies in the United States manufacturing musical saws. These saws ranged from the familiar steel variety to gold-plated masterpieces worth hundreds of dollars. However, with the start of World War II the demand for metals made the manufacture of saws too expensive and many of these companies went out of business. By the year 2000, only three companies in the United States—Mussehl & Westphal, Charlie Blacklock, and Wentworth—were making saws. In 2012, a company called Index Drums started producing a saw that had a built-in transducer in the handle, called the "JackSaw".
Outside the United States, makers of musical saws include Bahco, makers of the limited edition Stradivarius, Alexis in France, Feldmann and Stövesandt in Germany, Music Blade in Greece and Thomas Flinn & Company in the United Kingdom, based in Sheffield, who produce three different sized musical saws, as well as accessories.
The International Musical Saw Association (IMSA) produces an annual International Musical Saw Festival (including a "Saw-Off" competition) every August in Santa Cruz and Felton, California. An International Musical Saw Festival is held every other summer in New York City, produced by Natalia Paruz. Paruz also produced a musical saw festival in Israel. There are also annual saw festivals in Japan and China.
A Guinness World Record for the largest musical-saw ensemble was established July 18, 2009, at the annual NYC Musical Saw Festival. Organized by Paruz, 53 musical saw players performed together.
In 2011 a World Championship took place in Jelenia Góra/Poland. Winners: 1. Gladys Hulot (France), 2. Katharina Micada (Germany), 3. Tom Fink (Germany).
This is a list of people notable for playing the musical saw.
German actress and singer Marlene Dietrich, who lived and worked in the United States for a long time, is probably the best-known musical saw player. When she studied the violin for one year in Weimar in her early twenties, her musical skills were already evident. Some years later she learned to play the musical saw while she was shooting the film "Café Elektric" in Vienna in 1927. Her colleague, the Bavarian actor and musician Igo Sym, taught her how to play. In the shooting breaks and at weekends both performed romantic duets, he at the piano and she at the musical saw.
Sym gave his saw to her as a farewell gift. The following words are engraved on the saw: "Now Suidy is gone / the sun d’ont [sic!] / shine… / Igo / Vienna 1927"
She took the saw with her, when she left for Hollywood in 1929 and played there in the following years at film sets and Hollywood parties.
When she participated in the United Service Organizations (USO) shows for the US troops in 1944, she also played on the saw. Some of these shows were broadcast on radio, so there exist two rare recordings of her saw playing, embedded in entertaining interviews. 1. Aloha Oe 2. other song
Beginning from the early 1920s composers of both contemporary and popular music wrote for the musical saw.
Probably the first was Dmitri Shostakovich. He included the musical saw, e.g., in the film music for "The New Babylon" (1929), in "The Nose" (1928), and in "Lady Macbeth of the Mtsensk District" (1934).
Shostakovich and other composers of his time used the term "Flexaton" to mark the musical saw. "Flexaton" just means "to flex a tone"—the saw is flexed to change the pitch. Unfortunately, there exists another instrument called Flexatone, so there has been confusion for a long time. Aram Khachaturian, who knew Shostakovich's music, included the musical saw in his Piano Concerto (1936) in the second movement. Another composer was the Swiss Arthur Honegger, who included the saw in his opera "Antigone" in 1924 .
The Romanian composer George Enescu used the musical saw at the end of the second act of his opera "Œdipe" (1931) to show in an extensive glissando—which begins with the mezzo-soprano and is continued by the saw—the death and ascension of the sphinx killed by Oedipus.
The Italian composer Giacinto Scelsi wrote a part for the saw in his quarter-tone piece "Quattro pezzi per orchestra" (1959). German composer Hans Werner Henze took the saw to characterize the mean hero of his tragical opera "Elegy for young lovers" (1961).
Other composers were Krysztof Penderecki with "Fluorescences" (1961), "De natura sonoris Nr. 2" (1971) and the opera "Ubu Rex" (1990), Bernd Alois Zimmermann with "Stille und Umkehr" (1970), George Crumb with "Ancient voices of children" (1970), John Corigliano with "The Mannheim Rocket" (2001).
Composer Scott Munson wrote "Clover Hill" (2007) for saw and orchestra, Quintet for saw and strings (2009), "The World Is Too Much with Us" for soprano singer, saw and strings (2009), "Ars longa vitas brevis" for saw and string quartet (2010), 'Bend' for saw and string quartet (2011) many pieces for jazz band and saw (2010–2013), "Lullaby for the Forgotten" for saw and piano (2015), and many movie and theater scores containing the saw.
Chaya Czernowin used the saw in her opera "PNIMA...Ins Innere" (2000) to represent the character of the grandfather, who is traumatized by the Holocaust.
There are further Leif Segerstam, Hans Zender (orchestration of "5 préludes" by Claude Debussy), Franz Schreker (opera "Christophorus"), and Oscar Strasnoy (opera "Le bal").
Russian composer Lera Auerbach wrote for the saw in her ballet "The Little Mermaid" (2005), in her symphonic poem "Dreams and Whispers of Poseidon" (2005), in her oratorio "Requiem Dresden – Ode to Peace" (2012), in her Piano Concerto No.1 (2015), in her comic oratorio "The Infant Minstrel and His Peculiar Menagerie" (2016) and in her violin concerto Nr.4 "NyX – Fractured dreams" (2017).
Canadian composer Robert Minden has written extensively for the musical saw. Michael A. Levine composed "Divination By Mirrors" for musical saw soloist and two string ensembles tuned a quarter tone apart, taking advantage of the saws ability to play in both tunings.
Other composers for chamber music with musical saw are Jonathan Rutherford ("An Intake of Breath"), Dana Wilson ("Whispers from Another Time"), Heinrich Gattermeyer ("Elegie für Singende Säge, Cembalo (oder Klavier)", Vito Zuraj ("Musica di camera" (2001)) and Britta-Maria Bernhard ("Tranquillo")
|
https://en.wikipedia.org/wiki?curid=19995
|
MIDI
MIDI (; an acronym for Musical Instrument Digital Interface) is a technical standard that describes a communications protocol, digital interface, and electrical connectors that connect a wide variety of electronic musical instruments, computers, and related audio devices for playing, editing and recording music. The specification originates in a paper published by Dave Smith and Chet Wood then of Sequential Circuits at the October 1981 Audio Engineering Society conference in New York City then titled "Universal Synthesizer Interface."
A single MIDI link through a MIDI cable can carry up to sixteen channels of information, each of which can be routed to a separate device or instrument. This could be sixteen different digital instruments, for example. MIDI carries event messages; data that specify the instructions for music, including a note's notation, pitch, velocity (which is heard typically as loudness or softness of volume); vibrato; panning to the right or left of stereo; and clock signals (which set tempo). When a musician plays a MIDI instrument, all of the key presses, button presses, knob turns and slider changes are converted into MIDI data. One common MIDI application is to play a MIDI keyboard or other controller and use it to trigger a digital sound module (which contains synthesized musical sounds) to generate sounds, which the audience hears produced by a keyboard amplifier. MIDI data can be transferred via MIDI or USB cable, or recorded to a sequencer or digital audio workstation to be edited or played back.
A file format that stores and exchanges the data is also defined. Advantages of MIDI include small file size, ease of modification and manipulation and a wide choice of electronic instruments and synthesizer or digitally-sampled sounds. A MIDI recording of a performance on a keyboard could sound like a piano or other keyboard instrument; however, since MIDI records the messages and information about their notes and not the specific sounds, this recording could be changed to many other sounds, ranging from synthesized or sampled guitar or flute to full orchestra. A MIDI recording is not an audio signal, as with a sound recording made with a microphone.
Prior to the development of MIDI, electronic musical instruments from different manufacturers could generally not communicate with each other. This meant that a musician could not, for example, plug a Roland keyboard into a Yamaha synthesizer module. With MIDI, any MIDI-compatible keyboard (or other controller device) can be connected to any other MIDI-compatible sequencer, sound module, drum machine, synthesizer, or computer, even if they are made by different manufacturers.
MIDI technology was standardized in 1983 by a panel of music industry representatives, and is maintained by the MIDI Manufacturers Association (MMA). All official MIDI standards are jointly developed and published by the MMA in Los Angeles, and the MIDI Committee of the Association of Musical Electronics Industry (AMEI) in Tokyo. In 2016, the MMA established the MIDI Association (TMA) to support a global community of people who work, play, or create with MIDI.
In the early 1980s, there was no standardized means of synchronizing electronic musical instruments manufactured by different companies. Manufacturers had their own proprietary standards to synchronize instruments, such as CV/gate and Digital Control Bus (DCB). Roland founder Ikutaro Kakehashi felt the lack of standardization was limiting the growth of the electronic music industry. In June 1981, he proposed developing a standard to Oberheim Electronics founder Tom Oberheim, who had developed his own proprietary interface, the Oberheim System.
Kakehashi felt the Oberheim System was too cumbersome, and spoke to Sequential Circuits president Dave Smith about creating a simpler, cheaper alternative. While Smith discussed the concept with American companies, Kakehashi discussed it with Japanese companies Yamaha, Korg and Kawai. Representatives from all companies met to discuss the idea in October. Initially, only Sequential Circuits and the Japanese companies were interested.Using Roland's DCB as a basis, Smith and Sequential Circuits engineer Chet Wood devised a universal synthesizer interface to allow communication between equipment from different manufacturers. Smith and Wood proposed this standard in a paper, "Universal Synthesizer Interface," at the Audio Engineering Society show in October 1981. The standard was discussed and modified by representatives of Roland, Yamaha, Korg, Kawai, and Sequential Circuits. Kakehashi favored the name Universal Musical Interface (UMI), pronounced "you-me", but Smith felt this was "a little corny". However, he liked the use of "instrument" instead of "synthesizer", and proposed the name Musical Instrument Digital Interface (MIDI). Moog Music founder Robert Moog announced MIDI in the October 1982 issue of "Keyboard".
At the 1983 Winter NAMM Show, Smith demonstrated a MIDI connection between Prophet 600 and Roland JP-6 synthesizers. The MIDI specification was published in August 1983. The MIDI standard was unveiled by Kakehashi and Smith, who received Technical Grammy Awards in 2013 for their work. In 1982, the first instruments were released with MIDI, the Roland Jupiter-6 and the Prophet 600. In 1983, the first MIDI drum machine, the Roland TR-909, and the first MIDI sequencer, the Roland MSQ-700 were released. The first computer to support MIDI, the NEC PC-88 and PC-98, was released in 1982.
The MIDI Manufacturers Association (MMA) was formed following a meeting of "all interested companies" at the 1984 Summer NAMM Show in Chicago. The MIDI 1.0 Detailed Specification was published at the MMA's second meeting at the 1985 Summer NAMM show. The standard continued to evolve, adding standardized song files in 1991 (General MIDI) and adapted to new connection standards such as USB and FireWire. In 2016, the MIDI Association was formed to continue overseeing the standard. An initiative to create a 2.0 standard was announced in January 2019. The MIDI 2.0 standard was introduced at the 2020 Winter NAMM show.
MIDI's appeal was originally limited to professional musicians and record producers who wanted to use electronic instruments in the production of popular music. The standard allowed different instruments to communicate with each other and with computers, and this spurred a rapid expansion of the sales and production of electronic instruments and music software. This interoperability allowed one device to be controlled from another, which reduced the amount of hardware musicians needed. MIDI's introduction coincided with the dawn of the personal computer era and the introduction of samplers and digital synthesizers. The creative possibilities brought about by MIDI technology are credited for helping revive the music industry in the 1980s.
MIDI introduced capabilities that transformed the way many musicians work. MIDI sequencing makes it possible for a user with no notation skills to build complex arrangements. A musical act with as few as one or two members, each operating multiple MIDI-enabled devices, can deliver a performance similar to that of a larger group of musicians. The expense of hiring outside musicians for a project can be reduced or eliminated, and complex productions can be realized on a system as small as a synthesizer with integrated keyboard and sequencer.
MIDI also helped establish home recording. By performing preproduction in a home environment, an artist can reduce recording costs by arriving at a recording studio with a partially completed song.
MIDI was invented so that electronic or digital musical instruments could communicate with each other and so that one instrument can control another. For example, a MIDI-compatible sequencer can trigger beats produced by a drum sound module. Analog synthesizers that have no digital component and were built prior to MIDI's development can be retrofit with kits that convert MIDI messages into analog control voltages. When a note is played on a MIDI instrument, it generates a digital MIDI message that can be used to trigger a note on another instrument. The capability for remote control allows full-sized instruments to be replaced with smaller sound modules, and allows musicians to combine instruments to achieve a fuller sound, or to create combinations of synthesized instrument sounds, such as acoustic piano and strings. MIDI also enables other instrument parameters (volume, effects, etc.) to be controlled remotely.
Synthesizers and samplers contain various tools for shaping an electronic or digital sound. Filters adjust timbre, and envelopes automate the way a sound evolves over time after a note is triggered. The frequency of a filter and the envelope attack (the time it takes for a sound to reach its maximum level), are examples of synthesizer parameters, and can be controlled remotely through MIDI. Effects devices have different parameters, such as delay feedback or reverb time. When a MIDI continuous controller number (CCN) is assigned to one of these parameters, the device responds to any messages it receives that are identified by that number. Controls such as knobs, switches, and pedals can be used to send these messages. A set of adjusted parameters can be saved to a device's internal memory as a "patch", and these patches can be remotely selected by MIDI program changes.
MIDI events can be sequenced with computer software, or in specialized hardware music workstations. Many digital audio workstations (DAWs) are specifically designed to work with MIDI as an integral component. MIDI piano rolls have been developed in many DAWs so that the recorded MIDI messages can be easily modified. These tools allow composers to audition and edit their work much more quickly and efficiently than did older solutions, such as multitrack recording.
Because MIDI is a set of commands that create sound, MIDI sequences can be manipulated in ways that prerecorded audio cannot. It is possible to change the key, instrumentation or tempo of a MIDI arrangement, and to reorder its individual sections. The ability to compose ideas and quickly hear them played back enables composers to experiment. Algorithmic composition programs provide computer-generated performances that can be used as song ideas or accompaniment.
Some composers may take advantage of standard, portable set of commands and parameters in MIDI 1.0 and General MIDI (GM) to share musical data files among various electronic instruments. The data composed via the sequenced MIDI recordings can be saved as a "standard MIDI file" (SMF), digitally distributed, and reproduced by any computer or electronic instrument that also adheres to the same MIDI, GM, and SMF standards. MIDI data files are much smaller than corresponding recorded audio files.
The personal computer market stabilized at the same time that MIDI appeared, and computers became a viable option for music production. In 1983 computers started to play a role in mainstream music production. In the years immediately after the 1983 ratification of the MIDI specification, MIDI features were adapted to several early computer platforms. NEC's PC-88 and PC-98 began supporting MIDI as early as 1982. The Yamaha CX5M introduced MIDI support and sequencing in an MSX system in 1984.
The spread of MIDI on personal computers was largely facilitated by Roland Corporation's MPU-401, released in 1984, as the first MIDI-equipped PC sound card, capable of MIDI sound processing and sequencing. After Roland sold MPU sound chips to other sound card manufacturers, it established a universal standard MIDI-to-PC interface. The widespread adoption of MIDI led to computer-based MIDI software being developed. Soon after, a number of platforms began supporting MIDI, including the Apple II Plus, IIe and Macintosh, Commodore 64 and Amiga, Atari ST, Acorn Archimedes, and PC DOS.
The Macintosh was a favorite among US musicians, as it was marketed at a competitive price, and it took several years for PC systems to catch up with its efficiency and graphical interface. The Atari ST was preferred in Europe, where Macintoshes were more expensive. The Atari ST had the advantage of MIDI ports that were built directly into the computer. Most music software in MIDI's first decade was published for either the Apple or the Atari. By the time of Windows 3.0's 1990 release, PCs had caught up in processing power and had acquired a graphical interface and software titles began to see release on multiple platforms.
The Standard MIDI File (SMF) is a file format that provides a standardized way for music sequences to be saved, transported, and opened in other systems. The standard was developed and is maintained by the MMA, and usually uses a codice_1 extension. The compact size of these files led to their widespread use in computers, mobile phone ringtones, webpage authoring and musical greeting cards. These files are intended for universal use and include such information as note values, timing and track names. Lyrics may be included as metadata, and can be displayed by karaoke machines.
SMFs are created as an export format of software sequencers or hardware workstations. They organize MIDI messages into one or more parallel tracks, and timestamp the events so that they can be played back in sequence. A header contains the arrangement's track count, tempo and which of three SMF formats the file is in. A type 0 file contains the entire performance, merged onto a single track, while type 1 files may contain any number of tracks that are performed in synchrony. Type 2 files are rarely used and store multiple arrangements, with each arrangement having its own track and intended to be played in sequence. Microsoft Windows bundles SMFs together with Downloadable Sounds (DLS) in a Resource Interchange File Format (RIFF) wrapper, as RMID files with a codice_2 extension. RIFF-RMID has been deprecated in favor of Extensible Music Files (XMF).
A MIDI file is not an audio recording. Rather, it is a set of instructions "–" for example, for pitch or tempo "–" and can use a thousand times less disk space than the equivalent recorded audio. This made MIDI file arrangements an attractive way to share music, before the advent of broadband internet access and multi-gigabyte hard drives. Licensed MIDI files on floppy disks were commonly available in stores in Europe and Japan during the 1990s. The major drawback to this is the wide variation in quality of users' audio cards, and in the actual audio contained as samples or synthesized sound in the card that the MIDI data only refers to symbolically. There is no standardization of how symbols are expressed. Even a sound card that contains high-quality sampled sounds can have inconsistent quality from one sampled instrument to another, while different model cards have no guarantee of consistent sound of the same instrument. Early budget-priced cards, such as the AdLib and the Sound Blaster and its compatibles, used a stripped-down version of Yamaha's frequency modulation synthesis (FM synthesis) technology played back through low-quality digital-to-analog converters. The low-fidelity reproduction of these ubiquitous cards was often assumed to somehow be a property of MIDI itself. This created a perception of MIDI as low-quality audio, while in reality MIDI itself contains no sound, and the quality of its playback depends entirely on the quality of the sound-producing device (and of samples in the device).
The main advantage of the personal computer in a MIDI system is that it can serve a number of different purposes, depending on the software that is loaded. Multitasking allows simultaneous operation of programs that may be able to share data with each other.
Sequencing software provides a number of benefits to a composer or arranger. It allows recorded MIDI to be manipulated using standard computer editing features such as cut, copy and paste and drag and drop. Keyboard shortcuts can be used to streamline workflow, and editing functions are often selectable via MIDI commands. The sequencer allows each channel to be set to play a different sound, and gives a graphical overview of the arrangement. A variety of editing tools are made available, including a notation display that can be used to create printed parts for musicians. Tools such as looping, quantization, randomization, and transposition simplify the arranging process.
Beat creation is simplified, and groove templates can be used to duplicate another track's rhythmic feel. Realistic expression can be added through the manipulation of real-time controllers. Mixing can be performed, and MIDI can be synchronized with recorded audio and video tracks. Work can be saved, and transported between different computers or studios.
Sequencers may take alternate forms, such as drum pattern editors that allow users to create beats by clicking on pattern grids, and loop sequencers such as ACID Pro, which allow MIDI to be combined with prerecorded audio loops whose tempos and keys are matched to each other. Cue list sequencing is used to trigger dialogue, sound effect, and music cues in stage and broadcast production.
With MIDI, notes played on a keyboard can automatically be transcribed to sheet music. Scorewriting software typically lacks advanced sequencing tools, and is optimized for the creation of a neat, professional printout designed for live instrumentalists. These programs provide support for dynamics and expression markings, chord and lyric display, and complex score styles. Software is available that can print scores in braille.
SmartScore software can produce MIDI files from scanned sheet music. Other notation programs include Finale, Encore, Sibelius, MuseScore and Dorico.
Patch editors allow users to program their equipment through the computer interface. These became essential with the appearance of complex synthesizers such as the Yamaha FS1R, which contained several thousand programmable parameters, but had an interface that consisted of fifteen tiny buttons, four knobs and a small LCD. Digital instruments typically discourage users from experimentation, due to their lack of the feedback and direct control that switches and knobs would provide, but patch editors give owners of hardware instruments and effects devices the same editing functionality that is available to users of software synthesizers. Some editors are designed for a specific instrument or effects device, while other, "universal" editors support a variety of equipment, and ideally can control the parameters of every device in a setup through the use of System Exclusive commands.
Patch librarians have the specialized function of organizing the sounds in a collection of equipment, and allow transmission of entire banks of sounds between an instrument and a computer. This allows the user to augment the device's limited patch storage with a computer's much greater disk capacity, and to share custom patches with other owners of the same instrument. Universal editor/librarians that combine the two functions were once common, and included Opcode Systems' Galaxy and eMagic's SoundDiver. These programs have been largely abandoned with the trend toward computer-based synthesis, although Mark of the Unicorn's (MOTU)'s Unisyn and Sound Quest's Midi Quest remain available. Native Instruments' Kore was an effort to bring the editor/librarian concept into the age of software instruments.
Programs that can dynamically generate accompaniment tracks are called "auto-accompaniment" programs. These create a full band arrangement in a style that the user selects, and send the result to a MIDI sound generating device for playback. The generated tracks can be used as educational or practice tools, as accompaniment for live performances, or as a songwriting aid.
Computers can use software to generate sounds, which are then passed through a digital-to-analog converter (DAC) to a power amplifier and loudspeaker system. The number of sounds that can be played simultaneously (the polyphony) is dependent on the power of the computer's CPU, as are the sample rate and bit depth of playback, which directly affect the quality of the sound. Synthesizers implemented in software are subject to timing issues that are not present with hardware instruments, whose dedicated operating systems are not subject to interruption from background tasks as desktop operating systems are. These timing issues can cause synchronization problems, and clicks and pops when sample playback is interrupted. Software synthesizers also exhibit a noticeable delay known as latency in their sound generation, because computers use an audio buffer that delays playback and disrupts MIDI timing.
Software synthesis' roots go back as far as the 1950s, when Max Mathews of Bell Labs wrote the MUSIC-N programming language, which was capable of non-real-time sound generation. The first synthesizer to run directly on a host computer's CPU was Reality, by Dave Smith's Seer Systems, which achieved a low latency through tight driver integration, and therefore could run only on Creative Labs soundcards. Some systems use dedicated hardware to reduce the load on the host CPU, as with Symbolic Sound Corporation's Kyma System, and the Creamware/Sonic Core Pulsar/SCOPE systems, which power an entire recording studio's worth of instruments, effect units, and mixers.
The ability to construct full MIDI arrangements entirely in computer software allows a composer to render a finalized result directly as an audio file.
Early PC games were distributed on floppy disks, and the small size of MIDI files made them a viable means of providing soundtracks. Games of the DOS and early Windows eras typically required compatibility with either Ad Lib or Sound Blaster audio cards. These cards used FM synthesis, which generates sound through modulation of sine waves. John Chowning, the technique's pioneer, theorized that the technology would be capable of accurate recreation of any sound if enough sine waves were used, but budget computer audio cards performed FM synthesis with only two sine waves. Combined with the cards' 8-bit audio, this resulted in a sound described as "artificial" and "primitive".
Wavetable daughterboards that were later available provided audio samples that could be used in place of the FM sound. These were expensive, but often used the sounds from respected MIDI instruments such as the E-mu Proteus. The computer industry moved in the mid-1990s toward wavetable-based soundcards with 16-bit playback, but standardized on a 2MB ROM, a space too small in which to fit good-quality samples of 128 instruments plus drum kits. Some manufacturers used 12-bit samples, and padded those to 16 bits.
MIDI has been adopted as a control protocol in a number of non-musical applications. MIDI Show Control uses MIDI commands to direct stage lighting systems and to trigger cued events in theatrical productions. VJs and turntablists use it to cue clips, and to synchronize equipment, and recording systems use it for synchronization and automation. Apple Motion allows control of animation parameters through MIDI. The 1987 first-person shooter game "MIDI Maze" and the 1990 Atari ST computer puzzle game "Oxyd" used MIDI to network computers together, and kits are available that allow MIDI control over home lighting and appliances.
Despite its association with music devices, MIDI can control any electronic or digital device that can read and process a MIDI command. The receiving device or object would require a General MIDI processor, however in this instance, the program changes would trigger a function on that device rather than notes from a MIDI instrument's controller. Each function can be set to a timer (also controlled by MIDI) or other condition or trigger determined by the device's creator.
The cables terminate in a 180° five-pin DIN connector. Standard applications use only three of the five conductors: a ground wire, and a balanced pair of conductors that carry a +5 volt signal. This connector configuration can only carry messages in one direction, so a second cable is necessary for two-way communication. Some proprietary applications, such as phantom-powered footswitch controllers, use the spare pins for direct current (DC) power transmission.
Opto-isolators keep MIDI devices electrically separated from their connectors, which prevents the occurrence of ground loops and protects equipment from voltage spikes. There is no error detection capability in MIDI, so the maximum cable length is set at 15 meters (50 feet) to limit interference.
Most devices do not copy messages from their input to their output port. A third type of port, the "thru" port, emits a copy of everything received at the input port, allowing data to be forwarded to another instrument in a "daisy chain" arrangement. Not all devices contain thru ports, and devices that lack the ability to generate MIDI data, such as effects units and sound modules, may not include out ports.
Each device in a daisy chain adds delay to the system. This is avoided with a MIDI thru box, which contains several outputs that provide an exact copy of the box's input signal. A MIDI merger is able to combine the input from multiple devices into a single stream, and allows multiple controllers to be connected to a single device. A MIDI switcher allows switching between multiple devices, and eliminates the need to physically repatch cables. MIDI patch bays combine all of these functions. They contain multiple inputs and outputs, and allow any combination of input channels to be routed to any combination of output channels. Routing setups can be created using computer software, stored in memory, and selected by MIDI program change commands. This enables the devices to function as standalone MIDI routers in situations where no computer is present. MIDI patch bays also clean up any skewing of MIDI data bits that occurs at the input stage.
MIDI data processors are used for utility tasks and special effects. These include MIDI filters, which remove unwanted MIDI data from the stream, and MIDI delays, effects that send a repeated copy of the input data at a set time.
A computer MIDI interface's main function is to match clock speeds between the MIDI device and the computer. Some computer sound cards include a standard MIDI connector, whereas others connect by any of various means that include the D-subminiature DA-15 game port, USB, FireWire, Ethernet or a proprietary connection. The increasing use of USB connectors in the 2000s has led to the availability of MIDI-to-USB data interfaces that can transfer MIDI channels to USB-equipped computers. Some MIDI keyboard controllers are equipped with USB jacks, and can be plugged into computers that run music software.
MIDI's serial transmission leads to timing problems. A three-byte MIDI message requires nearly 1 millisecond for transmission. Because MIDI is serial, it can only send one event at a time. If an event is sent on two channels at once, the event on the second channel cannot transmit until the first one is finished, and so is delayed by 1 ms. If an event is sent on all channels at the same time, the last channel's transmission is delayed by as much as 16 ms. This contributed to the rise of MIDI interfaces with multiple in- and out-ports, because timing improves when events are spread between multiple ports as opposed to multiple channels on the same port. The term "MIDI slop" refers to audible timing errors that result when MIDI transmission is delayed.
There are two types of MIDI controllers: performance controllers that generate notes and are used to perform music, and controllers that may not send notes, but transmit other types of real-time events. Many devices are some combination of the two types.
Keyboards are by far the most common type of MIDI controller. MIDI was designed with keyboards in mind, and any controller that is not a keyboard is considered an "alternative" controller. This was seen as a limitation by composers who were not interested in keyboard-based music, but the standard proved flexible, and MIDI compatibility was introduced to other types of controllers, including guitars, stringed and wind instruments, drums and specialized and experimental controllers. Other controllers include drum controllers and wind controllers, which can emulate the playing of drum kit and wind instruments, respectively. Nevertheless, some features of the keyboard playing for which MIDI was designed do not fully capture other instruments' capabilities; Jaron Lanier cites the standard as an example of technological "lock-in" that unexpectedly limited what was possible to express. Some of these features, such as per-note pitch bend, are to be addressed in MIDI 2.0, described below.
Software synthesizers offer great power and versatility, but some players feel that division of attention between a MIDI keyboard and a computer keyboard and mouse robs some of the immediacy from the playing experience. Devices dedicated to real-time MIDI control provide an ergonomic benefit, and can provide a greater sense of connection with the instrument than an interface that is accessed through a mouse or a push-button digital menu. Controllers may be general-purpose devices that are designed to work with a variety of equipment, or they may be designed to work with a specific piece of software. Examples of the latter include Akai's APC40 controller for Ableton Live, and Korg's MS-20ic controller that is a reproduction of their MS-20 analog synthesizer. The MS-20ic controller includes patch cables that can be used to control signal routing in their virtual reproduction of the MS-20 synthesizer, and can also control third-party devices.
A MIDI instrument contains ports to send and receive MIDI signals, a CPU to process those signals, an interface that allows user programming, audio circuitry to generate sound, and controllers. The operating system and factory sounds are often stored in a Read-only memory (ROM) unit.
A MIDI instrument can also be a stand-alone module (without a piano style keyboard) consisting of a General MIDI soundboard (GM, GS and XG), onboard editing, including transposing/pitch changes, MIDI instrument changes and adjusting volume, pan, reverb levels and other MIDI controllers. Typically, the MIDI Module includes a large screen, so the user can view information for the currently selected function. Features can include scrolling lyrics, usually embedded in a MIDI file or karaoke MIDI, playlists, song library and editing screens. Some MIDI Modules include a Harmonizer and the ability to playback and transpose MP3 audio files.
Synthesizers may employ any of a variety of sound generation techniques. They may include an integrated keyboard, or may exist as "sound modules" or "expanders" that generate sounds when triggered by an external controller, such as a MIDI keyboard. Sound modules are typically designed to be mounted in a 19-inch rack. Manufacturers commonly produce a synthesizer in both standalone and rack-mounted versions, and often offer the keyboard version in a variety of sizes.
A sampler can record and digitize audio, store it in random-access memory (RAM), and play it back. Samplers typically allow a user to edit a sample and save it to a hard disk, apply effects to it, and shape it with the same tools that synthesizers use. They also may be available in either keyboard or rack-mounted form. Instruments that generate sounds through sample playback, but have no recording capabilities, are known as "ROMplers".
Samplers did not become established as viable MIDI instruments as quickly as synthesizers did, due to the expense of memory and processing power at the time. The first low-cost MIDI sampler was the Ensoniq Mirage, introduced in 1984. MIDI samplers are typically limited by displays that are too small to use to edit sampled waveforms, although some can be connected to a computer monitor.
Drum machines typically are sample playback devices that specialize in drum and percussion sounds. They commonly contain a sequencer that allows the creation of drum patterns, and allows them to be arranged into a song. There often are multiple audio outputs, so that each sound or group of sounds can be routed to a separate output. The individual drum voices may be playable from another MIDI instrument, or from a sequencer.
Sequencer technology predates MIDI. Analog sequencers use CV/Gate signals to control pre-MIDI analog synthesizers. MIDI sequencers typically are operated by transport features modeled after those of tape decks. They are capable of recording MIDI performances, and arranging them into individual tracks along a multitrack recording concept. Music workstations combine controller keyboards with an internal sound generator and a sequencer. These can be used to build complete arrangements and play them back using their own internal sounds, and function as self-contained music production studios. They commonly include file storage and transfer capabilities.
Some effects units can be remotely controlled via MIDI. For example, the Eventide H3000 Ultra-harmonizer allows such extensive MIDI control that it is playable as a synthesizer.
MIDI messages are made up of 8-bit "words" (commonly called "bytes") that are transmitted serially at a rate of 31.25 kbit/s. This rate was chosen because it is an exact division of 1 MHz, the operational speed of many early microprocessors. The first bit of each word identifies whether the word is a status byte or a data byte, and is followed by seven bits of information. A start bit and a stop bit are added to each byte for framing purposes, so a MIDI byte requires ten bits for transmission.
A MIDI link can carry sixteen independent channels of information. The channels are numbered 1–16, but their actual corresponding binary encoding is 0–15. A device can be configured to only listen to specific channels and to ignore the messages sent on other channels ("Omni Off" mode), or it can listen to all channels, effectively ignoring the channel address ("Omni On"). An individual device may be monophonic (the start of a new "note-on" MIDI command implies the termination of the previous note), or polyphonic (multiple notes may be sounding at once, until the polyphony limit of the instrument is reached, or the notes reach the end of their decay envelope, or explicit "note-off" MIDI commands are received). Receiving devices can typically be set to all four combinations of "omni off/on" versus "mono/poly" modes.
A MIDI message is an instruction that controls some aspect of the receiving device. A MIDI message consists of a status byte, which indicates the type of the message, followed by up to two data bytes that contain the parameters. MIDI messages can be "channel messages" sent on only one of the 16 channels and monitored only by devices on that channel, or "system messages" that all devices receive. Each receiving device ignores data not relevant to its function. There are five types of message: Channel Voice, Channel Mode, System Common, System Real-Time, and System Exclusive.
Channel Voice messages transmit real-time performance data over a single channel. Examples include "note-on" messages which contain a MIDI note number that specifies the note's pitch, a velocity value that indicates how forcefully the note was played, and the channel number; "note-off" messages that end a note; program change messages that change a device's patch; and control changes that allow adjustment of an instrument's parameters. MIDI notes are numbered from 0 to 127 assigned to C-1 to G9. This corresponds to a range of 8.175798916 12543.85395 Hz (assuming equal temperament and 440 Hz A4) and extends beyond the 88 note piano range from A0 to C8.
System Exclusive (SysEx) messages are a major reason for the flexibility and longevity of the MIDI standard. Manufacturers use them to create proprietary messages that control their equipment more thoroughly than standard MIDI messages could. SysEx messages are addressed to a specific device in a system. Each manufacturer has a unique identifier that is included in its SysEx messages, which helps ensure that only the targeted device responds to the message, and that all others ignore it. Many instruments also include a SysEx ID setting, so a controller can address two devices of the same model independently. SysEx messages can include functionality beyond what the MIDI standard provides. They target a specific instrument, and are ignored by all other devices on the system.
Devices typically do not respond to every type of message defined by the MIDI specification. The MIDI implementation chart was standardized by the MMA as a way for users to see what specific capabilities an instrument has, and how it responds to messages. A specific MIDI Implementation Chart is usually published for each MIDI device within the device documentation.
The MIDI 1.0 specification for the electrical interface is based on a fully isolated current loop. The MIDI out port nominally sources a +5 volt source through a 220 ohm resistor out through pin 4 on the MIDI out DIN connector, in on pin 4 of the receiving device's MIDI in DIN connector, through a 220 ohm protection resistor and the LED of an opto-isolator. The current then returns via pin 5 on the MIDI in port to the originating device's MIDI out port pin 5, again with a 220 ohm resistor in the path, giving a nominal current of about 5 milliamperes. Despite the cable's appearance, there is no conductive path between the two MIDI devices, only an optically isolated one. Properly designed MIDI devices are relatively immune to ground loops and similar interference. The data rate on this system is 31,250 bits per second, logic 0 being current on.
The MIDI specification provides for a ground "wire" and a braid or foil shield, connected on pin 2, protecting the two signal-carrying conductors on pins 4 and 5. Although the MIDI cable is supposed to connect pin 2 and the braid or foil shield to chassis ground, it should do so only at the MIDI out port; the MIDI in port should leave pin 2 unconnected and isolated. Some large manufacturers of MIDI devices use modified MIDI in-only DIN 5-pin sockets with the metallic conductors intentionally omitted at pin positions 1, 2, and 3 so that the maximum voltage isolation is obtained.
MIDI's flexibility and widespread adoption have led to many refinements of the standard, and have enabled its application to purposes beyond those for which it was originally intended.
MIDI allows selection of an instrument's sounds through program change messages, but there is no guarantee that any two instruments have the same sound at a given program location. Program #0 may be a piano on one instrument, or a flute on another. The General MIDI (GM) standard was established in 1991, and provides a standardized sound bank that allows a Standard MIDI File created on one device to sound similar when played back on another. GM specifies a bank of 128 sounds arranged into 16 families of eight related instruments, and assigns a specific program number to each instrument. Percussion instruments are placed on channel 10, and a specific MIDI note value is mapped to each percussion sound. GM-compliant devices must offer 24-note polyphony. Any given program change selects the same instrument sound on any GM-compatible instrument.
General MIDI is defined by a standard layout of defined instrument sounds called 'patches', defined by a 'patch' number (program number – PC#) and triggered by pressing a key on a MIDI keyboard. This layout ensures MIDI sound modules and other MIDI devices faithfully reproduce the designated sounds expected by the user and maintains reliable and consistent sound palettes across different manufacturers MIDI devices.
The GM standard eliminates variation in note mapping. Some manufacturers had disagreed over what note number should represent middle C, but GM specifies that note number 69 plays A440, which in turn fixes middle C as note number 60. GM-compatible devices are required to respond to velocity, aftertouch, and pitch bend, to be set to specified default values at startup, and to support certain controller numbers such as for sustain pedal, and Registered Parameter Numbers. A simplified version of GM, called "GM Lite", is used in mobile phones and other devices with limited processing power.
A general opinion quickly formed that the GM's 128-instrument sound set was not large enough. Roland's General Standard, or GS, system included additional sounds, drumkits and effects, provided a "bank select" command that could be used to access them, and used MIDI Non-Registered Parameter Numbers (NRPNs) to access its new features. Yamaha's Extended General MIDI, or XG, followed in 1994. XG similarly offered extra sounds, drumkits and effects, but used standard controllers instead of NRPNs for editing, and increased polyphony to 32 voices. Both standards feature backward compatibility with the GM specification, but are not compatible with each other. Neither standard has been adopted beyond its creator, but both are commonly supported by music software titles.
Member companies of Japan's AMEI developed the General MIDI Level 2 specification in 1999. GM2 maintains backward compatibility with GM, but increases polyphony to 32 voices, standardizes several controller numbers such as for sostenuto and soft pedal ("una corda"), RPNs and Universal System Exclusive Messages, and incorporates the MIDI Tuning Standard. GM2 is the basis of the instrument selection mechanism in Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for low power devices that allows the device's polyphony to scale according to its processing power.
Most MIDI synthesizers use equal temperament tuning. The MIDI tuning standard (MTS), ratified in 1992, allows alternate tunings. MTS allows microtunings that can be loaded from a bank of up to 128 patches, and allows real-time adjustment of note pitches. Manufacturers are not required to support the standard. Those who do are not required to implement all of its features.
A sequencer can drive a MIDI system with its internal clock, but when a system contains multiple sequencers, they must synchronize to a common clock. MIDI Time Code (MTC), developed by Digidesign, implements SysEx messages that have been developed specifically for timing purposes, and is able to translate to and from the SMPTE time code standard. MIDI Clock is based on tempo, but SMPTE time code is based on frames per second, and is independent of tempo. MTC, like SMPTE code, includes position information, and can adjust itself if a timing pulse is lost. MIDI interfaces such as Mark of the Unicorn's MIDI Timepiece can convert SMPTE code to MTC.
MIDI Machine Control (MMC) consists of a set of SysEx commands that operate the transport controls of hardware recording devices. MMC lets a sequencer send "Start", "Stop", and "Record" commands to a connected tape deck or hard disk recording system, and to fast-forward or rewind the device so that it starts playback at the same point as the sequencer. No synchronization data is involved, although the devices may synchronize through MTC.
MIDI Show Control (MSC) is a set of SysEx commands for sequencing and remotely cueing show control devices such as lighting, music and sound playback, and motion control systems. Applications include stage productions, museum exhibits, recording studio control systems, and amusement park attractions.
One solution to MIDI timing problems is to mark MIDI events with the times they are to be played, and store them in a buffer in the MIDI interface ahead of time. Sending data beforehand reduces the likelihood that a busy passage can send a large amount of information that overwhelms the transmission link. Once stored in the interface, the information is no longer subject to timing issues associated with USB jitter and computer operating system interrupts, and can be transmitted with a high degree of accuracy. MIDI timestamping only works when both hardware and software support it. MOTU's MTS, eMagic's AMT, and Steinberg's Midex 8 had implementations that were incompatible with each other, and required users to own software and hardware manufactured by the same company to work. Timestamping is built into FireWire MIDI interfaces, Mac OS X Core Audio, and Linux ALSA Sequencer.
An unforeseen capability of SysEx messages was their use for transporting audio samples between instruments. This led to the development of the sample dump standard (SDS), which established a new SysEx format for sample transmission. The SDS was later augmented with a pair of commands that allow the transmission of information about sample loop points, without requiring that the entire sample be transmitted.
The Downloadable Sounds (DLS) specification, ratified in 1997, allows mobile devices and computer sound cards to expand their wave tables with downloadable sound sets. The DLS Level 2 Specification followed in 2006, and defined a standardized synthesizer architecture. The Mobile DLS standard calls for DLS banks to be combined with SP-MIDI, as self-contained Mobile XMF files.
MIDI Polyphonic Expression (MPE) is a method of using MIDI that enables pitch bend, and other dimensions of expressive control, to be adjusted continuously for individual notes. MPE works by assigning each note to its own MIDI channel so that particular messages can be applied to each note individually. The specifications were released in November 2017 by AMEI and in January 2018 by the MMA. Instruments like the Continuum Fingerboard, Linnstrument, ROLI Seaboard, and Eigenharp let users control pitch, timbre, and other nuances for individual notes within chords. A growing number of soft synths and effects are also compatible with MPE (such as Equator, UVI Falcon, and Sandman Pro), as well as a few hardware synths (such as Modal Electronics 002 and ARGON8, Futuresonus Parva, and Modor NF-1).
In addition to the original 31.25 kbit/s current-loop transported on 5-pin DIN, other connectors have been used for the same electrical data, and transmission of MIDI streams in different forms over USB, IEEE 1394 a.k.a. FireWire, and Ethernet is now common. Some samplers and hard drive recorders can also pass MIDI data between each other over SCSI.
Members of the USB-IF in 1999 developed a standard for MIDI over USB, the "Universal Serial Bus Device Class Definition for MIDI Devices" MIDI over USB has become increasingly common as other interfaces that had been used for MIDI connections (serial, joystick, etc.) disappeared from personal computers. Linux, Microsoft Windows, Macintosh OS X, and Apple iOS operating systems include standard class drivers to support devices that use the "Universal Serial Bus Device Class Definition for MIDI Devices". Some manufacturers choose to implement a MIDI interface over USB that is designed to operate differently from the class specification, using custom drivers.
Apple Computer developed the FireWire interface during the 1990s. It began to appear on digital video cameras toward the end of the decade, and on G3 Macintosh models in 1999. It was created for use with multimedia applications. Unlike USB, FireWire uses intelligent controllers that can manage their own transmission without attention from the main CPU. As with standard MIDI devices, FireWire devices can communicate with each other with no computer present.
The Octave-Plateau Voyetra-8 synthesizer was an early MIDI implementation using XLR3 connectors in place of the 5-pin DIN. It was released in the pre-MIDI years and later retrofitted with a MIDI interface but keeping its XLR connector.
As computer-based studio setups became common, MIDI devices that could connect directly to a computer became available. These typically used the 8-pin mini-DIN connector that was used by Apple for serial and printer ports prior to the introduction of the Blue & White G3 models. MIDI interfaces intended for use as the centerpiece of a studio, such as the Mark of the Unicorn MIDI Time Piece, were made possible by a "fast" transmission mode that could take advantage of these serial ports' ability to operate at 20 times the standard MIDI speed. Mini-DIN ports were built into some late-1990s MIDI instruments, and enabled such devices to be connected directly to a computer. Some devices connected via PCs' DB-25 parallel port, or through the joystick port found in many PC sound cards.
Yamaha introduced the mLAN protocol in 1999. It was conceived as a Local Area Network for musical instruments using FireWire as the transport, and was designed to carry multiple MIDI channels together with multichannel digital audio, data file transfers, and time code. mLan was used in a number of Yamaha products, notably digital mixing consoles and the Motif synthesizer, and in third-party products such as the PreSonus FIREstation and the Korg Triton Studio. No new mLan products have been released since 2007.
Computer network implementations of MIDI provide network routing capabilities, and the high-bandwidth channel that earlier alternatives to MIDI, such as ZIPI, were intended to bring. Proprietary implementations have existed since the 1980s, some of which use fiber optic cables for transmission. The Internet Engineering Task Force's RTP-MIDI open specification has gained industry support. Apple has supported this protocol from Mac OS X 10.4 onwards, and a Windows driver based on Apple's implementation exists for Windows XP and newer versions.
Systems for wireless MIDI transmission have been available since the 1980s. Several commercially available transmitters allow wireless transmission of MIDI and OSC signals over Wi-Fi and Bluetooth. iOS devices are able to function as MIDI control surfaces, using Wi-Fi and OSC. An XBee radio can be used to build a wireless MIDI transceiver as a do-it-yourself project. Android devices are able to function as full MIDI control surfaces using several different protocols over Wi-Fi and Bluetooth.
Some devices use standard 3.5 mm TRS audio minijack connectors for MIDI data, including the Korg Electribe 2 and the Arturia Beatstep Pro. Both come with adaptors that break out to standard 5-pin DIN connectors.. This became widespread enough that the Midi Manufacturers' Association standardized the wiring. The MIDI-over-minijack standards document also recommends the use of 2.5 mm connectors over 3.5 mm ones to avoid confusion with audio connectors.
The MIDI 2.0 standard was presented on 17 January 2020 at the Winter NAMM Show in Anaheim, California at a session titled "Strategic Overview and Introduction to MIDI 2.0" by representatives Yamaha, Roli, Microsoft, Google, and the MIDI Association. This significant update adds bidirectional communication while maintaining backwards compatibility.
The new protocol has been researched since 2005. Prototype devices have been shown privately at NAMM using wired and wireless connections and licensing and product certification policies have been developed, however no projected release date was announced. Proposed physical layer and transport layer included Ethernet-based protocols such as RTP MIDI and Audio Video Bridging/Time-Sensitive Networking, as well as User Datagram Protocol (UDP)-based transport .
AMEI and MMA announced that complete specifications will be published following interoperability testing of prototype implementations from major manufacturers such as Google, Yamaha, Steinberg, Roland, Ableton, Native Instruments, and ROLI, among others. In January 2020, Roland announced the A-88mkII controller keyboard that supports MIDI 2.0.
MIDI 2.0 includes MIDI Capability Inquiry specification for property exchange and profiles, and the new Universal MIDI Packet format for high-speed transports which supports both MIDI 1.0 and MIDI 2.0 voice messages.
MIDI Capability Inquiry (MIDI-CI) specifies Universal SysEx messages to implement device profiles, parameter exchange, and MIDI protocol negotiation. The specifications were released in November 2017 by AMEI and in January 2018 by the MMA.
Parameter exchange defines methods to inquiry device capabilities, such as supported controllers, patch names, instrument profiles, device configuration and other metadata, and to get or set device configuration settings. Property exchange uses System Exclusive messages that carry JSON format data. Profiles define common sets of MIDI controllers for various instrument types, such as drawbar organs and analog synths, or for particular tasks, improving interoperability between instruments from different manufacturers. Protocol negotiation allows devices to employ the Next Generation protocol or manufacturer-specific protocols.
MIDI 2.0 defines a new Universal MIDI Packet format, which contains messages of varying length (32, 64, 96 or 128 bits) depending on the payload type. This new packet format supports a total of 256 MIDI channels, organized in 16 groups of 16 channels; each group can carry either a MIDI 1.0 Protocol stream or new MIDI 2.0 Protocol stream, and can also include system messages, system exclusive data, and timestamps for precise rendering of several simultaneous notes. To simplify initial adoption, existing products are explicitly allowed to only implement MIDI 1.0 messages. The Universal MIDI Packet is intended for high-speed transport such as USB and Ethernet and is not supported on the existing 5-pin DIN connections. System Real-Time and System Common messages are the same as defined in MIDI 1.0.
As of January 2019, the draft specification of the new protocol supports all core messages that also exist in MIDI 1.0, but extends their precision and resolution; it also defines many new high-precision controller messages. The specification defines default translation rules to convert between MIDI 2.0 Channel Voice and MIDI 1.0 Channel Voice messages that use different data resolution, as well as map 256 MIDI 2.0 streams to 16 MIDI 1.0 streams.
System Exclusive 8 messages use a new 8-bit data format, based on Universal System Exclusive messages. Mixed Data Set messages are intended to transfer large sets of data. System Exclusive 7 messages use the previous 7-bit data format.
|
https://en.wikipedia.org/wiki?curid=19996
|
Microcode
Microcode is a computer hardware technique that interposes a layer of organisation between the CPU hardware and the programmer-visible instruction set architecture of the computer. As such, the microcode is a layer of hardware-level instructions that implement higher-level machine code instructions or internal state machine sequencing in many digital processing elements. Microcode is used in general-purpose central processing units, although in current desktop CPUs, it is only a fallback path for cases that the faster hardwired control unit cannot handle.
Microcode typically resides in special high-speed memory and translates machine instructions, state machine data or other input into sequences of detailed circuit-level operations. It separates the machine instructions from the underlying electronics so that instructions can be designed and altered more freely. It also facilitates the building of complex multi-step instructions, while reducing the complexity of computer circuits. Writing microcode is often called microprogramming and the microcode in a particular processor implementation is sometimes called a microprogram.
More extensive microcoding allows small and simple microarchitectures to emulate more powerful architectures with wider word length, more execution units and so on, which is a relatively simple way to achieve software compatibility between different products in a processor family.
Some hardware vendors, especially IBM, use the term "microcode" as a synonym for "firmware". In that way, all code within a device is termed "microcode" regardless of it being microcode or machine code; for example, hard disk drives are said to have their microcode updated, though they typically contain both microcode and firmware.
The lowest layer in a computer's software stack is traditionally raw binary machine code instructions for the processor. Microcode sits one level below this. To avoid confusion, each microprogram-related element is differentiated by the "micro" prefix: microinstruction, microassembler, microprogrammer, microarchitecture, etc.
Engineers normally write the microcode during the design phase of a processor, storing it in a read-only memory (ROM) or programmable logic array (PLA) structure, or in a combination of both. However, machines also exist that have some or all microcode stored in SRAM or flash memory. This is traditionally denoted as "writeable control store" in the context of computers, which can be either read-only or read-write memory. In the latter case, the CPU initialization process loads microcode into the control store from another storage medium, with the possibility of altering the microcode to correct bugs in the instruction set, or to implement new machine instructions.
Complex digital processors may also employ more than one (possibly microcode-based) control unit in order to delegate sub-tasks that must be performed essentially asynchronously in parallel. A high-level programmer, or even an assembly programmer, does not normally see or change microcode. Unlike machine code, which often retains some backward compatibility among different processors in a family, microcode only runs on the exact electronic circuitry for which it is designed, as it constitutes an inherent part of the particular processor design itself.
Microprograms consist of series of microinstructions, which control the CPU at a very fundamental level of hardware circuitry. For example, a single typical "horizontal" microinstruction might specify the following operations:
To simultaneously control all processor's features in one cycle, the microinstruction is often wider than 50 bits; e.g., 128 bits on a 360/85 with an emulator feature. Microprograms are carefully designed and optimized for the fastest possible execution, as a slow microprogram would result in a slow machine instruction and degraded performance for related application programs that use such instructions.
Microcode was originally developed as a simpler method of developing the control logic for a computer. Initially, CPU instruction sets were hardwired. Each step needed to fetch, decode, and execute the machine instructions (including any operand address calculations, reads, and writes) was controlled directly by combinational logic and rather minimal sequential state machine circuitry. While such hard-wired processors were very efficient, the need for powerful instruction sets with multi-step addressing and complex operations ("see below") made them difficult to design and debug; highly encoded and varied-length instructions can contribute to this as well, especially when very irregular encodings are used.
Microcode simplified the job by allowing much of the processor's behaviour and programming model to be defined via microprogram routines rather than by dedicated circuitry. Even late in the design process, microcode could easily be changed, whereas hard-wired CPU designs were very cumbersome to change. Thus, this greatly facilitated CPU design.
From the 1940s to the late 1970s, a large portion of programming was done in assembly language; higher-level instructions mean greater programmer productivity, so an important advantage of microcode was the relative ease by which powerful machine instructions can be defined. The ultimate extension of this are "Directly Executable High Level Language" designs, in which each statement of a high-level language such as PL/I is entirely and directly executed by microcode, without compilation. The IBM Future Systems project and Data General Fountainhead Processor are examples of this. During the 1970s, CPU speeds grew more quickly than memory speeds and numerous techniques such as memory block transfer, memory pre-fetch and multi-level caches were used to alleviate this. High-level machine instructions, made possible by microcode, helped further, as fewer more complex machine instructions require less memory bandwidth. For example, an operation on a character string can be done as a single machine instruction, thus avoiding multiple instruction fetches.
Architectures with instruction sets implemented by complex microprograms included the IBM System/360 and Digital Equipment Corporation VAX. The approach of increasingly complex microcode-implemented instruction sets was later called CISC. An alternate approach, used in many microprocessors, is to use PLAs or ROMs (instead of combinational logic) mainly for instruction decoding, and let a simple state machine (without much, or any, microcode) do most of the sequencing. The MOS Technology 6502 is an example of a microprocessor using a PLA for instruction decode and sequencing. The PLA is visible in photomicrographs of the chip, and its operation can be seen in the transistor-level simulation.
Microprogramming is still used in modern CPU designs. In some cases, after the microcode is debugged in simulation, logic functions are substituted for the control store. Logic functions are often faster and less expensive than the equivalent microprogram memory.
A processor's microprograms operate on a more primitive, totally different, and much more hardware-oriented architecture than the assembly instructions visible to normal programmers. In coordination with the hardware, the microcode implements the programmer-visible architecture. The underlying hardware need not have a fixed relationship to the visible architecture. This makes it easier to implement a given instruction set architecture on a wide variety of underlying hardware micro-architectures.
The IBM System/360 has a 32-bit architecture with 16 general-purpose registers, but most of the System/360 implementations actually use hardware that implemented a much simpler underlying microarchitecture; for example, the System/360 Model 30 has 8-bit data paths to the arithmetic logic unit (ALU) and main memory and implemented the general-purpose registers in a special unit of higher-speed core memory, and the System/360 Model 40 has 8-bit data paths to the ALU and 16-bit data paths to main memory and also implemented the general-purpose registers in a special unit of higher-speed core memory. The Model 50 has full 32-bit data paths and implements the general-purpose registers in a special unit of higher-speed core memory. The Model 65 through the Model 195 have larger data paths and implement the general-purpose registers in faster transistor circuits. In this way, microprogramming enabled IBM to design many System/360 models with substantially different hardware and spanning a wide range of cost and performance, while making them all architecturally compatible. This dramatically reduces the number of unique system software programs that must be written for each model.
A similar approach was used by Digital Equipment Corporation (DEC) in their VAX family of computers. As a result, different VAX processors use different microarchitectures, yet the programmer-visible architecture does not change.
Microprogramming also reduces the cost of field changes to correct defects (bugs) in the processor; a bug can often be fixed by replacing a portion of the microprogram rather than by changes being made to hardware logic and wiring.
In 1947, the design of the MIT Whirlwind introduced the concept of a control store as a way to simplify computer design and move beyond "ad hoc" methods. The control store is a diode matrix: a two-dimensional lattice, where one dimension accepts "control time pulses" from the CPU's internal clock, and the other connects to control signals on gates and other circuits. A "pulse distributor" takes the pulses generated by the CPU clock and breaks them up into eight separate time pulses, each of which activates a different row of the lattice. When the row is activated, it activates the control signals connected to it.
Described another way, the signals transmitted by the control store are being played much like a player piano roll. That is, they are controlled by a sequence of very wide words constructed of bits, and they are "played" sequentially. In a control store, however, the "song" is short and repeated continuously.
In 1951, Maurice Wilkes enhanced this concept by adding "conditional execution", a concept akin to a conditional in computer software. His initial implementation consisted of a pair of matrices: the first one generated signals in the manner of the Whirlwind control store, while the second matrix selected which row of signals (the microprogram instruction word, so to speak) to invoke on the next cycle. Conditionals were implemented by providing a way that a single line in the control store could choose from alternatives in the second matrix. This made the control signals conditional on the detected internal signal. Wilkes coined the term microprogramming to describe this feature and distinguish it from a simple control store.
Each microinstruction in a microprogram provides the bits that control the functional elements that internally compose a CPU. The advantage over a hard-wired CPU is that internal CPU control becomes a specialized form of a computer program. Microcode thus transforms a complex electronic design challenge (the control of a CPU) into a less complex programming challenge. To take advantage of this, a CPU is divided into several parts:
There may also be a memory address register and a memory data register, used to access the main computer storage. Together, these elements form an "execution unit". Most modern CPUs have several execution units. Even simple computers usually have one unit to read and write memory, and another to execute user code. These elements could often be brought together as a single chip. This chip comes in a fixed width that would form a "slice" through the execution unit. These are known as "bit slice" chips. The AMD Am2900 family is one of the best known examples of bit slice elements. The parts of the execution units and the execution units themselves are interconnected by a bundle of wires called a bus.
Programmers develop microprograms, using basic software tools. A microassembler allows a programmer to define the table of bits symbolically. Because of its close relationship to the underlying architecture, "microcode has several properties that make it difficult to generate using a compiler." A simulator program is intended to execute the bits in the same way as the electronics, and allows much more freedom to debug the microprogram. After the microprogram is finalized, and extensively tested, it is sometimes used as the input to a computer program that constructs logic to produce the same data. This program is similar to those used to optimize a programmable logic array. Even without fully optimal logic, heuristically optimized logic can vastly reduce the number of transistors from the number required for a ROM control store. This reduces the cost of producing, and the electricity consumed by, a CPU.
Microcode can be characterized as "horizontal" or "vertical", referring primarily to whether each microinstruction controls CPU elements with little or no decoding (horizontal microcode) or requires extensive decoding by combinatorial logic before doing so (vertical microcode). Consequently, each horizontal microinstruction is wider (contains more bits) and occupies more storage space than a vertical microinstruction.
"Horizontal microcode has several discrete micro-operations that are combined in a single microinstruction for simultaneous operation." Horizontal microcode is typically contained in a fairly wide control store; it is not uncommon for each word to be 108 bits or more. On each tick of a sequencer clock a microcode word is read, decoded, and used to control the functional elements that make up the CPU.
In a typical implementation a horizontal microprogram word comprises fairly tightly defined groups of bits. For example, one simple arrangement might be:
For this type of micromachine to implement a JUMP instruction with the address following the opcode, the microcode might require two clock ticks. The engineer designing it would write microassembler source code looking something like this:
For each tick it is common to find that only some portions of the CPU are used, with the remaining groups of bits in the microinstruction being no-ops. With careful design of hardware and microcode, this property can be exploited to parallelise operations that use different areas of the CPU; for example, in the case above, the ALU is not required during the first tick, so it could potentially be used to complete an earlier arithmetic instruction.
In vertical microcode, each microinstruction is significantly encoded that is, the bit fields generally pass through intermediate combinatory logic that, in turn, generates the actual control and sequencing signals for internal CPU elements (ALU, registers, etc.). This is in contrast with horizontal microcode, in which the bit fields themselves either directly produce the control and sequencing signals or are only minimally encoded. Consequently, vertical microcode requires smaller instruction lengths and less storage, but requires more time to decode, resulting in a slower CPU clock.
Some vertical microcode is just the assembly language of a simple conventional computer that is emulating a more complex computer. Some processors, such as DEC Alpha processors and the CMOS microprocessors on later IBM System/390 mainframes and z/Architecture mainframes, use machine code, running in a special mode that gives it access to special instructions, special registers, and other hardware resources not available to regular machine code, to implement some instructions and other functions, such as page table walks on Alpha processors. This is called PALcode on Alpha processors and millicode on IBM mainframe processors.
Another form of vertical microcode has two fields:
The "field select" selects which part of the CPU will be controlled by this word of the control store. The "field value" actually controls that part of the CPU. With this type of microcode, a designer explicitly chooses to make a slower CPU to save money by reducing the unused bits in the control store; however, the reduced complexity may increase the CPU's clock frequency, which lessens the effect of an increased number of cycles per instruction.
As transistors became cheaper, horizontal microcode came to dominate the design of CPUs using microcode, with vertical microcode being used less often.
When both vertical and horizontal microcode are used, the horizontal microcode may be referred to as "nanocode" or "picocode".
A few computers were built using "writable microcode". In this design, rather than storing the microcode in ROM or hard-wired logic, the microcode is stored in a RAM called a "writable control store" or "WCS". Such a computer is sometimes called a "writable instruction set computer" or "WISC".
Many experimental prototype computers use writable control stores; there are also commercial machines that use writable microcode, such as the Burroughs Small Systems, early Xerox workstations, the DEC VAX 8800 ("Nautilus") family, the Symbolics L- and G-machines, a number of IBM System/360 and System/370 implementations, some DEC PDP-10 machines, and the Data General Eclipse MV/8000.
Many more machines offer user-programmable writable control stores as an option, including the HP 2100, DEC PDP-11/60 and Varian Data Machines V-70 series minicomputers. The IBM System/370 includes a facility called "Initial-Microprogram Load" ("IML" or "IMPL") that can be invoked from the console, as part of "power-on reset" ("POR") or from another processor in a tightly coupled multiprocessor complex.
Some commercial machines, for example IBM 360/85, have both a read-only storage and a writable control store for microcode.
WCS offers several advantages including the ease of patching the microprogram and, for certain hardware generations, faster access than ROMs can provide. User-programmable WCS allows the user to optimize the machine for specific purposes.
Starting with the Pentium Pro in 1995, several x86 CPUs have writable Intel Microcode. This, for example, has allowed bugs in the Intel Core 2 and Intel Xeon microcodes to be fixed by patching their microprograms, rather than requiring the entire chips to be replaced. A second prominent example is the set of microcode patches that Intel offered for some of their processor architectures of up to 10 years in age, in a bid to counter the security vulnerabilities discovered in their designs - Spectre and Meltdown - which went public at the start of 2018. A microcode update can be installed by Linux, FreeBSD, Microsoft Windows, or the motherboard BIOS.
The design trend toward heavily microcoded processors with complex instructions began in the early 1960s and continued until roughly the mid-1980s. At that point the RISC design philosophy started becoming more prominent.
A CPU that uses microcode generally takes several clock cycles to execute a single instruction, one clock cycle for each step in the microprogram for that instruction. Some CISC processors include instructions that can take a very long time to execute. Such variations interfere with both interrupt latency and, what is far more important in modern systems, pipelining.
When designing a new processor, a hardwired control RISC has the following advantages over microcoded CISC:
There are counterpoints as well:
Many RISC and VLIW processors are designed to execute every instruction (as long as it is in the cache) in a single cycle. This is very similar to the way CPUs with microcode execute one microinstruction per cycle. VLIW processors have instructions that behave similarly to very wide horizontal microcode, although typically without such fine-grained control over the hardware as provided by microcode. RISC instructions are sometimes similar to the narrow vertical microcode.
Microcoding has been popular in application-specific processors such as network processors, microcontrollers, digital signal processors, channel controllers, disk controllers, network interface controllers, graphics processing units, and in other hardware.
Modern CISC implementations, such as the x86 family, decode instructions into dynamically buffered micro-operations ("μops") with an instruction encoding similar to RISC or traditional microcode. A hardwired instruction decode unit directly emits μops for common x86 instructions, but falls back to a more traditional microcode ROM for more complex or rarely used instructions.
For example, an x86 might look up μops from microcode to handle complex multistep operations such as loop or string instructions, floating point unit transcendental functions or unusual values such as denormal numbers, and special purpose instructions such as CPUID.
|
https://en.wikipedia.org/wiki?curid=19999
|
Multitier architecture
In software engineering, multitier architecture (often referred to as "n"-tier architecture) or multilayered architecture is a client–server architecture in which presentation, application processing and data management functions are physically separated. The most widespread use of multitier architecture is the three-tier architecture.
"N"-tier application architecture provides a model by which developers can create flexible and reusable applications. By segregating an application into tiers, developers acquire the option of modifying or adding a specific layer, instead of reworking the entire application. A three-tier architecture is typically composed of a "presentation" tier, a "domain logic" tier, and a "data storage" tier.
While the concepts of layer and tier are often used interchangeably, one fairly common point of view is that there is indeed a difference. This view holds that a "layer" is a logical structuring mechanism for the elements that make up the software solution, while a "tier" is a physical structuring mechanism for the system infrastructure. For example, a three-layer solution could easily be deployed on a single tier, such as a personal workstation.
The "Layers" architectural pattern has been described in various publications.
In a logical multilayered architecture for an information system with an object-oriented design, the following four are the most common:
The book "Domain Driven Design" describes some common uses for the above four layers, although its primary focus is the domain layer.
If the application architecture has no explicit distinction between the business layer and the presentation layer (i.e., the presentation layer is considered part of the business layer), then a traditional client-server (two-tier) model has been implemented.
The more usual convention is that the application layer (or service layer) is considered a sublayer of the business layer, typically encapsulating the API definition surfacing the supported business functionality. The application/business layers can, in fact, be further subdivided to emphasize additional sublayers of distinct responsibility. For example, if the model–view–presenter pattern is used, the presenter sublayer might be used as an additional layer between the user interface layer and the business/application layer (as represented by the model sublayer).
Some also identify a separate layer called the business infrastructure layer (BI), located between the business layer(s) and the infrastructure layer(s). It's also sometimes called the "low-level business layer" or the "business services layer". This layer is very general and can be used in several application tiers (e.g. a CurrencyConverter).
The infrastructure layer can be partitioned into different levels (high-level or low-level technical services). Developers often focus on the persistence (data access) capabilities of the infrastructure layer and therefore only talk about the persistence layer or the data access layer (instead of an infrastructure layer or technical services layer). In other words, the other kind of technical services are not always explicitly thought of as part of any particular layer.
A layer is on top of another, because it depends on it. Every layer can exist without the layers above it, and requires the layers below it to function. Another common view is that layers do not always strictly depend on only the adjacent layer below. For example, in a relaxed layered system (as opposed to a strict layered system) a layer can also depend on all the layers below it.
Three-tier architecture is a client-server software architecture pattern in which the user interface (presentation), functional process logic ("business rules"), computer data storage and data access are developed and maintained as independent modules, most often on separate platforms. It was developed by John J. Donovan in Open Environment Corporation (OEC), a tools company he founded in Cambridge, Massachusetts.
Apart from the usual advantages of modular software with well-defined interfaces, the three-tier architecture is intended to allow any of the three tiers to be upgraded or replaced independently in response to changes in requirements or technology. For example, a change of operating system in the "presentation tier" would only affect the user interface code.
Typically, the user interface runs on a desktop PC or workstation and uses a standard graphical user interface, functional process logic that may consist of one or more separate modules running on a workstation or application server, and an RDBMS on a database server or mainframe that contains the computer data storage logic. The middle tier may be multitiered itself (in which case the overall architecture is called an ""n"-tier architecture").
In the web development field, three-tier is often used to refer to websites, commonly electronic commerce websites, which are built using three tiers:
Data transfer between tiers is part of the architecture. Protocols involved may include one or more of SNMP, CORBA, Java RMI, .NET Remoting, Windows Communication Foundation, sockets, UDP, web services or other standard or proprietary protocols. Often middleware is used to connect the separate tiers. Separate tiers often (but not necessarily) run on separate physical servers, and each tier may itself run on a cluster.
The end-to-end traceability of data flows through "n"-tier systems is a challenging task which becomes more important when systems increase in complexity. The Application Response Measurement defines concepts and APIs for measuring performance and correlating transactions between tiers.
Generally, the term "tiers" is used to describe physical distribution of components of a system on separate servers, computers, or networks (processing nodes). A three-tier architecture then will have three processing nodes. The term "layers" refers to a logical grouping of components which may or may not be physically located on one processing node.
|
https://en.wikipedia.org/wiki?curid=20003
|
Myrinet
Myrinet, ANSI/VITA 26-1998, is a high-speed local area networking system designed by the company Myricom to be used as an interconnect between multiple machines to form computer clusters.
Myrinet was promoted as having lower protocol overhead than standards such as Ethernet, and therefore better throughput, less interference, and lower latency while using the host CPU. Although it can be used as a traditional networking system, Myrinet is often used directly by programs that "know" about it, thereby bypassing a call into the operating system.
Myrinet physically consists of two fibre optic cables, upstream and downstream, connected to the host computers with a single connector. Machines are connected via low-overhead routers and switches, as opposed to connecting one machine directly to another. Myrinet includes a number of fault-tolerance features, mostly backed by the switches. These include flow control, error control, and "heartbeat" monitoring on every link. The "fourth-generation" Myrinet, called Myri-10G, supported a 10 Gbit/s data rate and can use 10 Gigabit Ethernet on PHY, the physical layer (cables, connectors, distances, signaling). Myri-10G started shipping at the end of 2005.
Myrinet was approved in 1998 by the American National Standards Institute for use on the VMEbus as ANSI/VITA 26-1998.. One of the earliest publications on Myrinet is a 1995 IEEE article.
Myrinet is a lightweight protocol with little overhead that allows it to operate with throughput close to the basic signaling speed of the physical layer. For supercomputing, the low latency of Myrinet is even more important than its throughput performance, since, according to Amdahl's law, a high-performance parallel system tends to be bottlenecked by its slowest sequential process, which in all but the most embarrassingly parallel supercomputer workloads is often the latency of message transmission across the network.
According to Myricom, 141 (28.2%) of the June 2005 TOP500 supercomputers used Myrinet technology. In the November 2005 TOP500, the number of supercomputers using Myrinet was down to 101 computers, or 20.2%, in November 2006, 79 (15.8%), and by November 2007, 18 (3.6%), a long way behind gigabit Ethernet at 54% and InfiniBand at 24.2%.
In the June 2014 TOP500 list, the number of supercomputers using Myrinet interconnect was 1 (0.2%).
In November, 2013, the assets of Myricom (including the Myrinet technology) were acquired by CSP Inc. In 2016, it was reported that Google had also offered to buy the company.
|
https://en.wikipedia.org/wiki?curid=20016
|
Musique concrète
Musique concrète (; ) is a type of music composition that utilizes recorded sounds as raw material . Sounds are often modified through the application of audio effects and tape manipulation techniques, and may be assembled into a form of montage . It can feature sounds derived from recordings of musical instruments, the human voice, and the natural environment as well as those created using synthesizers and computer-based digital signal processing. Compositions in this idiom are not restricted to the normal musical rules of melody, harmony, rhythm, metre, and so on. It exploits acousmatic listening, meaning sound identities can often be intentionally obscured or appear unconnected to their source cause.
The theoretical basis of "musique concrète" as a compositional practice was developed by French composer Pierre Schaeffer beginning in the early 1940s, and originally contrasted with "pure" "elektronische Musik" (based solely on the use of electronically produced sounds rather than recorded sounds). Schaeffer's work resulted in the establishment of France's Groupe de Recherches de Musique Concrète (GRMC), which attracted important figures including Pierre Henry, Luc Ferrari, Pierre Boulez, Karlheinz Stockhausen, Edgar Varèse, and Iannis Xenakis. From the late 1960s onward, and particularly in France, the term acousmatic music ("musique acousmatique") started to be used in reference to fixed media compositions that utilized both musique concrète based techniques and live sound spatialisation.
In 1928 music critic André Cœuroy wrote in his book "Panorama of Contemporary Music" that "perhaps the time is not far off when a composer will be able to represent through recording, music specifically composed for the gramophone" . In the same period the American composer Henry Cowell, in referring to the projects of Nikolai Lopatnikoff, believed that "there was a wide field open for the composition of music for phonographic discs." This sentiment was echoed further in 1930 by Igor Stravinsky, when he stated in the revue "Kultur und Schallplatte" that "there will be a greater interest in creating music in a way that will be peculiar to the gramophone record." The following year, 1931, Boris de Schloezer also expressed the opinion that one could write for the gramophone or for the wireless just as one can for the piano or the violin . Shortly after, German art theorist Rudolf Arnheim discussed the effects of microphonic recording in an essay entitled "Radio", published in 1936. In it the idea of a creative role for the recording medium was introduced and Arnheim stated that: "The rediscovery of the musicality of sound in noise and in language, and the reunification of music, noise and language in order to obtain a unity of material: that is one of the chief artistic tasks of radio" .
In 1942 French composer and theoretician Pierre Schaeffer began his exploration of radiophony when he joined Jacques Copeau and his pupils in the foundation of the Studio d'Essai de la Radiodiffusion nationale. The studio originally functioned as a center for the Resistance movement in French radio, which in August 1944 was responsible for the first broadcasts in liberated Paris. It was here that Schaeffer began to experiment with creative radiophonic techniques using the sound technologies of the time .
The development of Schaeffer's practice was informed by encounters with voice actors, and microphone usage and radiophonic art played an important part in inspiring and consolidating Schaeffer's conception of sound-based composition . Another important influence on Schaeffer's practice was cinema, and the techniques of recording and montage, which were originally associated with cinematographic practice, came to "serve as the substrate of musique concrète." Marc Battier notes that, prior to Schaeffer, Jean Epstein drew attention to the manner in which sound recording revealed what was hidden in the act of basic acoustic listening. Epstein's reference to this "phenomenon of an epiphanic being", which appears through the transduction of sound, proved influential on Schaeffer's concept of reduced listening. Schaeffer would explicitly cite Jean Epstein with reference to his use of extra-musical sound material. Epstein had already imagined that "through the transposition of natural sounds, it becomes possible to create chords and dissonances, melodies and symphonies of noise, which are a new and specifically cinematographic music" .
Perhaps earlier than Schaeffer conducting his preliminary experiments into sound manipulation (assuming these were later than 1944, and not as early as the foundation of the Studio d'Essai in 1942) was the activity of Egyptian composer Halim El-Dabh. As a student in Cairo in the early to mid-1940s he began experimenting with "tape music" using a cumbersome wire recorder. He recorded the sounds of an ancient "zaar" ceremony and at the Middle East Radio studios processed the material using reverberation, echo, voltage controls, and re-recording. The resulting tape-based composition, entitled "The Expression of Zaar", was presented in 1944 at an art gallery event in Cairo. El-Dabh has described his initial activities as an attempt to unlock "the inner sound" of the recordings. While his early compositional work was not widely known outside of Egypt at the time, El-Dabh would eventually gain recognition for his influential work at the Columbia-Princeton Electronic Music Center in the late 1950s .
Following Schaeffer's work with Studio d'Essai at Radiodiffusion Nationale during the early 1940s he was credited with originating the theory and practice of "musique concrète." The Studio d'Essai was renamed Club d'Essai de la Radiodiffusion-Télévision Française in 1946 and in the same year Schaeffer discussed, in writing, the question surrounding the transformation of time perceived through recording. The essay evidenced knowledge of sound manipulation techniques he would further exploit compositionally. In 1948 Schaeffer formally initiated "research in to noises" at the Club d'Essai and on 5 October 1948 the results of his initial experimentation were premiered at a concert given in Paris . Five works for phonograph (known collectively as "Cinq études de bruits"—Five Studies of Noises) including "Étude violette" ("Study in Purple") and "Étude aux chemins de fer" (Study with Railroads), were presented.
By 1949 Schaeffer's compositional work was known publicly as "musique concrète" . Schaeffer stated: "when I proposed the term 'musique concrète,' I intended … to point out an opposition with the way musical work usually goes. Instead of notating musical ideas on paper with the symbols of solfege and entrusting their realization to well-known instruments, the question was to collect concrete sounds, wherever they came from, and to abstract the musical values they were potentially containing" . According to Pierre Henry, "musique concrète was not a study of timbre, it is focused on envelopes, forms. It must be presented by means of non-traditional characteristics, you see … one might say that the origin of this music is also found in the interest in 'plastifying' music, of rendering it plastic like sculpture…musique concrète, in my opinion … led to a manner of composing, indeed, a new mental framework of composing" . Schaeffer had developed an aesthetic that was centred upon the use of sound as a primary compositional resource. The aesthetic also emphasised the importance of play ("jeu") in the practice of sound based composition. Schaeffer's use of the word "jeu", from the verb "jouer", carries the same double meaning as the English verb play: 'to enjoy oneself by interacting with one's surroundings', as well as 'to operate a musical instrument' .
By 1951 the work of Schaeffer, composer-percussionist Pierre Henry, and sound engineer Jacques Poullin had received official recognition and The Groupe de Recherches de Musique Concrète, Club d 'Essai de la Radiodiffusion-Télévision Française was established at RTF in Paris, the ancestor of the ORTF . At RTF the GRMC established the first purpose-built electroacoustic music studio. It quickly attracted many who either were or were later to become notable composers, including Olivier Messiaen, Pierre Boulez, Jean Barraqué, Karlheinz Stockhausen, Edgard Varèse, Iannis Xenakis, Michel Philippot, and Arthur Honegger. Compositional "output from 1951 to 1953 comprised "Étude I" (1951) and "Étude II" (1951) by Boulez, "Timbres-durées" (1952) by Messiaen, "Étude aux mille collants" (1952) by Stockhausen, "Le microphone bien tempéré" (1952) and "La voile d'Orphée" (1953) by Henry, "Étude I" (1953) by Philippot, "Étude" (1953) by Barraqué, the mixed pieces "Toute la lyre" (1951) and "Orphée 53" (1953) by Schaeffer/Henry, and the film music "Masquerage" (1952) by Schaeffer and "Astrologie" (1953) by Henry. In 1954 Varèse and Honegger visited to work on the tape parts of "Déserts" and "La rivière endormie"" .
In the early and mid 1950s Schaeffer's commitments to RTF included official missions which often required extended absences from the studios. This led him to invest Philippe Arthuys with responsibility for the GRMC in his absence, with Pierre Henry operating as Director of Works. Pierre Henry's composing talent developed greatly during this period at the GRMC and he worked with experimental filmmakers such as Max de Haas, Jean Grémillon, Enrico Fulchignoni, and Jean Rouch, and with choreographers including Dick Sanders and Maurice Béjart . Schaeffer returned to run the group at the end of 1957, and immediately stated his disapproval of the direction the GRMC had taken. A proposal was then made to "renew completely the spirit, the methods and the personnel of the Group, with a view to undertake research and to offer a much needed welcome to young composers" .
Following the emergence of differences within the GRMC Pierre Henry, Philippe Arthuys, and several of their colleagues, resigned in April 1958. Schaeffer created a new collective, called Groupe de Recherches Musicales (GRM) and set about recruiting new members including Luc Ferrari, Beatriz Ferreyra, François-Bernard Mâche, Iannis Xenakis, Bernard Parmegiani, and Mireille Chamass-Kyrou. Later arrivals included Ivo Malec, Philippe Carson, Romuald Vandelle, Edgardo Canton and François Bayle .
GRM was one of several theoretical and experimental groups working under the umbrella of the Schaeffer-led Service de la Recherche at ORTF (1960–74). Together with the GRM, three other groups existed: the Groupe de Recherches Image GRI, the Groupe de Recherches Technologiques GRT and the Groupe de Recherches which became the Groupe d'Etudes Critiques . Communication was the one theme that unified the various groups, all of which were devoted to production and creation. In terms of the question "who says what to whom?" Schaeffer added "how?", thereby creating a platform for research into audiovisual communication and mass media, audible phenomena and music in general (including non-Western musics) (Beatriz Ferreyra, new preface to Schaeffer and Reibel 1967, reedition of 1998, 9). At the GRM the theoretical teaching remained based on practice and could be summed up in the catch phrase "do and listen" .
Schaeffer kept up a practice established with the GRMC of delegating the functions (though not the title) of Group Director to colleagues. Since 1961 GRM has had six Group Directors: Michel Philippot (1960–61), Luc Ferrari (1962–63), Bernard Baschet and François Vercken (1964–66). From the beginning of 1966, François Bayle took over the direction for the duration of thirty-one years, to 1997. He was then replaced by Daniel Teruggi .
The group continued to refine Schaeffer's ideas and strengthened the concept of "musique acousmatique" . Schaeffer had borrowed the term acousmatic from Pythagoras and defined it as: ""Acousmatic, adjective: referring to a sound that one hears without seeing the causes behind it"" . In 1966 Schaeffer published the book "Traité des objets musicaux" (Treatise on Musical Objects) which represented the culmination of some 20 years of research in the field of "musique concrète". In conjunction with this publication, a set of sound recordings was produced, entitled "Le solfège de l'objet sonore" (Music Theory of the Acoustic Object), to provide examples of concepts dealt with in the treatise.
The development of musique concrète was facilitated by the emergence of new music technology in post-war Europe. Access to microphones, phonographs, and later magnetic tape recorders (created in 1939 and acquired by the Schaeffer's Groupe de Recherche de Musique Concrète (Research Group on Concrete Music) in 1952), facilitated by an association with the French national broadcasting organization, at that time the Radiodiffusion-Télévision Française, gave Schaeffer and his colleagues an opportunity to experiment with recording technology and tape manipulation.
In 1948, a typical radio studio consisted of a series of shellac record players, a shellac record recorder, a mixing desk with rotating potentiometers, mechanical reverberation, filters, and microphones. This technology made a number of limited operations available to a composer (, ):
The application of the above technologies in the creation of musique concrète led to the development of a number of sound manipulation techniques including (, ):
The first tape recorders started arriving at ORTF in 1949; however, their functioning was much less reliable than the shellac players, to the point that the "Symphonie pour un homme seul", which was composed in 1950–51, was mainly composed with records, even if the tape recorder was available . In 1950, when the machines finally functioned correctly, the techniques of musique concrète were expanded. A range of new sound manipulation practices were explored using improved media manipulation methods and operations such as speed variation. A completely new possibility of organising sounds appears with tape editing, which permits tape to be spliced and arranged with an extraordinary new precision. The "axe-cut junctions" were replaced with micrometric junctions and a whole new technique of production, less dependency on performance skills, could be developed. Tape editing brought a new technique called "micro-editing", in which very tiny fragments of sound, representing milliseconds of time, were edited together, thus creating completely new sounds or structures .
During the GRMC period from 1951–1958 time Schaeffer and Jacques Poullin developed a number of novel sound creation tools including a three-track tape recorder, a machine with ten playback heads to replay tape loops in echo (the morphophone), a keyboard-controlled machine to replay tape loops at twenty-four preset speeds (the keyboard, chromatic, or Tolana phonogène), a slide-controlled machine to replay tape loops at a continuously variable range of speeds (the handle, continuous, or Sareg phonogène), and a device to distribute an encoded track across four loudspeakers, including one hanging from the centre of the ceiling (the potentiomètre d'espace) .
Speed variation was a powerful tool for sound design applications. It had been identified that transformations brought about by varying playback speed lead to modification in the character of the sound material:
The phonogène was a machine capable of modifying sound structure significantly and it provided composers with a means to adapt sound to meet specific compositional contexts. The initial phonogènes were manufactured in 1953 by two subcontractors: the chromatic phonogène by a company called Tolana, and the sliding version by the SAREG Company . A third version was developed later at ORTF. An outline of the unique capabilities of the various phonogènes can be seen here:
This original tape recorder was one of the first machines permitting the simultaneous listening of several synchronised sources. Until 1958 musique concrète, radio and the studio machines were monophonic. The three-head tape recorder superposed three magnetic tapes that were dragged by a common motor, each tape having an independent spool. The objective was to keep the three tapes synchronised from a common starting point. Works could then be conceived polyphonically, and thus each head conveyed a part of the information and was listened to through a dedicated loudspeaker. It was an ancestor of the multi-track player (four then eight tracks) that appeared in the 1960s. "Timbres Durées" by Olivier Messiaen with the technical assistance of Pierre Henry was the first work composed for this tape recorder in 1952. A rapid rhythmic polyphony was distributed over the three channels .
This machine was conceived to build complex forms through repetition, and accumulation of events through delays, filtering and feedback. It consisted of a large rotating disk, 50 cm in diameter, on which was stuck a tape with its magnetic side facing outward. A series of twelve movable magnetic heads (one each recording head and erasing head, and ten playback heads) were positioned around the disk, in contact with the tape. A sound up to four seconds long could be recorded on the looped tape and the ten playback heads would then read the information with different delays, according to their (adjustable) positions around the disk. A separate amplifier and band-pass filter for each head could modify the spectrum of the sound, and additional feedback loops could transmit the information to the recording head. The resulting repetitions of a sound occurred at different time intervals, and could be filtered or modified through feedback. This system was also easily capable of producing artificial reverberation or continuous sounds .
At the premiere of Pierre Schaeffer's "Symphonie pour un homme seul" in 1951, a system that was designed for the spatial control of sound was tested. It was called a "relief desk" ("pupitre de relief", but also referred to as "pupitre d'espace" or "potentiomètre d'espace") and was intended to control the dynamic level of music played from several shellac players. This created a stereophonic effect by controlling the positioning of a monophonic sound source . One of five tracks, provided by a purpose-built tape machine, was controlled by the performer and the other four tracks each supplied a single loudspeaker. This provided a mixture of live and preset sound positions . The placement of loudspeakers in the performance space included two loudspeakers at the front right and left of the audience, one placed at the rear, and in the centre of the space a loudspeaker was placed in a high position above the audience. The sounds could therefore be moved around the audience, rather than just across the front stage. On stage, the control system allowed a performer to position a sound either to the left or right, above or behind the audience, simply by moving a small, hand held transmitter coil towards or away from four somewhat larger receiver coils arranged around the performer in a manner reflecting the loudspeaker positions . A contemporary eyewitness described the "potentiomètre d'espace" in normal use:
One found one's self sitting in a small studio which was equipped with four loudspeakers—two in front of one—right and left; one behind one and a fourth suspended above. In the front center were four large loops and an "executant" moving a small magnetic unit through the air. The four loops controlled the four speakers, and while all four were giving off sounds all the time, the distance of the unit from the loops determined the volume of sound sent out from each.The music thus came to one at varying intensity from various parts of the room, and this "spatial projection" gave new sense to the rather abstract sequence of sound originally recorded. The central concept underlying this method was the notion that music should be controlled during public presentation in order to create a performance situation; an attitude that has stayed with acousmatic music to the present day .
After the longstanding rivalry with the "electronic music" of the Cologne studio had subsided, in 1970 the GRM finally created an electronic studio using tools developed by the physicist Enrico Chiarucci, called the Studio 54, which featured the "Coupigny modular synthesiser" and a Moog synthesiser . The Coupigny synthesiser, named for its designer François Coupigny, director of the Group for Technical Research (Battier 2007, 200), and the Studio 54 mixing desk had a major influence on the evolution of GRM and from the point of their introduction on they brought a new quality to the music . The mixing desk and synthesiser were combined in one unit and were created specifically for the creation of musique concrète.
The design of the desk was influenced by trade union rules at French National Radio that required technicians and production staff to have clearly defined duties. The solitary practice of musique concrète composition did not suit a system that involved three operators: one in charge of the machines, a second controlling the mixing desk, and third to provide guidance to the others. Because of this the synthesiser and desk were combined and organised in a manner that allowed it to be used easily by a composer. Independently of the mixing tracks (twenty-four in total), it had a coupled connection patch that permitted the organisation of the machines within the studio. It also had a number of remote controls for operating tape recorders. The system was easily adaptable to any context, particularly that of interfacing with external equipment .
Before the late 1960s the musique concrète produced at GRM had largely been based on the recording and manipulation of sounds, but synthesised sounds had featured in a number of works prior to the introduction of the Coupigny. Pierre Henry had used oscillators to produce sounds as early as 1955. But a synthesiser with envelope control was something Pierre Schaeffer was against, since it favoured the preconception of music and therefore deviated from Schaeffer's principal of "making through listening" . Because of Schaeffer's concerns the Coupigny synthesiser was conceived as a sound-event generator with parameters controlled globally, without a means to define values as precisely as some other synthesisers of the day .
The development of the machine was constrained by several factors. It needed to be modular and the modules had to be easily interconnected (so that the synthesiser would have more modules than slots and it would have an easy-to-use patch). It also needed to include all the major functions of a modular synthesiser including oscillators, noise-generators, filters, ring-modulators, but an intermodulation facility was viewed as the primary requirement; to enable complex synthesis processes such as frequency modulation, amplitude modulation, and modulation via an external source. No keyboard was attached to the synthesiser and instead a specific and somewhat complex envelope generator was used to shape sound. This synthesiser was well-adapted to the production of continuous and complex sounds using intermodulation techniques such as cross-synthesis and frequency modulation but was less effective in generating precisely defined frequencies and triggering specific sounds .
The Coupigny synthesiser also served as the model for a smaller, portable unit, which has been used down to the present day .
In 1966 composer and technician François Bayle was placed in charge of the Groupe de Recherches Musicales and in 1975, GRM was integrated with the new Institut national de l'audiovisuel (INA – Audiovisual National Institute) with Bayle as its head. In taking the lead on work that began in the early 1950s, with Jacques Poullin's potentiomètre d'espace, a system designed to move monophonic sound sources across four speakers, Bayle and the engineer Jean-Claude Lallemand created an orchestra of loudspeakers ("un orchestre de haut-parleurs") known as the Acousmonium in 1974 . An inaugural concert took place on 14 February 1974 at the Espace Pierre Cardin in Paris with a presentation of Bayle's "Expérience acoustique" .
The Acousmonium is a specialised sound reinforcement system consisting of between 50 and 100 loudspeakers, depending on the character of the concert, of varying shape and size. The system was designed specifically for the concert presentation of musique-concrète-based works but with the added enhancement of sound spatialisation. Loudspeakers are placed both on stage and at positions throughout the performance space and a mixing console is used to manipulate the placement of acousmatic material across the speaker array, using a performative technique known as "sound diffusion" . Bayle has commented that the purpose of the Acousmonium is to "substitute a momentary classical disposition of sound making, which diffuses the sound from the circumference towards the centre of the hall, by a group of sound projectors which form an 'orchestration' of the acoustic image" .
As of 2010, the Acousmonium was still performing, with 64 speakers, 35 amplifiers, and 2 consoles .
|
https://en.wikipedia.org/wiki?curid=20017
|
Metric space
In mathematics, a metric space is a set together with a metric on the set. The metric is a function that defines a concept of "distance" between any two members of the set, which are usually called points. The metric satisfies a few simple properties. Informally:
A metric on a space induces topological properties like open and closed sets, which lead to the study of more abstract topological spaces.
The most familiar metric space is 3-dimensional Euclidean space. In fact, a "metric" is the generalization of the Euclidean metric arising from the four long-known properties of the Euclidean distance. The Euclidean metric defines the distance between two points as the length of the straight line segment connecting them. Other metric spaces occur for example in elliptic geometry and hyperbolic geometry, where distance on a sphere measured by angle is a metric, and the hyperboloid model of hyperbolic geometry is used by special relativity as a metric space of velocities.
In 1906 Maurice Fréchet introduced metric spaces in his work "Sur quelques points du calcul fonctionnel". However the name is due to Felix Hausdorff.
A metric space is an ordered pair formula_1 where formula_2 is a set and formula_3 is a metric on formula_2, i.e., a function
such that for any formula_6, the following holds:
Given the above three axioms, we also have that formula_7 for any formula_8. This is deduced as follows:
The function formula_3 is also called "distance function" or simply "distance". Often, formula_3 is omitted and one just writes formula_2 for a metric space if it is clear from the context what metric is used.
Ignoring mathematical details, for any system of roads and terrains the distance between two locations can be defined as the length of the shortest route connecting those locations. To be a metric there shouldn't be any one-way roads. The triangle inequality expresses the fact that detours aren't shortcuts. If the distance between two points is zero, the two points are indistinguishable from one-another. Many of the examples below can be seen as concrete versions of this general idea.
Every metric space is a topological space in a natural manner, and therefore all definitions and theorems about general topological spaces also apply to all metric spaces.
About any point formula_15 in a metric space formula_2 we define the open ball of radius formula_78 (where formula_79 is a real number) about formula_15 as the set
These open balls form the base for a topology on "M", making it a topological space.
Explicitly, a subset formula_82 of formula_2 is called open if for every formula_15 in formula_82 there exists an formula_78 such that formula_87 is contained in formula_82. The complement of an open set is called closed. A neighborhood of the point formula_15 is any subset of formula_2 that contains an open ball about formula_15 as a subset.
A topological space which can arise in this way from a metric space is called a metrizable space.
A sequence (formula_92) in a metric space formula_2 is said to converge to the limit formula_94 if and only if for every formula_95, there exists a natural number "N" such that formula_96 for all formula_97. Equivalently, one can use the general definition of convergence available in all topological spaces.
A subset formula_69 of the metric space formula_2 is closed if and only if every sequence in formula_69 that converges to a limit in formula_2 has its limit in formula_69.
A metric space formula_2 is said to be complete if every Cauchy sequence converges in formula_2. That is to say: if formula_105 as both formula_74 and formula_73 independently go to infinity, then there is some formula_108 with formula_109.
Every Euclidean space is complete, as is every closed subset of a complete space. The rational numbers, using the absolute value metric formula_110, are not complete.
Every metric space has a unique (up to isometry) completion, which is a complete space that contains the given space as a dense subset. For example, the real numbers are the completion of the rationals.
If formula_31 is a complete subset of the metric space formula_2, then formula_31 is closed in formula_2. Indeed, a space is complete if and only if it is closed in any containing metric space.
Every complete metric space is a Baire space.
A metric space "M" is called bounded if there exists some number "r", such that for all "x" and "y" in "M". The smallest possible such "r" is called the diameter of "M". The space "M" is called precompact or totally bounded if for every there exist finitely many open balls of radius "r" whose union covers "M". Since the set of the centres of these balls is finite, it has finite diameter, from which it follows (using the triangle inequality) that every totally bounded space is bounded. The converse does not hold, since any infinite set can be given the discrete metric (one of the examples above) under which it is bounded and yet not totally bounded.
Note that in the context of intervals in the space of real numbers and occasionally regions in a Euclidean space formula_115 a bounded set is referred to as "a finite interval" or "finite region". However boundedness should not in general be confused with "finite", which refers to the number of elements, not to how far the set extends; finiteness implies boundedness, but not conversely. Also note that an unbounded subset of formula_115 may have a finite volume.
A metric space "M" is compact if every sequence in "M" has a subsequence that converges to a point in "M". This is known as sequential compactness and, in metric spaces (but not in general topological spaces), is equivalent to the topological notions of countable compactness and compactness defined via open covers.
Examples of compact metric spaces include the closed interval [0,1] with the absolute value metric, all metric spaces with finitely many points, and the Cantor set. Every closed subset of a compact space is itself compact.
A metric space is compact if and only if it is complete and totally bounded. This is known as the Heine–Borel theorem. Note that compactness depends only on the topology, while boundedness depends on the metric.
Lebesgue's number lemma states that for every open cover of a compact metric space "M", there exists a "Lebesgue number" δ such that every subset of "M" of diameter < δ is contained in some member of the cover.
Every compact metric space is second countable, and is a continuous image of the Cantor set. (The latter result is due to Pavel Alexandrov and Urysohn.)
A metric space is said to be locally compact if every point has a compact neighborhood. Euclidean spaces are locally compact, but infinite-dimensional Banach spaces are not.
A space is proper if every closed ball {"y" : "d"("x","y") ≤ "r"} is compact. Proper spaces are locally compact, but the converse is not true in general.
A metric space formula_2 is connected if the only subsets that are both open and closed are the empty set and formula_2 itself.
A metric space formula_2 is path connected if for any two points formula_8 there exists a continuous map formula_121 with formula_122 and formula_123.
Every path connected space is connected, but the converse is not true in general.
There are also local versions of these definitions: locally connected spaces and locally path connected spaces.
Simply connected spaces are those that, in a certain sense, do not have "holes".
A metric space is separable space if it has a countable dense subset. Typical examples are the real numbers or any Euclidean space. For metric spaces (but not for general topological spaces) separability is equivalent to second-countability and also to the Lindelöf property.
If formula_31 is a nonempty metric space and formula_125 then formula_126 is called a "pointed metric space", and formula_127 is called a "distinguished point". Note that a pointed metric space is just a nonempty metric space with attention drawn to its distinguished point, and that any nonempty metric space can be viewed as a pointed metric space. The distinguished point is sometimes denoted formula_24 due to its similar behavior to zero in certain contexts.
Suppose ("M"1,"d"1) and ("M"2,"d"2) are two metric spaces.
The map "f":"M"1→"M"2 is continuous
if it has one (and therefore all) of the following equivalent properties:
Moreover, "f" is continuous if and only if it is continuous on every compact subset of "M"1.
The image of every compact set under a continuous function is compact, and the image of every connected set under a continuous function is connected.
The map "ƒ" : "M"1 → "M"2 is uniformly continuous if for every "ε" > 0 there exists "δ" > 0 such that
Every uniformly continuous map "ƒ" : "M"1 → "M"2 is continuous. The converse is true if "M"1 is compact (Heine–Cantor theorem).
Uniformly continuous maps turn Cauchy sequences in "M"1 into Cauchy sequences in "M"2. For continuous maps this is generally wrong; for example, a continuous map
from the open interval (0,1) "onto" the real line turns some Cauchy sequences into unbounded sequences.
Given a real number "K" > 0, the map "ƒ" : "M"1 → "M"2 is "K"-Lipschitz continuous if
Every Lipschitz-continuous map is uniformly continuous, but the converse is not true in general.
If "K" < 1, then "ƒ" is called a contraction. Suppose "M"2 = "M"1 and "M"1 is complete. If "ƒ" is a contraction, then "ƒ" admits a unique fixed point (Banach fixed point theorem). If "M"1 is compact, the condition can be weakened a bit: "ƒ" admits a unique fixed point if
The map "f":"M"1→"M"2 is an isometry if
Isometries are always injective; the image of a compact or complete set under an isometry is compact or complete, respectively. However, if the isometry is not surjective, then the image of a closed (or open) set need not be closed (or open).
The map "f" : "M"1 → "M"2 is a quasi-isometry if there exist constants "A" ≥ 1 and "B" ≥ 0 such that
and a constant "C" ≥ 0 such that every point in "M"2 has a distance at most "C" from some point in the image "f"("M"1).
Note that a quasi-isometry is not required to be continuous. Quasi-isometries compare the "large-scale structure" of metric spaces; they find use in geometric group theory in relation to the word metric.
Given two metric spaces ("M"1, "d"1) and ("M"2, "d"2):
Metric spaces are paracompact Hausdorff spaces and hence normal (indeed they are perfectly normal). An important consequence is that every metric space admits partitions of unity and that every continuous real-valued function defined on a closed subset of a metric space can be extended to a continuous map on the whole space (Tietze extension theorem). It is also true that every real-valued Lipschitz-continuous map defined on a subset of a metric space can be extended to a Lipschitz-continuous map on the whole space.
Metric spaces are first countable since one can use balls with rational radius as a neighborhood base.
The metric topology on a metric space "M" is the coarsest topology on "M" relative to which the metric "d" is a continuous map from the product of "M" with itself to the non-negative real numbers.
A simple way to construct a function separating a point from a closed set (as required for a completely regular space) is to consider the distance between the point and the set. If ("M","d") is a metric space, "S" is a subset of "M" and "x" is a point of "M", we define the distance from "x" to "S" as
Then "d"("x", "S") = 0 if and only if "x" belongs to the closure of "S". Furthermore, we have the following generalization of the triangle inequality:
which in particular shows that the map formula_138 is continuous.
Given two subsets "S" and "T" of "M", we define their Hausdorff distance to be
In general, the Hausdorff distance "d"H("S","T") can be infinite. Two sets are close to each other in the Hausdorff distance if every element of either set is close to some element of the other set.
The Hausdorff distance "d"H turns the set "K"("M") of all non-empty compact subsets of "M" into a metric space. One can show that "K"("M") is complete if "M" is complete.
One can then define the Gromov–Hausdorff distance between any two metric spaces by considering the minimal Hausdorff distance of isometrically embedded versions of the two spaces. Using this distance, the class of all (isometry classes of) compact metric spaces becomes a metric space in its own right.
If formula_141 are metric spaces, and "N" is the Euclidean norm on "Rn", then formula_142 is a metric space, where the product metric is defined by
and the induced topology agrees with the product topology. By the equivalence of norms in finite dimensions, an equivalent metric is obtained if "N" is the taxicab norm, a p-norm, the maximum norm, or any other norm which is non-decreasing as the coordinates of a positive "n"-tuple increase (yielding the triangle inequality).
Similarly, a countable product of metric spaces can be obtained using the following metric
An uncountable product of metric spaces need not be metrizable. For example, formula_145 is not first-countable and thus isn't metrizable.
In the case of a single space formula_1, the distance map formula_147 (from the definition) is uniformly continuous with respect to any of the above product metrics formula_148, and in particular is continuous with respect to the product topology of formula_149.
If "M" is a metric space with metric "d", and "~" is an equivalence relation on "M", then we can endow the quotient set "M/~" with a pseudometric. Given two equivalence classes ["x"] and ["y"], we define
where the infimum is taken over all finite sequences formula_151 and formula_152 with formula_153, formula_154, formula_155. In general this will only define a pseudometric, i.e. formula_156 does not necessarily imply that formula_157. However, for some equivalence relations (e.g., those given by gluing together polyhedra along faces), "d"' is a metric.
The quotient metric "d" is characterized by the following universal property. If formula_158 is a metric map between metric spaces (that is, formula_159 for all "x", "y") satisfying "f"("x")="f"("y") whenever formula_160 then the induced function formula_161, given by formula_162, is a metric map formula_163
A topological space is sequential if and only if it is a quotient of a metric space.
The ordered set formula_165 can be seen as a category by requesting exactly one morphism formula_166 if formula_167 and none otherwise. By using formula_168 as the tensor product and formula_24 as the identity, it becomes a monoidal category formula_170.
Every metric space formula_1 can now be viewed as a category formula_172 enriched over formula_170:
See the paper by F.W. Lawvere listed below.
This is reprinted (with author commentary) at Reprints in Theory and Applications of Categories
Also (with an author commentary) in Enriched categories in the logic of geometry and analysis. Repr. Theory Appl. Categ. No. 1 (2002), 1–37.
|
https://en.wikipedia.org/wiki?curid=20018
|
Marine biology
Marine biology is the scientific study of marine life, organisms in the sea. Given that in biology many phyla, families and genera have some species that live in the sea and others that live on land, marine biology classifies species based on the environment rather than on taxonomy.
A large proportion of all life on Earth lives in the ocean. The exact size of this "large proportion" is unknown, since many ocean species are still to be discovered. The ocean is a complex three-dimensional world covering approximately 71% of the Earth's surface. The habitats studied in marine biology include everything from the tiny layers of surface water in which organisms and abiotic items may be trapped in surface tension between the ocean and atmosphere, to the depths of the oceanic trenches, sometimes 10,000 meters or more beneath the surface of the ocean. Specific habitats include coral reefs, kelp forests, seagrass meadows, the surrounds of seamounts and thermal vents, tidepools, muddy, sandy and rocky bottoms, and the open ocean (pelagic) zone, where solid objects are rare and the surface of the water is the only visible boundary. The organisms studied range from microscopic phytoplankton and zooplankton to huge cetaceans (whales) in length. Marine ecology is the study of how marine organisms interact with each other and the environment.
Marine life is a vast resource, providing food, medicine, and raw materials, in addition to helping to support recreation and tourism all over the world. At a fundamental level, marine life helps determine the very nature of our planet. Marine organisms contribute significantly to the oxygen cycle, and are involved in the regulation of the Earth's climate. Shorelines are in part shaped and protected by marine life, and some marine organisms even help create new land.
Many species are economically important to humans, including both finfish and shellfish. It is also becoming understood that the well-being of marine organisms and other organisms are linked in fundamental ways. The human body of knowledge regarding the relationship between life in the sea and important cycles is rapidly growing, with new discoveries being made nearly every day. These cycles include those of matter (such as the carbon cycle) and of air (such as Earth's respiration, and movement of energy through ecosystems including the ocean). Large areas beneath the ocean surface still remain effectively unexplored.
The study of marine biology dates back to Aristotle (384–322 BC), who made many observations of life in the sea around Lesbos, laying the foundation for many future discoveries. In 1768, Samuel Gottlieb Gmelin (1744–1774) published the "Historia Fucorum", the first work dedicated to marine algae and the first book on marine biology to use the then new binomial nomenclature of Linnaeus. It included elaborate illustrations of seaweed and marine algae on folded leaves. The British naturalist Edward Forbes (1815–1854) is generally regarded as the founder of the science of marine biology. The pace of oceanographic and marine biology studies quickly accelerated during the course of the 19th century.
The observations made in the first studies of marine biology fueled the age of discovery and exploration that followed. During this time, a vast amount of knowledge was gained about the life that exists in the oceans of the world. Many voyages contributed significantly to this pool of knowledge. Among the most significant were the voyages of where Charles Darwin came up with his theories of evolution and on the formation of coral reefs. Another important expedition was undertaken by HMS "Challenger", where findings were made of unexpectedly high species diversity among fauna stimulating much theorizing by population ecologists on how such varieties of life could be maintained in what was thought to be such a hostile environment. This era was important for the history of marine biology but naturalists were still limited in their studies because they lacked technology that would allow them to adequately examine species that lived in deep parts of the oceans.
The creation of marine laboratories was important because it allowed marine biologists to conduct research and process their specimens from expeditions. The oldest marine laboratory in the world, Station biologique de Roscoff, was established in France in 1872. In the United States, the Scripps Institution of Oceanography dates back to 1903, while the prominent Woods Hole Oceanographic Institute was founded in 1930. The development of technology such as sound navigation ranging, scuba diving gear, submersibles and remotely operated vehicles allowed marine biologists to discover and explore life in deep oceans that was once thought to not exist.
As inhabitants of the largest environment on Earth, microbial marine systems drive changes in every global system. Microbes are responsible for virtually all the photosynthesis that occurs in the ocean, as well as the cycling of carbon, nitrogen, phosphorus and other nutrients and trace elements.
Microscopic life undersea is incredibly diverse and still poorly understood. For example, the role of viruses in marine ecosystems is barely being explored even in the beginning of the 21st century.
The role of phytoplankton is better understood due to their critical position as the most numerous primary producers on Earth. Phytoplankton are categorized into cyanobacteria (also called blue-green algae/bacteria), various types of algae (red, green, brown, and yellow-green), diatoms, dinoflagellates, euglenoids, coccolithophorids, cryptomonads, chrysophytes, chlorophytes, prasinophytes, and silicoflagellates.
Zooplankton tend to be somewhat larger, and not all are microscopic. Many Protozoa are zooplankton, including dinoflagellates, zooflagellates, foraminiferans, and radiolarians. Some of these (such as dinoflagellates) are also phytoplankton; the distinction between plants and animals often breaks down in very small organisms. Other zooplankton include cnidarians, ctenophores, chaetognaths, molluscs, arthropods, urochordates, and annelids such as polychaetes. Many larger animals begin their life as zooplankton before they become large enough to take their familiar forms. Two examples are fish larvae and sea stars (also called starfish).
Microscopic algae and plants provide important habitats for life, sometimes acting as hiding places for larval forms of larger fish and foraging places for invertebrates.
Algal life is widespread and very diverse under the ocean. Microscopic photosynthetic algae contribute a larger proportion of the world's photosynthetic output than all the terrestrial forests combined. Most of the niche occupied by sub plants on land is actually occupied by macroscopic algae in the ocean, such as "Sargassum" and kelp, which are commonly known as seaweeds that create kelp forests.
Plants that survive in the sea are often found in shallow waters, such as the seagrasses (examples of which are eelgrass, "Zostera", and turtle grass, "Thalassia"). These plants have adapted to the high salinity of the ocean environment. The intertidal zone is also a good place to find plant life in the sea, where mangroves or cordgrass or beach grass might grow.
As on land, invertebrates make up a huge portion of all life in the sea. Invertebrate sea life includes Cnidaria such as jellyfish and sea anemones; Ctenophora; sea worms including the phyla Platyhelminthes, Nemertea, Annelida, Sipuncula, Echiura, Chaetognatha, and Phoronida; Mollusca including shellfish, squid, octopus; Arthropoda including Chelicerata and Crustacea; Porifera; Bryozoa; Echinodermata including starfish; and Urochordata including sea squirts or tunicates. Invertebrates have no backbone. There are over a million species.
Over 1500 species of fungi are known from marine environments. These are parasitic on marine algae or animals, or are saprobes on algae, corals, protozoan cysts, sea grasses, wood and other substrata, and can also be found in sea foam. Spores of many species have special appendages which facilitate attachment to the substratum. A very diverse range of unusual secondary metabolites is produced by marine fungi.
A reported 33,400 species of fish, including bony and cartilaginous fish, had been described by 2016, more than all other vertebrates combined. About 60% of fish species live in saltwater.
Reptiles which inhabit or frequent the sea include sea turtles, sea snakes, terrapins, the marine iguana, and the saltwater crocodile. Most extant marine reptiles, except for some sea snakes, are oviparous and need to return to land to lay their eggs. Thus most species, excepting sea turtles, spend most of their lives on or near land rather than in the ocean. Despite their marine adaptations, most sea snakes prefer shallow waters nearby land, around islands, especially waters that are somewhat sheltered, as well as near estuaries. Some extinct marine reptiles, such as ichthyosaurs, evolved to be viviparous and had no requirement to return to land.
Birds adapted to living in the marine environment are often called seabirds. Examples include albatross, penguins, gannets, and auks. Although they spend most of their lives in the ocean, species such as gulls can often be found thousands of miles inland.
There are five main types of marine mammals, namely cetaceans (toothed whales and baleen whales); sirenians such as manatees; pinnipeds including seals and the walrus; sea otters; and the
polar bear. All are air-breathing, and while some such as the sperm whale can dive for prolonged periods, all must return to the surface to breathe.
Marine habitats can be divided into coastal and open ocean habitats. Coastal habitats are found in the area that extends from the shoreline to the edge of the continental shelf. Most marine life is found in coastal habitats, even though the shelf area occupies only seven percent of the total ocean area. Open ocean habitats are found in the deep ocean beyond the edge of the continental shelf. Alternatively, marine habitats can be divided into pelagic and demersal habitats. Pelagic habitats are found near the surface or in the open water column, away from the bottom of the ocean and affected by ocean currents, while demersal habitats are near or on the bottom. Marine habitats can be modified by their inhabitants. Some marine organisms, like corals, kelp and sea grasses, are ecosystem engineers which reshape the marine environment to the point where they create further habitat for other organisms.
Intertidal zones, the areas that are close to the shore, are constantly being exposed and covered by the ocean's tides. A huge array of life can be found within this zone. Shore habitats span from the upper intertidal zones to the area where land vegetation takes prominence. It can be underwater anywhere from daily to very infrequently. Many species here are scavengers, living off of sea life that is washed up on the shore. Many land animals also make much use of the shore and intertidal habitats. A subgroup of organisms in this habitat bores and grinds exposed rock through the process of bioerosion.
Estuaries are also near shore and influenced by the tides. An estuary is a partially enclosed coastal body of water with one or more rivers or streams flowing into it and with a free connection to the open sea. Estuaries form a transition zone between freshwater river environments and saltwater maritime environments. They are subject both to marine influences—such as tides, waves, and the influx of saline water—and to riverine influences—such as flows of fresh water and sediment. The shifting flows of both sea water and fresh water provide high levels of nutrients both in the water column and in sediment, making estuaries among the most productive natural habitats in the world.
Reefs comprise some of the densest and most diverse habitats in the world. The best-known types of reefs are tropical coral reefs which exist in most tropical waters; however, reefs can also exist in cold water. Reefs are built up by corals and other calcium-depositing animals, usually on top of a rocky outcrop on the ocean floor. Reefs can also grow on other surfaces, which has made it possible to create artificial reefs. Coral reefs also support a huge community of life, including the corals themselves, their symbiotic zooxanthellae, tropical fish and many other organisms.
Much attention in marine biology is focused on coral reefs and the El Niño weather phenomenon. In 1998, coral reefs experienced the most severe mass bleaching events on record, when vast expanses of reefs across the world died because sea surface temperatures rose well above normal. Some reefs are recovering, but scientists say that between 50% and 70% of the world's coral reefs are now endangered and predict that global warming could exacerbate this trend.
The open ocean is relatively unproductive because of a lack of nutrients, yet because it is so vast, in total it produces the most primary productivity. The open ocean is separated into different zones, and the different zones each have different ecologies. Zones which vary according to their depth include the epipelagic, mesopelagic, bathypelagic, abyssopelagic, and hadopelagic zones. Zones which vary by the amount of light they receive include the photic and aphotic zones. Much of the aphotic zone's energy is supplied by the open ocean in the form of detritus.
The deepest recorded oceanic trench measured to date is the Mariana Trench, near the Philippines, in the Pacific Ocean at . At such depths, water pressure is extreme and there is no sunlight, but some life still exists. A white flatfish, a shrimp and a jellyfish were seen by the American crew of the bathyscaphe "Trieste" when it dove to the bottom in 1960. In general, the deep sea is considered to start at the aphotic zone, the point where sunlight loses its power of transference through the water. Many life forms that live at these depths have the ability to create their own light known as bio-luminescence. Marine life also flourishes around seamounts that rise from the depths, where fish and other sea life congregate to spawn and feed. Hydrothermal vents along the mid-ocean ridge spreading centers act as oases, as do their opposites, cold seeps. Such places support unique biomes and many new microbes and other lifeforms have been discovered at these locations .
The marine ecosystem is large, and thus there are many sub-fields of marine biology. Most involve studying specializations of particular animal groups, such as phycology, invertebrate zoology and ichthyology. Other subfields study the physical effects of continual immersion in sea water and the ocean in general, adaptation to a salty environment, and the effects of changing various oceanic properties on marine life. A subfield of marine biology studies the relationships between oceans and ocean life, and global warming and environmental issues (such as carbon dioxide displacement). Recent marine biotechnology has focused largely on marine biomolecules, especially proteins, that may have uses in medicine or engineering. Marine environments are the home to many exotic biological materials that may inspire biomimetic materials.
Marine biology is a branch of biology. It is closely linked to oceanography and may be regarded as a sub-field of marine science. It also encompasses many ideas from ecology. Fisheries science and marine conservation can be considered partial offshoots of marine biology (as well as environmental studies). Marine Chemistry, Physical oceanography and Atmospheric sciences are closely related to this field.
An active research topic in marine biology is to discover and map the life cycles of various species and where they spend their time. Technologies that aid in this discovery include pop-up satellite archival tags, acoustic tags, and a variety of other data loggers. Marine biologists study how the ocean currents, tides and many other oceanic factors affect ocean life forms, including their growth, distribution and well-being. This has only recently become technically feasible with advances in GPS and newer underwater visual devices.
Most ocean life breeds in specific places, nests or not in others, spends time as juveniles in still others, and in maturity in yet others. Scientists know little about where many species spend different parts of their life cycles especially in the infant and juvenile years. For example, it is still largely unknown where juvenile sea turtles and some year-1 sharks travel. Recent advances in underwater tracking devices are illuminating what we know about marine organisms that live at great Ocean depths. The information that pop-up satellite archival tags give aids in certain time of the year fishing closures and development of a marine protected area. This data is important to both scientists and fishermen because they are discovering that by restricting commercial fishing in one small area they can have a large impact in maintaining a healthy fish population in a much larger area.
|
https://en.wikipedia.org/wiki?curid=20021
|
Microkernel
In computer science, a microkernel (often abbreviated as μ-kernel) is the near-minimum amount of software that can provide the mechanisms needed to implement an operating system (OS). These mechanisms include low-level address space management, thread management, and inter-process communication (IPC).
If the hardware provides multiple rings or CPU modes, the microkernel may be the only software executing at the most privileged level, which is generally referred to as supervisor or kernel mode. Traditional operating system functions, such as device drivers, protocol stacks and file systems, are typically removed from the microkernel itself and are instead run in user space.
In terms of the source code size, microkernels are often smaller than monolithic kernels. The MINIX 3 microkernel, for example, has only approximately 12,000 lines of code.
Microkernels trace their roots back to Danish computer pioneer Per Brinch Hansen and his tenure in Danish computer company Regnecentralen where he led software development efforts for the RC 4000 computer.
In 1967, Regnecentralen was installing a RC 4000 prototype in a Polish fertilizer plant in Puławy. The computer used a small real-time operating system tailored for the needs of the plant. Brinch Hansen and his team became concerned with the lack of generality and reusability of the RC 4000 system. They feared that each installation would require a different operating system so they started to investigate novel and more general ways of creating software for the RC 4000.
In 1969, their effort resulted in the completion of the RC 4000 Multiprogramming System. Its nucleus provided inter-process communication based on message-passing for up to 23 unprivileged processes, out of which 8 at a time were protected from one another. It further implemented scheduling of time slices of programs executed in parallel, initiation and control of program execution at the request of other running programs, and initiation of data transfers to or from peripherals. Besides these elementary mechanisms, it had no built-in strategy for program execution and resource allocation. This strategy was to be implemented by a hierarchy of running programs in which parent processes had complete control over child processes and acted as their operating systems.
Following Brinch Hansen's work, microkernels have been developed since the 1970s The term microkernel itself first appeared no later than 1981. Microkernels were meant as a response to changes in the computer world, and to several challenges adapting existing "mono-kernels" to these new systems. New device drivers, protocol stacks, file systems and other low-level systems were being developed all the time. This code was normally located in the monolithic kernel, and thus required considerable work and careful code management to work on. Microkernels were developed with the idea that all of these services would be implemented as user-space programs, like any other, allowing them to be worked on monolithically and started and stopped like any other program. This would not only allow these services to be more easily worked on, but also separated the kernel code to allow it to be finely tuned without worrying about unintended side effects. Moreover, it would allow entirely new operating systems to be "built up" on a common core, aiding OS research.
Microkernels were a very hot topic in the 1980s when the first usable local area networks were being introduced. The same mechanisms that allowed the kernel to be distributed into user space also allowed the system to be distributed across network links. The first microkernels, notably Mach, proved to have disappointing performance, but the inherent advantages appeared so great that it was a major line of research into the late 1990s. However, during this time the speed of computers grew greatly in relation to networking systems, and the disadvantages in performance came to overwhelm the advantages in development terms. Many attempts were made to adapt the existing systems to have better performance, but the overhead was always considerable and most of these efforts required the user-space programs to be moved back into the kernel. By 2000, most large-scale (Mach-like) efforts had ended, although Apple's macOS, released in 2001, uses a hybrid kernel called XNU, which combines a heavily modified (hybrid) OSFMK 7.3 kernel with code from BSD UNIX, and this kernel is also used in iOS, tvOS, and watchOS. , the Mach-based GNU Hurd is also functional and included in testing versions of Arch Linux and Debian.
Although major work on microkernels had largely ended, experimenters continued development. It has since been shown that many of the performance problems of earlier designs were not a fundamental limitation of the concept, but instead due to the designer's desire to use single-purpose systems to implement as many of these services as possible. Using a more pragmatic approach to the problem, including assembly code and relying on the processor to enforce concepts normally supported in software led to a new series of microkernels with dramatically improved performance.
Microkernels are closely related to exokernels.
They also have much in common with hypervisors,
but the latter make no claim to minimality and are specialized to supporting virtual machines; indeed, the L4 microkernel frequently finds use in a hypervisor capacity.
Early operating system kernels were rather small, partly because computer memory was limited. As the capability of computers grew, the number of devices the kernel had to control also grew. Throughout the early history of Unix, kernels were generally small, even though they contained various device drivers and file system implementations. When address spaces increased from 16 to 32 bits, kernel design was no longer constrained by the hardware architecture, and kernels began to grow larger.
The Berkeley Software Distribution (BSD) of Unix began the era of larger kernels. In addition to operating a basic system consisting of the CPU, disks and printers, BSD added a complete TCP/IP networking system and a number of "virtual" devices that allowed the existing programs to work 'invisibly' over the network. This growth continued for many years, resulting in kernels with millions of lines of source code. As a result of this growth, kernels were prone to bugs and became increasingly difficult to maintain.
The microkernel was intended to address this growth of kernels and the difficulties that resulted. In theory, the microkernel design allows for easier management of code due to its division into user space services. This also allows for increased security and stability resulting from the reduced amount of code running in kernel mode. For example, if a networking service crashed due to buffer overflow, only the networking service's memory would be corrupted, leaving the rest of the system still functional.
Inter-process communication (IPC) is any mechanism which allows separate processes to communicate with each other, usually by sending messages. Shared memory is strictly speaking also an inter-process communication mechanism, but the abbreviation IPC usually only refers to message passing, and it is the latter that is particularly relevant to microkernels. IPC allows the operating system to be built from a number of small programs called servers, which are used by other programs on the system, invoked via IPC. Most or all support for peripheral hardware is handled in this fashion, with servers for device drivers, network protocol stacks, file systems, graphics, etc.
IPC can be synchronous or asynchronous. Asynchronous IPC is analogous to network communication: the sender dispatches a message and continues executing. The receiver checks (polls) for the availability of the message, or is alerted to it via some notification mechanism. Asynchronous IPC requires that the kernel maintains buffers and queues for messages, and deals with buffer overflows; it also requires double copying of messages (sender to kernel and kernel to receiver). In synchronous IPC, the first party (sender or receiver) blocks until the other party is ready to perform the IPC. It does not require buffering or multiple copies, but the implicit rendezvous can make programming tricky. Most programmers prefer asynchronous send and synchronous receive.
First-generation microkernels typically supported synchronous as well as asynchronous IPC, and suffered from poor IPC performance. Jochen Liedtke assumed the design and implementation of the IPC mechanisms to be the underlying reason for this poor performance. In his L4 microkernel he pioneered methods that lowered IPC costs by an order of magnitude. These include an IPC system call that supports a send as well as a receive operation, making all IPC synchronous, and passing as much data as possible in registers. Furthermore, Liedtke introduced the concept of the "direct process switch", where during an IPC execution an (incomplete) context switch is performed from the sender directly to the receiver. If, as in L4, part or all of the message is passed in registers, this transfers the in-register part of the message without any copying at all. Furthermore, the overhead of invoking the scheduler is avoided; this is especially beneficial in the common case where IPC is used in an RPC-type fashion by a client invoking a server. Another optimization, called "lazy scheduling", avoids traversing scheduling queues during IPC by leaving threads that block during IPC in the ready queue. Once the scheduler is invoked, it moves such threads to the appropriate waiting queue. As in many cases a thread gets unblocked before the next scheduler invocation, this approach saves significant work. Similar approaches have since been adopted by QNX and MINIX 3.
In a series of experiments, Chen and Bershad compared memory cycles per instruction (MCPI) of monolithic Ultrix with those of microkernel Mach combined with a 4.3BSD Unix server running in user space. Their results explained Mach's poorer performance by higher MCPI and demonstrated that IPC alone is not responsible for much of the system overhead, suggesting that optimizations focused exclusively on IPC will have limited impact. Liedtke later refined Chen and Bershad's results by making an observation that the bulk of the difference between Ultrix and Mach MCPI was caused by capacity cache-misses and concluding that drastically reducing the cache working set of a microkernel will solve the problem.
In a client-server system, most communication is essentially synchronous, even if using asynchronous primitives, as the typical operation is a client invoking a server and then waiting for a reply. As it also lends itself to more efficient implementation, most microkernels generally followed L4's lead and only provided a synchronous IPC primitive. Asynchronous IPC could be implemented on top by using helper threads. However, experience has shown that the utility of synchronous IPC is dubious: synchronous IPC forces a multi-threaded design onto otherwise simple systems, with the resulting synchronization complexities. Moreover, an RPC-like server invocation sequentializes client and server, which should be avoided if they are running on separate cores. Versions of L4 deployed in commercial products have therefore found it necessary to add an asynchronous notification mechanism to better support asynchronous communication. This signal-like mechanism does not carry data and therefore does not require buffering by the kernel. By having two forms of IPC, they have nonetheless violated the principle of minimality. Other versions of L4 have switched to asynchronous IPC completely.
As synchronous IPC blocks the first party until the other is ready, unrestricted use could easily lead to deadlocks. Furthermore, a client could easily mount a denial-of-service attack on a server by sending a request and never attempting to receive the reply. Therefore, synchronous IPC must provide a means to prevent indefinite blocking. Many microkernels provide timeouts on IPC calls, which limit the blocking time. In practice, choosing sensible timeout values is difficult, and systems almost inevitably use infinite timeouts for clients and zero timeouts for servers. As a consequence, the trend is towards not providing arbitrary timeouts, but only a flag which indicates that the IPC should fail immediately if the partner is not ready. This approach effectively provides a choice of the two timeout values of zero and infinity. Recent versions of L4 and MINIX have gone down this path (older versions of L4 used timeouts). QNX avoids the problem by requiring the client to specify the reply buffer as part of the message send call. When the server replies the kernel copies the data to the client's buffer, without having to wait for the client to receive the response explicitly.
Microkernel servers are essentially daemon programs like any others, except that the kernel grants some of them privileges to interact with parts of physical memory that are otherwise off limits to most programs. This allows some servers, particularly device drivers, to interact directly with hardware.
A basic set of servers for a general-purpose microkernel includes file system servers, device driver servers, networking servers, display servers, and user interface device servers. This set of servers (drawn from QNX) provides roughly the set of services offered by a Unix monolithic kernel. The necessary servers are started at system startup and provide services, such as file, network, and device access, to ordinary application programs. With such servers running in the environment of a user application, server development is similar to ordinary application development, rather than the build-and-boot process needed for kernel development.
Additionally, many "crashes" can be corrected by simply stopping and restarting the server. However, part of the system state is lost with the failing server, hence this approach requires applications to cope with failure. A good example is a server responsible for TCP/IP connections: If this server is restarted, applications will experience a "lost" connection, a normal occurrence in a networked system. For other services, failure is less expected and may require changes to application code. For QNX, restart capability is offered as the QNX High Availability Toolkit.
Device drivers frequently perform direct memory access (DMA), and therefore can write to arbitrary locations of physical memory, including various kernel data structures. Such drivers must therefore be trusted. It is a common misconception that this means that they must be part of the kernel. In fact, a driver is not inherently more or less trustworthy by being part of the kernel.
While running a device driver in user space does not necessarily reduce the damage a misbehaving driver can cause, in practice it is beneficial for system stability in the presence of buggy (rather than malicious) drivers: memory-access violations by the driver code itself (as opposed to the device) may still be caught by the memory-management hardware. Furthermore, many devices are not DMA-capable, their drivers can be made untrusted by running them in user space. Recently, an increasing number of computers feature IOMMUs, many of which can be used to restrict a device's access to physical memory. This also allows user-mode drivers to become untrusted.
User-mode drivers actually predate microkernels. The Michigan Terminal System (MTS), in 1967, supported user space drivers (including its file system support), the first operating system to be designed with that capability.
Historically, drivers were less of a problem, as the number of devices was small and trusted anyway, so having them in the kernel simplified the design and avoided potential performance problems. This led to the traditional driver-in-the-kernel style of Unix, Linux, and Windows NT.
With the proliferation of various kinds of peripherals, the amount of driver code escalated and in modern operating systems dominates the kernel in code size.
As a microkernel must allow building arbitrary operating system services on top, it must provide some core functionality. At a minimum, this includes:
This minimal design was pioneered by Brinch Hansen's Nucleus and the hypervisor of IBM's VM. It has since been formalised in Liedtke's "minimality principle":
A concept is tolerated inside the microkernel only if moving it outside the kernel, i.e., permitting competing implementations, would prevent the implementation of the system's required functionality.
Everything else can be done in a usermode program, although device drivers implemented as user programs may on some processor architectures require special privileges to access I/O hardware.
Related to the minimality principle, and equally important for microkernel design, is the separation of mechanism and policy, it is what enables the construction of arbitrary systems on top of a minimal kernel. Any policy built into the kernel cannot be overwritten at user level and therefore limits the generality of the microkernel.
Policy implemented in user-level servers can be changed by replacing the servers (or letting the application choose between competing servers offering similar services).
For efficiency, most microkernels contain schedulers and manage timers, in violation of the minimality principle and the principle of policy-mechanism separation.
Start up (booting) of a microkernel-based system requires device drivers, which are not part of the kernel. Typically this means that they are packaged with the kernel in the boot image, and the kernel supports a bootstrap protocol that defines how the drivers are located and started; this is the traditional bootstrap procedure of L4 microkernels. Some microkernels simplify this by placing some key drivers inside the kernel (in violation of the minimality principle), LynxOS and the original Minix are examples. Some even include a file system in the kernel to simplify booting. A microkernel-based system may boot via multiboot compatible boot loader. Such systems usually load statically-linked servers to make an initial bootstrap or mount an OS image to continue bootstrapping.
A key component of a microkernel is a good IPC system and virtual-memory-manager design that allows implementing page-fault handling and swapping in usermode servers in a safe way. Since all services are performed by usermode programs, efficient means of communication between programs are essential, far more so than in monolithic kernels. The design of the IPC system makes or breaks a microkernel. To be effective, the IPC system must not only have low overhead, but also interact well with CPU scheduling.
On most mainstream processors, obtaining a service is inherently more expensive in a microkernel-based system than a monolithic system. In the monolithic system, the service is obtained by a single system call, which requires two "mode switches" (changes of the processor's ring or CPU mode). In the microkernel-based system, the service is obtained by sending an IPC message to a server, and obtaining the result in another IPC message from the server. This requires a context switch if the drivers are implemented as processes, or a function call if they are implemented as procedures. In addition, passing actual data to the server and back may incur extra copying overhead, while in a monolithic system the kernel can directly access the data in the client's buffers.
Performance is therefore a potential issue in microkernel systems. Indeed, the experience of first-generation microkernels such as Mach and ChorusOS showed that systems based on them performed very poorly. However, Jochen Liedtke showed that Mach's performance problems were the result of poor design and implementation, specifically Mach's excessive cache footprint.
Liedtke demonstrated with his own L4 microkernel that through careful design and implementation, and especially by following the minimality principle, IPC costs could be reduced by more than an order of magnitude compared to Mach. L4's IPC performance is still unbeaten across a range of architectures.
While these results demonstrate that the poor performance of systems based on first-generation microkernels is not representative for second-generation kernels such as L4, this constitutes no proof that microkernel-based systems can be built with good performance. It has been shown that a monolithic Linux server ported to L4 exhibits only a few percent overhead over native Linux.
However, such a single-server system exhibits few, if any, of the advantages microkernels are supposed to provide by structuring operating system functionality into separate servers.
A number of commercial multi-server systems exist, in particular the real-time systems QNX and Integrity. No comprehensive comparison of performance relative to monolithic systems has been published for those multiserver systems. Furthermore, performance does not seem to be the overriding concern for those commercial systems, which instead emphasize reliably quick interrupt handling response times (QNX) and simplicity for the sake of robustness. An attempt to build a high-performance multiserver operating system was the IBM Sawmill Linux project.
However, this project was never completed.
It has been shown in the meantime that user-level device drivers can come close to the performance of in-kernel drivers even for such high-throughput, high-interrupt devices as Gigabit Ethernet. This seems to imply that high-performance multi-server systems are possible.
The security benefits of microkernels have been frequently discussed. In the context of security the minimality principle of microkernels is, some have argued, a direct consequence of the principle of least privilege, according to which all code should have only the privileges needed to provide required functionality. Minimality requires that a system's trusted computing base (TCB) should be kept minimal. As the kernel (the code that executes in the privileged mode of the hardware) has unvetted access to any data and can thus violate its integrity or confidentiality, the kernel is always part of the TCB. Minimizing it is natural in a security-driven design.
Consequently, microkernel designs have been used for systems designed for high-security applications, including KeyKOS, EROS and military systems. In fact common criteria (CC) at the highest assurance level (Evaluation Assurance Level (EAL) 7) has an explicit requirement that the target of evaluation be "simple", an acknowledgment of the practical impossibility of establishing true trustworthiness for a complex system. Unfortunately, again, the term "simple" is misleading and ill-defined. At least the Department of Defense Trusted Computer System Evaluation Criteria introduced somewhat more precise verbiage at the B3/A1 classes:
More recent work on microkernels has been focusing on formal specifications of the kernel API, and formal proofs of the API's security properties and implementation correctness. The first example of this is a mathematical proof of the confinement mechanisms in EROS, based on a simplified model of the EROS API. More recently (in 2007) a comprehensive set of machine-checked proofs was performed of the properties of the protection model of , a version of L4.
This has led to what is referred to as "third-generation microkernels",
characterised by a security-oriented API with resource access controlled by capabilities, virtualization as a first-class concern, novel approaches to kernel resource management,
and a design goal of suitability for formal analysis, besides the usual goal of high performance. Examples are Coyotos, , Nova,
Redox and Fiasco.OC.
In the case of seL4, complete formal verification of the implementation has been achieved, i.e. a mathematical proof that the kernel's implementation is consistent with its formal specification. This provides a guarantee that the properties proved about the API actually hold for the real kernel, a degree of assurance which goes beyond even CC EAL7. It was followed by proofs of security-enforcement properties of the API, and a proof demonstrating that the executable binary code is a correct translation of the C implementation, taking the compiler out of the TCB. Taken together, these proofs establish an end-to-end proof of security properties of the kernel.
The term "nanokernel" or "picokernel" historically referred to:
There is also at least one case where the term nanokernel is used to refer not to a small kernel, but one that supports a nanosecond clock resolution.
|
https://en.wikipedia.org/wiki?curid=20023
|
Multihull
A multihull is a ship or boat with more than one hull, whereas a vessel with a single hull is a monohull.
Multihull ships can be classified by the number of hulls, by their arrangement and by their shapes and sizes.
The first multihull vessels were Austronesian canoes. The builders hollowed out logs to make canoes and stabilized them by attaching outriggers to prevent them from capsizing. This led in due course to the proa, catamaran, and trimaran.
In Polynesian terminology the catamaran is a pair of "Vaka" held together by "Aka", whereas the trimaran is a central "Vaka", with "Ama" on each side, attached by "Aka". Catamarans and trimarans share the same terminology.
Modern pioneers of multihull design include James Wharram (UK), Derek Kelsall (UK), Tom Lack (UK), Lock Crowther (Aust), Hedly Nicol (Aust), Malcolm Tennant (NZ), Jim Brown (USA), Arthur Piver (USA), Chris White (US), Ian Farrier (NZ), LOMOcean (NZ), and Dick Newick (USA).
The vast majority of multihull sailboats are catamarans. Trimarans are less common, and proas are virtually unknown outside the South Pacific.
An outrigger canoe is a canoe with a slender outrigger ("ama") attached by two or more struts ("akas"). This craft will normally be propelled by paddles. If the craft has a sail, it is known as a proa. While canoes and proas both derive stability from the outrigger, the proa has the greater need of the outrigger to counter the heeling effect of the sail. The difficulty with the proa is that the outrigger must be on the lee side to be effective, which means that a change of tack will need the sail to be rearranged.
A catamaran is a vessel with twin hulls. Commercial catamarans began in 17th century England. Separate attempts at steam-powered catamarans were carried out by the middle of the 20th century. However, success required better materials and more developed hydrodynamic technologies. During the second half of the 20th century catamaran designs flourished. Nowadays, catamarans are used as racing, sailing, tourist and fishing boats. Cruising catamarans are becoming important in the holiday charter market. Some 70% of fast passenger RoRo ferries are catamarans.
The hulls of a catamaran are typically connected by a bridgedeck, although some simpler cruising catamarans simply have a trampoline stretched between the crossbeams (or "akas"). Small beachable catamarans, such as the Hobie Cat, also have only a trampoline between the hulls.
Catamarans have no ballast and their stability is derived from the width between the hulls. The distance between hulls is called the "transverse clearance", and the greater this distance, the more stable the catamaran will be.
A catamaran's hulls are slim, although they may flare above the waterline to give reserve buoyancy. Catamarans are prone to "slamming", an unpleasant phenomenon where the waves slam against the underside of the bridge deck. The distance between the design waterplane and the bottom of the bridgedeck is called the "vertical clearance"; and the greater this distance, the less slamming will be encountered. Although a large vertical clearance normally increases a catamaran's seaworthiness, the designer must take care not to raise the overall CoG too much.
A trimaran is a vessel with three hulls. Unlike a catamaran where the hulls are mirror-images of each other, a trimaran is rather like a monohull with two slim outriggers. A trimaran has less accommodation space than a catamaran, but may be capable of even faster speeds.
The trimaran has the widest range of interactions of wave systems generated by hulls at speed. The interactions can be favorable or unfavorable, depending on relative hull arrangement and speed. No authentic trimarans exist. Model test results and corresponding simulations provide estimates on the power of the full-scale ships. The calculations show possible advantages in a defined band of relative speeds.
A new type of super-fast vessel, the "wave-piercing" trimaran (WPT) is known as an air-born unloaded (up to 25% of displacement) vessel, that can achieve twice the speed with a relative power.
Some trimaran configurations use the outlying hulls to enhance stability and allow for shallow draft, examples include the experimental ship RV Triton and the "Independence" class of littoral combat ships (US).
Some multihulls with four (quadrimaran) or five (pentamaran) hulls have been proposed; few have been built. A Swiss entrepreneur is attempting to raise €25 million to build a sail-driven quadrimaran that would use solar power to scoop plastic from the ocean; the project is scheduled for launch in 2020. A French manufacturer, Tera-4, produces motor quadrimarans which use aerodynamic lift between the four hulls to promote planing and reduce power consumption.
Design concepts for vessels with two pair of outriggers have been referred to as pentamarans. The design concept comprises a narrow, long hull that cuts through waves. The outriggers then provide the stability that such a narrow hull needs. While the aft sponsons act as trimaran sponsons do, the front sponsons do not touch the water normally; only if the ship rolls to one side do they provide added buoyancy to correct the roll. BMT Group, a shipbuilding and engineering company in the UK, has proposed a fast cargo ship and a yacht using this kind of hull.
Multihull designs may have hull beams that are slimmer at the water surface ("waterplane") than underwater. This arrangement allows good wave-piercing, while keeping a buoyant hydrodynamic hull beneath the waterplane. In a catamaran configuration this is called a small waterplane area twin hull, or SWATH. While SWATHs are stable in rough seas, they have the drawbacks, compared with other catamarans, of having a deeper draft, being more sensitive to loading, and requiring more power because of their higher underwater surface areas. Triple-hull configurations of small waterplane area craft had been studied, but not built, as of 2008.
Multihulls differ significantly from monohulls in a number of ways:
Multihulls have a broader variety of hull and payload geometries. Compared to a monohull, they have a relatively large beam, deck area (upper and inner), above-water capacity, shallower draft (allowing operation in shallower water) but a limited payload.
Early Austronesians discovered that round logs tied together into a raft were more stable than a single log. Hollowing out the logs further increased buoyancy and payload. Separating two logs by a pair of cross-members (or "akas") further increased stability. Spanning the intervening distance with a platform provides space for accommodation.
Compared to monohulls, multihulls are much less prone to heeling (tilt); a sailing catamaran will rarely heel more than 5° whereas a monohull will frequently heel to 45°. This is particularly noticeable when running the wind; a monohull will roll incessantly, while a catamaran will remain upright. A catamaran's stable motion reduces seasickness and tiredness of the crew, making it safer and more suitable for family cruising. The stability also allows more efficient solar energy collection and radar operation. However, shorter multihulls may be more prone towards an uncomfortable motion called "hobby horsing", especially when lightly loaded.
Being heavier (because of its ballast), a monohull's momentum will temporarily maintain progress if the wind drops, while a (lighter) multihull has less momentum and may be prone to going "in irons" when going about. (Multihulls need to keep the jib "aback" to complete the turn. However, multihull skippers will frequently choose to "gybe" instead, as gybing is much less of an event in a multihull than in a monohull. Multihulls "ghost" well under sail as they respond readily in light airs.
From the earliest times, monohulls (whether or not fitted with sails) were stabilized by carrying ballast (such as rocks) in the bilges; and all modern monohull yachts and ships still rely on ballast for stability. Naval architects arrange the vessel's centre of gravity to be well below the hull's metacenter. The low centre of gravity acts as a counterweight as the craft heels around its centre of buoyancy; that is, as a monohull heels, its ballast operates to restore it to its upright position.
By contrast, a multihull's stability is derived from its width, and modern multihulls are much wider than earlier designs, with the beam sometimes more than half the LOA. Should the weather hull lift from the water, the weight of the vessel will seek to restore the multihull to its normal position. However, unlike a monohull, a multihull has a "point of no return" which will lead to the vessel becoming inverted. A catamaran dinghy may be righted by its crew, but a large cruising cat will normally remain inverted unless it can be craned upright.
A catamaran that is being pressed too hard may "pitchhole (capsize)", a disastrous event whereby the lee bow (downwind bow) digs into the water and trips, followed by the stern rising and the entire vessel somersaulting.
Having no ballast, multihulls that become holed or inverted have a high rate of survivability; water-tight bulkheads should prevent sinking if the hulls fail. Catamarans may have increased reliability because most have an engine in each hulls. Whereas capsized monohulls typically right themselves, capsized multihulls remain inverted. Large multihulls may have escape hatches in the hulls or bridgedeck.
Having a low displacement, and long, narrow hulls, a multihull typically produces very small bow waves and wakes, a consequence of a favorable Froude number. Vessels with beamy hulls (typically monohulls) normally create a large bow wave and wake. Such a vessel is limited by its "hull speed", being unable to "climb over" its bow wave unless it changes from displacement mode to planing mode. Vessels with slim hulls (typically multihulls) will normally create no appreciable bow wave to limit their progress.
In 1978, 101 years after catamarans like "Amaryllis" were banned from yacht racing they returned to the sport. This started with the victory of the trimaran "Olympus Photo", skippered by Mike Birch in the first Route du Rhum. Thereafter, no open ocean race was won by a monohull. Winning times dropped by 70%, since 1978. Olympus Photo's 23-day 6 hr 58' 35" success dropped to Gitana 11's 7d 17h 19'6", in 2006. Around 2016 the first large wind driven foil-borne racing catamarans were built. These cats rise onto foils and T-foiled rudders only at higher speeds.
The increasing popularity of catamaran since the 1960s is down to the added space, speed, shallow draft, and lack of heeling underway. The stability of a multihull makes sailing much less tiring for the crew, and is particularly suitable for families. Having no need for ballast for stability, multihulls are much lighter than monohull sailboats; but a multihull's fine hull sections mean that one must take care not to overload the vessel. Powerboats catamarans are increasingly used for racing, cruising and as workboats and fishing boats. Speed, the stable working platform, safety, and added space are the prime advantages for power cats.
"The weight of a multihull, of this length, is probably not much more than half the weight of a monohull of the same length and it can be sailed with less crew effort."
Racing catamarans and trimarans are popular in France, New Zealand and Australia. Cruising cats are commonest in the Caribbean and Mediterranean (where they form the bulk of the charter business) and Australia. Multihulls are less common in the US, perhaps because their increased beam require wider dock/slips. Smaller multihulls may be collapsible and trailerable, and thus suitable for daybooks and racers. Until the 1960s most multihull sailboats (except for beach cats) were built either by their owners or by boat builders; since then companies have been selling mass-produced boats, of which there are more than 150 models.
Small sailing catamarans are also called beach catamarans. The Malibu Outrigger is one of the first beach launched multihull sailboat (1950). The most recognised racing classes are the Hobie Cat 14, Formula 18 cats, A-cats, the current Olympic Nacra 17, the former Olympic multihull Tornado and New Zealand's Weta trimaran.
Mega or super catamarans are those over 60 feet in length. These often receive substantial customisation following the request of the owner. Builders include Corsair Marine (mid-sized trimarans) and HanseYachts' "Privilège" brand (large catamarans). The largest manufacturer of large multihulls is Fountaine-Pajot in France.
Powerboats range from small single pilot Formula 1s to large multi-engined or gas turbined power boats that are used in off-shore racing and employ 2 to 4 pilots.
|
https://en.wikipedia.org/wiki?curid=20025
|
Multics Relational Data Store
The Multics Relational Data Store, or MRDS for short, was the first commercial relational database management system. It was written in PL/1 by Honeywell for the Multics operating system and first sold in June 1976. Unlike the SQL systems that emerged in the late 1970s and early 80's, MRDS used a command language only for basic data manipulation, equivalent to the codice_1 or codice_2 statements in SQL. Other operations, like creating a new database, or general file management, required the use of a separate command program.
|
https://en.wikipedia.org/wiki?curid=20029
|
Mike Oldfield
Michael Gordon Oldfield (born 15 May 1953) is an English multi-instrumentalist and composer. His work blends progressive rock with world, folk, classical, electronic, ambient, and new-age music. His biggest commercial success is the 1973 album "Tubular Bells"which launched Virgin Records and became a hit in America after its opening was used as the theme for the horror film "The Exorcist". He recorded the 1983 hit single "Moonlight Shadow" and a rendition of the Christmas piece "In Dulci Jubilo".
Oldfield has released 26 albums, most recently a sequel to his 1975 album "Ommadawn" titled "Return to Ommadawn", on 20 January 2017.
Oldfield was born on 15 May 1953 in Reading, Berkshire to Raymond Oldfield, a general practitioner, and Maureen ("née" Liston), a nurse of Irish descent. He has two elder siblings, sister Sally and brother Terence. When Oldfield was seven his mother gave birth to a younger brother, David, but he had Down syndrome and died in infancy. She was prescribed barbiturates, to which she became addicted. She suffered from mental health problems and spent much of the rest of her life in mental institutions. She died in early 1975, shortly after Oldfield had started writing "Ommadawn".
Oldfield attended St Joseph's Convent School, Highlands Junior School, St Edward's Preparatory School, and Presentation College, all in Reading. When he was thirteen the family moved to Harold Wood, then in Essex, and Oldfield attended Hornchurch Grammar School where, having already displayed musical talent, he earned one GCE qualification in English.
Oldfield took up the guitar aged ten, first learning on a 6-string acoustic that his father gave him. He learned technique by copying parts from songs by folk guitarists Bert Jansch and John Renbourn that he played on a portable record player. He tried to learn musical notation but was a "very, very slow" learner; "If I have to, I can write things down. But I don't like to". By the time he was 12, Oldfield played the electric guitar and performed in local folk and youth clubs and dances, earning as much as £4 per gig. During a six-month break from music that Oldfield had around this time, he took up painting. In May 1968, when Oldfield turned fifteen, his school headmaster requested that he cut his long hair. Oldfield refused and left abruptly. He then decided to pursue music on a full-time, professional basis.
After leaving school Oldfield accepted an invitation from his sister Sally to form a folk duo The Sallyangie, taking its name from her name and Oldfield's favourite Jansch tune, "Angie". They toured England and Paris and struck a deal with Transatlantic Records, for which they recorded one album, "Children of the Sun" (1969). After they split in the following year Oldfield suffered a nervous breakdown. He auditioned as bassist for Family in 1969 following the departure of Ric Grech, but the group did not share Roger Chapman's enthusiasm towards Oldfield's performance. Oldfield spent much of the next year living off his father and performing in an electric rock band named Barefoot that included his brother Terry on flute, until the group disbanded in early 1970.
In February 1970, Oldfield auditioned as the bassist in The Whole World, a new backing band that former Soft Machine vocalist Kevin Ayers was putting together. He landed the position despite the bass being a new instrument for him, but he also played occasional lead guitar and later looked back on this time as providing valuable training on the bass. Oldfield went on to play on Ayers's albums "Shooting at the Moon" (1970) and "Whatevershebringswesing" (1971), and played mandolin on "Edgar Broughton Band" (1971). All three albums were recorded at Abbey Road Studios, where Oldfield familiarised himself with a variety of instruments, such as orchestral percussion, piano, Mellotron, and harpsichord, and started to write and put down musical ideas of his own. While doing so Oldfield took up work as a reserve guitarist in a stage production of "Hair" at the Shaftesbury Theatre, where he played and gigged with Alex Harvey. After ten performances Oldfield grew bored of the job and was fired after he decided to play his part for "Let the Sunshine In" in 7/8 time.
By mid-1971, Oldfield had assembled a demo tape containing sections of a longform instrumental that became "Tubular Bells (Part One)", initially entitled "Opus One". After attempts to persuade record labels to take on the project came to nothing, in September 1971 Oldfield, now a session musician and bass guitarist for the Arthur Louis Band, attended recording sessions at The Manor Studio near Kidlington, Oxfordshire, owned by businessman Richard Branson and run by engineers Tom Newman and Simon Heyworth. Branson already had several business ventures and was about to launch Virgin Records with Simon Draper. Newman and Heyworth heard some of Oldfield's demos and took them to Branson and Draper, who eventually gave Oldfield one week of recording time at The Manor. During this week, he completed "Part One" of "Tubular Bells"; "Part Two" was compiled between February and April 1973.
By the end of January 1973, Branson had agreed to release "Tubular Bells" himself and secured Oldfield with a six-album deal with Virgin, with an additional four albums as optional. "Tubular Bells" was released on 25 May 1973 as the first album on the Virgin label. Oldfield played more than twenty different instruments in the multi-layered recording, and its style moved through diverse musical genres. Its 2,630,000 UK sales puts it at No. 34 on the list of the best-selling albums in the country. The title track became a top 10 hit single in the US after the opening was used in "The Exorcist" film in 1973. It is today considered to be a forerunner of the new-age music movement.
In 1974, Oldfield played the guitar on the critically acclaimed album "Rock Bottom" by Robert Wyatt.
In late 1974, his follow-up LP, "Hergest Ridge", was No. 1 in the UK for three weeks before being dethroned by "Tubular Bells". Although "Hergest Ridge" was released over a year after "Tubular Bells", it reached No. 1 first. "Tubular Bells" spent 11 weeks (10 of them consecutive) at No. 2 before its one week at the top. Like "Tubular Bells", "Hergest Ridge" is a two-movement instrumental piece, this time evoking scenes from Oldfield's Herefordshire country retreat. It was followed in 1975 by the pioneering world music piece "Ommadawn" released after the death of his mother Maureen.
In 1975, Oldfield recorded a version of the Christmas piece "In Dulci Jubilo" which charted at No. 4 in the UK.
In 1975, Oldfield received a Grammy award for Best Instrumental Composition in "Tubular Bells – Theme from "The Exorcist"".
In 1976, Oldfield and his sister joined his friend and band member Pekka Pohjola to play on his album "Mathematician's Air Display", which was released in 1977. The album was recorded and edited at Oldfield's Througham Slad Manor in Gloucestershire by Oldfield and Paul Lindsay. Oldfield's 1976 rendition of "Portsmouth" remains his best-performing single on the UK Singles Chart, reaching No. 3.
Oldfield recorded the double album "Incantations" between December 1977 and September 1978. introduced more diverse choral performances from Sally Oldfield, Maddy Prior, and the Queen's College Girls Choir. When it was released on 1 December 1978, the album went to No. 14 in the UK and reached platinum certification for 300,000 copies sold.
In June 1978, during the recording of "Incantations", Oldfield and his siblings completed a three-day Exegesis seminar, a controversial self-assertiveness program based on Werner Erhard's EST training program. The experience had a significant effect on Oldfield's personality, who recalled that he underwent a "rebirth experience" by reliving past fears. "It was like opening some huge cathedral doors and facing the monster, and I saw that the monster was myself as a newborn infant, because I'd started life in a panic." Following the Exegesis seminar, the formerly reclusive Oldfield granted press interviews, posed nude for a promotional photo shoot for "Incantations", and went drinking with news reporters. He had also conquered his fear of flying, gained a pilot's license, and bought his own plane.
In 1979, Oldfield supported "Incantations" with a European tour that spanned 21 dates between March and May 1979. The tour was documented with the live album and concert film, "Exposed". Initially marketed as a limited pressing of 100,000 copies, the strength of sales for the album were strong enough for Virgin to abandon the idea shortly after, transferring it to regular production.
Oldfield's music was used for the score of "The Space Movie" (1980), a Virgin Films production that celebrated the tenth anniversary of the Apollo 11 mission. In 1979, he recorded a version of the signature tune for the BBC children's television programme "Blue Peter", which was used by the show for 10 years.
Oldfield's fifth album, "Platinum", was released in November 1979 and marked the start of his transition from long compositions towards mainstream and pop music. Oldfield performed across Europe between April and December 1980 with the In Concert 1980 tour.
In 1980, Oldfield released "QE2", named after the ocean liner, which features a variety of guest musicians including Phil Collins on drums. This was followed by the European Adventure Tour 1981, during which Oldfield accepted an invitation to perform at a free concert celebrating the wedding of Prince Charles and Lady Diana in Guildhall. He wrote a new track, "Royal Wedding Anthem", for the occasion.
His next album, "Five Miles Out", followed in March 1982, which features the 24-minute track "Taurus II" occupying side one. The Five Miles Out World Tour 1982 saw Oldfield perform from April to December of that year. "Crises" saw Oldfield continue the pattern of one long composition with shorter songs. The first single from the album, "Moonlight Shadow", with Maggie Reilly on vocals, became Oldfield's most successful single, reaching No. 4 in the UK and No. 1 in nine other countries. The subsequent Crises Tour in 1983 concluded with a concert at Wembley Arena to commemorate the tenth anniversary of "Tubular Bells".
Oldfield later turned to film and video, writing the score for Roland Joffé's acclaimed film "The Killing Fields" and producing substantial video footage for his album "Islands". "Islands" continued what Oldfield had been doing on the past couple of albums, with an instrumental piece on one side and rock/pop singles on the other. Of these, "Islands", sung by Bonnie Tyler and "Magic Touch", with vocals by Max Bacon (in the US version) and Glasgow vocalist Jim Price (Southside Jimmy) in the rest of the world, were the major hits. In the US "Magic Touch" reached the top 10 on the Billboard album rock charts in 1988. During the 1980s, Oldfield's then-wife, Norwegian singer Anita Hegerland, contributed vocals to many songs including "Pictures in the Dark".
Released in July 1989, "Earth Moving" features seven vocalists across the album's nine tracks. It is Oldfield's first to consist solely of rock and pop songs, several of which were released as singles: "Innocent" and "Holy" in Europe, and "Hostage" in the US.
For his next instrumental album, Virgin insisted that Oldfield use the title "Tubular Bells 2". Oldfield's rebellious response was "Amarok", an hour-long work featuring rapidly changing themes, unpredictable bursts of noise and a hidden Morse code insult, stating "Fuck off RB", allegedly directed at Branson. Oldfield did everything in his power to make it impossible to make extracts and Virgin returned the favour by barely promoting the album.
in February 1991, Oldfield released his final album for Virgin, "Heaven's Open", under the name "Michael Oldfield". It marks the first time he handles all lead vocals. In 2013, Oldfield invited Branson to the opening of St. Andrew's International School of The Bahamas, where two of Oldfield's children were pupils. This was the occasion of the debut of "Tubular Bells for Schools", a piano solo adaptation of Oldfield's work.
By early 1992, Oldfield had secured Clive Banks as his new manager and had several record label owners listen to his demo of "Tubular Bells II" at his house. Oldfield signed with Rob Dickins of WEA Warner and recorded the album with Trevor Horn as producer. Released in August 1992, the album went to No. 1 in the UK. Its live premiere followed on 4 September at Edinburgh Castle which was released on home video as "Tubular Bells II Live". Oldfield supported the album with his Tubular Bells II 20th Anniversary Tour in 1992 and 1993, his first concert tour since 1984. By April 1993, the album had sold over three million copies worldwide.
Oldfield continued to embrace new musical styles, with "The Songs of Distant Earth" (based on Arthur C. Clarke's novel of the same name) exhibiting a softer new-age sound. In 1994, he also had an asteroid, 5656 Oldfield, named after him.
In 1995, Oldfield continued to embrace new musical styles by producing the Celtic-themed album "Voyager". In 1992, Oldfield met Luar na Lubre, a Galician Celtic-folk band (from A Coruña, Spain), with the singer Rosa Cedrón. The band's popularity grew after Oldfield covered their song "O son do ar" ("The sound of the air") on his "Voyager" album.
In 1998, Oldfield produced the third "Tubular Bells" album (also premiered at a concert, this time in Horse Guards Parade, London), drawing on the dance music scene at his then new home on the island of Ibiza. This album was inspired by themes from "Tubular Bells", but differed in lacking a clear two-part structure.
During 1999, Oldfield released two albums. The first, "Guitars", used guitars as the source for all the sounds on the album, including percussion. The second, "The Millennium Bell", consisted of pastiches of a number of styles of music that represented various historical periods over the past millennium. The work was performed live in Berlin for the city's millennium celebrations in 1999–2000.
He added to his repertoire the MusicVR project, combining his music with a virtual reality-based computer game. His first work on this project is "Tr3s Lunas" launched in 2002, a virtual game where the player can interact with a world full of new music. This project appeared as a double CD, one with the music, and the other with the game.
In 2002 and 2003, Oldfield re-recorded "Tubular Bells" using modern equipment to coincide the 30th anniversary of the original. He had wanted to do it years before but his contract with Virgin kept him from doing so. This new version features John Cleese as the Master of Ceremonies as Viv Stanshall, who spoke on the original, died in the interim. "Tubular Bells 2003" was released in May 2003.
On 12 April 2004 Oldfield launched his next virtual reality project, "Maestro", which contains music from the "Tubular Bells 2003" album and some new chillout melodies. The games have since been made available free of charge on Tubular.net. A double album, "Light + Shade", was released on Mercury Records in 2005, with whom Oldfield had recently signed a three-album deal. The two discs contain music of contrasting moods, one relaxed ("Light") and the other more edgy and moody ("Shade"). Oldfield headlined the pan-European Night of the Proms tour, consisting of 21 concerts in 2006 and 2007.
His autobiography "Changeling" was published in May 2007 by Virgin Books. In March 2008 Oldfield released his first classical album, "Music of the Spheres"; Karl Jenkins assisted with the orchestration. In the first week of release the album topped the UK Classical chart and reached number 9 on the main UK Album Chart. A single "Spheres", featuring a demo version of pieces from the album, was released digitally. The album was nominated for a Classical Brit Award, the NS&I Best Album of 2009.
In 2008, when Oldfield's original 35-year deal with Virgin Records ended, the rights to "Tubular Bells" and his other Virgin releases were returned to him, and were then transferred to Mercury Records. Mercury issued a press release on 15 April 2009, noting that Oldfield's Virgin albums would be re-released, starting 8 June 2009. These releases include special features from the archives. a further seven albums have been reissued and compilation albums have been released such as "Two Sides".
In March 2010, "Music Week" reported that publishing company Stage Three Music had acquired a 50% stake in the songs of Oldfield's entire recorded output in a seven-figure deal.
In 2008, Oldfield contributed an exclusive song ("Song for Survival") to a charity album called "Songs for Survival", in support of Survival International. Oldfield's daughter, Molly, played a large part in the project. In 2010 lyricist Don Black said in an interview with "Music Week" that he had been working with Oldfield. In 2012, Oldfield was featured on Terry Oldfield's "Journey into Space" album and on a track called "Islanders" by German producer Torsten Stenzel's York project. In 2013 Oldfield and York released a remix album titled "Tubular Beats".
At the 2012 Summer Olympics opening ceremony, Oldfield performed renditions of "Tubular Bells", "Far Above the Clouds" and "In Dulci Jubilo" during a segment about the National Health Service. This track appears on the "Isles of Wonder" album which contains music from the Danny Boyle-directed show.
In October 2013, the BBC broadcast "Tubular Bells: The Mike Oldfield Story", an hour-long appreciation of Oldfield's life and musical career, filmed on location at his home recording studio in Nassau.
Oldfield's latest rock-themed album of songs, titled "Man on the Rocks", was released on 3 March 2014 by Virgin EMI. The album was produced by Steve Lipson. The album marks a return of Oldfield to a Virgin branded label, through the merger of Mercury Records UK and Virgin Records after Universal Music's purchase of EMI. The track "Nuclear" was used for the E3 trailer of "".
Interviewed by Steve Wright in May 2015 for his BBC Radio 2 show, Oldfield said that he was currently working on a "prequel to "Tubular Bells"" which was being recorded using analogue equipment as much as possible. He suggested that the album might only be released on vinyl. The project is in its infancy and would follow his current reissue campaign. Oldfield suggested that it would be released "in a couple of years".
On 16 October 2015 Oldfield tweeted, via his official Twitter account ""I am continuing to work on ideas for "A New Ommadawn" for the last week or so to see if [...] the idea actually works."" On 8 May 2016, Oldfield announced via his Facebook group page that the new "Ommadawn" project with the tentative title of "Return to Ommadawn" is finished, and he is awaiting a release date from the record company. He also suggested that he may soon be starting work on a possible fourth "Tubular Bells" album.
Oldfield's latest album, "Return to Ommadawn" was released on 20 January 2017 and reached #4 in the UK Album Chart. On 29 January 2017, Oldfield again hinted at a "Tubular Bells 4" album via his official Facebook fan page; he uploaded photos of new equipment and a new Fender Telecaster guitar with the caption ""New sounds for TB4!""
Although Oldfield considers himself primarily a guitarist, he is also one of popular music's most skilled and diverse multi-instrumentalists. His 1970s recordings were characterised by a very broad variety of instrumentation predominantly played by himself, plus assorted guitar sound treatments to suggest other instrumental timbres (such as the bagpipe, mandolin, "Glorfindel" and varispeed guitars on the original "Tubular Bells").
During the 1980s Oldfield became expert in the use of digital synthesizers and sequencers (notably the Fairlight CMI) which began to dominate the sound of his recordings: from the late 1990s onwards, he became a keen user of software synthesizers. He has, however, regularly returned to projects emphasising detailed, manually played and part-acoustic instrumentation (such as 1990's "Amarok", 1996's "Voyager" and 1999's "Guitars").
Oldfield has played over forty distinct and different instruments on record, including:
While generally preferring the sound of guest vocalists, Oldfield has frequently sung both lead and backup parts for his songs and compositions. He has also contributed experimental vocal effects such as fake choirs and the notorious "Piltdown Man" impression on "Tubular Bells".
Although recognised as a highly skilled guitarist, Oldfield is self-deprecating about his other instrumental skills, describing them as having been developed out of necessity to perform and record the music he composes. He has been particularly dismissive of his violin-playing and singing abilities.
Over the years, Oldfield has used a range of guitars. Among the more notable of these are:
Oldfield used a modified Roland GP8 effects processor in conjunction with his PRS Artist to get many of his heavily overdriven guitar sounds from the "Earth Moving" album onwards. Oldfield has also been using guitar synthesizers since the mid-1980s, using a 1980s Roland GR-300/G-808 type system, then a 1990s Roland GK2 equipped red PRS Custom 24 (sold in 2006) with a Roland VG8, and most recently a Line 6 Variax.
Oldfield has an unusual playing style, using fingers and long right-hand fingernails and different ways of creating vibrato: a "very fast side-to-side vibrato" and "violinist's vibrato". Oldfield has stated that his playing style originates from his musical roots playing folk music and the bass guitar.
Over the years, Oldfield has owned and used a vast number of synthesizers and other keyboard instruments. In the 1980s, he composed the score for the film "The Killing Fields" on a Fairlight CMI. Some examples of keyboard and synthesised instruments which Oldfield has made use of include Sequential Circuits Prophet-5s (notably on "Platinum" and "The Killing Fields"), Roland JV-1080/JV-2080 units (1990s), a Korg M1 (as seen in the "Innocent" video), a Clavia Nord Lead and Steinway pianos. In recent years, he has also made use of software synthesis products, such as Native Instruments.
Oldfield has occasionally sung himself on his records and live performances, sometimes using a vocoder as a resource. It is not unusual for him to collaborate with diverse singers and to hold auditions before deciding the most appropriate for a particular song or album. Featured lead vocalists who have collaborated with him include:
Oldfield has self-recorded and produced many of his albums, and played the majority of the featured instruments, largely at his home studios. In the 1990s and 2000s he mainly used DAWs such as Apple Logic, Avid Pro Tools and Steinberg Nuendo as recording suites. For composing orchestral music Oldfield has been quoted as using the software notation program Sibelius running on Apple Macintoshes. He also used the FL Studio DAW on his 2005 double album "Light + Shade". Among the mixing consoles Oldfield has owned are an AMS Neve Capricorn 33238, a Harrison Series X, and a Euphonix System 5-MC.
Oldfield and his siblings were raised as Roman Catholics, their mother's faith. In his early life, Oldfield used drugs including LSD, whose effects on his mental health he discussed in his autobiography. In the early 1990s, he underwent a course on mental health problems and subsequently set up a foundation called Tonic, which sponsored people to have counselling and therapy. The trustee was the Professor of Psychiatry at Guy's Hospital, London.
Oldfield has been married three times and has seven children. In 1978, he married Diana Fuller, a relative of the Exegesis group leader, which lasted for three months. From 1979 to 1986, Oldfield was married to Sally Cooper. They had three children: Molly, Dougal (died in 2015 aged 33), and Luke. He lived in a civil partnership with Norwegian singer Anita Hegerland, which lasted from 1986 to 1991. They have two children: Greta and Noah. Between 2002 and 2013, Oldfield was married to Fanny Vandekerckhove. The two met when Oldfield was living in Ibiza; they have two sons, Jake and Eugene.
Oldfield is a motorcycle fan and has five bikes. These include a BMW R1200GS, a Suzuki GSX-R750, a Suzuki GSX-R1000, and a Yamaha R1. He says that some of his inspiration for composing comes from riding them. Throughout his life Oldfield has also had a passion for building and flying model aircraft. Since 1980, he has been a licensed pilot and has flown fixed wing aircraft (the first of which was a Beechcraft Sierra) and helicopters (including the Agusta Bell 47G, which featured on the sleeve of his cover version of the ABBA song "Arrival" as a pastiche of their album artwork). He is also interested in cars and has owned a Ferrari and a Bentley which was a gift from Richard Branson as an incentive for him to give his first live performance of "Tubular Bells". He has endorsed the Mercedes-Benz S-Class in the Mercedes UK magazine. Oldfield also considers himself to be a Trekkie. He noted in an interview in 2008 that he had two boats.
In 2007, Oldfield criticised Britain for being too controlling and protective, specifically concentrating on the smoking ban which England and Wales had introduced that year. Oldfield then moved from his South Gloucestershire home to Palma de Mallorca, Spain and then to Monaco. He has lived outside the UK in the past, including in Los Angeles and Ibiza in the 1990s and, for tax reasons, Switzerland in the mid-1980s. In 2009, he moved to the Bahamas and put his home in Mallorca up for sale. Oldfield stated in an interview with "The Times" in 2017 that he is a supporter of US President Donald Trump, and that he would have been delighted to have played at the president's inauguration ceremony. In the same interview, he also stated he was in favour of Brexit.
Grammy Awards
Ivor Novello Awards
NME Awards
Studio albums
Sources
|
https://en.wikipedia.org/wiki?curid=20032
|
Mutual recursion
In mathematics and computer science, mutual recursion is a form of recursion where two mathematical or computational objects, such as functions or data types, are defined in terms of each other. Mutual recursion is very common in functional programming and in some problem domains, such as recursive descent parsers, where the data types are naturally mutually recursive.
The most important basic example of a data type that can be defined by mutual recursion is a tree, which can be defined mutually recursively in terms of a forest (a list of trees). Symbolically:
A forest "f" consists of a list of trees, while a tree "t" consists of a pair of a value "v" and a forest "f" (its children). This definition is elegant and easy to work with abstractly (such as when proving theorems about properties of trees), as it expresses a tree in simple terms: a list of one type, and a pair of two types. Further, it matches many algorithms on trees, which consist of doing one thing with the value, and another thing with the children.
This mutually recursive definition can be converted to a singly recursive definition by inlining the definition of a forest:
A tree "t" consists of a pair of a value "v" and a list of trees (its children). This definition is more compact, but somewhat messier: a tree consists of a pair of one type and a list of another, which require disentangling to prove results about.
In Standard ML, the tree and forest data types can be mutually recursively defined as follows, allowing empty trees:
datatype 'a tree = Empty | Node of 'a * 'a forest
and 'a forest = Nil | Cons of 'a tree * 'a forest
Just as algorithms on recursive data types can naturally be given by recursive functions, algorithms on mutually recursive data structures can be naturally given by mutually recursive functions. Common examples include algorithms on trees, and recursive descent parsers. As with direct recursion, tail call optimization is necessary if the recursion depth is large or unbounded, such as using mutual recursion for multitasking. Note that tail call optimization in general (when the function called is not the same as the original function, as in tail-recursive calls) may be more difficult to implement than the special case of tail-recursive call optimization, and thus efficient implementation of mutual tail recursion may be absent from languages that only optimize tail-recursive calls. In languages such as Pascal that require declaration before use, mutually recursive functions require forward declaration, as a forward reference cannot be avoided when defining them.
As with directly recursive functions, a wrapper function may be useful, with the mutually recursive functions defined as nested functions within its scope if this is supported. This is particularly useful for sharing state across a set of functions without having to pass parameters between them.
A standard example of mutual recursion, which is admittedly artificial, determines whether a non-negative number is even or odd by defining two separate functions that call each other, decrementing each time. In C:
bool is_even(unsigned int n) {
bool is_odd(unsigned int n) {
These functions are based on the observation that the question "is 4 even?" is equivalent to "is 3 odd?", which is in turn equivalent to "is 2 even?", and so on down to 0. This example is mutual single recursion, and could easily be replaced by iteration. In this example, the mutually recursive calls are tail calls, and tail call optimization would be necessary to execute in constant stack space. In C, this would take "O"("n") stack space, unless rewritten to use jumps instead of calls.
This could be reduced to a single recursive function codice_1. In that case, codice_2, which could be inlined, would call codice_1, but codice_1 would only call itself.
As a more general class of examples, an algorithm on a tree can be decomposed into its behavior on a value and its behavior on children, and can be split up into two mutually recursive functions, one specifying the behavior on a tree, calling the forest function for the forest of children, and one specifying the behavior on a forest, calling the tree function for the tree in the forest. In Python:
def f_tree(tree) -> None:
def f_forest(forest) -> None:
In this case the tree function calls the forest function by single recursion, but the forest function calls the tree function by multiple recursion.
Using the Standard ML data type above, the size of a tree (number of nodes) can be computed via the following mutually recursive functions:
fun size_tree Empty = 0
and size_forest Nil = 0
A more detailed example in Scheme, counting the leaves of a tree:
These examples reduce easily to a single recursive function by inlining the forest function in the tree function, which is commonly done in practice: directly recursive functions that operate on trees sequentially process the value of the node and recurse on the children within one function, rather than dividing these into two separate functions.
A more complicated example is given by recursive descent parsers, which can be naturally implemented by having one function for each production rule of a grammar, which then mutually recurse; this will in general be multiple recursion, as production rules generally combine multiple parts. This can also be done without mutual recursion, for example by still having separate functions for each production rule, but having them called by a single controller function, or by putting all the grammar in a single function.
Mutual recursion can also implement a finite-state machine, with one function for each state, and single recursion in changing state; this requires tail call optimization if the number of state changes is large or unbounded. This can be used as a simple form of cooperative multitasking. A similar approach to multitasking is to instead use coroutines which call each other, where rather than terminating by calling another routine, one coroutine yields to another but does not terminate, and then resumes execution when it is yielded back to. This allows individual coroutines to hold state, without it needing to be passed by parameters or stored in shared variables.
There are also some algorithms which naturally have two phases, such as minimax (min and max), and these can be implemented by having each phase in a separate function with mutual recursion, though they can also be combined into a single function with direct recursion.
In mathematics, the Hofstadter Female and Male sequences are an example of a pair of integer sequences defined in a mutually recursive manner.
Fractals can be computed (up to a given resolution) by recursive functions. This can sometimes be done more elegantly via mutually recursive functions; the Sierpiński curve is a good example.
Mutual recursion is very common in the functional programming style, and is often used for programs written in LISP, Scheme, ML, and similar languages. For example, Abelson and Sussman describe how a metacircular evaluator can be used to implement LISP with an eval-apply cycle. In languages such as Prolog, mutual recursion is almost unavoidable.
Some programming styles discourage mutual recursion, claiming that it can be confusing to distinguish the conditions which will return an answer from the conditions that would allow the code to run forever without producing an answer. Peter Norvig points to a design pattern which discourages the use entirely, stating:
Mutual recursion is also known as indirect recursion, by contrast with direct recursion, where a single function calls itself directly. This is simply a difference of emphasis, not a different notion: "indirect recursion" emphasises an individual function, while "mutual recursion" emphasises the set of functions, and does not single out an individual function. For example, if "f" calls itself, that is direct recursion. If instead "f" calls "g" and then "g" calls "f," which in turn calls "g" again, from the point of view of "f" alone, "f" is indirectly recursing, while from the point of view of "g" alone, "g" is indirectly recursing, while from the point of view of both, "f" and "g" are mutually recursing on each other. Similarly a set of three or more functions that call each other can be called a set of mutually recursive functions.
Mathematically, a set of mutually recursive functions are primitive recursive, which can be proven by course-of-values recursion, building a single function "F" that lists the values of the individual recursive function in order: formula_1 and rewriting the mutual recursion as a primitive recursion.
Any mutual recursion between two procedures can be converted to direct recursion by inlining the code of one procedure into the other. If there is only one site where one procedure calls the other, this is straightforward, though if there are several it can involve code duplication. In terms of the call stack, two mutually recursive procedures yield a stack ABABAB..., and inlining B into A yields the direct recursion (AB)(AB)(AB)...
Alternately, any number of procedures can be merged into a single procedure that takes as argument a variant record (or algebraic data type) representing the selection of a procedure and its arguments; the merged procedure then dispatches on its argument to execute the corresponding code and uses direct recursion to call self as appropriate. This can be seen as a limited application of defunctionalization. This translation may be useful when any of the mutually recursive procedures can be called by outside code, so there is no obvious case for inlining one procedure into the other. Such code then needs to be modified so that procedure calls are performed by bundling arguments into a variant record as described; alternately, wrapper procedures may be used for this task.
|
https://en.wikipedia.org/wiki?curid=20034
|
Metasyntactic variable
A metasyntactic variable is a specific word or set of words identified as a placeholder in computer science and specifically computer programming. These words are commonly found in source code and are intended to be modified or substituted to be applicable to the specific usage before compilation (translation to an executable). The words foo and bar are good examples as they are used in over 330 Internet Engineering Task Force Requests for Comments, which are documents explaining foundational internet technologies like HTTP (websites), TCP/IP, and email protocols.
By mathematical analogy, a metasyntactic variable is a word that is a variable for other words, just as in algebra letters are used as variables for numbers.
Metasyntactic variables are used to name entities such as variables, functions, and commands whose exact identity is unimportant and serve only to demonstrate a concept, which is useful for teaching programming.
Due to English being the foundation-language, or lingua franca, of most computer programming languages these variables are commonly seen even in programs and examples of programs written for other spoken-language audiences.
The typical names may depend however on the subculture that has developed around a given programming language.
Metasyntactic variables used commonly across all programming languages include "foobar", "foo", "bar", "baz", "qux", "quux", "quuz", "corge", "grault", "garply", "waldo", "fred", "plugh", "xyzzy", and "thud"; several of these words are references to the game "Colossal Cave Adventure". "Wibble", "wobble", "wubble", and "flob" are also used in the UK.
A complete reference can be found in a MIT Press book titled "The Hacker's Dictionary".
In Japanese, the words "hoge" (ほげ) and "piyo" (ぴよ) are commonly used, with other common words and variants being "fuga" (ふが), "hogera" (ほげら), and "hogehoge" (ほげほげ). Note that "-ra" is a pluralizing ending in Japanese, and reduplication is also used for pluralizing. The origin of "hoge" as a metasyntactic variable is not known, but it is believed to date to the early 1980s.
In France, the word "toto" is widely used, with variants "tata", "titi", "tutu" as related placeholders. One commonly-raised source for the use of "toto" is a reference to the stock character used to tell jokes with Tête à Toto.
In the following example the function name foo and the variable name bar are both metasyntactic variables. Lines beginning with // are comments.
// The function named foo
int foo(void)
Spam, ham, and eggs are the principal metasyntactic variables used in the Python programming language. This is a reference to the famous comedy sketch, "Spam", by Monty Python, the eponym of the language.
In the following example spam, ham, and eggs are metasyntactic variables and lines beginning with # are comments.
def spam():
Both the IETF RFCs and computer programming languages are rendered in plain text, making it necessary to distinguish metasyntactic variables by a naming convention, since it would not be obvious from context.
Plain text example:
RFC 772 (cited in RFC 3092) contains for instance:
Another point reflected in the above example is the convention that a metavariable is to be uniformly substituted with the same instance in all its appearances in a given schema. This is in contrast with nonterminal symbols in formal grammars where the nonterminals on the right of a production can be substituted by different instances.
This section includes bits of code which show how metasyntactic variables are used in teaching computer programming concepts.
Function prototypes with different argument passing mechanisms:
void Foo(Fruit bar);
void Foo(Fruit* bar);
void Foo(const Fruit& bar);
Example showing the function overloading capabilities of the C++ language
void Foo(int bar);
void Foo(int bar, int baz);
void Foo(int bar, int baz, int qux);
|
https://en.wikipedia.org/wiki?curid=20036
|
Mondegreen
A mondegreen is a mishearing or misinterpretation of a phrase in a way that gives it a new meaning. Mondegreens are most often created by a person listening to a poem or a song; the listener, being unable to clearly hear a lyric, substitutes words that sound similar and make some kind of sense. American writer Sylvia Wright coined the term in 1954, writing that as a girl, when her mother read to her from Percy's "Reliques", she had misheard the lyric "layd him on the green" in the fourth line of the Scottish ballad "The Bonny Earl of Murray" as "Lady Mondegreen".
"Mondegreen" was included in the 2000 edition of the "Random House Webster's College Dictionary", and in the "Oxford English Dictionary" in 2002. Merriam-Webster's "Collegiate Dictionary" added the word in 2008.
In a 1954 essay in "Harper's Magazine", Wright described how, as a young girl, she misheard the last line of the first stanza from the seventeenth-century ballad "The Bonnie Earl O' Moray". She wrote:
The correct fourth line is, "And "laid him on the green"." Wright explained the need for a new term:
People are more likely to notice what they expect than things not part of their everyday experiences; this is known as confirmation bias. Similarly, one may mistake an unfamiliar stimulus for a familiar and more plausible version. For example, to consider a well-known mondegreen in the song "Purple Haze", one would be more likely to hear Jimi Hendrix singing that he is about to "kiss this guy" than that he is about to "kiss the sky". Similarly, if a lyric uses words or phrases that the listener is unfamiliar with, they may be misheard as using more familiar terms.
The creation of mondegreens may be driven in part by cognitive dissonance, as the listener finds it psychologically uncomfortable to listen to a song and not make out the words. Steven Connor suggests that mondegreens are the result of the brain's constant attempts to make sense of the world by making assumptions to fill in the gaps when it cannot clearly determine what it is hearing. Connor sees mondegreens as the "wrenchings of nonsense into sense".
This dissonance will be most acute when the lyrics are in a language the listener is fluent in.
On the other hand, Steven Pinker has observed that mondegreen mishearings tend to be "less" plausible than the original lyrics, and that once a listener has "locked in" to a particular misheard interpretation of a song's lyrics, it can remain unquestioned, even when that plausibility becomes strained. Pinker gives the example of a student "stubbornly" mishearing the chorus to "Venus" ("I'm your Venus") as "I'm your penis," and being surprised that the song was allowed on the radio. The phenomenon may, in some cases, be triggered by people hearing "what they want to hear", as in the case of the song "Louie Louie": parents heard obscenities in the Kingsmen recording where none existed.
James Gleick claims that the mondegreen is a distinctly modern phenomenon. Without the improved communication and language standardization brought about by radio, he believes there would have been no way to recognize and discuss this shared experience. Just as mondegreens transform songs based on experience, a folk song learned by repetition often is transformed over time when sung by people in a region where some of the song's references have become obscure. A classic example is "The Golden Vanity", which contains the line "As she sailed upon the lowland sea". British immigrants carried the song to Appalachia, where singers, not knowing what the term "lowland sea" refers to, transformed it over generations from "lowland" to "lonesome".
Classicist and Linguist Steve Reece has collected examples of English mondegreens in song lyrics, religious creeds and liturgies, commercials and advertisements, and jokes and riddles. He has then used this collection to shed light on the process of "junctural metanalysis" during the oral transmission of the ancient Greek epics, the "Iliad" and "Odyssey."
The national anthem of the United States is highly susceptible (especially for young grade-school students) to the creation of mondegreens, two in the first line. Francis Scott Key's "Star-Spangled Banner" begins with the line "O say can you see, by the dawn's early light." This has been accidentally and deliberately misinterpreted as "Jose, can you see," another example of the Hobson-Jobson effect, countless times. The second half of the line has been misheard as well, as "by the donzerly light," or other variants. This has led to many people believing that "donzerly" is an actual word.
Religious songs, learned by ear (and often by children), are another common source of mondegreens. The most-cited example is "Gladly, the cross-eyed bear" (from the line in the hymn "Keep Thou My Way" by Fanny Crosby and Theodore E. Perkins, "Kept by Thy tender care, gladly the cross I'll bear"). Jon Carroll and many others quote it as "Gladly the cross "I'd" bear"
Mondegreens expanded as a phenomenon with radio, and, especially, the growth of rock and roll (and even more so with rap). Amongst the most-reported examples are:
Both Creedence's John Fogerty and Hendrix eventually acknowledged these mishearings by deliberately singing the "mondegreen" versions of their songs in concert.
"Blinded by the Light", a cover of a Bruce Springsteen song by Manfred Mann's Earth Band, contains what has been called "probably the most misheard lyric of all time". The phrase "revved up like a deuce", altered from Springsteen's original "cut loose like a deuce," both lyrics referring to the hot rodders slang "deuce" (short for deuce coupé) for a 1932 Ford coupé, is frequently misheard as "wrapped up like a douche". Springsteen himself has joked about the phenomenon, claiming that it was not until Manfred Mann rewrote the song to be about a "feminine hygiene product" that the song became popular.
Another commonly-cited example of a song susceptible to mondegreens is Nirvana's "Smells Like Teen Spirit", with the line "here we are now, entertain us" variously being misinterpreted as "here we are now, "in containers"", and "here we are now, "hot potatoes"", amongst other renditions.
Rap and hip hop lyrics may be particularly susceptible to being misheard because they do not necessarily follow standard pronunciations. The delivery of rap lyrics relies heavily upon an often regional pronunciation or non-traditional accenting of words and their phonemes to adhere to the artist's stylizations and the lyrics' written structure. This issue is exemplified in controversies over alleged transcription errors in Yale University Press's 2010 "Anthology of Rap."
Sometimes, the modified version of a lyric becomes standard, as is the case with "The Twelve Days of Christmas". The original has "four colly birds" ("colly" means "black"; cf. "A Midsummer Night's Dream": "Brief as the lightning in the collied night."); sometime around the turn of the twentieth century, these became "calling" birds, which is the lyric used in the 1909 Frederic Austin version.
A number of misheard lyrics have been recorded, turning a mondegreen into a real title. The song "Sea Lion Woman", recorded in 1939 by Christine and Katherine Shipp, was performed by Nina Simone under the title, "See Line Woman". According to the liner notes from the compilation "A Treasury of Library of Congress Field Recordings", the correct title of this playground song might also be "See [the] Lyin' Woman" or "C-Line Woman". Jack Lawrence's misinterpretation of the French phrase "pauvre Jean" ("poor John") as the identically pronounced "pauvres gens" ("poor people") led to the translation of "La Goualante du pauvre Jean" ("The Ballad of Poor John") as "The Poor People of Paris", a hit song in 1956.
"A Monk Swimming" by author Malachy McCourt is so titled because of a childhood mishearing of a phrase from the Catholic rosary prayer, Hail Mary. "Amongst women" became "a monk swimmin'".
The title and plot of the short science fiction story "Come You Nigh: Kay Shuns" ("Com-mu-ni-ca-tions") by Lawrence A. Perkins, in "Analog Science Fiction and Fact" magazine (April 1970), deals with securing interplanetary radio communications by encoding them with mondegreens.
"Olive, the Other Reindeer" is a 1997 children's book by Vivian Walsh, which borrows its title from a mondegreen of the line, "all of the other reindeer" in the song "Rudolph the Red-Nosed Reindeer". The book was adapted into an animated Christmas special in 1999.
The travel guide book series Lonely Planet is named after the misheard phrase "lovely planet" sung by Joe Cocker in Matthew Moore's song "Space Captain".
The title of the novel "The Catcher in the Rye" (1951) by J. D. Salinger comes from the poem, Comin’ Through the Rye's name. Holden Caulfield, the protagonist, misinterprets a part of this poem to mean "if a body catch a body" rather than "if a body meet a body." He keeps picturing children playing in a field of rye near the edge of a cliff, and him catching them when they start to fall off.
A monologue of mondegreens appears in the 1971 film "Carnal Knowledge". The camera focuses on actress Candice Bergen laughing as she recounts various phrases that fooled her as a child, including "Round John Virgin" (instead of '"Round yon virgin...") and "Gladly, the cross-eyed bear" (instead of “Gladly the cross I’d bear”).
The enigmatic title of the 2013 film "Ain't Them Bodies Saints" is a misheard lyric from a folk song; director David Lowery decided to use it because it evoked the "classical, regional" feel of 1970s rural Texas.
Mondegreens have been used in many television advertising campaigns, including:
The traditional game Chinese whispers ("Telephone" in North America) involves mishearing a whispered sentence to produce successive mondegreens that gradually distort the original sentence as it is repeated by successive listeners.
Among schoolchildren in the U.S., daily rote recitation of the Pledge of Allegiance has long provided opportunities for the genesis of mondegreens.
In Dutch, mondegreens are popularly referred to as "Mama appelsap" ("Mommy applejuice"), from the Michael Jackson song "Wanna Be Startin' Somethin'" which features the lyrics "Mama-se mama-sa ma-ma-coo-sa", and was once misheard as "Mama say mama sa mam[a]appelsap". The Dutch radio station 3FM had a show "Superrradio" (originally "Timur Open Radio") run by Timur Perlin and Ramon with an item in which listeners were encouraged to send in mondegreens under the name "Mama appelsap". The segment was popular for years.
In French, the phenomenon is also known as 'hallucination auditive', especially when referring to pop songs.
The title of the film "La Vie en rose" depicting the life of Édith Piaf can be mistaken for "L'Avion rose" (The pink airplane).
The title of the 1983 French novel "Le Thé au harem d'Archi Ahmed" ("Tea in the Harem of Archi Ahmed") by Mehdi Charef (and the 1985 movie of the same name) is based on the main character mishearing "le théorème d'Archimède" ("the theorem of Archimedes") in his mathematics class.
A classic example in French is similar to the "Lady Mondegreen" anecdote: in his 1962 collection of children's quotes "La Foire aux cancres", the humorist Jean-Charles refers to a misunderstood lyric of "La Marseillaise" (the French national anthem): "Entendez-vous ... mugir ces féroces soldats" (Do you hear those savage soldiers roar?) is heard as "...Séféro, ce soldat" (that soldier Séféro).
Mondegreens are a well-known phenomenon in German, especially where non-German songs are concerned. They are sometimes called, after a well-known example, "Agathe Bauer"-songs ("I got the power", a song by Snap!, transferred to a German female name). Journalist Axel Hacke published a series of books about them, beginning with "Der weiße Neger Wumbaba" ("The White Negro Wumbaba", after the line "der weiße Nebel wunderbar" from Der Mond ist aufgegangen).
It is at least an urban legend that children, when painting nativity scenes, occasionally include next to the Child, Mary, Joseph, the shepherds and so forth yet another, laughing creature: This is the "Owi", who must be depicted laughing. The reason is to be found in the line "Gottes Sohn! O wie lacht / Lieb' aus Deinem göttlichen Mund" (God's Son! Oh, how does love laugh out of Thy divine mouth!) from Silent Night. The subject is "Lieb'", but it is a poetic contraction of "die Liebe", leaving away the final -e and the definite article (in German, though not in English, mandatory in such a context), so the phrase is not easily understood and it might well be a statement about a person named Owi laughing "in a loveable manner" (the adverb "lieb"), although the rest of the sentence still makes no sense. "Owi lacht" is the title of at least one book about Christmas and Christmas songs.
Ghil'ad Zuckermann cites the Hebrew example "mukhrakhím liyót saméakh" ("we must be happy", with a grammar mistake) instead of (the high-register) "úru 'akhím belév saméakh" ("wake up, brothers, with a happy heart"), from the well-known song "Háva Nagíla" ("Let's be happy").
The Israeli site dedicated to Hebrew mondegreens has coined the term "avatiach" (Hebrew for watermelon) for "mondegreen", named for a common mishearing of Shlomo Artzi's award-winning 1970 song "Ahavtia" ("I loved her", using a form uncommon in spoken Hebrew).
A paper in phonology cites memoirs of poet Antoni Słonimski, who confessed that in the recited poem "Konrad Wallenrod" he used to hear "zwierz Alpuhary" ("a beast of Alpujarras") rather than "z wież Alpuhary" ("from the towers of Alpujarras").
The most well-known mondegreen in Brazil is in the song "Noite do Prazer" (Night of Pleasure) by Claudio Zoli: the line "Na madrugada a vitrola rolando um blues, tocando B. B. King sem parar" (At dawn the phonograph playing blues, playing B. B. King nonstop), is often misheard as "Na madrugada a vitrola rolando um blues, trocando de biquini sem parar" (at dawn the phonograph playing blues, changing bikinis nonstop).
Russian author Fyodor Dostoyevsky, in 1875, cited a line from Fyodor Glinka's song "Troika" (1825) "колокольчик, дар Валдая" ("the bell, gift of Valday") claiming that it is usually understood as "колокольчик, дарвалдая" ("the bell "darvaldaying""—supposedly an onomatopoeia of ringing).
A reverse mondegreen is the intentional production, in speech or writing, of words or phrases that seem to be gibberish but disguise meaning. A prominent example is "Mairzy Doats", a 1943 novelty song by Milton Drake, Al Hoffman, and Jerry Livingston. The lyrics are a reverse mondegreen, made up of oronyms, so pronounced (and written) as to challenge the listener (or reader) to interpret them:
The clue to the meaning is contained in the bridge:
This makes it clear that the last line is "A kid'll eat ivy, too; wouldn't you?"
Other examples include:
Two authors have written books of supposed foreign-language poetry that are actually mondegreens of nursery rhymes in English. Luis van Rooten's pseudo-French "" includes critical, historical, and interpretive apparatus, as does John Hulme's "Mörder Guss Reims", attributed to a fictitious German poet. Both titles sound like the phrase "Mother Goose Rhymes". Both works can also be considered soramimi, which produces different meanings when interpreted in another language. Wolfgang Amadeus Mozart produced a similar effect in his canon "Difficile Lectu", which, though ostensibly in Latin, is actually an opportunity for scatological humor in both German and Italian.
Some performers and writers have used deliberate mondegreens to create double entendres. The phrase "if you see Kay" (F-U-C-K) has been employed many times, notably as a line from James Joyce's 1922 novel "Ulysses" and in many songs, including by blues pianist Memphis Slim in 1963, R. Stevie Moore in 1977, April Wine on its 1982 album "Power Play", the Poster Children via their "Daisy Chain Reaction" in 1991, Turbonegro in 2005, Aerosmith in "Devil's Got a New Disguise" in 2006, and The Script in their 2008 song "If You See Kay". Britney Spears did the same thing with the song "If U Seek Amy". A similar effect was created in Hindi in the 2011 Bollywood movie "Delhi Belly" in the song "Bhaag D.K. Bose". While "D. K. Bose" appears to be a person's name, it is sung repeatedly in the chorus to form the deliberate mondegreen ""bhosadi ke"" (Hindi: भोसडी के), a Hindi expletive.
"Mondegreen" is a song by Yeasayer on their 2010 album, "Odd Blood". The lyrics are intentionally obscure (for instance, "Everybody sugar in my bed" and "Perhaps the pollen in the air turns us into a stapler") and spoken hastily to encourage the mondegreen effect.
Closely related categories are Hobson-Jobson, where a word from a foreign language is homophonically translated into one's own language, e.g. "cockroach" from Spanish "cucaracha", and "soramimi", a Japanese term for homophonic translation of song lyrics.
An unintentionally incorrect use of similar-sounding words or phrases, resulting in a changed meaning, is a malapropism. If there is a connection in meaning, it may be called an eggcorn. If a person stubbornly continues to mispronounce a word or phrase after being corrected, that person has committed a mumpsimus.
Notes
Citations
|
https://en.wikipedia.org/wiki?curid=20038
|
Merge sort
In computer science, merge sort (also commonly spelled mergesort) is an efficient, general-purpose, comparison-based sorting algorithm. Most implementations produce a stable sort, which means that the order of equal elements is the same in the input and output. Merge sort is a divide and conquer algorithm that was invented by John von Neumann in 1945. A detailed description and analysis of bottom-up mergesort appeared in a report by Goldstine and von Neumann as early as 1948.
Conceptually, a merge sort works as follows:
Example C-like code using indices for top-down merge sort algorithm that recursively splits the list (called "runs" in this example) into sublists until sublist size is 1, then merges those sublists to produce a sorted list. The copy back step is avoided with alternating the direction of the merge with each level of recursion (except for an initial one time copy). To help understand this, consider an array with 2 elements. the elements are copied to B[], then merged back to A[]. If there are 4 elements, when the bottom of recursion level is reached, single element runs from A[] are merged to B[], and then at the next higher level of recursion, those 2 element runs are merged to A[]. This pattern continues with each level of recursion.
// Array A[] has the items to sort; array B[] is a work array.
void TopDownMergeSort(A[], B[], n)
// Sort the given run of array A[] using array B[] as a source.
// iBegin is inclusive; iEnd is exclusive (A[iEnd] is not in the set).
void TopDownSplitMerge(B[], iBegin, iEnd, A[])
// Left source half is A[ iBegin:iMiddle-1].
// Right source half is A[iMiddle:iEnd-1 ].
// Result is B[ iBegin:iEnd-1 ].
void TopDownMerge(A[], iBegin, iMiddle, iEnd, B[])
void CopyArray(A[], iBegin, iEnd, B[])
Example C-like code using indices for bottom-up merge sort algorithm which treats the list as an array of "n" sublists (called "runs" in this example) of size 1, and iteratively merges sub-lists back and forth between two buffers:
// array A[] has the items to sort; array B[] is a work array
void BottomUpMergeSort(A[], B[], n)
// Left run is A[iLeft :iRight-1].
// Right run is A[iRight:iEnd-1 ].
void BottomUpMerge(A[], iLeft, iRight, iEnd, B[])
void CopyArray(B[], A[], n)
Pseudocode for top-down merge sort algorithm which recursively divides the input list into smaller sublists until the sublists are trivially sorted, and then merges the sublists while returning up the call chain.
In this example, the function merges the left and right sublists.
Pseudocode for bottom-up merge sort algorithm which uses a small fixed size array of references to nodes, where array[i] is either a reference to a list of size 2"i" or "nil". "node" is a reference or pointer to a node. The merge() function would be similar to the one shown in the top-down merge lists example, it merges two already sorted lists, and handles empty lists. In this case, merge() would use "node" for its input parameters and return value.
A natural merge sort is similar to a bottom-up merge sort except that any naturally occurring runs (sorted sequences) in the input are exploited. Both monotonic and bitonic (alternating up/down) runs may be exploited, with lists (or equivalently tapes or files) being convenient data structures (used as FIFO queues or LIFO stacks). In the bottom-up merge sort, the starting point assumes each run is one item long. In practice, random input data will have many short runs that just happen to be sorted. In the typical case, the natural merge sort may not need as many passes because there are fewer runs to merge. In the best case, the input is already sorted (i.e., is one run), so the natural merge sort need only make one pass through the data. In many practical cases, long natural runs are present, and for that reason natural merge sort is exploited as the key component of Timsort. Example:
Tournament replacement selection sorts are used to gather the initial runs for external sorting algorithms.
In sorting "n" objects, merge sort has an average and worst-case performance of O("n" log "n"). If the running time of merge sort for a list of length "n" is "T"("n"), then the recurrence "T"("n") = 2"T"("n"/2) + "n" follows from the definition of the algorithm (apply the algorithm to two lists of half the size of the original list, and add the "n" steps taken to merge the resulting two lists). The closed form follows from the master theorem for divide-and-conquer recurrences.
In the worst case, the number of comparisons merge sort makes is given by the sorting numbers. These numbers are equal to or slightly smaller than ("n" ⌈lg "n"⌉ − 2⌈lg "n"⌉ + 1), which is between ("n" lg "n" − "n" + 1) and ("n" lg "n" + "n" + O(lg "n")).
For large "n" and a randomly ordered input list, merge sort's expected (average) number of comparisons approaches "α"·"n" fewer than the worst case where formula_1
In the "worst" case, merge sort does about 39% fewer comparisons than quicksort does in the "average" case. In terms of moves, merge sort's worst case complexity is O("n" log "n")—the same complexity as quicksort's best case, and merge sort's best case takes about half as many iterations as the worst case.
Merge sort is more efficient than quicksort for some types of lists if the data to be sorted can only be efficiently accessed sequentially, and is thus popular in languages such as Lisp, where sequentially accessed data structures are very common. Unlike some (efficient) implementations of quicksort, merge sort is a stable sort.
Merge sort's most common implementation does not sort in place; therefore, the memory size of the input must be allocated for the sorted output to be stored in (see below for versions that need only "n"/2 extra spaces).
Variants of merge sort are primarily concerned with reducing the space complexity and the cost of copying.
A simple alternative for reducing the space overhead to "n"/2 is to maintain "left" and "right" as a combined structure, copy only the "left" part of "m" into temporary space, and to direct the "merge" routine to place the merged output into "m". With this version it is better to allocate the temporary space outside the "merge" routine, so that only one allocation is needed. The excessive copying mentioned previously is also mitigated, since the last pair of lines before the "return result" statement (function " merge "in the pseudo code above) become superfluous.
One drawback of merge sort, when implemented on arrays, is its working memory requirement. Several in-place variants have been suggested:
An alternative to reduce the copying into multiple lists is to associate a new field of information with each key (the elements in "m" are called keys). This field will be used to link the keys and any associated information together in a sorted list (a key and its related information is called a record). Then the merging of the sorted lists proceeds by changing the link values; no records need to be moved at all. A field which contains only a link will generally be smaller than an entire record so less space will also be used. This is a standard sorting technique, not restricted to merge sort.
An external merge sort is practical to run using disk or tape drives when the data to be sorted is too large to fit into memory. External sorting explains how merge sort is implemented with disk drives. A typical tape drive sort uses four tape drives. All I/O is sequential (except for rewinds at the end of each pass). A minimal implementation can get by with just two record buffers and a few program variables.
Naming the four tape drives as A, B, C, D, with the original data on A, and using only 2 record buffers, the algorithm is similar to Bottom-up implementation, using pairs of tape drives instead of arrays in memory. The basic algorithm can be described as follows:
Instead of starting with very short runs, usually a hybrid algorithm is used, where the initial pass will read many records into memory, do an internal sort to create a long run, and then distribute those long runs onto the output set. The step avoids many early passes. For example, an internal sort of 1024 records will save nine passes. The internal sort is often large because it has such a benefit. In fact, there are techniques that can make the initial runs longer than the available internal memory.
With some overhead, the above algorithm can be modified to use three tapes. "O"("n" log "n") running time can also be achieved using two queues, or a stack and a queue, or three stacks. In the other direction, using "k" > two tapes (and "O"("k") items in memory), we can reduce the number of tape operations in "O"(log "k") times by using a k/2-way merge.
A more sophisticated merge sort that optimizes tape (and disk) drive usage is the polyphase merge sort.
On modern computers, locality of reference can be of paramount importance in software optimization, because multilevel memory hierarchies are used. Cache-aware versions of the merge sort algorithm, whose operations have been specifically chosen to minimize the movement of pages in and out of a machine's memory cache, have been proposed. For example, the algorithm stops partitioning subarrays when subarrays of size S are reached, where S is the number of data items fitting into a CPU's cache. Each of these subarrays is sorted with an in-place sorting algorithm such as insertion sort, to discourage memory swaps, and normal merge sort is then completed in the standard recursive fashion. This algorithm has demonstrated better performance on machines that benefit from cache optimization.
Also, many applications of external sorting use a form of merge sorting where the input get split up to a higher number of sublists, ideally to a number for which merging them still makes the currently processed set of pages fit into main memory.
Merge sort parallelizes well due to the use of the divide-and-conquer method. Several different parallel variants of the algorithm have been developed over the years. Some parallel merge sort algorithms are strongly related to the sequential top-down merge algorithm while others have a different general structure and use the K-way merge method.
The sequential merge sort procedure can be described in two phases, the divide phase and the merge phase. The first consists of many recursive calls that repeatedly perform the same division process until the subsequences are trivially sorted (containing one or no element). An intuitive approach is the parallelization of those recursive calls. Following pseudocode describes the merge sort with parallel recursion using the fork and join keywords:
This algorithm is the trivial modification of the sequential version and does not parallelize well. Therefore, its speedup is not very impressive. It has a span of formula_2, which is only an improvement of formula_3 compared to the sequential version (see Introduction to Algorithms). This is mainly due to the sequential merge method, as it is the bottleneck of the parallel executions.
Better parallelism can be achieved by using a parallel merge algorithm. Cormen et al. present a binary variant that merges two sorted sub-sequences into one sorted output sequence.
In one of the sequences (the longer one if unequal length), the element of the middle index is selected. Its position in the other sequence is determined in such a way that this sequence would remain sorted if this element were inserted at this position. Thus, one knows how many other elements from both sequences are smaller and the position of the selected element in the output sequence can be calculated. For the partial sequences of the smaller and larger elements created in this way, the merge algorithm is again executed in parallel until the base case of the recursion is reached.
The following pseudocode shows the modified parallel merge sort method using the parallel merge algorithm (adopted from Cormen et al.).
In order to analyze a Recurrence relation for the worst case span, the recursive calls of parallelMergesort have to be incorporated only once due to their parallel execution, obtaining
formula_4.
For detailed information about the complexity of the parallel merge procedure, see Merge algorithm.
The solution of this recurrence is given by
formula_5.
This parallel merge algorithm reaches a parallelism of formula_6, which is much higher than the parallelism of the previous algorithm. Such a sort can perform well in practice when combined with a fast stable sequential sort, such as insertion sort, and a fast sequential merge as a base case for merging small arrays.
It seems arbitrary to restrict the merge sort algorithms to a binary merge method, since there are usually p > 2 processors available. A better approach may be to use a K-way merge method, a generalization of binary merge, in which formula_7 sorted sequences are merged together. This merge variant is well suited to describe a sorting algorithm on a PRAM.
Given an unsorted sequence of formula_8 elements, the goal is to sort the sequence with formula_9 available processors. These elements are distributed equally among all processors and sorted locally using a sequential Sorting algorithm. Hence, the sequence consists of sorted sequences formula_10 of length formula_11. For simplification let formula_8 be a multiple of formula_9, so that formula_14 for formula_15.
These sequences will be used to perform a multisequence selection/splitter selection. For formula_16, the algorithm determines splitter elements formula_17 with global rank formula_18. Then the corresponding positions of formula_19 in each sequence formula_20 are determined with binary search and thus the formula_20 are further partitioned into formula_9 subsequences formula_23 with formula_24.
Furthermore, the elements of formula_25 are assigned to processor formula_26, means all elements between rank formula_27 and rank formula_28, which are distributed over all formula_20. Thus, each processor receives a sequence of sorted sequences. The fact that the rank formula_7 of the splitter elements formula_31 was chosen globally, provides two important properties: On the one hand, formula_7 was chosen so that each processor can still operate on formula_33 elements after assignment. The algorithm is perfectly load-balanced. On the other hand, all elements on processor formula_26 are less than or equal to all elements on processor formula_35. Hence, each processor performs the "p"-way merge locally and thus obtains a sorted sequence from its sub-sequences. Because of the second property, no further "p"-way-merge has to be performed, the results only have to be put together in the order of the processor number.
In its simplest form, given formula_9 sorted sequences formula_10 distributed evenly on formula_9 processors and a rank formula_7, the task is to find an element formula_40 with a global rank formula_7 in the union of the sequences. Hence, this can be used to divide each formula_20 in two parts at a splitter index formula_43, where the lower part contains only elements which are smaller than formula_40, while the elements bigger than formula_40 are located in the upper part.
The presented sequential algorithm returns the indices of the splits in each sequence, e.g. the indices formula_43 in sequences formula_20 such that formula_48 has a global rank less than formula_7 and formula_50.
For the complexity analysis the PRAM model is chosen. If the data is evenly distributed over all formula_9, the p-fold execution of the "binarySearch" method has a running time of formula_52. The expected recursion depth is formula_53 as in the ordinary Quickselect. Thus the overall expected running time is formula_54.
Applied on the parallel multiway merge sort, this algorithm has to be invoked in parallel such that all splitter elements of rank formula_55 for formula_56 are found simultaneously. These splitter elements can then be used to partition each sequence in formula_9 parts, with the same total running time of formula_58.
Below, the complete pseudocode of the parallel multiway merge sort algorithm is given. We assume that there is a barrier synchronization before and after the multisequence selection such that every processor can determine the splitting elements and the sequence partition properly.
Firstly, each processor sorts the assigned formula_59 elements locally using a sorting algorithm with complexity formula_60. After that, the splitter elements have to be calculated in time formula_61. Finally, each group of formula_9 splits have to be merged in parallel by each processor with a running time of formula_63 using a sequential p-way merge algorithm. Thus, the overall running time is given by
formula_64.
The multiway merge sort algorithm is very scalable through its high parallelization capability, which allows the use of many processors. This makes the algorithm a viable candidate for sorting large amounts of data, such as those processed in computer clusters. Also, since in such systems memory is usually not a limiting resource, the disadvantage of space complexity of merge sort is negligible. However, other factors become important in such systems, which are not taken into account when modelling on a PRAM. Here, the following aspects need to be considered: Memory hierarchy, when the data does not fit into the processors cache, or the communication overhead of exchanging data between processors, which could become a bottleneck when the data can no longer be accessed via the shared memory.
Sanders et al. have presented in their paper a bulk synchronous parallel algorithm for multilevel multiway mergesort, which divides formula_9 processors into formula_66 groups of size formula_67. All processors sort locally first. Unlike single level multiway mergesort, these sequences are then partitioned into formula_66 parts and assigned to the appropriate processor groups. These steps are repeated recursively in those groups. This reduces communication and especially avoids problems with many small messages. The hierarchial structure of the underlying real network can be used to define the processor groups (e.g. racks, clusters...).
Merge sort was one of the first sorting algorithms where optimal speed up was achieved, with Richard Cole using a clever subsampling algorithm to ensure "O"(1) merge. Other sophisticated parallel sorting algorithms can achieve the same or better time bounds with a lower constant. For example, in 1991 David Powers described a parallelized quicksort (and a related radix sort) that can operate in "O"(log "n") time on a CRCW parallel random-access machine (PRAM) with "n" processors by performing partitioning implicitly. Powers further shows that a pipelined version of Batcher's Bitonic Mergesort at "O"((log "n")2) time on a butterfly sorting network is in practice actually faster than his "O"(log "n") sorts on a PRAM, and he provides detailed discussion of the hidden overheads in comparison, radix and parallel sorting.
Although heapsort has the same time bounds as merge sort, it requires only Θ(1) auxiliary space instead of merge sort's Θ("n"). On typical modern architectures, efficient quicksort implementations generally outperform mergesort for sorting RAM-based arrays. On the other hand, merge sort is a stable sort and is more efficient at handling slow-to-access sequential media. Merge sort is often the best choice for sorting a linked list: in this situation it is relatively easy to implement a merge sort in such a way that it requires only Θ(1) extra space, and the slow random-access performance of a linked list makes some other algorithms (such as quicksort) perform poorly, and others (such as heapsort) completely impossible.
As of Perl 5.8, merge sort is its default sorting algorithm (it was quicksort in previous versions of Perl). In Java, the Arrays.sort() methods use merge sort or a tuned quicksort depending on the datatypes and for implementation efficiency switch to insertion sort when fewer than seven array elements are being sorted. The Linux kernel uses merge sort for its linked lists. Python uses Timsort, another tuned hybrid of merge sort and insertion sort, that has become the standard sort algorithm in Java SE 7 (for arrays of non-primitive types), on the Android platform, and in GNU Octave.
|
https://en.wikipedia.org/wiki?curid=20039
|
Maule Air
Maule Air, Inc. is a manufacturer of light, single-engined, short take-off and landing (STOL) aircraft, based in Moultrie, Georgia, USA. The company delivered 2,500 aircraft in its first 50 years of business.
Belford D. Maule (1911–1995) designed his first aircraft, the M-1 starting at age 19. He founded the company Mechanical Products Co. in Napoleon, Michigan to market his own starter design. In 1941 the B.D. Maule Co. was founded, and Maule produced tailwheels and fabric testers. In 1953 he began design work, and started aircraft production with the "Bee-Dee" M-4 in 1957.
The company is a family-owned enterprise. Its owner, June Maule, widow of B. D. Maule, remained directly involved with factory production until her death in 2009 at the age of 92.
The aircraft produced by Maule Air are tube-and-fabric designs and are popular with bush pilots, thanks to their very low stall speed, their tundra tires and oleo strut landing gear. Most Maules are built with tailwheel or amphibious configurations, although the newer MXT models have tricycle gear.
|
https://en.wikipedia.org/wiki?curid=20040
|
Minnesota Twins
The Minnesota Twins are an American professional baseball team based in Minneapolis, Minnesota. The Twins compete in Major League Baseball (MLB) as a member club of the American League (AL) Central division. The team is named after the Twin Cities area which includes the two adjoining cities of Minneapolis and St. Paul.
The franchise was founded in Washington, D.C. in 1901 as the Washington Senators. The team relocated to Minnesota and was renamed the Minnesota Twins at the start of the 1961 season. The Twins played in Metropolitan Stadium from 1961 to 1981 and in the Hubert H. Humphrey Metrodome from 1982 to 2009. The team played its inaugural game at Target Field on April 12, 2010. The franchise won the World Series in 1924 as the Senators, and in 1987 and 1991 as the Twins.
Through the 2019 season, the team has fielded 19 American League batting champions. The team has hosted five All-Star Games: 1937 and 1956 in Washington, D.C.; and 1965, 1985, and 2014 in Minneapolis-St. Paul.
From 1901 to 2019, the Twins overall win-loss record is 8903-9603 (a 0.481 win-loss "percentage").
The team was founded in Washington, D.C. in as one of the eight original teams of the American League, named the Washington Senators or Washington Nationals (both names had been used in the club's early years and no official name was used thereafter). The team endured long bouts of mediocrity immortalized in the 1955 Broadway musical "Damn Yankees".
The Washington Senators spent the first decade of their existence finishing near the bottom of the American League standings. Their fortunes began to improve with the arrival of 19-year-old pitcher, Walter Johnson, in 1907. Johnson blossomed in 1911 with 25 victories, although the Senators still finished the season in seventh place. In 1912, the Senators improved dramatically, as their pitching staff led the league in team earned run average and in strikeouts. Johnson won 33 games while teammate Bob Groom added another 24 wins to help the Senators finish the season in second place. Manager Clark Griffith joined the team in 1912 and became the team's owner in 1920. (The franchise remained under Griffith family ownership until 1984.) The Senators continued to perform respectably in 1913 with Johnson posting a career-high 35 victories, as the team once again finished in second place. The Senators then fell into another period of decline for the next decade.
The team had a period of prolonged success in the 1920s and 1930s, led by Walter Johnson, as well as additional Hall-of-Famer Bucky Harris, Goose Goslin, Sam Rice, Heinie Manush, and Joe Cronin. In particular, a rejuvenated Johnson rebounded in 1924 to win 23 games with the help of his catcher, Muddy Ruel, as the Senators won the American League pennant for the first time in the history of the franchise. The Senators then faced John McGraw's heavily favored New York Giants in the 1924 World Series. The two teams traded wins back and forth with three games of the first six being decided by one run. In the deciding 7th game, the Senators were trailing the Giants 3 to 1 in the 8th inning when Bucky Harris hit a routine ground ball to third which hit a pebble and took a bad hop over Giants third baseman Freddie Lindstrom. Two runners scored on the play, tying the score at three. An aging Walter Johnson then came in to pitch the ninth inning, and held the Giants scoreless into extra innings. In the bottom of the twelfth inning with Ruel at bat, he hit a high, foul ball directly over home plate. The Giants' catcher, Hank Gowdy, dropped his protective mask to field the ball but, failing to toss the mask aside, stumbled over it and dropped the ball, thus giving Ruel another chance to bat. On the next pitch, Ruel hit a double and proceeded to score the winning run when Earl McNeely hit a ground ball that took another bad hop over Lindstrom's head. This would mark the only World Series triumph for the franchise during their 60-year tenure in Washington.
The following season they repeated as American League champions but ultimately lost the 1925 World Series to the Pittsburgh Pirates. After Walter Johnson's retirement in 1927, he was hired as manager of the Senators. After enduring a few losing seasons, the team returned to contention in 1930. In 1933, Senators owner Clark Griffith returned to the formula that worked for him nine years prior: 26-year-old shortstop Joe Cronin became player-manager. The Senators posted a 99–53 record and cruised to the pennant seven games ahead of the New York Yankees, but in the 1933 World Series the Giants exacted their revenge winning in five games. Following the loss, the Senators sank all the way to seventh place in 1934 and attendance began to fall. Despite the return of Harris as manager from 1935–42 and again from 1950–54, Washington was mostly a losing ball club for the next 25 years contending for the pennant only during World War II. Washington came to be known as "first in war, first in peace, and last in the American League", with their hard luck being crucial to the plot of the musical and film "Damn Yankees". Cecil Travis, Buddy Myer (1935 A.L. batting champion), Roy Sievers, Mickey Vernon (batting champion in 1946 and 1953), and Eddie Yost were notable Senators players whose careers were spent in obscurity due to the team's lack of success. In 1954, the Senators signed future Hall of Fame member Harmon Killebrew. By 1959 he was the Senators’ regular third baseman and led the league with 42 home runs earning him a starting spot on the American League All-Star team.
After Griffith's death in 1955, his nephew and adopted son Calvin took over the team presidency. Calvin sold Griffith Stadium to the city of Washington and leased it back leading to speculation that the team was planning to move as the Boston Braves, St. Louis Browns and Philadelphia Athletics had all done in the early 1950s. By 1957, after an early flirtation with San Francisco (where the New York Giants would eventually move after that season ended), Griffith began courting Minneapolis–St. Paul, a prolonged process that resulted in his rejecting the Twin Cities' first offer before agreeing to relocate. Home attendance in Washington, D.C. steadily increased from 425,238 in 1955 to 475,288 in 1958, and then jumped to 615,372 in 1959. However, part of the Minnesota deal guarateed a million fans a year for three years, plus the potential to double TV and radio money.
The American League opposed the move at first, but in 1960 a deal was reached: The Senators would move and would be replaced with an expansion Senators team for 1961. Thus, the old Washington Senators became the Minnesota Twins.
The Washington franchise was known as both "Senators" and "Nationals" at various times, and sometimes at the same time. In 1905, the team changed its official name to the "Washington Nationals." The name "Nationals" appeared on uniforms for only two seasons, and was then replaced with the "W" logo for the next 52 years. The media often shortened the nickname to "Nats." Many fans and newspapers (especially out-of-town papers) persisted in using the "Senators" nickname, because of potential confusion caused by an American League team using the "Nationals" name. Over time, "Nationals" faded as a nickname, and "Senators" became dominant. Baseball guides listed the club's nickname as "Nationals or Senators", acknowledging the dual-nickname situation.
The team name was officially changed to Washington Senators around the time of Clark Griffith's death. It was not until 1959 that the word "Senators" first appeared on team shirts. "Nats," from the team name's middle syllable, continued to be used by space-saving headline writers, even for the 1961 expansion team, which was never officially known as "Nationals."
The current "Nationals" and "Nats" names were revived in 2005, when the Montreal Expos relocated to Washington to become the Nationals.
In 1960, Major League Baseball granted the city of Minneapolis an expansion team. Washington owner Calvin Griffith, Clark's nephew and adopted son, requested that he be allowed to move his team to Minneapolis-St. Paul and instead give Washington the expansion team. Upon league approval, the team moved to Minnesota after the 1960 season, setting up shop in Metropolitan Stadium, while Washington fielded a brand new "Washington Senators" (which later became the Texas Rangers prior to the 1972 season).
Success came quickly to the team in Minnesota. Sluggers Harmon Killebrew and Bob Allison, who had already been stars in Washington, were joined by Tony Oliva and Zoilo Versalles, and later second baseman Rod Carew and pitchers Jim Kaat and Jim Perry, winning the American League pennant in 1965. A second wave of success came in the late 1980s and early 1990s under manager Tom Kelly, led by Kent Hrbek, Bert Blyleven, Frank Viola, and Kirby Puckett, winning the franchise's second and third World Series (and first and second in Minnesota).
The name "Twins" was derived from the popular name of the region, the Twin Cities (Minneapolis and St. Paul). The NBA's Minneapolis Lakers had relocated to Los Angeles in 1960 due to poor attendance which was believed to have been caused in part by the reluctance of fans in St. Paul to support the team. Griffith was determined not to alienate fans in either city by naming the team after one city or the other, so his desire was to name the team the "Twin Cities Twins", however MLB objected. Griffith therefore named the team the "Minnesota Twins". However, the team was allowed to keep its original "TC" (for Twin Cities) insignia for its caps. The team's logo shows two men, one in a Minneapolis Millers uniform and one in a St. Paul Saints uniform, shaking hands across the Mississippi River within an outline of the state of Minnesota. The "TC" remained on the Twins' caps until 1987, when they adopted new uniforms. By this time, the team felt it was established enough to put an "M" on its cap without having St. Paul fans think it stood for Minneapolis. The "TC" logo was moved to a sleeve on the jerseys, and occasionally appeared as an alternate cap design. Both the "TC" and "Minnie & Paul" logos remain the team's primary insignia. As of 2010, the "TC" logo has been reinstated on the cap as their logo.
The Twins were eagerly greeted in Minnesota when they arrived in 1961. They brought a nucleus of talented players: Harmon Killebrew, Bob Allison, Camilo Pascual, Zoilo Versalles, Jim Kaat, Earl Battey, and Lenny Green. Tony Oliva, who would go on to win American League batting championships in 1964, 1965 and 1971, made his major league debut in 1962. That year, the Twins won 91 games, the most by the franchise since 1933. Behind Mudcat Grant's 21 victories, Versalles' A.L. MVP season and Oliva's batting title, the Twins won 102 games and the American League Pennant in 1965, but they were defeated in the World Series by the Los Angeles Dodgers in seven games (behind the Series MVP, Sandy Koufax, who compiled a 2–1 record, including winning the seventh game).
Heading into the final weekend of the 1967 season, when Rod Carew was named the A.L. Rookie of the Year, the Twins, Boston Red Sox, Chicago White Sox, and Detroit Tigers all had a shot at clinching the American League championship. The Twins and the Red Sox started the weekend tied for 1st place and played against each other in Boston for the final three games of the season. The Red Sox won two out of the three games, seizing their first pennant since 1946 with a 92–70 record. The Twins and Tigers both finished one game back, with 91–71 records, while the White Sox finished three games back, at 89–73. In 1969, the new manager of the Twins, Billy Martin, pushed aggressive base running all-around, and Carew set the all-time Major League record by stealing home seven times in addition to winning the first of seven A.L. batting championships. With Killebrew slugging 49 homers and winning the AL MVP Award, these 1969 Twins won the very first American League Western Division Championship, but they lost three straight games to the Baltimore Orioles, winners of 109 games, in the first American League Championship Series. The Orioles would go on to be upset by the New York Mets in the World Series. Martin was fired after the season, in part due to an August fight in Detroit with 20-game winner Dave Boswell and outfielder Bob Allison, in an alley outside the Lindell A.C. bar. Bill Rigney led the Twins to a repeat division title in 1970, behind the star pitching of Jim Perry (24-12), the A.L. Cy Young Award winner, while the Orioles again won the Eastern Division Championship behind the star pitching of Jim Palmer. Once again, the Orioles won the A.L. Championship Series in a three-game sweep, and this time they would win the World Series.
After winning the division again in 1970, the team entered an eight-year dry spell, finishing around the .500 mark. Killebrew departed after 1974. Owner Calvin Griffith faced financial difficulty with the start of free agency, costing the Twins the services of Lyman Bostock and Larry Hisle, who left as free agents after the 1977 season, and Carew, who was traded after the 1978 season. In 1975, Carew won his fourth consecutive AL batting title, having already joined Ty Cobb as the only players to lead the major leagues in batting average for three consecutive seasons. In , Carew batted .388, which was the highest in baseball since Boston's Ted Williams hit .406 in ; he won the 1977 AL MVP Award. He won another batting title in 1978, hitting .333.
In 1982, the Twins moved into the Hubert H. Humphrey Metrodome, which they shared with the Minnesota Vikings. After a 16–54 start, the Twins were on the verge on becoming the worst team in MLB history. They turned the season around somewhat, but still lost 102 games which is the second-worst record in Twins history (beaten only by the 2016 team, who lost 103 games), despite the .301 average, 23 homers and 92 RBI from rookie Kent Hrbek. In 1984, Griffith sold the Twins to multi-billionaire banker/financier Carl Pohlad. The Metrodome hosted the 1985 Major League Baseball All-Star Game. After several losing seasons, the 1987 team, led by Hrbek, Gary Gaetti, Frank Viola (A.L. Cy Young winner in 1988), Bert Blyleven, Jeff Reardon, Tom Brunansky, Dan Gladden, and rising star Kirby Puckett, returned to the World Series after defeating the favored Detroit Tigers in the ALCS, 4 games to 1. Tom Kelly managed the Twins to World Series victories over the St. Louis Cardinals in 1987 and the Atlanta Braves in 1991. The 1988 Twins were the first team in American League history to draw more than 3 million fans. On July 17, 1990, the Twins became the only team in major league history to pull off two triple plays in the same game. Twins' pitcher and Minnesota native Jack Morris was the star of the series in 1991, going 2–0 in his three starts with a 1.17 ERA. 1991 also marked the first time that any team that finished in last place in their division would advance to the World Series the following season; both the Twins and the Braves did this in 1991. Contributors to the 1991 Twins' improvement from 74 wins to 95 included Chuck Knoblauch, the A.L. Rookie of the Year; Scott Erickson, 20-game winner; new closer Rick Aguilera and new designated hitter Chili Davis.
The World Series in 1991 is regarded by many as one of the classics of all time. In this Series, four games were won during the teams' final at-bat, and three of these were in extra innings. The Atlanta Braves won all three of their games in Atlanta, and the Twins won all four of their games in Minnesota. The sixth game was a legendary one for Puckett, who tripled in a run, made a sensational leaping catch against the wall, and finally in the 11th inning hit the game-winning home run. The seventh game was tied 0–0 after the regulation nine innings, and marked only the second time that the seventh game of the World Series had ever gone into extra innings. The Twins won on a walk-off RBI single by Gene Larkin in the bottom of the 10th inning, after Morris had pitched ten shutout innings against the Braves. The seventh game of the 1991 World Series is widely regarded as one of the greatest games in the history of professional baseball.
After a winning season in 1992 but falling short of Oakland in the division, the Twins fell into a years-long stretch of mediocrity, posting a losing record each season for the next eight: 71–91 in 1993, 50–63 in 1994, 56–88 in 1995, 78–84 in 1996, 68–94 in 1997, 70–92 in 1998, 63–97 in 1999 and 69–93 in 2000. From 1994 to 1997, a long sequence of retirements and injuries hurt the team badly, and Tom Kelly spent the remainder of his managerial career attempting to rebuild the Twins. In 1997, owner Carl Pohlad almost sold the Twins to North Carolina businessman Don Beaver, who would have moved the team to the Piedmont Triad area.
Puckett after the 1995 season was forced to retire at age 35 due to loss of vision in one eye from a central retinal vein occlusion. The 1989 A.L. batting champion, he retired as the Twins' all-time leader in career hits, runs, doubles, and total bases. At the time of his retirement, his .318 career batting average was the highest by any right-handed American League batter since Joe DiMaggio. Puckett was the fourth baseball player during the 20th century to record 1,000 hits in his first five full calendar years in Major League Baseball, and was the second to record 2,000 hits during his first 10 full calendar years. He was elected to the Baseball Hall of Fame in 2001, his first year of eligibility.
The Twins dominated the Central Division in the first decade of the new century, winning the division in six of those ten years ('02, '03, '04, '06, '09 and '10), and nearly winning it in '08 as well. From 2001 to 2006, the Twins compiled the longest streak of consecutive winning seasons since moving to Minnesota.
Threatened with closure by league contraction, the 2002 team battled back to reach the American League Championship Series before being eliminated 4–1 by that year's World Series champion Anaheim Angels. The Twins have not won a playoff series since the 2002 series against the Athletics, this despite the team winning several division championships in the decade.
In 2006, the Twins won the division on the last day of the regular season (the only day all season they held sole possession of first place) but lost to the Oakland Athletics in the ALDS. Ozzie Guillén coined a nickname for this squad, calling the Twins "little piranhas". The Twins players embraced the label, and in response, the Twins Front office started a "Piranha Night", with piranha finger puppets given out to the first 10,000 fans. Scoreboard operators sometimes played an animated sequence of piranhas munching under that caption in situations where the Twins were scoring runs playing "small ball", and the stadium vendors sold T-shirts and hats advertising "The Little Piranhas".
The Twins also had the AL MVP in Justin Morneau, the AL batting champion in Joe Mauer, and the AL Cy Young Award winner in Johan Santana.
In 2008, the Twins finished the regular season tied with the White Sox on top of the AL Central, forcing a one-game playoff in Chicago to determine the division champion. The Twins lost that game and missed the playoffs. The game location was determined by rule of a coin flip that was conducted in mid-September. This rule was changed for the start of the 2009 season, making the site for any tiebreaker game to be determined by the winner of the regular season head-to-head record between the teams involved.
After a year where the Twins played .500 baseball for most of the season, the team won 17 of their last 21 games to tie the Detroit Tigers for the lead in the Central Division. The Twins were able to use the play-in game rule to their advantage when they won the AL Central at the end of the regular season by way of a 6–5 tiebreaker game that concluded with a 12th-inning walk-off hit by Alexi Casilla to right field, that scored Carlos Gómez. However, they failed to advance to the American League Championship Series as they lost the American League Divisional Series in three straight games to the eventual World Series champion New York Yankees. That year, Joe Mauer became only the second catcher in 33 years to win the AL MVP award. Iván Rodríguez won for the Texas Rangers in 1999, previous to that, the last catcher to win an AL MVP was the New York Yankees Thurman Munson in 1976.
In their inaugural season played at Target Field, the Twins finished the regular season with a record of 94–68, clinching the AL Central Division title for the 6th time in 9 years under manager Ron Gardenhire. New regular players included rookie Danny Valencia at third base, designated hitter Jim Thome, closer Matt Capps, infielder J. J. Hardy, and infielder Orlando Hudson. In relief pitching roles were late additions Brian Fuentes and Randy Flores. On July 7, the team suffered a major blow when Justin Morneau sustained a concussion, which knocked him out for the rest of the season. In the divisional series, the Twins lost to the Yankees in a three-game sweep for the second consecutive year. Following the season, Ron Gardenhire received AL Manager of the Year honors after finishing as a runner up in several prior years.
After repeating as AL Central champions in 2010, the Twins entered 2011 with no players on the disabled list, and the team seemed poised for another strong season. During the off-season, the team signed Japanese shortstop Tsuyoshi Nishioka to fill a hole in the middle infield, re-signed Jim Thome, who was in pursuit of career home run number 600, and also re-signed Carl Pavano. However, the season was largely derailed by an extensive list of injuries. Nishioka's broken leg in a collision at second base led the way and was followed by DL stints from Kevin Slowey, Joe Mauer, Jason Repko, Thome, Delmon Young (two stints on the DL), José Mijares, Glen Perkins, Joe Nathan, Francisco Liriano, Jason Kubel, Denard Span (two stints), Justin Morneau, Scott Baker, and Alexi Casilla. The team's low point was arguably on May 1 when the team started 7 players who were batting below .235 in a game against Kansas City. From that day forward, the Twins made a strong push to get as close as five games back of the division lead by the All-Star break. However, the team struggled down the stretch and fell back out of contention. The team failed to reach the playoffs for the first time since 2008 and experienced their first losing season in four years. Despite an AL-worst 63–99 record, the team drew over 3 million fans for the second consecutive year.
Michael Cuddyer served as the Twins representative at the All-Star game, his first appearance. Bert Blyleven's number was retired during the season and he was also inducted into the Baseball Hall of Fame during the month of July. On August 10, Nathan recorded his 255th save, passing Rick Aguilera for first place on the franchise's all-time saves list. On August 15, Thome hit 599th and 600th home run at Comerica Park to become the eighth player in Major League history to hit 600 home runs, joining Babe Ruth, Willie Mays, Hank Aaron, Barry Bonds, Sammy Sosa, Ken Griffey, Jr., and Alex Rodriguez.
The team started the 2012 season with a league worst 10–24 record. In late May and early June, the team embarked on a hot streak, winning 10 out of 13 games. By mid July, the team found themselves only 10 games out of the division lead. On July 16, the Twins defeated the Baltimore Orioles 19–7, the most runs scored in the short history of Target Field. By the end of August, the Twins were more than 20 games below .500, and last in the American League. On August 29, it was announced that the Twins would host the 2014 All-Star Game. In 2013, the Twins finished in 4th place in the AL Central, with a record of 66–96. In 2014, the team finished with a 70–92 record, last in the division and accumulated the second fewest wins in the American League. As a result, Ron Gardenhire was fired on September 29, 2014. On November 3, 2014 Paul Molitor was announced by the team as the 13th manager in Twins history.
In 2015, the team had a winning season (83-79), following four consecutive seasons of 90 or more losses.
In 2016, the Minnesota Twins finished last in the AL Central, with a 59–103 record. Brian Dozier set his career high in home runs with 43, which was tied for second in baseball, and leading all 2nd basemen. Tyler Duffey led all Twins starters with 9 wins throughout the season, while fellow reliever Brandon Kintzler led the team with 17 saves. Rising stars Miguel Sanó, Max Kepler, and Byron Buxton combined to have 263 total hits, 52 home runs, 167 RBIs, and a batting average of .232 throughout the season. The Twins signed star Korean slugger Byung Ho Park to a 4-year/$12 million contract, where he hit a .191 batting average, with 12 home runs, and 24 RBIs before being sent down to Rochester for the remainder of the season.
In 2017, the Twins went 85–77, finishing 2nd In the AL Central. Following Brian Dozier's 34 home runs, Miguel Sanó, Byron Buxton, and Eddie Rosario all had breakout years, while Joe Mauer hit .305. They ended up making the playoffs, which made them the first ever team to lose 100 games the previous year and make the playoffs the next season. They lost to the Yankees in the wild card round.
The 2018 season did not go as well. The Twins went 78–84, and did not return to the post-season. Sanó and Buxton were injured most of the year and eventually both sent down to the minors, while long-time Twin Brian Dozier was traded at the deadline. One bright spot came at the end of the season, when hometown hero Joe Mauer returned to catcher (his original position) for his final game, ending his career with a signature double and standing ovation. Another highlight was the team's two-game series against the Cleveland Indians in San Juan, Puerto Rico. After the season, manager Paul Molitor was fired. Free agent signing Logan Morrison and long-time veteran Ervin Santana declared free agency.
During the 2019 off-season, the Twins hired Rocco Baldelli as their new manager, signed free agents Marwin González, Jonathan Schoop, Nelson Cruz, and claimed CJ Cron off of waivers from the Tampa Bay Rays. Cron had 30 homeruns in the 2018 season. They also signed Martín Pérez, Ronald Torreyes, and Blake Parker.
Third baseman Miguel Sanó had surgery on his achilles tendon in March and did not return until May.
The Twins started the 2019 MLB season hot, owning the best record in baseball through mid-May. The strong start however did not translate to strong attendance, as they had the largest attendance drop in baseball during the first month of the season, with bad weather also being a factor. On May 8, in an effort to get fans back to the ballpark, the Twins announced a flash sale of $5 tickets for their remaining home games in May. The Twins sold 20,000 tickets within a day, and had to make additional seating available due to the overwhelming demand.
The Twins set the record for the most homeruns in the first half of the season with 166, and they are on track to set the record for most homeruns in a season. The old record - 267 - was set by the 2018 New York Yankees. Through 159 games (as of 09/26/2019) they have hit 301 homeruns, which is an average of 1.94 HR/game. At this pace, they would hit 307 home runs by the end of the regular season.
On September 17, 2019, Miguel Sanó hit a 482-foot home run to make the Twins the first team in major league history to have five players with at least 30 home runs in a season.
On September 25, 2019, the Twins clinched the American League Central division for the first time since 2010.
On September 26, 2019, the Twins became the first team in major league history to hit 300 home runs in a season.
The Twins finished the 2019 season with the second most wins in franchise history with 101, one short of the 1965 season. The team combined for a total of 307 home runs, the most in MLB history for a single season. The team's slugging prowess has earned them the nickname the "Bomba Squad". In the 2019 ALDS, the Twins opponents were the New York Yankees, who finished one home run behind at 306 and the second team to break the 300 home run mark. The Twins were swept again, and extend their postseason losing streak to 16, dating back to the 2004 ALDS.
The quirks of the Hubert H. Humphrey Metrodome, including the turf floor and the white roof, gave the Twins a significant home-field advantage that played into their winning the World Series in both 1987 and 1991, at least in the opinion of their opponents, as the Twins went 12–1 in postseason home games during those two seasons. These were the first two World Series in professional baseball history in which a team won the championship by winning all four home games. (The feat has since been repeated once, by the Arizona Diamondbacks in 2001.) Nevertheless, the Twins argued that the Metrodome was obsolete and that the lack of a dedicated baseball-only ballpark limited team revenue and made it difficult to sustain a top-notch, competitive team (the Twins had been sharing tenancy in stadiums with the NFL's Minnesota Vikings since 1961). The team was rumored to contemplate moving to such places as New Jersey, Las Vegas, Portland, Oregon, the Greensboro/Winston-Salem, North Carolina area, and elsewhere in search of a more financially competitive market. In 2002, the team was nearly disbanded when Major League Baseball selected the Twins and the Montreal Expos (now the Washington Nationals franchise) for elimination due to their financial weakness relative to other franchises in the league. The impetus for league contraction diminished after a court decision forced the Twins to play out their lease on the Metrodome. However, Twins owner Carl Pohlad continued his efforts to relocate, pursuing litigation against the Metropolitan Stadium Commission and obtaining a state court ruling that his team was not obligated to play in the Metrodome after the 2006 season. This cleared the way for the Twins to either be relocated or disbanded prior to the 2007 season if a new deal was not reached.
In response to the threatened loss of the Twins, the Minnesota private and public sector negotiated and approved a financing package for a replacement stadium— a baseball-only outdoor, natural turf ballpark in the Warehouse District of downtown Minneapolis— owned by a new entity known as the Minnesota Ballpark Authority. Target Field was constructed at a cost of $544.4 million (including site acquisition and infrastructure), utilizing the proceeds of a $392 million public bond offering based on a 0.15% sales tax in Hennepin County and private financing of $185 million provided by the Pohlad family. As part of the deal, the Twins also signed a 30-year lease of the new stadium, effectively guaranteeing the continuation of the team in Minnesota for a long time to come. Construction of the new field began in 2007, and was completed in December 2009, in time for the 2010 season. Commissioner Bud Selig, who earlier had threatened to disband the team, observed that without the new stadium the Twins could not have committed to sign their star player, catcher Joe Mauer, to an 8-year, $184 million contract extension. The first regular season game in Target Field was played against the Boston Red Sox on April 12, 2010, with Mauer driving in two runs and going 3-for-5 to help the Twins defeat the Red Sox, 5–2.
On May 18, 2011, Target Field was named "The Best Place To Shop" by Street and Smith's "SportsBusiness Journal" at the magazine's 2011 Sports Business Awards Ceremony in New York City. It was also named "The Best Sports Stadium in North America" by "ESPN The Magazine" in a ranking that included over 120 different stadiums, ballparks and arenas from around North America.
In July 2014, Target Field hosted the 85th Major League Baseball All-Star Game and the Home Run Derby.
Minnesota Twins all-time roster: A complete list of players who played in at least one game for the Twins franchise.
The Minnesota Twins farm system consists of seven minor league affiliates.
Molitor, Morris, and Winfield were all St. Paul natives who joined the Twins late in their careers and were warmly received as "hometown heroes", but were elected to the Hall primarily on the basis of their tenures with other teams. Both Molitor and Winfield swatted their 3,000th hit with Minnesota, while Morris pitched a complete-game shutout for the Twins in the deciding Game 7 of the 1991 World Series. Molitor was the first player in history to hit a triple for his 3,000th hit.
Cronin, Goslin, Griffith, Harris, Johnson, Killebrew and Wynn are listed on the Washington Hall of Stars display at Nationals Park (previously they were listed at Robert F. Kennedy Stadium). So are Ossie Bluege, George Case, Joe Judge, George Selkirk, Roy Sievers, Cecil Travis, Mickey Vernon and Eddie Yost.
The Metrodome's upper deck in center and right fields was partly covered by a curtain containing banners of various titles won, and retired numbers. There was no acknowledgment of the Twins' prior championships in Washington and several Senator Hall of Famers, such as Walter Johnson, played in the days prior to numbers being used on uniforms. However Killebrew played seven seasons as a Senator, including two full seasons as a regular prior to the move to Minnesota in 1961.
Prior to the addition of the banners, the Twins acknowledged their retired numbers on the Metrodome's outfield fence. Harmon Killebrew's #3 was the first to be displayed, as it was the only one the team had retired when they moved in. It was joined by Rod Carew's #29 in 1987, Tony Oliva's #6 in 1991, Kent Hrbek's #14 in 1995, and Kirby Puckett's #34 in 1997 before the Twins began hanging the banners to reduce capacity. The championships, meanwhile were marked on the "Baggie" in right field.
The numbers that have been retired hang within Target Field in front of the tower that serves as the Twins' executive offices in left field foul territory. The championships banners have been replaced by small pennants that fly on masts at the back of the left field upper deck. Those pennants, along with the flags flying in the plaza behind right field, serve as a visual cue for the players, suggesting the wind direction and speed.
Jackie Robinson's number, 42, was retired by Major League Baseball on April 15, 1997 and formally honored by the Twins on May 23, 1997. Robinson's number was positioned to the left of the Twins numbers in both venues.
In 2007, the Twins took the rights to the broadcasts in-house and created the Twins Radio Network (TRN). With that new network in place the Twins secured a new Metro Affiliate flagship radio station in KSTP (AM 1500). It replaced WCCO (AM 830), which held broadcast rights for the Twins since the team moved to Minneapolis in 1961. For 2013, the Twins moved to FM radio on KTWN-FM "96.3 K-Twin", which is owned by the Pohlad family. The original radio voices of the Twins in 1961 were Ray Scott, Halsey Hall and Bob Wolff. After the first season, Herb Carneal replaced Wolff. Twins TV and radio broadcasts were originally sponsored by the Hamm's Brewing Company. In 2009, Treasure Island Resort & Casino became the first ever naming rights partner for the Twins Radio Network, making the commercial name of TRN the Treasure Island Baseball Network. In 2017, it was announced that WCCO would become the flagship station the Twins again starting in 2018, thus returning the team back to its original station after 11 years.
Cory Provus is the current radio play by play announcer, taking over in 2012 for longtime Twins voice John Gordon who retired following the 2011 season. Former Twins OF Dan Gladden serves as color commentator.
TRN broadcasts are originated from the studios at Minnesota News Network and Minnesota Farm Networks. Kris Atteberry hosts the pre-game show, the "Lineup Card" and the "Post-game Download" from those studios except when filling in for Provus or Gladden when they are on vacation.
On April 1, 2007, Herb Carneal, the radio voice of the Twins for all but one year of their existence, died at his home in Minnetonka after a long battle with a list of illnesses. Carneal is in the broadcasters wing of the Baseball Hall of Fame.
The television rights are held by Fox Sports North with Dick Bremer as the play-by-play announcer and former Twin, 2011 National Baseball Hall of Fame inductee, Bert Blyleven as color analyst. They are sometimes joined by Roy Smalley, Justin Morneau and Jack Morris.
Bob Casey was the Twins first public-address announcer starting in 1961 and continuing until his death in 2005. He was well known for his unique delivery and his signature announcements of "No smoking in the Metrodome, either go outside or quit!" (or "go back to Boston", etc.), "Batting 3rd, the center-fielder, No. 34, Kirby Puckett!!!" and asking fans not to 'throw anything or anybody' onto the field.
Fans wave a "Homer Hanky" to rally the team during play-offs and other crucial games. The Homer Hanky was created by Terrie Robbins of the Star Tribune newspaper in the Twin Cities in 1987. It was her idea to originally give away 60,000 inaugural Homer Hankies. That year, over 2.3 million Homer Hankies were distributed.
The party atmosphere of the Twins clubhouse after a win is well-known, the team's players unwinding with loud rock music (usually the choice of the winning pitcher) and video games.
The club has several hazing rituals, such as requiring the most junior relief pitcher on the team to carry water and snacks to the bullpen in a brightly colored small child's backpack (Barbie in 2005, SpongeBob SquarePants in 2006, Hello Kitty in 2007, Disney Princess and Tinkerbell in 2009, Chewbacca and Darth Vader in 2010), and many of its players, both past and present, are notorious pranksters. For example, Bert Blyleven earned the nickname "The Frying Dutchman" for his ability to pull the "hotfoot" – which entails crawling under the bench in the dugout and lighting a teammate's shoelaces on fire.
|
https://en.wikipedia.org/wiki?curid=20050
|
Mach number
Mach number (M or Ma) (; ) is a dimensionless quantity in fluid dynamics representing the ratio of flow velocity past a boundary to the local speed of sound.
where:
By definition, at Mach1, the local flow velocity is equal to the speed of sound. At Mach0.65, is 65% of the speed of sound (subsonic), and, at Mach1.35, is 35% faster than the speed of sound (supersonic). Pilots of high-altitude aerospace vehicles use flight Mach number to express a vehicle's true airspeed, but the flow field around a vehicle varies in three dimensions, with corresponding variations in local Mach number.
The local speed of sound, and hence the Mach number, depends on the temperature of the surrounding gas. The Mach number is primarily used to determine the approximation with which a flow can be treated as an incompressible flow. The medium can be a gas or a liquid. The boundary can be traveling in the medium, or it can be stationary while the medium flows along it, or they can both be moving, with different velocities: what matters is their relative velocity with respect to each other. The boundary can be the boundary of an object immersed in the medium, or of a channel such as a nozzle, diffuser or wind tunnel channeling the medium. As the Mach number is defined as the ratio of two speeds, it is a dimensionless number. If and is a designation proposed by aeronautical engineer Jakob Ackeret in 1929. As the Mach number is a dimensionless quantity rather than a unit of measure, the number comes "after" the unit; the second Mach number is "Mach2" instead of "2Mach" (or Machs). This is somewhat reminiscent of the early modern ocean sounding unit "mark" (a synonym for fathom), which was also unit-first, and may have influenced the use of the term Mach. In the decade preceding faster-than-sound human flight, aeronautical engineers referred to the speed of sound as "Mach's number", never "Mach 1".
Mach number is a measure of the compressibility characteristics of fluid flow: the fluid (air) behaves under the influence of compressibility in a similar manner at a given Mach number, regardless of other variables. As modeled in the International Standard Atmosphere, dry air at mean sea level, standard temperature of , the speed of sound is . The speed of sound is not a constant; in a gas, it increases proportionally to the square root of the absolute temperature, and since atmospheric temperature generally decreases with increasing altitude between sea level and , the speed of sound also decreases. For example, the standard atmosphere model lapses temperature to at altitude, with a corresponding speed of sound (Mach1) of , 86.7% of the sea level value.
While the terms "subsonic" and "supersonic", in the purest sense, refer to speeds below and above the local speed of sound respectively, aerodynamicists often use the same terms to talk about particular ranges of Mach values. This occurs because of the presence of a "transonic regime" around flight (free stream) M = 1 where approximations of the Navier-Stokes equations used for subsonic design no longer apply; the simplest explanation is that the flow around an airframe locally begins to exceed M = 1 even though the free stream Mach number is below this value.
Meanwhile, the "supersonic regime" is usually used to talk about the set of Mach numbers for which linearised theory may be used, where for example the (air) flow is not chemically reacting, and where heat-transfer between air and vehicle may be reasonably neglected in calculations.
In the following table, the "regimes" or "ranges of Mach values" are referred to, and not the "pure" meanings of the words "subsonic" and "supersonic".
Generally, NASA defines "high" hypersonic as any Mach number from 10 to 25, and re-entry speeds as anything greater than Mach 25. Aircraft operating in this regime include the Space Shuttle and various space planes in development.
Flight can be roughly classified in six categories:
For comparison: the required speed for low Earth orbit is approximately 7.5 km/s = Mach 25.4 in air at high altitudes.
At transonic speeds, the flow field around the object includes both sub- and supersonic parts. The transonic period begins when first zones of M > 1 flow appear around the object. In case of an airfoil (such as an aircraft's wing), this typically happens above the wing. Supersonic flow can decelerate back to subsonic only in a normal shock; this typically happens before the trailing edge. (Fig.1a)
As the speed increases, the zone of M > 1 flow increases towards both leading and trailing edges. As M = 1 is reached and passed, the normal shock reaches the trailing edge and becomes a weak oblique shock: the flow decelerates over the shock, but remains supersonic. A normal shock is created ahead of the object, and the only subsonic zone in the flow field is a small area around the object's leading edge. (Fig.1b)
Fig. 1. "Mach number in transonic airflow around an airfoil; M < 1 (a) and M > 1 (b)."
When an aircraft exceeds Mach 1 (i.e. the sound barrier), a large pressure difference is created just in front of the aircraft. This abrupt pressure difference, called a shock wave, spreads backward and outward from the aircraft in a cone shape (a so-called Mach cone). It is this shock wave that causes the sonic boom heard as a fast moving aircraft travels overhead. A person inside the aircraft will not hear this. The higher the speed, the more narrow the cone; at just over M = 1 it is hardly a cone at all, but closer to a slightly concave plane.
At fully supersonic speed, the shock wave starts to take its cone shape and flow is either completely supersonic, or (in case of a blunt object), only a very small subsonic flow area remains between the object's nose and the shock wave it creates ahead of itself. (In the case of a sharp object, there is no air between the nose and the shock wave: the shock wave starts from the nose.)
As the Mach number increases, so does the strength of the shock wave and the Mach cone becomes increasingly narrow. As the fluid flow crosses the shock wave, its speed is reduced and temperature, pressure, and density increase. The stronger the shock, the greater the changes. At high enough Mach numbers the temperature increases so much over the shock that ionization and dissociation of gas molecules behind the shock wave begin. Such flows are called hypersonic.
It is clear that any object traveling at hypersonic speeds will likewise be exposed to the same extreme temperatures as the gas behind the nose shock wave, and hence choice of heat-resistant materials becomes important.
As a flow in a channel becomes supersonic, one significant change takes place. The conservation of mass flow rate leads one to expect that contracting the flow channel would increase the flow speed (i.e. making the channel narrower results in faster air flow) and at subsonic speeds this holds true. However, once the flow becomes supersonic, the relationship of flow area and speed is reversed: expanding the channel actually increases the speed.
The obvious result is that in order to accelerate a flow to supersonic, one needs a convergent-divergent nozzle, where the converging section accelerates the flow to sonic speeds, and the diverging section continues the acceleration. Such nozzles are called de Laval nozzles and in extreme cases they are able to reach hypersonic speeds ( at 20 °C).
An aircraft Machmeter or electronic flight information system (EFIS) can display Mach number derived from stagnation pressure (pitot tube) and static pressure.
The Mach number at which an aircraft is flying can be calculated by
where:
Note that the dynamic pressure can be found as:
Assuming air to be an ideal gas, the formula to compute Mach number in a subsonic compressible flow is derived from Bernoulli's equation for M < 1:
and the speed of sound varies with the thermodynamic temperature as:
where:
The formula to compute Mach number in a supersonic compressible flow is derived from the Rayleigh supersonic pitot equation:
Mach number is a function of temperature and true airspeed.
Aircraft flight instruments, however, operate using pressure differential to compute Mach number, not temperature.
Assuming air to be an ideal gas, the formula to compute Mach number in a subsonic compressible flow is found from Bernoulli's equation for (above):
The formula to compute Mach number in a supersonic compressible flow can be found from the Rayleigh supersonic pitot equation (above) using parameters for air:
where:
As can be seen, M appears on both sides of the equation, and for practical purposes a root-finding algorithm must be used for a numerical solution (the equation's solution is a root of a 7th-order polynomial in M2 and, though some of these may be solved explicitly, the Abel–Ruffini theorem guarantees that there exists no general form for the roots of these polynomials). It is first determined whether M is indeed greater than 1.0 by calculating M from the subsonic equation. If M is greater than 1.0 at that point, then the value of M from the subsonic equation is used as the initial condition for fixed point iteration of the supersonic equation, which usually converges very rapidly. Alternatively, Newton's method can also be used.
|
https://en.wikipedia.org/wiki?curid=20051
|
Moving Picture Experts Group
The Moving Picture Experts Group (MPEG) is a working group of authorities that was formed by ISO and IEC to set standards for audio and video compression and transmission. MPEG's official designation is ISO/IEC JTC 1/SC 29/WG 11 – "Coding of moving pictures and audio" (ISO/IEC Joint Technical Committee 1, Subcommittee 29, Working Group 11).
MPEG was established in 1988 by the initiative of Hiroshi Yasuda (Nippon Telegraph and Telephone) and Leonardo Chiariglione, group Chair from its inception. The first MPEG meeting was in May 1988 in Ottawa, Canada.
As of late 2005, MPEG has grown to include approximately 350 members per meeting from various industries, universities, and research institutions.
On June 6, 2020, the MPEG website – hosted by Chiariglione – was updated to inform readers that he retired as convenor, and that the MPEG group "was closed". Chiariglione, in his own blog, explained his reasons for deciding to step down. The decision followed a restructuring process within SC 29, in which "some of the subgroups of WG 11 (MPEG) will become distinct MPEG working groups (WGs) and advisory groups (AGs)" in July 2020. In the interim, Prof. Jörn Ostermann has been appointed as Acting Convenor of SC 29/WG 11.
Joint Video Team (JVT) is joint project between ITU-T SG16/Q.6 (Study Group 16 / Question 6) – VCEG (Video Coding Experts Group) and ISO/IEC JTC 1/SC 29/WG 11 – MPEG for the development of new video coding recommendation and international standard. It was formed in 2001 and its main result has been H.264/MPEG-4 AVC (MPEG-4 Part 10).
Joint Collaborative Team on Video Coding (JCT-VC) is a group of video coding experts from ITU-T Study Group 16 (VCEG) and ISO/IEC JTC 1/SC 29/WG 11 (MPEG). It was created in 2010 to develop High Efficiency Video Coding, a new generation video coding standard that further reduces (by 50%) the data rate required for high quality video coding, as compared to the current ITU-T H.264 / ISO/IEC 14496-10 standard. JCT-VC is co-chaired by Jens-Rainer Ohm and Gary Sullivan.
Joint Video Exploration Team (JVET) is a joint group of video coding experts from ITU-T Study Group 16 (VCEG) and ISO/IEC JTC 1/SC 29/WG 11 (MPEG) created in 2017 after an exploration phase in 2015. It seeks to develop Versatile Video Coding (VVC). Like JCT-VC, JVET is co-chaired by Jens-Rainer Ohm and Gary Sullivan.
The MPEG standards consist of different "Parts". Each "part" covers a certain aspect of the whole specification. The standards also specify "Profiles" and "Levels". "Profiles" are intended to define a set of tools that are available, and "Levels" define the range of appropriate values for the properties associated with them. Some of the approved MPEG standards were revised by later amendments and/or new editions.
MPEG has standardized the following compression formats and ancillary standards. All of the MPEG formats listed below use discrete cosine transform (DCT) based lossy video compression algorithms.
MPEG-4 has been chosen as the compression scheme for over-the-air in Brazil (ISDB-TB), based on original digital television from Japan (ISDB-T).
In addition, the following standards, while not sequential advances to the video encoding standard as with MPEG-1 through MPEG-4, are referred to by similar notation:
Moreover, more recently than other standards above, MPEG has started following international standards; each of the standards holds multiple MPEG technologies for a way of application. (For example, MPEG-A includes a number of technologies on multimedia application format.)
A standard published by ISO/IEC is the last stage of a long process that starts with the proposal of new work within a committee. Here are some abbreviations used for marking a standard with its status:
Other abbreviations:
A proposal of work (New Proposal) is approved at Subcommittee and then at the Technical Committee level (SC29 and JTC1 respectively – in the case of MPEG). When the scope of new work is sufficiently clarified, MPEG usually makes open requests for proposals – known as "Call for proposals". The first document that is produced for audio and video coding standards is called a Verification Model (VM). In the case of MPEG-1 and MPEG-2 this was called Simulation and Test Model, respectively. When a sufficient confidence in the stability of the standard under development is reached, a Working Draft (WD) is produced. This is in the form of a standard but is kept internal to MPEG for revision. When a WD is sufficiently solid, becomes Committee Draft (CD) (usually at the planned time). It is then sent to National Bodies (NB) for ballot. The CD becomes Final Committee Draft (FCD) if the number of positive votes is above the quorum. After a review and comments issued by NBs, FCD is again submitted to NBs for the second ballot. If the FCD is approved, it becomes Final Draft International Standard (FDIS). ISO then holds a ballot with National Bodies, where no technical changes are allowed (yes/no ballot). If approved, the document becomes International Standard (IS).
ISO/IEC Directives allow also the so-called "Fast-track procedure". In this procedure a document is submitted directly for approval as a draft International Standard (DIS) to the ISO member bodies or as a final draft International Standard (FDIS) if the document was developed by an international standardizing body recognized by the ISO Council.
|
https://en.wikipedia.org/wiki?curid=20055
|
MPEG-1
MPEG-1 is a standard for lossy compression of video and audio. It is designed to compress VHS-quality raw digital video and CD audio down to about 1.5 Mbit/s (26:1 and 6:1 compression ratios respectively) without excessive quality loss, making video CDs, digital cable/satellite TV and digital audio broadcasting (DAB) possible.
Today, MPEG-1 has become the most widely compatible lossy audio/video format in the world, and is used in a large number of products and technologies. Perhaps the best-known part of the MPEG-1 standard is the first version of the MP3 audio format it introduced.
The MPEG-1 standard is published as ISO/IEC 11172 – Information technology—Coding of moving pictures and associated audio for digital storage media at up to about 1.5 Mbit/s.
The standard consists of the following five "Parts":
The predecessor of MPEG-1 for video coding was the H.261 standard produced by the CCITT (now known as the ITU-T). The basic architecture established in H.261 was the motion-compensated DCT hybrid video coding structure. It uses macroblocks of size 16×16 with block-based motion estimation in the encoder and motion compensation using encoder-selected motion vectors in the decoder, with residual difference coding using a discrete cosine transform (DCT) of size 8×8, scalar quantization, and variable-length codes (like Huffman codes) for entropy coding. H.261 was the first practical video coding standard, and all of its described design elements were also used in MPEG-1.
Modeled on the successful collaborative approach and the compression technologies developed by the Joint Photographic Experts Group and CCITT's Experts Group on Telephony (creators of the JPEG image compression standard and the H.261 standard for video conferencing respectively), the Moving Picture Experts Group (MPEG) working group was established in January 1988, by the initiative of Hiroshi Yasuda (Nippon Telegraph and Telephone) and Leonardo Chiariglione (CSELT). MPEG was formed to address the need for standard video and audio formats, and to build on H.261 to get better quality through the use of somewhat more complex encoding methods (e.g., supporting higher precision for motion vectors).
Development of the MPEG-1 standard began in May 1988. Fourteen video and fourteen audio codec proposals were submitted by individual companies and institutions for evaluation. The codecs were extensively tested for computational complexity and subjective (human perceived) quality, at data rates of 1.5 Mbit/s. This specific bitrate was chosen for transmission over T-1/E-1 lines and as the approximate data rate of audio CDs. The codecs that excelled in this testing were utilized as the basis for the standard and refined further, with additional features and other improvements being incorporated in the process.
After 20 meetings of the full group in various cities around the world, and 4½ years of development and testing, the final standard (for parts 1–3) was approved in early November 1992 and published a few months later. The reported completion date of the MPEG-1 standard varies greatly: a largely complete draft standard was produced in September 1990, and from that point on, only minor changes were introduced. The draft standard was publicly available for purchase. The standard was finished with the 6 November 1992 meeting. The Berkeley Plateau Multimedia Research Group developed an MPEG-1 decoder in November 1992. In July 1990, before the first draft of the MPEG-1 standard had even been written, work began on a second standard, MPEG-2, intended to extend MPEG-1 technology to provide full broadcast-quality video (as per CCIR 601) at high bitrates (3–15 Mbit/s) and support for interlaced video. Due in part to the similarity between the two codecs, the MPEG-2 standard includes full backwards compatibility with MPEG-1 video, so any MPEG-2 decoder can play MPEG-1 videos.
Notably, the MPEG-1 standard very strictly defines the bitstream, and decoder function, but does not define how MPEG-1 encoding is to be performed, although a reference implementation is provided in ISO/IEC-11172-5. This means that MPEG-1 coding efficiency can drastically vary depending on the encoder used, and generally means that newer encoders perform significantly better than their predecessors. The first three parts (Systems, Video and Audio) of ISO/IEC 11172 were published in August 1993.
Due to its age, MPEG-1 is no longer covered by any essential patents and can thus be used without obtaining a licence or paying any fees. The ISO patent database lists one patent for ISO 11172, US 4,472,747, which expired in 2003. The near-complete draft of the MPEG-1 standard was publicly available as ISO CD 11172 by December 6, 1991. Neither the July 2008 Kuro5hin article "Patent Status of MPEG-1, H.261 and MPEG-2", nor an August 2008 thread on the gstreamer-devel mailing list were able to list a single unexpired MPEG-1 Video and MPEG-1 Audio Layer I/II patent. A May 2009 discussion on the whatwg mailing list mentioned US 5,214,678 patent as possibly covering MPEG-1 Audio Layer II. Filed in 1990 and published in 1993, this patent is now expired.
A full MPEG-1 decoder and encoder, with "Layer III audio", could not be implemented royalty free since there were companies that required patent fees for implementations of MPEG-1 Audio Layer III, as discussed in the MP3 article. All patents in the world connected to MP3 expired 30 December 2017, which makes this format totally free for use. On 23 April 2017, Fraunhofer IIS stopped charging for Technicolor's MP3 licensing program for certain MP3 related patents and software.
The following corporations filed declarations with ISO saying they held patents for the MPEG-1 Video (ISO/IEC-11172-2) format, although all such patents have since expired.
Part 1 of the MPEG-1 standard covers "systems", and is defined in ISO/IEC-11172-1.
MPEG-1 Systems specifies the logical layout and methods used to store the encoded audio, video, and other data into a standard bitstream, and to maintain synchronization between the different contents. This file format is specifically designed for storage on media, and transmission over communication channels, that are considered relatively reliable. Only limited error protection is defined by the standard, and small errors in the bitstream may cause noticeable defects.
This structure was later named an MPEG program stream: "The MPEG-1 Systems design is essentially identical to the MPEG-2 Program Stream structure." This terminology is more popular, precise (differentiates it from an MPEG transport stream) and will be used here.
Program Streams (PS) are concerned with combining multiple packetized elementary streams (usually just one audio and video PES) into a single stream, ensuring simultaneous delivery, and maintaining synchronization. The PS structure is known as a multiplex, or a container format.
Presentation time stamps (PTS) exist in PS to correct the inevitable disparity between audio and video SCR values (time-base correction). 90 kHz PTS values in the PS header tell the decoder which video SCR values match which audio SCR values. PTS determines when to display a portion of an MPEG program, and is also used by the decoder to determine when data can be discarded from the buffer. Either video or audio will be delayed by the decoder until the corresponding segment of the other arrives and can be decoded.
PTS handling can be problematic. Decoders must accept multiple "program streams" that have been concatenated (joined sequentially). This causes PTS values in the middle of the video to reset to zero, which then begin incrementing again. Such PTS wraparound disparities can cause timing issues that must be specially handled by the decoder.
Decoding Time Stamps (DTS), additionally, are required because of B-frames. With B-frames in the video stream, adjacent frames have to be encoded and decoded out-of-order (re-ordered frames). DTS is quite similar to PTS, but instead of just handling sequential frames, it contains the proper time-stamps to tell the decoder when to decode and display the next B-frame (types of frames explained below), ahead of its anchor (P- or I-) frame. Without B-frames in the video, PTS and DTS values are identical.
To generate the PS, the multiplexer will interleave the (two or more) packetized elementary streams. This is done so the packets of the simultaneous streams can be transferred over the same channel and are guaranteed to both arrive at the decoder at precisely the same time. This is a case of time-division multiplexing.
Determining how much data from each stream should be in each interleaved segment (the size of the interleave) is complicated, yet an important requirement. Improper interleaving will result in buffer underflows or overflows, as the receiver gets more of one stream than it can store (e.g. audio), before it gets enough data to decode the other simultaneous stream (e.g. video). The MPEG Video Buffering Verifier (VBV) assists in determining if a multiplexed PS can be decoded by a device with a specified data throughput rate and buffer size. This offers feedback to the muxer and the encoder, so that they can change the mux size or adjust bitrates as needed for compliance.
Part 2 of the MPEG-1 standard covers video and is defined in ISO/IEC-11172-2. The design was heavily influenced by H.261.
MPEG-1 Video exploits perceptual compression methods to significantly reduce the data rate required by a video stream. It reduces or completely discards information in certain frequencies and areas of the picture that the human eye has limited ability to fully perceive. It also exploits temporal (over time) and spatial (across a picture) redundancy common in video to achieve better data compression than would be possible otherwise. (See: Video compression)
Before encoding video to MPEG-1, the color-space is transformed to Y′CbCr (Y′=Luma, Cb=Chroma Blue, Cr=Chroma Red). Luma (brightness, resolution) is stored separately from chroma (color, hue, phase) and even further separated into red and blue components.
The chroma is also subsampled to , meaning it is reduced to half resolution vertically and half resolution horizontally, i.e., to just one quarter the number of samples used for the luma component of the video. This use of higher resolution for some color components is similar in concept to the Bayer pattern filter that is commonly used for the image capturing sensor in digital color cameras. Because the human eye is much more sensitive to small changes in brightness (the Y component) than in color (the Cr and Cb components), chroma subsampling is a very effective way to reduce the amount of video data that needs to be compressed. However, on videos with fine detail (high spatial complexity) this can manifest as chroma aliasing artifacts. Compared to other digital compression artifacts, this issue seems to very rarely be a source of annoyance. Because of the subsampling, Y′CbCr 4:2:0 video is ordinarily stored using even dimensions (divisible by 2 horizontally and vertically).
Y′CbCr color is often informally called YUV to simplify the notation, although that term more properly applies to a somewhat different color format. Similarly, the terms luminance and chrominance are often used instead of the (more accurate) terms luma and chroma.
MPEG-1 supports resolutions up to 4095×4095 (12 bits), and bit rates up to 100 Mbit/s.
MPEG-1 videos are most commonly seen using Source Input Format (SIF) resolution: 352×240, 352×288, or 320×240. These relatively low resolutions, combined with a bitrate less than 1.5 Mbit/s, make up what is known as a constrained parameters bitstream (CPB), later renamed the "Low Level" (LL) profile in MPEG-2. This is the minimum video specifications any decoder should be able to handle, to be considered MPEG-1 compliant. This was selected to provide a good balance between quality and performance, allowing the use of reasonably inexpensive hardware of the time.
MPEG-1 has several frame/picture types that serve different purposes. The most important, yet simplest, is I-frame.
"I-frame" is an abbreviation for "Intra-frame", so-called because they can be decoded independently of any other frames. They may also be known as I-pictures, or keyframes due to their somewhat similar function to the key frames used in animation. I-frames can be considered effectively identical to baseline JPEG images.
High-speed seeking through an MPEG-1 video is only possible to the nearest I-frame. When cutting a video it is not possible to start playback of a segment of video before the first I-frame in the segment (at least not without computationally intensive re-encoding). For this reason, I-frame-only MPEG videos are used in editing applications.
I-frame only compression is very fast, but produces very large file sizes: a factor of 3× (or more) larger than normally encoded MPEG-1 video, depending on how temporally complex a specific video is. I-frame only MPEG-1 video is very similar to MJPEG video. So much so that very high-speed and theoretically lossless (in reality, there are rounding errors) conversion can be made from one format to the other, provided a couple of restrictions (color space and quantization matrix) are followed in the creation of the bitstream.
The length between I-frames is known as the group of pictures (GOP) size. MPEG-1 most commonly uses a GOP size of 15-18. i.e. 1 I-frame for every 14-17 non-I-frames (some combination of P- and B- frames). With more intelligent encoders, GOP size is dynamically chosen, up to some pre-selected maximum limit.
Limits are placed on the maximum number of frames between I-frames due to decoding complexing, decoder buffer size, recovery time after data errors, seeking ability, and accumulation of IDCT errors in low-precision implementations most common in hardware decoders (See: IEEE-1180).
"P-frame" is an abbreviation for "Predicted-frame". They may also be called forward-predicted frames or inter-frames (B-frames are also inter-frames).
P-frames exist to improve compression by exploiting the temporal (over time) redundancy in a video. P-frames store only the "difference" in image from the frame (either an I-frame or P-frame) immediately preceding it (this reference frame is also called the "anchor frame").
The difference between a P-frame and its anchor frame is calculated using "motion vectors" on each "macroblock" of the frame (see below). Such motion vector data will be embedded in the P-frame for use by the decoder.
A P-frame can contain any number of intra-coded blocks, in addition to any forward-predicted blocks.
If a video drastically changes from one frame to the next (such as a cut), it is more efficient to encode it as an I-frame.
"B-frame" stands for "bidirectional-frame" or "bipredictive frame". They may also be known as backwards-predicted frames or B-pictures. B-frames are quite similar to P-frames, except they can make predictions using both the previous and future frames (i.e. two anchor frames).
It is therefore necessary for the player to first decode the next I- or P- anchor frame sequentially after the B-frame, before the B-frame can be decoded and displayed. This means decoding B-frames requires larger data buffers and causes an increased delay on both decoding and during encoding. This also necessitates the decoding time stamps (DTS) feature in the container/system stream (see above). As such, B-frames have long been subject of much controversy, they are often avoided in videos, and are sometimes not fully supported by hardware decoders.
No other frames are predicted from a B-frame. Because of this, a very low bitrate B-frame can be inserted, where needed, to help control the bitrate. If this was done with a P-frame, future P-frames would be predicted from it and would lower the quality of the entire sequence. However, similarly, the future P-frame must still encode all the changes between it and the previous I- or P- anchor frame. B-frames can also be beneficial in videos where the background behind an object is being revealed over several frames, or in fading transitions, such as scene changes.
A B-frame can contain any number of intra-coded blocks and forward-predicted blocks, in addition to backwards-predicted, or bidirectionally predicted blocks.
MPEG-1 has a unique frame type not found in later video standards. "D-frames" or DC-pictures are independently coded images (intra-frames) that have been encoded using DC transform coefficients only (AC coefficients are removed when encoding D-frames—see DCT below) and hence are very low quality. D-frames are never referenced by I-, P- or B- frames. D-frames are only used for fast previews of video, for instance when seeking through a video at high speed.
Given moderately higher-performance decoding equipment, fast preview can be accomplished by decoding I-frames instead of D-frames. This provides higher quality previews, since I-frames contain AC coefficients as well as DC coefficients. If the encoder can assume that rapid I-frame decoding capability is available in decoders, it can save bits by not sending D-frames (thus improving compression of the video content). For this reason, D-frames are seldom actually used in MPEG-1 video encoding, and the D-frame feature has not been included in any later video coding standards.
MPEG-1 operates on video in a series of 8×8 blocks for quantization. However, to reduce the bit rate needed for motion vectors and because chroma (color) is subsampled by a factor of 4, each pair of (red and blue) chroma blocks corresponds to 4 different luma blocks. This set of 6 blocks, with a resolution of 16×16, is processed together and called a "macroblock".
A macroblock is the smallest independent unit of (color) video. Motion vectors (see below) operate solely at the macroblock level.
If the height or width of the video are not exact multiples of 16, full rows and full columns of macroblocks must still be encoded and decoded to fill out the picture (though the extra decoded pixels are not displayed).
To decrease the amount of temporal redundancy in a video, only blocks that change are updated, (up to the maximum GOP size). This is known as conditional replenishment. However, this is not very effective by itself. Movement of the objects, and/or the camera may result in large portions of the frame needing to be updated, even though only the position of the previously encoded objects has changed. Through motion estimation, the encoder can compensate for this movement and remove a large amount of redundant information.
The encoder compares the current frame with adjacent parts of the video from the anchor frame (previous I- or P- frame) in a diamond pattern, up to a (encoder-specific) predefined radius limit from the area of the current macroblock. If a match is found, only the direction and distance (i.e. the "vector" of the "motion") from the previous video area to the current macroblock need to be encoded into the inter-frame (P- or B- frame). The reverse of this process, performed by the decoder to reconstruct the picture, is called motion compensation.
A predicted macroblock rarely matches the current picture perfectly, however. The differences between the estimated matching area, and the real frame/macroblock is called the prediction error. The larger the amount of prediction error, the more data must be additionally encoded in the frame. For efficient video compression, it is very important that the encoder is capable of effectively and precisely performing motion estimation.
Motion vectors record the "distance" between two areas on screen based on the number of pixels (also called pels). MPEG-1 video uses a motion vector (MV) precision of one half of one pixel, or half-pel. The finer the precision of the MVs, the more accurate the match is likely to be, and the more efficient the compression. There are trade-offs to higher precision, however. Finer MV precision results in using a larger amount of data to represent the MV, as larger numbers must be stored in the frame for every single MV, increased coding complexity as increasing levels of interpolation on the macroblock are required for both the encoder and decoder, and diminishing returns (minimal gains) with higher precision MVs. Half-pel precision was chosen as the ideal trade-off for that point in time. (See: qpel)
Because neighboring macroblocks are likely to have very similar motion vectors, this redundant information can be compressed quite effectively by being stored DPCM-encoded. Only the (smaller) amount of difference between the MVs for each macroblock needs to be stored in the final bitstream.
P-frames have one motion vector per macroblock, relative to the previous anchor frame. B-frames, however, can use two motion vectors; one from the previous anchor frame, and one from the future anchor frame.
Partial macroblocks, and black borders/bars encoded into the video that do not fall exactly on a macroblock boundary, cause havoc with motion prediction. The block padding/border information prevents the macroblock from closely matching with any other area of the video, and so, significantly larger prediction error information must be encoded for every one of the several dozen partial macroblocks along the screen border. DCT encoding and quantization (see below) also isn't nearly as effective when there is large/sharp picture contrast in a block.
An even more serious problem exists with macroblocks that contain significant, random, "edge noise", where the picture transitions to (typically) black. All the above problems also apply to edge noise. In addition, the added randomness is simply impossible to compress significantly. All of these effects will lower the quality (or increase the bitrate) of the video substantially.
Each 8×8 block is encoded by first applying a "forward" discrete cosine transform (FDCT) and then a quantization process. The FDCT process (by itself) is theoretically lossless, and can be reversed by applying an "Inverse" DCT (IDCT) to reproduce the original values (in the absence of any quantization and rounding errors). In reality, there are some (sometimes large) rounding errors introduced both by quantization in the encoder (as described in the next section) and by IDCT approximation error in the decoder. The minimum allowed accuracy of a decoder IDCT approximation is defined by ISO/IEC 23002-1. (Prior to 2006, it was specified by IEEE 1180-1990.)
The FDCT process converts the 8×8 block of uncompressed pixel values (brightness or color difference values) into an 8×8 indexed array of "frequency coefficient" values. One of these is the (statistically high in variance) "DC coefficient", which represents the average value of the entire 8×8 block. The other 63 coefficients are the statistically smaller "AC coefficients", which have positive or negative values each representing sinusoidal deviations from the flat block value represented by the DC coefficient.
An example of an encoded 8×8 FDCT block:
Since the DC coefficient value is statistically correlated from one block to the next, it is compressed using DPCM encoding. Only the (smaller) amount of difference between each DC value and the value of the DC coefficient in the block to its left needs to be represented in the final bitstream.
Additionally, the frequency conversion performed by applying the DCT provides a statistical decorrelation function to efficiently concentrate the signal into fewer high-amplitude values prior to applying quantization (see below).
Quantization is, essentially, the process of reducing the accuracy of a signal, by dividing it by some larger step size and rounding to an integer value (i.e. finding the nearest multiple, and discarding the remainder).
The frame-level quantizer is a number from 0 to 31 (although encoders will usually omit/disable some of the extreme values) which determines how much information will be removed from a given frame. The frame-level quantizer is typically either dynamically selected by the encoder to maintain a certain user-specified bitrate, or (much less commonly) directly specified by the user.
A "quantization matrix" is a string of 64 numbers (ranging from 0 to 255) which tells the encoder how relatively important or unimportant each piece of visual information is. Each number in the matrix corresponds to a certain frequency component of the video image.
An example quantization matrix:
Quantization is performed by taking each of the 64 "frequency" values of the DCT block, dividing them by the frame-level quantizer, then dividing them by their corresponding values in the quantization matrix. Finally, the result is rounded down. This significantly reduces, or completely eliminates, the information in some frequency components of the picture. Typically, high frequency information is less visually important, and so high frequencies are much more "strongly quantized" (drastically reduced). MPEG-1 actually uses two separate quantization matrices, one for intra-blocks (I-blocks) and one for inter-block (P- and B- blocks) so quantization of different block types can be done independently, and so, more effectively.
This quantization process usually reduces a significant number of the "AC coefficients" to zero, (known as sparse data) which can then be more efficiently compressed by entropy coding (lossless compression) in the next step.
An example quantized DCT block:
Quantization eliminates a large amount of data, and is the main lossy processing step in MPEG-1 video encoding. This is also the primary source of most MPEG-1 video compression artifacts, like blockiness, color banding, noise, ringing, discoloration, et al. This happens when video is encoded with an insufficient bitrate, and the encoder is therefore forced to use high frame-level quantizers ("strong quantization") through much of the video.
Several steps in the encoding of MPEG-1 video are lossless, meaning they will be reversed upon decoding, to produce exactly the same (original) values. Since these lossless data compression steps don't add noise into, or otherwise change the contents (unlike quantization), it is sometimes referred to as noiseless coding. Since lossless compression aims to remove as much redundancy as possible, it is known as entropy coding in the field of information theory.
The coefficients of quantized DCT blocks tend to zero towards the bottom-right. Maximum compression can be achieved by a zig-zag scanning of the DCT block starting from the top left and using Run-length encoding techniques.
The DC coefficients and motion vectors are DPCM-encoded.
Run-length encoding (RLE) is a simple method of compressing repetition. A sequential string of characters, no matter how long, can be replaced with a few bytes, noting the value that repeats, and how many times. For example, if someone were to say "five nines", you would know they mean the number: 99999.
RLE is particularly effective after quantization, as a significant number of the AC coefficients are now zero (called sparse data), and can be represented with just a couple of bytes. This is stored in a special 2-dimensional Huffman table that codes the run-length and the run-ending character.
Huffman Coding is a very popular and relatively simple method of entropy coding, and used in MPEG-1 video to reduce the data size. The data is analyzed to find strings that repeat often. Those strings are then put into a special table, with the most frequently repeating data assigned the shortest code. This keeps the data as small as possible with this form of compression. Once the table is constructed, those strings in the data are replaced with their (much smaller) codes, which reference the appropriate entry in the table. The decoder simply reverses this process to produce the original data.
This is the final step in the video encoding process, so the result of Huffman coding is known as the MPEG-1 video "bitstream."
I-frames store complete frame info within the frame and are therefore suited for random access. P-frames provide compression using motion vectors relative to the previous frame ( I or P ). B-frames provide maximum compression but require the previous as well as next frame for computation. Therefore, processing of B-frames requires more buffer on the decoded side. A configuration of the Group of Pictures (GOP) should be selected based on these factors. I-frame only sequences give least compression, but are useful for random access, FF/FR and editability. I- and P-frame sequences give moderate compression but add a certain degree of random access, FF/FR functionality. I-, P- and B-frame sequences give very high compression but also increase the coding/decoding delay significantly. Such configurations are therefore not suited for video-telephony or video-conferencing applications.
The typical data rate of an I-frame is 1 bit per pixel while that of a P-frame is 0.1 bit per pixel and for a B-frame, 0.015 bit per pixel.
Part 3 of the MPEG-1 standard covers audio and is defined in ISO/IEC-11172-3.
MPEG-1 Audio utilizes psychoacoustics to significantly reduce the data rate required by an audio stream. It reduces or completely discards certain parts of the audio that it deduces that the human ear can't "hear", either because they are in frequencies where the ear has limited sensitivity, or are "masked" by other (typically louder) sounds.
Channel Encoding:
MPEG-1 Audio is divided into 3 layers. Each higher layer is more computationally complex, and generally more efficient at lower bitrates than the previous. The layers are semi backwards compatible as higher layers reuse technologies implemented by the lower layers. A "Full" Layer II decoder can also play Layer I audio, but "not" Layer III audio, although not all higher level players are "full".
MPEG-1 Audio Layer I is a simplified version of MPEG-1 Audio Layer II. Layer I uses a smaller 384-sample frame size for very low delay, and finer resolution. This is advantageous for applications like teleconferencing, studio editing, etc. It has lower complexity than Layer II to facilitate real-time encoding on the hardware available circa 1990.
Layer I saw limited adoption in its time, and most notably was used on Philips' defunct Digital Compact Cassette at a bitrate of 384 kbit/s. With the substantial performance improvements in digital processing since its introduction, Layer I quickly became unnecessary and obsolete.
Layer I audio files typically use the extension ".mp1" or sometimes ".m1a".
MPEG-1 Audio Layer II (the first version of MP2, often informally called MUSICAM) is a lossy audio format designed to provide high quality at about 192 kbit/s for stereo sound. Decoding MP2 audio is computationally simple relative to MP3, AAC, etc.
MPEG-1 Audio Layer II was derived from the MUSICAM ("Masking pattern adapted Universal Subband Integrated Coding And Multiplexing") audio codec, developed by Centre commun d'études de télévision et télécommunications (CCETT), Philips, and Institut für Rundfunktechnik (IRT/CNET) as part of the EUREKA 147 pan-European inter-governmental research and development initiative for the development of digital audio broadcasting.
Most key features of MPEG-1 Audio were directly inherited from MUSICAM, including the filter bank, time-domain processing, audio frame sizes, etc. However, improvements were made, and the actual MUSICAM algorithm was not used in the final MPEG-1 Audio Layer II standard. The widespread usage of the term MUSICAM to refer to Layer II is entirely incorrect and discouraged for both technical and legal reasons.
MP2 is a time-domain encoder. It uses a low-delay 32 sub-band polyphased filter bank for time-frequency mapping; having overlapping ranges (i.e. polyphased) to prevent aliasing. The psychoacoustic model is based on the principles of auditory masking, simultaneous masking effects, and the absolute threshold of hearing (ATH). The size of a Layer II frame is fixed at 1152-samples (coefficients).
Time domain refers to how analysis and quantization is performed on short, discrete samples/chunks of the audio waveform. This offers low delay as only a small number of samples are analyzed before encoding, as opposed to frequency domain encoding (like MP3) which must analyze many times more samples before it can decide how to transform and output encoded audio. This also offers higher performance on complex, random and transient impulses (such as percussive instruments, and applause), offering avoidance of artifacts like pre-echo.
The 32 sub-band filter bank returns 32 amplitude coefficients, one for each equal-sized frequency band/segment of the audio, which is about 700 Hz wide (depending on the audio's sampling frequency). The encoder then utilizes the psychoacoustic model to determine which sub-bands contain audio information that is less important, and so, where quantization will be inaudible, or at least much less noticeable.
The psychoacoustic model is applied using a 1024-point Fast Fourier Transform (FFT). Of the 1152 samples per frame, 64 samples at the top and bottom of the frequency range are ignored for this analysis. They are presumably not significant enough to change the result. The psychoacoustic model uses an empirically determined masking model to determine which sub-bands contribute more to the masking threshold, and how much quantization noise each can contain without being perceived. Any sounds below the absolute threshold of hearing (ATH) are completely discarded. The available bits are then assigned to each sub-band accordingly.
Typically, sub-bands are less important if they contain quieter sounds (smaller coefficient) than a neighboring (i.e. similar frequency) sub-band with louder sounds (larger coefficient). Also, "noise" components typically have a more significant masking effect than "tonal" components.
Less significant sub-bands are reduced in accuracy by quantization. This basically involves compressing the frequency range (amplitude of the coefficient), i.e. raising the noise floor. Then computing an amplification factor, for the decoder to use to re-expand each sub-band to the proper frequency range.
Layer II can also optionally use intensity stereo coding, a form of joint stereo. This means that the frequencies above 6 kHz of both channels are combined/down-mixed into one single (mono) channel, but the "side channel" information on the relative intensity (volume, amplitude) of each channel is preserved and encoded into the bitstream separately. On playback, the single channel is played through left and right speakers, with the intensity information applied to each channel to give the illusion of stereo sound. This perceptual trick is known as "stereo irrelevancy". This can allow further reduction of the audio bitrate without much perceivable loss of fidelity, but is generally not used with higher bitrates as it does not provide very high quality (transparent) audio.
Subjective audio testing by experts, in the most critical conditions ever implemented, has shown MP2 to offer transparent audio compression at 256 kbit/s for 16-bit 44.1 kHz CD audio using the earliest reference implementation (more recent encoders should presumably perform even better). That (approximately) 1:6 compression ratio for CD audio is particularly impressive because it is quite close to the estimated upper limit of perceptual entropy, at just over 1:8. Achieving much higher compression is simply not possible without discarding some perceptible information.
MP2 remains a favoured lossy audio coding standard due to its particularly high audio coding performances on important audio material such as castanet, symphonic orchestra, male and female voices and particularly complex and high energy transients (impulses) like percussive sounds: triangle, glockenspiel and audience applause. More recent testing has shown that MPEG Multichannel (based on MP2), despite being compromised by an inferior matrixed mode (for the sake of backwards compatibility) rates just slightly lower than much more recent audio codecs, such as Dolby Digital (AC-3) and Advanced Audio Coding (AAC) (mostly within the margin of error—and substantially superior in some cases, such as audience applause). This is one reason that MP2 audio continues to be used extensively. The MPEG-2 AAC Stereo verification tests reached a vastly different conclusion, however, showing AAC to provide superior performance to MP2 at half the bitrate. The reason for this disparity with both earlier and later tests is not clear, but strangely, a sample of applause is notably absent from the latter test.
Layer II audio files typically use the extension ".mp2" or sometimes ".m2a".
MPEG-1 Audio Layer III (the first version of MP3) is a lossy audio format designed to provide acceptable quality at about 64 kbit/s for monaural audio over single-channel (BRI) ISDN links, and 128 kbit/s for stereo sound.
MPEG-1 Audio Layer III was derived from the "Adaptive Spectral Perceptual Entropy Coding" (ASPEC) codec developed by Fraunhofer as part of the EUREKA 147 pan-European inter-governmental research and development initiative for the development of digital audio broadcasting. ASPEC was adapted to fit in with the Layer II model (frame size, filter bank, FFT, etc.), to become Layer III.
ASPEC was itself based on "Multiple adaptive Spectral audio Coding" (MSC) by E. F. Schroeder, "Optimum Coding in the Frequency domain" (OCF) the doctoral thesis by Karlheinz Brandenburg at the University of Erlangen-Nuremberg, "Perceptual Transform Coding" (PXFM) by J. D. Johnston at AT&T Bell Labs, and "Transform coding of audio signals" by Y. Mahieux and J. Petit at Institut für Rundfunktechnik (IRT/CNET).
MP3 is a frequency-domain audio transform encoder. Even though it utilizes some of the lower layer functions, MP3 is quite different from MP2.
MP3 works on 1152 samples like MP2, but needs to take multiple frames for analysis before frequency-domain (MDCT) processing and quantization can be effective. It outputs a variable number of samples, using a bit buffer to enable this variable bitrate (VBR) encoding while maintaining 1152 sample size output frames. This causes a significantly longer delay before output, which has caused MP3 to be considered unsuitable for studio applications where editing or other processing needs to take place.
MP3 does not benefit from the 32 sub-band polyphased filter bank, instead just using an 18-point MDCT transformation on each output to split the data into 576 frequency components, and processing it in the frequency domain. This extra granularity allows MP3 to have a much finer psychoacoustic model, and more carefully apply appropriate quantization to each band, providing much better low-bitrate performance.
Frequency-domain processing imposes some limitations as well, causing a factor of 12 or 36 × worse temporal resolution than Layer II. This causes quantization artifacts, due to transient sounds like percussive events and other high-frequency events that spread over a larger window. This results in audible smearing and pre-echo. MP3 uses pre-echo detection routines, and VBR encoding, which allows it to temporarily increase the bitrate during difficult passages, in an attempt to reduce this effect. It is also able to switch between the normal 36 sample quantization window, and instead using 3× short 12 sample windows instead, to reduce the temporal (time) length of quantization artifacts. And yet in choosing a fairly small window size to make MP3's temporal response adequate enough to avoid the most serious artifacts, MP3 becomes much less efficient in frequency domain compression of stationary, tonal components.
Being forced to use a "hybrid" time domain (filter bank) /frequency domain (MDCT) model to fit in with Layer II simply wastes processing time and compromises quality by introducing aliasing artifacts. MP3 has an aliasing cancellation stage specifically to mask this problem, but which instead produces frequency domain energy which must be encoded in the audio. This is pushed to the top of the frequency range, where most people have limited hearing, in hopes the distortion it causes will be less audible.
Layer II's 1024 point FFT doesn't entirely cover all samples, and would omit several entire MP3 sub-bands, where quantization factors must be determined. MP3 instead uses two passes of FFT analysis for spectral estimation, to calculate the global and individual masking thresholds. This allows it to cover all 1152 samples. Of the two, it utilizes the global masking threshold level from the more critical pass, with the most difficult audio.
In addition to Layer II's intensity encoded joint stereo, MP3 can use middle/side (mid/side, m/s, MS, matrixed) joint stereo. With mid/side stereo, certain frequency ranges of both channels are merged into a single (middle, mid, L+R) mono channel, while the sound difference between the left and right channels is stored as a separate (side, L-R) channel. Unlike intensity stereo, this process does not discard any audio information. When combined with quantization, however, it can exaggerate artifacts.
If the difference between the left and right channels is small, the side channel will be small, which will offer as much as a 50% bitrate savings, and associated quality improvement. If the difference between left and right is large, standard (discrete, left/right) stereo encoding may be preferred, as mid/side joint stereo will not provide any benefits. An MP3 encoder can switch between m/s stereo and full stereo on a frame-by-frame basis.
Unlike Layers I and II, MP3 uses variable-length Huffman coding (after perceptual) to further reduce the bitrate, without any further quality loss.
These technical limitations inherently prevent MP3 from providing critically transparent quality at any bitrate. This makes Layer II sound quality actually superior to MP3 audio, when it is used at a high enough bitrate to avoid noticeable artifacts. The term "transparent" often gets misused, however. The quality of MP3 (and other codecs) is sometimes called "transparent," even at impossibly low bitrates, when what is really meant is "good quality on average/non-critical material," or perhaps "exhibiting only non-annoying artifacts."
MP3's more fine-grained and selective quantization does prove notably superior to MP2 at lower-bitrates, however. It is able to provide nearly equivalent audio quality to Layer II, at a 15% lower bitrate (approximately). 128 kbit/s is considered the "sweet spot" for MP3; meaning it provides generally acceptable quality stereo sound on most music, and there are diminishing quality improvements from increasing the bitrate further. MP3 is also regarded as exhibiting artifacts that are less annoying than Layer II, when both are used at bitrates that are too low to possibly provide faithful reproduction.
Layer III audio files use the extension ".mp3".
The MPEG-2 standard includes several extensions to MPEG-1 Audio. These are known as MPEG-2 BC – backwards compatible with MPEG-1 Audio. MPEG-2 Audio is defined in ISO/IEC 13818-3.
These sampling rates are exactly half that of those originally defined for MPEG-1 Audio. They were introduced to maintain higher quality sound when encoding audio at lower-bitrates. The even-lower bitrates were introduced because tests showed that MPEG-1 Audio could provide higher quality than any existing (circa 1994) very low bitrate (i.e. speech) audio codecs.
Part 4 of the MPEG-1 standard covers conformance testing, and is defined in ISO/IEC-11172-4.
Conformance: Procedures for testing conformance.
Provides two sets of guidelines and reference bitstreams for testing the conformance of MPEG-1 audio and video decoders, as well as the bitstreams produced by an encoder.
Part 5 of the MPEG-1 standard includes reference software, and is defined in ISO/IEC TR 11172-5.
Simulation: Reference software.
C reference code for encoding and decoding of audio and video, as well as multiplexing and demultiplexing.
This includes the "ISO Dist10" audio encoder code, which LAME and TooLAME were originally based upon.
.mpg is one of a number of file extensions for MPEG-1 or MPEG-2 audio and video compression. MPEG-1 Part 2 video is rare nowadays, and this extension typically refers to an MPEG program stream (defined in MPEG-1 and MPEG-2) or MPEG transport stream (defined in MPEG-2). Other suffixes such as .m2ts also exist specifying the precise container, in this case MPEG-2 TS, but this has little relevance to MPEG-1 media.
.mp3 is the most common extension for files containing MP3 audio (typically MPEG-1 Audio, sometimes MPEG-2 Audio). An MP3 file is typically an uncontained stream of raw audio; the conventional way to tag MP3 files is by writing data to "garbage" segments of each frame, which preserve the media information but are discarded by the player. This is similar in many respects to how raw .AAC files are tagged (but this is less supported nowadays, e.g. iTunes).
Note that although it would apply, .mpg does not normally append raw AAC or AAC in MPEG-2 Part 7 Containers. The .aac extension normally denotes these audio files.
|
https://en.wikipedia.org/wiki?curid=20056
|
Mumia Abu-Jamal
Mumia Abu-Jamal (born Wesley Cook; April 24, 1954) is a political activist and journalist who was convicted of murder and sentenced to death in 1982 for the 1981 murder of Philadelphia police officer Daniel Faulkner. He became widely known while on death row for his writings and commentary on the criminal justice system in the United States. After numerous appeals, his death penalty sentence was overturned by a Federal court. In 2011, the prosecution agreed to a sentence of life imprisonment without parole. He entered the general prison population early the following year.
Beginning at the age of 14 in 1968, Abu-Jamal became involved with the Black Panther Party and was a member until October 1970. After he left the party, he completed his high school education, and later became a radio reporter. He eventually served as president of the Philadelphia Association of Black Journalists. He supported the MOVE Organization in Philadelphia and covered the 1978 confrontation in which one police officer was killed. The MOVE Nine were the members who were arrested and convicted of murder in that case.
Since 1982, the murder trial of Abu-Jamal has been seriously criticized for constitutional failings; some have claimed that he is innocent, and many opposed his death sentence. The Faulkner family, public authorities, police organizations, and conservative groups believe that Abu-Jamal's trial was fair, his guilt undeniable, and his death sentence appropriate.
When his death sentence was overturned by a Federal court in 2001, he was described as "perhaps the world's best known death-row inmate" by "The New York Times." During his imprisonment, Abu-Jamal has published books and commentaries on social and political issues; his first book was "Live from Death Row" (1995).
He was born Wesley Cook in Philadelphia, Pennsylvania, where he grew up. He has a younger brother named William. They attended local public schools.
In 1968, a high school teacher, a Kenyan instructing a class on African cultures, encouraged the students to take African or Arabic names for classroom use; he gave Cook the name "Mumia". According to Abu-Jamal, "Mumia" means "Prince" and was the name of a Kenyan anti-colonial African nationalist who fought against the British before Kenyan independence.
Abu-Jamal has described being "kicked ... into the Black Panther Party" as a teenager of 14, after suffering a beating from "white racists" and a policeman for trying to disrupt a 1968 rally for Independent candidate George Wallace, former governor of Alabama, who was running on a racist platform. From then he helped form the Philadelphia branch of the Black Panther Party with Defense Captain Reggie Schell, and other Panthers. He was appointed as the chapter's "Lieutenant of Information," responsible for writing information and news communications. In an interview in the early years, Abu-Jamal quoted Mao Zedong, saying that "political power grows out of the barrel of a gun". That same year, he dropped out of Benjamin Franklin High School and began living at the branch's headquarters.
He spent late 1969 in New York City and early 1970 in Oakland, living and working with BPP colleagues in those cities; the party had been founded in Oakland. He was a party member from May 1969 until October 1970. During this period, he was subject to illegal surveillance as part of the Federal Bureau of Investigation's COINTELPRO program, with which the Philadelphia police cooperated. The FBI was working to infiltrate black radical groups and to disrupt them by creating internal dissension.
After leaving the Panthers, Abu-Jamal returned as a student to his former high school. He was suspended for distributing literature calling for "black revolutionary student power". He led unsuccessful protests to change the school name to Malcolm X High, to honor the major African-American leader who had been killed in New York by political opponents.
After attaining his GED, Abu-Jamal studied briefly at Goddard College in rural Vermont. He returned to Philadelphia.
Cook adopted the surname Abu-Jamal ("father of Jamal" in Arabic) after the birth of his first child, son Jamal, on July 18, 1971. He married Jamal's mother Biba in 1973, but they did not stay together long. Their daughter, Lateefa, was born shortly after the wedding. The couple divorced.
In 1977 Abu-Jamal married again, to his second wife, Marilyn (known as "Peachie"). Their son, Mazi, was born in early 1978. By 1981, Abu-Jamal had divorced Peachie and married to his third (and current) wife, Wadiya.
By 1975 Abu-Jamal was working in radio newscasting, first at Temple University's WRTI and then at commercial enterprises. In 1975, he was employed at radio station WHAT, and he became host of a weekly feature program at WCAU-FM in 1978. He also worked for brief periods at radio station WPEN. He became active in the local chapter of the Marijuana Users Association of America.
From 1979 to 1981 he worked at National Public Radio (NPR) affiliate WHYY. The management asked him to resign, saying that he did not maintain a sufficiently objective approach in his presentation of news. As a radio journalist, Abu-Jamal was renowned for identifying with and covering the MOVE anarcho-primitivist commune in West Philadelphia's Powelton Village neighborhood. He reported on the 1979–80 trial of certain members (the "MOVE Nine"), who were convicted of the murder of police officer James Ramp. Abu-Jamal had several high-profile interviews, including with Julius Erving, Bob Marley and Alex Haley. He was elected president of the Philadelphia Association of Black Journalists.
Before joining MOVE, Abu-Jamal reported on the organization. When he joined MOVE, he said it was because of his love of the people in the organization. Thinking back on it later, he said he "was probably enraged as well".
In December 1981, Abu-Jamal was working as a taxicab driver in Philadelphia two nights a week to supplement his income. He had been working part-time as a reporter for WDAS, then an African-American-oriented and minority-owned radio station.
At 3:55 am on December 9, 1981, in Philadelphia, close to the intersection at 13th and Locust streets, Philadelphia Police Department officer Daniel Faulkner conducted a traffic stop on a vehicle belonging to and driven by William Cook, Abu-Jamal's younger brother. Faulkner and Cook became engaged in a physical confrontation. Driving his cab in the vicinity, Abu-Jamal observed the altercation, parked, and ran across the street toward Cook's car. Faulkner was shot in the back and face. He shot Abu-Jamal in the stomach. Faulkner died at the scene from the gunshot to his head.
Police arrived and arrested Abu-Jamal, who was found to be wearing a shoulder holster. His revolver, which had five spent cartridges, was beside him. He was taken directly from the scene of the shooting to Thomas Jefferson University Hospital, where he received treatment for his wound. He was next taken to Police
Headquarters, where he was charged and held for trial in the first-degree murder of Officer Faulkner.
The prosecution presented four witnesses to the court about the shootings. Robert Chobert, a cab driver who testified he was parked behind Faulkner, identified Abu-Jamal as the shooter. Cynthia White, a prostitute, testified that Abu-Jamal emerged from a nearby parking lot and shot Faulkner. Michael Scanlan, a motorist, testified that from two car lengths away, he saw a man, matching Abu-Jamal's description, run across the street from a parking lot and shoot Faulkner. Albert Magilton, a pedestrian who did not see the shooting, testified to seeing Faulkner pull over Cook's car. As Abu-Jamal started to cross the street toward them, Magilton turned away and did not see what happened next.
The prosecution presented two witnesses from the hospital where Abu-Jamal was treated. Hospital security guard Priscilla Durham and police officer Garry Bell testified that Abu-Jamal said in the hospital, "I shot the motherfucker, and I hope the motherfucker dies."
A .38 caliber Charter Arms revolver, belonging to Abu-Jamal, with five spent cartridges, was retrieved beside him at the scene. He was wearing a shoulder holster. Anthony Paul, the Supervisor of the Philadelphia Police Department's firearms identification unit, testified at trial that the cartridge cases and rifling characteristics of the weapon were consistent with bullet fragments taken from Faulkner's body. Tests to confirm that Abu-Jamal had handled and fired the weapon were not performed. Contact with arresting police and other surfaces at the scene could have compromised the forensic value of such tests.
The defense maintained that Abu-Jamal was innocent, and that the prosecution witnesses were unreliable. The defense presented nine character witnesses, including poet Sonia Sanchez, who testified that Abu-Jamal was "viewed by the black community as a creative, articulate, peaceful, genial man". Another defense witness, Dessie Hightower, testified that he saw a man running along the street shortly after the shooting, although he did not see the shooting itself. His testimony contributed to the development of a "running man theory", based on the possibility that a "running man" may have been the shooter. Veronica Jones also testified for the defense, but she did not testify to having seen another man. Other potential defense witnesses refused to appear in court. Abu-Jamal did not testify in his own defense, nor did his brother, William Cook. Cook had repeatedly told investigators at the crime scene: "I ain't got nothing to do with this!".
After three hours of deliberations, the jury presented a unanimous guilty verdict.
In the sentencing phase of the trial, Abu-Jamal read to the jury from a prepared statement. He was cross-examined about issues relevant to the assessment of his character by Joseph McGill, the prosecuting attorney.
In his statement, Abu-Jamal criticized his attorney as a "legal trained lawyer", who was imposed on him against his will and who "knew he was inadequate to the task and chose to follow the directions of this black-robed conspirator [referring to the judge], Albert Sabo, even if it meant ignoring my directions." He claimed that his rights had been "deceitfully stolen" from him by [Judge] Sabo, particularly focusing on the denial of his request to receive defense assistance from John Africa, who was not an attorney, and being prevented from proceeding "pro se". He quoted remarks of John Africa, and said:
Abu-Jamal was sentenced to death by the unanimous decision of the jury. Amnesty International has objected to the introduction by the prosecution at the time of his sentencing of statements from when he was an activist as a youth. It also protested the politicization of the trial, noting that there was documented recent history in Philadelphia of police abuse and corruption, including fabricated evidence and use of excessive force. Amnesty International concluded "that the proceedings used to convict and sentence Mumia Abu-Jamal to death were in violation of minimum international standards that govern fair trial procedures and the use of the death penalty".
The Supreme Court of Pennsylvania on March 6, 1989, heard and rejected a direct appeal of his conviction. It subsequently denied rehearing. The Supreme Court of the United States denied his petition for writ of "certiorari" on October 1, 1990, and denied his petition for rehearing twice up to June 10, 1991.
On June 1, 1995, Abu-Jamal's death warrant was signed by Pennsylvania Governor Tom Ridge. Its execution was suspended while Abu-Jamal pursued state post-conviction review. At the post-conviction review hearings, new witnesses were called. William "Dales" Singletary testified that he saw the shooting, and that the gunman was the passenger in Cook's car. Singletary's account contained discrepancies which rendered it "not credible" in the opinion of the court.
The six judges of the Supreme Court of Pennsylvania ruled unanimously that all issues raised by Abu-Jamal, including the claim of ineffective assistance of counsel, were without merit. The Supreme Court of the United States denied a petition for "certiorari" against that decision on October 4, 1999, enabling Ridge to sign a second death warrant on October 13, 1999. Its execution was stayed as Abu-Jamal began to seek federal "habeas corpus" review.
In 1999, Arnold Beverly claimed that he and an unnamed assailant, not Mumia Abu-Jamal, shot Daniel Faulkner as part of a contract killing because Faulkner was interfering with graft and payoff to corrupt police. As Abu-Jamal's defense team prepared another appeal in 2001, they were divided over use of the Beverly affidavit. Some thought it usable and others rejected Beverly's story as "not credible".
Private investigator George Newman claimed in 2001 that Chobert had recanted his testimony. Commentators noted that police and news photographs of the crime scene did not show Chobert's taxi, and that Cynthia White, the only witness at the original trial to testify to seeing the taxi, had previously provided crime scene descriptions that omitted it. Cynthia White was declared to be dead by the state of New Jersey in 1992, but Pamela Jenkins claimed that she saw White alive as late as 1997. The Free Mumia Coalition has claimed that White was a police informant and that she falsified her testimony against Abu-Jamal.
Kenneth Pate, who was imprisoned with Abu-Jamal on other charges, has since claimed that his step-sister Priscilla Durham, a hospital security guard, admitted later she had not heard the "hospital confession" to which she had testified at trial. The hospital doctors said that Abu-Jamal was "on the verge of fainting" when brought in, and they did not hear any such confession.
In 2008, the Supreme Court of Pennsylvania rejected a further request from Abu-Jamal for a hearing into claims that the trial witnesses perjured themselves, on the grounds that he had waited too long before filing the appeal.
On March 26, 2012 the Supreme Court of Pennsylvania rejected his appeal for retrial. His defense had asserted, based on a 2009 report by the National Academy of Sciences, that forensic evidence presented by the prosecution and accepted into evidence in the original trial was unreliable. This was reported as Abu-Jamal's last legal appeal.
On April 30, 2018, the Pennsylvania Supreme Court ruled that Abu-Jamal would not be immediately granted another appeal and that the proceedings had to continue until August 30 of that year. The defense argued that former Pennsylvania Supreme Court Chief justice Ronald D. Castille should have recused himself from the 2012 appeals decision after his involvement as Philadelphia District Attorney (DA) in the 1989 appeal. Both sides of the 2018 proceedings repeatedly cited a 1990 letter sent by Castille to then-Governor Bob Casey, urging Casey to sign the execution warrants of those convicted of murdering police. This letter, demanding Casey send "a clear and dramatic message to all cop killers," was claimed one of many reasons to suspect Castille's bias in the case. Philadelphia's current DA Larry Krasner stated he could not find any document supporting the defense's claim. On August 30, 2018, the proceedings to determine another appeal were once again extended and a ruling on the matter was delayed for at least 60 more days.
The Free Mumia Coalition published statements by William Cook and his brother Abu-Jamal in the spring of 2001. Cook, who had been stopped by the police officer, had not made any statement before April 29, 2001, and did not testify at his brother's trial. In 2001 he said that he had not seen who had shot Faulkner. Abu-Jamal did not make any public statements about Faulkner's murder until May 4, 2001. In his version of events, he claimed that he was sitting in his cab across the street when he heard shouting, saw a police vehicle, and heard the sound of gunshots. Upon seeing his brother appearing disoriented across the street, Abu-Jamal ran to him from the parking lot and was shot by a police officer.
In 2001 Judge William H. Yohn, Jr. of the United States District Court for the Eastern District of Pennsylvania upheld the conviction, saying that Abu-Jamal did not have the right to a new trial. But he vacated the sentence of death on December 18, 2001, citing irregularities in the penalty phase of the trial and the original process of sentencing. Particularly, he said that
Eliot Grossman and Marlene Kamish, attorneys for Abu-Jamal, criticized the ruling on the grounds that it denied the possibility of a "trial de novo," at which they could introduce evidence that their client had been framed. Prosecutors also criticized the ruling. Officer Faulkner's widow Maureen said the judgment would allow Abu-Jamal, whom she described as a "remorseless, hate-filled killer", to "be permitted to enjoy the pleasures that come from simply being alive". Both parties appealed.
On December 6, 2005, the Third Circuit Court of Appeals admitted four issues for appeal of the ruling of the District Court:
The Third Circuit Court heard oral arguments in the appeals on May 17, 2007, at the United States Courthouse in Philadelphia. The appeal panel consisted of Chief Judge Anthony Joseph Scirica, Judge Thomas Ambro, and Judge Robert Cowen. The Commonwealth of Pennsylvania sought to reinstate the sentence of death, on the basis that Yohn's ruling was flawed, as he should have deferred to the Pennsylvania Supreme Court which had already ruled on the issue of sentencing. The prosecution said that the "Batson" claim was invalid because Abu-Jamal made no complaints during the original jury selection.
The resulting jury was racially mixed, with 2 blacks and 10 whites at the time of the unanimous conviction, but defense counsel told the Third Circuit Court that Abu-Jamal did not get a fair trial because the jury was racially biased, misinformed, and the judge was a racist. He noted that the prosecution used eleven out of fourteen peremptory challenges to eliminate prospective black jurors. Terri Maurer-Carter, a former Philadelphia court stenographer, stated in a 2001 affidavit that she overheard Judge Sabo say "Yeah, and I'm going to help them fry the nigger" in the course of a conversation with three people present regarding Abu-Jamal's case. Sabo denied having made any such comment.
On March 27, 2008, the three-judge panel issued a majority 2–1 opinion upholding Yohn's 2001 opinion but rejecting the bias and "Batson" claims, with Judge Ambro dissenting on the "Batson" issue. On July 22, 2008, Abu-Jamal's formal petition seeking reconsideration of the decision by the full Third Circuit panel of 12 judges was denied. On April 6, 2009, the United States Supreme Court refused to hear Abu-Jamal's appeal, allowing his conviction to stand.
On January 19, 2010, the Supreme Court ordered the appeals court to reconsider its decision to rescind the death penalty. The same three-judge panel convened in Philadelphia on November 9, 2010, to hear oral argument. On April 26, 2011, the Third Circuit Court of Appeals reaffirmed its prior decision to vacate the death sentence on the grounds that the jury instructions and verdict form were ambiguous and confusing. The Supreme Court declined to hear the case in October.
On December 7, 2011, District Attorney of Philadelphia R. Seth Williams announced that prosecutors, with the support of the victim's family, would no longer seek the death penalty for Abu-Jamal and would accept a sentence of life imprisonment without parole. This sentence was reaffirmed by the Superior Court of Pennsylvania on July 9, 2013.
After the press conference on the sentence, widow Maureen Faulkner said that she did not want to relive the trauma of another trial. She understood that it would be extremely difficult to present the case against Abu-Jamal again, after the passage of 30 years and the deaths of several key witnesses. She also reiterated her belief that Abu-Jamal will be punished further after death.
In 1991 Abu-Jamal published an essay in the "Yale Law Journal", on the death penalty and his death row experience. In May 1994, Abu-Jamal was engaged by National Public Radio's "All Things Considered" program to deliver a series of monthly three-minute commentaries on crime and punishment. The broadcast plans and commercial arrangement were canceled following condemnations from, among others, the Fraternal Order of Police and U.S. Senator Bob Dole (Kansas Republican Party). Abu-Jamal sued NPR for not airing his work, but a federal judge dismissed the suit. His commentaries later were published in May 1995 as part of his first book, "Live from Death Row."
In 1996, he completed a B.A. degree via correspondence classes at Goddard College, which he had attended for a time as a young man. He has been invited as commencement speaker by a number of colleges, and has participated via recordings. In 1999, Abu-Jamal was invited to record a keynote address for the graduating class at Evergreen State College in Washington State. The event was protested by some. In 2000, he recorded a commencement address for Antioch College. The now defunct New College of California School of Law presented him with an honorary degree "for his struggle to resist the death penalty."
On October 5, 2014, he gave the commencement speech at Goddard College, via playback of a recording. As before, the choice of Abu-Jamal was controversial. Ten days later the Pennsylvania legislature had passed an addition to the Crime Victims Act called "Revictimization Relief." The new provision is intended to prevent actions that cause "a temporary or permanent state of mental anguish" to those who have previously been victimized by crime. It was signed by Republican governor Tom Corbett five days later. Commentators suggest that the bill was directed to control Abu-Jamal's journalism, book publication, and public speaking, and that it would be challenged on the grounds of free speech.
With occasional interruptions due to prison disciplinary actions, Abu-Jamal has for many years been a regular commentator on an online broadcast, sponsored by Prison Radio. He also is published as a regular columnist for "Junge Welt," a Marxist newspaper in Germany. For almost a decade, Abu-Jamal taught introductory courses in Georgist economics by correspondence to other prisoners around the world.
In addition, he has written and published several books: "Live From Death Row" (1995), a diary of life on Pennsylvania's death row; "All Things Censored" (2000), a collection of essays examining issues of crime and punishment; "Death Blossoms: Reflections from a Prisoner of Conscience" (2003), in which he explores religious themes; and "We Want Freedom: A Life in the Black Panther Party" (2004), a history of the Black Panthers that draws on his own experience and research, and discusses the federal government's program known as COINTELPRO, to disrupt black activist organizations.
In 1995, Abu-Jamal was punished with solitary confinement for engaging in entrepreneurship contrary to prison regulations. Subsequent to the airing of the 1996 HBO documentary "," which included footage from visitation interviews conducted with him, the Pennsylvania Department of Corrections banned outsiders from using any recording equipment in state prisons.
In litigation before the U.S. Court of Appeals, in 1998 Abu-Jamal successfully established his right while in prison to write for financial gain. The same litigation also established that the Pennsylvania Department of Corrections had illegally opened his mail in an attempt to establish whether he was earning money by his writing.
When, for a brief time in August 1999, Abu-Jamal began delivering his radio commentaries live on the Pacifica Network's "Democracy Now!" weekday radio newsmagazine, prison staff severed the connecting wires of his telephone from their mounting in mid-performance. He was later allowed to resume his broadcasts, and hundreds of his broadcasts have been aired on Pacifica Radio.
Following the overturning of his death sentence, Abu-Jamal was sentenced to life in prison in December 2011. At the end of January 2012, he was shifted from the isolation of death row into the general prison population at State Correctional Institution – Mahanoy.
On March 30, 2015, he suffered diabetic shock and has been diagnosed with active Hepatitis C. In August 2015 his attorneys filed suit in the U.S. District Court for the Middle District of Pennsylvania, alleging that he has not received appropriate medical care for his serious health conditions.
Labor unions, politicians, advocates, educators, the NAACP Legal Defense and Educational Fund, and human rights advocacy organizations such as Human Rights Watch and Amnesty International have expressed concern about the impartiality of the trial of Abu-Jamal. Amnesty International neither takes a position on the guilt or innocence of Abu-Jamal nor classifies him as a political prisoner.
The family of Daniel Faulkner, the Commonwealth of Pennsylvania, the City of Philadelphia, politicians, and the Fraternal Order of Police have continued to support the original trial and sentencing of the journalist. In August 1999, the Fraternal Order of Police called for an economic boycott against all individuals and organizations that support Abu-Jamal.
Partly based on his own writing, Abu-Jamal and his cause have become widely known internationally, and other groups have classified him as a political prisoner. About 25 cities, including Montreal, Palermo, and Paris, have made him an honorary citizen.
In 2001, he received the sixth biennial Erich Mühsam Prize, named after an anarcho-communist essayist, which recognizes activism in line with that of its namesake. In October 2002, he was made an honorary member of the German political organization Society of People Persecuted by the Nazi Regime – Federation of Anti-Fascists (VVN-BdA)
On April 29, 2006, a newly paved road in the Parisian suburb of Saint-Denis was named Rue Mumia Abu-Jamal in his honor. In protest of the street-naming, U.S. Congressman Michael Fitzpatrick and Senator Rick Santorum, both members of the Republican Party of Pennsylvania, introduced resolutions in both Houses of Congress condemning the decision. The House of Representatives voted 368–31 in favor of Fitzpatrick's resolution. In December 2006, the 25th anniversary of the murder, the executive committee of the Republican Party for the 59th Ward of the City of Philadelphia—covering approximately Germantown, Philadelphia—filed two criminal complaints in the French legal system against the city of Paris and the city of Saint-Denis, accusing the municipalities of "glorifying" Abu-Jamal and alleging the offense "apology or denial of crime" in respect of their actions.
In 2007, the widow of Officer Faulkner co-authored a book with Philadelphia radio journalist Michael Smerconish titled "Murdered by Mumia: A Life Sentence of Pain, Loss, and Injustice." The book was part memoir of Faulkner's widow, and part discussion in which they chronicled Abu-Jamal's trial and discussed evidence for his conviction. They also discussed support for the death penalty.
In early 2014, President Barack Obama nominated Debo Adegbile, a former lawyer for the NAACP Legal Defense Fund, to head the civil rights division of the Justice Department. He had worked on Abu-Jamal's case, and his nomination was rejected by the U.S. Senate on a bipartisan basis because of that.
In April 10, 2015, Marylin Zuniga, a teacher at Forest Street Elementary School in Orange, New Jersey, was suspended without pay after asking her students to write cards to Abu-Jamal, who was ill in prison due to complications from diabetes, without approval from the school or parents. Some parents and police leaders denounced her actions. On the other hand, community members, parents, teachers, and professors expressed their support and condemned Zuniga's suspension. Scholars and educators nationwide, including Noam Chomsky, Chris Hedges and Cornel West among others, signed a letter calling for her immediate reinstatement. On May 13, 2015, the Orange Preparatory Academy board voted to dismiss Marylin Zuniga after hearing from her and several of her supporters.
Video
Supporter websites
Opponent websites
|
https://en.wikipedia.org/wiki?curid=20057
|
Multiplicative function
In number theory, a multiplicative function is an arithmetic function "f"("n") of a positive integer "n" with the property that "f"(1) = 1 and whenever
"a" and "b" are coprime, then
An arithmetic function "f"("n") is said to be completely multiplicative (or totally multiplicative) if "f"(1) = 1 and "f"("ab") = "f"("a")"f"("b") holds "for all" positive integers "a" and "b", even when they are not coprime.
Some multiplicative functions are defined to make formulas easier to write:
Other examples of multiplicative functions include many functions of importance in number theory, such as:
An example of a non-multiplicative function is the arithmetic function "r""2"("n") - the number of representations of "n" as a sum of squares of two integers, positive, negative, or zero, where in counting the number of ways, reversal of order is allowed. For example:
and therefore "r"2(1) = 4 ≠ 1. This shows that the function is not multiplicative. However, "r"2("n")/4 is multiplicative.
In the On-Line Encyclopedia of Integer Sequences, sequences of values of a multiplicative function have the keyword "mult".
See arithmetic function for some other examples of non-multiplicative functions.
A multiplicative function is completely determined by its values at the powers of prime numbers, a consequence of the fundamental theorem of arithmetic. Thus, if "n" is a product of powers of distinct primes, say "n" = "p""a" "q""b" ..., then
"f"("n") = "f"("p""a") "f"("q""b") ...
This property of multiplicative functions significantly reduces the need for computation, as in the following examples for "n" = 144 = 24 · 32:
Similarly, we have:
In general, if "f"("n") is a multiplicative function and "a", "b" are any two positive integers, then
Every completely multiplicative function is a homomorphism of monoids and is completely determined by its restriction to the prime numbers.
If "f" and "g" are two multiplicative functions, one defines a new multiplicative function "f" * "g", the "Dirichlet convolution" of "f" and "g", by
where the sum extends over all positive divisors "d" of "n".
With this operation, the set of all multiplicative functions turns into an abelian group; the identity element is "ε". Convolution is commutative, associative, and distributive over addition.
Relations among the multiplicative functions discussed above include:
The Dirichlet convolution can be defined for general arithmetic functions, and yields a ring structure, the Dirichlet ring.
The Dirichlet convolution of two multiplicative functions is again multiplicative. A proof of this fact is given by the following expansion for relatively prime formula_11:
More examples are shown in the article on Dirichlet series.
Let "A" = , the polynomial ring over the finite field with "q" elements. "A" is a principal ideal domain and therefore "A" is a unique factorization domain.
A complex-valued function formula_17 on "A" is called multiplicative if formula_18 whenever "f" and "g" are relatively prime.
Let "h" be a polynomial arithmetic function (i.e. a function on set of monic polynomials over "A"). Its corresponding Dirichlet series is defined to be
where for formula_20 set formula_21 if formula_22 and formula_23 otherwise.
The polynomial zeta function is then
Similar to the situation in , every Dirichlet series of a multiplicative function "h" has a product representation (Euler product):
where the product runs over all monic irreducible polynomials "P". For example, the product representation of the zeta function is as for the integers:
Unlike the classical zeta function, formula_27 is a simple rational function:
In a similar way, If "f" and "g" are two polynomial arithmetic functions, one defines "f" * "g", the "Dirichlet convolution" of "f" and "g", by
where the sum is over all monic divisors "d" of "m", or equivalently over all pairs ("a", "b") of monic polynomials whose product is "m". The identity formula_30 still holds.
|
https://en.wikipedia.org/wiki?curid=20059
|
MPEG-2
MPEG-2 (a.k.a. H.222/H.262 as defined by the ITU) is a standard for "the generic coding of moving pictures and associated audio information". It describes a combination of lossy video compression and lossy audio data compression methods, which permit storage and transmission of movies using currently available storage media and transmission bandwidth. While MPEG-2 is not as efficient as newer standards such as H.264/AVC and H.265/HEVC, backwards compatibility with existing hardware and software means it is still widely used, for example in over-the-air digital television broadcasting and in the DVD-Video standard.
MPEG-2 is widely used as the format of digital television signals that are broadcast by terrestrial (over-the-air), cable, and direct broadcast satellite TV systems. It also specifies the format of movies and other programs that are distributed on DVD and similar discs. TV stations, TV receivers, DVD players, and other equipment are often designed to this standard. MPEG-2 was the second of several standards developed by the Moving Pictures Expert Group (MPEG) and is an international standard (ISO/IEC 13818). Parts 1 and 2 of MPEG-2 were developed in a collaboration with ITU-T, and they have a respective catalog number in the ITU-T Recommendation Series.
While MPEG-2 is the core of most digital television and DVD formats, it does not completely specify them. Regional institutions can adapt it to their needs by restricting and augmenting aspects of the standard. See Video profiles and levels.
MPEG-2 includes a Systems section, part 1, that defines two distinct, but related, container formats. One is the "transport stream", a data packet format designed to transmit one data packet in four ATM data packets for streaming digital video and audio over fixed or mobile transmission mediums, where the beginning and the end of the stream may not be identified, such as radio frequency, cable and linear recording mediums, examples of which include ATSC/DVB/ISDB/SBTVD broadcasting, and HDV recording on tape. The other is the "program stream", an extended version of the container format with less overhead than "transport stream". "Program stream" is designed for random access storage mediums such as hard disk drives, optical discs and flash memory.
"Transport stream" file formats include M2TS, which is used on Blu-ray discs, AVCHD on re-writable DVDs and HDV on compact flash cards. "Program stream" files include VOB on DVDs and Enhanced VOB on the short lived HD DVD. The standard MPEG-2 "transport stream" contains packets of 188 bytes. M2TS prepends each packet with 4 bytes containing a 2-bit copy permission indicator and 30-bit timestamp.
MPEG-2 Systems is formally known as ISO/IEC 13818-1 and as ITU-T Rec. H.222.0. ISO authorized the "SMPTE Registration Authority, LLC" as the registration authority for MPEG-2 format identifiers. The registration descriptor of MPEG-2 transport is provided by ISO/IEC 13818-1 in order to enable users of the standard to unambiguously carry data when its format is not necessarily a recognized international standard. This provision will permit the MPEG-2 transport standard to carry all types of data while providing for a method of unambiguous identification of the characteristics of the underlying private data.
The Video section, part 2 of MPEG-2, is similar to the previous MPEG-1 standard, but also provides support for interlaced video, the format used by analog broadcast TV systems. MPEG-2 video is not optimized for low bit-rates, especially less than 1 Mbit/s at standard definition resolutions. All standards-compliant MPEG-2 Video decoders are fully capable of playing back MPEG-1 Video streams conforming to the Constrained Parameters Bitstream syntax. MPEG-2/Video is formally known as ISO/IEC 13818-2 and as ITU-T Rec. H.262.
With some enhancements, MPEG-2 Video and Systems are also used in some HDTV transmission systems, and is the standard format for over-the-air ATSC digital television.
MPEG-2 introduces new audio encoding methods compared to MPEG-1:
The MPEG-2 Audio section, defined in Part 3 (ISO/IEC 13818-3) of the standard, enhances MPEG-1's audio by allowing the coding of audio programs with more than two channels, up to 5.1 multichannel. This method is backwards-compatible (also known as MPEG-2 BC), allowing MPEG-1 audio decoders to decode the two main stereo components of the presentation. MPEG-2 part 3 also defined additional bit rates and sample rates for MPEG-1 Audio Layer I, II and III.
MPEG-2 BC (backward compatible with MPEG-1 audio formats)
Part 7 (ISO/IEC 13818-7) of the MPEG-2 standard specifies a rather different, non-backwards-compatible audio format (also known as MPEG-2 NBC). Part 7 is referred to as MPEG-2 AAC. AAC is more efficient than the previous MPEG audio standards, and is in some ways less complicated than its predecessor, MPEG-1 Audio, Layer 3, in that it does not have the hybrid filter bank. It supports from 1 to 48 channels at sampling rates of 8 to 96 kHz, with multichannel, multilingual, and multiprogram capabilities. Advanced Audio is also defined in Part 3 of the MPEG-4 standard.
MPEG-2 NBC (Non-Backward Compatible)
MPEG-2 standards are published as parts of ISO/IEC 13818. Each part covers a certain aspect of the whole specification.
MPEG-2 evolved out of the shortcomings of MPEG-1.
MPEG-1's known weaknesses:
Sakae Okubo of NTT was the ITU-T coordinator for developing the H.262/MPEG-2 Part 2 video coding standard and the requirements chairman in MPEG for the MPEG-2 set of standards. The majority of patents underlying MPEG-2 technology are owned by three companies: Sony (311 patents), Thomson (198 patents) and Mitsubishi Electric (119 patents). Hyundai Electronics (now SK Hynix) developed the first MPEG-2 SAVI (System/Audio/Video) decoder in 1995.
.mpg, .mpeg, .m2v, .mp2, .mp3 are some of a number of filename extensions used for MPEG-1 or MPEG-2 audio and video file formats.
The DVD-Video standard uses MPEG-2 video, but imposes some restrictions:
HDV is a format for recording and playback of high-definition MPEG-2 video on a DV cassette tape.
MOD and TOD are recording formats for use in consumer digital file-based camcorders.
XDCAM is a professional file-based video recording format.
Application-specific restrictions on MPEG-2 video in the DVB standard:
Allowed resolutions for SDTV:
For HDTV:
The ATSC A/53 standard used in the United States, uses MPEG-2 video at the Main Profile @ High Level (MP@HL), with additional restrictions such as the maximum bitrate of 19.39 Mbit/s for broadcast television and 38.8 Mbit/s for cable television, 4:2:0 chroma subsampling format, and mandatory colorimetry information.
ATSC allows the following video resolutions, aspect ratios, and frame/field rates:
ATSC standard A/63 defines additional resolutions and aspect rates for 50 Hz (PAL) signal.
The ATSC specification and MPEG-2 allow the use of progressive frames, even within an interlaced video sequence. For example, a station that transmits 1080i60 video sequence can use a coding method where those 60 fields are coded with 24 progressive frames and metadata instructs the decoder to interlace them and perform 3:2 pulldown before display. This allows broadcasters to switch between 60 Hz interlaced (news, soap operas) and 24 Hz progressive (prime-time) content without ending the MPEG-2 sequence and introducing several seconds of delay as the TV switches formats. This is the reason why 1080p30 and 1080p24 sequences allowed by the ATSC specification are not used in practice.
The 1080-line formats are encoded with 1920 × 1088 pixel luma matrices and 960 × 540 chroma matrices, but the last 8 lines are discarded by the MPEG-2 decoding and display process.
ATSC A/72 is the newest revision of ATSC standards for digital television, which allows the use of H.264/AVC video coding format and 1080p60 signal.
MPEG-2 audio was a contender for the ATSC standard during the DTV "Grand Alliance" shootout, but lost out to Dolby AC-3.
Technical features of MPEG-2 in ATSC are also valid for ISDB-T, except that in the main TS has aggregated a second program for mobile devices compressed in MPEG-4 H.264 AVC for video and AAC-LC for audio, mainly known as 1seg.
MPEG-2 is one of the three supported video coding formats supported by Blu-ray Disc. Early Blu-ray releases typically used MPEG-2 video, but recent releases are almost always in H.264 or occasionally VC-1. Only MPEG-2 video (MPEG-2 part 2) is supported, Blu-ray does not support MPEG-2 audio (parts 3 and 7). Additionally, the container format used on Blu-ray discs is an MPEG-2 transport stream, regardless of which audio and video codecs are used.
As of February 14 2020, only Malaysia still have active patents covering MPEG-2. Patents in the rest of the world have expired, with the last US patent expiring February 23, 2018.
MPEG LA, a private patent licensing organization, has acquired rights from over 20 corporations and one university to license a patent pool of approximately 640 worldwide patents, which it claims are the "essential" to use of MPEG-2 technology. The patent holders include Sony, Mitsubishi Electric, Fujitsu, Panasonic, Scientific Atlanta, Columbia University, Philips, General Instrument, Canon, Hitachi, JVC Kenwood, LG Electronics, NTT, Samsung, Sanyo, Sharp and Toshiba. Where Software patentability is upheld and patents have not expired, the use of MPEG-2 requires the payment of licensing fees to the patent holders. Other patents are licensed by Audio MPEG, Inc. The development of the standard itself took less time than the patent negotiations. Patent pooling between essential and peripheral patent holders in the MPEG-2 pool was the subject of a study by the University of Wisconsin.
According to the MPEG-2 licensing agreement any use of MPEG-2 technology in countries with active patents is subject to royalties. MPEG-2 encoders and decoders are subject to $0.35 per unit. Also, any packaged medium (DVDs/Data Streams) is subject to licence fees according to length of recording/broadcast. The royalties were previously priced higher but were lowered at several points, most recently on January 1 2018.
An earlier criticism of the MPEG-2 patent pool was that even though the number of patents will decreased from 1,048 to 416 by June 2013 the license fee had not decreased with the expiration rate of MPEG-2 patents..
The following organizations have held patents for MPEG-2, as listed at MPEG LA.
The last United States patent expired on .
|
https://en.wikipedia.org/wiki?curid=20060
|
MPEG-3
MPEG-3 is the designation for a group of audio and video coding standards agreed upon by the Moving Picture Experts Group (MPEG) designed to handle HDTV signals at 1080p in the range of 20 to 40 megabits per second. MPEG-3 was launched as an effort to address the need of an HDTV standard while work on MPEG-2 was underway, but it was soon discovered that MPEG-2, at high data rates, would accommodate HDTV. Thus, in 1992 HDTV was included as a separate profile in the MPEG-2 standard and MPEG-3 was rolled into MPEG-2.
|
https://en.wikipedia.org/wiki?curid=20061
|
Meditation
Meditation is a practice where an individual uses a technique – such as mindfulness, or focusing the mind on a particular object, thought, or activity – to train attention and awareness, and achieve a mentally clear and emotionally calm and stable state. Scholars have found meditation difficult to define, as practices vary both between traditions and within them.
Meditation has been practiced since 1500 BCE antiquity in numerous religious traditions, often as part of the path towards enlightenment and self realization. The earliest records of meditation (Dhyana), come from the Hindu traditions of Vedantism. Since the 19th century, Asian meditative techniques have spread to other cultures where they have also found application in non-spiritual contexts, such as business and health.
Meditation may be used with the aim of reducing stress, anxiety, depression, and pain, and increasing peace, perception, self-concept, and well-being. Meditation is under research to define its possible health (psychological, neurological, and cardiovascular) and other effects.
The English "meditation" is derived from Old French "meditacioun", in turn from Latin "meditatio" from a verb "meditari", meaning "to think, contemplate, devise, ponder". The use of the term "meditatio" as part of a formal, stepwise process of meditation goes back to the 12th century monk Guigo II.
Apart from its historical usage, the term "meditation" was introduced as a translation for Eastern spiritual practices, referred to as "dhyāna" in Hinduism and Buddhism and which comes from the Sanskrit root "dhyai", meaning to contemplate or meditate. The term "meditation" in English may also refer to practices from Islamic Sufism, or other traditions such as Jewish Kabbalah and Christian Hesychasm.
Meditation has proven difficult to define as it covers a wide range of dissimilar practices in different traditions. In popular usage, the word "meditation" and the phrase "meditative practice" are often used imprecisely to designate practices found across many cultures. These can include almost anything that is claimed to train the attention of mind or to teach calm or compassion. There remains no definition of necessary and sufficient criteria for meditation that has achieved universal or widespread acceptance within the modern scientific community. In 1971, Claudio Naranjo noted that "The word 'meditation' has been used to designate a variety of practices that differ enough from one another so that we may find trouble in defining what "meditation" is." A 2009 study noted a "persistent lack of consensus in the literature" and a "seeming intractability of defining meditation".
Dictionaries give both the original Latin meaning of "think[ing] deeply about (something)"; as well as the popular usage of " focusing one's mind for a period of time", "the act of giving your attention to only one thing, either as a religious activity or as a way of becoming calm and relaxed", and "to engage in mental exercise (such as concentrating on one's breathing or repetition of a mantra) for the purpose of reaching a heightened level of spiritual awareness."
In modern psychological research, meditation has been defined and characterized in a variety of ways. Many of these emphasize the role of attention and characterize the practice of meditation as attempts to get beyond the reflexive, "discursive thinking" or "logic" mind to achieve a deeper, more devout, or more relaxed state.
Bond et al. (2009) identified criteria for defining a practice as meditation "for use in a comprehensive systematic review of the therapeutic use of meditation", using "a 5-round Delphi study with a panel of 7 experts in meditation research" who were also trained in diverse but empirically highly studied (Eastern-derived or clinical) forms of meditation:
Several other definitions of meditation have been used by influential modern reviews of research on meditation across multiple traditions:
Some of the difficulty in precisely defining meditation has been in recognizing the particularities of the many various traditions; and theories and practice can differ within a tradition. Taylor noted that even within a faith such as "Hindu" or "Buddhist", schools and individual teachers may teach distinct types of meditation.
Ornstein noted that "Most techniques of meditation do not exist as solitary practices but are only artificially separable from an entire system of practice and belief." For instance, while monks meditate as part of their everyday lives, they also engage the codified rules and live together in monasteries in specific cultural settings that go along with their meditative practices.
In the West, meditation techniques have sometimes been thought of in two broad categories: focused (or concentrative) meditation and open monitoring (or mindfulness) meditation.
"Direction of mental attention... A practitioner can focus intensively on one particular object (so-called "concentrative meditation"), on all mental events that enter the field of awareness (so-called "mindfulness meditation"), or both specific focal points and the field of awareness."
Focused methods include paying attention to the breath, to an idea or feeling (such as mettā (loving-kindness)), to a kōan, or to a mantra (such as in transcendental meditation), and single point meditation.
Open monitoring methods include mindfulness, shikantaza and other awareness states.
Practices using both methods include vipassana (which uses anapanasati as a preparation), and samatha (calm-abiding).
In "No thought" methods, ""the practitioner is fully alert, aware, and in control of their faculties but does not experience any unwanted thought activity."" This is in contrast to the common meditative approaches of being detached from, and non-judgmental of, thoughts, but not of aiming for thoughts to cease. In the meditation practice of the Sahaja yoga spiritual movement, the focus is on thoughts ceasing. Clear light yoga also aims at a state of no mental content, as does the no thought ("wu nian") state taught by Huineng, and the teaching of Yaoshan Weiyan.
One proposal is that transcendental meditation and possibly other techniques be grouped as an "automatic self-transcending" set of techniques. Other typologies include dividing meditation into concentrative, generative, receptive and reflective practices.
The Transcendental Meditation technique recommends practice of 20 minutes twice per day. Some techniques suggest less time, especially when starting meditation, and Richard Davidson has quoted research saying benefits can be achieved with a practice of only 8 minutes per day. Some meditators practice for much longer, particularly when on a course or retreat. Some meditators find practice best in the hours before dawn.
Asanas and positions such as the full-lotus, half-lotus, Burmese, Seiza, and kneeling positions are popular in Buddhism, Jainism and Hinduism, although other postures such as sitting, supine (lying), and standing are also used. Meditation is also sometimes done while walking, known as kinhin, while doing a simple task mindfully, known as samu or while lying down known as savasana.
Some religions have traditions of using prayer beads as tools in devotional meditation. Most prayer beads and Christian rosaries consist of pearls or beads linked together by a thread. The Roman Catholic rosary is a string of beads containing five sets with ten small beads. The Hindu japa mala has 108 beads (the figure 108 in itself having spiritual significance, as well as those used in Jainism and Buddhist prayer beads. Each bead is counted once as a person recites a mantra until the person has gone all the way around the mala. The Muslim misbaha has 99 beads.
The Buddhist literature has many stories of Enlightenment being attained through disciples being struck by their masters. According to T. Griffith Foulk, the encouragement stick was an integral part of the Zen practice:
Richard Davidson has expressed the view that having a narrative can help maintenance of daily practice. For instance he himself prostrates to the teachings, and meditates "not primarily for my benefit, but for the benefit of others".
There are many schools and styles of meditation within Hinduism. In pre-modern and traditional Hinduism, "Yoga" and "Dhyana" are practised to realize union of one's eternal self or soul, one's ātman. In Advaita Vedanta this is equated with the omnipresent and non-dual Brahman. In the dualistic Yoga school and Samkhya, the Self is called Purusha, a pure consciousness separate from matter. Depending on the tradition, the liberative event is named moksha, vimukti or kaivalya.
The earliest clear references to meditation in Hindu literature are in the middle Upanishads and the Mahabharata (including the Bhagavad Gita). According to Gavin Flood, the earlier Brihadaranyaka Upanishad is describing meditation when it states that "having become calm and concentrated, one perceives the self ("ātman") within oneself".
One of the most influential texts of classical Hindu Yoga is Patañjali's Yoga sutras (c. 400 CE), a text associated with Yoga and Samkhya, which outlines eight limbs leading to kaivalya ("aloneness"). These are ethical discipline (yamas), rules (niyamas), physical postures (āsanas), breath control (prāṇāyama), withdrawal from the senses (pratyāhāra), one-pointedness of mind (dhāraṇā), meditation (dhyāna), and finally samādhi.
Later developments in Hindu meditation include the compilation of Hatha Yoga (forceful yoga) compendiums like the Hatha Yoga Pradipika, the development of Bhakti yoga as a major form of meditation and Tantra. Another important Hindu yoga text is the Yoga Yajnavalkya, which makes use of Hatha Yoga and Vedanta Philosophy.
Jain meditation and spiritual practices system were referred to as salvation-path. It has three parts called the "Ratnatraya" "Three Jewels": right perception and faith, right knowledge and right conduct. Meditation in Jainism aims at realizing the self, attaining salvation, and taking the soul to complete freedom. It aims to reach and to remain in the pure state of soul which is believed to be pure consciousness, beyond any attachment or aversion. The practitioner strives to be just a knower-seer (Gyata-Drashta). Jain meditation can be broadly categorized to "Dharmya Dhyana" and "Shukla Dhyana".
Jainism uses meditation techniques such as "pindāstha-dhyāna, padāstha-dhyāna, rūpāstha-dhyāna, rūpātita-dhyāna, and savīrya-dhyāna". In "padāstha dhyāna" one focuses on a mantra. A mantra could be either a combination of core letters or words on deity or themes. There is a rich tradition of Mantra in Jainism. All Jain followers irrespective of their sect, whether Digambara or Svetambara, practice mantra. Mantra chanting is an important part of daily lives of Jain monks and followers. Mantra chanting can be done either loudly or silently in mind.
Contemplation is a very old and important meditation technique. The practitioner meditates deeply on subtle facts. In "agnya vichāya", one contemplates on seven facts – life and non-life, the inflow, bondage, stoppage and removal of "karmas", and the final accomplishment of liberation. In "apaya vichāya", one contemplates on the incorrect insights one indulges, which eventually develops right insight. In "vipaka vichāya", one reflects on the eight causes or basic types of "karma". In "sansathan vichāya", one thinks about the vastness of the universe and the loneliness of the soul.
Buddhist meditation refers to the meditative practices associated with the religion and philosophy of Buddhism. Core meditation techniques have been preserved in ancient Buddhist texts and have proliferated and diversified through teacher-student transmissions. Buddhists pursue meditation as part of the path toward awakening and nirvana. The closest words for meditation in the classical languages of Buddhism are "bhāvanā", "jhāna"/"dhyāna", and "vipassana".
Buddhist meditation techniques have become popular in the wider world, with many non-Buddhists taking them up. There is considerable homogeneity across meditative practices – such as breath meditation and various recollections ("anussati") – across Buddhist schools, as well as significant diversity. In the Theravāda tradition, there are over fifty methods for developing mindfulness and forty for developing concentration, while in the Tibetan tradition there are thousands of visualization meditations. Most classical and contemporary Buddhist meditation guides are school-specific.
According to the Theravada and Sarvastivada commentatorial traditions, and the Tibetan tradition, the Buddha identified two paramount mental qualities that arise from wholesome meditative practice:
Through the meditative development of serenity, one is able to weaken the obscuring hindrances and bring the mind to a collected, pliant and still state (samadhi). This quality of mind then supports the development of insight and wisdom (Prajñā) which is the quality of mind that can "clearly see" ("vi-passana") the nature of phenomena. What exactly is to be seen varies within the Buddhist traditions. In Theravada, all phenomena are to be seen as impermanent, suffering, not-self and empty. When this happens, one develops dispassion ("viraga") for all phenomena, including all negative qualities and hindrances and lets them go. It is through the release of the hindrances and ending of craving through the meditative development of insight that one gains liberation.
In the modern era, Buddhist meditation saw increasing popularity due to the influence of Buddhist modernism on Asian Buddhism, and western lay interest in Zen and the Vipassana movement. The spread of Buddhist meditation to the Western world paralleled the spread of Buddhism in the West. The modernized concept of mindfulness (based on the Buddhist term "sati") and related meditative practices have in turn led to mindfulness based therapies.
In Sikhism, simran (meditation) and good deeds are both necessary to achieve the devotee's Spiritual goals; without good deeds meditation is futile. When Sikhs meditate, they aim to feel God's presence and emerge in the divine light. It is only God's divine will or order that allows a devotee to desire to begin to meditate.Nām Japnā involves focusing one's attention on the names or great attributes of God.
Taoist meditation has developed techniques including concentration, visualization, "qi" cultivation, contemplation, and mindfulness meditations in its long history. Traditional Daoist meditative practices were influenced by Chinese Buddhism from around the 5th century, and influenced Traditional Chinese medicine and the Chinese martial arts.
Livia Kohn distinguishes three basic types of Taoist meditation: "concentrative", "insight", and "visualization". "Ding" 定 (literally means "decide; settle; stabilize") refers to "deep concentration", "intent contemplation", or "perfect absorption". "Guan" 觀 (lit. "watch; observe; view") meditation seeks to merge and attain unity with the Dao. It was developed by Tang Dynasty (618–907) Taoist masters based upon the "Tiantai" Buddhist practice of "Vipassanā" "insight" or "wisdom" meditation. "Cun" 存 (lit. "exist; be present; survive") has a sense of "to cause to exist; to make present" in the meditation techniques popularized by the Taoist Shangqing and Lingbao Schools. A meditator visualizes or actualizes solar and lunar essences, lights, and deities within their body, which supposedly results in health and longevity, even "xian" 仙/仚/僊, "immortality".
The (late 4th century BCE) "Guanzi" essay "Neiye" "Inward training" is the oldest received writing on the subject of "qi" cultivation and breath-control meditation techniques. For instance, "When you enlarge your mind and let go of it, when you relax your vital breath and expand it, when your body is calm and unmoving: And you can maintain the One and discard the myriad disturbances. ... This is called "revolving the vital breath": Your thoughts and deeds seem heavenly."
The (c. 3rd century BCE) Taoist "Zhuangzi" records "zuowang" or "sitting forgetting" meditation. Confucius asked his disciple Yan Hui to explain what "sit and forget" means: "I slough off my limbs and trunk, dim my intelligence, depart from my form, leave knowledge behind, and become identical with the Transformational Thoroughfare."
Taoist meditation practices are central to Chinese martial arts (and some Japanese martial arts), especially the "qi"-related "neijia" "internal martial arts". Some well-known examples are "daoyin" "guiding and pulling", "qigong" "life-energy exercises", "neigong" "internal exercises", "neidan" "internal alchemy", and "taijiquan" "great ultimate boxing", which is thought of as moving meditation. One common explanation contrasts "movement in stillness" referring to energetic visualization of "qi" circulation in "qigong" and "zuochan" "seated meditation", versus "stillness in movement" referring to a state of meditative calm in "taijiquan" forms.
Judaism has made use of meditative practices for thousands of years. For instance, in the Torah, the patriarch Isaac is described as going ""לשוח"" ("lasuach") in the field – a term understood by all commentators as some type of meditative practice (Genesis 24:63). Similarly, there are indications throughout the Tanakh (the Hebrew Bible) that the prophets meditated. In the Old Testament, there are two Hebrew words for meditation: "hāgâ" (), "to sigh" or "murmur", but also "to meditate", and "sîḥâ" (), "to muse", or "rehearse in one's mind".
Classical Jewish texts espouse a wide range of meditative practices, often associated with the cultivation of "kavanah" or intention. The first layer of rabbinic law, the Mishnah, describes ancient sages "waiting" for an hour before their prayers, "in order to direct their hearts to the Omnipresent One (Mishnah Berakhot 5:1). Other early rabbinic texts include instructions for visualizing the Divine Presence (B. Talmud Sanhedrin 22a) and breathing with conscious gratitude for every breath (Genesis Rabba 14:9).
One of the best known types of meditation in early Jewish mysticism was the work of the Merkabah, from the root /R-K-B/ meaning "chariot" (of God). Some meditative traditions have been encouraged in Kabbalah, and some Jews have described Kabbalah as an inherently meditative field of study. Kabbalistic meditation often involves the mental visualization of the supernal realms. Aryeh Kaplan has argued that the ultimate purpose of Kabbalistic meditation is to understand and cleave to the Divine.
Meditation has been of interest to a wide variety of modern Jews. In modern Jewish practice, one of the best known meditative practices is called ""hitbodedut"" ("התבודדות", alternatively transliterated as "hisbodedus"), and is explained in Kabbalistic, Hasidic, and Mussar writings, especially the Hasidic method of Rabbi Nachman of Breslav. The word derives from the Hebrew word "boded" (בודד), meaning the state of being alone. Another Hasidic system is the Habad method of "hisbonenus", related to the Sephirah of "Binah", Hebrew for understanding. This practice is the analytical reflective process of making oneself understand a mystical concept well, that follows and internalises its study in Hasidic writings. The Musar Movement, founded by Rabbi Israel Salanter in the middle of the nineteenth-century, emphasized meditative practices of introspection and visualization that could help to improve moral character. Conservative rabbi Alan Lew has emphasized meditation playing an important role in the process of "teshuvah" (repentance). Jewish Buddhists have adopted Buddhist styles of meditation.
Christian meditation is a term for a form of prayer in which a structured attempt is made to get in touch with and deliberately reflect upon the revelations of God. The word meditation comes from the Latin word "meditari", which means to concentrate. Christian meditation is the process of deliberately focusing on specific thoughts (e.g. a biblical scene involving Jesus and the Virgin Mary) and reflecting on their meaning in the context of the love of God. Christian meditation is sometimes taken to mean the middle level in a broad three stage characterization of prayer: it then involves more reflection than first level vocal prayer, but is more structured than the multiple layers of contemplation in Christianity.
The Rosary is a devotion for the meditation of the mysteries of Jesus and Mary. “The gentle repetition of its prayers makes it an excellent means to moving into deeper meditation. It gives us an opportunity to open ourselves to God’s word, to refine our interior gaze by turning our minds to the life of Christ. The first principle is that meditation is learned through practice. Many people who practice rosary meditation begin very simply and gradually develop a more sophisticated meditation. The meditator learns to hear an interior voice, the voice of God”.
According to Edmund P. Clowney, Christian meditation contrasts with Eastern forms of meditation as radically as the portrayal of God the Father in the Bible contrasts with depictions of Krishna or Brahman in Indian teachings. Unlike some Eastern styles, most styles of Christian meditation do not rely on the repeated use of mantras, and yet are also intended to stimulate thought and deepen meaning. Christian meditation aims to heighten the personal relationship based on the love of God that marks Christian communion. In "Aspects of Christian meditation", the Catholic Church warned of potential incompatibilities in mixing Christian and Eastern styles of meditation. In 2003, in "A Christian reflection on the New Age" the Vatican announced that the "Church avoids any concept that is close to those of the New Age".
Salah is a mandatory act of devotion performed by Muslims five times per day. The body goes through sets of different postures, as the mind attains a level of concentration called "khushu".
A second optional type of meditation, called dhikr, meaning remembering and mentioning God, is interpreted in different meditative techniques in Sufism or Islamic mysticism. This became one of the essential elements of Sufism as it was systematized traditionally. It is juxtaposed with "fikr" (thinking) which leads to knowledge. By the 12th century, the practice of Sufism included specific meditative techniques, and its followers practiced breathing controls and the repetition of holy words.
Sufism uses a meditative procedure like Buddhist concentration, involving high-intensity and sharply focused introspection. In the Oveyssi-Shahmaghsoudi Sufi order, for example, muraqaba takes the form of tamarkoz, "concentration" in Persian.
"Tafakkur" or "tadabbur" in Sufism literally means "reflection upon the universe": this is considered to permit access to a form of cognitive and emotional development that can emanate only from the higher level, i.e. from God. The sensation of receiving divine inspiration awakens and liberates both heart and intellect, permitting such inner growth that the apparently mundane actually takes on the quality of the infinite. Muslim teachings embrace life as a test of one's submission to God.
In the teachings of the Bahá'í Faith, meditation is a primary tool for spiritual development, involving reflection on the words of God. While prayer and meditation are linked, where meditation happens generally in a prayerful attitude, prayer is seen specifically as turning toward God, and meditation is seen as a communion with one's self where one focuses on the divine.
In Bahá'í teachings the purpose of meditation is to strengthen one's understanding of the words of God, and to make one's soul more susceptible to their potentially transformative power, more receptive to the need for both prayer and meditation to bring about and maintain a spiritual communion with God.
Bahá'u'lláh, the founder of the religion, never specified any particular form of meditation, and thus each person is free to choose their own form. However, he did state that Bahá'ís should read a passage of the Bahá'í writings twice a day, once in the morning, and once in the evening, and meditate on it. He also encouraged people to reflect on one's actions and worth at the end of each day. During the Nineteen Day Fast, a period of the year during which Bahá'ís adhere to a sunrise-to-sunset fast, they meditate and pray to reinvigorate their spiritual forces.
Movements which use magic, such as Wicca, Thelema, Neopaganism, and occultism, often require their adherents to meditate as a preliminary to the magical work. This is because magic is often thought to require a particular state of mind in order to make contact with spirits, or because one has to visualize one's goal or otherwise keep intent focused for a long period during the ritual in order to see the desired outcome. Meditation practice in these religions usually revolves around visualization, absorbing energy from the universe or higher self, directing one's internal energy, and inducing various trance states. Meditation and magic practice often overlap in these religions as meditation is often seen as merely a stepping stone to supernatural power, and the meditation sessions may be peppered with various chants and spells.
Mantra meditation, with the use of a japa mala and especially with focus on the Hare Krishna maha-mantra, is a central practice of the Gaudiya Vaishnava faith tradition and the International Society for Krishna Consciousness (ISKCON), also known as the Hare Krishna movement. Other popular New Religious Movements include the Ramakrishna Mission, Vedanta Society, Divine Light Mission, Chinmaya Mission, Osho, Sahaja Yoga, Transcendental Meditation, Oneness University, Brahma Kumaris and Vihangam Yoga.
New Age meditations are often influenced by Eastern philosophy, mysticism, yoga, Hinduism and Buddhism, yet may contain some degree of Western influence. In the West, meditation found its mainstream roots through the social revolution of the 1960s and 1970s, when many of the youth of the day rebelled against traditional religion as a reaction against what some perceived as the failure of Christianity to provide spiritual and ethical guidance.
New Age meditation as practised by the early hippies is regarded for its techniques of blanking out the mind and releasing oneself from conscious thinking. This is often aided by repetitive chanting of a mantra, or focusing on an object. New Age meditation evolved into a range of purposes and practices, from serenity and balance to access to other realms of consciousness to the concentration of energy in group meditation to the supreme goal of samadhi, as in the ancient yogic practice of meditation.
The US National Center for Complementary and Integrative Health states that ""Meditation is a mind and body practice that has a long history of use for increasing calmness and physical relaxation, improving psychological balance, coping with illness, and enhancing overall health and well-being."" A 2014 review found that practice of mindfulness meditation for two to six months by people undergoing long-term psychiatric or medical therapy could produce small improvements in anxiety, pain, or depression. In 2017, the American Heart Association issued a scientific statement that meditation may be a reasonable adjunct practice to help reduce the risk of cardiovascular diseases, with the qualification that meditation needs to be better defined in higher-quality clinical research of these disorders.
Low-quality evidence indicates that meditation may help with irritable bowel syndrome, insomnia, cognitive decline in the elderly, and post-traumatic stress disorder.
A 2010 review of the literature on spirituality and performance in organizations found an increase in corporate meditation programs.
As of 2016 around a quarter of U.S. employers were using stress reduction initiatives. The goal was to help reduce stress and improve reactions to stress. Aetna now offers its program to its customers. Google also implements mindfulness, offering more than a dozen meditation courses, with the most prominent one, "Search Inside Yourself", having been implemented since 2007. General Mills offers the Mindful Leadership Program Series, a course which uses a combination of mindfulness meditation, yoga and dialogue with the intention of developing the mind's capacity to pay attention.
Herbert Benson of Harvard Medical School conducted a series of clinical tests on meditators from various disciplines, including the Transcendental Meditation technique and Tibetan Buddhism. In 1975, Benson published a book titled "The Relaxation Response" where he outlined his own version of meditation for relaxation. Also in the 1970s, the American psychologist Patricia Carrington developed a similar technique called Clinically Standardized Meditation (CSM). In Norway, another sound-based method called Acem Meditation developed a psychology of meditation and has been the subject of several scientific studies.
Biofeedback has been used by many researchers since the 1950s in an effort to enter deeper states of mind.
The history of meditation is intimately bound up with the religious context within which it was practiced. Some authors have even suggested the hypothesis that the emergence of the capacity for focused attention, an element of many methods of meditation, may have contributed to the latest phases of human biological evolution. Some of the earliest references to meditation are found in the Hindu Vedas of India. Wilson translates the most famous Vedic mantra "Gayatri" as: "We meditate on that desirable light of the divine Savitri, who influences our pious rites" (Rigveda : Mandala-3, Sukta-62, Rcha-10). Around the 6th to 5th centuries BCE, other forms of meditation developed via Confucianism and Taoism in China as well as Hinduism, Jainism, and early Buddhism in India.
In the Roman Empire, by 20 BCE Philo of Alexandria had written on some form of "spiritual exercises" involving attention (prosoche) and concentration and by the 3rd century Plotinus had developed meditative techniques.
The Pāli Canon from the 1st century BCE considers Buddhist meditation as a step towards liberation. By the time Buddhism was spreading in China, the "Vimalakirti Sutra" which dates to 100 CE included a number of passages on meditation, clearly pointing to Zen (known as Chan in China, Thiền in Vietnam, and Seon in Korea). The Silk Road transmission of Buddhism introduced meditation to other Asian countries, and in 653 the first meditation hall was opened in Singapore. Returning from China around 1227, Dōgen wrote the instructions for zazen.
The Islamic practice of Dhikr had involved the repetition of the 99 Names of God since the 8th or 9th century. By the 12th century, the practice of Sufism included specific meditative techniques, and its followers practiced breathing controls and the repetition of holy words. Interactions with Indians or the Sufis may have influenced the Eastern Christian meditation approach to hesychasm, but this can not be proved. Between the 10th and 14th centuries, hesychasm was developed, particularly on Mount Athos in Greece, and involves the repetition of the Jesus prayer.
Western Christian meditation contrasts with most other approaches in that it does not involve the repetition of any phrase or action and requires no specific posture. Western Christian meditation progressed from the 6th century practice of Bible reading among Benedictine monks called Lectio Divina, i.e. divine reading. Its four formal steps as a "ladder" were defined by the monk Guigo II in the 12th century with the Latin terms "lectio", "meditatio", "oratio", and "contemplatio" (i.e. read, ponder, pray, contemplate). Western Christian meditation was further developed by saints such as Ignatius of Loyola and Teresa of Avila in the 16th century.
Meditation has spread in the West since the late 19th century, accompanying increased travel and communication among cultures worldwide. Most prominent has been the transmission of Asian-derived practices to the West. In addition, interest in some Western-based meditative practices has been revived, and these have been disseminated to a limited extent to Asian countries.
Ideas about Eastern meditation had begun "seeping into American popular culture even before the American Revolution through the various sects of European occult Christianity", and such ideas "came pouring in [to America] during the era of the transcendentalists, especially between the 1840s and the 1880s." The following decades saw further spread of these ideas to America:
More recently, in the 1960s, another surge in Western interest in meditative practices began. The rise of communist political power in Asia led to many Asian spiritual teachers taking refuge in Western countries, oftentimes as refugees. In addition to spiritual forms of meditation, secular forms of meditation have taken root. Rather than focusing on spiritual growth, secular meditation emphasizes stress reduction, relaxation and self-improvement.
Research on the processes and effects of meditation is a subfield of neurological research. Modern scientific techniques, such as fMRI and EEG, were used to observe neurological responses during meditation. Concerns have been raised on the quality of meditation research, including the particular characteristics of individuals who tend to participate.
Since the 1970s, clinical psychology and psychiatry have developed meditation techniques for numerous psychological conditions. Mindfulness practice is employed in psychology to alleviate mental and physical conditions, such as reducing depression, stress, and anxiety. Mindfulness is also used in the treatment of drug addiction, although the quality of research has been poor. Studies demonstrate that meditation has a moderate effect to reduce pain. There is insufficient evidence for any effect of meditation on positive mood, attention, eating habits, sleep, or body weight.
A 2017 systematic review and meta-analysis of the effects of meditation on empathy, compassion, and prosocial behaviors found that meditation practices had small to medium effects on self-reported and observable outcomes, concluding that such practices can "improve positive prosocial emotions and behaviors".
The 2012 US National Health Interview Survey (NHIS) (34,525 subjects) found 8% of US adults used meditation, with lifetime and 12-month prevalence of meditation use of 5.2% and 4.1% respectively. In the 2017 NHIS survey, meditation use among workers was 10% (up from 8% in 2002).
The psychologist Thomas Joiner argues that modern mindfulness meditation has been "corrupted" for commercial gain by self-help celebrities, and suggests that it encourages unhealthy narcissistic and self-obsessed mindsets.
Meditation has been correlated with unpleasant experiences in some people.
In one study, published in 2019, of 1,232 regular meditators with at least two months of meditation experience, about a quarter reported having had particularly unpleasant meditation-related experiences (such as anxiety, fear, distorted emotions or thoughts, altered sense of self or the world), which they thought may have been caused by their meditation practice. Meditators with high levels of repetitive negative thinking and those who only engage in deconstructive meditation were more likely to report unpleasant side effects. Adverse effects were less frequently reported in women and religious meditators.
Difficult experiences encountered in meditation are mentioned in traditional sources; and some may be considered to be just an expected part of the process: for example: seven stages of purification mentioned in Theravāda Buddhism, or possible “unwholesome or frightening visions” mentioned in a practical manual on vipassanā meditation.
Many major traditions in which meditation is practiced, such as Buddhism and Hinduism, advise members not to consume intoxicants, while others, such as the Rastafarian movements and Native American Church, view drugs as integral to their religious lifestyle.
The fifth of the five precepts of the Pancasila, the ethical code in the Theravada and Mahayana Buddhist traditions, states that adherents must: "abstain from fermented and distilled beverages that cause heedlessness."
On the other hand, the ingestion of psychoactives has been a central feature in the rituals of many religions, in order to produce altered states of consciousness. In several traditional shamanistic ceremonies, drugs are used as agents of ritual. In the Rastafari movement, cannabis is believed to be a gift from Jah and a sacred herb to be used regularly, while alcohol is considered to debase man. Native Americans use peyote, as part of religious ceremony, continuing today.
|
https://en.wikipedia.org/wiki?curid=20062
|
MPEG-4
MPEG-4 is a method of defining compression of audio and visual (AV) digital data. It was introduced in late 1998 and designated a standard for a group of audio and video coding formats and related technology agreed upon by the ISO/IEC Moving Picture Experts Group (MPEG) (ISO/IEC JTC1/SC29/WG11) under the formal standard ISO/IEC 14496 – "Coding of audio-visual objects". Uses of MPEG-4 include compression of AV data for web (streaming media) and CD distribution, voice (telephone, videophone) and broadcast television applications. The MPEG-4 standard was developed by a group led by Touradj Ebrahimi (later the JPEG president) and Fernando Pereira.
MPEG-4 absorbs many of the features of MPEG-1 and MPEG-2 and other related standards, adding new features such as (extended) VRML support for 3D rendering, object-oriented composite files (including audio, video and VRML objects), support for externally specified Digital Rights Management and various types of interactivity. AAC (Advanced Audio Coding) was standardized as an adjunct to MPEG-2 (as Part 7) before MPEG-4 was issued.
MPEG-4 is still an evolving standard and is divided into a number of parts. Companies promoting MPEG-4 compatibility do not always clearly state which "part" level compatibility they are referring to. The key parts to be aware of are MPEG-4 Part 2 (including Advanced Simple Profile, used by codecs such as DivX, Xvid, Nero Digital and 3ivx and by QuickTime 6) and MPEG-4 part 10 (MPEG-4 AVC/H.264 or Advanced Video Coding, used by the x264 encoder, Nero Digital AVC, QuickTime 7, and high-definition video media like Blu-ray Disc).
Most of the features included in MPEG-4 are left to individual developers to decide whether or not to implement. This means that there are probably no complete implementations of the entire MPEG-4 set of standards. To deal with this, the standard includes the concept of "profiles" and "levels", allowing a specific set of capabilities to be defined in a manner appropriate for a subset of applications.
Initially, MPEG-4 was aimed primarily at low bit-rate video communications; however, its scope as a multimedia coding standard was later expanded. MPEG-4 is efficient across a variety of bit-rates ranging from a few kilobits per second to tens of megabits per second. MPEG-4 provides the following functions:
MPEG-4 provides a series of technologies for developers, for various service-providers and for end users:
The MPEG-4 format can perform various functions, among which might be the following:
MPEG-4 provides a large and rich set of tools for encoding.
Subsets of the MPEG-4 tool sets have been provided for use in specific applications.
These subsets, called 'Profiles', limit the size of the tool set a decoder is required to implement. In order to restrict computational complexity, one or more 'Levels' are set for each Profile. A Profile and Level combination allows:
MPEG-4 consists of several standards—termed "parts"—including the following (each part covers a certain aspect of the whole specification):
Profiles are also defined within the individual "parts", so an implementation of a part is ordinarily not an implementation of an entire part.
MPEG-1, MPEG-2, MPEG-7 and MPEG-21 are other suites of MPEG standards.
The low profile levels are part of the MPEG-4 video encoding/decoding constraints and are compatible with the older ITU H.261 standard, also compatible with former analog TV standards for broadcast and records (such as NTSC or PAL video). The ASP profile in its highest level is suitable for most usual DVD medias and players or for many online video sites, but not for Blu-ray records or online HD video contents.
More advanced profiles for HD media have been defined later in the AVC profile, which is functionally identical to the ITU H.264 standard but are now also integrated in MPEG-4 Part 10 (see H.264/MPEG-4 AVC for the list of defined levels in this AVC profile).
MPEG-4 contains patented technologies, the use of which requires licensing in countries that acknowledge software algorithm patents. Over two dozen companies claim to have patents covering MPEG-4. MPEG LA licenses patents required for MPEG-4 Part 2 Visual from a wide range of companies (audio is licensed separately) and lists all of its licensors and licensees on the site. New licenses for MPEG-4 System patents are under development and no new licenses are being offered while holders of its old MPEG-4 Systems license are still covered under the terms of that license for the patents listed (MPEG LA – Patent List).
The majority of patents used for the MPEG-4 Visual format are held by three Japanese companies: Mitsubishi Electric (255 patents), Hitachi (206 patents), and Panasonic (200 patents).
|
https://en.wikipedia.org/wiki?curid=20063
|
Maritime archaeology
Maritime archaeology (also known as marine archaeology) is a discipline within archaeology as a whole that specifically studies human interaction with the sea, lakes and rivers through the study of associated physical remains, be they vessels, shore-side facilities, port-related structures, cargoes, human remains and submerged landscapes. A specialty within maritime archaeology is nautical archaeology, which studies ship construction and use.
As with archaeology as a whole, maritime archaeology can be practised within the historical, industrial, or prehistoric periods. An associated discipline, and again one that lies within archaeology itself, is underwater archaeology, which studies the past through any submerged remains be they of maritime interest or not. An example from the prehistoric era would be the remains of submerged settlements or deposits now lying under water despite having been dry land when sea levels were lower. The study of submerged aircraft lost in lakes, rivers or in the sea is an example from the historical, industrial or modern era. Many specialist sub-disciplines within the broader maritime and underwater archaeological categories have emerged in recent years.
Maritime archaeological sites often result from shipwrecks or sometimes seismic activity, and thus represent a moment in time rather than a slow deposition of material accumulated over a period of years, as is the case with port-related structures (such as piers, wharves, docks and jetties) where objects are lost or thrown off structures over extended periods of time. This fact has led to shipwrecks often being described in the media and in popular accounts as 'time capsules'.
Archaeological material in the sea or in other underwater environments is typically subject to different factors than artifacts on land. However, as with terrestrial archaeology, what survives to be investigated by modern archaeologists can often be a tiny fraction of the material originally deposited. A feature of maritime archaeology is that despite all the material that is lost, there are occasional rare examples of substantial survival, from which a great deal can be learned, due to the difficulties often experienced in accessing the sites.
There are those in the archaeology community who see maritime archaeology as a separate discipline with its own concerns (such as shipwrecks) and requiring the specialized skills of the underwater archaeologist. Others value an integrated approach, stressing that nautical activity has economic and social links to communities on land and that archaeology is archaeology no matter where the study is conducted. All that is required is the mastering of skills specific to the environment in which the work occurs.
Before the industrial era, travel by water was often easier than over land. As a result, marine channels, navigable rivers and sea crossings formed the trade routes of historic and ancient civilisations. For example, the Mediterranean Sea was known to the Romans as the inner sea because the Roman empire spread around its coasts. The historic record as well as the remains of harbours, ships and cargoes, testify to the volume of trade that crossed it. Later, nations with a strong maritime culture such as the United Kingdom, the Netherlands, Denmark, Portugal and Spain were able to establish colonies on other continents. Wars were fought at sea over the control of important resources. The material cultural remains that are discovered by maritime archaeologists along former trade routes can be combined with historical documents and material cultural remains found on land to understand the economic, social and political environment of the past. Of late maritime archaeologists have been examining the submerged cultural remains of China, India, Korea and other Asian nations.
There are significant differences in the survival of archaeological material depending on whether a site is wet or dry, on the nature of the chemical environment, on the presence of biological organisms and on the dynamic forces present. Thus rocky coastlines, especially in shallow water, are typically inimical to the survival of artifacts, which can be dispersed, smashed or ground by the effect of currents and surf, possibly (but not always) leaving an artifact pattern but little if any wreck structure.
Saltwater is particularly inimical to iron artefacts including metal shipwrecks, and sea organisms will readily consume organic material such as wooden shipwrecks. On the other hand, out of all the thousands of potential archaeological sites destroyed or grossly eroded by such natural processes, occasionally sites survive with exceptional preservation of a related collection of artifacts. An example of such a collection is . Survival in this instance is largely due to the remains being buried in sediment
Of the many examples where the sea bed provides an extremely hostile environment for submerged evidence of history, one of the most notable, , though a relatively young wreck and in deep water so calcium-starved that concretion does not occur, appears strong and relatively intact, though indications are that it has already incurred irreversible degradation of her steel and iron hull. As such degradation inevitably continues, data will be forever lost, objects' context will be destroyed and the bulk of the wreck will over centuries completely deteriorate on the floor of the Atlantic Ocean. Comparative evidence shows that all iron and steel ships, especially those in a highly oxygenated environment, continue to degrade and will continue to do so until only their engines and other machinery project much above the sea-floor. Where it remains even after the passage of time, the iron or steel hull is often fragile with no remaining metal within the layer of concretion and corrosion products. , having been found in the 1970s, was subjected to a program of attempted "in situ" preservation, for example, but deterioration of the vessel progressed at such a rate that the rescue of her turret was undertaken lest nothing be saved from the wreck.
Some wrecks, lost to natural obstacles to navigation, are at risk of being smashed by subsequent wrecks sunk by the same hazard, or are deliberately destroyed because they present a hazard to navigation. Even in deep water, commercial activities such as pipe-laying operations and deep sea trawling can place a wreck at risk. Such a wreck is the "Mardi Gras" shipwreck sunk in the Gulf of Mexico in of water. The shipwreck lay forgotten at the bottom of the sea until it was discovered in 2002 by an oilfield inspection crew working for the Okeanos Gas Gathering Company (OGGC). Large pipelines can crush sites and render some of their remnants inaccessible as pipe is dropped from the ocean surface to the substrate thousands of feet below. Trawl nets snag and tear superstructures and separate artifacts from their context.
The wrecks, and other archaeological sites that have been preserved have generally survived because the dynamic nature of the sea bed can result in artifacts becoming rapidly buried in sediments. These sediments then provide an anaerobic environment which protects from further degradation. Wet environments, whether on land in the form of peat bogs and wells, or underwater are particularly important for the survival of organic material, such as wood, leather, fabric and horn. Cold and absence of light also aid survival of artifacts, because there is little energy available for either organic activity or chemical reactions. Salt water provides for greater organic activity than freshwater, and in particular, the shipworm, terredo navalis, lives only in salt water, so some of the best preservation in the absence of sediments has been found in the cold, dark waters of the Great Lakes in North America and in the (low salinity) Baltic Sea (where "Vasa" was preserved).
While the land surface is continuously reused by societies, the sea bed was largely inaccessible until the advent of submarines, scuba equipment and remotely operated underwater vehicles (ROVs) in the twentieth century. Salvagers have operated in much earlier times, but much of the material was beyond the reach of anyone. Thus "Mary Rose" was subject to salvage from the sixteenth century and later, but a very large amount of material, buried in the sediments, remained to be found by maritime archaeologists of the twentieth century.
While preservation in situ is not assured, material that has survived underwater and is then recovered to land is typically in an unstable state and can only be preserved using highly specialised conservation processes. While the wooden structure of "Mary Rose" and the individual artifacts have been undergoing conservation since their recovery, provides an example of a relatively recent (metal) wreck for which extensive conservation has been necessary to preserve the hull. While the hull remains intact, its machinery remains inoperable. The engine of that was recovered in 1985 from a saline environment after over a century underwater is presently considered somewhat anomalous, in that after two decades of treatment it can now be turned over by hand.
A challenge for the modern archaeologist is to consider whether "in-situ" preservation, or recovery and conservation on land is the preferable option; or to face the fact that preservation in any form, other than as an archaeological record is not feasible. A site that has been discovered has typically been subjected to disturbance of the very factors that caused its survival in the first place, for example, when a covering of sediment has been removed by storms or the action of man. Active monitoring and deliberate protection may mitigate further rapid destruction making "in situ" preservation an option, but long-term survival can never be guaranteed. For very many sites, the costs are too great for either active measures to ensure "in situ" preservation or to provide for satisfactory conservation on recovery. Even the cost of proper and complete archaeological investigation may be too great to enable this to occur within a timescale that ensures that an archaeological record is made before data is inevitably lost.
Maritime archaeology studies prehistorical objects and sites that are, because of changes in climate and geology, now underwater.
Bodies of water, fresh and saline, have been important sources of food for people for as long as we have existed. It should be no surprise that ancient villages were located at the water's edge. Since the last ice age sea level has risen as much as .
Therefore, a great deal of the record of human activity throughout the Ice Age is now to be found under water.
The flooding of the area now known as the Black Sea (when a land bridge, where the Bosporus is now, collapsed under the pressure of rising water in the Mediterranean Sea) submerged a great deal of human activity that had been gathered round what had been an enormous, fresh-water lake.
Significant cave art sites off the coast of western Europe such as the Grotto Cosquer can be reached only by diving, because the cave entrances are underwater, though the upper portions of the caves themselves are not flooded.
Throughout history, seismic events have at times caused submergence of human settlements. The remains of such catastrophes exist all over the world, and sites such as Alexandria and Port Royal now form important archaeological sites. As with shipwrecks, archaeological research can follow multiple themes, including evidence of the final catastrophe, the structures and landscape before the catastrophe and the culture and economy of which it formed a part. Unlike the wrecking of a ship, the destruction of a town by a seismic event can take place over many years and there may be evidence for several phases of damage, sometimes with rebuilding in between.
Not all maritime sites are underwater. There are many structures at the margin of land and water that provide evidence of the human societies of the past. Some are deliberately created for access - such as bridges and walkways. Other structures remain from exploitation of resources, such as dams and fish traps. Nautical remains include early harbours and places where ships were built or repaired. At the end of their life, ships were often beached. Valuable or easily accessed timber has often been salvaged leaving just a few frames and bottom planking.
Archaeological sites can also be found on the foreshore today that would have been on dry land when they were constructed. An example of such a site is Seahenge, a Bronze Age timber circle.
The archaeology of shipwrecks can be divided into a three-tier hierarchy, of which the first tier considers the wrecking process itself: how does a ship break up, how does a ship sink to the bottom, and how do the remains of the ship, cargo and the surrounding environment evolve over time? The second tier studies the ship as a machine, both in itself and in a military or economic system. The third tier consists of the archaeology of maritime cultures, in which nautical technology, naval warfare, trade and shipboard societies are studied. Some consider this to be the most important tier. Ships and boats are not necessarily wrecked: some are deliberately abandoned, scuttled or beached. Many such abandoned vessels have been extensively salvaged.
The earliest boats discovered date from the Bronze Age and are constructed of hollowed out logs or sewn planks. Vessels have been discovered where they have been preserved in sediments underwater or in waterlogged land sites, such as the discovery of a canoe near St Botolphs. Examples of sewn-plank boats include those found at North Ferriby and the Dover Bronze Age Boat which is now displayed at Dover Museum. These may be an evolution from boats made of sewn hides, but it is highly unlikely that hide boats could have survived.
Ships wrecked in the sea have probably not survived, although remains of cargo (particularly bronze material) have been discovered, such as those at the Salcombe B site. A close collection of artefacts on the sea bed may imply that artefacts were from a ship, even if there are no remains of the actual vessel.
Late Bronze Age ships, such as the Uluburun Shipwreck have been discovered in the Mediterranean, constructed of edge joined planks. This shipbuilding technology continued through the classical period.
In the Mediterranean area, maritime archaeologists have investigated several ancient cultures. Notable early Iron Age shipwrecks include two Phoenician ships of c. 750 BC that foundered off Gaza with cargoes of wine in amphoras. The crew of the U.S. Navy deep submergence research submarine NR-1 discovered the sites in 1997. In 1999 a team led by Robert Ballard and Harvard University archaeology Professor Lawrence Stager investigated the wrecks.
Extensive research has been carried out on the Mediterranean and Aegean coastlines of Turkey. Complete excavations have been performed on several wrecks from the Classical, Hellenistic, Byzantine, and Ottoman periods.
Maritime archaeological studies in Italy illuminate the naval and maritime activities of the Etruscans, Greek colonists, and Romans. After the 2nd century BC, the Roman fleet ruled the Mediterranean and actively suppressed piracy. During this Pax Romana, seaborne trade increased significantly throughout the region. Though sailing was the safest, fastest, and most efficient method of transportation in the ancient world, some fractional percentage of voyages ended in shipwreck. With the significantly increased sea traffic during the Roman era came a corresponding increase in shipwrecks. These wrecks and their cargo remains offer glimpses through time of the economy, culture, and politics of the ancient world. Particularly useful to archaeologists are studies of amphoras, the ceramic shipping containers used in the Mediterranean region from the 15th century BC through the Medieval period.
In addition to many discoveries in the sea, some wrecks have been examined in lakes. Most notable are Caligula's pleasure barges in Lake Nemi, Italy. The Nemi ships and other shipwreck sites occasionally yield objects of unique artistic value. For instance, the Antikythera wreck contained a staggering collection of marble and bronze statues including the Antikythera Youth. Discovered in 1900 by Greek sponge divers, the ship probably sank in the 1st century BC and may have been dispatched by the Roman general, Sulla, to carry booty back to Rome. The sponge divers also recovered from the wreck the famous Antikythera mechanism, believed to be an astronomical calculator. Further examples of fabulous works of art recovered from the sea floor are the two "bronzi" found in Riace (Calabria), Italy. In the cases of Antikythera and Riace, however, the artifacts were recovered without the direct participation of maritime archaeologists.
Recent studies in the Sarno river (near Pompeii) show other interesting elements of ancient life. The Sarno projects suggests that on the Tyrrhenian shore there were little towns with palafittes, similar to ancient Venice. In the same area, the submerged town of Puteoli (Pozzuoli, close to Naples) contains the "portus Julius" created by Marcus Vipsanius Agrippa in 37 BC, later sunk due to bradyseism.
The sea floor elsewhere in the Mediterranean holds countless archaeological sites. In Israel, Herod the Great's port at Caesarea Maritima has been extensively studied. Other finds are consistent with some passages of the Bible (like the so-called Jesus boat, which appears to have been in use during the first century AD).
Maritime archaeology in Australia commenced in the 1970s with the advent of Jeremy Green due to concerns expressed by academics and politicians with the rampant destruction of the Dutch and British East India ships lost on the west coast. As Commonwealth legislation was enacted and enforced after 1976 and as States enacted their own legislation the sub-discipline spread throughout Australia concentrating initially on shipwrecks due to on-going funding by both the States and the Commonwealth under their shipwreck legislation. Studies now include as an element of underwater archaeology, as a whole, the study of submerged indigenous sites. Nautical Archaeology, (the specialised study of boat and ship construction) is also practised in the region. Often the sites or relics studied in Australia as in the rest of the world are not inundated. The study of historic submerged aircraft, better known as a sub-discipline of aviation archaeology, underwater aviation archaeology is also practised in the region. In some states maritime and underwater archaeology is practised out of Museums and in others out of cultural heritage management units and all practitioners operate under the aegis of the Australasian Institute for Maritime Archaeology (AIMA).
|
https://en.wikipedia.org/wiki?curid=20064
|
Morihei Ueshiba
The son of a landowner from Tanabe, Ueshiba studied a number of martial arts in his youth, and served in the Japanese Army during the Russo-Japanese War. After being discharged in 1907, he moved to Hokkaidō as the head of a pioneer settlement; here he met and studied with Takeda Sōkaku, the founder of Daitō-ryū Aiki-jūjutsu. On leaving Hokkaido in 1919, Ueshiba joined the Ōmoto-kyō movement, a Shinto sect, in Ayabe, where he served as a martial arts instructor and opened his first dojo. He accompanied the head of the Ōmoto-kyō group, Onisaburo Deguchi, on an expedition to Mongolia in 1924, where they were captured by Chinese troops and returned to Japan. The following year, he had a profound spiritual experience, stating that, "a golden spirit sprang up from the ground, veiled my body, and changed my body into a golden one." After this experience, his martial arts skill appeared to be greatly increased.
Ueshiba moved to Tokyo in 1926, where he set up the Aikikai Hombu Dojo. By now he was comparatively famous in martial arts circles, and taught at this dojo and others around Japan, including in several military academies. In the aftermath of World War II the Hombu dojo was temporarily closed, but Ueshiba had by this point left Tokyo and retired to Iwama, and he continued training at the dojo he had set up there. From the end of the war until the 1960s, he worked to promote aikido throughout Japan and abroad. He died from liver cancer in 1969.
After Ueshiba's death, aikido continued to be promulgated by his students (many of whom became noted martial artists in their own right). It is now practiced around the world.
Morihei Ueshiba was born in Nishinotani village (now part of the city of Tanabe), Wakayama Prefecture, Japan, on December 14, 1883, the fourth child (and only son) born to Yoroku Ueshiba and his wife Yuki.
The young Ueshiba was raised in a somewhat privileged setting. His father Yoroku was a wealthy gentleman farmer and minor politician, being an elected member of the Nishinotani village council for 22 consecutive years. His mother Yuki was from the Itokawa clan, a prominent local family who could trace their lineage back to the Heian period. Ueshiba was a rather weak, sickly child and bookish in his inclinations. At a young age his father encouraged him to take up sumo wrestling and swimming and entertained him with stories of his great-grandfather Kichiemon, who was considered a very strong samurai in his era. The need for such strength was further emphasized when the young Ueshiba witnessed his father being attacked by followers of a competing politician.
A major influence on Ueshiba's early education was his elementary schoolteacher Tasaburo Nasu, who was a Shinto priest and who introduced Ueshiba to the religion. At the age of six Ueshiba was sent to study at the Jizōderu Temple, but had little interest in the rote learning of Confucian education. However, his schoolmaster Mitsujo Fujimoto was also a priest of Shingon Buddhism, and taught the young Ueshiba some of the esoteric chants and ritual observances of the sect, which Ueshiba found intriguing. His interest in Buddhism was sufficiently great that his mother considered enrolling him in the priesthood, but his father Yoroku vetoed the idea. Ueshiba went to Tanabe Higher Elementary School and then to Tanabe Prefectural Middle School, but left formal education in his early teens, enrolling instead at a private abacus academy, the Yoshida Institute, to study accountancy. On graduating from the academy, he worked at a local tax office for a few months, but the job did not suit him and in 1901 he left for Tokyo, funded by his father. Ueshiba Trading, the stationery business which he opened there, was short-lived; unhappy with life in the capital, he returned to Tanabe less than a year later after suffering a bout of beri-beri. Shortly thereafter he married his childhood acquaintance Hatsu Itokawa.
In 1903, Ueshiba was called up for military service. He failed the initial physical examination, being shorter than the regulation . To overcome this, he stretched his spine by attaching heavy weights to his legs and suspending himself from tree branches; when he re-took the physical exam he had increased his height by the necessary half-inch to pass. He was assigned to the Osaka Fourth Division, 37th Regiment, and was promoted to corporal of the 61st Wakayama regiment by the following year; after serving on the front lines during the Russo-Japanese War he was promoted to sergeant. He was discharged in 1907, and again returned to his father's farm in Tanabe. Here he befriended the writer and philosopher Minakata Kumagusu, becoming involved with Minakata's opposition to the Meiji government's Shrine Consolidation Policy. He and his wife had their first child, a daughter named Matsuko, in 1911.
Ueshiba studied several martial arts during his early life, and was renowned for his physical strength during his youth. During his sojourn in Tokyo he studied Kitō-ryū jujutsu under Takisaburo Tobari, and briefly enrolled in a school teaching Shinkage-ryū. His training in Gotō-ha Yagyū-ryu under Masakatsu Nakai was sporadic due to his military service, although he was granted a diploma in the art within a few years. In 1901 he received some instruction from Tozawa Tokusaburōin in Tenjin Shin'yō-ryū jujutsu and he studied judo with Kiyoichi Takagi in Tanabe in 1911, after his father had a dojo built on the family compound to encourage his son's training. In 1907, after his return from the war, he was also presented with a certificate of enlightenment ("shingon inkyo") by his childhood teacher Mitsujo Fujimoto.
In the early part of the 20th century, the prefectural government of Hokkaidō, Japan's northernmost island, were offering various grants and incentives for mainland Japanese groups willing to relocate there. At the time, Hokkaidō was still largely unsettled by the Japanese, being occupied primarily by the indigenous Ainu. In 1910, Ueshiba travelled to Hokkaidō in the company of his acquaintance Denzaburo Kurahashi, who had lived on the northern island before. His intent was to scout out a propitious location for a new settlement, and he found the site at Shirataki suitable for his plans. Despite the hardships he suffered on this journey (which included getting lost in snowstorms several times and an incident in which he nearly drowned in a freezing river), Ueshiba returned to Tanabe filled with enthusiasm for the project, and began recruiting families to join him. He became the leader of the Kishū Settlement Group, a collective of eighty-five pioneers who intended to settle in the Shirataki district and live as farmers; the group founded the village of Yubetsu (later Shirataki village) in August, 1912. Much of the funding for this project came from Ueshiba's father and his brothers-in-law Zenzo and Koshiro Inoue. Zenzo's son Noriaki was also a member of the settlement group.
Poor soil conditions and bad weather led to crop failures during the first three years of the project, but the group still managed to cultivate mint and farm livestock. The burgeoning timber industry provided a boost to the settlement's economy, and by 1918 there were over 500 families residing there. A fire in 1917 razed the entire village, leading to the departure of around twenty families. Ueshiba was attending a meeting over railway construction around 50 miles away, but on learning of the fire travelled back the entire distance on foot. He was elected to the village council that year, and took a prominent role in leading the reconstruction efforts. In the summer of 1918, Hatsu gave birth to their first son, Takemori.
The young Ueshiba met Takeda Sōkaku, the founder of Daitō-ryū Aiki-jūjutsu, at the Hisada Inn in Engaru, in March 1915. Ueshiba was deeply impressed with Takeda's martial art, and despite being on an important mission for his village at the time, abandoned his journey to spend the next month studying with Takeda. He requested formal instruction and began studying Takeda's style of jūjutsu in earnest, going so far as to construct a dojo at his home and inviting his new teacher to be a permanent house guest. He received a "kyōju dairi" certificate, a teaching license, for the system from Takeda in 1922, when Takeda visited him in Ayabe. Takeda also gave him a Yagyū Shinkage-ryū sword transmission scroll. Ueshiba then became a representative of Daitō-ryū, toured with Takeda as a teaching assistant and taught the system to others. The relationship between Ueshiba and Takeda was a complicated one. Ueshiba was an extremely dedicated student, dutifully attending to his teacher's needs and displaying great respect. However, Takeda overshadowed him throughout his early martial arts career, and Ueshiba's own students recorded the need to address what they referred to as "the Takeda problem".
In November 1919, Ueshiba learned that his father Yoroku was ill, and was not expected to survive. Leaving most of his possessions to Takeda, Ueshiba left Shirataki with the apparent intention of returning to Tanabe to visit his ailing parent. En route he made a detour to Ayabe, near Kyoto, intending to visit Onisaburo Deguchi, the spiritual leader of the Ōmoto-kyō religion (Ueshiba's nephew Noriaki Inoue had already joined the religion and may have recommended it to his uncle). Ueshiba stayed at the Ōmoto-kyō headquarters for several days, and met with Deguchi, who told him that, "There is nothing to worry about with your father". On his return to Tanabe, Ueshiba found that Yoroku had died. Criticised by family and friends for arriving too late to see his father, Ueshiba went into the mountains with a sword and practised solo sword exercises for several days; this almost led to his arrest when the police were informed of a sword-wielding madman on the loose.
Within a few months, Ueshiba was back in Ayabe, having decided to become a full-time student of Ōmoto-kyō. In 1920 he moved his entire family, including his mother, to the Ōmoto compound; at the same time he also purchased enough rice to feed himself and his family for several years. That same year, Deguchi asked Ueshiba to become the group's martial arts instructor, and a dojo—the first of several that Ueshiba was to lead—was constructed on the centre's grounds. Ueshiba also taught Takeda's Daitō-ryū in neighbouring Hyōgo Prefecture during this period. His second son, Kuniharu, was born in 1920 in Ayabe, but died from illness the same year, along with three-year-old Takemori.
Takeda visited Ueshiba in Ayabe to provide instruction, although he was not a follower of Ōmoto and did not get along with Deguchi, which led to a cooling of the relationship between him and Ueshiba. Ueshiba continued to teach his martial art under the name "Daitō-ryū Aiki-jūjutsu", at the behest of his teacher. However, Deguchi encouraged Ueshiba to create his own style of martial arts, "Ueshiba-ryū", and sent many Ōmoto followers to study at the dojo. He also brought Ueshiba into the highest levels of the group's bureaucracy, making Ueshiba his executive assistant and putting him in charge of the Showa Seinenkai (Ōmoto-kyō's national youth organisation) and the Ōmoto Shobotai, a volunteer fire service.
His close relationship with Deguchi introduced Ueshiba to various members of Japan's far-right; members of the ultranationalist group the Sakurakai would hold meetings at Ueshiba's dojo, and he developed a friendship with the philosopher Shūmei Ōkawa during this period, as well as meeting with Nisshō Inoue and Kozaburō Tachibana. Deguchi also offered Ueshiba's services as a bodyguard to Kingoro Hashimoto, the Sakurakai's founder. Ueshiba's commitment to the goal of world peace, stressed by many biographers, must be viewed in the light of these relationships and his Ōmoto-kyō beliefs. His association with the extreme right-wing is understandable when one considers that Ōmoto-kyō's view of world peace was of a benevolent dictatorship by the Emperor of Japan, with other nations being subjugated under Japanese rule.
In 1921, in an event known as the , the Japanese authorities raided the compound, destroying the main buildings on the site and arresting Deguchi on charges of lèse-majesté. Ueshiba's dojo was undamaged and over the following two years he worked closely with Deguchi to reconstruct the group's centre, becoming heavily involved in farming work and serving as the group's "Caretaker of Forms", a role which placed him in charge of overseeing Ōmoto's move towards self-sufficiency. His son Kisshomaru was born in the summer of 1921.
Three years later, in 1924, Deguchi led a small group of Ōmoto-kyō disciples, including Ueshiba, on a journey to Mongolia at the invitation of retired naval captain Yutaro Yano and his associates within the ultra-nationalist Black Dragon Society. Deguchi's intent was to establish a new religious kingdom in Mongolia, and to this end he had distributed propaganda suggesting that he was the reincarnation of Genghis Khan. Allied with the Mongolian bandit Lu Zhankui, Deguchi's group were arrested in Tongliao by the Chinese authorities—fortunately for Ueshiba, whilst Lu and his men were executed by firing squad, the Japanese group were released into the custody of the Japanese consul. They were returned under guard to Japan, where Deguchi was imprisoned for breaking the terms of his bail. During this expedition Ueshiba was given the Chinese alias Wang Shou-gao, rendered in Japanese as "Moritaka" – he was reportedly very taken with this name and continued to use it intermittently for the rest of his life.
After returning to Ayabe, Ueshiba began a regimen of spiritual training, regularly retreating to the mountains or performing "misogi" in the Nachi Falls. As his prowess as a martial artist increased, his fame began to spread. He was challenged by many established martial artists, some of whom later became his students after being defeated by him. In the autumn of 1925 he was asked to give a demonstration of his art in Tokyo, at the behest of Admiral Isamu Takeshita; one of the spectators was Yamamoto Gonnohyōe, who requested that Ueshiba stay in the capital to instruct the Imperial Guard in his martial art. After a couple of weeks, however, Ueshiba took issue with several government officials who voiced concerns about his connections to Deguchi; he cancelled the training and returned to Ayabe.
In 1926 Takeshita invited Ueshiba to visit Tokyo again. Ueshiba relented and returned to the capital, but while residing there was stricken with a serious illness. Deguchi visited his ailing student and, concerned for his health, commanded Ueshiba to return to Ayabe. The appeal of returning increased after Ueshiba was questioned by the police following his meeting with Deguchi; the authorities were keeping the Ōmoto-kyō leader under close surveillance. Angered at the treatment he had received, Ueshiba went back to Ayabe again. Six months later, this time with Deguchi's blessing, he and his family moved permanently to Tokyo. This move allowed Ueshiba to teach politicians, high-ranking military personnel, and members of the Imperial household; suddenly he was no longer an obscure provincial martial artist, but a sensei to some of Japan's most important citizens. Arriving in October 1927, the Ueshiba family set up home in the Shirokane district. The building proved too small to house the growing number of aikido students, and so the Ueshibas moved to larger premises, first in Mita district, then in Takanawa, and finally to a purpose-built hall in Shinjuku. This last location, originally named the Kobukan (), would eventually become the Aikikai Hombu Dojo. During its construction, Ueshiba rented a property nearby, where he was visited by Kanō Jigorō, the founder of judo.
During this period, Ueshiba was invited to teach at a number of military institutes, due to his close personal relationships with key figures in the military (among them Sadao Araki, the Japanese Minister of War). He accepted an invitation from Admiral Sankichi Takahashi to be the martial arts instructor at the Imperial Japanese Naval Academy, and also taught at the Nakano Spy School, although aikido was later judged to be too technical for the students there and karate was adopted instead. He also became a visiting instructor at the Imperial Japanese Army Academy after being challenged by (and defeating) General Makoto Miura, another student of Takeda Sōkaku's Daitō-ryū. Takeda himself met Ueshiba for the last time around 1935, while Ueshiba was teaching at the Osaka headquarters of the "Asahi Shimbun" newspaper. Frustrated by the appearance of his teacher, who was openly critical of Ueshiba's martial arts and who appeared intent on taking over the classes there, Ueshiba left Osaka during the night, bowing to the residence in which Takeda was staying and thereafter avoiding all contact with him. Between 1940 and 1942 he made several visits to Manchukuo (Japanese occupied Manchuria) where he was the principal martial arts instructor at Kenkoku University. Whilst in Manchuria, he met and defeated the sumo wrestler Tenryū Saburō during a demonstration.
The "Second Ōmoto Incident" in 1935 saw another government crackdown on Deguchi's sect, in which the Ayabe compound was destroyed and most of the group's leaders imprisoned. Although he had relocated to Tokyo, Ueshiba had retained links with the Ōmoto-kyō group (he had in fact helped Deguchi to establish a paramilitary branch of the sect only three years earlier) and expected to be arrested as one of its senior members. However, he had a good relationship with the local police commissioner Kenji Tomita and the chief of police Gīchi Morita, both of whom had been his students. As a result, although he was taken in for interrogation, he was released without charge on Morita's authority.
In 1932, Ueshiba's daughter Matsuko was married to the swordsman Kiyoshi Nakakura, who was adopted as Ueshiba's heir under the name Morihiro Ueshiba. The marriage ended after a few years, and Nakakura left the family in 1937. Ueshiba later designated his son Kisshomaru as the heir to his martial art.
The 1930s saw Japan's invasion of mainland Asia and increased military activity in Europe. Ueshiba was concerned about the prospect of war, and became involved in a number of efforts to try and forestall the conflict that would eventually become World War II. He was part of a group, along with Shūmei Ōkawa and several wealthy Japanese backers, that tried to broker a deal with Harry Chandler to export aviation fuel from the United States to Japan (in contravention of the ), although this effort ultimately failed. In 1941 Ueshiba also undertook a secret diplomatic mission to China at the behest of Prince Fumimaro Konoe. The intended goal was a meeting with Chiang Kai-shek to establish peace talks, but Ueshiba was unable to meet with the Chinese leader, arriving too late to fulfil his mission.
From 1935 onwards, Ueshiba had been purchasing land in Iwama in Ibaraki Prefecture, and by the early 1940s had acquired around of farmland there. In 1942, disenchanted with the war-mongering and political manoeuvring in the capital, he left Tokyo and moved to Iwama permanently, settling in a small farmer's cottage. Here he founded the Aiki Shuren Dojo, also known as the Iwama dojo, and the Aiki Shrine, a devotional shrine to the "Great Spirit of Aiki". During this time he travelled extensively in Japan, particularly in the Kansai region, teaching his aikido. Despite the prohibition on the teaching of martial arts after World War II, Ueshiba and his students continued to practice in secret at the Iwama dojo; the Hombu dojo in Tokyo was in any case being used as a refugee centre for citizens displaced by the severe firebombing. It was during this period that Ueshiba met and befriended Koun Nakanishi, an expert in kotodama. The study of kotodama was to become one of Ueshiba's passions in later life, and Nakanishi's work inspired Ueshiba's concept of "takemusu aiki".
The rural nature of his new home in Iwama allowed Ueshiba to concentrate on the second great passion of his life: farming. He had been born into a farming family and spent much of his life cultivating the land, from his settlement days in Hokkaidō to his work in Ayabe trying to make the Ōmoto-kyō compound self-sufficient. He viewed farming as a logical complement to martial arts; both were physically demanding and required single-minded dedication. Not only did his farming activities provide a useful cover for martial arts training under the government's restrictions, it also provided food for Ueshiba, his students and other local families at a time when food shortages were commonplace.
The government prohibition (on aikido, at least) was lifted in 1948 with the creation of the Aiki Foundation, established by the Japanese Ministry of Education with permission from the Occupation forces. The Hombu dojo re-opened the following year. After the war Ueshiba effectively retired from aikido. He delegated most of the work of running the Hombu dojo and the Aiki Federation to his son Kisshomaru, and instead chose to spend much of his time in prayer, meditation, calligraphy and farming. He still travelled extensively to promote aikido, even visiting Hawaii in 1961. He also appeared in a television documentary on aikido: NTV's "The Master of Aikido", broadcast in January 1960. Ueshiba maintained links with the Japanese nationalist movement even in later life; his student Kanshu Sunadomari reported that Ueshiba temporarily sheltered Mikami Taku, one of the naval officers involved in the May 15 Incident, at Iwama.
In 1969, Ueshiba became ill. He led his last training session on March 10, and was taken to hospital where he was diagnosed with cancer of the liver. He died suddenly on April 26, 1969. His body was buried at Kōzan-ji, and he was given the posthumous Buddhist title "Aiki-in Moritake En'yū Daidōshi" (); parts of his hair were enshrined at Ayabe, Iwama and Kumano. Two months later, his wife Hatsu ( "Ueshiba Hatsu", née "Itokawa Hatsu"; 1881–1969) also died.
Aikido—usually translated as the "Way of Unifying Spirit" or the "Way of Spiritual Harmony"—is a fighting system that focuses on throws, pins and joint locks together with some striking techniques. It emphasises protecting the opponent and promotes spiritual and social development.
The technical curriculum of aikido was derived from the teachings of Takeda Sōkaku; the basic techniques of aikido stem from his Daitō-ryū system. In the earlier years of his teaching, from the 1920s to the mid-1930s, Ueshiba taught the Daitō-ryū Aiki-jūjutsu system; his early students' documents bear the term Daitō-ryū. Indeed, Ueshiba trained one of the future highest grade earners in Daitō-ryū, Takuma Hisa, in the art before Takeda took charge of Hisa's training.
The early form of training under Ueshiba was noticeably different from later forms of aikido. It had a larger curriculum, increased use of strikes to vital points ("atemi") and a greater use of weapons. The schools of aikido developed by Ueshiba's students from the pre-war period tend to reflect the harder style of the early training. These students included Kenji Tomiki (who founded the Shodokan Aikido sometimes called Tomiki-ryū), Noriaki Inoue (who founded Shin'ei Taidō), Minoru Mochizuki (who founded Yoseikan Budo) and Gozo Shioda (who founded Yoshinkan Aikido). Many of these styles are therefore considered "pre-war styles", although some of these teachers continued to train with Ueshiba in the years after World War II.
During his lifetime, Ueshiba had three spiritual experiences that impacted greatly on his understanding of the martial arts. The first occurred in 1925, after Ueshiba had defeated a naval officer's "bokken" (wooden katana) attacks unarmed and without hurting the officer. Ueshiba then walked to his garden, where he had the following realisation:
His second experience occurred in 1940 when engaged in the ritual purification process of "misogi".
His third experience was in 1942 during the worst fighting of World War II, when Ueshiba had a vision of the "Great Spirit of Peace".
After these events, Ueshiba seemed to slowly grow away from Takeda, and he began to change his art. These changes are reflected in the differing names with which he referred to his system, first as "aiki-jūjutsu", then Ueshiba-ryū, Asahi-ryū, and "aiki budō". In 1942, when Ueshiba's group joined the Dai Nippon Butoku Kai, the martial art that Ueshiba developed finally came to be known as aikido.
As Ueshiba grew older, more skilled, and more spiritual in his outlook, his art also changed and became softer and more gentle. Martial techniques became less important, and more focus was given to the control of ki. In his own expression of the art there was a greater emphasis on what is referred to as "kokyū-nage", or "breath throws" which are soft and blending, utilizing the opponent's movement in order to throw them. Ueshiba regularly practiced cold water "misogi", as well as other spiritual and religious rites, and viewed his studies of aikido as part of this spiritual training.
Over the years, Ueshiba trained a large number of students, many of whom later became famous teachers in their own right and developed their own styles of aikido. Some of them were "uchi-deshi", live-in students. Ueshiba placed many demands on his "uchi-deshi", expecting them to attend him at all times, act as training partners (even in the middle of the night), arrange his travel plans, massage and bathe him, and assist with household chores.
There were roughly four generations of students, comprising the pre-war students (training 1921–1935), students who trained during the Second World War (c.1936–1945), the post-war students in Iwama (c.1946–1955) and the students who trained with Ueshiba during his final years (c.1956–c.1969). As a result of Ueshiba's martial development throughout his life, students from each of these generations tend to have markedly different approaches to aikido. These variations are compounded by the fact that few students trained with Ueshiba for a protracted period; only Yoichiro Inoue, Kenji Tomiki, Gozo Shioda, Morihiro Saito, Tsutomu Yukawa and Mitsugi Saotome studied directly under Ueshiba for more than five or six years. After the war, Ueshiba and the Hombu Dojo dispatched some of their students to various other countries, resulting in aikido spreading around the world.
|
https://en.wikipedia.org/wiki?curid=20069
|
Memory address register
In a computer, the memory address register (MAR) is the CPU register that either stores the memory address from which data will be fetched to the CPU, or the address to which data will be sent and stored.
In other words, MAR holds the memory location of data that needs to be accessed. When reading from memory, data addressed by MAR is fed into the MDR (memory data register) and then used by the CPU. When writing to memory, the CPU writes data from MDR to the memory location whose address is stored in MAR. MAR, which is found inside the CPU, goes either to the RAM (random access memory) or cache.
The memory address register is half of a minimal interface between a microprogram and computer storage; the other half is a memory data register.
In general, MAR is a parallel load register that contains the next memory address to be manipulated. For example, the next address to be read or written.
|
https://en.wikipedia.org/wiki?curid=20070
|
Microassembler
A microassembler is a computer program that helps prepare a microprogram, called "firmware", to control the low level operation of a computer in much the same way an assembler helps prepare higher level code for a processor. The difference is that the microprogram is usually only developed by the processor manufacturer and works intimately with the computer hardware. On a microprogrammed computer the microprogram implements the operations of the instruction set in which any normal program (including both application programs and operating systems) is written. The use of a microprogram allows the manufacturer to fix certain mistakes, including working around hardware design errors, without modifying the hardware. Another means of employing microassembler-generated microprograms is in allowing the same hardware to run different instruction sets. After it is assembled, the microprogram is then loaded to a control store to become part of the logic of a CPU's control unit.
Some microassemblers are more generalized and are not targeted at a single computer architecture. For example, through the use of macro-assembler-like capabilities, Digital Equipment Corporation used their "MICRO2" microassembler for a very wide range of computer architectures and implementations.
If a given computer implementation supports a writeable control store, the microassembler is usually provided to customers as a means of writing customized microcode.
In the process of microcode assembly it is helpful to verify the microprogram with emulation tools before distribution. Nowadays, microcoding has experienced a revival, since it is possible to correct and optimize the firmware of processing units already manufactured or sold, in order to adapt to specific operating systems or to fix hardware bugs. However, a commonly usable microassembler for today's CPUs is not available to manipulate the microcode. Unfortunately, knowledge of a processor's microcode is usually considered proprietary information so it is difficult to obtain information about how to modify it.
|
https://en.wikipedia.org/wiki?curid=20072
|
Machine pistol
A machine pistol is a self-loading pistol capable of fully automatic or burst fire, and sometimes also a stockless handgun-style submachine gun. The term is a calque of "Maschinenpistole", the German word for submachine guns. Machine pistols were developed during World War I and originally issued to German artillery crews who needed a self-defense weapon that is lighter than a rifle but more powerful than a standard semi-automatic pistol. Today, they are considered a special purpose weapon with limited utility, and difficult to control for all but the best shooters.
The Austrians introduced the world's first machine pistol, the "Steyr Repetierpistole" M1912/P16, during World War I. The Germans also experimented with machine pistols, by converting various types of semi-automatic pistols to full-auto, leading to the development of the first practical submachine gun. During World War II, machine pistol development was more or less ignored as the major powers were focused on mass-producing submachine guns. After the war, machine pistols development was limited and only a handful of manufacturers would develop new designs, with varying degrees of success.
During World War I, a machine pistol version of the Steyr M1912 called the "Repetierpistole M1912/P16" was produced. It used a 16-round fixed magazine loaded via 8 round stripper clips, a detachable shoulder stock and a rather large exposed semi-auto/full-auto selector on the right side of the frame above the trigger (down = semi & up = full). It fires the 9×23mm Steyr cartridge, with a full-auto rate-of-fire of about 800 to 1000 rounds per minute. It weighed about 2.6 pounds. Introduced in 1916, it is considered the world's first machine pistol, only 960 M1912/P16 were made.
The Germans also experimented with machine pistols, by converting various types of semi-automatic pistols to full-auto.
The armed forces never made wide spread use of this modified pistols in the first world war, most of them reaching only prototype state.
The german solution came out as the Luger P08 "Artillery Pistol" with its long barrel, tangent sights, detachable stock and 32-round drum magazine.
It was issued to primarily German field and mobile artillery crews who needed a self-defence weapon, lighter than a rifle but more effective than a standard pistol. It fired the newly developed 9mm Parabellum pistol cartridge, which was designed for low recoil without sacrificing penetration and stopping power. These machine pistols proved to be quite effective in close range trench warfare and led the Germans to develop the 9 mm Parabellum Bergmann MP-18, the first practical submachine gun.
The Mauser C96 was introduced in 1896, it was one of the first commercially successful and practical semi-automatic pistols. During World War I, the Germans experimented with machine pistols by converting both 7.63mm Mauser and 9 mm Parabellum semi-automatic C96 pistols to full-auto. In the late 1920s, Spanish gunmakers introduced select fire copies of the C96 with 20 round detachable magazines. In the early 1930s, Mauser engineers followed suit, and introduced the Model 1932 or Model 712 "Schnellfeuer" variant, which included 20 round detachable magazine and a select fire mechanism allowing fully automatic fire at a rate of 1000 rounds/minute.
During World War II machine pistol development was more or less ignored as the major powers were focused on mass-producing submachine guns. With one exception, the 9 mm Parabellum Browning Hi-Power machine pistol. The artillery version with its adjustable tangent rear-sight, shoulder stock, 13 round magazine and later 20 round magazine was routinely converted to full-auto-only. In German service, it was used mainly by Waffen-SS and "Fallschirmjäger" personnel along with Mauser M1932 "Schnellfeuer" machine pistol.
The 9×18mm Makarov Stechkin automatic pistol (APS) is a Russian selective-fire machine pistol introduced into the Russian army in 1951. Like the other common Russian army pistol of this era, the Makarov, the Stechkin uses a simple unlocked blow-back mechanism and the double action trigger. In addition, the Stechkin APS has an automatic fire mode, which is selected using the safety lever. In burst or automatic fire, the pistol should be fitted with the wooden shoulder stock; otherwise, the weapon quickly becomes uncontrollable. The Stechkin was intended as a side arm for artillery soldiers and tank crews. In practice, it earned a strong following in the ranks of political and criminal police, special forces and the like. Many KGB and GRU operatives favored the Stechkin for its firepower and 20 round magazine.
The Škorpion vz. 61 is a Czechoslovak 7.65 mm or .32 ACP machine pistol developed in 1959 and produced from 1961 to 1979. Although it was developed for use with security and special forces, the weapon was also accepted into service with the Czechoslovak Army, as a personal sidearm for lower-ranking army staff, vehicle drivers, armoured vehicle personnel and special forces. The Skorpion's lower powered .32 ACP cartridge, coupled with a rate-of-fire limiting device housed in the grip (which allows a reasonable rate of 850 RPM with a relatively light bolt), also makes it easier to control in full-auto than the more common 9 mm Parabellum designs. Currently the weapon is in use with the armed forces of several countries as a sidearm. The Škorpion was also licence-built in Yugoslavia, designated M84.
The Beretta M951R, was based on the 9 mm Parabellum Beretta M1951 pistol and produced during the 1960s in response to a request made by the Italian special forces. The primary difference between the M951R and the original M1951 lied in the fire selector lever mounted on the right side of the weapon's frame, enabling either semi-automatic or continuous fire – labelled "SEM" and "AUT", respectively. Additionally, the weapon has a heavier slide, a folding wooden forward grip, the barrel was extended, and so was the magazine, increasing capacity to 10 rounds.
It is reported that the Inglis Company manufactured or converted Inglis made Hi-Powers and were reportedly used by the SAS during the Cold War as the British Army were using Inglis made Hi-Powers before adopting the L9A1 version of the Hi-Power in 1966.
The MAC-10 and MAC-11 were 1970s blowback designed weapons with the magazine in the pistol grip and a fire selector switch. The .45 ACP MAC-10 had a 1,145 rounds per minute rate of fire, and the 9×19mm version 1,090 rounds per minute. The MAC-11 could fire 1,200 rounds-per-minute with its .380 ACP cartridges. These guns were designed by Gordon Ingram and Military Armament Corporation in the US. The weapons were used in special operations and clandestine applications in Vietnam and by Brazilian anti-terrorist units. It could be fitted with a silencer using its threaded barrel. While some sources call the MAC-10 and MAC-11 machine pistols, the guns are also referred to as compact submachine guns.
Since it is difficult to control machine pistols when they are fired in full automatic mode, in the 1970s, some manufacturers developed an "intermittent-fire" setting that fires a burst of three shots instead of a full-automatic, such as the Heckler & Koch VP70. It is a 9 mm Parabellum, 18-round, double action only, select fire capable polymer frame pistol. It was the first polymer framed pistol, predating the Glock 17. The stock incorporates a semi-auto/three round burst selector. It will only fire a 3 round burst with the stock attached. Cyclic rounds per minute for the three-round bursts is 2,200 rpm. Despite the VP70's potential, it was never adopted by the "Bundeswehr".
In 1976, a shortened machine pistol version of the 9 mm Parabellum Heckler & Koch MP5 was introduced; the MP5K (K from the German "Kurz" = "short") was designed for close quarters battle use by clandestine operations and special services. The MP5K does not have a shoulder stock, and the bolt and receiver were shortened at the rear. The resultant lighter bolt led to a higher rate of fire than the standard MP5. The barrel, cocking handle and its cover were shortened and a vertical foregrip was used to replace the standard handguard. The barrel ends at the base of the front sight, which prevents the use of any sort of muzzle device.
The Stechkin APS made a comeback in the late 1970s, when Russian Spetsnaz special forces units in Afghanistan used the suppressor-equipped APB variant for clandestine missions in enemy territory, such as during the Soviet–Afghan War.
The 9 mm Parabellum Micro Uzi is a scaled-down version of the Uzi submachine gun, first introduced in 1983. It is 460 mm (18.11 inches) long with the stock extended, and just 250 mm (9.84 inches) long with the stock folded. Its barrel length is 117 mm and its muzzle velocity is 350 m/s. Used by the Israeli Isayeret and the US Secret Service, Micro-Uzis are available in open bolt or closed bolt versions. The weapon has an additional tungsten weight on the bolt to slow the rate of fire, which would otherwise make such a lightweight weapon uncontrollable.
The 9 mm Parabellum Glock 18 is a select-fire variant of the Glock 17, developed in 1986 at the request of the Austrian counter-terrorist unit EKO Cobra. This machine pistol has a lever-type fire-control selector switch, installed on the left side, at the rear of the slide, serrated portion (selector lever in the bottom position for continuous fire, top setting for single fire). The firearm is typically used with an extended 33-round capacity magazine and may be fired with or without a shoulder stock. The pistol's rate of fire in fully automatic mode is approximately 1,100–1,200 rounds/min.
Introduced in 1992, the Steyr TMP ("Taktische Maschinenpistole" "tactical machine pistol") is a select-fire 9×19mm Parabellum machine pistol manufactured by Steyr Mannlicher of Austria. The magazines come in 15-, 20-, 25-, or 30-round detachable box types. A suppressor can also be fitted.
Also introduced in 1992, the 9 mm Parabellum CZ 75 AUTOMATIC is the full-auto version of the CZ75. It has a longer barrel with three vent ports. This machine pistol has a horizontal rail in front of the trigger guard through which a spare 16 or 20 round magazine can be attached and be used as a fore-grip for better control during full automatic firing.
During the 1990s, the Russian Stechkin APS was once again put into service, as a weapon for VIP bodyguards and for anti-terrorist hostage rescue teams that needed the capability for full automatic fire in emergencies.
Developed in the 1990s and 2000s the personal defense weapon, a compact submachine gun-like firearm which can fire armor-piercing, higher-powered ammunition began to replace the machine pistol as a self-defence side arm for artillery crews, tank crews, and helicopter pilots. Introduced in 2001, the Heckler & Koch MP7 is often called a machine pistol. The MP7 uses a short stroke piston gas system as used on H&K's G36 and HK416 assault rifles, in place of a blow-back system traditionally seen on machine pistols. The MP7 uses 20-, 30- and 40-round magazines and fires 4.6×30mm ammunition which can penetrate soft body armor. Due to the heavy use of polymers in its construction, the MP7 is much lighter than older designs, only with an empty 20 round magazine.
The dividing line between machine pistols and compact submachine guns is hard to draw. The term "submachine gun" usually refers to magazine-fed, fully automatic carbines designed to fire pistol cartridges, while the term "machine pistol" usually refers to a fully automatic handgun based weapons. However, many weapons fall into both categories.
The Škorpion vz. 61, is often called a submachine gun. However, it is small enough to be carried in a pistol holster and so is also often referred to as a machine pistol. The MAC-10, MAC-11 and the compact versions of the Uzi series have been placed in both classes. The Steyr TMP (Tactical Machine Pistol) is also called as a compact submachine gun. Likewise, the German Heckler & Koch MP5K also falls in both categories. Personal Defense Weapons (PDWs) such as the Heckler & Koch MP7 are also often called machine pistols.
Machine pistols are considered a special purpose weapon with limited utility. Due to their small size, machine pistols are difficult for all but the best shooters to control. As a result, most machine pistols are fitted with an unwieldy detachable shoulder stock. Some, such as the Heckler & Koch VP70, will only fire in semi-automatic when the stock is removed because the select-fire mechanism is incorporated into the stock. The VP70 also introduced a three-round-burst limiter to improve controllability. The Beretta 93R not only uses a detachable shoulder stock and a three-round-burst limiter, but also a folding forward hand-grip to improve controllability in full auto. The MAC-10 and MAC-11 use suppressors to reduce muzzle climb, while other designs use a combination of burst limiters, forward hand-grips, ported barrels and muzzle brakes.
Gunsite, a US firearms training facility, decided against teaching machine pistol firing when it was founded in 1976. Facility experts believed that it is "a slob's weapon, useful only by half-trained or poorly motivated troops"; they claimed that the machine pistol "hits no harder than a pistol and is no more portable than a rifle." Nevertheless, even the critics from Gunsite concede that the machine pistol is useful for a few situations, such as boarding an enemy boat in low light or when repelling boarders in a naval situation. In the 1970s, International Association of Police Chiefs weapons researcher David Steele criticized the MAC-10's accuracy when he described the MAC series as "fit only for combat in a phone booth".
Walt Rauch notes that "... despite the 50 to 70 years of bad press that has accrued to the concept of shooting a hand-held machine pistol", in which critics contend that the weapon will "spray bullets indiscriminately all over the area", he believes that the 2000s-era models such as the Glock 18 are controllable and accurate in full-auto shooting. Leroy Thompson states that "...machine pistols were reasonably good for use from within a vehicle or for issue to VIP [bodyguard] drivers to give them a marginally more effective weapon during an evacuation under fire". However, he also stated that machine pistols are "...(h)ard to control in full-auto fire", which means that there is nothing that a machine pistol "...can do that other weapons available today can't do more efficiently."
|
https://en.wikipedia.org/wiki?curid=20074
|
Adobe Flash
Adobe Flash is a deprecated multimedia software platform used for production of animations, rich Internet applications, desktop applications, mobile applications, mobile games and embedded web browser video players. Flash displays text, vector graphics and raster graphics to provide animations, video games and applications. It allows streaming of audio and video, and can capture mouse, keyboard, microphone and camera input. Related development platform Adobe AIR continues to be supported.
Artists may produce Flash graphics and animations using Adobe Animate (formerly known as Adobe Flash Professional). Software developers may produce applications and video games using Adobe Flash Builder, FlashDevelop, Flash Catalyst, or any text editor when used with the Apache Flex SDK.
End-users can view Flash content via Flash Player (for web browsers), AIR (for desktop or mobile apps) or third-party players such as Scaleform (for video games). Adobe Flash Player (supported on Microsoft Windows, macOS and Linux) enables end-users to view Flash content using web browsers. Adobe Flash Lite enabled viewing Flash content on older smartphones, but has been discontinued and superseded by Adobe AIR.
The ActionScript programming language allows the development of interactive animations, video games, web applications, desktop applications and mobile applications. Programmers can implement Flash software using an IDE such as Adobe Animate, Adobe Flash Builder, Adobe Director, FlashDevelop and Powerflasher FDT. Adobe AIR enables full-featured desktop and mobile applications to be developed with Flash and published for Windows, macOS, Android, iOS, Xbox One, PlayStation 4, Wii U, and Nintendo Switch.
Although Flash was previously a dominant platform for online multimedia content, it is slowly being abandoned as Adobe favors a transition to HTML5, Unity, or other platforms. Flash Player has been deprecated and has an official end-of-life on December 31, 2020. However, Adobe will continue to develop Adobe Animate, which will focus on supporting web standards such as HTML5 instead of the obsolete Flash format.
In the early 2000s, Flash was widely installed on desktop computers, and was commonly used to display interactive web pages, online games, and to playback video and audio content. In 2005, YouTube was founded by former PayPal employees, and it used Flash Player as a means to display compressed video content on the web.
Between 2000 and 2010, numerous businesses used Flash-based websites to launch new products, or to create interactive company portals. Notable users include Nike, Hewlett-Packard, Nokia, General Electric, World Wildlife Fund, HBO, Cartoon Network, Disney and Motorola. After Adobe introduced hardware-accelerated 3D for Flash (Stage3D), Flash websites saw a growth of 3D content for product demonstrations and virtual tours.
In 2007, YouTube offered videos in HTML5 format to support the iPhone and iPad, which did not support Flash Player. After a controversy with Apple, Adobe stopped developing Flash Player for Mobile, focusing its efforts on Adobe AIR applications and HTML5 animation. In 2015, Google introduced Google Swiffy to convert Flash animation to HTML5, a tool Google would use to automatically convert Flash web ads for mobile devices. In 2016, Google discontinued Swiffy and its support. In 2015, YouTube switched to HTML5 technology on all devices; however, it would preserve the Flash-based video player for older web browsers.
After Flash 5 introduced ActionScript in 2000, developers combined the visual and programming capabilities of Flash to produce interactive experiences and applications for the Web. Such Web-based applications eventually came to be known as "Rich Internet Applications" (RIAs).
In 2004, Macromedia Flex was released, and specifically targeted the application development market. Flex introduced new user interface components, advanced data visualization components, data remoting, and a modern IDE (Flash Builder). Flex competed with Asynchronous JavaScript and XML (AJAX) and Microsoft Silverlight during its tenure. Flex was upgraded to support integration with remote data sources, using AMF, BlazeDS, Adobe LiveCycle, Amazon Elastic Compute Cloud, and others. As of 2015, Flex applications can be published for desktop platforms using Adobe AIR.
Between 2006 and 2016, the Speedtest.net web service conducted over 9.0 billion speed tests using an RIA built with Adobe Flash. In 2016, the service shifted to HTML5 due to the decreasing availability of Adobe Flash Player on PCs.
As of 2016, Web applications and RIAs can be developed with Flash using the ActionScript 3.0 programming language and related tools such as Adobe Flash Builder. Third-party IDEs such as FlashDevelop and Powerflasher FDT also enable developers to create Flash games and applications and are generally similar to Microsoft Visual Studio. Flex applications are typically built using Flex frameworks such as PureMVC.
Flash video games are popular on the Internet, with portals like Newgrounds, Miniclip, and Armor Games dedicated to hosting of Flash-based games. Popular games developed with Flash include "Angry Birds", "Clash of Clans", "FarmVille", "AdventureQuest", "Machinarium", "Hundreds", "N", "QWOP" and "Solipskier".
Adobe introduced various technologies to help build video games, including Adobe AIR (to release games for desktop or mobile platforms), Adobe Scout (to improve performance), CrossBridge (to convert C++-based games to run in Flash), and Stage3D (to support GPU-accelerated video games). 3D frameworks like Away3D and Flare3D simplified creation of 3D content for Flash.
Adobe AIR allows the creation of Flash-based mobile games, which may be published to the Google Play and Apple app stores.
Flash is also used to build interfaces and HUDs for 3D video games using Scaleform GFx, a technology that renders Flash content within non-Flash video games. Scaleform is supported by more than 10 major video game engines including Unreal Engine, UDK, CryEngine and PhyreEngine, and has been used to provide 3D interfaces for more than 150 major video game titles since its launch in 2003.
Adobe Animate is one of the common animation programs for low-cost 2D television and commercial animation, in competition with Anime Studio and Toon Boom Animation.
Notable users of Flash include DHX Media Vancouver for productions including "Pound Puppies" and "", Fresh TV for Total Drama, Nelvana for "6teen" and "Clone High", Williams Street for "Metalocalypse" and "Squidbillies", Nickelodeon Animation Studios for "Wow! Wow! Wubbzy!", "", among others.
Flash is less commonly used for feature-length animated films; however, 2009's "The Secret of Kells", an Irish film, was animated primarily in Adobe Flash, and was nominated for an Academy Award for Best Animated Feature at the 82nd Academy Awards.
Several popular online series are currently produced in Flash, such as the Emmy Award-winning "Off-Mikes", produced by ESPN and Animax Entertainment; "Happy Tree Friends"; "Gotham Girls", produced by Warner Brothers; "Crime Time", produced by Future Thought Productions and "Homestar Runner" produced by Mike and Matt Chapman.
Various third-party software packages designed for traditionally trained cartoonists and animators can publish animations in the SWF format.
The precursor to Flash was a product named SmartSketch, published by FutureWave Software in 1993. The company was founded by Charlie Jackson, Jonathan Gay, and Michelle Welsh. SmartSketch was a vector drawing application for pen computers running the PenPoint OS. When PenPoint failed in the marketplace, SmartSketch was ported to Microsoft Windows and Mac OS.
As the Internet became more popular, FutureWave realized the potential for a vector-based web animation tool that might challenge Macromedia Shockwave technology. In 1995, FutureWave modified SmartSketch by adding frame-by-frame animation features and released this new product as FutureSplash Animator on Macintosh and PC.
FutureWave approached Adobe Systems with an offer to sell them FutureSplash in 1995, but Adobe turned down the offer at that time. Microsoft wanted to create an "online TV network" (MSN 2.0) and adopted FutureSplash animated content as a central part of it. Disney Online used FutureSplash animations for their subscription-based service Disney's Daily Blast. Fox Broadcasting Company launched The Simpsons using FutureSplash.
In November 1996, FutureSplash was acquired by Macromedia, and Macromedia re-branded and released "FutureSplash Animator" as "Macromedia Flash 1.0". Flash was a two-part system, a graphics and animation editor known as Macromedia Flash, and a player known as Macromedia Flash Player.
"FutureSplash Animator" was an animation tool originally developed for pen-based computing devices. Due to the small size of the "FutureSplash Viewer", it was particularly suited for download on the Web. Macromedia distributed Flash Player as a free browser plugin in order to quickly gain market share. By 2005, more computers worldwide had Flash Player installed than any other Web media format, including Java, QuickTime, RealNetworks and Windows Media Player.
Macromedia upgraded the Flash system between 1996 and 1999 adding MovieClips, Actions (the precursor to ActionScript), Alpha transparency, and other features. As Flash matured, Macromedia's focus shifted from marketing it as a graphics and media tool to promoting it as a Web application platform, adding scripting and data access capabilities to the player while attempting to retain its small footprint.
In 2000, the first major version of ActionScript was developed, and released with "Flash 5". Actionscript 2.0 was released with "Flash MX 2004" and supported object-oriented programming, improved UI components and other programming features. The last version of Flash released by Macromedia was "Flash 8", which focused on graphical upgrades such as filters (blur, drop shadow, etc.), blend modes (similar to Adobe Photoshop), and advanced features for FLV video.
Macromedia was acquired by Adobe Systems on December 3, 2005, and the entire Macromedia product line including Flash, Dreamweaver, Director/Shockwave, Fireworks (which has since been discontinued) and Authorware is now handled by Adobe.
In 2007, Adobe's first version release was "Adobe Flash CS3 Professional", the ninth major version of Flash. It introduced the ActionScript 3.0 programming language, which supported modern programming practices and enabled business applications to be developed with Flash. Adobe Flex Builder (built on Eclipse) targeted the enterprise application development market, and was also released the same year. Flex Builder included the Flex SDK, a set of components that included charting, advanced UI, and data services ("Flex Data Services").
In 2008, Adobe released the tenth version of Flash, "Adobe Flash CS4". Flash 10 improved animation capabilities within the Flash editor, adding a motion editor panel (similar to Adobe After Effects), inverse kinematics (bones), basic 3D object animation, object-based animation, and other text and graphics features. "Flash Player 10" included an in-built 3D engine (without GPU acceleration) that allowed basic object transformations in 3D space (position, rotation, scaling).
Also in 2008, Adobe released the first version of Adobe Integrated Runtime (later re-branded as "Adobe AIR"), a runtime engine that replaced Flash Player, and provided additional capabilities to the ActionScript 3.0 language to build desktop and mobile applications. With AIR, developers could access the file system (the user's files and folders), and connected devices such as a joystick, gamepad, and sensors for the first time.
In 2011, "Adobe Flash Player 11" was released, and with it the first version of Stage3D, allowing GPU-accelerated 3D rendering for Flash applications and games on desktop platforms such as Microsoft Windows and Mac OS X. Adobe further improved 3D capabilities from 2011 to 2013, adding support for 3D rendering on Android and iOS platforms, alpha-channels, compressed textures, texture atlases, and other features. Adobe AIR was upgraded to support 64-bit computers, and to allow developers to add additional functionality to the AIR runtime using "AIR Native Extensions" (ANE).
In 2014, Adobe AIR reached a milestone with over 100,000 unique applications built, and over 1 billion installations logged across the world (May 2014). Adobe AIR was voted the "Best Mobile Application Development" product at the Consumer Electronics Show on two consecutive years (CES 2014 and CES 2015). In 2016, Adobe renamed Flash Professional, the primary authoring software for Flash content, to Adobe Animate to reflect its growing use for authoring HTML5 content in favour of Flash content.
On May 1, 2008, Adobe announced the "Open Screen Project", with the intent of providing a consistent application interface across devices such as personal computers, mobile devices, and consumer electronics. When the project was announced, seven goals were outlined: the abolition of licensing fees for Adobe Flash Player and Adobe Integrated Runtime, the removal of restrictions on the use of the Shockwave Flash (SWF) and Flash Video (FLV) file formats, the publishing of application programming interfaces for porting Flash to new devices, and the publishing of The Flash Cast protocol and Action Message Format (AMF), which let Flash applications receive information from remote databases.
, the specifications removing the restrictions on the use of SWF and FLV/F4V specs have been published. The Flash Cast protocol—now known as the Mobile Content Delivery Protocol—and AMF protocols have also been made available, with AMF available as an open source implementation, BlazeDS.
The list of mobile device providers who have joined the project includes Palm, Motorola, and Nokia, who, together with Adobe, have announced a $10 million Open Screen Project fund., the Open Screen Project is no longer accepting new applications according to partner BSQuare. However paid licensing is still an option for device makers who want to use Adobe software.
Although Flash was previously a dominant platform for online multimedia content, it is slowly being abandoned as Adobe favors a transition to HTML5 due to inherent security flaws and significant resources required to maintain the platform. Apple restricted the use of Flash on iOS in 2010 due to concerns that it performed poorly on its mobile devices, had negative impact on battery life, and was deemed unnecessary for online content. As a result, it was not adopted by Apple for its smartphone and tablet devices, which also reduced its user base and encouraged wider adoption of HTML5 features such as the canvas and video elements, which can replace Flash without the need for plugins. In 2015, Adobe rebranded its Flash authoring environment as Adobe Animate to emphasize its expanded support for HTML5 authoring, and stated that it would "encourage content creators to build with new web standards" rather than using Flash. In July 2017, Adobe announced that it would declare Flash to be end-of-life at the end of 2020, and will cease support, distribution, and security updates for Flash Player. After the announcement, developers have started a petition to turn Flash into an open-source project, leading to controversy.
The Flash Platform will continue in the form of Adobe AIR, which Adobe will continue to develop, and OpenFL, a multi-target open-source implementation of the Flash API. Additionally, Adobe Animate will continue to be developed by Adobe even after 2020.
Starting from Chrome 76 and Firefox 69, Flash is disabled by default and browsers do not even show a prompt to activate Flash content. Users who want to play Flash content need to manually set a browser to prompt for Flash content, and then during each browser session enable Flash plugin for every site individually. Furthermore, browsers show warnings about the removal of Flash entirely after December 2020. Microsoft Edge based on Chromium will follow the same plan as Google Chrome.
Google Chrome will block Flash plugin as "out of date" in January 2021 and eventually remove it from the source code. Also in December 2020 Flash support will be completely removed from Firefox. In a move to further reduce the number of Flash Player installations, Adobe announced plans to add a "time bomb" to Flash to prevent existing installations past End-of-Life date, to prompt users to uninstall Flash, and to remove all existing download links for Flash installers.
Flash source files are in the FLA format and contain graphics and animation, as well as embedded assets such as bitmap images, audio files, and FLV video files. The Flash source file format is a proprietary format and Adobe Animate is the only available authoring tool capable of editing such files. Flash source files (.fla) may be compiled into Flash movie files (.swf) using Adobe Animate. Note that FLA files can be edited, but output (.swf) files cannot.
Flash movie files are in the "SWF" format, traditionally called "ShockWave Flash" movies, "Flash movies", or "Flash applications", usually have a .swf file extension, and may be used in the form of a web page plug-in, strictly "played" in a standalone Flash Player, or incorporated into a self-executing Projector movie (with the .exe extension in Microsoft Windows). Flash Video files have a .flv file extension and are either used from within .swf files or played through a flv-aware player, such as VLC, or QuickTime and Windows Media Player with external codecs added.
The use of vector graphics combined with program code allows Flash files to be smaller—and thus allows streams to use less bandwidth—than the corresponding bitmaps or video clips. For content in a single format (such as just text, video, or audio), other alternatives may provide better performance and consume less CPU power than the corresponding Flash movie, for example, when using transparency or making large screen updates such as photographic or text fades.
In addition to a vector-rendering engine, the Flash Player includes a virtual machine called the ActionScript Virtual Machine (AVM) for scripting interactivity at run-time, with video, MP3-based audio, and bitmap graphics. As of Flash Player 8, it offers two video codecs: On2 Technologies VP6 and Sorenson Spark, and run-time JPEG, Progressive JPEG, PNG, and GIF capability.
Flash Player 11 introduced a full 3D shader API, called Stage3D, which is fairly similar to WebGL. Stage3D enables GPU-accelerated rendering of 3D graphics within Flash games and applications, and has been used to build Angry Birds, and a couple of other notable games.
Various 3D frameworks have been built for Flash using Stage3D, such as Away3D 4, CopperCube, Flare3D and Starling. Professional game engines like Unreal Engine and Unity also export Flash versions which use Stage3D to render 3D graphics.
Virtually all browser plugins for video are free of charge and cross-platform, including Adobe's offering of Flash Video, which was introduced with Flash version 6. Flash Video has been a popular choice for websites due to the large installed user base and programmability of Flash. In 2010, Apple publicly criticized Adobe Flash, including its implementation of video playback for not taking advantage of hardware acceleration, one reason Flash is not to be found on Apple's mobile devices. Soon after Apple's criticism, Adobe demoed and released a beta version of Flash 10.1, which uses available GPU hardware acceleration even on a Mac. Flash 10.2 beta, released December 2010, adds hardware acceleration for the whole video rendering pipeline.
Flash Player supports two distinct modes of video playback, and hardware accelerated video decoding may not be used for older video content. Such content causes excessive CPU usage compared to comparable content played with other players.
In tests done by Ars Technica in 2008 and 2009, Adobe Flash Player performed better on Windows than Mac OS X and Linux with the same hardware.
Performance has later improved for the latter two, on Mac OS X with Flash Player 10.1, and on Linux with Flash Player 11.
Flash Audio is most commonly encoded in MP3 or AAC (Advanced Audio Coding) however it can also use ADPCM, Nellymoser (Nellymoser Asao Codec) and Speex audio codecs. Flash allows sample rates of 11, 22 and 44.1 kHz. It cannot have 48 kHz audio sample rate, which is the standard TV and DVD sample rate.
On August 20, 2007, Adobe announced on its blog that with Update 3 of Flash Player 9, Flash Video will also implement some parts of the MPEG-4 international standards. Specifically, Flash Player will work with video compressed in H.264 (MPEG-4 Part 10), audio compressed using AAC (MPEG-4 Part 3), the F4V, MP4 (MPEG-4 Part 14), M4V, M4A, 3GP and MOV multimedia container formats, 3GPP Timed Text specification (MPEG-4 Part 17), which is a standardized subtitle format and partial parsing capability for the "ilst" atom, which is the ID3 equivalent iTunes uses to store metadata. MPEG-4 Part 2 and H.263 will not work in F4V file format. Adobe also announced that it will be gradually moving away from the FLV format to the standard ISO base media file format (MPEG-4 Part 12) owing to functional limits with the FLV structure when streaming H.264. The final release of the Flash Player implementing some parts of MPEG-4 standards had become available in Fall 2007.
Adobe Flash Player 10.1 does not have acoustic echo cancellation, unlike the VoIP offerings of Skype and Google Voice, making this and earlier versions of Flash less suitable for group calling or meetings. Flash Player 10.3 Beta incorporates acoustic echo cancellation.
"ActionScript" is the programming language used by Flash. It is an enhanced superset of the ECMAScript programming language, with a classical Java-style class model, rather than JavaScript's prototype model.
In October 1998, Macromedia disclosed the Flash Version 3 Specification on its website. It did this in response to many new and often semi-open formats competing with SWF, such as Xara's Flare and Sharp's Extended Vector Animation formats. Several developers quickly created a C library for producing SWF. In February 1999, MorphInk 99 was introduced, the first third-party program to create SWF files. Macromedia also hired Middlesoft to create a freely available developers' kit for the SWF file format versions 3 to 5.
Macromedia made the Flash Files specifications for versions 6 and later available only under a non-disclosure agreement, but they are widely available from various sites.
In April 2006, the Flash SWF file format specification was released with details on the then newest version format (Flash 8). Although still lacking specific information on the incorporated video compression formats (On2, Sorenson Spark, etc.), this new documentation covered all the new features offered in Flash v8 including new ActionScript commands, expressive filter controls, and so on. The file format specification document is offered only to developers who agree to a license agreement that permits them to use the specifications only to develop programs that can export to the Flash file format. The license does not allow the use of the specifications to create programs that can be used for playback of Flash files. The Flash 9 specification was made available under similar restrictions.
In June 2009, Adobe launched the Open Screen Project (Adobe link), which made the SWF specification available without restrictions. Previously, developers could not use the specification for making SWF-compatible players, but only for making SWF-exporting authoring software. The specification still omits information on codecs such as Sorenson Spark, however.
The Adobe Animate authoring program is primarily used to design graphics and animation and publish the same for websites, web applications, and video games. The program also offers limited support for audio and video embedding and ActionScript scripting.
Adobe released Adobe LiveMotion, designed to create interactive animation content and export it to a variety of formats, including SWF. LiveMotion failed to gain any notable user base.
In February 2003, Macromedia purchased Presedia, which had developed a Flash authoring tool that automatically converted PowerPoint files into Flash. Macromedia subsequently released the new product as Breeze, which included many new enhancements.
Various free and commercial software packages can output animations into the Flash SWF format, suitable for display on the web, including:
The Flash 4 Linux project was an initiative to develop an open source Linux application as an alternative to Adobe Animate. Development plans included authoring capacity for 2D animation, and tweening, as well as outputting SWF file formats. F4L evolved into an editor that was capable of authoring 2D animation and publishing of SWF files. Flash 4 Linux was renamed UIRA. UIRA intended to combine the resources and knowledge of the F4L project and the Qflash project, both of which were Open Source applications that aimed to provide an alternative to the proprietary Adobe Flash.
Adobe provides a series of tools to develop software applications and video games for Flash:
Third-party development tools have been created to assist developers in creating software applications and video games with Flash.
Adobe Flash Player is the multimedia and application player originally developed by Macromedia and acquired by Adobe Systems. It plays SWF files, which can be created by Adobe Animate, Apache Flex, or a number of other Adobe Systems and 3rd party tools. It has support for a scripting language called ActionScript, which can be used to display Flash Video from an SWF file.
Scaleform GFx is a commercial alternative Flash player that features fully hardware-accelerated 2D graphics rendering using the GPU. Scaleform has high conformance with both Flash 10 ActionScript 3 and Flash 8 ActionScript 2. Scaleform GFx is a game development middleware solution that helps create graphical user interfaces or HUDs within 3D video games. It does not work with web browsers.
IrfanView, an image viewer, uses Flash Player to display SWF files.
OpenFL is an open-source implementation of the Adobe Flash API. It allows developers to build a single application against the OpenFL APIs and simultaneously target multiple platforms including iOS, Android, HTML5 (choice of Canvas, WebGL, SVG or DOM), Windows, macOS, Linux, WebAssembly, Flash, AIR, PlayStation 4, PlayStation 3, PlayStation Vita, Xbox One, Wii U, TiVo, Raspberry Pi, and Node.js.
OpenFL mirrors the Flash API for graphical operations. OpenFL applications can be written in Haxe, JavaScript (EcmaScript 5 or 6+), or TypeScript.
Lightspark is a free and open source SWF player that supports most of ActionScript 3.0 and has a Mozilla-compatible plug-in. It will fall back on Gnash, a free SWF player supporting ActionScript 1.0 and 2.0 (AVM1) code. Lightspark supports OpenGL-based rendering for 3D content. The player is also compatible with H.264 Flash videos on YouTube.
Gnash aims to create a software player and browser plugin replacement for the Adobe Flash Player. Gnash can play SWF files up to version 7, and 80% of ActionScript 2.0. Gnash runs on Windows, Linux and other platforms for the 32-bit, 64-bit, and other operating systems, but development has slowed significantly in recent years.
Shumway was an open source Flash Player released by Mozilla in November 2012. It was built in JavaScript and is thus compatible with modern web-browsers. In early October 2013, Shumway was included by default in the Firefox nightly branch. Shumway rendered Flash contents by translating contents inside Flash files to HTML5 elements, and running an ActionScript interpreter in JavaScript. It supported both AVM1 and AVM2, and ActionScript versions 1, 2, and 3. Development of Shumway ceased in early 2016.
Adobe Flash has been deprecated. The latest version of Adobe Flash Player is available for three major desktop platforms, including Windows, macOS and Linux. On Linux the PPAPI plug-in is available; the NPAPI version wasn't updated to new major versions for a while until Adobe changed its mind on stopping support and its former plan to discontinue "in 2017".
Adobe Flash Player is available in four flavors:
The "ActiveX" version is an ActiveX control for use in Internet Explorer and any other Windows applications that support ActiveX technology. The "Plug-in" versions are available for browsers supporting either NPAPI or PPAPI plug-ins on Microsoft Windows, macOS and Linux. The "projector" version is a standalone player that can open SWF files directly.
The following table documents Flash Player and Adobe AIR support on desktop operating systems:
Adobe AIR, version 18, contains Adobe Flash Player 18, and is available for Windows XP and later, as well as macOS. Official support for desktop Linux distributions ceased in June 2011 with version 2.6. The latest Adobe AIR is AIR 32, while HARMAN supplies AIR 33.
Adobe Flash Player was available for a variety of mobile operating systems, including Android (between versions 2.2 and 4.0.4), Pocket PC/Windows CE, QNX (e.g. on BlackBerry PlayBook), Symbian, Palm OS, and webOS (since version 2.0). Flash Player for smart phones was made available to handset manufacturers at the end of 2009.
However, in November 2011, Adobe announced the withdrawal of support for Flash Player on mobile devices. Adobe continues to support deploying Flash-based content as mobile applications via Adobe AIR.
Adobe is reaffirming its commitment to "aggressively contribute" to HTML5. Adobe announced the end of Flash for mobile platforms or TV, instead focusing on HTML5 for browser content and Adobe AIR for the various mobile application stores and described it as "the beginning of the end". BlackBerry LTD (formerly known as RIM) announced that it would continue to develop Flash Player for the PlayBook.
There is no Adobe Flash Player for iOS devices (iPhone, iPad and iPod Touch). However, Flash content can be made to run on iOS devices in a variety of ways:
The mobile version of Internet Explorer for Windows Phone cannot play Flash content, however Flash support is still present on the tablet version of Windows.
Adobe AIR was released in 2008, and allows the creation of mobile applications and mobile games using Flash and ActionScript. Notable mobile games built with Flash include "Angry Birds", "Machinarium" and "Defend Your Castle".
Using AIR, developers can access the full Adobe Flash functionality, including text, vector graphics, raster graphics, video, audio, camera and microphone capability. Adobe AIR also includes additional features such as file system integration, native client extensions, desktop integration and access to connected devices and sensors.
AIR applications can be published as native phone applications on certain mobile operating systems, such as Android (ARM Cortex-A8 and above) and Apple iOS.
The following table explains to what extent Adobe AIR can run on various mobile operating systems:
Adobe Flash Lite is a lightweight version of Adobe Flash Player intended for mobile phones and other portable electronic devices like Chumby and iRiver.
On the emerging single-board enthusiast market, as substantially popularized by the Raspberry Pi, support from Adobe is lacking. However, the open-source player Gnash has been ported and found to be useful.
OpenFL is an open-source implementation of the Adobe Flash technology. It allows developers to build a single application against the OpenFL APIs, and simultaneously target multiple platforms including Flash/AIR, HTML5, Windows, Android, Tizen, Neko, BlackBerry, and webOS. OpenFL mirrors the Flash API for graphical operations. OpenFL applications are written in Haxe, a modern multi-platform programming language.
More than 500 video games have been developed with OpenFL, including the BAFTA-award-winning game "Papers, Please", Rymdkapsel, Lightbot and Madden NFL Mobile.
HTML5 is often cited as an alternative to Adobe Flash technology usage on web pages. Adobe released a tool that converts Flash to HTML5, and in June 2011, Google released an experimental tool that does the same. In January 2015, YouTube defaulted to HTML5 players to better support more devices.
The following tools allow running Flash content in web browsers using HTML5:
Websites built with Adobe Flash will not function on most modern mobile devices running Google Android or iOS (iPhone, iPad). The only alternative is using HTML5 and responsive web design to build websites that support both desktop and mobile devices.
However, Flash is still used to build mobile games using Adobe AIR. Such games will not work in mobile web browsers but must be installed via the appropriate app store.
The reliance on Adobe for decoding Flash makes its use on the World Wide Web a concern—the completeness of its public specifications are debated, and no complete implementation of Flash is publicly available in source code form with a license that permits reuse. Generally, public specifications are what makes a format re-implementable (see future proofing data storage), and reusable codebases can be ported to new platforms without the endorsement of the format creator.
Adobe's restrictions on the use of the SWF/FLV specifications were lifted in February 2009 (see Adobe's Open Screen Project). However, despite efforts of projects like Gnash, Swfdec and Lightspark, a complete free Flash player is yet to be seen, as of September 2011. For example, Gnash cannot use SWF v10 yet. Notably, Gnash was listed on the Free Software Foundation's high priority list, from at least 2007, to its overdue removal in January 2017.
Notable advocates of free software, open standards, and the World Wide Web have warned against the use of Flash:
The founder of Mozilla Europe, Tristan Nitot, stated in 2008:
Companies building websites should beware of proprietary rich-media technologies like Adobe's Flash and Microsoft's Silverlight. (...) You're producing content for your users and there's someone in the middle deciding whether users should see your content.
Representing open standards, inventor of CSS and co-author of HTML5, Håkon Wium Lie explained in a Google tech talk of 2007, entitled "the element", the proposal of Theora as the format for HTML5 video:
I believe very strongly, that we need to agree on some kind of baseline video format if [the video element] is going to succeed. Flash is today the baseline format on the web. The problem with Flash is that it's not an open standard.
Representing the free software movement, Richard Stallman stated in a speech in 2004 that: "The use of Flash in websites is a major problem for our community."
Usability consultant Jakob Nielsen published an Alertbox in 2000 entitled, "Flash: 99% Bad", stating that "Flash tends to degrade websites for three reasons: it encourages design abuse, it breaks with the Web's fundamental interaction principles, and it distracts attention from the site's core value." Some problems have been at least partially fixed since Nielsen's complaints: Text size can be controlled using full page zoom and it has been possible for authors to include alternative text in Flash since Flash Player 6.
Flash content is usually embedded using the codice_1 or codice_2 HTML element. A web browser that does not fully implement one of these elements displays the replacement text, if supplied by the web page. Often, a plugin is required for the browser to fully implement these elements, though some users cannot or will not install it.
Since Flash can be used to produce content (such as advertisements) that some users find obnoxious or take a large amount of bandwidth to download, some web browsers, by default, do not play Flash content until the user clicks on it, e.g. Konqueror, K-Meleon.
Most current browsers have a feature to block plugins, playing one only when the user clicks it. Opera versions since 10.5 feature native Flash blocking. Opera Turbo requires the user to click to play Flash content, and the browser also allows the user to enable this option permanently. Both Chrome and Firefox have an option to enable "click to play plugins". Equivalent "Flash blocker" extensions are also available for many popular browsers: Firefox has Flashblock and NoScript, Internet Explorer has Foxie, which contains a number of features, one of them named Flashblock. WebKit-based browsers under macOS, such as Apple's Safari, have ClickToFlash. In June 2015, Google announced that Chrome will "pause" advertisements and "non-central" Flash content by default.
Firefox (from version 46) rewrites old Flash-only YouTube embed code into YouTube's modern embedded player that is capable of using either HTML5 or Flash. Such embed code is used by non-YouTube sites to embed YouTube's videos, and can still be encountered, for example, on old blogs and forums.
For many years Adobe Flash Player's security record has led many security experts to recommend against installing the player, or to block Flash content. The US-CERT has recommended blocking Flash, and security researcher Charlie Miller recommended "not to install Flash"; however, for people still using Flash, Intego recommended that users get trusted updates "only directly from the vendor that publishes them." As of February 12, 2015, Adobe Flash Player has over 400 CVE entries, of which over 300 lead to arbitrary code execution, and past vulnerabilities have enabled spying via web cameras. Security experts have long predicted the demise of Flash, saying that with the rise of HTML5 "...the need for browser plugins such as Flash is diminishing", as only 7 to 10 percent of websites still use it.
Active moves by third parties to limit the risk began with Steve Jobs in 2010 saying that Apple would not allow Flash on the iPhone, iPod touch and iPad – citing abysmal security as one reason. Flash often used the ability to dynamically change parts of the runtime on languages on OSX to improve their own performance, but caused general instability. In July 2015, a series of newly discovered vulnerabilities resulted in Facebook's chief security officer, Alex Stamos, issuing a call to Adobe to discontinue the software entirely and the Mozilla Firefox web browser, Google Chrome and Apple Safari to blacklist all earlier versions of Flash Player.
As a result, "Adobe has essentially stopped trying to do anything new and innovative with Flash."
Like the HTTP cookie, a flash cookie (also known as a "Local Shared Object") can be used to save application data. Flash cookies are not shared across domains. An August 2009 study by the Ashkan Soltani and a team of researchers at UC Berkeley found that 50% of websites using Flash were also employing flash cookies, yet privacy policies rarely disclosed them, and user controls for privacy preferences were lacking. Most browsers' cache and history suppress or delete functions did not affect Flash Player's writing Local Shared Objects to its own cache in version 10.2 and earlier, at which point the user community was much less aware of the existence and function of Flash cookies than HTTP cookies. Thus, users with those versions, having deleted HTTP cookies and purged browser history files and caches, may believe that they have purged all tracking data from their computers when in fact Flash browsing history remains. Adobe's own Flash Website Storage Settings panel, a submenu of Adobe's Flash Settings Manager web application, and other editors and toolkits can manage settings for and delete Flash Local Shared Objects.
|
https://en.wikipedia.org/wiki?curid=20947
|
Brainwashing
Brainwashing (also known as mind control, menticide, coercive persuasion, thought control, thought reform, and re-education) is the concept that the human mind can be altered or controlled by certain psychological techniques. Brainwashing is said to reduce its subjects' ability to think critically or independently, to allow the introduction of new, unwanted thoughts and ideas into their minds, as well as to change their attitudes, values and beliefs.
The term "brainwashing" was first used by Edward Hunter in 1950 to describe how the Chinese government appeared to make people cooperate with them. Research into the concept also looked at Nazi Germany, at some criminal cases in the United States, and at the actions of human traffickers. In the 1970s there was considerable scientific and legal debate, as well as media attention, about the possibility of brainwashing being a factor in the conversion of young people to some new religious movements, which were often referred to as cults at the time. The concept of brainwashing is sometimes involved in lawsuits, especially regarding child custody. It can also be a theme in science fiction and in political and corporate culture.
Although the term "brainwashing" appears in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) of the American Psychiatric Association it is not generally accepted as scientific term.
The Chinese term "xǐnăo" (洗脑,"wash brain") was originally used to describe the coercive persuasion used under the Maoist government in China, which aimed to transform "reactionary" people into "right-thinking" members of the new Chinese social system. The term punned on the Taoist custom of "cleansing / washing the heart / mind" ("xǐxīn",洗心) before conducting ceremonies or entering holy places.
The "Oxford English Dictionary" records the earliest known English-language usage of the word "brainwashing" in an article by journalist Edward Hunter, in "Miami News", published on 24 September 1950. Hunter was an outspoken anticommunist and was alleged to be a CIA agent working undercover as a journalist. Hunter and others used the Chinese term to explain why, during the Korean War (1950-1953), some American prisoners of war (POWs) cooperated with their Chinese captors, and even in a few cases defected to their side. British radio operator Robert W. Ford and British army Colonel James Carne also claimed that the Chinese subjected them to brainwashing techniques during their imprisonment.
The U.S. military and government laid charges of brainwashing in an effort to undermine confessions made by POWs to war crimes, including biological warfare. After Chinese radio broadcasts claimed to quote Frank Schwable, Chief of Staff of the First Marine Air Wing admitting to participating in germ warfare, United Nations commander Gen. Mark W. Clark asserted:
Beginning in 1953, Robert Jay Lifton interviewed American servicemen who had been POWs during the Korean War as well as priests, students, and teachers who had been held in prison in China after 1951. In addition to interviews with 25 Americans and Europeans, Lifton interviewed 15 Chinese citizens who had fled after having been subjected to indoctrination in Chinese universities. (Lifton's 1961 book "", was based on this research.) Lifton found that when the POWs returned to the United States their thinking soon returned to normal, contrary to the popular image of "brainwashing."
In 1956, after reexamining the concept of brainwashing following the Korean War, the U.S. Army published a report entitled "Communist Interrogation, Indoctrination, and Exploitation of Prisoners of War", which called brainwashing a "popular misconception". The report concludes that "exhaustive research of several government agencies failed to reveal even one conclusively documented case of 'brainwashing' of an American prisoner of war in Korea."
For 20 years starting in the early 1950s, the United States Central Intelligence Agency (CIA) and the United States Department of Defense conducted secret research, including Project MKUltra, in an attempt to develop practical brainwashing techniques; the results are unknown (see Sidney Gottlieb). CIA experiments using various psychedelic drugs such as LSD and Mescaline drew from Nazi human experimentation.
A bipartisan Senate Armed Services Committee report, released in part in December 2008 and in full in April 2009, reported that US military trainers who came to Guantánamo Bay in December 2002 had based an interrogation class on a chart copied from a 1957 Air Force study of "Chinese Communist" brainwashing techniques. The report showed how the Secretary of Defense’s 2002 authorization of the aggressive techniques at Guantánamo led to their use in Afghanistan and in Iraq, including at Abu Ghraib.
The concept of brainwashing has been raised in the defense of criminal charges. It has also been raised in child custody cases. The 1969 to 1971 case of Charles Manson, who was said to have brainwashed his followers to commit murder and other crimes, brought the issue to renewed public attention.
In 1974, Patty Hearst, a member of the wealthy Hearst family, was kidnapped by the Symbionese Liberation Army, a left-wing militant organization. After several weeks of captivity she agreed to join the group and took part in their activities. In 1975, she was arrested and charged with bank robbery and use of a gun in committing a felony. Her attorney, F. Lee Bailey, argued in her trial that she should not be held responsible for her actions since her treatment by her captors was the equivalent of the alleged brainwashing of Korean War POWs (see also Diminished responsibility). Bailey developed his case in conjunction with psychiatrist Louis Jolyon West and psychologist Margaret Singer. They had both studied the experiences of Korean War POWs. (In 1996 Singer published her theories in her best-selling book "Cults in Our Midst".) Despite this defense Hearst was found guilty.
In 1990 Steven Fishman, who was a member of the Church of Scientology, was charged with mail fraud for conducting a scheme to sue large corporations via conspiring with minority stockholders in shareholder class action lawsuits. Afterwards, he would sign settlements that left those stockholders empty-handed. Fishman's attorneys notified the court that they intended to rely on an insanity defense, using the theories of brainwashing and the expert witnesses of Singer and Richard Ofshe to claim that Scientology had practiced brainwashing on him which left him unsuitable to make independent decisions. The court ruled that the use of brainwashing theories is inadmissible in expert witnesses, citing the Frye standard, which states that scientific theories utilized by expert witnesses must be generally accepted in their respective fields.
In 2003, the brainwashing defense was used unsuccessfully in the defense of Lee Boyd Malvo, who was charged with murder for his part in the D.C. sniper attacks.
Some legal scholars have argued that the brainwashing defense undermines the law's fundamental premise of free will. In 2003, forensic psychologist Dick Anthony said that "no reasonable person would question that there are situations where people can be influenced against their best interests, but those arguments are evaluated on the basis of fact, not bogus expert testimony."
Italy has had controversy over the concept of "plagio", a crime consisting in an absolute psychological—and eventually physical—domination of a person. The effect is said to be the annihilation of the subject's freedom and self-determination and the consequent negation of his or her personality. The crime of plagio has rarely been prosecuted in Italy, and only one person was ever convicted. In 1981, an Italian court found that the concept is imprecise, lacks coherence and is liable to arbitrary application.
Kathleen Barry, co-founder of the United Nations NGO, the Coalition Against Trafficking in Women (CATW), prompted international awareness of human sex trafficking in her 1979 book "Female Sexual Slavery". In his 1986 book "Woman Abuse: Facts Replacing Myths," Lewis Okun reported that: "Kathleen Barry shows in "Female Sexual Slavery" that forced female prostitution involves coercive control practices very similar to thought reform." In their 1996 book, "Casting Stones: Prostitution and Liberation in Asia and the United States", Rita Nakashima Brock and Susan Brooks Thistlethwaite report that the methods commonly used by pimps to control their victims "closely resemble the brainwashing techniques of terrorists and paranoid cults."
In the 1970s, the anti-cult movement applied the concept of brainwashing to explain seemingly sudden and dramatic religious conversions to various new religious movements (NRMs) and other groups that they considered cults. News media reports tended to support the brainwashing view and social scientists sympathetic to the anti-cult movement, who were usually psychologists, developed revised models of mind control. While some psychologists were receptive to the concept, sociologists were for the most part skeptical of its ability to explain conversion to NRMs.
Philip Zimbardo defined mind control as "the process by which individual or collective freedom of choice and action is compromised by agents or agencies that modify or distort perception, motivation, affect, cognition or behavioral outcomes," and he suggested that any human being is susceptible to such manipulation. Another adherent to this view, Jean-Marie Abgrall was heavily criticized by forensic psychologist Dick Anthony for employing a pseudo-scientific approach and lacking any evidence that anyone's worldview was substantially changed by these coercive methods. On the contrary, the concept and the fear surrounding it was used as a tool for the anti-cult movement to rationalize the persecution of minority religious groups.
Eileen Barker criticized the concept of mind control because it functioned to justify costly interventions such as deprogramming or exit counseling. She has also criticized some mental health professionals, including Singer, for accepting expert witness jobs in court cases involving NRMs. Her 1984 book, "" describes the religious conversion process to the Unification Church (whose members are sometimes informally referred to as "Moonies"), which had been one of the best known groups said to practice brainwashing. Barker spent close to seven years studying Unification Church members and wrote that she rejects the "brainwashing" theory, because it explains neither the many people who attended a recruitment meeting and did not become members, nor the voluntary disaffiliation of members.
James Richardson observed that if the new religious movements had access to powerful brainwashing techniques, one would expect that they would have high growth rates, yet in fact most have not had notable success in recruiting or retaining members For this and other reasons, sociologists of religion including David Bromley and Anson Shupe consider the idea that "cults" are brainwashing American youth to be "implausible." Thomas Robbins, Massimo Introvigne, Lorne Dawson, Gordon Melton, Marc Galanter, and Saul Levine, amongst other scholars researching NRMs, have argued and established to the satisfaction of courts, relevant professional associations and scientific communities that there exists no generally accepted scientific theory, based upon methodologically sound research, that supports the concept of brainwashing.
Benjamin Zablocki responded that brainwashing is not "a process that is directly observable," and that the "real sociological issue" is whether "brainwashing occurs frequently enough to be considered an important social problem", and that Richardson misunderstands brainwashing, conceiving of it as a recruiting process, instead of a retaining process, and that the number of people who attest to brainwashing in interviews (performed in accordance with guidelines of the National Institute of Mental Health and National Science Foundation) is too large to result from anything other than a genuine phenomenon. Zablocki also pointed out that in the two most prestigious journals dedicated to the sociology of religion there have been no articles "supporting the brainwashing perspective," while over one hundred such articles have been published in other journals "marginal to the field." He concludes that the concept of brainwashing has been unfairly blacklisted.
Families of converts to NRMs have attempted to invoke brainwashing theories to satisfy conservatorship statutory guidelines. Conservatorship is a legal concept in the United States that grants a responsible person custody over another adult who cannot care for herself or himself, either financially and/or in daily life, due to physical and/or mental limitations. Typically, conservatorship cases involved the elderly, mainly those suffering from dementia-related illnesses. However, conservatorship cases involving younger adults and their participation in new religious movements increased during the mid-1970s, with many of those U.S. judges granting temporary conservatorships. The use of brainwashing theories in conservatorship cases was deemed inadmissible as a result of the Katz v. Superior Court (1977) ruling. The ruling implied that the statutory guideline for conservatorships only referred to "needs of health, food, clothing, and shelter" and that investigating if conversion is "induced by faith or by coercive persuasion is ... not in turn investigating and questioning the validity of that faith." In 2016, Israeli anthropologist of religion and fellow at the Van Leer Jerusalem Institute Adam Klin-Oron said about then-proposed "anti-cult" legislation:
In 1983, the American Psychological Association (APA) asked Singer to chair a taskforce called the APA Task Force on Deceptive and Indirect Techniques of Persuasion and Control (DIMPAC) to investigate whether brainwashing or coercive persuasion did indeed play a role in recruitment by NRMs.
It came to the following conclusion:
"Cults and large group awareness trainings have generated considerable controversy because of their widespread use of deceptive and indirect techniques of persuasion and control. These techniques can compromise individual freedom, and their use has resulted in serious harm to thousands of individuals and families. This report reviews the literature on this subject, proposes a new way of conceptualizing influence techniques, explores the ethical ramifications of deceptive and indirect techniques of persuasion and control, and makes recommendations addressing the problems described in the report."
On 11 May 1987, the APA's Board of Social and Ethical Responsibility for Psychology (BSERP) rejected the DIMPAC report because the report "lacks the scientific rigor and evenhanded critical approach necessary for APA imprimatur", and concluded that "after much consideration, BSERP does not believe that we have sufficient information available to guide us in taking a position on this issue."
Joost Meerloo, a Dutch psychiatrist, was an early proponent of the concept of brainwashing. ("Menticide" is a neologism coined by him meaning: "killing of the mind.") Meerloo's view was influenced by his experiences during the German occupation of his country and his work with the Dutch government and the American military in the interrogation of accused Nazi war criminals. He later emigrated to the United States and taught at Columbia University. His best-selling 1956 book, "The Rape of the Mind", concludes by saying:
Russian historian Daniel Romanovsky, who interviewed survivors and eyewitnesses in the 1970s, reported on what he called "Nazi brainwashing" of the people of Belarus by the occupying Germans during the Second World War, which took place through both mass propaganda and intense re-education, especially in schools. Romanovsky noted that very soon most people had adopted the Nazi view that the Jews were an inferior race and were closely tied to the Soviet government, views that had not been at all common before the German occupation.
In George Orwell's 1948 dystopian novel "Nineteen Eighty-Four" the main character is subjected to imprisonment, isolation, and torture in order to conform his thoughts and emotions to the wishes of the rulers of Orwell's fictional future totalitarian society. Orwell's vision influenced Hunter and is still reflected in the popular understanding of the concept of brainwashing.
In the 1950s some American films were made that featured brainwashing of POWs, including "The Rack", "The Bamboo Prison", "Toward the Unknown", and "The Fearmakers". "Forbidden Area" told the story of Soviet secret agents who had been brainwashed through classical conditioning by their own government so they wouldn't reveal their identities. In 1962 "The Manchurian Candidate" (based on the 1959 novel by Richard Condon) "put brainwashing front and center" by featuring a plot by the Soviet government to take over the United States by using a brainwashed sleeper agent for political assassination. The concept of brainwashing became popularly associated with the research of Russian psychologist Ivan Pavlov, which mostly involved dogs, not humans, as subjects. In "The Manchurian Candidate" the head brainwasher is Dr. Yen Lo, of the Pavlov Institute.
The science fiction stories of Cordwainer Smith (pen name of Paul Myron Anthony Linebarger (1913-1966), a US Army officer who specialized in military intelligence and psychological warfare) depict brainwashing to remove memories of traumatic events as a normal and benign part of future medical practice. Mind control remains an important theme in science fiction. A subgenre is "corporate mind control", in which a future society is run by one or more business corporations that dominate society using advertising and mass media to control the population's thoughts and feelings. Terry O'Brien commented: "Mind control is such a powerful image that if hypnotism did not exist, then something similar would have to have been invented: The plot device is too useful for any writer to ignore. The fear of mind control is equally as powerful an image."
|
https://en.wikipedia.org/wiki?curid=20948
|
Molotov–Ribbentrop Pact
The Molotov–Ribbentrop Pact was a non-aggression pact between Nazi Germany and the Soviet Union that enabled those two powers to divide-up Poland between them. It was signed in Moscow on August 23, 1939, by Foreign Ministers Joachim von Ribbentrop and Vyacheslav Molotov, and was officially known as the Treaty of Non-Aggression between Germany and the Union of Soviet Socialist Republics.
The Pact's clauses provided a written guarantee of peace by each party towards the other and a declared commitment that neither government would ally itself to or aid an enemy of the other. In addition to the publicly-announced stipulations of non-aggression, the treaty included a secret protocol, which defined the borders of Soviet and German spheres of influence across Poland, Lithuania, Latvia, Estonia and Finland. The secret protocol also recognised the interest of Lithuania in the Vilno region, and Germany declared its complete disinterest in Bessarabia. The Secret Protocol was just a rumor until it was made public at the Nuremberg trials.
Soon after the pact was finalized, Germany invaded Poland on 1 September 1939. Soviet leader Joseph Stalin ordered the Soviet invasion of Poland on 17 September, one day after a Soviet–Japanese ceasefire at the Khalkhin Gol came into effect. After the invasion, the new border between the two powers was confirmed by the supplementary protocol of the German–Soviet Frontier Treaty. In March 1940, parts of the Karelia and Salla regions l, in Finland, were annexed by the Soviet Union after the Winter War. That was followed by Soviet annexations of Estonia, Latvia, Lithuania and parts of Romania (Bessarabia, Northern Bukovina and the Hertza region). Concern about ethnic Ukrainians and Belarusians had been used as pretexts for the Soviet invasion of Poland. Stalin's invasion of Bukovina in 1940 violated the pact since it went beyond the Soviet sphere of influence that had been agreed with the Axis.
The territories of Poland annexed by the Soviet Union after the 1939 Soviet invasion, east of the Curzon line, remained in the Soviet Union at the end of World War II and now are parts of Ukraine and Belarus. The formerly-Polish Vilno region is now part of Lithuania, and the city of Vilnius is now the Lithuanian capital. Only the region around Białystok and a small part of Galicia east of the San River, around Przemyśl, were returned to Poland. Of all the other territories annexed by the Soviet Union in 1939 to 1940, those detached from Finland (Western Karelia, Petsamo), Estonia (Estonian Ingria and Petseri County) and Latvia (Abrene) remain part of Russia, the successor state to the Russian Soviet Federative Socialist Republic upon the dissolution of the Soviet Union in 1991. The territories annexed from Romania had also been integrated into the Soviet Union (as the Moldavian SSR or oblasts of the Ukrainian SSR). The core of Bessarabia now forms Moldova. The northern part of Bessarabia, Northern Bukovina and Hertza now form the Chernivtsi Oblast of Ukraine. Southern Bessarabia is part of the Odessa Oblast, also in Ukraine.
The pact was terminated on 22 June 1941, when Germany launched Operation Barbarossa and invaded the Soviet Union thus also executing the ideological goal of "Lebensraum". After the war, Ribbentrop was convicted of war crimes and executed. Molotov died at 96 in 1986, five years before the Soviet Union's dissolution. Soon after World War II, the German copy of the secret protocol was found in Nazi archives and published in the West, but the Soviet government denied its existence until 1989, when it was finally acknowledged and denounced. Mikhail Gorbachev, the last leader of the Soviet Union, condemned the pact. Vladimir Putin has condemned the pact as "immoral" but also defended it as a "necessary evil". At a press conference on 19 December 2019, Putin went further and announced that the signing of the pact was no worse than the 1938 Munich agreement, which led to partition of Czechoslovakia.
The outcome of World War I was disastrous for both the German and the Russian empires. Civil War broke out in Russia in late 1917 after the Bolshevik Revolution, and Vladimir Lenin, the first leader of the newly-formed Soviet Russia, recognised the independence of Finland, Estonia, Latvia, Lithuania and Poland. Moreover, facing a German military advance, Lenin and Trotsky were forced to enter into the Treaty of Brest-Litovsk, which ceded many western Russian territories to the German Empire. After Germany's collapse, a multinational Allied-led army intervened in the Russian Civil War (1917–22).
On 16 April 1922, Germany and the Soviet Union entered the Treaty of Rapallo in which they renounced territorial and financial claims against each other. Each party also pledged neutrality in the event of an attack against the other with the Treaty of Berlin (1926). While trade between the two countries fell sharply after World War I, trade agreements signed in the mid-1920s helped to increase trade to 433 million Reichsmarks per year by 1927.
At the beginning of the 1930s, the Nazi Party's rise to power increased tensions between Germany and the Soviet Union, along with other countries with ethnic Slavs, who were considered "Untermenschen" (inferior) according to Nazi racial ideology. Moreover, the anti-Semitic Nazis associated ethnic Jews with both communism and financial capitalism, both of which they opposed. Nazi theory held that Slavs in the Soviet Union were being ruled by "Jewish Bolshevik" masters. Hitler himself had spoken of an inescapable battle for the acquisition of land for Nazi Germany in the east . The resulting manifestation of German anti-Bolshevism and an increase in Soviet foreign debts caused a dramatic decline in German–Soviet trade. Imports of Soviet goods to Germany fell to 223 million Reichsmarks in 1934 as the more isolationist Stalinist regime asserted power and the abandonment of postwar Treaty of Versailles military controls decreased Germany's reliance on Soviet imports.
In 1936, Germany and Fascist Italy supported the Spanish Nationalists in the Spanish Civil War, but the Soviets supported the Second Spanish Republic. Thus the Spanish Civil War became a proxy war between Germany and the Soviet Union. In 1936, Germany and Japan entered the Anti-Comintern Pact, and they were joined a year later by Italy.
On 31 March 1939, Britain extended a guarantee to Poland stating that "if any action clearly threatened Polish independence, and if the Poles felt it vital to resist such action by force, Britain would come to their aid". Hitler was furious since that meant that the British were committed to political interests in Europe and that Hitler's land grabs such as the takeover of Czechoslovakia would not be taken lightly anymore. Hitler's response to the political checkmate would later be heard at a rally in Wilhelmshaven: "No power on earth would be able to break German might, and if the Western Allies thought Germany would stand by while they marshalled their 'satellite states' to act in their interests, then they were sorely mistaken." Ultimately, Hitler's discontent with a British-Polish alliance led to a restructuring of strategy towards Moscow. Alfred Rosenberg wrote that he had spoken to Hermann Goering of this potential alliance with the Soviet Union: "When Germany's life is at stake, even a temporary alliance with Moscow must be contemplated". Sometime in early May 1939 at the infamous Berghof, Ribbentrop showed Hitler a film of Stalin viewing his military in a recent parade. Hitler became intrigued with the idea, and Ribbentrop recalled Hitler saying that Stalin "looked like a man he could do business with". Thereafter, Ribbentrop was given the nod to pursue negotiations with Moscow.
Hitler's fierce anti-Soviet rhetoric was one of the reasons that Britain and France decided that Soviet participation in the 1938 Munich Conference on Czechoslovakia would be both dangerous and useless. The Munich Agreement that followed it marked a partial German annexation of Czechoslovakia in late 1938, followed by its complete dissolution in March 1939, which was part of the appeasement of Germany conducted by Neville Chamberlain and Édouard Daladier's cabinets. That policy immediately raised the question of whether the Soviet Union could avoid being next on Hitler's list. The Soviet leadership believed that the West wanted to encourage German aggression in the East and that France and Britain might stay neutral in a war initiated by Germany in the hope that the warring states would wear each other out and put an end to both regimes.
For Germany, because an autarkic economic approach or an alliance with Britain were impossible, closer relations with the Soviet Union to obtain raw materials became necessary but not only for economic reasons. An expected British blockade in the event of war would also create massive shortages for Germany in a number of key raw materials. After the Munich agreement, the resulting increase in German military supply needs and Soviet demands for military machinery, talks between the two countries occurred from late 1938 to March 1939. The third Soviet Five Year Plan required new infusions of technology and industrial equipment. German war planners had estimated serious shortfalls of raw materials if Germany entered a war without Soviet supply.
On 31 March 1939, in response to Nazi Germany's defiance of the Munich agreement and the occupation of Czechoslovakia, Britain pledged its support and that of France to guarantee the independence of Poland, Belgium, Romania, Greece and Turkey. On 6 April, Poland and Britain agreed to formalise the guarantee as a military alliance, pending negotiations. On 28 April, Hitler denounced the 1934 German–Polish Non-Aggression Pact and the 1935 Anglo–German Naval Agreement.
In mid-March 1939, attempting to contain Hitler's expansionism, the Soviet Union, Britain and France started to trade a flurry of suggestions and counterplans regarding a potential political and military agreement. Although informal consultations commenced in April, the main negotiations began only in May. At the same time, throughout early 1939, Germany had secretly hinted to Soviet diplomats that it could offer better terms for a political agreement than Britain and France.
The Soviet Union, which feared Western powers and the possibility of "capitalist encirclements", had little faith either that war could be avoided or in the Polish army, and it wanted nothing less than an ironclad military alliance with France and Britain that would provide guaranteed support for a two-pronged attack on Germany; thus, Stalin's adherence to the collective security line was purely conditional. Britain and France believed that war could still be avoided and that the Soviet Union, weakened by the Great Purge, could not be a main military participant, a point with which many military sources were at variance, especially because of the Soviet victories over the Japanese Kwantung Army on the Manchurian frontier. France was more anxious to find an agreement with the Soviet Union than was Britain; as a continental power, it was more willing to make concessions and also more fearful of the dangers of an agreement between the Soviet Union and Germany. The contrasting attitudes partly explain why the Soviets has often been charged with playing a double game in 1939 of carrying on open negotiations for an alliance with Britain and France while it was secretly considering propositions from Germany.
By the end of May, drafts were formally presented. In mid-June, the main tripartite negotiations started. The discussion was focused on potential guarantees to Central and Eastern Europe on the case of German aggression. The Soviets proposed to consider that a political turn towards Germany by the Baltic states would constitute an "indirect aggression" towards it. Britain opposed such proposals because they feared the Soviets' proposed language could either justify a Soviet intervention in Finland and the Baltic states or push those countries to seek closer relations with Germany. The discussion about a definition of "indirect aggression" became one of the sticking points between the parties, and by mid-July, the tripartite political negotiations effectively stalled while the parties agreed to start negotiations on a military agreement, which the Soviets insisted must be entered into simultaneously with any political agreement. One day before the military negotiations began, the Soviet Politburo, pessimistically expecting the coming negotiations go be going nowhere, formally decided to consider German proposals seriously. The military negotiations began on 12 August in Moscow, with a British delegation headed by Sir Reginald Drax, a retired admiral, French delegation headed by General and the Soviet delegation headed by Kliment Voroshilov, the commissar of defence, and Boris Shaposhnikov, chief of the general staff. Without written credentials, Drax was not authorised to guarantee anything to the Soviet Union and instructed by the British government to prolong the discussions as long as possible and to avoid answering the question that whether Poland would agree to permit Soviet troops to enter the country if the Germans invaded. As the negotiations failed, a great opportunity to prevent the German aggression was probably lost.
From April to July, Soviet and German officials made statements on the potential for the beginning of political negotiations while no actual negotiations were taking place. "The Soviet Union had wanted good relations with Germany for years and was happy to see that feeling finally reciprocated", wrote historian Gerhard L. Weinberg. The ensuing discussion of a potential political deal between Germany and the Soviet Union had to be channelled into the framework of economic negotiations between the two countries because close military and diplomatic connections, as was the case before the mid-1930s, had been largely severed. In May, Stalin replaced his foreign minister from 1930 to 1939, Maxim Litvinov, who had advocated rapprochement with the West and who was also Jewish, with Vyacheslav Molotov, allowing the Soviet Union more latitude in discussions with more parties, not only with Britain and France.
On 23 August 1939, two Focke-Wulf Condors, containing German diplomats, officials and photographers (about 20 in each plane), headed by Joachim von Ribbentrop, descended into Moscow. As Nazi Party leaders stepped off the plane a Soviet military band performed, "Deutschland, Deutschland über Alles". The Nazi arrival was well planned, with all the aesthetics in order. The classic hammer and sickle was propped up next to the swastika of the Nazi flag that had been used in a local film studio for Soviet propaganda films. After stepping off the plane and the shaking of hands, Ribbentrop and Gustav Hilger along with German ambassador Friedrich-Werner von der Schulenburg and Stalin's chief bodyguard, Nikolai Vlasik, entered into a limousine operated by the NKVD to travel to the Red Square. The limousine arrived close to Stalin's office and was greeted by Alexander Poskrebyshev, the chief of Stalin's personal chancellery. Nazi Party officials were led up a flight of stairs to a room with lavish furnishings. Stalin and Foreign Minister Vyacheslav Molotov greeted the party members, much to the Nazis surprise. It was well known that Stalin avoided meeting foreign visitors and so his presence at the meeting showed how seriously the Soviets were taking the negotiations.
In late July and early August 1939, Soviet and German officials agreed on most of the details of a planned economic agreement and specifically addressed a potential political agreement, which the Soviets stated could only come after an economic agreement.
The Nazi presence in the Soviet capital during negotiations can be regarded as rather tense. German pilot Hans Baur recalled Soviet secret police following his every move. Their job was to inform authorities when he left his residence and where he was headed. Baur's guide informed him: "Another car would tack itself onto us and follow us fifty or so yards in the rear, and wherever we went and whatever we did, the secret police would be on our heels". Hans also recalled trying to tip his Russian driver, which led to a harsh exchange of words: "He was furious. He wanted to know whether this was the thanks he got for having done his best for us to get him into prison. We knew perfectly well it was forbidden to take tips".
In early August, Germany and the Soviet Union worked out the last details of their economic deal and started to discuss a political alliance. They explained to each other the reasons for their foreign policy hostility in the 1930s, finding common ground in the anticapitalism of both countries, as the diplomats from both countries addressed the common ground of anticapitalism: "there is one common element in the ideology of Germany, Italy, and the Soviet Union: opposition to the capitalist democracies" or that "it seems to us rather unnatural that a socialist state would stand on the side of the western democracies."
At the same time, British, French and Soviet negotiators scheduled three-party talks on military matters to occur in Moscow in August 1939 that aimed to define what the agreement would specify on the reaction of the three powers to a German attack. The tripartite military talks, started in mid-August, hit a sticking point on the passage of Soviet troops through Poland if Germans attacked, and the parties waited as British and French officials overseas pressured Polish officials to agree to such terms. Polish officials refused to allow Soviet troops into Polish territory if Germany attacked; Polish Foreign Minister Józef Beck pointed out that the Polish government feared that once the Red Army entered Polish territory, it would never leave.
On 19 August, the 1939 German–Soviet Commercial Agreement was finally signed. On 21 August, the Soviets suspended the tripartite military talks, citing other reasons. The same day, Stalin received assurance that Germany would approve secret protocols to the proposed non-aggression pact that would place half of Poland (east of the Vistula River), Latvia, Estonia, Finland and Bessarabia in the Soviets' sphere of influence. That night, Stalin replied that the Soviets were willing to sign the pact and that he would receive Ribbentrop on 23 August.
On 25 August 1939, the "New York Times" ran a front-page story by Otto D. Tolischus, "Nazi Talks Secret", whose subtitle included "Soviet and Reich Agree on East". On 26 August 1939, the "New York Times" report Japanese anger and French communist surprise over the pact. The same day, however, Tolischus filed a story that noted Nazi troops on the move near Gleiwitz (now Gliwice), which led to the false flag Gleiwitz incident on 31 August 1939. On 28 August 1939, the "New York Times" was still reporting on fears of a Gleiwitz raid. On 29 August 1939, the "New York Times" reported that the Supreme Soviet had failed on its first day of convening to act on the pact. The same day, the "New York Times" also reported from Montreal, Canada, that American Professor Samuel N. Harper of the University of Chicago had stated publicly his belief that "the Russo-German non-aggression pact conceals an agreement whereby Russia and Germany may have planned spheres of influence for Eastern Europe". On 30 August 1939, the "New York Times" reported a Soviet buildup on its Western frontiers by moving 200,000 troops from the Far East.
On 22 August, one day after the talks broke down with France and Britain, Moscow revealed that Ribbentrop would visit Stalin the next day. That happened while the Soviets were still negotiating with the British and the French missions in Moscow. With the Western nations unwilling to accede to Soviet demands, Stalin instead entered a secret Nazi–Soviet pact. On 23 August, a ten-year non-aggression pact was signed with provisions that included consultation, arbitration if either party disagreed, neutrality if either went to war against a third power and no membership of a group "which is directly or indirectly aimed at the other". The article "On Soviet–German Relations" in the Soviet newspaper "Izvestia" of 21 August 1939, stated:
There was also a secret protocol to the pact, which was revealed only after Germany's defeat in 1945 although hints about its provisions were leaked much earlier, such as to influence Lithuania. According to the protocol, Romania, Poland, Lithuania, Latvia, Estonia and Finland were divided into German and Soviet "spheres of influence". In the north, Finland, Estonia and Latvia were assigned to the Soviet sphere. Poland was to be partitioned in the event of its "political rearrangement": the areas east of the Pisa, Narev, Vistula and San Rivers would go to the Soviet Union, and Germany would occupy the west. Lithuania, adjacent to East Prussia, would be in the German sphere of influence but a second secret protocol, agreed to in September 1939, and reassigned the most of Lithuania to the Soviet Union . According to the protocol, Lithuania would be granted its historical capital, Vilnius, which was under Polish control during the interwar period. Another clause of the treaty stipulated that Germany would not interfere with the Soviet Union's actions towards Bessarabia, then part of Romania; as a result, not only Bessarabia but also Northern Bukovina and Hertza regions were occupied by the Soviets and integrated into the Soviet Union.
At the signing, Ribbentrop and Stalin enjoyed warm conversations, exchanged toasts and further addressed the prior hostilities between the countries in the 1930s. They characterised Britain as always attempting to disrupt Soviet–German relations, ll and stated that the Anti-Comintern pact was not aimed at the Soviet Union but actually aimed at Western democracies and "frightened principally the City of London [the British financiers] and the English shopkeepers".
On 24 August, "Pravda" and "Izvestia" carried news of the pact's public portions, complete with the now-infamous front-page picture of Molotov signing the treaty, with a smiling Stalin looking on. The news was met with utter shock and surprise by government leaders and media worldwide, most of whom were aware only of the British–French–Soviet negotiations, which had taken place for months. The pact was received with shock by Germany's allies, notably Japan; by the Comintern and foreign communist parties and by Jewish communities all around the world. The same day, German diplomat Hans von Herwarth, whose grandmother was Jewish, informed Guido Relli, an Italian diplomat, and American chargé d'affaires Charles Bohlen on the secret protocol on the vital interests in the countries' allotted "spheres of influence" without revealing the annexation rights for "territorial and political rearrangement".
"Time Magazine" repeatedly referred to the Pact as the "Communazi Pact" and its participants as "communazis" until April 1941.
Soviet propaganda and representatives went to great lengths to minimise the importance of the fact that they had opposed and fought against the Nazis in various ways for a decade prior to signing the pact. Molotov tried to reassure the Germans of his good intentions by commenting to journalists that "fascism is a matter of taste". For its part, Nazi Germany also did a public volte-face regarding its virulent opposition to the Soviet Union, but Hitler still viewed an attack on the Soviet Union as "inevitable".
Concerns over the possible existence of a secret protocol were expressed first by the intelligence organisations of the Baltic states only days after the pact was signed. Speculation grew stronger when Soviet negotiators referred to its content during negotiations for military bases in those countries (see occupation of the Baltic States).
The day after the pact was signed, the French and British military negotiation delegation urgently requested a meeting with Soviet military negotiator Kliment Voroshilov. On 25 August, Voroshilov told them that "in view of the changed political situation, no useful purpose can be served in continuing the conversation". The same day, Hitler told the British ambassador to Berlin that the pact with the Soviets prevented Germany from facing a two-front war, changing the strategic situation from that in World War I, and that Britain should accept his demands on Poland.
On 25 August, surprising Hitler, Britain entered into a defense pact with Poland. Hitler postponed his planned 26 August invasion of Poland to 1 September. In accordance with the defence pact, Britain and France declared war on Germany on 3 September.
On 1 September, Germany invaded Poland from the west. Within a few days, Germany began conducting massacres of Polish and Jewish civilians and POWs, which took place in over 30 towns and villages in the first month of the German occupation. The "Luftwaffe" also took part by strafing fleeing civilian refugees on roads and by carrying out a bombing campaign. The Soviet Union assisted German air forces by allowing them to use signals broadcast by the Soviet radio station at Minsk, allegedly "for urgent aeronautical experiments."
Hitler declared at Danzig:
In the opinion of Robert Service, Stalin did not move instantly but was waiting to see whether the Germans would halt within the agreed area, and the Soviet Union also needed to secure the frontier in the Soviet–Japanese Border Wars. On 17 September, the Red Army invaded Poland, violating the 1932 Soviet–Polish Non-Aggression Pact, and occupied the Polish territory assigned to it by the Molotov–Ribbentrop Pact. That was followed by co-ordination with German forces in Poland.
Polish troops already fighting much stronger German forces on its west desperately tried to delay the capture of Warsaw. Consequently, Polish forces could not mount significant resistance against the Soviets.
On 21 September, the Soviets and Germans signed a formal agreement co-ordinating military movements in Poland, including the "purging" of saboteurs. Joint German–Soviet parades were held in Lvov and Brest-Litovsk, and the countries' military commanders met in the latter city. Stalin had decided in August that he was going to liquidate the Polish state, and a German–Soviet meeting in September addressed the future structure of the "Polish region." Soviet authorities immediately started a campaign of Sovietisation of the newly-acquired areas. The Soviets organised staged elections, the result of which was to become a legitimisation of the Soviet annexation of eastern Poland.
Eleven days after the Soviet invasion of the Polish Kresy, the secret protocol of the Molotov–Ribbentrop Pact was modified by the German–Soviet Treaty of Friendship, Cooperation and Demarcation, allotting Germany a larger part of Poland and transferring Lithuania, with the exception of the left bank of the River Scheschupe, the "Lithuanian Strip", from the envisioned German sphere to the Soviet sphere. On 28 September 1939, the Soviet Union and German Reich issued a joint declaration in which they declared:
On 3 October, Friedrich Werner von der Schulenburg, the German ambassador in Moscow, informed Joachim Ribbentrop that the Soviet government was willing to cede the city of Vilnius and its environs. On 8 October 1939, a new Nazi–Soviet agreement was reached by an exchange of letters between Vyacheslav Molotov and the German ambassador.
The Baltic states of Estonia, Latvia, and Lithuania were given no choice but to sign a so-called "Pact of Defence and Mitual Assistance", which permitted the Soviet Union to station troops in them.
After the Baltic states had been forced to accept treaties, Stalin turned his sights on Finland and was confident that its capitulation could be attained without great effort. The Soviets demanded territories on the Karelian Isthmus, the islands of the Gulf of Finland and a military base near the Finnish capital, Helsinki, which Finland rejected. The Soviets staged the shelling of Mainila and used it as a pretext to withdraw from the Soviet–Finnish Non-Aggression Pact. The Red Army attacked in November 1939. Simultaneously, Stalin set up a puppet government in the Finnish Democratic Republic. The leader of the Leningrad Military District, Andrei Zhdanov, commissioned a celebratory piece from Dmitri Shostakovich, "Suite on Finnish Themes", to be performed as the marching bands of the Red Army would be parading through Helsinki. After Finnish defenses surprisingly held out for over three months and inflicted stiff losses on Soviet forces, under the command of Semyon Timoshenko, the Soviets settled for an interim peace. Finland ceded southeastern areas of Karelia (10% of Finnish territory), which resulted in approximately 422,000 Karelians (12% of Finland's population) losing their homes. Soviet official casualty counts in the war exceeded 200,000 although Soviet Premier Nikita Khrushchev later claimed that the casualties may have been one million.
Around that time, after several Gestapo–NKVD Conferences, Soviet NKVD officers also conducted lengthy interrogations of 300,000 Polish POWs in camps that were a selection process to determine who would be killed. On 5 March 1940, in what would later be known as the Katyn massacre, 22,000 members of the military as well as intellectuals were executed, labelled "nationalists and counterrevolutionaries" or kept at camps and prisons in western Ukraine and Belarus.
In mid-June 1940, while international attention focused on the German invasion of France, Soviet NKVD troops raided border posts in Lithuania, Estonia and Latvia.
State administrations were liquidated and replaced by Soviet cadres, who deported or killed 34,250 Latvians, 75,000 Lithuanians and almost 60,000 Estonians. Elections took place, with a single pro-Soviet candidates listed for many positions, and the resulting people's assemblies immediately requesting admission into the Soviet Union, which was granted. (The Soviets annexed the whole of Lithuania, including the Šešupė area, which had been earmarked for Germany.)
Finally, on 26 June, four days after France had sued for an armistice with the Third Reich, the Soviet Union issued an ultimatum that demanded Bessarabia and unexpectedly Northern Bukovina from Romania. Two days later, the Romanians acceded to the Soviet demands, and the Soviets occupied the territories. The Hertza region was initially not requested by the Soviets but was later occupied by force after the Romanians had agreed to the initial Soviet demands. The subsequent waves of deportations began in Bessarabia and Northern Bukovina.
At the end of October 1939, Germany enacted the death penalty for disobedience to the German occupation. Germany began a campaign of "Germanization", which meant assimilating the occupied territories politically, culturally, socially and economically into the German Reich. 50,000–200,000 Polish children were kidnapped to be Germanised.
The elimination of Polish elites and intelligentsia was part of Generalplan Ost. The Intelligenzaktion, a plan to eliminate the Polish intelligentsia, Poland's 'leadership class', took place soon after the German invasion of Poland and lasted from fall of 1939 to the spring of 1940. As the result of the operation, in ten regional actions, about 60,000 Polish nobles, teachers, social workers, priests, judges and political activists were killed. It was continued in May 1940, when Germany launched AB-Aktion, More than 16,000 members of the intelligentsia were murdered in Operation Tannenberg alone.
Germany also planned to incorporate all of the land into the Third Reich. That effort resulted in the forced resettlement of two million Poles. Families were forced to travel in the severe winter of 1939–1940, leaving behind almost all of their possessions without compensation. As part of Operation Tannenberg alone, 750,000 Polish peasants were forced to leave, and their property was given to Germans. A further 330,000 were murdered. Germany planned the eventual move of ethnic Poles to Siberia.
Although Germany used forced labourers in most other occupied countries, Poles and other Slavs were viewed as inferior by Nazi propaganda and thus better suited for such duties. Between 1 and 2.5 million Polish citizens were transported to the Reich for forced labour. All Polish males were made to perform forced labour. While ethnic Poles were subject to selective persecution, all ethnic Jews were targeted by the Reich. In the winter of 1939–40, about 100,000 Jews were thus deported to Poland. They were initially gathered into massive urban ghettos, such as the 380,000 held in the Warsaw Ghetto, where large numbers died of starvation and diseases under its harsh conditions, including 43,000 in the Warsaw Ghetto alone. Poles and ethnic Jews were imprisoned in nearly every camp of the extensive concentration camp system in German-occupied Poland and the Reich. In Auschwitz, which began operating on 14 June 1940, 1.1 million people perished.
In the summer of 1940, fear of the Soviet Union, in conjunction with German support for the territorial demands of Romania's neighbours and the Romanian government's own miscalculations, resulted in more territorial losses for Romania. Between 28 June and 4 July, the Soviet Union occupied and annexed Bessarabia, northern Bukovina and the Hertza region of Romania.
On 30 August, Ribbentrop and Italian Foreign Minister Galeazzo Ciano issued the Second Vienna Award, giving Northern Transylvania to Hungary. On 7 September, Romania ceded Southern Dobruja to Bulgaria (Axis-sponsored Treaty of Craiova). After various events over the following months, Romania increasingly took on the aspect of a German-occupied country.
The Soviet-occupied territories were converted into republics of the Soviet Union. During the two years after the annexation, the Soviets arrested approximately 100,000 Polish citizens and deported between 350,000 and 1,500,000, of whom between 250,000 and 1,000,000 died, mostly civilians. Forced re-lsettlements into gulag labour camps and exile settlements in remote areas of the Soviet Union occurred. According to Norman Davies, almost half of them were dead by July 1940.
On 10 January 1941, Germany and the Soviet Union signed an agreement settling several ongoing issues. Secret protocols in the new agreement modified the "Secret Additional Protocols" of the German–Soviet Boundary and Friendship Treaty, ceding the Lithuanian Strip to the Soviet Union in exchange for 7.5 million dollars (31.5 million Reichsmark). The agreement formally set the border between Germany and the Soviet Union between the Igorka River and the Baltic Sea. It also extended trade regulation of the 1940 German–Soviet Commercial Agreement until 1 August 1942, increased deliveries above the levels of the first year of that agreement, settled trading rights in the Baltics and Bessarabia, calculated the compensation for German property interests in the Baltic states that were now occupied by the Soviets and covered other issues. It also covered the migration to Germany within two-and-a-half months of ethnic Germans and German citizens in Soviet-held Baltic territories and the migration to the Soviet Union of Baltic and "White Russian" "nationals" in the German-held territories.
The agreement stunned the world. John Gunther, in Moscow in August 1939, recalled, "Nothing more unbelievable could be imagined. Astonishment and scepticism turned quickly to consternation and alarm." Before the pact was announced, Western communists denied that such a treaty would be signed. Herbert Biberman, a future member of the Hollywood Ten, denounced rumours as "Fascist propaganda." Earl Browder, the head of the Communist Party USA, stated that "there is as much chance of agreement as of Earl Browder being elected president of the Chamber of Commerce."
Gunther wrote, however, that some knew "communism and Fascism were more closely allied than was normally understood," and Ernst von Weizsäcker had told Nevile Henderson on 16 August that the Soviet Union would "join in sharing in the Polish spoils." In September 1939, the Soviet Comintern suspended all anti-Nazi and antifascist propaganda and explained that the war in Europe was a matter of capitalist states attacking one another for imperialist purposes. Western communists acted accordingly; although they had previously supported collective security, they now denounced Britain and France for going to war.
When anti-German demonstrations erupted in Prague, Czechoslovakia, the Comintern ordered the Communist Party of Czechoslovakia to employ all of its strength to paralyse "chauvinist elements." Moscow soon forced the French Communist Party and the Communist Party of Great Britain to adopt anti-war positions. On 7 September, Stalin called Georgi Dimitrov, who sketched a new Comintern line on the war that stated that the war was unjust and imperialist, which was approved by the secretariat of the Comintern on 9 September. Thus, western communist parties now had to oppose the war and to vote against war credits. Although the French communists had unanimously voted in Parliament for war credits on 2 September and declared their "unshakeable will" to defend the country on 19 September, the Comintern formally instructed the party to condemn the war as imperialist on 27 September. By 1 October, the French communists advocated listening to German peace proposals, and leader Maurice Thorez deserted from the French Army on 4 October and fled to Russia. Other communists also deserted from the army.
The Communist Party of Germany featured similar attitudes. In "Die Welt", a communist newspaper published in Stockholm the exiled communist leader Walter Ulbricht opposed the Allies, stated that Britain represented "the most reactionary force in the world"), and argued, "The German government declared itself ready for friendly relations with the Soviet Union, whereas the English–French war bloc desires a war against the socialist Soviet Union. The Soviet people and the working people of Germany have an interest in preventing the English war plan".
Despite a warning by the Comintern, German tensions were raised when the Soviets stated in September that they must enter Poland to "protect" their ethnic Ukrainian and Belarusian brethren from Germany. Molotov later admitted to German officials that the excuse was necessary because the Kremlin could find no other pretext for the Soviet invasion.
During the early months of the Pact, the Soviet foreign policy became critical of the Allies and more pro-German in turn. During the Fifth Session of the Supreme Soviet on 31 October 1939, Molotov analysed the international situation, thus giving the direction for communist propaganda. According to Molotov, Germany had a legitimate interest in regaining its position as a great power, and the Allies had started an aggressive war in order to maintain the Versailles system.
Germany and the Soviet Union entered an intricate trade pact on 11 February 1940 that was over four times larger than the one that the two countries had signed in August 1939. The new trade pact helped Germany surmount a British blockade. In the first year, Germany received one million tons of cereals, half-a-million tons of wheat, 900,000 tons of oil, 100,000 tons of cotton, 500,000 tons of phosphates and considerable amounts of other vital raw materials, along with the transit of one million tons of soybeans from Manchuria. Those and other supplies were being transported through Soviet and occupied Polish territories. The Soviets were to receive a naval cruiser, the plans to the battleship "Bismarck", heavy naval guns, other naval gear and 30 of Germany's latest warplanes, including the Bf 109 and Bf 110 fighters and Ju 88 bomber. The Soviets would also receive oil and electric equipment, locomotives, turbines, generators, diesel engines, ships, machine tools and samples of German artillery, tanks, explosives, chemical-warfare equipment and other items.
The Soviets also helped Germany to avoid British naval blockades by providing a submarine base, Basis Nord, in the northern Soviet Union near Murmansk. That also provided a refuelling and maintenance location and a takeoff point for raids and attacks on shipping. In addition, the Soviets provided Germany with access to the Northern Sea Route for both cargo ships and raiders though only the commerce raider used the route before the German invasion, which forced Britain to protect sea lanes in both the Atlantic and the Pacific.
The Finnish and Baltic invasions began a deterioration of relations between the Soviets and Germany. Stalin's invasions were a severe irritant to Berlin since the intent to accomplish them had not been communicated to the Germans beforehand, and they prompted concern that Stalin was seeking to form an anti-German bloc. Molotov's reassurances to the Germans and the Germans' mistrust intensified. On 16 June, as the Soviets invaded Lithuania but before they had invaded Latvia and Estonia, Ribbentrop instructed his staff "to submit a report as soon as possible as to whether in the Baltic States a tendency to seek support from the Reich can be observed or whether an attempt was made to form a bloc."
In August 1940, the Soviet Union briefly suspended its deliveries under its commercial agreement after relations were strained after disagreements over policy in Romania, the Soviet war with Finland, Germany's falling behind on its deliveries of goods under the pact and Stalin's worry that Hitler's war with the West might end quickly after France signed an armistice. The suspension created significant resource problems for Germany. By the end of August, relations improved again, as the countries had redrawn the Hungarian and Romanian borders and settled some Bulgarian claims, and Stalin was again convinced that Germany would face a long war in the west with Britain's improvement in its air battle with Germany and the execution of an agreement between the United States and Britain regarding destroyers and bases.
In the United States, "The leftists, of course, included the Communist Party, which during the 1939–1941 era of the Nazi-Soviet pact, was slavish in its effort to appease Hitler and sabotage the Allied cause and American preparedness. Their soul mate in Congress was Vito Marcantonio of New York's American Labor Party. Despite opposition from the left and the right, American aid continued to make a short war unlikely.
However, in late August, Germany arranged its own occupation of Romania, targeting its oil fields. That move raised tensions with the Soviets, who responded that Germany was supposed to have consulted with the Soviet Union under Article III of the pact.
After Germany in September 1940 entered the Tripartite Pact with Japan and Italy, Ribbentrop wrote to Stalin, inviting Molotov to Berlin for negotiations aimed to create a 'continental bloc' of Germany, Italy, Japan and the Soviet Union that would oppose Britain and the United States. Stalin sent Molotov to Berlin to negotiate the terms for the Soviet Union to join the Axis and potentially to enjoy the spoils of the pact. After negotiations during November 1940 on where to extend the Soviet sphere of influence, Hitler broke off talks and continued planning for the eventual attempts to invade the Soviet Union.
In an effort to demonstrate peaceful intentions toward Germany, on 13 April 1941, the Soviets signed a neutrality pact with Japan, an Axis power. While Stalin had little faith in Japan's commitment to neutrality, he felt that the pact was important for its political symbolism to reinforce a public affection for Germany. Stalin felt that there was a growing split in German circles about whether Germany should initiate a war with the Soviet Union. Stalin did not know that Hitler had been secretly discussing an invasion of the Soviet Union since summer 1940 and that Hitler had ordered his military in late 1940 to prepare for war in the East, regardless of the parties' talks of a potential Soviet entry as a fourth Axis power.
Germany unilaterally terminated the pact at 03:15 on 22 June 1941 by launching a massive attack on the Soviet Union in Operation Barbarossa. Stalin had ignored repeated warnings that Germany was likely to invade and ordered no "full-scale" mobilisation of forces although the mobilisation was ongoing. After the launch of the invasion, the territories gained by the Soviet Union as a result of the pact were lost in a matter of weeks. The southeastern part was absorbed into Greater Germany's General Government, and the rest was integrated with the Reichskommissariats Ostland and Ukraine. Within six months, the Soviet military had suffered 4.3 million casualties, and three million more had been captured. The lucrative export of Soviet raw materials to Germany over the course of the economic relations continued uninterrupted until the outbreak of hostilities. The Soviet exports in several key areas enabled Germany to maintain its stocks of rubber and grain from the first day of the invasion to October 1941.
The German original of the secret protocols was presumably destroyed in the bombing of Germany, but in late 1943, Ribbentrop had ordered the most secret records of the German Foreign Office from 1933 onward, amounting to some 9,800 pages, to be microfilmed. When the various departments of the Foreign Office in Berlin were evacuated to Thuringia at the end of the war, Karl von Loesch, a civil servant who had worked for the chief interpreter Paul Otto Schmidt, was entrusted with the microfilm copies. He eventually received orders to destroy the secret documents but decided to bury the metal container with the microfilms as personal insurance for his future well-being. In May 1945, von Loesch approached the British Lieutenant Colonel Robert C. Thomson with the request to transmit a personal letter to Duncan Sandys, Churchill's son-in-law. In the letter, von Loesch revealed that he had knowledge of the documents' whereabouts but expected preferential treatment in return. Thomson and his American counterpart, Ralph Collins, agreed to transfer von Loesch to Marburg, in the American zone if he would produce the microfilms. The microfilms contained a copy of the Non-Aggression Treaty as well as the Secret Protocol. Both documents were discovered as part of the microfilmed records in August 1945 by US State Department employee Wendell B. Blancke, the head of a special unit called "Exploitation German Archives" (EGA).
News of the secret protocols first appeared during the Nuremberg trials. Alfred Seidl, the attorney for defendant Hans Frank, was able to place into evidence an affidavit that described them. It was written from memory by Nazi Foreign Office lawyer , who wrote the text and was present at its signing in Moscow. Later, Seidl obtained the German-language text of the secret protocols from an anonymous Allied source and attempted to place them into evidence while he was questioning witness Ernst von Weizsäcker, a former Foreign Office State Secretary. The Allied prosecutors objected, and the texts were not accepted into evidence, but Weizsäcker was permitted to describe them from memory, thus corroborating the Gaus affidavit. Finally, at the request of a "St. Louis Post-Dispatch" reporter, American deputy prosecutor Thomas J. Dodd acquired a copy of the secret protocols from Seidl and had it translated into English. They were first published on 22 May 1946 in a front-page story in that newspaper. Later, in Britain, they were published by the "Manchester Guardian".
The protocols gained wider media attention when they were included in an official State Department collection, "Nazi–Soviet Relations 1939–1941", edited by Raymond J. Sontag and James S. Beddie and published on 21 January 1948. The decision to publish the key documents on German–Soviet relations, including the treaty and protocol, had been taken already in spring 1947. Sontag and Beddie prepared the collection throughout the summer of 1947. In November 1947, Truman personally approved the publication, but it was held back in view of the Foreign Ministers Conference in London scheduled for December. Since negotiations at that conference did not prove to be constructive from an American point of view, the document edition was sent to press. The documents made headlines worldwide. State Department officials counted it as a success: "The Soviet Government was caught flat-footed in what was the first effective blow from our side in a clear-cut propaganda war."
Despite publication of the recovered copy in western media, for decades, the official policy of the Soviet Union was to deny the existence of the secret protocol. The secret protocol's existence was officially denied until 1989. Vyacheslav Molotov, one of the signatories, went to his grave categorically rejecting its existence. The French Communist Party did not acknowledge the existence of the secret protocol until 1968, as the party de-Stalinized.
On 23 August 1986, tens of thousands of demonstrators in 21 western cities, including New York, London, Stockholm, Toronto, Seattle, and Perth participated in Black Ribbon Day Rallies to draw attention to the secret protocols.
In response to the publication of the secret protocols and other secret German–Soviet relations documents in the State Department edition "Nazi–Soviet Relations" (1948), Stalin published "Falsifiers of History", which included the claim that during the pact's operation, Stalin rejected Hitler's claim to share in a division of the world, without mentioning the Soviet offer to join the Axis. That version persisted, without exception, in historical studies, official accounts, memoirs, and textbooks published in the Soviet Union until the dissolution of the Soviet Union.
The book also claimed that the Munich agreement was a "secret agreement" between Germany and "the west" and a "highly important phase in their policy aimed at goading the Hitlerite aggressors against the Soviet Union."
For decades, it was the official policy of the Soviet Union to deny the existence of the secret protocol to the Soviet–German Pact. At the behest of Mikhail Gorbachev, Alexander Nikolaevich Yakovlev headed a commission investigating the existence of such a protocol. In December 1989, the commission concluded that the protocol had existed and revealed its findings to the Congress of People's Deputies of the Soviet Union. As a result, the Congress passed the declaration confirming the existence of the secret protocols and condemning and denouncing them. Both successor states of the pact parties have declared the secret protocols to be invalid from the moment that they were signed: the Federal Republic of Germany on 1 September 1989 and the Soviet Union on 24 December 1989, following an examination of the microfilmed copy of the German originals.
The Soviet copy of the original document was declassified in 1992 and published in a scientific journal in early 1993.
In August 2009, in an article written for the Polish newspaper Gazeta Wyborcza, Russian Prime Minister Vladimir Putin condemned the Molotov–Ribbentrop Pact as "immoral".
The new Russian nationalists and revisionists, including Russian negationist Alexander Dyukov and Nataliya Narotchnitskaya, whose book carried an approving foreword by the Russian foreign Minister Sergei Lavrov, described the pact as a necessary measure because of the British and French failure to enter into an antifascist pact.
Some scholars believe that, from the very beginning of the Tripartite negotiations between the Soviet Union, Great Britain and France, it was clear that the Soviet position required the other parties to agree to a Soviet occupation of Estonia, Latvia, and Lithuania and for Finland to be included in the Soviet sphere of influence.
On the timing of German rapprochement, many historians agree that the dismissal of Maxim Litvinov, whose Jewish ethnicity was viewed unfavorably by Nazi Germany, removed an obstacle to negotiations with Germany. Stalin immediately directed Molotov to "purge the ministry of Jews." Given Litvinov's prior attempts to create an anti-fascist coalition, association with the doctrine of collective security with France and Britain and a pro-Western orientation by the standards of the Kremlin, his dismissal indicated the existence of a Soviet option of rapprochement with Germany. Likewise, Molotov's appointment served as a signal to Germany that the Soviet Union was open to offers. The dismissal also signaled to France and Britain the existence of a potential negotiation option with Germany. One British official wrote that Litvinov's termination also meant the loss of an admirable technician or shock-absorber but that Molotov's "modus operandi" was "more truly Bolshevik than diplomatic or cosmopolitan." Carr argued that the Soviet Union's replacement of Litvinov with Molotov on 3 May 1939 indicated not an irrevocable shift towards alignment with Germany but rather was Stalin's way of engaging in hard bargaining with the British and the French by appointing a proverbial hard man to the Foreign Commissariat. Historian Albert Resis stated that the Litvinov dismissal gave the Soviets freedom to pursue faster German negotiations but that they did not abandon British–French talks. Derek Watson argued that Molotov could get the best deal with Britain and France because he was not encumbered with the baggage of collective security and could negotiate with Germany. Geoffrey Roberts argued that Litvinov's dismissal helped the Soviets with British–French talks because Litvinov doubted or maybe even opposed such discussions.
Edward Hallett Carr, a frequent defender of Soviet policy, stated: "In return for 'non-intervention' Stalin secured a breathing space of immunity from German attack." According to Carr, the "bastion" created by means of the pact "was and could only be, a line of defense against potential German attack." According to Carr, an important advantage was that "if Soviet Russia had eventually to fight Hitler, the Western Powers would already be involved." However, during the last decades, that view has been disputed. Historian Werner Maser stated that "the claim that the Soviet Union was at the time threatened by Hitler, as Stalin supposed... is a legend, to whose creators Stalin himself belonged. In Maser's view, "neither Germany nor Japan were in a situation [of] invading the USSR even with the least perspective of success," which must not have been known to Stalin. Carr further stated that for a long time, the primary motive of Stalin's sudden change of course was assumed to be the fear of German aggressive intentions.
Soviet sources have claimed that soon after the pact was signed, both Britain and the US showed understanding that the buffer zone was necessary to keep Hitler from advancing for some time and accepted the ostensible strategic reasoning; however, soon after World War II ended, those countries changed their view. Many Polish newspapers published numerous articles claiming that Russia must apologise to Poland for the pact.
Two weeks after Soviet armies had entered the Baltic states, Berlin requested Finland to permit the transit of German troops, and five weeks later Hitler issued a secret directive "to take up the Russian problem, to think about war preparations," a war whose objective would include establishment of a Baltic confederation.
Historians have debated whether Stalin was planning an invasion of German territory in the summer of 1941. Most historians agreed that the geopolitical differences between the Soviet Union and the Axis made war inevitable, and that Stalin had made extensive preparations for war and exploited the military conflict in Europe to his advantage. A number of German historians have debunked the claim that Operation Barbarossa was a preemptive strike. Such as Andreas Hillgruber, Rolf-Dieter Müller, and Christian Hartmann, they also acknowledge the Soviets were aggressive to their neighbors
The pact was a taboo subject in the postwar Soviet Union. In December 1989, the Congress of People's Deputies of the Soviet Union condemned the pact and its secret protocol as "legally deficient and invalid." In modern Russia, the pact is often portrayed positively or neutrally by the pro-government propaganda; for example, Russian textbooks tend to describe the pact as a defensive measure, not as one aiming at territorial expansion. In 2009, Russian President Vladimir Putin stated that "there are grounds to condemn the Pact", but in 2014, he reversed the stance by describing it as routine and "necessary for Russia's survival". Any accusation that cast doubt on one-dimensional, positive portrayal of Russia's role in World War II has been seen as highly problematic for modern Russia's state, which sees Russia's victory in the war as one of "the most venerated pillars of state ideology", which legitimises the current government and its policies.
In 2009, the European Parliament proclaimed 23 August, the anniversary of the Molotov–Ribbentrop Pact, as the European Day of Remembrance for Victims of Stalinism and Nazism, to be commemorated with dignity and impartiality. In connection with the Molotov–Ribbentrop Pact, the Organisation for Security and Co-operation in Europe parliamentary resolution condemned both communism and fascism for starting World War II and called for a day of remembrance for victims of both Stalinism and Nazism on 23 August. In response to the resolution, Russian lawmakers threatened the OSCE with "harsh consequences". A similar resolution was passed by the European Parliament a decade later, blaming the 1939 Molotov–Ribbentrop pact for the outbreak of war in Europe and again leading to criticism by Russian authorities.
During the re-ignition of Cold War tensions in 1982, the US Congress during the Reagan administration established Baltic Freedom Day, to be remembered every 14 June in the United States.
|
https://en.wikipedia.org/wiki?curid=20950
|
Mobile, Alabama
Mobile ( , ) is the county seat of Mobile County, Alabama, United States. The population within the city limits was 195,111 as of the 2010 United States Census, making it the third most populous city in Alabama, the most populous in Mobile County, and the largest municipality on the Gulf Coast between New Orleans, Louisiana, and St. Petersburg, Florida.
Alabama's only saltwater port, Mobile is located on the Mobile River at the head of the Mobile Bay and the north-central Gulf Coast. The Port of Mobile has always played a key role in the economic health of the city, beginning with the settlement as an important trading center between the French colonists and Native Americans, down to its current role as the 12th-largest port in the United States.
Mobile is the principal municipality of the Mobile metropolitan area. This region of 412,992 residents is composed solely of Mobile County; it is the third-largest metropolitan statistical area in the state. Mobile is the largest city in the Mobile-Daphne−Fairhope CSA, with a total population of 604,726, the second largest in the state. , the population within a radius of Mobile is 1,262,907.
Mobile was founded in 1702 by the French as the first capital of Louisiana. During its first 100 years, Mobile was a colony of France, then Britain, and lastly Spain. Mobile became a part of the United States in 1813, with the annexation by President James Madison of West Florida from Spain. The city surrendered to Federal forces on April 12, 1865, after Union victories at two forts protecting the city. This, along with the news of Johnston's surrender negotiations with Sherman, led Taylor to seek a meeting with his Union counterpart, Maj. Gen. Edward R. S. Canby. The two generals met several miles north of Mobile on May 2. After agreeing to a 48-hour truce, the generals enjoyed an al fresco luncheon of food, drink, and lively music. Canby offered Taylor the same terms agreed upon between Lee and Grant. Taylor accepted the terms and surrendered his command on May 4 at Citronelle, Alabama.
Considered one of the Gulf Coast's cultural centers, Mobile has several art museums, a symphony orchestra, professional opera, professional ballet company, and a large concentration of historic architecture. Mobile is known for having the oldest organized Carnival or Mardi Gras celebrations in the United States. Its French Catholic colonial settlers celebrated this festival from the first decade of the 18th century. Beginning in 1830, Mobile was host to the first formally organized Carnival mystic society to celebrate with a parade in the United States. (In New Orleans such a group is called a krewe.)
The city gained its name from the Mobile tribe that the French colonists encountered living in the area of Mobile Bay. Although debated by Alabama historians, they may have been descendants of the Native American tribe whose small fortress town, Mabila, was used to conceal several thousand native warriors before an attack in 1540 on the expedition of Spanish explorer Hernando de Soto. About seven years after the founding of the French Mobile settlement, the Mobile tribe, along with the Tohomé, gained permission from the colonists to settle near the fort.
The European settlement of Mobile began with French colonists, who in 1702 constructed "Fort Louis de la Louisiane", at Twenty-seven Mile Bluff on the Mobile River, as the first capital of the French colony of La Louisiane. It was founded by French Canadian brothers Pierre Le Moyne d'Iberville and Jean-Baptiste Le Moyne, Sieur de Bienville, to establish control over France's claims to "La Louisiane". Bienville was appointed as royal governor of French Louisiana in 1701. Mobile's Roman Catholic parish was established on July 20, 1703, by Jean-Baptiste de la Croix de Chevrières de Saint-Vallier, Bishop of Quebec. The parish was the first French Catholic parish established on the Gulf Coast of the United States.
In 1704 the ship "Pélican" delivered 23 French women to the colony; passengers had contracted yellow fever at a stop in Havana. Though most of the ""Pélican" girls" recovered, numerous colonists and neighboring Native Americans contracted the disease in turn and many died. This early period was also the occasion of the importation of the first African slaves, transported aboard a French supply ship from the French colony of Saint-Domingue in the Caribbean, where they had first been held. The population of the colony fluctuated over the next few years, growing to 279 persons by 1708, yet shrinking to 178 persons two years later due to disease.
These additional outbreaks of disease and a series of floods resulted in Bienville ordering that the settlement be relocated in 1711 several miles downriver to its present location at the confluence of the Mobile River and Mobile Bay. A new earth-and-palisade Fort Louis was constructed at the new site during this time. By 1712, when Antoine Crozat was appointed to take over administration of the colony, its population had reached 400 persons.
The capital of La Louisiane was moved in 1720 to Biloxi, leaving Mobile to serve as a regional military and trading center. In 1723 the construction of a new brick fort with a stone foundation began and it was renamed Fort Condé in honor of Louis Henri, Duc de Bourbon and prince of Condé.
In 1763, the Treaty of Paris was signed, ending the Seven Years' War, which Britain won, defeating France. By this treaty, France ceded its territories east of the Mississippi River to Britain. This area was made a part of the expanded British West Florida colony. The British changed the name of Fort Condé to Fort Charlotte, after Charlotte of Mecklenburg-Strelitz, wife and queen with King George III.
The British were eager not to lose any useful inhabitants and promised religious tolerance to the French colonists; ultimately 112 French colonists remained in Mobile. The first permanent Jewish settlers came to Mobile in 1763 as a result of the new British rule and religious tolerance. Jews had not been allowed to officially reside in colonial French Louisiana due to the Code Noir, a decree passed by France's King Louis XIV in 1685 that forbade the exercise of any religion other than Roman Catholicism, and ordered all Jews out of France's colonies. Most of these colonial-era Jews in Mobile were merchants and traders from Sephardic Jewish communities in Savannah, Georgia and Charleston, South Carolina; they added to the commercial development of Mobile. In 1766 the total population was estimated to be 860, though the town's borders were smaller than during the French colonial period. During the American Revolutionary War, West Florida and Mobile became a refuge for loyalists fleeing the other colonies.
While the British were dealing with their rebellious colonists along the Atlantic coast, the Spanish entered the war in 1779 as an ally of France. They took the opportunity to order Bernardo de Galvez, Governor of Louisiana, on an expedition east to retake West Florida. He captured Mobile during the Battle of Fort Charlotte in 1780, as part of this campaign. The Spanish wished to eliminate any British threat to their Louisiana colony west of the Mississippi River, which they had received from France in the 1763 Treaty of Paris. Their actions were condoned by the revolting American colonies, partially evidenced by the presence of Oliver Pollack, representative of the American Continental Congress. Due to strong trade ties, many residents of Mobile and West Florida remained loyal to the British Crown. The Spanish renamed the fort as Fortaleza Carlota, and held Mobile as a part of Spanish West Florida until 1813, when it was seized by United States General James Wilkinson during the War of 1812.
By the time Mobile was included in the Mississippi Territory in 1813, the population had dwindled to roughly 300 people. The city was included in the Alabama Territory in 1817, after Mississippi gained statehood. Alabama was granted statehood in 1819; Mobile's population had increased to 809 by that time.
Mobile was well situated for trade, as its location tied it to a river system that served as the principal navigational access for most of Alabama and a large part of Mississippi. River transportation was aided by the introduction of steamboats in the early decades of the 19th century. By 1822 the city's population was 2800.
The Industrial Revolution in Great Britain created shortages of cotton, driving up prices on world markets. Much land well suited to growing cotton lies in the vicinity of the Mobile River, and its main tributaries the Tombigbee and Alabama Rivers. A plantation economy using slave labor developed in the region and as a consequence Mobile's population quickly grew. It came to be settled by attorneys, cotton factors, doctors, merchants and other professionals seeking to capitalize on trade with the upriver areas.
From the 1830s onward, Mobile expanded into a city of commerce with a primary focus on the cotton and slave trades. Many slaves were transported by ship in the coastwise slave trade from the Upper South. There were many businesses in the city related to the slave trade – people to make clothes, food, and supplies for the slave traders and their wards. The city's booming businesses attracted merchants from the North; by 1850 10% of its population was from New York City, which was deeply involved in the cotton industry. Mobile was the slave-trading center of the state until the 1850s, when it was surpassed by Montgomery.
The prosperity stimulated a building boom that was underway by the mid-1830s, with the building of some of the most elaborate structures the city had seen up to that point. This was cut short in part by the Panic of 1837 and yellow fever epidemics. The waterfront was developed with wharves, terminal facilities, and fireproof brick warehouses. The exports of cotton grew in proportion to the amounts being produced in the Black Belt; by 1840 Mobile was second only to New Orleans in cotton exports in the nation.
With the economy so focused on one crop, Mobile's fortunes were always tied to those of cotton, and the city weathered many financial crises. Mobile slaveholders owned relatively few slaves compared to planters in the upland plantation areas, but many households had domestic slaves, and many other slaves worked on the waterfront and on riverboats. The last slaves to enter the United States from the African trade were brought to Mobile on the slave ship "Clotilde". Among them was Cudjoe Lewis, who in the 1920s was the last survivor of the slave trade.
By 1853, fifty Jewish families lived in Mobile, including Philip Phillips, an attorney from Charleston, South Carolina, who was elected to the Alabama State Legislature and then to the United States Congress. Many early Jewish families were descendants of Sephardic Jews who had been among the earliest colonial settlers in Charleston and Savannah.
By 1860 Mobile's population within the city limits had reached 29,258 people; it was the 27th-largest city in the United States and 4th-largest in what would soon be the Confederate States of America. The free population in the whole of Mobile County, including the city, consisted of 29,754 citizens, of which 1,195 were free people of color. Additionally, 1,785 slave owners in the county held 11,376 people in bondage, about one-quarter of the total county population of 41,130 people.
During the American Civil War, Mobile was a Confederate city. The "H. L. Hunley", the first submarine to sink an enemy ship, was built in Mobile. One of the most famous naval engagements of the war was the Battle of Mobile Bay, resulting in the Union taking control of Mobile Bay on August 5, 1864. On April 12, 1865, three days after Robert E. Lee's surrender at Appomattox Courthouse, the city surrendered to the Union army to avoid destruction after Union victories at nearby Spanish Fort and Fort Blakely.
On May 25, 1865, the city suffered great loss when some three hundred people died as a result of an explosion at a federal ammunition depot on Beauregard Street. The explosion left a deep hole at the depot's location, and sank ships docked on the Mobile River; the resulting fires destroyed the northern portion of the city.
Federal Reconstruction in Mobile began after the Civil War and effectively ended in 1874 when the local Democrats gained control of the city government. The last quarter of the 19th century was a time of economic depression and municipal insolvency for Mobile. One example can be provided by the value of Mobile's exports during this period of depression. The value of exports leaving the city fell from $9 million in 1878 to $3 million in 1882.
The turn of the 20th century brought the Progressive Era to Mobile. The economic structure developed with new industries, generating new jobs and attracting a significant increase in population. The population increased from around 40,000 in 1900 to 60,000 by 1920. During this time the city received $3 million in federal grants for harbor improvements to deepen the shipping channels. During and after World War I, manufacturing became increasingly vital to Mobile's economic health, with shipbuilding and steel production being two of the most important industries.
During this time, social justice and race relations in Mobile worsened, however. The state passed a new constitution in 1901 that disenfranchised most blacks and many poor whites; and the white Democratic-dominated legislature passed other discriminatory legislation. In 1902, the city government passed Mobile's first racial segregation ordinance, segregating the city streetcars. It legislated what had been informal practice, enforced by convention. Mobile's African-American population responded to this with a two-month boycott, but the law was not repealed. After this, Mobile's "de facto" segregation was increasingly replaced with legislated segregation as whites imposed Jim Crow laws to maintain supremacy.
In 1911 the city adopted a commission form of government, which had three members elected by at-large voting. Considered to be progressive, as it would reduce the power of ward bosses, this change resulted in the elite white majority strengthening its power, as only the majority could gain election of at-large candidates. In addition, poor whites and blacks had already been disenfranchised. Mobile was one of the last cities to retain this form of government, which prevented smaller groups from electing candidates of their choice. But Alabama's white yeomanry had historically favored single-member districts in order to elect candidates of their choice.
The red imported fire ant was first introduced into the United States via the Port of Mobile. Sometime in the late 1930s they came ashore off cargo ships arriving from South America. The ants were carried in the soil used as ballast on those ships. They have spread throughout the South and Southwest.
During World War II, the defense buildup in Mobile shipyards resulted in a considerable increase in the city's white middle-class and working-class population, largely due to the massive influx of workers coming to work in the shipyards and at the Brookley Army Air Field. Between 1940 and 1943, more than 89,000 people moved into Mobile to work for war effort industries.
Mobile was one of eighteen United States cities producing Liberty ships. Its Alabama Drydock and Shipbuilding Company (ADDSCO) supported the war effort by producing ships faster than the Axis powers could sink them. ADDSCO also churned out a copious number of T2 tankers for the War Department. Gulf Shipbuilding Corporation, a subsidiary of Waterman Steamship Corporation, focused on building freighters, s, and minesweepers. The rapid increase of population in the city produced crowded conditions, increasing social tensions in the competition for housing and good jobs.
A race riot broke out in May 1943 of whites against blacks. ADDSCO management had long maintained segregated conditions at the shipyards, although the Roosevelt administration had ordered defense contractors to integrate facilities. That year ADDSCO promoted 12 blacks to positions as welders, previously reserved for whites; and whites objected to the change by rioting on May 24. The mayor appealed to the governor to call in the National Guard to restore order, but it was weeks before officials allowed African Americans to return to work, keeping them away for their safety.
In the late 1940s, the transition to the postwar economy was hard for the city, as thousands of jobs were lost at the shipyards with the decline in the defense industry. Eventually the city's social structure began to become more liberal. Replacing shipbuilding as a primary economic force, the paper and chemical industries began to expand. No longer needed for defense, most of the old military bases were converted to civilian uses. Following the war, in which many African Americans had served, veterans and their supporters stepped up activism to gain enforcement of their constitutional rights and social justice, especially in the Jim Crow South. During the 1950s the City of Mobile integrated its police force and Spring Hill College accepted students of all races. Unlike in the rest of the state, by the early 1960s the city buses and lunch counters voluntarily desegregated.
The Alabama legislature passed the Cater Act in 1949, allowing cities and counties to set up industrial development boards (IDB) to issue municipal bonds as incentives to attract new industry into their local areas. The city of Mobile did not establish a Cater Act board until 1962. George E. McNally, Mobile's first Republican mayor since Reconstruction, was the driving force behind the founding of the IDB. The Mobile Area Chamber of Commerce, believing its members were better qualified to attract new businesses and industry to the area, considered the new IDB as a serious rival. After several years of political squabbling, the Chamber of Commerce emerged victorious. While McNally's IDB prompted the Chamber of Commerce to become more proactive in attracting new industry, the chamber effectively shut Mobile city government out of economic development decisions.
In 1963, three African-American students brought a case against the Mobile County School Board for being denied admission to Murphy High School. This was nearly a decade after the United States Supreme Court had ruled in "Brown v. Board of Education" (1954) that segregation of public schools was unconstitutional. The federal district court ordered that the three students be admitted to Murphy for the 1964 school year, leading to the desegregation of Mobile County's school system.
The civil rights movement gained congressional passage of the Civil Rights Act of 1964 and Voting Rights Act of 1965, eventually ending legal segregation and regaining effective suffrage for African Americans. But whites in the state had more than one way to reduce African Americans' voting power. Maintaining the city commission form of government with at-large voting resulted in all positions being elected by the white majority, as African Americans could not command a majority for their candidates in the informally segregated city.
In 1969 Brookley Air Force Base was closed by the Department of Defense, dealing Mobile's economy a severe blow. The closing resulted in a 10% unemployment rate in the city. This and other factors related to industrial restructuring ushered in a period of economic depression that lasted through the 1970s. The loss of jobs created numerous problems and resulted in loss of population as residents moved away for work.
Mobile's city commission form of government was challenged and finally overturned in 1982 in "City of Mobile v. Bolden", which was remanded by the United States Supreme Court to the district court. Finding that the city had adopted a commission form of government in 1911 and at-large positions with discriminatory intent, the court proposed that the three members of the city commission should be elected from single-member districts, likely ending their division of executive functions among them. Mobile's state legislative delegation in 1985 finally enacted a mayor-council form of government, with seven members elected from single-member districts. This was approved by voters. As white conservatives increasingly entered the Republican Party in the late 20th century, African-American residents of the city have elected members of the Democratic Party as their candidates of choice. Since the change to single-member districts, more women and African Americans were elected to the council than under the at-large system.
Beginning in the late 1980s, newly elected mayor Mike Dow and the city council began an effort termed the "String of Pearls Initiative" to make Mobile into a competitive city. The city initiated construction of numerous new facilities and projects, and the restoration of hundreds of historic downtown buildings and homes. City and county leaders also made efforts to attract new business ventures to the area.
Mobile is located in the southwestern corner of the U.S. state of Alabama. According to the United States Census Bureau, the city has a total area of , with of it being land, and , or 26.1% of the total, being covered by water. The elevation in Mobile ranges from on Water Street in downtown to at the Mobile Regional Airport.
Mobile has a number of notable historic neighborhoods. These include Ashland Place, Campground, Church Street East, De Tonti Square, Leinkauf, Lower Dauphin Street, Midtown, Oakleigh Garden, Old Dauphin Way, Spring Hill, and Toulminville.
Mobile's geographical location on the Gulf of Mexico provides a mild subtropical climate (Köppen "Cfa"), with hot, humid summers and mild, rainy winters. The record low temperature was , set on February 13, 1899, and the record high was , set on August 29, 2000.
A 2007 study by WeatherBill, Inc. determined that Mobile is the wettest city in the contiguous 48 states, with of average annual rainfall over a 30-year period. Mobile averages 120 days per year with at least of rain. Precipitation is heavy year-round. On average, July and August are the wettest months, with frequent and often-heavy shower and thunderstorm activity. October stands out as a slightly drier month than all others. Snow is rare in Mobile, with its last snowfall on December 8, 2017, before this, its last snow was nearly four years earlier, on January 27, 2014.
Mobile is occasionally affected by major tropical storms and hurricanes. The city suffered a major natural disaster on the night of September 12, 1979, when category-3 Hurricane Frederic passed over the heart of the city. The storm caused tremendous damage to Mobile and the surrounding area. Mobile had moderate damage from Hurricane Opal on October 4, 1995, and Hurricane Ivan on September 16, 2004.
Mobile suffered millions of dollars in damage from Hurricane Katrina on August 29, 2005, which damaged much of the Gulf Coast cities. A storm surge of , topped by higher waves, damaged eastern sections of the city with extensive flooding in downtown, the Battleship Parkway, and the elevated Jubilee Parkway. As can be seen in the above 2005 photograph, floodwaters covered stairs of the entrance to the Federal Courthouse, located three blocks from the waterfront.
In late December 2012, the city suffered two tornado hits. On December 25, 2012, at 4:54 pm, a large wedge tornado touched down in the city. The tornado rapidly intensified as it moved north-northeast at speeds of up to . The path took the tornado into Midtown, causing damage or destruction to at least 100 structures. The heaviest damage to houses was along Carlen Street, Rickarby Place, Dauphin Street, Old Shell Road, Margaret Street, Silverwood Street, and Springhill Avenue. In addition to residential structures, the tornado caused significant damage to the Carmelite Monastery, Little Flower Catholic Church, commercial real estate along Airport Boulevard and Government Street in the Midtown at the Loop neighborhood, Murphy High School, Trinity Episcopal Church, Springhill Avenue Temple, and Mobile Infirmary Hospital before moving into the neighboring city of Prichard. The tornado was classified as an EF2 tornado by the National Weather Service on December 26.
The path taken through the city was just a short distance east of the path taken days earlier, on December 20, by an EF1 tornado which had touched down near Davidson High School and taken a path ending in Prichard. Initial damage estimates for insured and uninsured ranged from $140 to $150 million.
Mobile's French and Spanish colonial history has given it a culture distinguished by French, Spanish, Creole, African and Catholic heritage, in addition to later British and American influences. It is distinguished from all other cities in the state of Alabama. The annual Carnival celebration is perhaps the best example of its differences. Mobile is the birthplace of the celebration of Mardi Gras in the United States and has the oldest celebration, dating to the early 18th century during the French colonial period.
Carnival in Mobile evolved over the course of 300 years from a beginning as a sedate French Catholic tradition into the mainstream multi-week celebration that today bridges a spectrum of cultures. Mobile's official cultural ambassadors are the Azalea Trail Maids, meant to embody the ideals of Southern hospitality.
"Close Encounters of the Third Kind" (1977) and "Back Roads" (1981) were shot in Mobile.
The Carnival season has expanded throughout the late fall and winter: balls in the city may be scheduled as early as November, with the parades beginning after January 5 and the Twelfth Day of Christmas or Epiphany on January 6. Carnival celebrations end at midnight on Mardi Gras, a moveable feast related to the timing of Lent and Easter. The next day is Ash Wednesday and the beginning of Lent, the 40-day penitential season before Easter.
In Mobile, locals often use the term Mardi Gras as a shorthand to refer to the entire Carnival season. During the Carnival season; the mystic societies build colorful floats and parade throughout downtown. Masked society members toss small gifts, known as 'throws,' to parade spectators. The mystic societies, which in essence are exclusive private clubs, also hold formal masquerade balls, usually by invitation only, and oriented to adults.
Carnival was first celebrated in Mobile in 1703 when colonial French Catholic settlers carried out their traditional celebration at the Old Mobile Site, prior to the 1711 relocation of the city to the current site. Mobile's first Carnival society was established in 1711 with the "Boeuf Gras Society" (Fatted Ox Society). Celebrations were relatively small and consisted of local, private parties until the early 19th century.
In 1830 Mobile's Cowbellion de Rakin Society was the first formally organized and masked mystic society in the United States to celebrate with a parade. The Cowbellions got their start when Michael Krafft, a cotton factor from Pennsylvania, began a parade with rakes, hoes, and cowbells. The "Cowbellians" introduced horse-drawn floats to the parades in 1840 with a parade entitled "Heathen Gods and Goddesses". The Striker's Independent Society, formed in 1843, is the oldest surviving mystic society in the United States.
Carnival celebrations in Mobile were canceled during the American Civil War. In 1866 Joe Cain revived the Mardi Gras parades when he paraded through the city streets on Fat Tuesday while costumed as a fictional Chickasaw chief named "Slacabamorinico." He celebrated the day in front of the occupying Union Army troops. In 2002, Mobile's Tricentennial celebrated with parades that represented all of the city's mystic societies.
Founded in 2004, the Conde Explorers in 2005 were the first integrated Mardi Gras society to parade in downtown Mobile. The society has about a hundred members and welcomes men and women of all races. In addition to the parade and ball, the Conde Explorers hold several parties throughout the year. Its members also perform volunteer work. The Conde Explorers were featured in the award-winning documentary, "The Order of Myths" (2008), by Margaret Brown about Mobile's Mardi Gras.
The National African American Archives and Museum features the history of African-American participation in Mardi Gras, authentic artifacts from the era of slavery, and portraits and biographies of famous African Americans. The University of South Alabama Archives houses primary source material relating to the history of Mobile and southern Alabama, as well as the university's history. The archives are located on the ground floor of the USA Spring Hill Campus and are open to the general public.
The Mobile Municipal Archives contains the extant records of the City of Mobile, dating from the city's creation as a municipality by the Mississippi Territory in 1814. The majority of the original records of Mobile's colonial history, spanning the years 1702 through 1813, are housed in Paris, London, Seville, and Madrid. The Mobile Genealogical Society Library and Media Center is located at the Holy Family Catholic Church and School complex. It features handwritten manuscripts and published materials that are available for use in genealogical research.
The Mobile Public Library system serves Mobile and consists of eight branches across Mobile County; its large local history and genealogy division is housed in a facility next to the newly restored and enlarged Ben May Main Library on Government Street. The Saint Ignatius Archives, Museum and Theological Research Library contains primary sources, artifacts, documents, photographs and publications that pertain to the history of Saint Ignatius Church and School, the Catholic history of the city, and the history of the Roman Catholic Church.
The Mobile Museum of Art features permanent exhibits that span several centuries of art and culture. The museum was expanded in 2002 to approximately . The permanent exhibits include the African and Asian Collection Gallery, Altmayer Gallery (American art), Katharine C. Cochrane Gallery of American Fine Art, Maisel European Gallery, Riddick Glass Gallery, Smith Crafts Gallery, and the Ann B. Hearin Gallery (contemporary works).
The Centre for the Living Arts is an organization that operates the historic Saenger Theatre and Space 301, a contemporary art gallery. The Saenger Theatre opened in 1927 as a movie palace. Today it is a performing arts center and serves as a small concert venue for the city. It is home to the Mobile Symphony Orchestra, conducted by Maestro Scott Speck. Space 301 Gallery and Studio was initially housed adjacent to the Saenger, but moved to its own space in 2008. The building, donated to the Centre by the "Press-Register" after its relocation to a new modern facility, underwent a $5.2 million renovation and redesign prior to opening. The Crescent Theater in downtown Mobile has been showing arthouse films since 2008.
The Mobile Civic Center contains three facilities under one roof. The building has an arena, a theater and an exposition hall. It is the primary concert venue for the city and hosts a wide variety of events. It is home to the Mobile Opera and the Mobile Ballet. The 60-year-old Mobile Opera averages about 1,200 attendees per performance. A wide variety of events are held at Mobile's Arthur C. Outlaw Convention Center. It contains a exhibit hall, a grand ballroom, and sixteen meeting rooms.
The city has hosts the Greater Gulf State Fair, held each October since 1955. The city also hosted BayFest, an annual three-day music festival with more than 125 live musical acts on multiple stages spread throughout downtown; it now holds Ten Sixty Five festival, a free music festival.
The Mobile Theatre Guild is a nonprofit community theatre that has served the city since 1947. It is a member of the Mobile Arts Council, the Alabama Conference of Theatre and Speech, the Southeastern Theatre Conference, and the American Association of Community Theatres. Mobile is also host to the Joe Jefferson Players, Alabama's oldest continually running community theatre. The group was named in honor of the famous comedic actor Joe Jefferson, who spend part of his teenage years in Mobile. The Players debuted their first production on December 17, 1947. Drama Camp Productions and Sunny Side Theater is Mobile's home for children's theater and fun. The group began doing summer camps in 2002, expanded to a year-round facility in 2008 and recently moved into the Azalea City Center for the Arts, a community of drama, music, art, photography, and dance teachers. The group has produced Broadway shows including "Miracle on 34th Street," "Honk," "Fame," and "Hairspray."
The Mobile Arts Council is an umbrella organization for the arts in Mobile. It was founded in 1955 as a project of the Junior League of Mobile with the mission to increase cooperation among artistic and cultural organizations in the area and to provide a forum for problems in art, music, theater, and literature.
Mobile is home to a variety of museums. Battleship Memorial Park is a military park on the shore of Mobile Bay and features the World War II era battleship , the World War II era submarine , Korean War and Vietnam War Memorials, and a variety of historical military equipment. The History Museum of Mobile showcases 300 plus years of Mobile history and prehistory. It is housed in the historic Old City Hall (1857), a National Historic Landmark. The Oakleigh Historic Complex features three house museums that attempt to interpret the lives of people from three strata of 19th century society in Mobile, that of the enslaved, the working class, and the upper class. The Mobile Carnival Museum, housing the city's Mardi Gras history and memorabilia, documents the variety of floats, costumes, and displays seen during the history of the festival season. The Bragg-Mitchell Mansion (1855), Richards DAR House (1860), and the Condé-Charlotte House (1822) are historic, furnished antebellum house museums. Fort Morgan (1819), Fort Gaines (1821), and Historic Blakeley State Park all figure predominantly in local American Civil War history.
The Mobile Medical Museum is housed in the historic French colonial-style Vincent-Doan House (1827). It features artifacts and resources that chronicle the long history of medicine in Mobile. The Phoenix Fire Museum is located in the restored Phoenix Volunteer Fire Company Number 6 building and features the history of fire companies in Mobile from their organization in 1838. The Mobile Police Department Museum features exhibits that chronicle the history of law enforcement in Mobile. The Gulf Coast Exploreum Science Center is a non-profit science center located in downtown. It features permanent and traveling exhibits, an IMAX dome theater, a digital 3D virtual theater, and a hands-on chemistry laboratory. The Dauphin Island Sea Lab is located south of the city, on Dauphin Island near the mouth of Mobile Bay. It houses the Estuarium, an aquarium which illustrates the four habitats of the Mobile Bay ecosystem: the river delta, bay, barrier islands and Gulf of Mexico.
The Mobile Botanical Gardens feature a variety of flora spread over . It contains the Millie McConnell Rhododendron Garden with 1,000 evergreen and native azaleas and the Longleaf Pine Habitat. Bellingrath Gardens and Home, located on Fowl River, is a botanical garden and historic mansion that dates to the 1930s. The 5 Rivers Delta Resource Center is a facility that allows visitors to learn about and access the Mobile, Tensaw, Apalachee, Middle, Blakeley, and Spanish rivers. It was established to serve as an easily accessible gateway to the Mobile-Tensaw River Delta. In addition to offering several boat and adventure tours, it contains a small theater; exhibit hall; meeting facilities; walking trails; a canoe and kayak landing.
Mobile has more than 45 public parks within its limits, with some that are of special note. Bienville Square is a historic park in the Lower Dauphin Street Historic District. It assumed its current form in 1850 and is named for Mobile's founder, Jean-Baptiste Le Moyne, Sieur de Bienville. It was once the principal gathering place for residents, when the city was smaller, and remains popular today. Cathedral Square is a one-block performing arts park, also in the Lower Dauphin Street Historic District, which is overlooked by the Cathedral Basilica of the Immaculate Conception.
The Fort of Colonial Mobile is a reconstruction of the city's original Fort Condé, built on the original fort's footprint. It serves as the official welcome center and a colonial-era living history museum. Spanish Plaza is a downtown park that honors the Spanish phase of the city between 1780 and 1813. It features the "Arches of Friendship", a fountain presented to Mobile by the city of Málaga, Spain. Langan Park, the largest of the parks at , features lakes, natural spaces, and contains the Mobile Museum of Art, Azalea City Golf Course, Mobile Botanical Gardens and Playhouse in the Park.
Mobile has antebellum architectural examples of Greek Revival, Gothic Revival, Italianate, and Creole cottage. Later architectural styles found in the city include the various Victorian types, shotgun types, Colonial Revival, Tudor Revival, Spanish Colonial Revival, Beaux-Arts and many others. The city currently has nine major historic districts: Old Dauphin Way, Oakleigh Garden, Lower Dauphin Street, Leinkauf, De Tonti Square, Church Street East, Ashland Place, Campground, and Midtown.
Mobile has a number of historic structures in the city, including numerous churches and private homes. Some of Mobile's historic churches include Christ Church Cathedral, the Cathedral Basilica of the Immaculate Conception, Emanuel AME Church, Government Street Presbyterian Church, St. Louis Street Missionary Baptist Church, State Street AME Zion Church, Stone Street Baptist Church, Trinity Episcopal Church, St. Francis Street Methodist Church, Saint Joseph's Roman Catholic Church, Saint Francis Xavier Catholic Church, Saint Matthew's Catholic Church, Saint Paul's Episcopal Chapel, and Saint Vincent de Paul. The Sodality Chapel and St. Joseph's Chapel at Spring Hill College are two historic churches on that campus. Two historic Roman Catholic convents survive, the Convent and Academy of the Visitation and the Convent of Mercy.
Barton Academy is a historic Greek Revival school building and local landmark on Government Street. The Bishop Portier House and the Carlen House are two of the many surviving examples of Creole cottages in the city. The Mobile City Hospital and the United States Marine Hospital are both restored Greek Revival hospital buildings that predate the Civil War. The Washington Firehouse No. 5 is a Greek Revival fire station, built in 1851. The Hunter House is an example of the Italianate style and was built by a successful 19th-century African American businesswoman. The Shepard House is a good example of the Queen Anne style. The Scottish Rite Temple is the only surviving example of Egyptian Revival architecture in the city. The Gulf, Mobile and Ohio Passenger Terminal is an example of the Mission Revival style.
The city has several historic cemeteries that were established shortly after the colonial era. They replaced the colonial Campo Santo, of which no trace remains. The Church Street Graveyard contains above-ground tombs and monuments spread over and was founded in 1819, during the height of yellow fever epidemics. The nearby Magnolia Cemetery was established in 1836 and served as Mobile's primary burial site during the 19th and early 20th centuries, with approximately 80,000 burials. It features tombs and many intricately carved monuments and statues.
The Catholic Cemetery was established in 1848 by the Archdiocese of Mobile and covers more than . It contains plots for the Brothers of the Sacred Heart, Little Sisters of the Poor, Sisters of Charity, and Sisters of Mercy, in addition to many other historically significant burials. Mobile's Jewish community dates back to the 1820s and the city has two historic Jewish cemeteries, Sha'arai Shomayim Cemetery and Ahavas Chesed Cemetery. Sha'arai Shomayim is the older of the two.
The 2010 United States Census determined that there were 195,111 people residing within the city limits of Mobile. Mobile is the center of Alabama's second-largest metropolitan area, which consists of all of Mobile County. Metropolitan Mobile is estimated to have a population of 413,936 in 2012.
The 2010 census indicated that there were 78,959 households, out of which 21,073 had children under the age of 18 living with them, 28,073 were married couples living together, 17,037 had a female householder with no husband present, 3,579 had a male householder with no wife present, and 30,270 were non-families. 25,439 of all households were made up of individuals and 8,477 had someone living alone who was 65 years of age or older. The racial makeup of the city was 50.6% Black or African American, 45.0% White, 0.3% Native American, 1.8% Asian, 0.0% Pacific Islander, 0.9% from other races, 1.4% from two or more races, and 2.4% of the population were Latino. Non-Hispanic Whites were 43.9% of the population in 2010, down from 62.1% in 1980. The average household size was 2.4 and the average family size was 3.07. Estimated same-sex couple households comprised 0.3% of all households in 2010.
The age distribution of the population in 2010 consisted of 6.7% under the age of five years, 75.9% over 18, and 13.7% over 65. The median age was 35.7 years. The male population was 47.0% and the female population was 53.0%. The median income for a household in the city was $37,056 for 2006 to 2010. The per capita income for the city was $22,401.
Since 1985 the government of Mobile has consisted of a mayor and a seven-member city council. The mayor is elected at-large, and the council members are elected from each of the seven city council single-member districts (SMDs). A supermajority of five votes is required to conduct council business.
This form of city government was chosen by the voters after the previous form of government, which had three city commissioners, each elected at-large, was ruled in 1975 to substantially dilute the minority vote and violate the Voting Rights Act in "Bolden v. City of Mobile". The three at-large commissioners each required a majority vote to win. Due to appeals, the case took time to reach settlement and establishment of a new electoral system. Municipal elections are held every four years.
The first mayor elected under the new system of single-member district (SMD) voting was Arthur R. Outlaw, who served his second term as mayor from 1985–1989. His first term had been under the old system, from 1967–1968. Mike Dow defeated Outlaw in the 1989 election; he was re-elected, serving as mayor for four terms, from 1989–2005. His "The String of Pearls" initiative, a series of projects designed to stimulate redevelopment of the city's core, is credited with reviving much of downtown Mobile. Upon his retirement, Dow endorsed Sam Jones as his successor.
Sam Jones was elected in 2005 as the first African-American mayor of Mobile. He was re-elected for a second term in 2009 without opposition. His administration continued the focus on downtown redevelopment and bringing industries to the city. He ran for a third term in 2013 but was defeated by Sandy Stimpson. Stimpson took office on November 4, 2013 and was re-elected on August 22, 2017.
As of November 2013, the seven-member city council is made up of Fredrick Richardson, Jr. from District 1, Levon Manzie from District 2, C.J. Small from District 3, John C. Williams from District 4, Joel Daves from District 5, Bess Rich from District 6, and Gina Gregory from District 7.
Public schools in Mobile are operated by the Mobile County Public School System. The Mobile County Public School System has an enrollment of over 65,000 students, employs approximately 8,500 public school employees, and had a budget in 2005–2006 of $617,162,616. The State of Alabama operates the Alabama School of Mathematics and Science on Dauphin Street in Mobile, which boards advanced Alabama high school students. It was founded in 1989 to identify, challenge, and educate future leaders.
Mobile also has a large number of private schools, most of them parochial in nature. Many belong to the Roman Catholic Archdiocese of Mobile. The private Catholic institutions include McGill-Toolen Catholic High School (1896), Corpus Christi School, Little Flower Catholic School (1934), Most Pure Heart of Mary Catholic School (1900), Saint Dominic School (1961), Saint Ignatius School (1952), Saint Mary Catholic School (1867), Saint Pius X Catholic School (1957), and Saint Vincent DePaul Catholic School (1976).
Notable private Protestant institutions include St. Paul's Episcopal School (1947), Mobile Christian School (1961), St. Lukes Episcopal School (1961), Cottage Hill Baptist School System (1961), Faith Academy (1967), and Trinity Lutheran School (1955).
UMS-Wright Preparatory School is an independent co-educational preparatory school. It assumed its current configuration in 1988, when the University Military School (founded 1893) and the Julius T. Wright School for Girls (1923) merged to form UMS-Wright.
Major colleges and universities in Mobile that are accredited by the Southern Association of Colleges and Schools include the University of South Alabama, Spring Hill College, the University of Mobile, Faulkner University, and Bishop State Community College.
The University of South Alabama is a public, doctoral-level university established in 1963. The university is composed of the College of Arts and Sciences, the Mitchell College of Business, the College of Education, the College of Engineering, the College of Medicine, the Doctor of Pharmacy Program, the College of Nursing, the School of Computing, and the School of Continuing Education and Special Programs.
Faulkner University is a four-year private Church of Christ-affiliated university based in Montgomery, Alabama. The Mobile campus was established in 1975 and offers bachelor's degrees in Business Administration, Management of Human Resources, and Criminal Justice. It also offers associate degrees in Business Administration, Business Information Systems, Computer & Information Science, Criminal Justice, Informatics, Legal Studies, Arts, and Science.
Spring Hill College, chartered in 1830, was the first Catholic college in the southeastern United States and is the third oldest Jesuit college in the country. This four-year private college offers graduate programs in Business Administration, Education, Liberal Arts, Nursing (MSN), and Theological Studies. Undergraduate divisions and programs include the Division of Business, the Communications/Arts Division, International Studies, Inter-divisional Studies, the Language and Literature Division, Bachelor of Science in Nursing, Philosophy and Theology, Political Science, the Sciences Division, the Social Sciences Division, and the Teacher Education Division.
The University of Mobile is a four-year private Baptist-affiliated university in the neighboring city of Prichard that was founded in 1961. It consists of the College of Arts and Sciences, School of Business, School of Christian Studies, School of Education, the School of Leadership Development, and the School of Nursing.
Bishop State Community College, founded in 1927, is a public, historically African American, community college. Bishop State has four campuses in Mobile and offers a wide array of associate degrees.
Several post-secondary, vocational-type institutions have a campus in Mobile. These include the Alabama Institute Of Real Estate, American Academy Of Hypnosis, Bealle School Of Real Estate, Charles Academy Of Beauty Culture, Fortis College, Virginia College, ITT Technical Institute, Remington College and White And Sons Barber College.
Mobile serves the central Gulf Coast as a regional center for medicine, with over 850 physicians and 175 dentists. There are four major medical centers within the city limits.
Mobile Infirmary Medical Center has 704 beds and is the largest nonprofit hospital in the state. It was founded in 1910. Providence Hospital has 349 beds. It was founded in 1854 by the Daughters of Charity from Emmitsburg, Maryland. The University of South Alabama Medical Center has 346 beds. Its roots go back to 1830 with the old city-owned Mobile City Hospital and associated medical school. A teaching hospital, it has Mobile's only level I trauma center and regional burn center. Springhill Medical Center, with 252 beds, was founded in 1975. It is Mobile's only for-profit facility.
Additionally, the University of South Alabama operates the University of South Alabama Children's and Women's Hospital with 219 beds, dedicated exclusively to the care of women and minors. In 2008, the University of South Alabama opened the USA Mitchell Cancer Center Institute. The center is home to the first academic cancer research center in the central Gulf Coast region.
Mobile Infirmary Medical Center operated Infirmary West, formerly Knollwood Hospital, with 100 acute care beds, but closed the facility at the end of October 2012 due to declining revenues.
BayPointe Hospital and Children's Residential Services, with 94-beds, is the only psychiatric hospital in the city. It houses a residential unit for children, an acute unit for children and adolescents, and an age-segregated involuntary hospital unit for adults undergoing evaluation ordered by the Mobile Probate Court.
The city has a broad array of outpatient surgical centers, emergency clinics, home health care services, assisted-living facilities and skilled nursing facilities.
Aerospace, steel, ship building, retail, services, construction, medicine, and manufacturing are Mobile's major industries. After having economic decline for several decades, Mobile's economy began to rebound in the late 1980s. Between 1993 and 2003 roughly 13,983 new jobs were created as 87 new companies were founded and 399 existing companies were expanded.
Defunct companies that had been founded or based in Mobile included Alabama Drydock and Shipbuilding Company, Delchamps, and Gayfers. Current companies that were formerly based in the city include Checkers, Minolta-QMS, Morrison's, and the Waterman Steamship Corporation.
In addition to those discussed below, AlwaysHD, Foosackly's, Integrity Media, and Volkert, Inc. are headquartered in Mobile.
Mobile's Alabama State Docks underwent the largest expansion in its history in the early 21st century. It expanded its container processing and storage facility and increased container storage at the docks by over 1,000% at a cost of over $300 million, a project completed in 2005. Despite the expansion of its container capabilities and the addition of two massive new cranes, the port went from 9th largest to the 12th largest by tonnage in the nation from 2008 to 2010.
Shipbuilding began to make a major comeback in Mobile in 1999 with the founding of Austal USA. A subsidiary of the Australian company Austal, it expanded its production facility for United States defense and commercial aluminum shipbuilding on Blakeley Island in 2005. Austal announced in October 2012, after winning a new defense contract and completing another building within their complex on the island, that it will expand from a workforce of 3,000 workers to 4,500 employees.
Atlantic Marine operated a major shipyard at the former Alabama Drydock and Shipbuilding Company site on Pinto Island. It was acquired by British defense conglomerate BAE Systems in May 2010 for $352 million. Doing business as BAE Systems Southeast Shipyards, the company continues to operate the site as a full-service shipyard, employing approximately 600 workers with plans to expand.
The Mobile Aeroplex at Brookley is an industrial complex and airport located south of the central business district of the city. It is the largest industrial and transportation complex in the region, having more than 70 companies, many of which are aerospace, spread over . Notable employers at Brookley include Airbus North America Engineering (Airbus Military North America's facilities are at the Mobile Regional Airport), ST Aerospace Mobile (a division of ST Engineering), and Continental Motors.
Plans for an Airbus A320 family aircraft assembly plant in Mobile were formally announced by Airbus CEO Fabrice Brégier from the Mobile Convention Center on July 2, 2012. The plans include a $600 million factory at the Brookley Aeroplex for the assembly of the A319, A320 and A321 aircraft. It was planned to employ up roughly 1,000 full-time workers when fully operational. Construction began with a groundbreaking ceremony on April 8, 2013, with it becoming operable by 2015 and producing up to 50 aircraft per year by 2017. The assembly plant is the company's first factory to be built within the United States. It was announced on February 1, 2013 that Airbus had hired Alabama-based Hoar Construction to oversee construction of the facility.
The factory officially opened on September 14, 2015, covering one million square feet on 53 acres of flat grassland.
On October 16, 2017, Airbus announced a partnership with Bombardier Aerospace, taking over a majority share of the Bombardier CSeries airliner program. As a result of this partnership, Airbus plans to open an assembly line for CSeries aircraft in Mobile, particularly to serve the US market. This effort may allow the companies to circumvent high import tariffs on the CSeries family. The aircraft was renamed the Airbus A220 on 10 July 2018. Production started in August 2019; the first A220 from the new line is due to be delivered to Delta in the third quarter of 2020.
German technology conglomerate ThyssenKrupp broke ground on a $4.65 billion combined stainless and carbon steel processing facility in Calvert, a few miles north of Mobile, in 2007. It was originally projected to eventually employ 2,700 people. The facility became operational in July 2010.
The company put both its carbon mill in Calvert and a steel slab-making unit in Rio de Janeiro up for sale in May 2012, citing rising production costs and a worldwide decrease in demand. ThyssenKrupp's stainless steel division, Inoxum, including the stainless portion of the Calvert plant, was sold to Finnish stainless steel company Outokumpu Oyi in 2012.
According to Mobile's 2011 Comprehensive Annual Financial Report, the top employers in the city during 2011 were:
The United States Department of Labor's Bureau of Labor Statistics unemployment rate (not seasonally adjusted) for the Mobile Metropolitan Statistical Area was 7.5% for July 2013, compared with an unadjusted rate of 6.6% for Alabama as a whole and 7.4% for the entire nation.
Local airline passengers are served by the Mobile Regional Airport, with direct connections to four major hub airports. It is served by American Eagle, with service to Dallas-Fort Worth International Airport and Charlotte/Douglas International Airport; United Express, with service to George Bush Intercontinental Airport and Delta Connection, with service to Hartsfield-Jackson International Airport. The Mobile Downtown Airport at the Brookley Aeroplex serves corporate, cargo, and private aircraft.
In an effort to better take advantage of Mobile's waterways for recreational, opposed to simply industrial, use, The Three Mile Creek Greenway Trail is being designed and implemented under the instruction of the City Council. The linear park will ultimately span seven miles, from Langan (Municipal) Park to Dr. Martin Luther King Junior Avenue, and include trailheads, sidewalks, and bike lanes. The existing greenway is centered at Tricentennial Park.
Other trails include the paved Mobile Airport Perimeter Trail, encircling the Mobile Downtown Airport and mountain biking trails on the west side of the University of South Alabama.
Mobile is served by four Class I railroads, including the Canadian National Railway (CNR), CSX Transportation (CSX), the Kansas City Southern Railway (KCS), and the Norfolk Southern Railway (NS). The Alabama and Gulf Coast Railway (AGR), a Class III railroad, links Mobile to the Burlington Northern and Santa Fe Railway (BNSF) at Amory, Mississippi. These converge at the Port of Mobile, which provides intermodal freight transport service to companies engaged in importing and exporting. Other railroads include the CG Railway (CGR), a rail ship service to Coatzacoalcos, Veracruz, and the Terminal Railway Alabama State Docks (TASD), a switching railroad.
The city was served by Amtrak's "Sunset Limited" passenger train service until 2005, when the service was suspended due to the effects of Hurricane Katrina.However, efforts to restart passenger rail service between Mobile and New Orleans were revived in 2019 by the 21-member Southern Rail Commission after receiving a $33 million Federal Railroad Administration grant in June of that year. Louisiana quickly dedicated its $10 million toward the project, and Mississippi initially balked before committing its $15 million sum but Governor Kay Ivey resisted committing the estimated $2.7 million state allocation from Alabama because of concerns regarding long-term financial commitments and potential competition with freight traffic from the Port of Mobile The Winter of 2019 was marked by repeated postponement of votes by the Mobile City Council as it requested more information on how rail traffic from the port would be impacted and where the Amtrak station would be built as community support for the project became more vocal, especially among millennials.
|
https://en.wikipedia.org/wiki?curid=20952
|
Monoamine oxidase
Monoamine oxidases (MAO) () are a family of enzymes that catalyze the oxidation of monoamines, employing oxygen to clip off their amine group. They are found bound to the outer membrane of mitochondria in most cell types of the body. The first such enzyme was discovered in 1928 by Mary Bernheim in the liver and was named tyramine oxidase. The MAOs belong to the protein family of flavin-containing amine oxidoreductases.
MAOs are important in the breakdown of monoamines ingested in food, and also serve to inactivate monoamine neurotransmitters. Because of the latter, they are involved in a number of psychiatric and neurological diseases, some of which can be treated with monoamine oxidase inhibitors (MAOIs) which block the action of MAOs.
In humans there are two types of MAO: MAO-A and MAO-B.
MAO-A appears at roughly 80% of adulthood levels at birth, increasing very slightly after the first 4 years of life, while MAO-B is almost non-detectable in the infant brain. Regional distribution of the monoamine oxidases is characterized by extremely high levels of both MAOs in the hypothalamus and hippocampal uncus, as well as a large amount of MAO-B with very little MAO-A in the striatum and globus pallidus. The cortex has relatively high levels of only MAO-A, with the exception of areas of the cingulate cortex, which contains a balance of both. Autopsied brains demonstrated the predicted increased concentration of MAO-A in regions dense in serotonergic neurotransmission, however MAO-B only correlated with norepinephrine.
Monoamine oxidases catalyze the oxidative deamination of monoamines. Oxygen is used to remove an amine group (plus the adjacent hydrogen atom) from a molecule, resulting in the corresponding ketone (or aldehyde) and ammonia. Monoamine oxidases contain the covalently bound cofactor FAD and are, thus, classified as flavoproteins. Monoamine oxidase A and B share roughly 70% of their structure and both have substrate binding sites that are predominantly hydrophobic. Two tyrosine residues (398, 435, 407 and 444) in the binding pocket that are commonly involved in inhibitor activity have been hypothesized to be relevant to orienting substrates, and mutations of these residues are relevant to mental health. Four main models have been proposed for the mechanism of electron transfer (single electron transfer, hydrogen atom transfer, nucleophilic model, and hydride transfer) although there is insufficient evidence to support any of them.
They are well known enzymes in pharmacology, since they are the target for the action of a number of monoamine oxidase inhibitor drugs. MAO-A is particularly important in the catabolism of monoamines ingested in food. Both MAOs are also vital to the inactivation of monoamine neurotransmitters, for which they display different specificities.
Specific reactions catalyzed by MAO include:
Because of the vital role that MAOs play in the inactivation of neurotransmitters, MAO dysfunction (too much or too little MAO activity) is thought to be responsible for a number of psychiatric and neurological disorders. For example, unusually high or low levels of MAOs in the body have been associated with schizophrenia, depression, attention deficit disorder, substance abuse, migraines, and irregular sexual maturation. Monoamine oxidase inhibitors are one of the major classes of drug prescribed for the treatment of depression, although they are often last-line treatment due to risk of the drug's interaction with diet or other drugs. Excessive levels of catecholamines (epinephrine, norepinephrine, and dopamine) may lead to a hypertensive crisis, and excessive levels of serotonin may lead to serotonin syndrome.
In fact, MAO-A inhibitors act as antidepressant and anti-anxiety agents, whereas MAO-B inhibitors are used alone or in combination to treat Alzheimer's disease and Parkinson's disease. Some research suggests that certain phenotypes of depression, such as those with anxiety, and "atypical" symptoms involving psychomotor retardation, weight gain and interpersonal sensitivity. However the findings related to this have not been consistent. MAOIs may be effective in treatment resistant depression, especially those that do not respond to tricyclic antidepressants.
PET research shows that use of tobacco cigarettes heavily depletes MAO-B, mimicking the action of an MAO-B inhibitor. Smokers who smoke for emotional relief may therefore be unintentionally treating depression and/or anxiety that is better addressed by an MAO-B inhibitor.
There are significant differences in MAO activity in different species. Dopamine is primarily deaminated by MAO-A in rats, but by MAO-B in vervet monkeys and humans.
Mice unable to produce either MAO-A or MAO-B display autistic-like traits. These knockout mice display an increased response to stress.
The genes encoding MAO-A and MAO-B are located side-by-side on the short arm of the X chromosome, and have about 70% sequence similarity. Rare mutations in the gene are associated with Brunner syndrome.
A study based on the Dunedin cohort concluded that maltreated children with a low-activity polymorphism in the promoter region of the MAO-A gene were more likely to develop antisocial conduct disorders than maltreated children with the high-activity variant. Out of the 442 total males in the study (maltreated or not), 37% had the low activity variant. Of the 13 maltreated males with low MAO-A activity, 11 had been assessed as exhibiting adolescent conduct disorder and 4 were convicted for violent offenses. The suggested mechanism for this effect is the decreased ability of those with low MAO-A activity to quickly degrade norepinephrine, the synaptic neurotransmitter involved in sympathetic arousal and rage. This is argued to provide direct support for the idea that genetic susceptibility to disease is not determined at birth, but varies with exposure to environmental influences. However, most individuals with conduct disorder or convictions did not have low activity of MAO-A; maltreatment was found to have caused stronger predisposition for antisocial behavior than differences in MAO-A activity.
The claim that an interaction between low MAO-A activity and maltreatment would cause anti-social behavior has been criticized since the predisposition towards anti-social behavior could equally well have been caused by "other" genes inherited from abusive parents.
A possible link between predisposition to novelty seeking and a genotype of the MAO-A gene has been found.
A particular variant (or genotype), dubbed "warrior gene" in the popular press, was over-represented in Māori. This supported earlier studies finding different proportions of variants in different ethnic groups. This is the case for many genetic variants, with 33% White/Non-Hispanic, 61% Asian/Pacific Islanders having the low-activity MAO-A promoter variant.
Unlike many other enzymes, MAO-B activity is increased during aging in the brain of humans and other mammals. Increased MAO-B activity was also found in the pineal gland of aging rats. This may contribute to lowered levels of monoamines in aged brain and pineal gland.
|
https://en.wikipedia.org/wiki?curid=20953
|
Magna Carta
After John's death, the regency government of his young son, Henry III, reissued the document in 1216, stripped of some of its more radical content, in an unsuccessful bid to build political support for their cause. At the end of the war in 1217, it formed part of the peace treaty agreed at Lambeth, where the document acquired the name Magna Carta, to distinguish it from the smaller Charter of the Forest which was issued at the same time. Short of funds, Henry reissued the charter again in 1225 in exchange for a grant of new taxes. His son, Edward I, repeated the exercise in 1297, this time confirming it as part of England's statute law. The charter became part of English political life and was typically renewed by each monarch in turn, although as time went by and the fledgling Parliament of England passed new laws, it lost some of its practical significance.
At the end of the 16th century there was an upsurge in interest in Magna Carta. Lawyers and historians at the time believed that there was an ancient English constitution, going back to the days of the Anglo-Saxons, that protected individual English freedoms. They argued that the Norman invasion of 1066 had overthrown these rights, and that Magna Carta had been a popular attempt to restore them, making the charter an essential foundation for the contemporary powers of Parliament and legal principles such as "habeas corpus". Although this historical account was badly flawed, jurists such as Sir Edward Coke used Magna Carta extensively in the early 17th century, arguing against the divine right of kings propounded by the Stuart monarchs. Both James I and his son Charles I attempted to suppress the discussion of Magna Carta, until the issue was curtailed by the English Civil War of the 1640s and the execution of Charles. The political myth of Magna Carta and its protection of ancient personal liberties persisted after the Glorious Revolution of 1688 until well into the 19th century. It influenced the early American colonists in the Thirteen Colonies and the formation of the United States Constitution, which became the supreme law of the land in the new republic of the United States. Research by Victorian historians showed that the original 1215 charter had concerned the medieval relationship between the monarch and the barons, rather than the rights of ordinary people, but the charter remained a powerful, iconic document, even after almost all of its content was repealed from the statute books in the 19th and 20th centuries.
Magna Carta still forms an important symbol of liberty today, often cited by politicians and campaigners, and is held in great respect by the British and American legal communities, Lord Denning describing it as "the greatest constitutional document of all times – the foundation of the freedom of the individual against the arbitrary authority of the despot". In the 21st century, four exemplifications of the original 1215 charter remain in existence, two at the British Library, one at Lincoln Castle and one at Salisbury Cathedral. There are also a handful of the subsequent charters in public and private ownership, including copies of the 1297 charter in both the United States and Australia. The original charters were written on parchment sheets using quill pens, in heavily abbreviated medieval Latin, which was the convention for legal documents at that time. Each was sealed with the royal great seal (made of beeswax and resin sealing wax): very few of the seals have survived. Although scholars refer to the 63 numbered "clauses" of Magna Carta, this is a modern system of numbering, introduced by Sir William Blackstone in 1759; the original charter formed a single, long unbroken text. The four original 1215 charters were displayed together at the British Library for one day, 3 February 2015, to mark the 800th anniversary of Magna Carta.
Magna Carta originated as an unsuccessful attempt to achieve peace between royalist and rebel factions in 1215, as part of the events leading to the outbreak of the First Barons' War. England was ruled by King John, the third of the Angevin kings. Although the kingdom had a robust administrative system, the nature of government under the Angevin monarchs was ill-defined and uncertain. John and his predecessors had ruled using the principle of "vis et voluntas", or "force and will", taking executive and sometimes arbitrary decisions, often justified on the basis that a king was above the law. Many contemporary writers believed that monarchs should rule in accordance with the custom and the law, with the counsel of the leading members of the realm, but there was no model for what should happen if a king refused to do so.
John had lost most of his ancestral lands in France to King Philip II in 1204 and had struggled to regain them for many years, raising extensive taxes on the barons to accumulate money to fight a war which ended in expensive failure in 1214. Following the defeat of his allies at the Battle of Bouvines, John had to sue for peace and pay compensation. John was already personally unpopular with many of the barons, many of whom owed money to the Crown, and little trust existed between the two sides. A triumph would have strengthened his position, but in the face of his defeat, within a few months after his return from France John found that rebel barons in the north and east of England were organising resistance to his rule.
The rebels took an oath that they would "stand fast for the liberty of the church and the realm", and demanded that the King confirm the Charter of Liberties that had been declared by King Henry I in the previous century, and which was perceived by the barons to protect their rights. The rebel leadership was unimpressive by the standards of the time, even disreputable, but were united by their hatred of John; Robert FitzWalter, later elected leader of the rebel barons, claimed publicly that John had attempted to rape his daughter, and was implicated in a plot to assassinate John in 1212.
John held a council in London in January 1215 to discuss potential reforms, and sponsored discussions in Oxford between his agents and the rebels during the spring. Both sides appealed to Pope Innocent III for assistance in the dispute. During the negotiations, the rebellious barons produced an initial document, which historians have termed "the Unknown Charter of Liberties", which drew on Henry I's Charter of Liberties for much of its language; seven articles from that document later appeared in the "Articles of the Barons" and the subsequent charter.
It was John's hope that the Pope would give him valuable legal and moral support, and accordingly John played for time; the King had declared himself to be a papal vassal in 1213 and correctly believed he could count on the Pope for help. John also began recruiting mercenary forces from France, although some were later sent back to avoid giving the impression that the King was escalating the conflict. In a further move to shore up his support, John took an oath to become a crusader, a move which gave him additional political protection under church law, even though many felt the promise was insincere.
Letters backing John arrived from the Pope in April, but by then the rebel barons had organised into a military faction. They congregated at Northampton in May and renounced their feudal ties to John, marching on London, Lincoln, and Exeter. John's efforts to appear moderate and conciliatory had been largely successful, but once the rebels held London, they attracted a fresh wave of defectors from the royalists. The King offered to submit the problem to a committee of arbitration with the Pope as the supreme arbiter, but this was not attractive to the rebels. Stephen Langton, the archbishop of Canterbury, had been working with the rebel barons on their demands, and after the suggestion of papal arbitration failed, John instructed Langton to organise peace talks.
John met the rebel leaders at Runnymede, a water-meadow on the south bank of the River Thames, on 10 June 1215. Runnymede was a traditional place for assemblies, but it was also located on neutral ground between the royal fortress of Windsor Castle and the rebel base at Staines, and offered both sides the security of a rendezvous where they were unlikely to find themselves at a military disadvantage. Here the rebels presented John with their draft demands for reform, the 'Articles of the Barons'. Stephen Langton's pragmatic efforts at mediation over the next ten days turned these incomplete demands into a charter capturing the proposed peace agreement; a few years later, this agreement was renamed Magna Carta, meaning "Great Charter". By 15 June, general agreement had been made on a text, and on 19 June, the rebels renewed their oaths of loyalty to John and copies of the charter were formally issued.
Although, as the historian David Carpenter has noted, the charter "wasted no time on political theory", it went beyond simply addressing individual baronial complaints, and formed a wider proposal for political reform. It promised the protection of church rights, protection from illegal imprisonment, access to swift justice, and, most importantly, limitations on taxation and other feudal payments to the Crown, with certain forms of feudal taxation requiring baronial consent. It focused on the rights of free men—in particular the barons - however, the rights of serfs were included in articles 16, 20, and 28. Its style and content reflected Henry I's Charter of Liberties, as well as a wider body of legal traditions, including the royal charters issued to towns, the operations of the Church and baronial courts and European charters such as the Statute of Pamiers.
Under what historians later labelled "clause 61", or the "security clause", a council of 25 barons would be created to monitor and ensure John's future adherence to the charter. If John did not conform to the charter within 40 days of being notified of a transgression by the council, the 25 barons were empowered by clause 61 to seize John's castles and lands until, in their judgement, amends had been made. Men were to be compelled to swear an oath to assist the council in controlling the King, but once redress had been made for any breaches, the King would continue to rule as before. In one sense this was not unprecedented; other kings had previously conceded the right of individual resistance to their subjects if the King did not uphold his obligations. Magna Carta was however novel in that it set up a formally recognised means of collectively coercing the King. The historian Wilfred Warren argues that it was almost inevitable that the clause would result in civil war, as it "was crude in its methods and disturbing in its implications". The barons were trying to force John to keep to the charter, but clause 61 was so heavily weighted against the King that this version of the charter could not survive.
John and the rebel barons did not trust each other, and neither side seriously attempted to implement the peace accord. The 25 barons selected for the new council were all rebels, chosen by the more extremist barons, and many among the rebels found excuses to keep their forces mobilised. Disputes began to emerge between the royalist faction and those rebels who had expected the charter to return lands that had been confiscated .
Clause 61 of Magna Carta contained a commitment from John that he would "seek to obtain nothing from anyone, in our own person or through someone else, whereby any of these grants or liberties may be revoked or diminished". Despite this, the King appealed to Pope Innocent for help in July, arguing that the charter compromised the Pope's rights as John's feudal lord. As part of the June peace deal, the barons were supposed to surrender London by 15 August, but this they refused to do. Meanwhile, instructions from the Pope arrived in August, written before the peace accord, with the result that papal commissioners excommunicated the rebel barons and suspended Langton from office in early September. Once aware of the charter, the Pope responded in detail: in a letter dated 24 August and arriving in late September, he declared the charter to be "not only shameful and demeaning but also illegal and unjust" since John had been "forced to accept" it, and accordingly the charter was "null, and void of all validity for ever"; under threat of excommunication, the King was not to observe the charter, nor the barons try to enforce it.
By then, violence had broken out between the two sides; less than three months after it had been agreed, John and the loyalist barons firmly repudiated the failed charter: the First Barons' War erupted. The rebel barons concluded that peace with John was impossible, and turned to Philip II's son, the future Louis VIII, for help, offering him the English throne. The war soon settled into a stalemate. The King became ill and died on the night of 18 October 1216, leaving the nine-year-old Henry III as his heir.
The preamble to Magna Carta includes the names of the following 27 ecclesiastical and secular magnates who had counselled John to accept its terms. The names include some of the moderate reformers, notably Archbishop Stephen Langton, and some of John's loyal supporters, such as William Marshal, Earl of Pembroke. They are listed here in the order in which they appear in the charter itself:
The names of the Twenty-Five Barons appointed under clause 61 to monitor John's future conduct are not given in the charter itself, but do appear in four early sources, all seemingly based on a contemporary listing: a late 13th-century collection of law tracts and statutes, a Reading Abbey manuscript now in Lambeth Palace Library, and the "Chronica Majora" and "Liber Additamentorum" of Matthew Paris. The process of appointment is not known, but the names were drawn almost exclusively from among John's more active opponents. They are listed here in the order in which they appear in the original sources:
In September 1215, the papal commissioners in England – Subdeacon Pandulf, Peter des Roches, Bishop of Winchester, and Simon, Abbot of Reading – excommunicated the rebels, acting on instructions earlier received from Rome. A letter sent by the commissioners from Dover on 5 September to Archbishop Langton explicitly names nine senior rebel barons (all members of the Council of Twenty-Five), and six clerics numbered among the rebel ranks:
Although the Charter of 1215 was a failure as a peace treaty, it was resurrected under the new government of the young Henry III as a way of drawing support away from the rebel faction. On his deathbed, King John appointed a council of thirteen executors to help Henry reclaim the kingdom, and requested that his son be placed into the guardianship of William Marshal, one of the most famous knights in England. William knighted the boy, and Cardinal Guala Bicchieri, the papal legate to England, then oversaw his coronation at Gloucester Cathedral on 28 October.
The young King inherited a difficult situation, with over half of England occupied by the rebels. He had substantial support though from Guala, who intended to win the civil war for Henry and punish the rebels. Guala set about strengthening the ties between England and the Papacy, starting with the coronation itself, during which Henry gave homage to the Papacy, recognising the Pope as his feudal lord. Pope Honorius III declared that Henry was the Pope's vassal and ward, and that the legate had complete authority to protect Henry and his kingdom. As an additional measure, Henry took the cross, declaring himself a crusader and thereby entitled to special protection from Rome.
The war was not going well for the loyalists, but Prince Louis and the rebel barons were also finding it difficult to make further progress. John's death had defused some of the rebel concerns, and the royal castles were still holding out in the occupied parts of the country. Henry's government encouraged the rebel barons to come back to his cause in exchange for the return of their lands, and reissued a version of the 1215 Charter, albeit having first removed some of the clauses, including those unfavourable to the Papacy and clause 61, which had set up the council of barons. The move was not successful, and opposition to Henry's new government hardened.
In February 1217, Louis set sail for France to gather reinforcements. In his absence, arguments broke out between Louis' French and English followers, and Cardinal Guala declared that Henry's war against the rebels was the equivalent of a religious crusade. This declaration resulted in a series of defections from the rebel movement, and the tide of the conflict swung in Henry's favour. Louis returned at the end of April, but his northern forces were defeated by William Marshal at the Battle of Lincoln in May.
Meanwhile, support for Louis' campaign was diminishing in France, and he concluded that the war in England was lost. He negotiated terms with Cardinal Guala, under which Louis would renounce his claim to the English throne; in return, his followers would be given back their lands, any sentences of excommunication would be lifted, and Henry's government would promise to enforce the charter of the previous year. The proposed agreement soon began to unravel amid claims from some loyalists that it was too generous towards the rebels, particularly the clergy who had joined the rebellion.
In the absence of a settlement, Louis stayed in London with his remaining forces, hoping for the arrival of reinforcements from France. When the expected fleet did arrive in August, it was intercepted and defeated by loyalists at the Battle of Sandwich. Louis entered into fresh peace negotiations, and the factions came to agreement on the final Treaty of Lambeth, also known as the Treaty of Kingston, on 12 and 13 September 1217. The treaty was similar to the first peace offer, but excluded the rebel clergy, whose lands and appointments remained forfeit; it included a promise, however, that Louis' followers would be allowed to enjoy their traditional liberties and customs, referring back to the Charter of 1216. Louis left England as agreed and joined the Albigensian Crusade in the south of France, bringing the war to an end.
A great council was called in October and November to take stock of the post-war situation; this council is thought to have formulated and issued the Charter of 1217. The charter resembled that of 1216, although some additional clauses were added to protect the rights of the barons over their feudal subjects, and the restrictions on the Crown's ability to levy taxation were watered down. There remained a range of disagreements about the management of the royal forests, which involved a special legal system that had resulted in a source of considerable royal revenue; complaints existed over both the implementation of these courts, and the geographic boundaries of the royal forests. A complementary charter, the Charter of the Forest, was created, pardoning existing forest offences, imposing new controls over the forest courts, and establishing a review of the forest boundaries. To distinguish the two charters, the term "magna carta libertatum", "the great charter of liberties", was used by the scribes to refer to the larger document, which in time became known simply as Magna Carta.
Magna Carta became increasingly embedded into English political life during Henry III's minority. As the King grew older, his government slowly began to recover from the civil war, regaining control of the counties and beginning to raise revenue once again, taking care not to overstep the terms of the charters. Henry remained a minor and his government's legal ability to make permanently binding decisions on his behalf was limited. In 1223, the tensions over the status of the charters became clear in the royal court, when Henry's government attempted to reassert its rights over its properties and revenues in the counties, facing resistance from many communities that argued—if sometimes incorrectly—that the charters protected the new arrangements. This resistance resulted in an argument between Archbishop Langton and William Brewer over whether the King had any duty to fulfil the terms of the charters, given that he had been forced to agree to them. On this occasion, Henry gave oral assurances that he considered himself bound by the charters, enabling a royal inquiry into the situation in the counties to progress.
Two years later, the question of Henry's commitment to the charters re-emerged, when Louis VIII of France invaded Henry's remaining provinces in France, Poitou and Gascony. Henry's army in Poitou was under-resourced, and the province quickly fell. It became clear that Gascony would also fall unless reinforcements were sent from England. In early 1225, a great council approved a tax of £40,000 to dispatch an army, which quickly retook Gascony. In exchange for agreeing to support Henry, the barons demanded that the King reissue Magna Carta and the Charter of the Forest. The content was almost identical to the 1217 versions, but in the new versions, the King declared that the charters were issued of his own "spontaneous and free will" and confirmed them with the royal seal, giving the new Great Charter and the Charter of the Forest of 1225 much more authority than the previous versions.
The barons anticipated that the King would act in accordance with these charters, subject to the law and moderated by the advice of the nobility. Uncertainty continued, and in 1227, when he was declared of age and able to rule independently, Henry announced that future charters had to be issued under his own seal. This brought into question the validity of the previous charters issued during his minority, and Henry actively threatened to overturn the Charter of the Forest unless the taxes promised in return for it were actually paid. In 1253, Henry confirmed the charters once again in exchange for taxation.
Henry placed a symbolic emphasis on rebuilding royal authority, but his rule was relatively circumscribed by Magna Carta. He generally acted within the terms of the charters, which prevented the Crown from taking extrajudicial action against the barons, including the fines and expropriations that had been common under his father, John. The charters did not address the sensitive issues of the appointment of royal advisers and the distribution of patronage, and they lacked any means of enforcement if the King chose to ignore them. The inconsistency with which he applied the charters over the course of his rule alienated many barons, even those within his own faction.
Despite the various charters, the provision of royal justice was inconsistent and driven by the needs of immediate politics: sometimes action would be taken to address a legitimate baronial complaint, while on other occasions the problem would simply be ignored. The royal courts, which toured the country to provide justice at the local level, typically for lesser barons and the gentry claiming grievances against major lords, had little power, allowing the major barons to dominate the local justice system. Henry's rule became lax and careless, resulting in a reduction in royal authority in the provinces and, ultimately, the collapse of his authority at court.
In 1258, a group of barons seized power from Henry in a "coup d'état", citing the need to strictly enforce Magna Carta and the Charter of the Forest, creating a new baronial-led government to advance reform through the Provisions of Oxford. The barons were not militarily powerful enough to win a decisive victory, and instead appealed to Louis IX of France in 1263–1264 to arbitrate on their proposed reforms. The reformist barons argued their case based on Magna Carta, suggesting that it was inviolable under English law and that the King had broken its terms.
Louis came down firmly in favour of Henry, but the French arbitration failed to achieve peace as the rebellious barons refused to accept the verdict. England slipped back into the Second Barons' War, which was won by Henry's son, the Lord Edward. Edward also invoked Magna Carta in advancing his cause, arguing that the reformers had taken matters too far and were themselves acting against Magna Carta. In a conciliatory gesture after the barons had been defeated, in 1267 Henry issued the Statute of Marlborough, which included a fresh commitment to observe the terms of Magna Carta.
The following 65 individuals were witnesses to the 1225 issue of Magna Carta, named in the order in which they appear in the charter itself:
King Edward I reissued the Charters of 1225 in 1297 in return for a new tax. It is this version which remains in statute today, although with most articles now repealed.
The "Confirmatio Cartarum" (Confirmation of Charters) was issued in Norman French by Edward I in 1297. Edward, needing money, had taxed the nobility, and they had armed themselves against him, forcing Edward to issue his confirmation of Magna Carta and the Forest Charter to avoid civil war. The nobles had sought to add another document, the "De Tallagio", to Magna Carta. Edward I's government was not prepared to concede this, they agreed to the issuing of the "Confirmatio", confirming the previous charters and confirming the principle that taxation should be by consent, although the precise manner of that consent was not laid down.
A passage mandates that copies shall be distributed in "cathedral churches throughout our realm, there to remain, and shall be read before the people two times by the year", hence the permanent installation of a copy in Salisbury Cathedral. In the Confirmation's second article, it is confirmed that
With the reconfirmation of the Charters in 1300, an additional document was granted, the "Articuli super Cartas" (The Articles upon the Charters). It was composed of 17 articles and sought in part to deal with the problem of enforcing the Charters. Magna Carta and the Forest Charter were to be issued to the sheriff of each county, and should be read four times a year at the meetings of the county courts. Each county should have a committee of three men who could hear complaints about violations of the Charters.
Pope Clement V continued the papal policy of supporting monarchs (who ruled by divine grace) against any claims in Magna Carta which challenged the King's rights, and annulled the "Confirmatio Cartarum" in 1305. Edward I interpreted Clement V's papal bull annulling the "Confirmatio Cartarum" as effectively applying to the "Articuli super Cartas", although the latter was not specifically mentioned. In 1306 Edward I took the opportunity given by the Pope's backing to reassert forest law over large areas which had been "disafforested". Both Edward and the Pope were accused by some contemporary chroniclers of "perjury", and it was suggested by Robert McNair Scott that Robert the Bruce refused to make peace with Edward I's son, Edward II, in 1312 with the justification: "How shall the king of England keep faith with me, since he does not observe the sworn promises made to his liege men..."
The Great Charter was referred to in legal cases throughout the medieval period. For example, in 1226, the knights of Lincolnshire argued that their local sheriff was changing customary practice regarding the local courts, "contrary to their liberty which they ought to have by the charter of the lord king". In practice, cases were not brought against the King for breach of Magna Carta and the Forest Charter, but it was possible to bring a case against the King's officers, such as his sheriffs, using the argument that the King's officers were acting contrary to liberties granted by the King in the charters.
In addition, medieval cases referred to the clauses in Magna Carta which dealt with specific issues such as wardship and dower, debt collection, and keeping rivers free for navigation. Even in the 13th century, some clauses of Magna Carta rarely appeared in legal cases, either because the issues concerned were no longer relevant, or because Magna Carta had been superseded by more relevant legislation. By 1350 half the clauses of Magna Carta were no longer actively used.
During the reign of King Edward III six measures, later known as the "Six Statutes", were passed between 1331 and 1369. They sought to clarify certain parts of the Charters. In particular the third statute, in 1354, redefined clause 29, with "free man" becoming "no man, of whatever estate or condition he may be", and introduced the phrase "due process of law" for "lawful judgement of his peers or the law of the land".
Between the 13th and 15th centuries Magna Carta was reconfirmed 32 times according to Sir Edward Coke, and possibly as many as 45 times. Often the first item of parliamentary business was a public reading and reaffirmation of the Charter, and, as in the previous century, parliaments often exacted confirmation of it from the monarch. The Charter was confirmed in 1423 by King Henry VI.
By the mid-15th century, Magna Carta ceased to occupy a central role in English political life, as monarchs reasserted authority and powers which had been challenged in the 100 years after Edward I's reign. The Great Charter remained a text for lawyers, particularly as a protector of property rights, and became more widely read than ever as printed versions circulated and levels of literacy increased.
During the 16th century, the interpretation of Magna Carta and the First Barons' War shifted. Henry VII took power at the end of the turbulent Wars of the Roses, followed by Henry VIII, and extensive propaganda under both rulers promoted the legitimacy of the regime, the illegitimacy of any sort of rebellion against royal power, and the priority of supporting the Crown in its arguments with the Papacy.
Tudor historians rediscovered the Barnwell chronicler, who was more favourable to King John than other 13th-century texts, and, as historian Ralph Turner describes, they "viewed King John in a positive light as a hero struggling against the papacy", showing "little sympathy for the Great Charter or the rebel barons". Pro-Catholic demonstrations during the 1536 uprising cited Magna Carta, accusing the King of not giving it sufficient respect.
The first mechanically printed edition of Magna Carta was probably the "Magna Carta cum aliis Antiquis Statutis" of 1508 by Richard Pynson, although the early printed versions of the 16th century incorrectly attributed the origins of Magna Carta to Henry III and 1225, rather than to John and 1215, and accordingly worked from the later text. An abridged English-language edition was published by John Rastell in 1527.
Thomas Berthelet, Pynson's successor as the royal printer during 1530–1547,
printed an edition of the text along with other "ancient statutes" in 1531 and 1540.
In 1534, George Ferrers published the first unabridged English-language edition of Magna Carta, dividing the Charter into 37 numbered clauses.
At the end of the 16th century, there was an upsurge in antiquarian interest in England. This work concluded that there was a set of ancient English customs and laws, temporarily overthrown by the Norman invasion of 1066, which had then been recovered in 1215 and recorded in Magna Carta, which in turn gave authority to important 16th century legal principles. Modern historians note that although this narrative was fundamentally incorrect—many refer to it as a "myth"—it took on great importance among the legal historians of the time.
The antiquarian William Lambarde, for example, published what he believed were the Anglo-Saxon and Norman law codes, tracing the origins of the 16th-century English Parliament back to this period, albeit misinterpreting the dates of many documents concerned. Francis Bacon argued that clause 39 of Magna Carta was the basis of the 16th-century jury system and judicial processes. Antiquarians Robert Beale, James Morice, and Richard Cosin argued that Magna Carta was a statement of liberty and a fundamental, supreme law empowering English government. Those who questioned these conclusions, including the Member of Parliament Arthur Hall, faced sanctions.
In the early 17th century, Magna Carta became increasingly important as a political document in arguments over the authority of the English monarchy. James I and Charles I both propounded greater authority for the Crown, justified by the doctrine of the divine right of kings, and Magna Carta was cited extensively by their opponents to challenge the monarchy.
Magna Carta, it was argued, recognised and protected the liberty of individual Englishmen, made the King subject to the common law of the land, formed the origin of the trial by jury system, and acknowledged the ancient origins of Parliament: because of Magna Carta and this ancient constitution, an English monarch was unable to alter these long-standing English customs. Although the arguments based on Magna Carta were historically inaccurate, they nonetheless carried symbolic power, as the charter had immense significance during this period; antiquarians such as Sir Henry Spelman described it as "the most majestic and a sacrosanct anchor to English Liberties".
Sir Edward Coke was a leader in using Magna Carta as a political tool during this period. Still working from the 1225 version of the text—the first printed copy of the 1215 charter only emerged in 1610—Coke spoke and wrote about Magna Carta repeatedly. His work was challenged at the time by Lord Ellesmere, and modern historians such as Ralph Turner and Claire Breay have critiqued Coke as "misconstruing" the original charter "anachronistically and uncritically", and taking a "very selective" approach to his analysis. More sympathetically, J. C. Holt noted that the history of the charters had already become "distorted" by the time Coke was carrying out his work.
In 1621, a bill was presented to Parliament to renew Magna Carta; although this bill failed, lawyer John Selden argued during Darnell's Case in 1627 that the right of "habeas corpus" was backed by Magna Carta. Coke supported the Petition of Right in 1628, which cited Magna Carta in its preamble, attempting to extend the provisions, and to make them binding on the judiciary. The monarchy responded by arguing that the historical legal situation was much less clear-cut than was being claimed, restricted the activities of antiquarians, arrested Coke for treason, and suppressed his proposed book on Magna Carta. Charles initially did not agree to the Petition of Right, and refused to confirm Magna Carta in any way that would reduce his independence as King.
England descended into civil war in the 1640s, resulting in Charles I's execution in 1649. Under the republic that followed, some questioned whether Magna Carta, an agreement with a monarch, was still relevant. An anti-Cromwellian pamphlet published in 1660, "The English devil", said that the nation had been "compelled to submit to this Tyrant Nol or be cut off by him; nothing but a word and a blow, his Will was his Law; tell him of Magna Carta, he would lay his hand on his sword and cry Magna Farta". In a 2005 speech the Lord Chief Justice of England and Wales, Lord Woolf, repeated the claim that Cromwell had referred to Magna Carta as "Magna Farta".
The radical groups that flourished during this period held differing opinions of Magna Carta. The Levellers rejected history and law as presented by their contemporaries, holding instead to an "anti-Normanism" viewpoint. John Lilburne, for example, argued that Magna Carta contained only some of the freedoms that had supposedly existed under the Anglo-Saxons before being crushed by the Norman yoke. The Leveller Richard Overton described the charter as "a beggarly thing containing many marks of intolerable bondage". Both saw Magna Carta as a useful declaration of liberties that could be used against governments they disagreed with. Gerrard Winstanley, the leader of the more extreme Diggers, stated "the best lawes that England hath, [viz., Magna Carta] were got by our Forefathers importunate petitioning unto the kings that still were their Task-masters; and yet these best laws are yoaks and manicles, tying one sort of people to be slaves to another; Clergy and Gentry have got their freedom, but the common people still are, and have been left servants to work for them."
The first attempt at a proper historiography was undertaken by Robert Brady, who refuted the supposed antiquity of Parliament and belief in the immutable continuity of the law. Brady realised that the liberties of the Charter were limited and argued that the liberties were the grant of the King. By putting Magna Carta in historical context, he cast doubt on its contemporary political relevance; his historical understanding did not survive the Glorious Revolution, which, according to the historian J. G. A. Pocock, "marked a setback for the course of English historiography."
According to the Whig interpretation of history, the Glorious Revolution was an example of the reclaiming of ancient liberties. Reinforced with Lockean concepts, the Whigs believed England's constitution to be a social contract, based on documents such as Magna Carta, the Petition of Right, and the Bill of Rights. The "English Liberties" (1680, in later versions often "British Liberties") by the Whig propagandist Henry Care (d. 1688) was a cheap polemical book that was influential and much-reprinted, in the American colonies as well as Britain, and made Magna Carta central to the history and the contemporary legitimacy of its subject.
Ideas about the nature of law in general were beginning to change. In 1716, the Septennial Act was passed, which had a number of consequences. First, it showed that Parliament no longer considered its previous statutes unassailable, as it provided for a maximum parliamentary term of seven years, whereas the Triennial Act (1694) (enacted less than a quarter of a century previously) had provided for a maximum term of three years.
It also greatly extended the powers of Parliament. Under this new constitution, monarchical absolutism was replaced by parliamentary supremacy. It was quickly realised that Magna Carta stood in the same relation to the King-in-Parliament as it had to the King without Parliament. This supremacy would be challenged by the likes of Granville Sharp. Sharp regarded Magna Carta as a fundamental part of the constitution, and maintained that it would be treason to repeal any part of it. He also held that the Charter prohibited slavery.
Sir William Blackstone published a critical edition of the 1215 Charter in 1759, and gave it the numbering system still used today. In 1763, Member of Parliament John Wilkes was arrested for writing an inflammatory pamphlet, "No. 45, 23 April 1763"; he cited Magna Carta continually. Lord Camden denounced the treatment of Wilkes as a contravention of Magna Carta. Thomas Paine, in his "Rights of Man", would disregard Magna Carta and the Bill of Rights on the grounds that they were not a written constitution devised by elected representatives.
When English colonists left for the New World, they brought royal charters that established the colonies. The Massachusetts Bay Company charter, for example, stated that the colonists would "have and enjoy all liberties and immunities of free and natural subjects." The Virginia Charter of 1606, which was largely drafted by Sir Edward Coke, stated that the colonists would have the same "liberties, franchises and immunities" as people born in England. The Massachusetts Body of Liberties contained similarities to clause 29 of Magna Carta; when drafting it, the Massachusetts General Court viewed Magna Carta as the chief embodiment of English common law. The other colonies would follow their example. In 1638, Maryland sought to recognise Magna Carta as part of the law of the province, but the request was denied by Charles I.
In 1687, William Penn published "The Excellent Privilege of Liberty and Property: being the birth-right of the Free-Born Subjects of England", which contained the first copy of Magna Carta printed on American soil. Penn's comments reflected Coke's, indicating a belief that Magna Carta was a fundamental law. The colonists drew on English law books, leading them to an anachronistic interpretation of Magna Carta, believing that it guaranteed trial by jury and "habeas corpus".
The development of parliamentary supremacy in the British Isles did not constitutionally affect the Thirteen Colonies, which retained an adherence to English common law, but it directly affected the relationship between Britain and the colonies. When American colonists fought against Britain, they were fighting not so much for new freedom, but to preserve liberties and rights that they believed to be enshrined in Magna Carta.
In the late 18th century, the United States Constitution became the supreme law of the land, recalling the manner in which Magna Carta had come to be regarded as fundamental law. The Constitution's Fifth Amendment guarantees that "no person shall be deprived of life, liberty, or property, without due process of law", a phrase that was derived from Magna Carta. In addition, the Constitution included a similar writ in the Suspension Clause, Article 1, Section 9: "The privilege of the writ of habeas corpus shall not be suspended, unless when in cases of rebellion or invasion, the public safety may require it."
Each of these proclaim that no person may be imprisoned or detained without evidence that he or she committed a crime. The Ninth Amendment states that "The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people." The writers of the U.S. Constitution wished to ensure that the rights they already held, such as those that they believed were provided by Magna Carta, would be preserved unless explicitly curtailed.
The U.S. Supreme Court has explicitly referenced Edward Coke's analysis of Magna Carta as an antecedent of the Sixth Amendment's right to a speedy trial.
Initially, the Whig interpretation of Magna Carta and its role in constitutional history remained dominant during the 19th century. The historian William Stubbs's "Constitutional History of England", published in the 1870s, formed the high-water mark of this view. Stubbs argued that Magna Carta had been a major step in the shaping of the English nation, and he believed that the barons at Runnymede in 1215 were not just representing the nobility, but the people of England as a whole, standing up to a tyrannical ruler in the form of King John.
This view of Magna Carta began to recede. The late-Victorian jurist and historian Frederic William Maitland provided an alternative academic history in 1899, which began to return Magna Carta to its historical roots. In 1904, Edward Jenks published an article entitled "The Myth of Magna Carta", which undermined the previously accepted view of Magna Carta. Historians such as Albert Pollard agreed with Jenks in concluding that Edward Coke had largely "invented" the myth of Magna Carta in the 17th century; these historians argued that the 1215 charter had not referred to liberty for the people at large, but rather to the protection of baronial rights.
This view also became popular in wider circles, and in 1930 Sellar and Yeatman published their parody on English history, "1066 and All That", in which they mocked the supposed importance of Magna Carta and its promises of universal liberty: "Magna Charter was therefore the chief cause of Democracy in England, and thus a "Good Thing" for everyone (except the Common People)".
In many literary representations of the medieval past, however, Magna Carta remained a foundation of English national identity. Some authors used the medieval roots of the document as an argument to preserve the social status quo, while others pointed to Magna Carta to challenge perceived economic injustices. The Baronial Order of Magna Charta was formed in 1898 to promote the ancient principles and values felt to be displayed in Magna Carta. The legal profession in England and the United States continued to hold Magna Carta in high esteem; they were instrumental in forming the Magna Carta Society in 1922 to protect the meadows at Runnymede from development in the 1920s, and in 1957, the American Bar Association erected the Magna Carta Memorial at Runnymede. The prominent lawyer Lord Denning described Magna Carta in 1956 as "the greatest constitutional document of all times – the foundation of the freedom of the individual against the arbitrary authority of the despot".
Radicals such as Sir Francis Burdett believed that Magna Carta could not be repealed, but in the 19th century clauses which were obsolete or had been superseded began to be repealed. The repeal of clause 36 in 1829, by the Offences against the Person Act 1828 (9 Geo. 4 c. 31 s. 1), was the first time a clause of Magna Carta was repealed. Over the next 140 years, nearly the whole of Magna Carta (1297) as statute was repealed, leaving just clauses 1, 9, and 29 still in force (in England and Wales) after 1969. Most of the clauses were repealed in England and Wales by the Statute Law Revision Act 1863, and in modern Northern Ireland and also in the modern Republic of Ireland by the Statute Law (Ireland) Revision Act 1872.
Many later attempts to draft constitutional forms of government trace their lineage back to Magna Carta. The British dominions, Australia and New Zealand, Canada (except Quebec), and formerly the Union of South Africa and Southern Rhodesia, reflected the influence of Magna Carta in their laws, and the Charter's effects can be seen in the laws of other states that evolved from the British Empire.
Magna Carta continues to have a powerful iconic status in British society, being cited by politicians and lawyers in support of constitutional positions. Its perceived guarantee of trial by jury and other civil liberties, for example, led to Tony Benn's reference to the debate in 2008 over whether to increase the maximum time terrorism suspects could be held without charge from 28 to 42 days as "the day Magna Carta was repealed". Although rarely invoked in court in the modern era, in 2012 the Occupy London protestors attempted to use Magna Carta in resisting their eviction from St. Paul's Churchyard by the City of London. In his judgment the Master of the Rolls gave this short shrift, noting somewhat drily that although clause 29 was considered by many the foundation of the rule of law in England, he did not consider it directly relevant to the case, and that the two other surviving clauses ironically concerned the rights of the Church and the City of London and could not help the defendants.
Magna Carta carries little legal weight in modern Britain, as most of its clauses have been repealed and relevant rights ensured by other statutes, but the historian James Holt remarks that the survival of the 1215 charter in national life is a "reflexion of the continuous development of English law and administration" and symbolic of the many struggles between authority and the law over the centuries. The historian W. L. Warren has observed that "many who knew little and cared less about the content of the Charter have, in nearly all ages, invoked its name, and with good cause, for it meant more than it said".
It also remains a topic of great interest to historians; Natalie Fryde characterised the charter as "one of the holiest of cows in English medieval history", with the debates over its interpretation and meaning unlikely to end. In many ways still a "sacred text", Magna Carta is generally considered part of the uncodified constitution of the United Kingdom; in a 2005 speech, the Lord Chief Justice of England and Wales, Lord Woolf, described it as the "first of a series of instruments that now are recognised as having a special constitutional status".
Magna Carta was reprinted in New Zealand in 1881 as one of the Imperial Acts in force there. Clause 29 of the document remains in force as part of New Zealand law.
The document also continues to be honoured in the United States as an antecedent of the United States Constitution and Bill of Rights. In 1976, the UK lent one of four surviving originals of the 1215 Magna Carta to the United States for their bicentennial celebrations and also donated an ornate display case for it. The original was returned after one year, but a replica and the case are still on display in the United States Capitol Crypt in Washington, D.C.
The 800th anniversary of the original charter occurred on 15 June 2015, and organisations and institutions planned celebratory events. The British Library brought together the four existing copies of the 1215 manuscript in February 2015 for a special exhibition. British artist Cornelia Parker was commissioned to create a new artwork, "Magna Carta (An Embroidery)", which was shown at the British Library between May and July 2015. The artwork is a copy of the Wikipedia article about Magna Carta (as it appeared on the document's 799th anniversary, 15 June 2014), hand-embroidered by over 200 people.
On 15 June 2015, a commemoration ceremony was conducted in Runnymede at the National Trust park, attended by British and American dignitaries. On the same day, Google celebrated the anniversary with a Google Doodle.
The copy held by Lincoln Cathedral was exhibited in the Library of Congress in Washington, D.C., from November 2014 until January 2015. A new visitor centre at Lincoln Castle was opened for the anniversary. The Royal Mint released two commemorative two-pound coins.
In 2014, Bury St Edmunds in Suffolk celebrated the 800th anniversary of the barons' Charter of Liberties, said to have been secretly agreed there in November 1214.
Numerous copies, known as exemplifications, were made of the various charters, and many of them still survive. The documents were written in heavily abbreviated medieval Latin in clear handwriting, using quill pens on sheets of parchment made from sheep skin, approximately across. They were sealed with the royal great seal by an official called the spigurnel, equipped with a special seal press, using beeswax and resin. There were no signatures on the charter of 1215, and the barons present did not attach their own seals to it. The text was not divided into paragraphs or numbered clauses: the numbering system used today was introduced by the jurist Sir William Blackstone in 1759.
At least thirteen original copies of the charter of 1215 were issued by the royal chancery during that year, seven in the first tranche distributed on 24 June and another six later; they were sent to county sheriffs and bishops, who were probably charged for the privilege. Slight variations exist between the surviving copies, and there was probably no single "master copy". Of these documents, only four survive, all held in England: two now at the British Library, one at Salisbury Cathedral, and one, the property of Lincoln Cathedral, on permanent loan to Lincoln Castle. Each of these versions is slightly different in size and text, and each is considered by historians to be equally authoritative.
The two 1215 charters held by the British Library, known as "Cotton MS. Augustus II.106" and "Cotton Charter XIII.31a", were acquired by the antiquarian Sir Robert Cotton in the 17th century. The first had been found by Humphrey Wyems, a London lawyer, who may have discovered it in a tailor's shop, and who gave it to Cotton in January 1629. The second was found in Dover Castle in 1630 by Sir Edward Dering. The Dering charter was traditionally thought to be the copy sent in 1215 to the Cinque Ports; but in 2015 the historian David Carpenter argued that it was more probably that sent to Canterbury Cathedral, as its text was identical to a transcription made from the Cathedral's copy of the 1215 charter in the 1290s. This copy was damaged in the Cotton library fire of 1731, when its seal was badly melted. The parchment was somewhat shrivelled but otherwise relatively unscathed, and an engraved facsimile of the charter was made by John Pine in 1733. In the 1830s, however, an ill-judged and bungled attempt at cleaning and conservation rendered the manuscript largely illegible to the naked eye. This is, nonetheless, the only surviving 1215 copy still to have its great seal attached.
Lincoln Cathedral's copy has been held by the county since 1215. It was displayed in the Common Chamber in the cathedral, before being moved to another building in 1846. Between 1939 and 1940 it was displayed in the British Pavilion at the 1939 World Fair in New York City, and at the Library of Congress. When the Second World War broke out, Winston Churchill wanted to give the charter to the American people, hoping that this would encourage the United States, then neutral, to enter the war against the Axis powers, but the cathedral was unwilling, and the plans were dropped. After December 1941, the copy was stored in Fort Knox, Kentucky, for safety, before being put on display again in 1944 and returned to Lincoln Cathedral in early 1946. It was put on display in 1976 in the cathedral's medieval library. It was subsequently displayed in San Francisco, and was taken out of display for a time to undergo conservation in preparation for another visit to the United States, where it was exhibited in 2007 at the Contemporary Art Center of Virginia and the National Constitution Center in Philadelphia. In 2009 it returned to New York to be displayed at the Fraunces Tavern Museum. It is currently on permanent loan to the David P. J. Ross Vault at Lincoln Castle, along with an original copy of the 1217 Charter of the Forest.
The fourth copy, held by Salisbury Cathedral, was first given in 1215 to its predecessor, Old Sarum Cathedral. Rediscovered by the cathedral in 1812, it has remained in Salisbury throughout its history, except when being taken off-site for restoration work. It is possibly the best preserved of the four, although small pin holes can be seen in the parchment from where it was once pinned up. The handwriting on this version is different from that of the other three, suggesting that it was not written by a royal scribe but rather by a member of the cathedral staff, who then had it exemplified by the royal court.
Other early versions of the charters survive today. Only one exemplification of the 1216 charter survives, held in Durham Cathedral. Four copies of the 1217 charter exist; three of these are held by the Bodleian Library in Oxford and one by Hereford Cathedral. Hereford's copy is occasionally displayed alongside the Mappa Mundi in the cathedral's chained library and has survived along with a small document called the "Articuli super Cartas" that was sent along with the charter, telling the sheriff of the county how to observe the conditions outlined in the document. One of the Bodleian's copies was displayed at San Francisco's California Palace of the Legion of Honor in 2011.
Four exemplifications of the 1225 charter survive: the British Library holds one, which was preserved at Lacock Abbey until 1945; Durham Cathedral also holds a copy, with the Bodleian Library holding a third. The fourth copy of the 1225 exemplification was held by the museum of the Public Record Office and is now held by The National Archives. The Society of Antiquaries also holds a draft of the 1215 charter (discovered in 2013 in a late 13th-century register from Peterborough Abbey), a copy of the 1225 third re-issue (within an early 14th-century collection of statutes) and a roll copy of the 1225 reissue.
Only two exemplifications of Magna Carta are held outside England, both from 1297. One of these was purchased in 1952 by the Australian Government for £12,500 from King's School, Bruton, England. This copy is now on display in the Members' Hall of Parliament House, Canberra. The second was originally held by the Brudenell family, earls of Cardigan, before they sold it in 1984 to the Perot Foundation in the United States, which in 2007 sold it to U.S. businessman David Rubenstein for US$21.3 million. Rubenstein commented "I have always believed that this was an important document to our country, even though it wasn't drafted in our country. I think it was the basis for the Declaration of Independence and the basis for the Constitution". This exemplification is now on permanent loan to the National Archives in Washington, D.C. Only two other 1297 exemplifications survive, one of which is held in the UK's National Archives, the other in the Guildhall, London.
Seven copies of the 1300 exemplification by Edward I survive, in Faversham, Oriel College, Oxford, the Bodleian Library, Durham Cathedral, Westminster Abbey, the City of London (held in the archives at the London Guildhall) and Sandwich (held in the Kent County Council archives). The Sandwich copy was rediscovered in early 2015 in a Victorian scrapbook in the town archives of Sandwich, Kent, one of the Cinque Ports. In the case of the Sandwich and Oriel College exemplifications, the copies of the Charter of the Forest originally issued with them also survive.
Most of the 1215 charter and later versions sought to govern the feudal rights of the Crown over the barons. Under the Angevin kings, and in particular during John's reign, the rights of the King had frequently been used inconsistently, often in an attempt to maximise the royal income from the barons. Feudal relief was one way that a king could demand money, and clauses 2 and 3 fixed the fees payable when an heir inherited an estate or when a minor came of age and took possession of his lands. Scutage was a form of medieval taxation; all knights and nobles owed military service to the Crown in return for their lands, which theoretically belonged to the King, but many preferred to avoid this service and offer money instead; the Crown often used the cash to pay for mercenaries. The rate of scutage that should be payable, and the circumstances under which it was appropriate for the King to demand it, was uncertain and controversial; clauses 12 and 14 addressed the management of the process.
The English judicial system had altered considerably over the previous century, with the royal judges playing a larger role in delivering justice across the country. John had used his royal discretion to extort large sums of money from the barons, effectively taking payment to offer justice in particular cases, and the role of the Crown in delivering justice had become politically sensitive among the barons. Clauses 39 and 40 demanded due process be applied in the royal justice system, while clause 45 required that the King appoint knowledgeable royal officials to the relevant roles. Although these clauses did not have any special significance in the original charter, this part of Magna Carta became singled out as particularly important in later centuries. In the United States, for example, the Supreme Court of California interpreted clause 45 in 1974 as establishing a requirement in common law that a defendant faced with the potential of incarceration be entitled to a trial overseen by a legally trained judge.
Royal forests were economically important in medieval England and were both protected and exploited by the Crown, supplying the King with hunting grounds, raw materials, and money. They were subject to special royal jurisdiction and the resulting forest law was, according to the historian Richard Huscroft, "harsh and arbitrary, a matter purely for the King's will". The size of the forests had expanded under the Angevin kings, an unpopular development.
The 1215 charter had several clauses relating to the royal forests; clauses 47 and 48 promised to deforest the lands added to the forests under John and investigate the use of royal rights in this area, but notably did not address the forestation of the previous kings, while clause 53 promised some form of redress for those affected by the recent changes, and clause 44 promised some relief from the operation of the forest courts. Neither Magna Carta nor the subsequent Charter of the Forest proved entirely satisfactory as a way of managing the political tensions arising in the operation of the royal forests.
Some of the clauses addressed wider economic issues. The concerns of the barons over the treatment of their debts to Jewish moneylenders, who occupied a special position in medieval England and were by tradition under the King's protection, were addressed by clauses 10 and 11. The charter concluded this section with the phrase "debts owing to other than Jews shall be dealt with likewise", so it is debatable to what extent the Jews were being singled out by these clauses. Some issues were relatively specific, such as clause 33 which ordered the removal of all fishing weirs—an important and growing source of revenue at the time—from England's rivers.
The role of the English Church had been a matter for great debate in the years prior to the 1215 charter. The Norman and Angevin kings had traditionally exercised a great deal of power over the church within their territories. From the 1040s onwards successive popes had emphasised the importance of the church being governed more effectively from Rome, and had established an independent judicial system and hierarchical chain of authority. After the 1140s, these principles had been largely accepted within the English church, even if accompanied by an element of concern about centralising authority in Rome.
These changes brought the customary rights of lay rulers such as John over ecclesiastical appointments into question. As described above, John had come to a compromise with Pope Innocent III in exchange for his political support for the King, and clause 1 of Magna Carta prominently displayed this arrangement, promising the freedoms and liberties of the church. The importance of this clause may also reflect the role of Archbishop Langton in the negotiations: Langton had taken a strong line on this issue during his career.
Only three clauses of Magna Carta still remain on statute in England and Wales. These clauses concern 1) the freedom of the English Church, 2) the "ancient liberties" of the City of London (clause 13 in the 1215 charter, clause 9 in the 1297 statute), and 3) a right to due legal process (clauses 39 and 40 in the 1215 charter, clause 29 in the 1297 statute). In detail, these clauses (using the numbering system from the 1297 statute) state that:
Government Magna Carta websites
Texts
|
https://en.wikipedia.org/wiki?curid=20958
|
Möbius function
The classical Möbius function is an important multiplicative function in number theory and combinatorics. The German mathematician August Ferdinand Möbius introduced it in 1832. It is a special case of a more general object in combinatorics.
For any positive integer , define as the sum of the primitive th roots of unity. It has values in {, , } depending on the factorization of into prime factors:
The Möbius function can alternatively be represented as
where formula_2 is the Kronecker delta, is the Liouville function, is the number of distinct prime divisors of , and is the number of prime factors of , counted with multiplicity.
The values of for the first 30 positive numbers are
The first 50 values of the function are plotted below:
The Möbius function is multiplicative (i.e. whenever and are coprime).
The sum of the Möbius function over all positive divisors of (including itself and 1) is zero except when :
The equality above leads to the important Möbius inversion formula and is the main reason why is of relevance in the theory of multiplicative and arithmetic functions.
Other applications of in combinatorics are connected with the use of the Pólya enumeration theorem in combinatorial groups and combinatorial enumerations.
There is a formula for calculating the Möbius function without directly knowing the factorization of its argument:
i.e. is the sum of the primitive -th roots of unity. (However, the computational complexity of this definition is at least the same as that of the Euler product definition.)
Using
the formula
can be seen as a consequence of the fact that the th roots of unity sum to 0, since each th root of unity is a primitive th root of unity for exactly one divisor of .
However it is also possible to prove this identity from first principles. First note that it is trivially true when . Suppose then that . Then there is a bijection between the factors of for which and the subsets of the set of all prime factors of . The asserted result follows from the fact that every non-empty finite set has an equal number of odd- and even-cardinality subsets.
This last fact can be shown easily by induction on the cardinality of a non-empty finite set . First, if , there is exactly one odd-cardinality subset of , namely itself, and exactly one even-cardinality subset, namely . Next, if , then divide the subsets of into two subclasses depending on whether they contain or not some fixed element in . There is an obvious bijection between these two subclasses, pairing those subsets that have the same complement relative to the subset . Also, one of these two subclasses consists of all the subsets of the set , and therefore, by the induction hypothesis, has an equal number of odd- and even-cardinality subsets. These subsets in turn correspond bijectively to the even- and odd-cardinality -containing subsets of . The inductive step follows directly from these two bijections.
A related result is that the binomial coefficients exhibit alternating entries of odd and even power which sum symmetrically.
In number theory another arithmetic function closely related to the Möbius function is the Mertens function, defined by
for every natural number . This function is closely linked with the positions of zeroes of the Riemann zeta function. See the article on the Mertens conjecture for more information about the connection between and the Riemann hypothesis.
From the formula
it follows that the Mertens function is given by:
where is the Farey sequence of order .
This formula is used in the proof of the Franel–Landau theorem.
The Dirichlet series that generates the Möbius function is the (multiplicative) inverse of the Riemann zeta function; if is a complex number with real part larger than 1 we have
This may be seen from its Euler product
The Lambert series for the Möbius function is:
which converges for . For prime formula_13, we also have
Gauss proved that for a prime number the sum of its primitive roots is congruent to .
If denotes the finite field of order (where is necessarily a prime power), then the number of monic irreducible polynomials of degree over is given by:
The average order of the Möbius function is zero. This statement is, in fact, equivalent to the prime number theorem.
if and only if is divisible by the square of a prime. The first numbers with this property are :
If is prime, then , but the converse is not true. The first non prime for which is . The first such numbers with three distinct prime factors (sphenic numbers) are:
and the first such numbers with 5 distinct prime factors are:
In combinatorics, every locally finite partially ordered set (poset) is assigned an incidence algebra. One distinguished member of this algebra is that poset's "Möbius function". The classical Möbius function treated in this article is essentially equal to the Möbius function of the set of all positive integers partially ordered by divisibility. See the article on incidence algebras for the precise definition and several examples of these general Möbius functions.
Constantin Popovici defined a generalised Möbius function to be the -fold Dirichlet convolution of the Möbius function with itself. It is thus again a multiplicative function with
where the binomial coefficient is taken to be zero if . The definition may be extended to complex by reading the binomial as a polynomial in .
The Möbius function also arises in the primon gas or free Riemann gas model of supersymmetry. In this theory, the fundamental particles or "primons" have energies . Under second quantization, multiparticle excitations are considered; these are given by for any natural number . This follows from the fact that the factorization of the natural numbers into primes is unique.
In the free Riemann gas, any natural number can occur, if the primons are taken as bosons. If they are taken as fermions, then the Pauli exclusion principle excludes squares. The operator that distinguishes fermions and bosons is then none other than the Möbius function .
The free Riemann gas has a number of other interesting connections to number theory, including the fact that the partition function is the Riemann zeta function. This idea underlies Alain Connes's attempted proof of the Riemann hypothesis.
The "Disquisitiones Arithmeticae" has been translated from Latin into English and German. The German edition includes all of his papers on number theory: all the proofs of quadratic reciprocity, the determination of the sign of the Gauss sum, the investigations into biquadratic reciprocity, and unpublished notes.
|
https://en.wikipedia.org/wiki?curid=20961
|
Methadone
Methadone, sold under the brand name Dolophine among others, is an opioid used for opioid maintenance therapy in opioid dependence and for chronic pain management. Detoxification using methadone can be accomplished in less than a month, or it may be done gradually over as long as six months. While a single dose has a rapid effect, maximum effect can take up to five days of use. The pain-relieving effects last about six hours after a single dose. After long-term use, in people with normal liver function, effects last 8 to 36 hours. Methadone is usually taken by mouth and rarely by injection into a muscle or vein.
Side effects are similar to those of other opioids. These frequently includes dizziness, sleepiness, vomiting, and sweating. Serious risks include opioid abuse and a decreased effort to breathe. Abnormal heart rhythms may also occur due to a prolonged QT interval. The number of deaths in the United States involving methadone poisoning declined from 4,418 in 2011 to 3,300 in 2015. Risks are greater with higher doses. Methadone is made by chemical synthesis and acts on opioid receptors.
Methadone was developed in Germany around 1937 to 1939 by Gustav Ehrhart and Max Bockmühl. It was approved for use in the United States in 1947. It is on the World Health Organization's List of Essential Medicines, the safest and most effective medicines needed in a health system. In 2013, about 41,400 kilograms were manufactured globally. It is regulated similarly to other narcotic drugs. It is not particularly expensive in the United States.
Methadone is used for the treatment of opioid use disorder. It may be used as a maintenance therapy or in shorter periods for detoxification to manage opioid withdrawal symptoms.
A 2009 Cochrane review found methadone was effective in retaining people in treatment and in the reduction or cessation of heroin use as measured by self-report and urine/hair analysis but did not affect criminal activity or risk of death.
Treatment of opioid-dependent persons with methadone follows one of two routes: maintenance or detoxification. Methadone maintenance therapy (MMT) usually takes place in outpatient settings. It is usually prescribed as a single daily dose medication for those who wish to abstain from illicit opioid use. Treatment models for MMT differ. It is not uncommon for treatment recipients to be administered methadone in a specialist clinic, where they are observed for around 15–20 minutes post dosing, to reduce risk of diversion of medication.
The duration of methadone treatment programs range from a few months to several years. Given opioid dependence is characteristically a chronic relapsing/remitting disorder, MMT may be lifelong. The length of time a person remains in treatment depends on a number of factors. While starting doses may be adjusted based on the amount of opioids reportedly used, most clinical guidelines suggest doses start low (e.g. at doses not exceeding 40 mg daily) and are incremented gradually.
Methadone maintenance has been shown to reduce the transmission of blood borne viruses associated with opioid injection, such as hepatitis B and C, and/or HIV. The principal goals of methadone maintenance are to relieve opioid cravings, suppress the abstinence syndrome, and block the euphoric effects associated with opioids.
Chronic methadone dosing will eventually lead to neuroadaptation, characterised by a syndrome of tolerance and withdrawal (dependence). However, when used correctly in treatment, maintenance therapy has been found to be medically safe, non-sedating, and can provide a slow recovery from opioid addiction. Methadone has been widely used for pregnant women addicted to opioids.
Methadone is approved in the US, and many other parts of the world, for the treatment of opioid addiction. Its use for the treatment of addiction is usually strictly regulated. In the US, outpatient treatment programs must be certified by the federal Substance Abuse and Mental Health Services Administration (SAMHSA) and registered by the Drug Enforcement Administration (DEA) in order to prescribe methadone for opioid addiction.
Methadone is used as an analgesic in chronic pain, often in rotation with other opioids. Due to its activity at the NMDA receptor, it may be more effective against neuropathic pain; for the same reason, tolerance to the analgesic effects may be less than that of other opioids.
Adverse effects of methadone include:
Physical symptoms
Cognitive symptoms
Methadone withdrawal symptoms are reported as being significantly more protracted than withdrawal from opioids with shorter half-lives.
Methadone is sometimes administered as an oral liquid. Methadone has been implicated in contributing to significant tooth decay. Methadone causes dry mouth, reducing the protective role of saliva in preventing decay. Other putative mechanisms of methadone-related tooth decay include craving for carbohydrates related to opioids, poor dental care, and general decrease in personal hygiene. These factors, combined with sedation, have been linked to the causation of extensive dental damage.
Methadone has the following US FDA black box warning:
Most people who have overdosed on methadone may show some of the following symptoms:
The respiratory depression of an overdose can be treated with naloxone. Naloxone is preferred to the newer, longer acting antagonist naltrexone. Despite methadone's much longer duration of action compared to either heroin and other shorter-acting agonists, and the need for repeat doses of the antagonist naloxone, it is still used for overdose therapy. As naltrexone has a longer half-life, it is more difficult to titrate. If too large a dose of the opioid antagonist is given to a dependent person, it will result in withdrawal symptoms (possibly severe). When using naloxone, the naloxone will be quickly eliminated and the withdrawal will be short lived. Doses of naltrexone take longer to be eliminated from the person's system. A common problem in treating methadone overdoses is that, given the short action of naloxone (versus the extremely longer-acting methadone), a dosage of naloxone given to a methadone-overdosed person will initially work to bring the person out of overdose, but once the naloxone wears off, if no further naloxone is administered, the person can go right back into overdose (based upon time and dosage of the methadone ingested).
As with other opioid medications, tolerance and dependence usually develop with repeated doses. There is some clinical evidence that tolerance to analgesia is less with methadone compared to other opioids; this may be due to its activity at the NMDA receptor. Tolerance to the different physiological effects of methadone varies; tolerance to analgesic properties may or may not develop quickly, but tolerance to euphoria usually develops rapidly, whereas tolerance to constipation, sedation, and respiratory depression develops slowly (if ever).
Methadone treatment may impair driving ability. Drug abusers had significantly more involvement in serious crashes than non-abusers in a study by the University of Queensland. In the study of a group of 220 drug abusers, most of them poly-drug abusers, 17 were involved in crashes killing people, compared with a control group of other people randomly selected having no involvement in fatal crashes. However, there have been multiple studies verifying the ability of methadone maintenance patients to drive. In the UK, persons who are prescribed oral Methadone can continue to drive after they have satisfactorily completed an independent medical examination which will include a urine screen for drugs. The license will be issued for 12 months at a time and even then, only following a favourable assessment from their own doctor. Individuals who are prescribed methadone for either IV or IM administration cannot drive in the UK, mainly due to the increased sedation effects that this route of use can cause.
In the United States, deaths linked to methadone more than quadrupled in the five-year period between 1999 and 2004. According to the U.S. National Center for Health Statistics, as well as a 2006 series in the "Charleston Gazette" (West Virginia), medical examiners listed methadone as contributing to 3,849 deaths in 2004. That number was up from 790 in 1999. Approximately 82 percent of those deaths were listed as accidental, and most deaths involved combinations of methadone with other drugs (especially benzodiazepines).
Although deaths from methadone are on the rise, methadone-associated deaths are not being caused primarily by methadone intended for methadone treatment programs, according to a panel of experts convened by the Substance Abuse and Mental Health Services Administration, which released a report titled "Methadone-Associated Mortality, Report of a National Assessment". The consensus report concludes that "although the data remains incomplete, National Assessment meeting participants concurred that methadone tablets or diskettes distributed through channels other than opioid treatment programs most likely are the central factors in methadone-associated mortality."
In 2006, the U.S. Food and Drug Administration issued a caution about methadone, titled "Methadone Use for Pain Control May Result in Death." The FDA also revised the drug's package insert. The change deleted previous information about the usual adult dosage. The "Charleston Gazette" reported, "The old language about the 'usual adult dose' was potentially deadly, according to pain specialists."
Methadone acts by binding to the µ-opioid receptor, but also has some affinity for the NMDA receptor, an ionotropic glutamate receptor. Methadone is metabolized by CYP3A4, CYP2B6, CYP2D6, and is a substrate, or in this case target, for the P-glycoprotein efflux protein, a protein which helps pump foreign substances out of cells, in the intestines and brain. The bioavailability and elimination half-life of methadone are subject to substantial interindividual variability. Its main route of administration is oral. Adverse effects include sedation, hypoventilation, constipation and miosis, in addition to tolerance, dependence and withdrawal difficulties. The withdrawal period can be much more prolonged than with other opioids, spanning anywhere from two weeks to several months.
The metabolic half life of methadone differs from its duration of action. The metabolic half life is 8 to 59 hours (approximately 24 hours for opioid-tolerant people, and 55 hours in opioid-naive people), as opposed to a half life of 1 to 5 hours for morphine. The length of the half life of methadone allows for exhibition of respiratory depressant effects for an extended duration of time in opioid-naive people.
Levomethadone (the "L" enantiomer) is a μ-opioid receptor agonist with higher intrinsic activity than morphine, but lower affinity. Dextromethadone (the "S" enantiomer) does not affect opioid receptors but binds to the glutamatergic NMDA ("N"-methyl--aspartate) receptor, and acts as an antagonist against glutamate. Methadone has been shown to reduce neuropathic pain in rat models, primarily through NMDA receptor antagonism. Glutamate is the primary excitatory neurotransmitter in the central nervous system. NMDA receptors have a very important role in modulating long-term excitation and memory formation. NMDA antagonists such as dextromethorphan (DXM, a cough suppressant), ketamine (a dissociative anaesthetic), tiletamine (a veterinary anaesthetic) and ibogaine (from the African tree "Tabernanthe iboga") are being studied for their role in decreasing the development of tolerance to opioids and as possible for eliminating addiction/tolerance/withdrawal, possibly by disrupting memory circuitry. Acting as an NMDA antagonist may be one mechanism by which methadone decreases craving for opioids and tolerance, and has been proposed as a possible mechanism for its distinguished efficacy regarding the treatment of neuropathic pain. The dextrorotary form (dextromethadone), which acts as an NMDA receptor antagonist and is devoid of opioid activity, has been shown to produce analgesia in experimental models of chronic pain. Methadone also acted as a potent, noncompetitive α3β4 neuronal nicotinic acetylcholine receptor antagonist in rat receptors, expressed in human embryonic kidney cell lines.
Methadone has a slow metabolism and very high fat solubility, making it longer lasting than morphine-based drugs. Methadone has a typical elimination half-life of 15 to 60 hours with a mean of around 22. However, metabolism rates vary greatly between individuals, up to a factor of 100, ranging from as few as 4 hours to as many as 130 hours, or even 190 hours. This variability is apparently due to genetic variability in the production of the associated cytochrome enzymes CYP3A4, CYP2B6 and CYP2D6. Many substances can also induce, inhibit or compete with these enzymes further affecting (sometimes dangerously) methadone half-life. A longer half-life frequently allows for administration only once a day in Opioid detoxification and maintenance programs. People who metabolize methadone rapidly, on the other hand, may require twice daily dosing to obtain sufficient symptom alleviation while avoiding excessive peaks and troughs in their blood concentrations and associated effects. This can also allow lower total doses in some such people. The analgesic activity is shorter than the pharmacological half-life; dosing for pain control usually requires multiple doses per day normally dividing daily dosage for administration at 8 hour intervals.
The main metabolic pathway involves "N"-demethylation by CYP3A4 in the liver and intestine to give 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine (EDDP). This inactive product, as well as the inactive 2-ethyl-5-methyl-3,3-
diphenyl-1-pyrroline (EMDP), produced by a second "N"-demethylation, are detectable in the urine of those taking methadone.
The most common route of administration at a methadone clinic is in a racemic oral solution, though in Germany, only the "R" enantiomer (the L optical isomer) has traditionally been used, as it is responsible for most of the desired opioid effects. The single-isomer form is becoming less common due to the higher production costs.
Methadone is available in traditional pill, sublingual tablet, and two different formulations designed for the person to drink. Drinkable forms include ready-to-dispense liquid (sold in the United States as Methadose), and "Diskets"(known on the street as "wafers" or "biscuits") which are tablets designed to disperse themselves rapidly in water for oral administration, used in a similar fashion to Alka-Seltzer. The liquid form is the most common as it allows for smaller dose changes. Methadone is almost as effective when administered orally as by injection. Oral medication is usually preferable because it offers safety, simplicity and represents a step away from injection-based drug abuse in those recovering from addiction. U.S. federal regulations require the oral form in addiction treatment programs. Injecting methadone pills can cause collapsed veins, bruising, swelling, and possibly other harmful effects. Methadone pills often contain talc that, when injected, produces a swarm of tiny solid particles in the blood, causing numerous minor blood clots. These particles cannot be filtered out before injection, and will accumulate in the body over time, especially in the lungs and eyes, producing various complications such as pulmonary hypertension, an irreversible and progressive disease. The formulation sold under the brand name Methadose (flavored liquid suspension for oral dosing, commonly used for maintenance purposes) should not be injected either.
Information leaflets included in packs of UK methadone tablets state that the tablets are for oral use only and that use by any other route can cause serious harm. In addition to this warning, additives have now been included into the tablets formulation to make the use of them by the IV route more difficult.
Methadone and its major metabolite, 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine (EDDP), are often measured in urine as part of a drug abuse testing program, in plasma or serum to confirm a diagnosis of poisoning in hospitalized victims, or in whole blood to assist in a forensic investigation of a traffic or other criminal violation or a case of sudden death. Methadone usage history is considered in interpreting the results as a chronic user can develop tolerance to doses that would incapacitate an opioid-naive individual. Chronic users often have high methadone and EDDP baseline values.
The protonated form of methadone takes on an extended conformation, while the free base is more compact. In particular, it was found that there is an interaction between the tertiary amine and the carbonyl carbon of the ketone function (R3N ••• >C=O) that limits the molecule's conformation freedom, though the distance (291 pm by X-ray) is far too long to represent a true chemical bond. However, it does represent the initial trajectory of attack of an amine on a carbonyl group and was an important piece of experimental evidence for the proposal of the Bürgi–Dunitz angle for carbonyl addition reactions.
Methadone was developed in 1937 in Germany by scientists working for I.G. Farbenindustrie AG at the Farbwerke Hoechst who were looking for a synthetic opioid that could be created with readily available precursors, to solve Germany's opium shortage problem. On September 11, 1941 Bockmühl and Ehrhart filed an application for a patent for a synthetic substance they called Hoechst 10820 or Polamidon (a name still in regular use in Germany) and whose structure had only slight relation to morphine or the opiate alkaloids. (Bockmühl and Ehrhart, 1949) It was brought to market in 1943 and was widely used by the German army during WWII.
In the 1930s, pethidine (meperidine) went into production in Germany; however, production of methadone, then being developed under the designation Hoechst 10820, was not carried forward because of side effects discovered in the early research. After the war, all German patents, trade names and research records were requisitioned and expropriated by the Allies. The records on the research work of the I.G. Farbenkonzern at the Farbwerke Hoechst were confiscated by the U.S. Department of Commerce Intelligence, investigated by a Technical Industrial Committee of the U.S. Department of State and then brought to the US. The report published by the committee noted that while methadone was potentially addictive, it produced less sedation and respiratory depression than morphine and was thus interesting as a commercial drug.
In the early 1950s, methadone (most times the racemic HCl salts mixture) was also investigated for use as an antitussive.
Isomethadone, noracymethadol, LAAM, and normethadone were first developed in Germany, United Kingdom, Belgium, Austria, Canada, and the United States in the thirty or so years after the 1937 discovery of pethidine, the first synthetic opioid used in medicine. These synthetic opioids have increased length and depth of satiating any opiate cravings and generate very strong analgesic effects due to their long metabolic half-life and strong receptor affinity at the mu opioid receptor sites. Therefore, they impart much of the satiating and anti-addictive effects of methadone by means of suppressing drug cravings.
It was only in 1947 that the drug was given the generic name "methadone" by the Council on Pharmacy and Chemistry of the American Medical Association. Since the patent rights of the I.G. Farbenkonzern and Farbwerke Hoechst were no longer protected each pharmaceutical company interested in the formula could buy the rights for the commercial production of methadone for just one dollar (MOLL 1990).
Methadone was introduced into the United States in 1947 by Eli Lilly and Company as an analgesic under the trade name Dolophine.
The trade name Dolophine was created by Eli Lilly after World War II and used in the United States; the claim that Nazi leader Adolf Hitler ordered the manufacture of methadone or that the brand name 'Dolophine' was named after him is an urban legend. "Dolo" stems from the Latin word for pain, "dolor", and "finis" which means "end". Therefore, Dolophine literally means "pain end". The pejorative term "adolphine" (never a widely used name for the drug) appeared in the United States in the early 1970s as a reference to the aforementioned urban myth that the trade name Dolophine was a reference to Adolf Hitler.
Brand names include Dolophine, Symoron, Amidone, Methadose, Physeptone, Metadon, Metadol, Metadol-D, Heptanon and Heptadon among others.
In the US, generic methadone tablets are inexpensive, with retail prices ranging from $0.25 to $2.50 per defined daily dose.
|
https://en.wikipedia.org/wiki?curid=20962
|
Möbius inversion formula
In mathematics, the classic Möbius inversion formula was introduced into number theory in 1832 by August Ferdinand Möbius.
A large generalization of this formula applies to summation over an arbitrary locally finite partially ordered set, with Möbius' classical formula applying to the set of the natural numbers ordered by divisibility: see incidence algebra.
The classic version states that if and are arithmetic functions satisfying
then
where is the Möbius function and the sums extend over all positive divisors of (indicated by formula_3 in the above formulae). In effect, the original can be determined given by using the inversion formula. The two sequences are said to be Möbius transforms of each other.
The formula is also correct if and are functions from the positive integers into some abelian group (viewed as a -module).
In the language of Dirichlet convolutions, the first formula may be written as
where denotes the Dirichlet convolution, and is the constant function . The second formula is then written as
Many specific examples are given in the article on multiplicative functions.
The theorem follows because is (commutative and) associative, and , where is the identity function for the Dirichlet convolution, taking values , for all . Thus
Let
so that
is its transform. The transforms are related by means of series: the Lambert series
and the Dirichlet series:
where is the Riemann zeta function.
Given an arithmetic function, one can generate a bi-infinite sequence of other arithmetic functions by repeatedly applying the first summation.
For example, if one starts with Euler's totient function , and repeatedly applies the transformation process, one obtains:
If the starting function is the Möbius function itself, the list of functions is:
Both of these lists of functions extend infinitely in both directions. The Möbius inversion formula enables these lists to be traversed backwards.
As an example the sequence starting with is:
The generated sequences can perhaps be more easily understood by considering the corresponding Dirichlet series: each repeated application of the transform corresponds to multiplication by the Riemann zeta function.
A related inversion formula more useful in combinatorics is as follows: suppose and are complex-valued functions defined on the interval such that
then
Here the sums extend over all positive integers which are less than or equal to .
This in turn is a special case of a more general form. If is an arithmetic function possessing a Dirichlet inverse , then if one defines
then
The previous formula arises in the special case of the constant function , whose Dirichlet inverse is .
A particular application of the first of these extensions arises if we have (complex-valued) functions and defined on the positive integers, with
By defining and , we deduce that
A simple example of the use of this formula is counting the number of reduced fractions , where and are coprime and . If we let be this number, then is the total number of fractions with , where and are not necessarily coprime. (This is because every fraction with and can be reduced to the fraction with , and vice versa.) Here it is straightforward to determine , but is harder to compute.
Another inversion formula is (where we assume that the series involved are absolutely convergent):
As above, this generalises to the case where is an arithmetic function possessing a Dirichlet inverse :
As Möbius inversion applies to any abelian group, it makes no difference whether the group operation is written as addition or as multiplication. This gives rise to the following notational variant of the inversion formula:
The first generalization can be proved as follows. We use Iverson's convention that [condition] is the indicator function of the condition, being 1 if the condition is true and 0 if false. We use the result that
that is, .
We have the following:
The proof in the more general case where replaces 1 is essentially identical, as is the second generalisation.
For a poset , a set endowed with a partial order relation formula_24, define the Möbius function formula_25 of recursively by
(Here one assumes the summations are finite.) Then for formula_27, where is a field, we have
if and only if
|
https://en.wikipedia.org/wiki?curid=20963
|
Martin Lowry
Thomas Martin Lowry (; 26 October 1874 – 2 November 1936) was an English physical chemist who developed the Brønsted–Lowry acid–base theory simultaneously with and independently of Johannes Nicolaus Brønsted and was a founder-member and president (1928–1930) of the Faraday Society.
Lowry was born in Low Moor, Bradford, West Yorkshire, England, in a Cornish family. He was the second son of the Reverend E. P. Lowry who was the minister of the Wesleyan Church in Aldershot from 1892 to 1919. He was educated at Kingswood School, Bath, Somerset, and then at the Central Technical College in South Kensington. During those years he realized that he wanted to be a chemist. He studied chemistry under Henry Edward Armstrong, an English chemist whose interests were primarily in organic chemistry but also included the nature of ions in aqueous solutions. From 1896 to 1913 Lowry was assistant to Armstrong, and between 1904 and 1913 worked as lecturer in chemistry at the Westminster Training College. In 1913, he was appointed head of the chemical department in Guy’s Hospital Medical and became the first teacher of chemistry in a Medical School to be made a University Professor, at the University of London. From 1920 till his death, Lowry served as the Chair of Physical Chemistry at the University of Cambridge. He married a daughter of the Rev. C. Wood in 1904 and was survived by his widow, two sons and a daughter.
Since the establishment of the Faraday Society in 1903, Lowry had been its active member and served as its president between 1928 and 1930. In 1914 he was elected a fellow of the Royal Society. During and after the World War I, Lowry acted as director of shell-filling (1917–1919) and worked for the Trench Warfare Committee, Chemical Warfare Committee and Ordnance Committee. For this service, he was awarded the Order of the British Empire and the Order of Saints Maurice and Lazarus.
In 1898, Lowry noted the change in optical rotation on nitro-"d"-camphor with time and invented the term "mutarotational" to describe this phenomenon. He studied changes in optical rotation caused by acid- and base-catalyzed reactions of camphor derivatives. This led in 1923 to his formulation of the protonic definition of acids and bases, now known as Brønsted–Lowry acid-base theory, independently of the work by Johannes Nicolaus Brønsted. Lowry published a few hundred papers and several books. His 1935 monograph on "Optical Rotatory Power" (1935) has long been regarded as a standard work on the subject.
|
https://en.wikipedia.org/wiki?curid=20964
|
Marvel Comics
Marvel Comics is the brand name and primary imprint of Marvel Worldwide Inc., formerly Marvel Publishing, Inc. and Marvel Comics Group, a publisher of American comic books and related media. In 2009, The Walt Disney Company acquired Marvel Entertainment, Marvel Worldwide's parent company.
Marvel was started in 1939 by Martin Goodman under a number of corporations and imprints but now known as Timely Comics, and by 1951 had generally become known as Atlas Comics. The Marvel era began in 1961, the year that the company launched "The Fantastic Four" and other superhero titles created by Stan Lee, Jack Kirby, Steve Ditko and many others. The Marvel brand, which had been used over the years, was solidified as the company's primary brand.
Marvel counts among its characters such well-known superheroes as Spider-Man, Iron Man, the Hulk, Thor, Captain America, Ant-Man, the Wasp, Black Widow, Wolverine, Captain Marvel, Black Panther, Doctor Strange, Ghost Rider, Blade, Daredevil, the Punisher and Deadpool. Superhero teams exist such as the Avengers, the X-Men, the Fantastic Four and the Guardians of the Galaxy as well as supervillains including Doctor Doom, Magneto, Thanos, Loki, Green Goblin, Kingpin, Red Skull, Ultron, the Mandarin, MODOK, Doctor Octopus, Kang, Dormammu, Annihilus and Galactus. Most of Marvel's fictional characters operate in a single reality known as the Marvel Universe, with most locations mirroring real-life places; many major characters are based in New York City. Additionally, Marvel has published several licensed properties from other companies. This includes "Star Wars" comics twice from 1977 to 1986 and again since 2015.
Pulp-magazine publisher Martin Goodman created the company later known as Marvel Comics under the name Timely Publications in 1939. Goodman, who had started with a Western pulp in 1933, was expanding into the emerging—and by then already highly popular—new medium of comic books. Launching his new line from his existing company's offices at 330 West 42nd Street, New York City, he officially held the titles of editor, managing editor, and business manager, with Abraham Goodman (Martin's brother) officially listed as publisher.
Timely's first publication, "Marvel Comics" #1 (cover dated Oct. 1939), included the first appearance of Carl Burgos' android superhero the Human Torch, and the first appearances of Bill Everett's anti-hero Namor the Sub-Mariner, among other features. The issue was a great success; it and a second printing the following month sold a combined nearly 900,000 copies. While its contents came from an outside packager, Funnies, Inc., Timely had its own staff in place by the following year. The company's first true editor, writer-artist Joe Simon, teamed with artist Jack Kirby to create one of the first patriotically themed superheroes, Captain America, in "Captain America Comics" #1 (March 1941). It, too, proved a hit, with sales of nearly one million. Goodman formed Timely Comics, Inc., beginning with comics cover-dated April 1941 or Spring 1941.
While no other Timely character would achieve the success of these three characters, some notable heroes—many of which continue to appear in modern-day retcon appearances and flashbacks—include the Whizzer, Miss America, the Destroyer, the original Vision, and the Angel. Timely also published one of humor cartoonist Basil Wolverton's best-known features, "Powerhouse Pepper", as well as a line of children's funny-animal comics featuring characters like Super Rabbit and the duo Ziggy Pig and Silly Seal.
Goodman hired his wife's cousin, Stanley Lieber, as a general office assistant in 1939. When editor Simon left the company in late 1941, Goodman made Lieber—by then writing pseudonymously as "Stan Lee"—interim editor of the comics line, a position Lee kept for decades except for three years during his military service in World War II. Lee wrote extensively for Timely, contributing to a number of different titles.
Goodman's business strategy involved having his various magazines and comic books published by a number of corporations all operating out of the same office and with the same staff. One of these shell companies through which Timely Comics was published was named Marvel Comics by at least "Marvel Mystery Comics" #55 (May 1944). As well, some comics' covers, such as "All Surprise Comics" #12 (Winter 1946–47), were labeled "A Marvel Magazine" many years before Goodman would formally adopt the name in 1961.
The post-war American comic market saw superheroes falling out of fashion. Goodman's comic book line dropped them for the most part and expanded into a wider variety of genres than even Timely had published, featuring horror, Westerns, humor, funny animal, men's adventure-drama, giant monster, crime, and war comics, and later adding jungle books, romance titles, espionage, and even medieval adventure, Bible stories and sports.
Goodman began using the globe logo of the Atlas News Company, the newsstand-distribution company he owned, on comics cover-dated November 1951 even though another company, Kable News, continued to distribute his comics through the August 1952 issues. This globe branding united a line put out by the same publisher, staff and freelancers through 59 shell companies, from Animirth Comics to Zenith Publications.
Atlas, rather than innovate, took a proven route of following popular trends in television and movies—Westerns and war dramas prevailing for a time, drive-in movie monsters another time—and even other comic books, particularly the EC horror line. Atlas also published a plethora of children's and teen humor titles, including Dan DeCarlo's "Homer the Happy Ghost" (similar to "Casper the Friendly Ghost") and "Homer Hooper" (à la Archie Andrews). Atlas unsuccessfully attempted to revive superheroes from late 1953 to mid-1954, with the Human Torch (art by Syd Shores and Dick Ayers, variously), the Sub-Mariner (drawn and most stories written by Bill Everett), and Captain America (writer Stan Lee, artist John Romita Sr.). Atlas did not achieve any breakout hits and, according to Stan Lee, Atlas survived chiefly because it produced work quickly, cheaply, and at a passable quality.
The first modern comic books under the Marvel Comics brand were the science-fiction anthology "Journey into Mystery" #69 and the teen-humor title "Patsy Walker" #95 (both cover dated June 1961), which each displayed an "MC" box on its cover. Then, in the wake of DC Comics' success in reviving superheroes in the late 1950s and early 1960s, particularly with the Flash, Green Lantern, Batman, Superman, Wonder Woman, Green Arrow and other members of the team the Justice League of America, Marvel followed suit.
In 1961, writer-editor Stan Lee revolutionized superhero comics by introducing superheroes designed to appeal to older readers than the predominantly child audiences of the medium, thus ushering what Marvel later called the Marvel Age of Comics. Modern Marvel's first superhero team, the titular stars of "The Fantastic Four" #1 (Nov. 1961), broke convention with other comic book archetypes of the time by squabbling, holding grudges both deep and petty, and eschewing anonymity or secret identities in favor of celebrity status. Subsequently, Marvel comics developed a reputation for focusing on characterization and adult issues to a greater extent than most superhero comics before them, a quality which the new generation of older readers appreciated. This applied to "The Amazing Spider-Man" title in particular, which turned out to be Marvel's most successful book. Its young hero suffered from self-doubt and mundane problems like any other teenager, something with which many readers could identify.
Stan Lee and freelance artist and eventual co-plotter Jack Kirby's Fantastic Four originated in a Cold War culture that led their creators to revise the superhero conventions of previous eras to better reflect the psychological spirit of their age. Eschewing such comic-book tropes as secret identities and even costumes at first, having a monster as one of the heroes, and having its characters bicker and complain in what was later called a "superheroes in the real world" approach, the series represented a change that proved to be a great success.
Marvel often presented flawed superheroes, freaks, and misfits—unlike the perfect, handsome, athletic heroes found in previous traditional comic books. Some Marvel heroes looked like villains and monsters such as the Hulk and the Thing. This naturalistic approach even extended into topical politics.
Comics historian Mike Benton also noted:
All these elements struck a chord with the older readers, including college-aged adults. In 1965, Spider-Man and the Hulk were both featured in "Esquire" magazine's list of 28 college campus heroes, alongside John F. Kennedy and Bob Dylan. In 2009, writer Geoff Boucher reflected that,
Superman and DC Comics instantly seemed like boring old Pat Boone; Marvel felt like The Beatles and the British Invasion. It was Kirby's artwork with its tension and psychedelia that made it perfect for the times—or was it Lee's bravado and melodrama, which was somehow insecure and brash at the same time?
In addition to Spider-Man and the Fantastic Four, Marvel began publishing further superhero titles featuring such heroes and antiheroes as the Hulk, Thor, Ant-Man, Iron Man, the X-Men, Daredevil, the Inhumans, Black Panther, Doctor Strange, Captain Marvel and the Silver Surfer, and such memorable antagonists as Doctor Doom, Magneto, Galactus, Loki, the Green Goblin, and Doctor Octopus, all existing in a shared reality known as the Marvel Universe, with locations that mirror real-life cities such as New York, Los Angeles and Chicago.
Marvel even lampooned itself and other comics companies in a parody comic, "Not Brand Echh" (a play on Marvel's dubbing of other companies as "Brand Echh", à la the then-common phrase "Brand X").
In 1968, while selling 50 million comic books a year, company founder Goodman revised the constraining distribution arrangement with Independent News he had reached under duress during the Atlas years, allowing him now to release as many titles as demand warranted. Late that year, he sold Marvel Comics and its parent company, Magazine Management, to the Perfect Film and Chemical Corporation, with Goodman remaining as publisher. In 1969, Goodman finally ended his distribution deal with Independent by signing with Curtis Circulation Company.
In 1971, the United States Department of Health, Education, and Welfare approached Marvel Comics editor-in-chief Stan Lee to do a comic book story about drug abuse. Lee agreed and wrote a three-part Spider-Man story portraying drug use as dangerous and unglamorous. However, the industry's self-censorship board, the Comics Code Authority, refused to approve the story because of the presence of narcotics, deeming the context of the story irrelevant. Lee, with Goodman's approval, published the story regardless in "The Amazing Spider-Man" #96–98 (May–July 1971), without the Comics Code seal. The market reacted well to the storyline, and the CCA subsequently revised the Code the same year.
Goodman retired as publisher in 1972 and installed his son, Chip, as publisher. Shortly thereafter, Lee succeeded him as publisher and also became Marvel's president for a brief time. During his time as president, he appointed his associate editor, prolific writer Roy Thomas, as editor-in-chief. Thomas added "Stan Lee Presents" to the opening page of each comic book.
A series of new editors-in-chief oversaw the company during another slow time for the industry. Once again, Marvel attempted to diversify, and with the updating of the Comics Code published titles themed to horror ("The Tomb of Dracula"), martial arts ("Shang-Chi: Master of Kung Fu"), sword-and-sorcery ("Conan the Barbarian" in 1970, "Red Sonja"), satire ("Howard the Duck") and science fiction ("", "Killraven" in "Amazing Adventures", "Battlestar Galactica", "Star Trek", and, late in the decade, the long-running "Star Wars" series). Some of these were published in larger-format black and white magazines, under its Curtis Magazines imprint.
Marvel was able to capitalize on its successful superhero comics of the previous decade by acquiring a new newsstand distributor and greatly expanding its comics line. Marvel pulled ahead of rival DC Comics in 1972, during a time when the price and format of the standard newsstand comic were in flux. Goodman increased the price and size of Marvel's November 1971 cover-dated comics from 15 cents for 36 pages total to 25 cents for 52 pages. DC followed suit, but Marvel the following month dropped its comics to 20 cents for 36 pages, offering a lower-priced product with a higher distributor discount.
In 1973, Perfect Film and Chemical renamed itself as Cadence Industries and renamed Magazine Management as Marvel Comics Group. Goodman, now disconnected from Marvel, set up a new company called Seaboard Periodicals in 1974, reviving Marvel's old Atlas name for a new Atlas Comics line, but this lasted only a year and a half.
In the mid-1970s a decline of the newsstand distribution network affected Marvel. Cult hits such as "Howard the Duck" fell victim to the distribution problems, with some titles reporting low sales when in fact the first specialty comic book stores resold them at a later date. But by the end of the decade, Marvel's fortunes were reviving, thanks to the rise of direct market distribution—selling through those same comics-specialty stores instead of newsstands.
Marvel ventured into audio in 1975 with a radio series and a record, both had Stan Lee as narrator. The radio series was Fantastic Four. The record was "Spider-Man: Rock Reflections of a Superhero" concept album for music fans.
Marvel held its own comic book convention, Marvelcon '75, in spring 1975, and promised a Marvelcon '76. At the 1975 event, Stan Lee used a Fantastic Four panel discussion to announce that Jack Kirby, the artist co-creator of most of Marvel's signature characters, was returning to Marvel after having left in 1970 to work for rival DC Comics. In October 1976, Marvel, which already licensed reprints in different countries, including the UK, created a superhero specifically for the British market. Captain Britain debuted exclusively in the UK, and later appeared in American comics. During this time, Marvel and the Iowa-based Register and Tribune Syndicate launched a number of syndicated comic strips — "The Amazing Spider-Man", "Howard the Duck", "Conan the Barbarian", and "The Incredible Hulk". None of the strips lasted past 1982, except for "The Amazing Spider-Man", which is still being published.
In 1978, Jim Shooter became Marvel's editor-in-chief. Although a controversial personality, Shooter cured many of the procedural ills at Marvel, including repeatedly missed deadlines. During Shooter's nine-year tenure as editor-in-chief, Chris Claremont and John Byrne's run on the "Uncanny X-Men" and Frank Miller's run on "Daredevil" became critical and commercial successes. Shooter brought Marvel into the rapidly evolving direct market, institutionalized creator royalties, starting with the Epic Comics imprint for creator-owned material in 1982; introduced company-wide crossover story arcs with "Contest of Champions" and "Secret Wars"; and in 1986 launched the ultimately unsuccessful New Universe line to commemorate the 25th anniversary of the Marvel Comics imprint. Star Comics, a children-oriented line differing from the regular Marvel titles, was briefly successful during this period.
In 1986, Marvel's parent, Marvel Entertainment Group, was sold to New World Entertainment, which within three years sold it to MacAndrews and Forbes, owned by Revlon executive Ronald Perelman in 1989. In 1991 Perelman took MEG public. Following the rapid rise of this stock, Perelman issued a series of junk bonds that he used to acquire other entertainment companies, secured by MEG stock.
Marvel earned a great deal of money with their 1980s children's comics imprint Star Comics and they earned a great deal more money and worldwide success during the comic book boom of the early 1990s, launching the successful 2099 line of comics set in the future ("Spider-Man 2099", etc.) and the creatively daring though commercially unsuccessful Razorline imprint of superhero comics created by novelist and filmmaker Clive Barker. In 1990, Marvel began selling Marvel Universe Cards with trading card maker SkyBox International. These were collectible trading cards that featured the characters and events of the Marvel Universe. The 1990s saw the rise of variant covers, cover enhancements, swimsuit issues, and company-wide crossovers that affected the overall continuity of the Marvel Universe.
Marvel suffered a blow in early 1992, when seven of its most prized artists — Todd McFarlane (known for his work on ""), Jim Lee ("X-Men"), Rob Liefeld ("X-Force"), Marc Silvestri ("Wolverine"), Erik Larsen ("The Amazing Spider-Man"), Jim Valentino ("Guardians of the Galaxy"), and Whilce Portacio ("Uncanny X-Men") — left to form Image Comics in a deal brokered by Malibu Comics' owner Scott Mitchell Rosenberg. Three years later Rosenberg sold Malibu to Marvel on November 3, 1994, who acquired the then-leading standard for computer coloring of comic books (developed by Rosenberg) in the process, but also integrating the Ultraverse into Marvel's multiverse and ownership of the Genesis Universe.
In late 1994, Marvel acquired the comic book distributor Heroes World Distribution to use as its own exclusive distributor. As the industry's other major publishers made exclusive distribution deals with other companies, the ripple effect resulted in the survival of only one other major distributor in North America, Diamond Comic Distributors Inc. Then, by the middle of the decade, the industry had slumped, and in December 1996 MEG filed for Chapter 11 bankruptcy protection. In early 1997, when Marvel's Heroes World endeavor failed, Diamond also forged an exclusive deal with Marvel—giving the company its own section of its comics catalog "Previews".
In 1996, Marvel had some of its titles participate in "Heroes Reborn", a crossover that allowed Marvel to relaunch some of its flagship characters such as the Avengers and the Fantastic Four, and outsource them to the studios of two of the former Marvel artists turned Image Comics founders, Jim Lee and Rob Liefeld. The relaunched titles, which saw the characters transported to a parallel universe with a history distinct from the mainstream Marvel Universe, were a solid success amidst a generally struggling industry, but Marvel discontinued the experiment after a one-year run and returned the characters to the Marvel Universe proper.
In 1997, Toy Biz bought Marvel Entertainment Group to end the bankruptcy, forming a new corporation, Marvel Enterprises. With his business partner Avi Arad, publisher Bill Jemas, and editor-in-chief Bob Harras, Toy Biz co-owner Isaac Perlmutter helped stabilize the comics line.
In 1998, the company launched the imprint Marvel Knights, taking place just outside Marvel continuity with better production quality. The imprint was helmed by soon-to-become editor-in-chief Joe Quesada; it featured tough, gritty stories showcasing such characters as the Daredevil, Inhumans and Black Panther.
With the new millennium, Marvel Comics emerged from bankruptcy and again began diversifying its offerings. In 2001, Marvel withdrew from the Comics Code Authority and established its own Marvel Rating System for comics. The first title from this era to not have the code was "X-Force" #119 (October 2001). Marvel also created new imprints, such as MAX (an explicit-content line) and Marvel Adventures (developed for child audiences). In addition, the company created an alternate universe imprint, Ultimate Marvel, that allowed the company to reboot its major titles by revising and updating its characters to introduce to a new generation.
Some of its characters have been turned into successful film franchises, such as the "Men in Black" movie series, starting in 1997, "Blade" movie series, starting in 1998, "X-Men" movie series, starting in 2000, and the highest grossing series "Spider-Man", beginning in 2002.
Marvel's Conan the Barbarian title stopped in 1993 after 275 issues. The Savage Sword of Conan magazine had 235 issues. Marvel published additional titles including miniseries until 2000 for a total of 650 issues. Conan was pick up by Dark Horse three years later.
In a cross-promotion, the November 1, 2006, episode of the CBS soap opera "The Guiding Light", titled "She's a Marvel", featured the character Harley Davidson Cooper (played by Beth Ehlers) as a superheroine named the Guiding Light. The character's story continued in an eight-page backup feature, "A New Light", that appeared in several Marvel titles published November 1 and 8. Also that year, Marvel created a wiki on its Web site.
In late 2007 the company launched Marvel Digital Comics Unlimited, a digital archive of over 2,500 back issues available for viewing, for a monthly or annual subscription fee. At the December 2007 the NY Anime Fest, the company announcement that Del Rey Manga would published two original English language Marvel manga books featuring the X-Men and Wolverine to hit the stands in spring 2009.
In 2009 Marvel Comics closed its Open Submissions Policy, in which the company had accepted unsolicited samples from aspiring comic book artists, saying the time-consuming review process had produced no suitably professional work. The same year, the company commemorated its 70th anniversary, dating to its inception as Timely Comics, by issuing the one-shot "Marvel Mystery Comics 70th Anniversary Special" #1 and a variety of other special issues.
On August 31, 2009, The Walt Disney Company announced it would acquire Marvel Comics' parent corporation, Marvel Entertainment, for a cash and stock deal worth approximately $4 billion, which if necessary would be adjusted at closing, giving Marvel shareholders $30 and 0.745 Disney shares for each share of Marvel they owned. As of 2008, Marvel and its major, longtime competitor DC Comics shared over 80% of the American comic-book market.
As of September 2010, Marvel switched its bookstores distribution company from Diamond Book Distributors to Hachette Distribution Services. Marvel moved its office to the Sports Illustrated Building in October 2010.
Marvel relaunched the CrossGen imprint, owned by Disney Publishing Worldwide, in March 2011. Marvel and Disney Publishing began jointly publishing "Disney/Pixar Presents" magazine that May.
Marvel discontinued its Marvel Adventures imprint in March 2012, and replaced them with a line of two titles connected to the Marvel Universe TV block. Also in March, Marvel announced its Marvel ReEvolution initiative that included Infinite Comics, a line of digital comics, Marvel AR, a software application that provides an augmented reality experience to readers and Marvel NOW!, a relaunch of most of the company's major titles with different creative teams. Marvel NOW! also saw the debut of new flagship titles including "Uncanny Avengers" and "All-New X-Men".
In April 2013, Marvel and other Disney conglomerate components began announcing joint projects. With ABC, a "Once Upon a Time" graphic novel was announced for publication in September. With Disney, Marvel announced in October 2013 that in January 2014 it would release its first title under their joint "Disney Kingdoms" imprint "Seekers of the Weird", a five-issue miniseries. On January 3, 2014, fellow Disney subsidiary Lucasfilm announced that as of 2015, "Star Wars" comics would once again be published by Marvel.
Following the events of the company-wide crossover "Secret Wars" in 2015, a relaunched Marvel universe began in September 2015, called the All-New, All-Different Marvel.
Marvel Legacy was the company's Fall 2017 relaunch banner starting in September. The banner had comics with lenticular variant covers which required comic book stores to double their regular issue order to be able to order the variants. The owner of two Comix Experience stores complained about the set up of forcing retailers to be stuck with copies they cannot sell for the variant that they can sell. With other complaints too, Marvel did adjust down requirements for new titles no adjustment was made for any other. Thusforthly MyComicShop.com and at least 70 other comic book stores were boycotting these variant covers. Despite the release of "Guardians of the Galaxy Vol. 2", "Logan", "" and "" in theaters, none of those characters' titles featured in the top 10 sales and the Guardians of the Galaxy comic book series was cancelled. Conan Properties International announced on January 12, 2018 that Conan would return to Marvel in early 2019.
On January 19, 2018, Joshua Yehl, editor of ign.com, speculated on potential changes if Disney's proposed acquisition of 21st Century Fox went through. He expects Fox franchises licensed out to other firms would be moved to Marvel and that Fox's Marvel film properties would be treated better by the publishing division. However, Marvel had licensed Archie Comics to publish Marvel Digests collections for the newsstand market. While Disney has licensed IDW Publishing to produce the classic, all-ages Disney comics since the Marvel purchase and a Big Hero 6 comic book to go along with the despite the fact that the Disney movie was based on a Marvel Comic book. Then on July 17, 2018, Marvel Entertainment announced the licensing of Marvel characters to IDW for a line of middle-grade reader market comic books to start publishing in November 2018.
On March 1, 2019, Serial Box, a digital book platform, announced a partnership with Marvel. They will publish new and original stories that will be tied to a number of Marvel's popular franchises. The first series will be about the character Thor and is set to be released Summer 2019.
Due to Diamond Comics Distributors halting their distribution of comics globally as a result of the COVID-19 pandemic, Marvel Comics as of April 15 have suspended the release of both physical and digital copies of their comic books until further notice. Dan Buckley the president of Marvel Entertainment has stated that he will provide further information when possible.
Marvel's chief editor originally held the title of "editor". This head editor's title later became "editor-in-chief". Joe Simon was the company's first true chief-editor, with publisher Martin Goodman, who had served as titular editor only and outsourced editorial operations.
In 1994 Marvel briefly abolished the position of editor-in-chief, replacing Tom DeFalco with five group editors-in-chief. As Carl Potts described the 1990s editorial arrangement:
Marvel reinstated the overall editor-in-chief position in 1995 with Bob Harras.
Originally called associate editor when Marvel's chief editor just carried the title of editor, the title of the next highest editorial position became executive editor under the chief editor title of editor-in-chief. The title of associate editor later was revived under the editor-in-chief as an editorial position in charge of few titles under the direction of an editor and without an assistant editor.
Located in New York City, Marvel has had successive headquarters:
Animated
In 2017, Marvel held a 38.30% share of the comics market, compared to its competitor DC Comics' 33.93%. By comparison, the companies respectively held 33.50% and 30.33% shares in 2013, and 40.81% and 29.94% shares in 2008.
Marvel characters and stories have been adapted to many other media. Some of these adaptations were produced by Marvel Comics and its sister company, Marvel Studios, while others were produced by companies licensing Marvel material.
In June 1993, Marvel issued its collectable caps for milk caps game under the Hero Caps brand. In 2014, the Japanese TV series was launched together with a collectible game called Bachicombat, a game similar to the milk caps game, by Bandai.
The RPG industry brought the development of the collectible card game (CCG) in the early 1990s which there were soon Marvel characters were featured in CCG of their own starting in 1995 with Fleer's OverPower (1995–1999). Later collectible card game were:
TSR published the pen-and-paper role-playing game Marvel Super Heroes in 1984. TSR then released in 1998 the "Marvel Super Heroes Adventure Game" which used a different system, the card-based SAGA system, than their first game. In 2003 Marvel Publishing published its own role-playing game, the "Marvel Universe Roleplaying Game", that used a diceless stone pool system. In August 2011 Margaret Weis Productions announced it was developing a tabletop role-playing game based on the Marvel universe, set for release in February 2012 using its house Cortex Plus RPG system.
Video games based on Marvel characters go back to 1984 and the Atari game, "Spider-Man". Since then several dozen video games have been released and all have been produces by outside licensees. In 2014, "" was released that brought Marvel characters to the existing Disney sandbox video game.
As of the start of September 2015, films based on Marvel's properties represent the highest-grossing U.S. franchise, having grossed over $7.7 billion as part of a worldwide gross of over $18 billion. As of May 2019 the Marvel Cinematic Universe (MCU) has grossed over $22 billion.
Marvel first licensed two prose novels to Bantam Books, who printed "The Avengers Battle the Earth Wrecker" by Otto Binder (1967) and "Captain America: The Great Gold Steal" by Ted White (1968). Various publishers took up the licenses from 1978 to 2002. Also, with the various licensed films being released beginning in 1997, various publishers put out movie novelizations. In 2003, following publication of the prose young adult novel "Mary Jane", starring Mary Jane Watson from the Spider-Man mythos, Marvel announced the formation of the publishing imprint Marvel Press. However, Marvel moved back to licensing with Pocket Books from 2005 to 2008. With few books issued under the imprint, Marvel and Disney Books Group relaunched Marvel Press in 2011 with the Marvel Origin Storybooks line.
Many television series, both live-action and animated, have based their productions on Marvel Comics characters. These include series for popular characters such as Spider-Man, Iron Man, the Hulk, the Avengers, the X-Men, Fantastic Four, the Guardians of the Galaxy, Daredevil, Jessica Jones, Luke Cage, Iron Fist, the Punisher, the Defenders, S.H.I.E.L.D., Agent Carter, Deadpool, Legion, and others. Additionally, a handful of television movies, usually also pilots, based on Marvel Comics characters have been made.
Marvel has licensed its characters for theme parks and attractions, including Marvel Super Hero Island at Universal Orlando's Islands of Adventure in Orlando, Florida, which includes rides based on their iconic characters and costumed performers, as well as The Amazing Adventures of Spider-Man ride cloned from Islands of Adventure to Universal Studios Japan.
Years after Disney purchased Marvel in late 2009, Walt Disney Parks and Resorts plans on creating original Marvel attractions at their theme parks, with Hong Kong Disneyland becoming the first Disney theme park to feature a Marvel attraction. Due to the licensing agreement with Universal Studios, signed prior to Disney's purchase of Marvel, Walt Disney World and Tokyo Disney Resort are barred from having Marvel characters in their parks. However, this only includes characters that Universal is currently using, other characters in their "families" (X-Men, Avengers, Fantastic Four, etc.), and the villains associated with said characters. This clause has allowed Walt Disney World to have meet and greets, merchandise, attractions and more with other Marvel characters not associated with the characters at Islands of Adventures, such as Star-Lord and Gamora from "Guardians of the Galaxy".
Marvel Worldwide with Disney announced in October 2013 that in January 2014 it would release its first comic book title under their joint Disney Kingdoms imprint "Seekers of the Weird", a five-issue miniseries inspired by a never built Disneyland attraction Museum of the Weird. Marvel's Disney Kingdoms imprint has since released comic adaptations of Big Thunder Mountain Railroad, Walt Disney's Enchanted Tiki Room, The Haunted Mansion, two series on "Figment" based on Journey Into Imagination.
|
https://en.wikipedia.org/wiki?curid=20966
|
Meritocracy
Meritocracy ("merit", from Latin "mereō", and "-cracy", from Ancient Greek κράτος "" 'strength, power') is a political system in which economic goods and/or political power are vested in individual people on the basis of talent, effort, and achievement, rather than wealth or social class. Advancement in such a system is based on performance, as measured through examination or demonstrated achievement. Although the concept of meritocracy has existed for centuries, the term itself was coined in 1958 by the sociologist Michael Dunlop Young in his satirical essay "The Rise of the Meritocracy".
The "most common definition of meritocracy conceptualizes merit in terms of tested competency and ability, and most likely, as measured by IQ or standardized achievement tests." In government and other administrative systems, "meritocracy" refers to a system under which advancement within the system turns on "merits", like performance, intelligence, credentials, and education. These are often determined through evaluations or examinations.
In a more general sense, meritocracy can refer to any form of evaluation based on achievement. Like "utilitarian" and "pragmatic", the word "meritocratic" has also developed a broader connotation, and is sometimes used to refer to any government run by "a ruling or influential class of educated or able people".
This is in contrast to the original, condemnatory use of the term in 1958 by Michael Dunlop Young in his work ""The Rise of the Meritocracy"", who was satirizing the ostensibly merit-based Tripartite System of education practiced in the United Kingdom at the time; he claimed that, in the Tripartite System, "merit is equated with intelligence-plus-effort, its possessors are identified at an early age and selected for appropriate intensive education, and there is an obsession with quantification, test-scoring, and qualifications."
Meritocracy in its wider sense, may be any general act of judgment upon the basis of various demonstrated merits; such acts frequently are described in sociology and psychology. Supporters of meritocracy do not necessarily agree on the nature of "merit"; however, they do tend to agree that "merit" itself should be a primary consideration during evaluation. Thus, the merits may extend beyond intelligence and education to any mental or physical talent or to work ethic. As such meritocracy may be based on moral character or innate abilities such as intelligence.
In rhetoric, the demonstration of one's merit regarding mastery of a particular subject is an essential task most directly related to the Aristotelian term "Ethos". The equivalent Aristotelian conception of meritocracy is based upon aristocratic or oligarchic structures, rather than in the context of the modern state.
In the United States, the assassination of President James A. Garfield in 1881 prompted the replacement of the American Spoils System with a meritocracy. In 1883, The Pendleton Civil Service Reform Act was passed, stipulating government jobs should be awarded on the basis of merit through competitive exams, rather than ties to politicians or political affiliation.
The most common form of meritocratic screening found today is the college degree. Higher education is an imperfect meritocratic screening system for various reasons, such as lack of uniform standards worldwide, lack of scope (not all occupations and processes are included), and lack of access (some talented people never have an opportunity to participate because of the expense, most especially in developing countries). Nonetheless, academic degrees serve some amount of meritocratic screening purpose in the absence of a more refined methodology. Education alone, however, does not constitute a complete system, as meritocracy must automatically confer power and authority, which a degree does not accomplish independently.
Although the concept has existed for centuries, the term "meritocracy" is relatively new. It was used pejoratively by British politician and sociologist Michael Dunlop Young in his 1958 satirical essay "The Rise of the Meritocracy", which pictured the United Kingdom under the rule of a government favouring intelligence and aptitude (merit) above all else, being the combination of the root of Latin origin "merit" (from "mereō" meaning "earn") and the Ancient Greek suffix "-cracy" (meaning "power", "rule"). (The "purely" Greek word is axiocracy (αξιοκρατία), from axios (αξιος, worthy) + "-cracy" (-κρατία, power).)
In this book the term had distinctly negative connotations as Young questioned both the legitimacy of the selection process used to become a member of this elite and the outcomes of being ruled by such a narrowly defined group.
The essay, written in the first person by a fictional historical narrator in 2034, interweaves history from the politics of pre- and post-war Britain with those of fictional future events in the short (1960 onward) and long term (2020 onward).
The essay was based upon the tendency of the then-current governments, in their striving toward intelligence, to ignore shortcomings and upon the failure of education systems to utilize correctly the gifted and talented members within their societies.
Young's fictional narrator explains that, on the one hand, the greatest contributor to society is not the "stolid mass" or majority, but the "creative minority" or members of the "restless elite". On the other hand, he claims that there are casualties of progress whose influence is underestimated and that, from such stolid adherence to natural science and intelligence, arises arrogance and complacency. This problem is encapsulated in the phrase "Every selection of one is a rejection of many".
It was also used by Hannah Arendt in her essay "Crisis in Education", which was written in 1958 and refers to the use of meritocracy in the English educational system. She too uses the term pejoratively. It was not until 1972 that Daniel Bell used the term positively.
According to scholarly consensus, the earliest example of an administrative meritocracy, based on civil service examinations, dates back to Ancient China. The concept originates, at least by the sixth century BC, when it was advocated by the Chinese philosopher Confucius, who "invented the notion that those who govern should do so because of merit, not of inherited status. This sets in motion the creation of the imperial examinations and bureaucracies open only to those who passed tests."
As the Qin and Han dynasties developed a meritocratic system in order to maintain power over a large, sprawling empire, it became necessary for the government to maintain a complex network of officials. Prospective officials could come from a rural background and government positions were not restricted to the nobility. Rank was determined by merit, through the civil service examinations, and education became the key for social mobility. After the fall of the Han Dynasty, the nine-rank system was established during the Three Kingdoms period.
According to the Princeton Encyclopedia on American History:
Both Plato and Aristotle advocated meritocracy, Plato in his "The Republic", arguing that the wisest should rule, and hence the rulers should be philosopher kings.
The concept of meritocracy spread from China to British India during the seventeenth century, and then into continental Europe and the United States. With the translation of Confucian texts during the Age of Enlightenment, the concept of a meritocracy reached intellectuals in the West, who saw it as an alternative to the traditional "ancient regime" of Europe. Voltaire and François Quesnay wrote favourably of the idea, with Voltaire claiming that the Chinese had "perfected moral science" and Quesnay advocating an economic and political system modeled after that of the Chinese.
The first European power to implement a successful meritocratic civil service was the British Empire, in their administration of India: "company managers hired and promoted employees based on competitive examinations in order to prevent corruption and favoritism." British colonial administrators advocated the spread of the system to the rest of the commonwealth, the most "persistent" of which was Thomas Taylor Meadows, Britain's consul in Guangzhou, China. Meadows successfully argued in his "Desultory Notes on the Government and People of China", published in 1847, that "the long duration of the Chinese empire is solely and altogether owing to the good government which consists in the advancement of men of talent and merit only," and that the British must reform their civil service by making the institution meritocratic. This practice later was adopted in the late nineteenth century by the British mainland, inspired by the "Chinese mandarin system".
The British philosopher and polymath John Stuart Mill advocated meritocracy in his book, "Considerations on Representative Government". His model was to give more votes to the more educated voter. His views are explained in Estlund (2003:57–58):
Mill's proposal of plural voting has two motives. One is to prevent one group or class of people from being able to control the political process even without having to give reasons in order to gain sufficient support. He calls this the problem of class legislation. Since the most numerous class is also at a lower level of education and social rank, this could be partly remedied by giving those at the higher ranks plural votes. A second, and equally prominent motive for plural voting is to avoid giving equal influence to each person without regard to their merit, intelligence, etc. He thinks that it is fundamentally important that political institutions embody, in their spirit, the recognition that some opinions are worth more than others. He does not say that this is a route to producing better political decisions, but it is hard to understand his argument, based on this second motive, in any other way.
So, if Aristotle is right that the deliberation is best if participants are numerous (and assuming for simplicity that the voters are the deliberators) then this is a reason for giving all or many citizens a vote, but this does not yet show that the wiser subset should not have, say, two or three; in that way something would be given both to the value of the diverse perspectives, and to the value of the greater wisdom of the few. This combination of the Platonic and Aristotelian points is part of what I think is so formidable about Mill's proposal of plural voting. It is also an advantage of his view that he proposes to privilege not the wise, but the educated. Even if we agreed that the wise should rule, there is a serious problem about how to identify them. This becomes especially important if a successful political justification must be generally acceptable to the ruled. In that case, privileging the wise would require not only their being so wise as to be better rulers, but also, and more demandingly, that their wisdom be something that can be agreed to by all reasonable citizens. I turn to this conception of justification below.
Mill's position has great plausibility: good education promotes the ability of citizens to rule more wisely. So, how can we deny that the educated subset would rule more wisely than others. But then why shouldn't they have more votes?
Estlund goes on to criticize Mill's education-based meritocracy on various grounds.
In the United States, the federal bureaucracy used the Spoils System from 1828 until the assassination of United States President James A. Garfield by a disappointed office seeker in 1881 proved its dangers. Two years later in 1883, the system of appointments to the United States Federal Bureaucracy was revamped by the Pendleton Civil Service Reform Act, partially based on the British meritocratic civil service that had been established years earlier. The act stipulated that government jobs should be awarded on the basis of merit, through competitive exams, rather than ties to politicians or political affiliation. It also made it illegal to fire or demote government employees for political reasons.
To enforce the merit system and the judicial system, the law also created the United States Civil Service Commission. In the modern American meritocracy, the president may hand out only a certain number of jobs, which must be approved by the United States Senate.
Australia began establishing public universities in the 1850s with the goal of promoting meritocracy by providing advanced training and credentials. The educational system was set up to service urban males of middle-class background, but of diverse social and religious origins. It was increasingly extended to all graduates of the public school system, those of rural and regional background, and then to women and finally to ethnic minorities. Both the middle classes and the working classes have promoted the ideal of meritocracy within a strong commitment to "mate-ship" and political equality.
Singapore describes meritocracy as one of its official guiding principles for domestic public policy formulation, placing emphasis on academic credentials as objective measures of merit.
There is criticism that, under this system, Singaporean society is being increasingly stratified and that an elite class is being created from a narrow segment of the population. Singapore has a growing level of tutoring for children, and top tutors are often paid better than school teachers. Defendants recall the ancient Chinese proverb "Wealth does not pass three generations" (), suggesting that the nepotism or cronyism of elitists eventually will be, and often are, replaced by those lower down the hierarchy.
Singaporean academics are continuously re-examining the application of meritocracy as an ideological tool and how it's stretched to encompass the ruling party's objectives. Professor Kenneth Paul Tan at the Lee Kuan Yew School of Public Policy asserts that "Meritocracy, in trying to 'isolate' merit by treating people with fundamentally unequal backgrounds as superficially the same, can be a practice that ignores and even conceals the real advantages and disadvantages that are unevenly distributed to different segments of an inherently unequal society, a practice that in fact perpetuates this fundamental inequality. In this way, those who are picked by meritocracy as having merit may already have enjoyed unfair advantages from the very beginning, ignored according to the principle of nondiscrimination."
Meritocracy in the Singaporean context relates to the application of pragmatism as an ideological device which combines strict adherence to market principles "without" any aversion to social engineering and little propensity for classical social welfarism, is further illustrated by Kenneth Paul Tan in subsequent articles:
There is a strong ideological quality in Singapore's pragmatism, and a strongly pragmatic quality in ideological negotiations within the dynamics of hegemony. In this complex relationship, the combination of ideological and pragmatic maneuvering over the decades has resulted in the historical dominance of government by the PAP in partnership with global capital whose interests have been advanced without much reservation.Within the Ecuadorian Ministry of Labor, the Ecuadorian Meritocracy Institute was created under the technical advice of the Singaporean government.
John Rawls rejects the ideal of meritocracy.
In 2007 an anonymous British group called The Meritocracy Party published its first manifesto, to which they have now added more than two million words on the subject (discussing Hegel, Rousseau, Charles Fourier, Henri de Saint-Simon, and various other philosophers, scientists, reformers, and revolutionaries). In summary, The Meritocracy Party wants to achieve the following:
On their website The Meritocracy Party lists five meritocratic principles and thirteen primary aims. The Meritocracy International is the host of all meritocratic political parties in the world and the place where these may be found by country of origin.
The term "meritocracy" was originally intended as a negative concept. One of the primary concerns with meritocracy is the unclear definition of "merit". What is considered as meritorious can differ with opinions as on which qualities are considered the most worthy, raising the question of which "merit" is the highest—or, in other words, which standard is the "best" standard. As the supposed effectiveness of a meritocracy is based on the supposed competence of its officials, this standard of merit cannot be arbitrary and has to also reflect the competencies required for their roles.
The reliability of the authority and system that assesses each individual's merit is another point of concern. As a meritocratic system relies on a standard of merit to measure and compare people against, the system by which this is done has to be reliable to ensure that their assessed merit accurately reflects their potential capabilities. Standardized testing, which reflects the meritocratic sorting process, has come under criticism for being rigid and unable to accurately assess many valuable qualities and potentials of students. Education theorist Bill Ayers, commenting on the limitations of standardized testing, writes that "Standardized tests can't measure initiative, creativity, imagination, conceptual thinking, curiosity, effort, irony, judgment, commitment, nuance, good will, ethical reflection, or a host of other valuable dispositions and attributes. What they can measure and count are isolated skills, specific facts and function, content knowledge, the least interesting and least significant aspects of learning." Merit determined through the opinionated evaluations of teachers, while being able to assess the valuable qualities that cannot be assessed by standardized testing, are unreliable as the opinions, insights, biases, and standards of the teachers vary greatly. If the system of evaluation is corrupt, non-transparent, opinionated or misguided, decisions regarding who has the highest merit can be highly fallible.
The level of education required in order to become competitive in a meritocracy may also be costly, effectively limiting candidacy for a position of power to those with the means necessary to become educated. An example of this was Chinese student self-declared messiah, Hong Xiuquan, who despite ranking first in a preliminary, nationwide imperial examination, was unable to afford further education. As such, although he did try to study in private, Hong was ultimately noncompetitive in later examinations and unable to become a bureaucrat. This economic aspect of meritocracies has been said to continue nowadays in countries without free educations, with the Supreme Court of the United States, for example, consisting only of justices who attended Harvard or Yale and generally only considering clerkship candidates who attended a top-five university, while in the 1950s the two universities only accounted for around one fifth of the justices. Even if free education were provided, the resources that the parents of a student are able to provide outside of the curriculum, such as tutoring, exam preparation, and financial support for living costs during higher education will influence the education the student attains and the student's social position in a meritocratic society. This limits the fairness and justness of any meritocratic system. Similarly, feminist critics have noted that many hierarchical organisations actually favour individuals who have received disproportionate support of an informal kind (e.g. mentorship, word-of-mouth opportunities, and so on), such that only those who benefit from such supports are likely to understand these organisations as meritocratic.
Another concern regards the principle of incompetence, or the "Peter Principle". As people rise in a meritocratic society through the social hierarchy through their demonstrated merit, they eventually reach, and become stuck, at a level too difficult for them to perform effectively; they are promoted to incompetence. This reduces the effectiveness of a meritocratic system, the supposed main practical benefit of which is the competence of those who run the society.
In his book "Meritocratic Education and Social Worthlessness" (Palgrave, 2012), the philosopher Khen Lampert argued that educational meritocracy is nothing but a post-modern version of Social Darwinism. Its proponents argue that the theory justifies social inequality as being meritocratic. This social theory holds that Darwin's theory of evolution by natural selection is a model, not only for the development of biological traits in a population, but also as an application for human social institutions—the existing social institutions being implicitly declared as normative. Social Darwinism shares its roots with early progressivism, and was most popular from the late nineteenth century to the end of World War II. Darwin only ventured to propound his theories in a biological sense, and it is other thinkers and theorists who have applied Darwin's model normatively to unequal endowments of human ambitions.
|
https://en.wikipedia.org/wiki?curid=20971
|
Marxism–Leninism
Marxism–Leninism is a political philosophy that seeks to establish a socialist state to develop further into socialism and eventually communism, a classless social system with common ownership of the means of production, with full social and economic equality of all members of society. Marxist–Leninists espouse a wide array of views depending on their understanding of orthodox Marxism and Leninism, but they generally support the idea of a vanguard party, a communist party-led state, state-dominance over the economy, internationalism and opposition to capitalism, fascism, imperialism and liberal democracy. As an ideology, it was developed by Joseph Stalin in the late 1920s based on his understanding and synthesis of both orthodox Marxism and Leninism. It was the official state ideology of the Soviet Union and the other ruling parties making up the Eastern Bloc as well as the political parties of the Communist International after Bolshevisation. Today, Marxism–Leninism is the ideology of Stalinist and Maoist political parties around the world and remains the official ideology of the ruling parties of China, Cuba, Laos and Vietnam.
After the death of Vladimir Lenin in 1924, Marxism–Leninism became a distinct philosophical movement in the Soviet Union when Stalin and his supporters gained control of the Soviet party. It rejected the common notions among Western Marxists at the time of world revolution as a prerequisite for building socialism in favour of the concept of socialism in one country. According to its supporters, the gradual transition from capitalism to socialism was signified with the introduction of the first five-year plan and the 1936 Soviet Constitution. The internationalism of Marxism–Leninism was expressed in supporting revolutions in foreign countries (e.g. initially through the Communist International or through the concept of socialist-leaning countries after de-Stalinisation). By the late 1920s, Stalin established ideological orthodoxy among the Russian Communist Party (Bolsheviks), the Soviet Union and the Communist International to establish universal Marxist–Leninist praxis. In the late 1930s, Stalin's official textbook "History of the Communist Party of the Soviet Union (Bolsheviks)" (1938) made the term Marxism–Leninism common political-science usage among communists and non-communists.
The goal of Marxism–Leninism is the revolutionary transformation of a capitalist state into a socialist state by way of two-stage revolution led by a vanguard party of professional revolutionaries drawn from the proletariat. To realise the two-stage transformation of the state, the vanguard party establishes the dictatorship of the proletariat (as opposed to that of the bourgeoisie) and determines policy through democratic centralism. The Marxist–Leninist communist party is the vanguard for the political, economic and social transformation of a capitalist society into a socialist society which is the lower stage of socio-economic development and progress towards the upper-stage communist society which is stateless and classless. It features public ownership of the means of production, accelerated industrialisation, pro-active development of society's productive forces (research and development) and nationalised natural resources.
In the establishment of socialism in the former Russian Empire, Bolshevism was the ideological basis for the Soviet Union. As the vanguard party who guide the establishment and development of socialism, the communist party represented their policies as correct. Because Leninism was the revolutionary means to achieving socialism in the praxis of government, the relationship between ideology and decision-making inclined to pragmatism and most policy decisions were taken in light of the continual and permanent development of Marxist–Leninist ideology, i.e. ideological adaptation to actual conditions.
Within five years of the death of Vladimir Lenin (1924), Joseph Stalin completed his rise to power and was the leader of the Soviet Union who theorised and applied the socialist theories of Lenin and Karl Marx as political expediencies used to realise his plans for the Soviet Union and for world socialism. The book "Concerning Questions of Leninism" (1926) represented Marxism–Leninism as a separate communist ideology and featured a global hierarchy of communist parties and revolutionary vanguard parties in each country of the world. With that, Stalin's application of Marxism–Leninism to the situation of the Soviet Union became Stalinism, the official state ideology until his death in 1953. In Marxist political discourse, Stalinism, denoting and connoting the theory and praxis of Stalin, has two usages, namely praise of Stalin by Marxist–Leninists who believe Stalin successfully developed Lenin's legacy and criticism of Stalin by Marxist–Leninists and other Marxists who repudiate Stalin's political purges, social-class repressions and bureaucratic terrorism.
As the Left Opposition to Stalin within the Soviet party and government, Leon Trotsky and Trotskyists argued that Marxist–Leninist ideology contradicted Marxism and Leninism in theory, therefore Stalin's ideology was not useful for the implementation of socialism in Russia. Moreover, Trotskyists within the party identified their anti-Stalinist communist ideology as Bolshevik–Leninism and supported the permanent revolution to differentiate themselves from Stalin's justification and implementation of socialism in one country.
After the Sino-Soviet split of the 1960s, the Communist Party of China and the Communist Party of the Soviet Union claimed to be the sole heir and successor to Stalin concerning the correct interpretation of Marxism–Leninism and ideological leader of world communism. In that vein, Mao Zedong Thought, Mao Zedong's updating and adaptation of Marxism–Leninism to Chinese conditions in which revolutionary praxis is primary and ideological orthodoxy is secondary, represents urban Marxism–Leninism adapted to pre-industrial China. The claim that Mao had adapted Marxism–Leninism to Chinese conditions evolved into the idea that he had updated it in a fundamental way applying to the world as a whole. Consequently, Mao Zedong Thought became the official state ideology of the People's Republic of China as well as the ideological basis of communist parties around the world which sympathised with China. In the late 1970s, the Peruvian communist party Shining Path developed and synthesized Mao Zedong Thought into Marxism–Leninism–Maoism, a contemporary variety of Marxism–Leninism that is a supposed higher level of Marxism–Leninism that can be applied universally.
Following the Sino-Albanian split of the 1970s, a small portion of Marxist–Leninists began to downplay or repudiate the role of Mao in the Marxist–Leninist international movement in favour of the Albanian Labor Party and a stricter adherence to Stalin. The Sino-Albanian split was caused by Albania's rejection of China's "Realpolitik" of Sino–American rapprochement, specifically the 1972 Mao–Nixon meeting which the anti-revisionist Albanian Labor Party perceived as an ideological betrayal of Mao's own Three Worlds Theory that excluded such political rapprochement with the West. To the Albanian Marxist–Leninists, the Chinese dealings with the United States indicated Mao's lessened, practical commitments to ideological orthodoxy and proletarian internationalism. In response to Mao's apparently unorthodox deviations, Enver Hoxha, head of the Albanian Labor Party, theorised anti-revisionist Marxism–Leninism, referred to as Hoxhaism, that retained orthodox Marxism–Leninism when compared to the ideology of the post-Stalin Soviet Union.
In North Korea, Marxism–Leninism was officially superseded in 1977 by "Juche". However, the government is still sometimes referred to as Marxist–Leninist, or more commonly as Stalinist, due to its political and economic structure. In the other four existing Marxist–Leninist socialist states, namely China, Cuba, Laos and Vietnam, the ruling parties hold Marxism–Leninism as their official ideology, although they give it different interpretations in terms of practical policy. Marxism–Leninism remains the ideology of mostly anti-revisionist Stalinist, Maoist and Hoxhaist as well as some de-Stalinised or reformed non-ruling communist parties worldwide. The anti-revisionists criticize some rule of the communist states by claiming that they were state capitalist countries ruled by revisionists. Although the periods and countries defined as state capitalist or revisionist varies among different ideologies and parties, all of them accept that the Soviet Union was socialist during Stalin's time. Maoists believe that the People's Republic of China became state capitalist after Mao's death. Hoxhaists believe that the People's Republic of China was always state capitalist and uphold the People's Socialist Republic of Albania as the only socialist state after the Soviet Union under Stalin.
Although Marxism–Leninism was created after Vladimir Lenin's death during the regime of Josef Stalin in the Soviet Union, continuing to be the official state ideology after de-Stalinisation and of other Marxist–Leninist states, the basis for elements of Marxism–Leninism predate this. The philosophy of Marxism–Leninism originated as the pro-active, political praxis of the Bolshevik faction of the Russian Social Democratic Labour Party in realising political change in Tsarist Russia. Lenin's leadership transformed the Bolsheviks into the party's political vanguard which was composed of professional revolutionaries who practised democratic centralism to elect leaders and officers as well as to determine policy through free discussion, then decisively realised through united action. The vanguardism of proactive, pragmatic commitment to achieving revolution was the Bolsheviks' advantage in out-manoeuvring the liberal and conservative political parties who advocated social democracy without a practical plan of action for the Russian society they wished to govern. Leninism allowed the Bolshevik party to assume command of the October Revolution in 1917.
Twelve years before the October Revolution in 1917, the Bolsheviks had failed to assume control of the February Revolution of 1905 (22 January 1905 – 16 June 1907) because the centres of revolutionary action were too far apart for proper political coordination. To generate revolutionary momentum from the Tsarist army killings on Bloody Sunday (22 January 1905), the Bolsheviks encouraged workers to use political violence in order to compel the bourgeois social classes (the nobility, the gentry and the bourgeoisie) to join the proletarian revolution to overthrow the absolute monarchy of the Tsar of Russia. Most importantly, the experience of this revolution caused Lenin to conceive of the means of sponsoring socialist revolution through agitation, propaganda and a well-organised, disciplined and small political party.
Despite secret-police persecution by the Okhrana (Department for Protecting the Public Security and Order), émigré Bolsheviks returned to Russia to agitate, organise and lead, but then they returned to exile when peoples' revolutionary fervour failed in 1907. The failure of the February Revolution exiled Bolsheviks, Mensheviks, Socialist Revolutionaries and anarchists such as the Black Guards from Russia. Membership in both the Bolshevik and Menshevik ranks diminished from 1907 to 1908 while the number of people taking part in strikes in 1907 was 26% of the figure during the year of the Revolution of 1905, dropping to 6% in 1908 and 2% in 1910. The 1908–1917 period was one of disillusionment in the Bolshevik party over Lenin's leadership, with members opposing him for scandals involving his expropriations and methods of raising money for the party. This political defeat was aggravated by Tsar Nicholas II's political reformations of Imperial Russian government. In practise, the formalities of political participation (the electoral plurality of a multi-party system with the State Duma and the Russian Constitution of 1906) were the Tsar's piecemeal and cosmetic concessions to social progress because public office remained available only to the aristocracy, the gentry and the bourgeoisie. These reforms resolved neither the illiteracy, the poverty, nor malnutrition of the proletarian majority of Imperial Russia.
In Swiss exile, Lenin developed Marx's philosophy and extrapolated decolonisation by colonial revolt as a reinforcement of proletarian revolution in Europe. In 1912, Lenin resolved a factional challenge to his ideological leadership of the RSDLP by the Forward Group in the party, usurping the all-party congress to transform the RSDLP into the Bolshevik party. In the early 1910s, Lenin remained highly unpopular and was so unpopular amongst international socialist movement that by 1914 it considered censoring him. Unlike the European socialists who chose bellicose nationalism to anti-war internationalism, whose philosophical and political break was consequence of the internationalist–defencist schism among socialists, the Bolsheviks opposed the Great War (1914–1918). That nationalist betrayal of socialism was denounced by a small group of socialist leaders who opposed the Great War, including Rosa Luxemburg, Karl Liebknecht and Lenin, who said that the European socialists had failed the working classes for preferring patriotic war to proletarian internationalism. To debunk patriotism and national chauvinism, Lenin explained in the essay "Imperialism, the Highest Stage of Capitalism" (1917) that capitalist economic expansion leads to colonial imperialism which is then regulated with nationalist wars such as the Great War among the empires of Europe. To relieve strategic pressures from the Western Front (4 August 1914 – 11 November 1918), Imperial Germany impelled the withdrawal of Imperial Russia from the war's Eastern Front (17 August 1914 – 3 March 1918) by sending Lenin and his Bolshevik cohort in a diplomatically-sealed train, anticipating them partaking in revolutionary activity.
In March 1917, the abdication of Tsar Nicholas II led to the Russian Provisional Government (March–July 1917), who then proclaimed the Russian Republic (September–November 1917). Later in the October Revolution, the Bolshevik's seizure of power against the Provisional Government resulted in their establishment of the Russian Soviet Federative Socialist Republic (1917–1991), yet parts of Russia remained occupied by the counter-revolutionary White Movement of anti-communists who had united to form the White Army to fight the Russian Civil War (1917–1922) against the Bolshevik government. Moreover, despite the White–Red civil war, Russia remained a combatant in the Great War that the Bolshevik's had quit with the Treaty of Brest-Litovsk which then provoked the Allied Intervention to the Russian Civil War by the armies of seventeen countries, featuring Great Britain, France, Italy, the United States and Imperial Japan.
Elsewhere, the successful October Revolution in Russia had facilitated the German Revolution of 1918–1919 and revolutions and interventions in Hungary (1918–1920) which produced the First Hungarian Republic and the Hungarian Soviet Republic. In Berlin, the German government and their Freikorps mercenaries fought and defeated the Spartacist uprising which began as a general strike. In Munich, the local Freikorps fought and defeated the Bavarian Soviet Republic. In Hungary, the disorganised workers who had proclaimed the Hungarian Soviet Republic were fought and defeated by the royal armies of the Kingdom of Romania and the Kingdom of Yugoslavia as well as the army of the First Republic of Czechoslovakia. These communist forces were soon crushed by anti-communist forces and attempts to create an international communist revolution failed. However, a successful revolution occurred in Asia, when the Mongolian Revolution of 1921 established the Mongolian People's Republic (1924–1992).
As promised to the Russian peoples in October 1917, the Bolsheviks quit Russia's participation in the Great War on 3 March 1918. That same year, the Bolsheviks consolidated government power by expelling the Mensheviks, the Socialist Revolutionaries and the Left Socialist-Revolutionaries from the soviets. The Bolshevik government then established the Cheka (All-Russian Extraordinary Commission) secret police to eliminate anti–Bolshevik opposition in the country. Initially, there was strong opposition to the Bolshevik régime because they had not resolved the food shortages and material poverty of the Russian peoples as promised in October 1917. From that social discontent, the Cheka reported 118 uprisings, including the Kronstadt rebellion (7–17 March 1921) against the economic austerity of the War Communism imposed by the Bolsheviks. The principal obstacles to Russian economic development and modernisation were great material poverty and the lack of modern technology which were conditions that orthodox Marxism considered unfavourable to communist revolution. Agricultural Russia was sufficiently developed for establishing capitalism, but it was insufficiently developed for establishing socialism. For Bolshevik Russia, the 1921–1924 period featured the simultaneous occurrence of economic recovery, famine (1921–1922) and a financial crisis (1924). By 1924, considerable economic progress had been achieved and by 1926 the Bolshevik government had achieved economic production levels equal to Russia's production levels in 1913.
Initial Bolshevik economic policies from 1917 to 1918 were cautious, with limited nationalisations of the means of production which had been private property of the Russian aristocracy during the Tsarist monarchy. Lenin was immediately committed to avoid antagonising the peasantry by making efforts to coax them away from the Socialist Revolutionaries, allowing a peasant takeover of nobles' estates while no immediate nationalisations were enacted on peasants' property. The Decree on Land (8 November 1917) fulfilled Lenin's promised redistribution of Russia's arable land to the peasants, who reclaimed their farmlands from the aristocrats, ensuring the peasants' loyalty to the Bolshevik party. To overcome the civil war's economic interruptions, the policy of War Communism (1918–1921), a regulated market, state-controlled means of distribution and nationalisation of large-scale farms, was adopted to requisite and distribute grain in order to feed industrial workers in the cities whilst the Red Army was fighting the White Army's attempted restoration of the Romanov dynasty as absolute monarchs of Russia. Moreover, the politically unpopular forced grain-requisitions discouraged peasants from farming resulted in reduced harvests and food shortages that provoked labour strikes and food riots. In the event, the Russian peoples created an economy of barter and black market to counter the Bolshevik government's voiding of the monetary economy.
In 1921, the New Economic Policy restored some private enterprise to animate the Russian economy. As part of Lenin's pragmatic compromise with external financial interests in 1918, Bolshevik state capitalism temporarily returned 91% of industry to private ownership or trusts until the Soviet Russians learned the technology and the techniques required to operate and administrate industries. Importantly, Lenin declared that the development of socialism would not be able to be pursued in the manner originally thought by Marxists. A key aspect that affected the Bolshevik regime was the backward economic conditions in Russia that were considered unfavourable to orthodox Marxist theory of communist revolution. At the time, orthodox Marxists claimed that Russia was ripe for the development of capitalism, not yet for socialism. Lenin advocated the need of the development of a large corps of technical intelligentsia to assist the industrial development of Russia and advance the Marxist economic stages of development as it had too few technical experts at the time. In that vein, Lenin explained it as follows: "Our poverty is so great that we cannot, at one stroke, restore full-scale factory, state, socialist production". He added that the development of socialism would proceed according to the actual material and socio-economic conditions in Russia and not as abstractly described by Marx for industrialised Europe in the 19th century. To overcome the lack of educated Russians who could operate and administrate industry, Lenin advocated the development of a technical intelligentsia who would propel the industrial development of Russia to self-sufficiency.
As he neared death after suffering strokes, Lenin's Testament (December 1922) named Trotsky and Stalin as the most able men in the Central Committee, but he harshly criticised them. Lenin said that Stalin should be removed from being the General Secretary of the party and that he be replaced with "some other person who is superior to Stalin only in one respect, namely, in being more tolerant, more loyal, more polite, and more attentive to comrades". Upon his death (21 January 1924), Lenin's political testament was read aloud to the Central Committee, who choose to ignore Lenin's ordered removal of Stalin as General Secretary because enough members believed Stalin had been politically rehabilitated in 1923.
Consequent to personally spiteful disputes about the praxis of Leninism, the October Revolution veterans Lev Kamenev and Grigory Zinoviev said that the true threat to the ideological integrity of the party was Trotsky, who was a personally charismatic political leader as well as the commanding officer of the Red Army in the Russian Civil War and revolutionary partner of the dead Lenin. To thwart Trotsky's likely election to head the party, Stalin, Kamenev and Zinoviev formed a troika that featured Stalin as General Secretary, the "de facto" centre of power in the party and the country. The direction of the party was decided in confrontations of politics and personality between Stalin's troika and Trotsky over which Marxist policy to pursue, either Trotsky's policy of permanent revolution or Stalin's policy of socialism in one country. Trotsky's permanent revolution advocated rapid industrialisation, elimination of private farming and having the Soviet Union promote the spread of communist revolution abroad. Stalin's socialism in one country stressed moderation and development of positive relations between the Soviet Union and other countries to increase trade and foreign investment. To politically isolate and oust Trotsky from the party, Stalin expediently advocated socialism in one country, a policy to which he was indifferent. In 1925, the 14th Congress of the All-Union Communist Party (Bolsheviks) chose Stalin's policy, defeating Trotsky as a possible leader of the party and of the Soviet Union.
In the 1925–1927 period, Stalin dissolved the troika and disowned the centrist Kamenev and Zinoviev for an expedient alliance with the three most right-wing from the Bolshevik period, namely Alexei Rykov (Premier of Russia, 1924–1929; Premier of the Soviet Union, 1924–1930), Nikolai Bukharin and Mikhail Tomsky (leader of the All-Russian Central Council of Trade Unions). In 1927, the party endorsed Stalin's policy of socialism in one country as the Soviet Union's national policy and expelled the leftist Trotsky and the centrists Kamenev and Zinoviev from the Politburo. In 1929, Stalin politically controlled the party and the Soviet Union by way of deception and administrative acumen. In that time, Stalin's centralised, socialism in one country régime had negatively associated Lenin's revolutionary Bolshevism with Stalinism, i.e. government by command-policy to realise projects such as the rapid industrialisation of cities and the collectivisation of agriculture. Such Stalinism also subordinated the interests (political, national and ideological) of Asian and European communist parties to the geopolitical interests of the Soviet Union.
In the 1928–1932 period of the first five-year plan, Stalin affected the dekulakization of the farmlands of the Soviet Union, a politically radical dispossession of the kulak class of peasant-landlords from the Tsarist social order of monarchy. As Old Bolshevik revolutionaries, Bukharin, Rykov and Tomsky recommended amelioration of the dekulakization to lessen the negative social impact in the relations between the Soviet peoples and the party, but Stalin took umbrage and then accused them of uncommunist philosophical deviations from Lenin and Marx. That implicit accusation of ideological deviationism licensed Stalin to accuse Bukharin, Rykov and Tomsky of plotting against the party and the appearance of impropriety then compelled the resignations of the Old Bolsheviks from government and from the Politburo. Stalin then completed his political purging of the party by exiling Trotsky from the Soviet Union in 1929. Afterwards, the political opposition to the practical régime of Stalinism was denounced as Trotskyism (Bolshevik–Leninism), described as a deviation from Marxism–Leninism, the state ideology of the Soviet Union.
Political developments in the Soviet Union included Stalin dismantling the remaining elements of democracy from the party by extending his control over its institutions and eliminating any possible rivals. The party's ranks grew in numbers, with the party modifying its organisation to include more trade unions and factories. The ranks and files of the party were populated with members from the trade unions and the factories, whom Stalin controlled because there were no other Old Bolsheviks to contradict Marxism–Leninism. In the late 1930s, the Soviet Union adopted the 1936 Soviet Constitution which ended weighted-voting preferences for workers, promulgated universal suffrage for every man and woman older than 18 years of age and organised the soviets (councils of workers) into two legislatures, namely the Soviet of the Union (representing electoral districts) and the Soviet of Nationalities (representing the ethnic groups of the country). By 1939, with the exception of Stalin himself, none of the original Bolsheviks of the October Revolution of 1917 remained in the party. Unquestioning loyalty to Stalin was expected by the regime of all citizens.
Stalin exercised extensive personal control over the party and unleashed an unprecedented level of violence to eliminate any potential threat to his regime. While Stalin exercised major control over political initiatives, their implementation was in the control of localities, often with local leaders interpreting the policies in a way that served themselves best. This abuse of power by local leaders exacerbated the violent purges and terror campaigns carried out by Stalin against members of the party deemed to be traitors. With the Great Purge (1936–1938), Stalin rid himself of internal enemies in the party and rid the Soviet Union of any alleged socially dangerous and counterrevolutionary person who might have offered legitimate political opposition to Marxism–Leninism.
Stalin allowed the secret police NKVD (People's Commissariat for Internal Affairs) to rise above the law and the GPU (State Political Directorate) to use political violence to eliminate any person who might be a threat, whether real, potential, or imagined. As an administrator, Stalin governed the Soviet Union by controlling the formulation of national policy, but he delegated implementation to subordinate functionaries. Such freedom of action allowed local communist functionaries much discretion to interpret the intent of orders from Moscow, but this allowed their corruption. To Stalin, the correction of such abuses of authority and economic corruption were responsibility of the NKVD. In the 1937–1938 period, the NKVD arrested 1.5 million people, purged from every stratum of Soviet society and every rank and file of the party, of which 681,692 people were killed as enemies of the state. To provide manpower (manual, intellectual and technical) to realise the construction of socialism in one country, the NKVD established the Gulag system of forced-labour camps for regular criminals and political dissidents, for culturally insubordinate artists and politically incorrect intellectuals and for homosexual people and religious anti-communists.
Beginning in 1928, Stalin's five-year plans for the national economy of the Soviet Union achieved the rapid industrialisation (coal, iron and steel, electricity and petroleum, among others) and the collectivisation of agriculture. It achieved 23.6% of collectivisation within two years (1930) and 98.0% of collectivisation within thirteen years (1941). As the revolutionary vanguard, the communist party organised Russian society to realise rapid industrialisation programs as defence against Western interference with socialism in Bolshevik Russia. The five-year plans were prepared in the 1920s whilst the Bolshevik government fought the internal Russian Civil War (1917–1922) and repelled the external Allied intervention to the Russian Civil War (1918–1925). Vast industrialisation was initiated mostly based with a focus on heavy industry.
During the 1930s, the rapid industrialisation of the country accelerated the Soviet people's sociological transition from poverty to relative plenty when politically illiterate peasants passed from Tsarist serfdom to self-determination and became politically aware urban citizens. The Marxist–Leninist economic régime modernised Russia from the illiterate, peasant society characteristic of monarchy to the literate, socialist society of educated farmers and industrial workers. Industrialisation led to a massive urbanisation in the country. Unemployment was virtually eliminated in the country during the 1930s.
Social developments in the Soviet Union included the relinquishment of the relaxed social control and allowance of experimentation under Lenin to Stalin's promotion of a rigid and authoritarian society based upon discipline, mixing traditional Russian values with Stalin's interpretation of Marxism. Organised religion was repressed, especially minority religious groups. Education was transformed. Under Lenin, the education system allowed relaxed discipline in schools that became based upon Marxist theory, but Stalin reversed this in 1934 with a conservative approach taken with the reintroduction of formal learning, the use of examinations and grades, the assertion of full authority of the teacher and the introduction of school uniforms. Art and culture became strictly regulated under the principles of socialist realism and Russian traditions that Stalin admired were allowed to continue.
Foreign policy in the Soviet Union from 1929 to 1941 resulted in substantial changes in the Soviet Union's approach to its foreign policy. In 1933, the Marxist–Leninist geopolitical perspective was that the Soviet Union was surrounded by capitalist and anti-communist enemies. As a result, the election of Adolf Hitler and his Nazi Party government in Germany initially caused the Soviet Union to sever diplomatic relations that had been established in the 1920s. In 1938, Stalin accommodated the Nazis and the anti-communist West by not defending Czechoslovakia, allowing Hitler's threat of pre-emptive war for the Sudetenland to annex the land and "rescue the oppressed German peoples" living in Czecho.
To challenge Nazi Germany's bid for European empire and hegemony, Stalin promoted anti-fascist front organisations to encourage European socialists and democrats to join the Soviet communists to fight throughout Nazi-occupied Europe, creating agreements with France to challenge Germany. After Germany and Britain signed the Munich Agreement (29 September 1938) which allowed the German occupation of Czechoslovakia (1938–1945), Stalin adopted pro-German policies for the Soviet Union's dealings with Nazi Germany. In 1939, the Soviet Union and Nazi Germany agreed to the Treaty of Non-aggression between Germany and the Union of Soviet Socialist Republics (Molotov–Ribbentrop Pact, 23 August 1939) and to jointly invade and partition Poland, by way of which Nazi Germany started the Second World War (1 September 1939).
In the 1941–1942 period of the Great Patriotic War, the German invasion of the Soviet Union (Operation Barbarossa, 22 June 1941) was ineffectively opposed by the Red Army, who were poorly led, ill-trained and under-equipped. As a result, they fought poorly and suffered great losses of soldiers (killed, wounded and captured). The weakness of the Red Army was partly consequence of the Great Purge (1936–1938) of senior officers and career soldiers whom Stalin considered politically unreliable. Strategically, the Wehrmacht's extensive and effective attack threatened the territorial integrity of the Soviet Union, and the political integrity of Stalin's model of a Marxist–Leninist state, when the Nazis were initially welcomed as liberators by the anti-communist and nationalist populations in the Byelorussian Soviet Socialist Republic, the Georgian Soviet Socialist Republic and the Ukrainian Soviet Socialist Republic.
The anti-Soviet nationalists' collaboration with the Nazi's lasted until the "Schutzstaffel" and the "Einsatzgruppen" began their "Lebensraum" killings of the Jewish populations, the local communists, the civil and community leaders—the Holocaust meant to realise the Nazi German colonisation of Bolshevik Russia. In response, Stalin ordered the Red Army to fight a total war against the Germanic invaders who would exterminate Slavic Russia. Hitler's attack against the Soviet Union (Nazi Germany's erstwhile ally) realigned Stalin's political priorities, from the repression of internal enemies to the existential defence against external attack. The pragmatic Stalin then entered the Soviet Union to the Grand Alliance, a common front against the Axis Powers (Nazi Germany, Fascist Italy and Imperial Japan).
In the continental European countries occupied by the Axis powers, the native communist party usually led the armed resistance (guerrilla warfare and urban guerrilla warfare) against fascist military occupation. In Mediterranean Europe, the communist Yugoslav Partisans led by Josip Broz Tito effectively resisted the German Nazi and Italian Fascist occupation. In the 1943–1944 period, the Yugoslav Partisans liberated territories with Red Army assistance and established the communist political authority that became the Socialist Federal Republic of Yugoslavia. To end the Imperial Japanese occupation of China in continental Asia, Stalin ordered Mao Zedong and the Communist Party of China to temporarily cease the Chinese Civil War (1927–1949) against Chiang Kai-shek and the anti-communist Kuomintang as the Second United Front in the Second Sino-Japanese War (1937–1945).
In 1943, the Red Army began to repel the Nazi invasion of the Soviet Union, especially at the Battle of Stalingrad (23 August 1942 – 2 February 1943) and at the Battle of Kursk (5 July – 23 August 1943). The Red Army then repelled the Nazi and Fascist occupation armies from Eastern Europe until the Red Army decisively defeated Nazi Germany in the Berlin Strategic Offensive Operation (16 April–2 May 1945). On concluding the Great Patriotic War (1941–1945), the Soviet Union was a military superpower with a say in determining the geopolitical order of the world. In accordance with the three-power Yalta Agreement (4–11 February 1945), the Soviet Union purged native fascist collaborators from the Eastern European countries occupied by the Axis Powers and installed native Communist governments.
Upon Allied victory concluding the Second World War (1939–1945), the members of the Grand Alliance resumed their expediently suppressed, pre-war geopolitical rivalries and ideological tensions which disunity broke their anti-fascist wartime alliance into the anti-communist Western Bloc and the communist Eastern Bloc. The renewed competition for geopolitical hegemony resulted in the bi-polar Cold War (1945–1991), a protracted state of tension (military and diplomatic) between the United States and the Soviet Union which often threatened a Soviet–American nuclear war, but it usually featured proxy wars in the Third World.
The events that precipitated the Cold War in Europe were the Soviet and Yugoslav, Bulgarian and Albanian military interventions to the Greek Civil War (1944–1949) on behalf of the Communist Party of Greece; and the Berlin Blockade (1948–1949) by the Soviet Union. The event that precipitated the Cold War in continental Asia was the resumption of the Chinese Civil War (1927–1949) fought between the anti-communist Kuomintang and the Communist Party of China. After military defeat exiled Generalissimo Chiang Kai-shek and his Kuomintang nationalist government to Formosa island (Taiwan), Mao Zedong established the People's Republic of China on 1 October 1949.
In the late 1940s, the geopolitics of the Eastern Bloc countries under the hegemony of Stalinist Russia featured an official-and-personal style of socialist diplomacy that failed Stalin and Tito when Tito refused to subordinating Yugoslavia to the Soviet Union. In 1948, circumstance and cultural personality aggravated the matter into the Yugoslav–Soviet split (1948–1955) that resulted from Tito's rejection of Stalin's demand to subordinate the Socialist Federal Republic of Yugoslavia to the geopolitical agenda (economic and military) of the Soviet Union, i.e. Tito at Stalin's disposal. Stalin punished Tito's refusal by denouncing him as an ideological revisionist of Marxism–Leninism; by denouncing Yugoslavia's practice of Titoism as socialism deviated from the cause of world communism; and by expelling the Communist Party of Yugoslavia from the Communist Information Bureau (Cominform). The break from the Eastern Bloc allowed the development of a socialism with Yugoslav characteristics which allowed doing business with the capitalist West to develop the socialist economy and the establishment of Yugoslavia's diplomatic and commercial relations with countries of the Eastern Bloc and the Western Bloc. Yugoslavia's international relations matured into the Non-Aligned Movement (1961) of countries without political allegiance to any power bloc.
At the death of Stalin in 1953, Nikita Khrushchev became leader of the Soviet Union and of the Communist Party of the Soviet Union and then consolidated an anti-Stalinist government. In a secret meeting at the 20th Congress of the Communist Party of the Soviet Union, Khrushchev denounced Stalin and Stalinism in the speech "On the Cult of Personality and Its Consequences" (25 February 1956) in which he specified and condemned Stalin's dictatorial excesses and abuses of power such as the Great purge (1936–1938) and the cult of personality. Khrushchev introduced the de-Stalinisation of the party and of the Soviet Union. He realised this with the dismantling of the Gulag archipelago of forced-labour camps and freeing the prisoners as well as allowing Soviet civil society greater political freedom of expression, especially for public intellectuals of the intelligentsia such as the novelist Aleksandr Solzhenitsyn, whose literature obliquely criticised Stalin and the Stalinist police state. De-Stalinization also ended Stalin's national-purpose policy of socialism in one country and was replaced with proletarian internationalism, by way of which Khrushchev re-committed the Soviet Union to permanent revolution to realise world communism. In that geopolitical vein, Khrushchev presented de-Stalinization as the restoration of Leninism as the state ideology of the Soviet Union.
In the 1950s, the de-Stalinisation of the Soviet Union was ideological bad news for the People's Republic of China because Soviet and Russian interpretations and applications of Leninism and orthodox Marxism contradicted the Sinified Marxism–Leninism of Mao Zedong—his Chinese adaptations of Stalinist interpretation and praxis for establishing socialism in China. To realise that leap of Marxist faith in the development of Chinese socialism, the Communist Party of China developed Maoism as the official state ideology. As the specifically Chinese development of Marxism–Leninism, Maoism illuminated the cultural differences between the European-Russian and the Asian-Chinese interpretations and practical applications of Marxism–Leninism in each country. The political differences then provoked geopolitical, ideological and nationalist tensions, which derived from the different stages of development, between the urban society of the industrialised Soviet Union and the agricultural society of the pre-industrial China. The theory versus praxis arguments escalated to theoretic disputes about Marxist–Leninist revisionism and provoked the Sino-Soviet split (1956–1966) and the two countries broke their international relations (diplomatic, political, cultural and economic).
In Eastern Asia, the Cold War produced the Korean War (1950–1953), the first proxy war between the Eastern Bloc and the Western Bloc, resulted from dual origins, namely the nationalist Koreans' post-war resumption of their Korean Civil War and the imperial war for regional hegemony sponsored by the United States and the Soviet Union. The international response to the North Korean invasion of South Korea was realised by the United Nations Security Council, who voted for war despite the absent Soviet Union and authorised an international military expedition to intervene, expel the northern invaders from the south of Korea and restore the geopolitical "status quo ante" of the Soviet and American division of Korea at the 38th Parallel of global latitude. Consequent to Chinese military intervention in behalf of North Korea, the magnitude of the infantry warfare reached operational and geographic stalemate (July 1951–July 1953). Afterwards, the shooting war was ended with the Korean Armistice Agreement (27 July 1953); and the superpower Cold War in Asia then resumed as the Korean Demilitarized Zone.
Consequent to the Sino-Soviet split, the pragmatic China established politics of détente with the United States in an effort to publicly challenge the Soviet Union for leadership of the international Marxist–Leninist movement. Mao Zedong's pragmatism permitted geopolitical rapprochement and facilitated President Richard Nixon's 1972 visit to China which subsequently ended the policy of the existence to Two Chinas when the United States sponsored the People's Republic of China to replace the Republic of China (Taiwan) as the representative of the Chinese people at the United Nations. In the due course of Sino-American rapprochement, China also assumed membership in the Security Council of the United Nations. In the post-Mao period of Sino-American détente, the Deng Xiaoping government (1982–1987) affected policies of economic liberalisation that allowed continual growth for the Chinese economy. The ideological justification is socialism with Chinese characteristics, the Chinese adaptation of Marxism–Leninism.
Communist revolution erupted in the Americas in this period, including revolutions in Bolivia, Cuba, El Salvador, Grenada, Nicaragua, Peru and Uruguay. The Cuban Revolution (1953–1959) led by Fidel Castro and Che Guevara deposed the military dictatorship (1952–1959) of Fulgencio Batista and established the Republic of Cuba, a state formally recognised by the Soviet Union. In response, the United States launched a coup against the Castro government in 1961. However, the CIA's unsuccessful Bay of Pigs invasion (17 April 1961) by anti-communist Cuban exiles impelled the Republic of Cuba to side with the Soviet Union in the geopolitics of the bipolar Cold War. The Cuban missile crisis (22–28 October 1962) occurred when the United States opposed Cuba being armed with nuclear missiles by the Soviet Union. After a stalemate confrontation, the United States and the Soviet Union jointly resolved the nuclear-missile crisis by respectively removing United States missiles from Turkey and Italy and Soviet missiles from Cuba.
Both Bolivia, Canada and Uruguay faced Marxist–Leninist revolution in the 1960s and 1970s. In Bolivia, this included Che Guevara as a leader until being killed there by government forces. In 1970, the October Crisis (5 October – 28 December 1970) occurred in Canada, a brief revolution in the province of Quebec, where the actions of the Marxist–Leninist and separatist Quebec Liberation Front (FLQ) featured the kidnap of James Cross, the British Trade Commissioner in Canada; and the killing of Pierre Laporte, the Quebec government minister. The political manifesto of the FLQ condemned English-Canadian imperialism in French Quebec and called for an independent, socialist Quebec. The Canadian government's harsh response included the suspension of civil liberties in Quebec and compelled the FLQ leaders' flight to Cuba. Uruguay faced Marxist–Leninist revolution from the Tupamaros movement from the 1960s to the 1970s.
In 1979, the Sandinista National Liberation Front (FSLN) led by Daniel Ortega won the Nicaraguan Revolution (1961–1990) against the government of Anastasio Somoza Debayle (1 December 1974 – 17 July 1979) to establish a socialist Nicaragua. Within months, the government of Ronald Reagan sponsored the counter-revolutionary Contras in the secret Contra War (1979–1990) against the Sandinista government. In 1989, the Contra War concluded with the signing of the Tela Accord at the port of Tela, Honduras. The Tela Accord required the subsequent, voluntary demobilization of the Contra guerrilla armies and the FSLN army. In 1990, a second national election installed to government a majority of non-Sandinista political parties, to whom the FSLN handed political power. Since 2006, the FSLN has returned to government, winning every legislative and presidential election in the process (2006, 2011 and 2016).
The Salvadoran Civil War (1979–1992) featured the popularly supported Farabundo Martí National Liberation Front, an organisation of left-wing parties fighting against the right-wing military government of El Salvador. In 1983, the United States invasion of Grenada (25–29 October 1983) thwarted the assumption of power by the elected government of the New Jewel Movement (1973–1983), a Marxist–Leninist vanguard party led by Maurice Bishop.
In Asia, the Vietnam War (1945–1975) was the second East–West war fought during the Cold War (1945–1991). In the First Indochina War (1946–1954), the Việt Minh led by Ho Chi Minh defeated the French re-establishment of European colonialism in Vietnam. To fill the geopolitical power vacuum caused by French defeat in southeast Asia, the United States then became the Western power supporting the client-state Republic of Vietnam (1955–1975) headed by Ngo Dinh Diem. Despite possessing military superiority, the United States failed to safeguard South Vietnam from the guerrilla warfare of the Viet Cong sponsored by North Vietnam. On 30 January 1968, North Vietnam launched the Tet Offensive (the General Offensive and Uprising of Tet Mau Than, 1968). Although a military failure for the guerrillas and the army, it was a successful psychological warfare operation that decisively turned international public opinion against the United States intervention to the Vietnamese civil war, with the military withdrawal of the United States from Vietnam in 1973 and the subsequent and consequent Fall of Saigon to the North Vietnamese army on 30 April 1975.
With the end of the Vietnam War, Marxist–Leninist regimes were established in Vietnam's neighbour states. This included Kampuchea and Laos. Consequent to the Cambodian Civil War (1968–1975), a coalition composed of Prince Norodom Sihanouk (1941–1955), the native Cambodian Marxist–Leninists and the Maoist Khmer Rouge (1951–1999) led by Pol Pot established Democratic Kampuchea (1975–1982), a Marxist–Leninist state that featured class warfare to restructure the society of old Cambodia and to be effected and realised with the abolishment of money and private property, the outlawing of religion, the killing of the intelligentsia and compulsory manual labour for the middle classes by way of death-squad state terrorism. To eliminate Western cultural influence, Kampuchea expelled all foreigners and effected the destruction of the urban bourgeoisie of old Cambodia, first by displacing the population of the capital city, Phnom Penh; and then by displacing the national populace to work farmlands to increase food supplies. Meanwhile, the Khmer Rouge purged Kampuchea of internal enemies (social class and political, cultural and ethnic) at the Killing Fields, the scope of which became crimes against humanity for the deaths of 2,700,000 people by mass murder and genocide. That social restructuring of Cambodia into Kampuchea included attacks against the Vietnamese ethnic minority of the country which aggravated the historical, ethnic rivalries between the Viet and the Khmer peoples. Beginning in September 1977, Kampuchea and the Socialist Republic of Vietnam continually engaged in border clashes. In 1978, Vietnam invaded Kampuchea and captured Phnom Penh in January 1979, deposed the Maoist Khmer Rouge from government and established the Cambodian Liberation Front for National Renewal as the government of Cambodia.
A new front of Marxist–Leninist revolution erupted in Africa between 1961 and 1979. Angola, Benin, Congo, Ethiopia, Guinea-Bissau Mozambique, Somalia and Zimbabwe became Marxist–Leninist states governed by their respective native peoples during the 1968–1980 period. Marxist–Leninist guerrillas fought the Portuguese Colonial War (1961–1974) in three countries, namely Angola, Guinea-Bissau and Mozambique. In Ethiopia, a Marxist–Leninist revolution deposed the monarchy of Emperor Haile Selassie (1930–1974) and established the Derg government (1974–1987) of the Provisional Military Government of Socialist Ethiopia. In Rhodesia (1965–1979), Robert Mugabe led the Zimbabwe War of Liberation (1964–1979) that deposed white-minority rule and then established the Republic of Zimbabwe.
In Apartheid South Africa (1948–1994), the Afrikaner government of the Nationalist Party caused much geopolitical tension between the United States and the Soviet Union because of the Afrikaners' violent social control and political repression of the black and coloured populations of South Africa exercised under the guise of anti-communism and national security. The Soviet Union officially supported the overthrow of apartheid while the West and the United States in particular maintained official neutrality on the matter. In the 1976–1977 period of the Cold War, the United States and other Western countries found it morally untenable to politically support Apartheid South Africa, especially when the Afrikaner government killed 176 people (students and adults) in the police suppression of the Soweto uprising (June 1976), a political protest against Afrikaner cultural imperialism upon the non-white peoples of South Africa, specifically the imposition of the Germanic language of Afrikaans as the standard language for education which black South Africans were required to speak when addressing white people and Afrikaners; and the police assassination of Steven Biko (September 1977), a politically moderate leader of the internal resistance to apartheid in South Africa.
Under President Jimmy Carter, the West joined the Soviet Union and others in enacting sanctions against weapons trade and weapons-grade material to South Africa. However, forceful actions by the United States against Apartheid South Africa were diminished under President Reagan as the Reagan administration feared the rise of revolution in South Africa as had happened in Zimbabwe against white minority rule. In 1979, the Soviet Union intervened in Afghanistan to establish a Marxist–Leninist state, although the act was seen as an invasion by the West which responded to the Soviet military actions by boycotting the Moscow Olympics of 1980 and providing clandestine support to the Mujahideen, including Osama bin Laden, as a means to challenge the Soviet Union. The war became a Soviet equivalent of the Vietnam War to the United States and it remained a stalemate throughout the 1980s.
Social resistance to the policies of Marxist–Leninist regimes in Eastern Europe accelerated in strength with the rise of the Solidarity, the first non-Marxist–Leninist controlled trade union in the Warsaw Pact that was formed in the People's Republic of Poland in 1980.
In 1985, Mikhail Gorbachev rose to power in the Soviet Union and began policies of radical political reform involving political liberalisation, called perestroika and glasnost. Gorbachev's policies were designed at dismantling authoritarian elements of the state that were developed by Stalin, aiming for a return to a supposed ideal Leninist state that retained one-party structure while allowing the democratic election of competing candidates within the party for political office. Gorbachev also aimed to seek détente with the West and end the Cold War that was no longer economically sustainable to be pursued by the Soviet Union. The Soviet Union and the United States under President George H. W. Bush joined in pushing for the dismantlement of apartheid and oversaw the dismantlement of South African colonial rule over Namibia.
Meanwhile, the eastern European Marxist–Leninist states politically deteriorated in response to the success of the Polish Solidarity movement and the possibility of Gorbachev-style political liberalisation. In 1989, revolts began across Eastern Europe and China against Marxist–Leninist regimes. In China, the government refused to negotiate with student protestors, resulting in the Tiananmen Square attacks that stopped the revolts by force. The revolts culminated with the revolt in East Germany against the Marxist–Leninist regime of Erich Honecker and demands for the Berlin Wall to be torn down. The event in East Germany developed into a popular mass revolt with sections of the Berlin Wall being torn down and East and West Berliners uniting. Gorbachev's refusal to use Soviet forces based in East Germany to suppress the revolt was seen as a sign that the Cold War had ended. Honecker was pressured to resign from office and the new government committed itself to reunification with West Germany. The Marxist–Leninist regime of Nicolae Ceaușescu in Romania was forcefully overthrown in 1989 and Ceaușescu was executed. The other Warsaw Pact regimes also fell during the Revolutions of 1989, with the exception of the Socialist People's Republic of Albania that continued until 1992.
Unrest and eventual collapse of Marxism–Leninism also occurred in Yugoslavia, although for different reasons than those of the Warsaw Pact. The death of Josip Broz Tito in 1980 and the subsequent vacuum of strong leadership allowed the rise of rival ethnic nationalism in the multinational country. The first leader to exploit such nationalism for political purposes was Slobodan Milošević, who used it to seize power as president of Serbia and demanded concessions to Serbia and Serbs by the other republics in the Yugoslav federation. This resulted in a surge of Slovene and Croat nationalism in response and the collapse of the League of Communists of Yugoslavia in 1990, the victory of nationalists in multi-party elections in most of Yugoslavia's constituent republics and eventually civil war between the various nationalities beginning in 1991. Yugoslavia was dissolved in 1992.
The Soviet Union itself collapsed between 1990 and 1991, with a rise of secessionist nationalism and a political power dispute between Gorbachev and Boris Yeltsin, the new leader of the Russian Federation. With the Soviet Union collapsing, Gorbachev prepared the country to become a loose federation of independent states called the Commonwealth of Independent States. Hardline Marxist–Leninist leaders in the military reacted to Gorbachev's policies with the August Coup of 1991 in which hardline Marxist–Leninist military leaders overthrew Gorbachev and seized control of the government. This regime only lasted briefly as widespread popular opposition erupted in street protests and refused to submit. Gorbachev was restored to power, but the various Soviet republics were now set for independence. On 25 December 1991, Gorbachev officially announced the dissolution of the Soviet Union, ending the existence of the world's first Marxist–Leninist-led state.
Since the fall of the Eastern European Marxist–Leninist regimes, the Soviet Union and a variety of African Marxist–Leninist regimes, only a few Marxist–Leninist parties remained in power. This include China, Cuba, Laos and Vietnam. Most Marxist–Leninist communist parties outside of these nations have fared relatively poorly in elections, although other parties have remained or became a relative strong force. This include Russia, where the Communist Party of the Russian Federation has remained a significant political force, winning the 1995 legislative election, almost winning the 1996 presidential election and generally remaining the second most popular party. In Ukraine, the Communist Party of Ukraine has also exerted influence and governed the country after the 1994 parliamentary election and again after the 2006 parliamentary election. However, the 2014 parliamentary election following the Russian invasion of Ukraine and the annexation of Crimea resulted in the loss of its 32 members and no parliamentary representation.
In Europe, several Marxist–Leninist parties remain strong. In Cyprus, Dimitris Christofias of AKEL won the 2008 presidential election. AKEL has consistently been the first and third most popular party, winning the 1970, 1981, 2001 and 2006 legislative elections. In the Czech Republic and Portugal, the Communist Party of Bohemia and Moravia and the Portuguese Communist Party have been the second and fourth most popular parties until the 2017 and 2009 legislative elections, respectively. Since 2017, the Communist Party of Bohemia and Moravia supports the ANO 2011–ČSSD minority government while the Portuguese Communist Party has provided confidence and supply along with the Ecologist Party "The Greens" and Left Bloc to the Socialist minority government from 2015 to 2019. In Greece, the Communist Party of Greece has led an interim and later national unity government between 1989 and 1990, constantly remaining the third or fourth most popular party. In Moldova, the Party of Communists of the Republic of Moldova won the 2001, 2005 and April 2009 parliamentary elections. However, the April 2009 Moldovan elections results were protested and another round was held in July, resulting in the formation of the Alliance for European Integration. Failing to elect the president, new parliamentary elections were held in November 2010 which resulted in roughly the same representation in the parliament. According to Ion Marandici, a Moldovan political scientist, the Party of Communists differs from those in other countries because it managed to appeal to the ethnic minorities and the anti-Romanian Moldovans. After tracing the adaptation strategy of the party, he found confirming evidence for five of the factors contributing to its electoral success, already mentioned in the theoretical literature on former Marxist–Leninist parties, namely the economic situation, the weakness of the opponents, the electoral laws, the fragmentation of the political spectrum and the legacy of the old regime. However, Marandici identified seven additional explanatory factors at work in the Moldovan case, namely the foreign support for certain political parties, separatism, the appeal to the ethnic minorities, the alliance-building capacity, the reliance on the Soviet notion of the Moldovan identity, the state-building process and the control over a significant portion of the media. It is due to these seven additional factors that the party managed to consolidate and expand its constituency. In the post-Soviet states, the Party of Communists are the only ones who have been in power for so long and did not change the name of the party.
In Asia, a number of Marxist–Leninist regimes and movements continue to exist. The People's Republic of China has continued the agenda of Deng Xiaoping's 1980s reforms by initiating significant privatisation of the national economy. At the same time, no corresponding political liberalisation has occurred as happened in previous years to Eastern European countries. The Naxalite–Maoist insurgency has continued between the governments of Bangladesh and India against various Marxist–Leninist movements, having been unabated since the 1960s. In India, the Manmohan Singh government depended on the parliamentary support of the Communist Party of India (Marxist) which has led state governments in Kerala, Tripura and West Bengal. The armed wing of the Communist Party of India (Maoist) has been fighting a war against the government of India since 1967 and is still active in half the country. Maoist rebels in Nepal engaged in a civil war from 1996 to 2006 that managed to topple the monarchy there and create a republic. Communist Party of Nepal (Unified Marxist–Leninist) leader Man Mohan Adhikari briefly became prime minister and national leader from 1994 to 1995 and the Maoist guerrilla leader Prachanda was elected prime minister by the Constituent Assembly of Nepal in 2008. Prachanda has since been deposed as prime minister, leading the Maoists, who consider Prachanda's removal to be unjust, to abandon their legalistic approach and return to their street actions and militancy and to lead sporadic general strikes using their substantial influence on the Nepalese labour movement. These actions have oscillated between mild and intense. In the Philippines, the Maoist-oriented Communist Party of the Philippines and its armed wing, the New People's Army, have been waging armed revolution against the existing Philippine government since 1968 and are still participating in a low-scale guerrilla insurgency.
In Africa, several Marxist–Leninist states reformed themselves and maintained power. In South Africa, the South African Communist Party is a member of the Tripartite alliance alongside the African National Congress and the Congress of South African Trade Unions. The Economic Freedom Fighters is a pan-African, Marxist–Leninist party founded in 2013 by expelled former president of the African National Congress Youth League Julius Malema and his allies. Sri Lanka has had Marxist–Leninist ministers in their national governments. In Zimbabwe, former President Robert Mugabe of the Zimbabwe African National Union – Patriotic Front, the country's long standing leader, was a professed Marxist–Leninist.
In the Americas, there have been several insurgencies. In North America, the Revolutionary Communist Party, USA led by its chairman Bob Avakian organises for a revolution to overthrow the capitalist system and replace it with a socialist state. In South America, Colombia has been in the midst of a civil war which has been waged since 1964 between the Colombian government and aligned right-wing paramilitaries against two Marxist–Leninist guerrilla groups, namely the National Liberation Army and Revolutionary Armed Forces of Colombia. In Peru, there has been an internal conflict between the Peruvian government and Marxist–Leninist–Maoist militants such as the Shining Path.
The goal of Marxist–Leninist political economy is the emancipation of men and women from the dehumanisation caused by mechanistic work that is psychologically alienating (without work–life balance) which is performed in exchange for wages that give limited financial-access to the material necessities of life (i.e. food and shelter). That personal and societal emancipation from poverty (material necessity) would maximise individual liberty by enabling men and women to pursue their interests and innate talents (artistic, industrial and intellectual) whilst working by choice, without the economic coercion of poverty. In the communist society of upper-stage economic development, the elimination of alienating labour (mechanistic work) depends upon the developments of high technology that improve the means of production and the means of distribution. To meet the material needs of a socialist society, the state uses a planned economy to co-ordinate the means of production and of distribution to supply and deliver the goods and services required throughout society and the national economy. The state serves as a safeguard for the ownership and as the coordinator of production through a universal economic plan.
For the purpose of reducing waste and increasing efficiency, scientific planning replaces market mechanisms and price mechanisms as the guiding principle of the economy. The state's huge purchasing power replaces the role of market forces, with macroeconomic equilibrium not being achieved through market forces but by economic planning based on scientific assessment. The wages of the worker are determined according to the type of skills and the type of work he or she can perform within the national economy. Moreover, the economic value of the goods and services produced is based upon their use value (as material objects) and not upon the cost of production (value) or the exchange value (marginal utility). The profit motive as a driving force for production is replaced by social obligation to fulfil the economic plan. Wages are set and differentiated according to skill and intensity of work. While socially utilised means of production are under public control, personal belongings or property of a personal nature that does not involve mass production of goods remains unaffected by the state.
Because Marxism–Leninism has historically been the state ideology of countries who were economically undeveloped prior to socialist revolution, or whose economies were nearly obliterated by war such as the German Democratic Republic, the primary goal before achieving communism was the development of socialism in itself. Such was the case in the Soviet Union, where the economy was largely agrarian and urban industry was in a primitive stage. To develop socialism, the Soviet Union underwent rapid industrialisation with pragmatic programs of social engineering that transplanted peasant populations to the cities, where they were educated and trained as industrial workers and then became the workforce of the new factories and industries. Likewise, the farmer populations worked the system of collective farms to grow food to feed the industrial workers in the industrialised cities. Since the mid-1930s, Marxism–Leninism has advocated an austere social-equality based upon asceticism, egalitarianism and self-sacrifice. In the 1920s, the Bolshevik party semi-officially allowed some limited, small-scale wage inequality to boost labour productivity in the economy of the Soviet Union. These reforms were promoted to encourage materialism and acquisitiveness in order to stimulate economic growth. This pro-consumerist policy has been advanced on the lines of industrial pragmatism as it advances economic progress through bolstering industrialisation.
In the economic praxis of Bolshevik Russia, there was a defining difference of political economy between socialism and communism. Lenin explained their conceptual similarity to Marx's descriptions of the lower-stage and the upper-stage of economic development, namely that immediately after a proletarian revolution in the socialist lower-stage society the practical economy must be based upon the individual labour contributed by men and women and that paid labour would be the basis of the communist upper-stage society that has realised the social precept of the slogan: "From each according to his ability, to each according to his needs".
Marxism–Leninism aims to create an international communist society. It opposes colonialism and imperialism and advocates decolonisation and anti-colonial forces. It supports anti-fascist international alliances and has advocated the creation of popular fronts between communist and non-communist anti-fascists against strong fascist movements. This Marxist–Leninist approach to international relations derives from the analyses (political, economic, sociological and geopolitical) that Lenin presented in the essay "Imperialism, the Highest Stage of Capitalism" (1917). Extrapolating from five philosophical bases of Marxism, namely that human history is the history of class struggle between a ruling class and an exploited class; that capitalism creates antagonistic social classes, i.e. the bourgeois exploiters and the exploited proletariat; that capitalism employs nationalist war to further private economic expansion; that socialism is an economic system that voids social classes through public ownership of the means of production and so will eliminate the economic causes of war; and that once the state (socialist or communist) withers away, so shall international relations wither away because they are projections of national economic forces, Lenin said that the capitalists' exhaustion of domestic sources of investment profit by way of price-fixing trusts and cartels, then prompts the same capitalists to export investment capital to undeveloped countries to finance the exploitation of natural resources and the native populations and to create new markets. That the capitalists' control of national politics ensures the government's military safeguarding of colonial investments and the consequent imperial competition for economic supremacy provokes international wars to protect their national interests.
In the vertical perspective (social-class relations) of Marxism–Leninism, the internal and international affairs of a country are a political continuum, not separate realms of human activity. This is the philosophic opposite of the horizontal perspectives (country-to-country) of the liberal and the realist approaches to international relations. Colonial imperialism is the inevitable consequence in the course of economic relations among countries when the domestic price-fixing of monopoly capitalism has voided profitable competition in the capitalist homeland. The ideology of New Imperialism, rationalised as a civilising mission, allowed the exportation of high-profit investment capital to undeveloped countries with uneducated, native populations (sources of cheap labour), plentiful raw materials for exploitation (factors for manufacture) and a colonial market to consume the surplus production, which the capitalist homeland cannot consume. The example is the European Scramble for Africa (1881–1914) in which imperialism was safeguarded by the national military.
To secure the economic and settler colonies, foreign sources of new capital-investment-profit, the imperialist state seeks either political or military control of the limited resources (natural and human). The First World War (1914–1918) resulted from such geopolitical conflicts among the empires of Europe over colonial spheres of influence. For the colonised working classes who create the wealth (goods and services), the elimination of war for natural resources (access, control and exploitation) is resolved by overthrowing the militaristic capitalist state and establishing a socialist state because a peaceful world economy is feasible only by proletarian revolutions that overthrow systems of political economy based upon the exploitation of labour.
Marxism–Leninism supports the creation of a one-party state led by a communist party as a means to develop socialism and then communism. The political structure of the Marxist–Leninist state involves the rule of a communist vanguard party over a revolutionary socialist state that represents the will and rule of the proletariat. Through the policy of democratic centralism, the communist party is the supreme political institution of the Marxist–Leninist state.
In Marxism–Leninism, elections are held for all positions within the legislative structure, municipal councils, national legislatures and presidencies. In most Marxist–Leninist states, this has taken the form of directly electing representatives to fill positions, although in some states such as People's Republic of China, the Republic of Cuba and the Socialist Federal Republic of Yugoslavia this system also included indirect elections such as deputies being elected by deputies as the next lower level of government. Marxism–Leninism asserts that society is united upon common interests represented through the communist party and other institutions of the Marxist–Leninist state.
Marxism–Leninism supports universal social welfare. The Marxist–Leninist state provides for the national welfare with universal healthcare, free public education (academic, technical and professional) and the social benefits (childcare and continuing education) necessary to increase the productivity of the workers and the socialist economy to develop a communist society. As part of the planned economy, the Marxist–Leninist state is meant to develop the proletariat's universal education (academic and technical) and their class consciousness (political education) to facilitate their contextual understanding of the historical development of communism as presented in Marx's theory of history.
Marxism–Leninism supports the emancipation of women and ending the exploitation of women. Marxist–Leninist policy on family law has typically involved the elimination of the political power of the bourgeoisie, the abolition of private property and an education that teaches citizens to abide by a disciplined and self-fulfilling lifestyle dictated by the social norms of communism as a means to establish a new social order. The judicial reformation of family law eliminates patriarchy from the legal system. This facilitates the political emancipation of women from traditional social inferiority and economic exploitation. The reformation of civil law made marriage secular into a "free and voluntary union" between persons who are social-and-legal equals; facilitated divorce; legalised abortion, eliminated bastardy ("illegitimate children"); and voided the political power of the bourgeoisie and the private property-status of the means of production. The educational system imparts the social norms for a self-disciplined and self-fulfilling way of life, by which the socialist citizens establish the social order necessary for realising a communist society. With the advent of a classless society and the abolition of private property, society collectively assume many of the roles traditionally assigned to mothers and wives, with women becoming integrated into industrial work. This has been promoted by Marxism–Leninism as the means to achieve women's emancipation.
Marxist–Leninist cultural policy modernises social relations among citizens by eliminating the capitalist value system of traditionalist conservatism, by which Tsarism classified, divided and controlled people with stratified social classes without any socio-economic mobility. It focuses upon modernisation and distancing society from the past, the bourgeoisie and the old intelligentsia. The socio-cultural changes required for establishing a communist society are realised with education and agitprop (agitation and propaganda) which reinforce communal and communist values. The modernisation of educational and cultural policies eliminates the societal atomisation, including anomie and social alienation, caused by cultural backwardness. Marxism–Leninism develops the New Soviet man, an educated and cultured citizen possessed of a proletarian class consciousness who is oriented towards the social cohesion necessary for developing a communist society as opposed to the antithetic bourgeois individualist associated with social atomisation.
The Marxist–Leninist worldview is atheist, wherein all human activity results from human volition and not the will of supernatural beings (gods, goddesses and demons) who have direct agency in the public and private affairs of human society. The tenets of the Soviet Union's national policy of Marxist–Leninist atheism originated from the philosophies of Georg Wilhelm Friedrich Hegel (1770–1831) and Ludwig Feuerbach (1804–1872) as well as that of Karl Marx (1818–1883) and Vladimir Lenin (1870–1924).
As a basis of Marxism–Leninism, the philosophy of materialism (the physical universe exists independently of human consciousness) is applied as dialectical materialism (a philosophy of science and nature) to examine the socio-economic relations among people and things as parts of a dynamic, material world that is unlike the immaterial world of metaphysics. Soviet astrophysicist Vitaly Ginzburg said that ideologically the "Bolshevik communists were not merely atheists, but, according to Lenin's terminology, militant atheists" in excluding religion from the social mainstream, from education and from government.
Marxism–Leninism has been widely criticised due to its relations with Stalinism, the Soviet Union and state repression in Marxist–Leninist states. Classical and orthodox Marxists were critical of Stalin's political economy and single-party government in the Soviet Union.
Italian left communist Amadeo Bordiga dismissed Marxism–Leninism as political opportunism that preserved capitalism because of the claim that the exchange of commodities would occur under socialism and that the use of popular front organisations by the Communist International and a political vanguard organised by organic centralism were politically more effective than a vanguard organised by democratic centralism. American Marxist Raya Dunayevskaya dismissed Marxism–Leninism as a type of state capitalism because state ownership of the means of production is a form of state capitalism and single-party rule is undemocratic whereas the dictatorship of the proletariat is democratic, further arguing that the doctrine is neither Marxism nor Leninism, but rather a composite ideology that Stalin used to expediently determine what is communism and what is not communism for the countries of the Eastern Bloc.
Trotskyists claim that Marxism–Leninism led to the establishment of a degenerated workers' state. Others such as philosopher Eric Voegelin claim that Marxism–Leninism is in its core inherently oppressive, arguing that the "Marxian vision dictated the Stalinist outcome not because the communist utopia was inevitable but because it was impossible". Criticism like this has itself been criticised for philosophical determinism, i.e. that the negative events in the movement's history were predetermined by their convictions. Historian Robert Vincent Daniels argues that Marxism was used to "justify Stalinism, but it was no longer allowed to serve either as a policy directive or an explanation of reality" during Stalin's rule. In complete contrast, E. Van Ree argues that Stalin continued to be in "general agreement" with the classical works of Marxism until his death.
|
https://en.wikipedia.org/wiki?curid=20972
|
Mikhail Gorbachev
Mikhail Sergeyevich Gorbachev (born 2 March 1931) is a Russian and formerly Soviet politician. The eighth and last leader of the Soviet Union, he was the general secretary of the Communist Party of the Soviet Union from 1985 until 1991. He was also the country's head of state from 1988 until 1991, serving as the chairman of the Presidium of the Supreme Soviet from 1988 to 1989, chairman of the Supreme Soviet from 1989 to 1990, and president of the Soviet Union from 1990 to 1991. Ideologically, he initially adhered to Marxism–Leninism although by the early 1990s had moved toward social democracy.
Of mixed Russian and Ukrainian heritage, Gorbachev was born in Privolnoye, Stavropol Krai, to a poor peasant family. Growing up under the rule of Joseph Stalin, in his youth he operated combine harvesters on a collective farm before joining the Communist Party, which then governed the Soviet Union as a one-party state according to Marxist-Leninist doctrine. While studying at Moscow State University, he married fellow student Raisa Titarenko in 1953 prior to receiving his law degree in 1955. Moving to Stavropol, he worked for the Komsomol youth organization and, after Stalin's death, became a keen proponent of the de-Stalinization reforms of Soviet leader Nikita Khrushchev. He was appointed the First Party Secretary of the Stavropol Regional Committee in 1970, in which position he oversaw construction of the Great Stavropol Canal. In 1978 he returned to Moscow to become a Secretary of the party's Central Committee and in 1979 joined its governing Politburo. Within three years of the death of Soviet leader Leonid Brezhnev, following the brief regimes of Yuri Andropov and Konstantin Chernenko, the Politburo elected Gorbachev as General Secretary, the "de facto" head of government, in 1985.
Although committed to preserving the Soviet state and to its socialist ideals, Gorbachev believed significant reform was necessary, particularly after the 1986 Chernobyl disaster. He withdrew from the Soviet–Afghan War and embarked on summits with United States President Ronald Reagan to limit nuclear weapons and end the Cold War. Domestically, his policy of "glasnost" ("openness") allowed for enhanced freedom of speech and press, while his "perestroika" ("restructuring") sought to decentralize economic decision making to improve efficiency. His democratization measures and formation of the elected Congress of People's Deputies undermined the one-party state. Gorbachev declined to intervene militarily when various Eastern Bloc countries abandoned Marxist-Leninist governance in 1989–90. Internally, growing nationalist sentiment threatened to break up the Soviet Union, leading Marxist-Leninist hardliners to launch the unsuccessful August Coup against Gorbachev in 1991. In the wake of this, the Soviet Union dissolved against Gorbachev's wishes and he resigned. After leaving office, he launched his Gorbachev Foundation, became a vocal critic of Russian Presidents Boris Yeltsin and Vladimir Putin, and campaigned for Russia's social-democratic movement.
Widely considered one of the most significant figures of the second half of the 20th century, Gorbachev remains the subject of controversy. The recipient of a wide range of awards—including the Nobel Peace Prize—he was widely praised for his pivotal role in ending the Cold War, curtailing human rights abuses in the Soviet Union, and tolerating both the fall of Marxist–Leninist administrations in eastern and central Europe and the reunification of Germany. Conversely, in Russia he is often derided for not stopping the Soviet collapse, an event which brought a decline in Russia's global influence and precipitated an economic crisis.
Gorbachev was born on 2 March 1931 in the village of Privolnoye, Stavropol Krai, then in the Russian Soviet Federative Socialist Republic, one of the constituent republics of the Soviet Union. At the time, Privolnoye was divided almost evenly between ethnic Russians and ethnic Ukrainians. Gorbachev's paternal family were ethnic Russians and had moved to the region from Voronezh several generations before; his maternal family were of ethnic Ukrainian heritage and had migrated from Chernigov.
His parents named him Victor, but at the insistence of his mother—a devout Orthodox Christian—he had a secret baptism, where his grandfather christened him Mikhail. His relationship with his father, Sergey Andreyevich Gorbachev, was close; his mother, Maria Panteleyevna Gorbacheva (née Gopkalo), was colder and punitive. His parents were poor, and lived as peasants. They had married as teenagers in 1928, and in keeping with local tradition had initially resided in Sergei's father's house, an adobe-walled hut, before a hut of their own could be built.
The Soviet Union was a one-party state governed by the Communist Party, and during Gorbachev's childhood was under the leadership of Joseph Stalin. Stalin had initiated a project of mass rural collectivization which, in keeping with his Marxist-Leninist ideas, he believed would help convert the country into a socialist society. Gorbachev's maternal grandfather joined the Communist Party and helped form the village's first kolkhoz (collective farm) in 1929, becoming its chair. This farm was outside Privolnoye village and when he was three years old, Gorbachev left his parental home and moved into the kolkhoz with his maternal grandparents.
The country was then experiencing the famine of 1932–33, in which two of Gorbachev's paternal uncles and an aunt died. This was followed by the Great Purge, in which individuals accused of being "enemies of the people"—including those sympathetic to rival interpretations of Marxism like Trotskyism—were arrested and interned in labor camps, if not executed. Both of Gorbachev's grandfathers were arrested—his maternal in 1934 and his paternal in 1937—and both spent time in Gulag labor camps prior to being released. After his December 1938 release, Gorbachev's maternal grandfather discussed having been tortured by the secret police, an account that influenced the young boy.
Following on from the outbreak of the Second World War in 1939, in June 1941 the German Army invaded the Soviet Union. German forces occupied Privolnoye for four and a half months in 1942. Gorbachev's father had joined the Soviet Red Army and fought on the frontlines; he was wrongly declared dead during the conflict and fought in the Battle of Kursk before returning to his family, injured. After Germany was defeated, Gorbachev's parents had their second son, Aleksandr, in 1947; he and Mikhail would be their only children.
The village school had closed during much of the war but re-opened in autumn 1944. Gorbachev did not want to return but when he did he excelled academically. He read voraciously, moving from the Western novels of Thomas Mayne Reid to the work of Vissarion Belinsky, Alexander Pushkin, Nikolai Gogol, and Mikhail Lermontov. In 1946, he joined Komsomol, the Soviet political youth organization, becoming leader of his local group and then being elected to the Komsomol committee for the district. From primary school he moved to the high school in Molotovskeye; he stayed there during the week while walking the home during weekends. As well as being a member of the school's drama society, he organized sporting and social activities and led the school's morning exercise class. Over the course of five consecutive summers from 1946 onward he returned home to assist his father operate a combine harvester, during which they sometimes worked 20-hour days. In 1948, they harvested over 8000 centners of grain, a feat for which Sergey was awarded the Order of Lenin and his son the Order of the Red Banner of Labour.
In June 1950, Gorbachev became a candidate member of the Communist Party. He also applied to study at the law school of Moscow State University (MSU), then the most prestigious university in the country. They accepted without asking for an exam, likely because of his worker-peasant origins and his possession of the Order of the Red Banner of Labour. His choice of law was unusual; it was not a well-regarded subject in Soviet society at that time. Aged 19, he traveled by train to Moscow, the first time he had left his home region.
In the city, he resided with fellow MSU students at a dormitory in Sokolniki District. He and other rural students felt at odds with their Muscovite counterparts but he soon came to fit in. Fellow students recall him working especially hard, often late into the night. He gained a reputation as a mediator during disputes, and was also known for being outspoken in class, although would only reveal a number of his views privately; for instance he confided in some students his opposition to the Soviet jurisprudential norm that a confession proved guilt, noting that confessions could have been forced. During his studies, an anti-semitic campaign spread through the Soviet Union, culminating in the Doctors' plot; Gorbachev publicly defended a Jewish student who was accused of disloyalty to the country by one of their fellows.
At MSU, he became the Komsomol head of his entering class, and then Komsomol's deputy secretary for agitation and propaganda at the law school. One of his first Komsomol assignments in Moscow was to monitor the election polling in Krasnopresnenskaya district to ensure the government's desire for near total turnout; Gorbachev found that most of those who voted did so "out of fear". In 1952, he was appointed a full member of the Communist Party. As a party and Komsomol member he was tasked with monitoring fellow students for potential subversion; some of his fellow students said that he did so only minimally and that they trusted him to keep confidential information secret from the authorities. Gorbachev became close friends with Zdeněk Mlynář, a Czechoslovak student who later became a primary ideologist of the 1968 Prague Spring. Mlynář recalled that the duo remained committed Marxist-Leninists despite their growing concerns about the Stalinist system. After Stalin died in March 1953, Gorbachev and Mlynář joined the crowds amassing to see Stalin's body laying in state.
At MSU, Gorbachev met Raisa Titarenko, a Ukrainian studying in the university's philosophy department. She was engaged to another man but after that engagement fell apart, she began a relationship with Gorbachev; together they went to bookstores, museums, and art exhibits. In early 1953, he took an internship at the procurator's office in Molotovskoye district, but was angered by the incompetence and arrogance of those working there. That summer, he returned to Privolnoe to work with his father on the harvest; the money earned allowed him to pay for a wedding. On 25 September 1953 he and Raisa registered their marriage at Sokolniki Registry Office; and in October moved in together at the Lenin Hills dormitory. Raisa discovered that she was pregnant and although the couple wanted to keep the child she fell ill and required a life-saving abortion.
In June 1955, Gorbachev graduated with a distinction; his final paper had been on the advantages of "socialist democracy" (the Soviet political system) over "bourgeois democracy" (liberal democracy). He was subsequently assigned to the Soviet Procurator's office, which was then focusing on the rehabilitation of the innocent victims of Stalin's purges, but found that they had no work for him. He was then offered a place on an MSU graduate course specializing in kolkhoz law, but declined. He had wanted to remain in Moscow, where Raisa was enrolled on a PhD program, but instead gained employment in Stavropol; Raisa abandoned her studies to join him there.
In August 1955, Gorbachev started work at the Stavropol regional procurator's office, but disliked the job and used his contacts to get a transfer to work for Komsomol, becoming deputy director of Komsomol's agitation and propaganda department for that region. In this position, he visited villages in the area and tried to improve the lives of their inhabitants; he established a discussion circle in Gorkaya Balka village to help its peasant residents gain social contacts.
Gorbachev and his wife initially rented a small room in Stavropol, taking daily evening walks around the city and on weekends hiking in the countryside. In January 1957, Raisa gave birth to a daughter, Irina, and in 1958 they moved into two rooms in a communal apartment. In 1961, Gorbachev pursued a second degree, on agricultural production; he took a correspondence course from the local Stavropol Agricultural Institute, receiving his diploma in 1967. His wife had also pursued a second degree, attaining a PhD in sociology in 1967 from the Moscow Pedagogical Institute; while in Stavropol she too joined the Communist Party.
Stalin was ultimately succeeded as Soviet leader by Nikita Khrushchev, who denounced Stalin and his cult of personality in a speech given in February 1956, after which he launched a de-Stalinization process throughout Soviet society. Later biographer William Taubman suggested that Gorbachev "embodied" the "reformist spirit" of the Khrushchev era. Gorbachev was among those who saw themselves as "genuine Marxists" or "genuine Leninists" in contrast to what they regarded as the perversions of Stalin. He helped spread Khrushchev's anti-Stalinist message in Stavropol, but encountered many who continued to regard Stalin as a hero or who praised the Stalinist purges as just.
Gorbachev rose steadily through the ranks of the local administration. The authorities regarded him as politically reliable, and he would flatter his superiors, for instance gaining favor with prominent local politician Fyodor Kulakov. With an ability to outmanoeuvre rivals, some colleagues resented his success. In September 1956, he was promoted First Secretary of the Stavropol city's Komsomol, placing him in charge of it; in April 1958 he was made deputy head of the Komsomol for the entire region. At this point he was given better accommodation: a two-room flat with its own private kitchen, toilet, and bathroom. In Stavropol, he formed a discussion club for youths, and helped mobilize local young people to take part in Khrushchev's agricultural and development campaigns.
In March 1961, Gorbachev became First Secretary of the regional Komsomol, in which position he went out of his way to appoint women as city and district leaders. In 1961, Gorbachev played host to the Italian delegation for the World Youth Festival in Moscow; that October, he also attended the 22nd Congress of the Communist Party of the Soviet Union. In January 1963, Gorbachev was promoted to personnel chief for the regional party's agricultural committee, and in September 1966 became First Secretary of the Stavropol City Party Organization ("Gorkom"). By 1968 he was increasingly frustrated with his job—in large part because Khrushchev's reforms were stalling or being reversed—and he contemplated leaving politics to work in academia. However, in August 1968, he was named Second Secretary of the Stavropol Kraikom, making him the deputy of First Secretary Leonid Yefremov and the second most senior figure in the Stavrapol region. In 1969 he was elected as a deputy to the Supreme Soviet of the Soviet Union and made a member of its Standing Commission for the Protection of the Environment.
Cleared for travel to Eastern Bloc countries, in 1966 he was part of a delegation visiting East Germany, and in 1969 and 1974 visited Bulgaria. In August 1968 the Soviet Union led an invasion of Czechoslovakia to put an end to the Prague Spring, a period of political liberalization in the Marxist–Leninist country. Although Gorbachev later stated that he had had private concerns about the invasion, he publicly supported it. In September 1969 he was part of a Soviet delegation sent to Czechoslovakia, where he found the Czechoslovak people largely unwelcoming to them. That year, the Soviet authorities ordered him to punish Fagien B. Sadykov, a Stavropol-based agronomist whose ideas were regarded as critical of Soviet agricultural policy; Gorbachev ensured that Sadykov was fired from teaching but ignored calls for him to face tougher punishment. Gorbachev later related that he was "deeply affected" by the incident; "my conscience tormented me" for overseeing Sadykov's persecution.
In April 1970, Yefremov was promoted to a higher position in Moscow and Gorbachev succeeded him as the First Secretary of the Stavropol kraikom. This granted Gorbachev significant power over the Stavropol region. He had been personally vetted for the position by senior Kremlin leaders and was informed of their decision by the Soviet leader, Leonid Brezhnev. Aged 39, he was considerably younger than his predecessors in the position. As head of the Stavropol region, he automatically became a member of the Central Committee of the Communist Party of the Soviet Union in 1971. According to biographer Zhores Medvedev, Gorbachev "had now joined the Party's super-elite". As regional leader, Gorbachev initially attributed economic and other failures to "the inefficiency and incompetence of cadres, flaws in management structure or gaps in legislation", but eventually concluded that they were caused by an excessive centralization of decision making in Moscow. He began reading translations of restricted texts by Western Marxist authors like Antonio Gramsci, Louis Aragon, Roger Garaudy, and Giuseppe Boffa, and came under their influence.
Gorbachev's main task as regional leader was to raise agricultural production levels, something hampered by severe droughts in 1975 and 1976.
He oversaw the expansion of irrigation systems through construction of the Great Stavropol Canal. For overseeing a record grain harvest in Ipatovsky district, in March 1972 he was awarded by Order of the October Revolution by Brezhnev in a Moscow ceremony. Gorbachev always sought to maintain Brezhnev's trust; as regional leader, he repeatedly praised Brezhnev in his speeches, for instance referring to him as "the outstanding statesman of our time". Gorbachev and his wife holidayed in Moscow, Leningrad, Uzbekistan, and resorts in the North Caucusus; he holidayed with the head of the KGB, Yuri Andropov, who was favorable towards him and who became an important patron. Gorbachev also developed good relationships with senior figures like the Soviet Prime Minister, Alexei Kosygin, and the longstanding senior party member Mikhail Suslov.
The government considered Gorbachev sufficiently reliable that he was sent as part of Soviet delegations to Western Europe; he made five trips there between 1970 and 1977. In September 1971 he was part of a delegation who traveled to Italy, where they met with representatives of the Italian Communist Party; Gorbachev loved Italian culture but was struck by the poverty and inequality he saw in the country. In 1972 he visited Belgium and the Netherlands and in 1973 West Germany. Gorbachev and his wife visited France in 1976 and 1977, on the latter occasion touring the country with a guide from the French Communist Party. He was surprised by how openly West Europeans offered their opinions and criticized their political leaders, something absent from the Soviet Union, where most people did not feel safe speaking so openly. He later related that for him and his wife, these visits "shook our a priori belief in the superiority of socialist over bourgeois democracy".
Gorbachev had remained close to his parents; after his father became terminally ill in 1974, Gorbachev traveled to be with him in Privolnoe shortly before his death. His daughter, Irina, married fellow student Anatoly Virgansky in April 1978. In 1977, the Supreme Soviet appointed Gorbachev to chair the Standing Commission on Youth Affairs due to his experience with mobilizing young people in Komsomol.
In November 1978, Gorbachev was appointed a Secretary of the Central Committee. His appointment had been approved unanimously by the Central Committee's members. To fill this position, Gorbachev and his wife moved to Moscow, where they were initially given an old dacha outside the city. They then moved to another, at Sosnovka, before finally being allocated a newly built brick house. He was also given an apartment inside the city, but gave that to his daughter and son-in-law; Irina had begun work at Moscow's Second Medical Institute. As part of the Moscow political elite, Gorbachev and his wife now had access to better medical care and to specialized shops; they were also given cooks, servants, bodyguards, and secretaries, although many of these were spies for the KGB. In his new position, Gorbachev often worked twelve to sixteen hour days. He and his wife socialized little, but liked to visit Moscow's theaters and museums.
In 1978, Gorbachev was appointed to the Central Committee's Secretariat for Agriculture, replacing his old friend Kulakov, who had died of a heart attack. Gorbachev concentrated his attentions on agriculture: the harvests of 1979, 1980, and 1981 were all poor, due largely to weather conditions, and the country had to import increasing quantities of grain. He had growing concerns about the country's agricultural management system, coming to regard it as overly centralized and requiring more bottom-up decision making; he raised these points at his first speech at a Central Committee Plenum, given in July 1978. He began to have concerns about other policies too. In December 1979, the Soviets sent the Red Army into neighbouring Afghanistan to support its Soviet-aligned government against Islamist insurgents; Gorbachev privately thought it a mistake. At times he openly supported the government position; in October 1980 he for instance endorsed Soviet calls for Poland's Marxist–Leninist government to crack down on growing internal dissent in that country. That same month, he was promoted from a candidate member to a full member of the Politburo, the highest decision-making authority in the Communist Party. At the time, he was the Politburo's youngest member.
After Brezhnev's death in November 1982, Andropov succeeded him as General Secretary of the Communist Party, the "de facto" head of government in the Soviet Union. Gorbachev was enthusiastic about the appointment. However, although Gorbachev hoped that Andropov would introduce liberalizing reforms, the latter carried out only personnel shifts rather than structural change. Gorbachev became Andropov's closest ally in the Politburo; with Andropov's encouragement, Gorbachev sometimes chaired Politburo meetings. Andropov encouraged Gorbachev to expand into policy areas other than agriculture, preparing him for future higher office. In April 1983, Gorbachev delivered the annual speech marking the birthday of the Soviet founder Vladimir Lenin; this required him re-reading many of Lenin's later writings, in which the latter had called for reform in the context of the New Economic Policy of the 1920s, and encouraged Gorbachev's own conviction that reform was needed. In May 1983, Gorbachev was sent to Canada, where he met Prime Minister Pierre Trudeau and spoke to the Canadian Parliament. There, he met and befriended the Soviet ambassador, Aleksandr Yakovlev, who later became a key political ally.
In February 1984, Andropov died; on his deathbed he indicated his desire that Gorbachev succeed him. Many in the Central Committee nevertheless thought the 53-year old Gorbachev was too young and inexperienced. Instead, Konstantin Chernenko—a longstanding Brezhnev ally—was appointed General Secretary, but he too was in very poor health. Chernenko was often too sick to chair Politburo meetings, with Gorbachev stepping in last minute. Gorbachev continued to cultivate allies both in the Kremlin and beyond, and also gave the main speech at a conference on Soviet ideology, where he angered party hardliners by implying that the country required reform.
In April 1984, he was appointed chair of the Foreign Affairs Committee of the Soviet legislature, a largely honorific position. In June he traveled to Italy as a Soviet representative for the funeral of Italian Communist Party leader Enrico Berlinguer, and in September to Sofia, Bulgaria to attend celebrations of the fortieth anniversary of its liberation by the Red Army. In December, he visited Britain at the request of its Prime Minister Margaret Thatcher; she was aware that he was a potential reformer and wanted to meet him. At the end of the visit, Thatcher said: "I like Mr Gorbachev. We can do business together". He felt that the visit helped to erode Andrei Gromyko's dominance of Soviet foreign policy while at the same time sending a signal to the United States government that he wanted to improve Soviet-U.S. relations.
On 10 March 1985, Chernenko died. Gromyko proposed Gorbachev as the next General Secretary; as a longstanding party member, Gromyko's recommendation carried great weight among the Central Committee. Gorbachev expected much opposition to his nomination as General Secretary, but ultimately the rest of the Politburo supported him. Shortly after Chernenko's death, the Politburo unanimously elected Gorbachev as his successor; they wanted him over another elderly leader. He thus became the eighth leader of the Soviet Union. Few in the government imagined that he would be as radical a reformer as he proved. Although not a well-known figure to the Soviet public, there was widespread relief that the new leader was not elderly and ailing. Gorbachev's first public appearance as leader was at Chernenko's Red Square funeral, held on 14 March. Two months after being elected, he left Moscow for the first time, traveling to Leningrad, where he spoke to assembled crowds. In June he traveled to Ukraine, in July to Belarus, and in September to Tyumen Oblast, urging party members in these areas to take more responsibility for fixing local problems.
Gorbachev's leadership style differed from that of his predecessors. He would stop to talk to civilians on the street, forbade the display of his portrait at the 1985 Red Square holiday celebrations, and encouraged frank and open discussions at Politburo meetings. To the West, Gorbachev was seen as a more moderate and less threatening Soviet leader; some Western commentators however believed this an act to lull Western governments into a false sense of security. His wife was his closest adviser, and took on the unofficial role of a "first lady" by appearing with him on foreign trips; her public visibility was a breach of standard practice and generated resentment. His other close aides were Georgy Shakhnazarov and Anatoly Chernyaev.
Gorbachev was aware that the Politburo could remove him from office, and that he could not pursue more radical reform without a majority of supporters in the Politburo. He sought to remove several older members from the Politburo, encouraging Grigory Romanov, Nikolai Tikhonov, and Viktor Grishin into retirement. He promoted Gromyko to head of state, a largely ceremonial role with little influence, and moved his own ally, Eduard Shevardnadze, to Gromyko's former post in charge of foreign policy. Other allies whom he saw promoted were Yakovlev, Anatoly Lukyanov, and Vadim Medvedev. Another of those promoted by Gorbachev was Boris Yeltsin, who was made a Secretary of the Central Committee in July 1985. Most of these appointees were from a new generation of well-educated officials who had been frustrated during the Brezhnev era. In his first year, 14 of the 23 heads of department in the secretariat were replaced. Doing so, Gorbachev secured dominance in the Politburo within a year, faster than either Stalin, Khrushchev, or Brezhnev had achieved.
Gorbachev recurrently employed the term "perestroika", first used publicly in March 1984. He saw "perestroika" as encompassing a complex series of reforms to restructure society and the economy. He was concerned by the country's low productivity, poor work ethic, and inferior quality goods; like several economists, he feared this would lead to the country becoming a second-rate power. The first stage of Gorbachev's perestroika was "uskoreniye" ("acceleration"), a term he used regularly in the first two years of his leadership. The Soviet Union was behind the United States in many areas of production, but Gorbachev claimed that it would accelerate industrial output to match that of the U.S. by 2000. The Five Year Plan of 1985–90 was targeted to expand machine building by 50 to 100%. To boost agricultural productivity, he merged five ministries and a state committee into a single entity, Agroprom, although by late 1986 acknowledged this merger as a failure.
The purpose of reform was to prop up the centrally planned economy—not to transition to market socialism. Speaking in late summer 1985 to the secretaries for economic affairs of the central committees of the East European communist parties, Gorbachev said: "Many of you see the solution to your problems in resorting to market mechanisms in place of direct planning. Some of you look at the market as a lifesaver for your economies. But, comrades, you should not think about lifesavers but about the ship, and the ship is socialism."
Gorbachev's perestroika also entailed attempts to move away from technocratic management of the economy by increasingly involving the labor force in industrial production. He was of the view that once freed from the strong control of central planners, state-owned enterprises would act as market agents. Gorbachev and other Soviet leaders did not anticipate opposition to the perestroika reforms; according to their interpretation of Marxism, they believed that in a socialist society like the Soviet Union there would not be "antagonistic contradictions". However, there would come to be a public perception in the country that many bureaucrats were paying lip service to the reforms while trying to undermine them. He also initiated the concept of "gospriyomka" (state acceptance of production) during his time as leader, which represented quality control. In April 1986, he introduced an agrarian reform which linked salaries to output and allowed collective farms to sell 30% of their produce directly to shops or co-operatives rather than giving it all to the state for distribution. In a September 1986 speech, he embraced the idea of reintroducing market economics to the country alongside limited private enterprise, citing Lenin's New Economic Policy as a precedent; he nevertheless stressed that he did not regard this as a return to capitalism.
In the Soviet Union, alcohol consumption had risen steadily between 1950 and 1985. By the 1980s, drunkenness was a major social problem and Andropov had planned a major campaign to limit alcohol consumption. Encouraged by his wife, Gorbachev—who believed the campaign would improve health and work efficiency—oversaw its implementation. Alcohol production was reduced by around 40 percent, the legal drinking age rose from 18 to 21, alcohol prices were increased, stores were banned from selling it before 2pm, and tougher penalties were introduced for workplace or public drunkenness and home production of alcohol. The All-Union Voluntary Society for the Struggle for Temperance was formed to promote sobriety; it had over 14 million members within three years. As a result, crime rates fell and life expectancy grew slightly between 1986 and 1987. However, moonshine production rose considerably, and the reform had significant costs to the Soviet economy, resulting in losses of up to US$100 billion between 1985 and 1990. Gorbachev later considered the campaign to have been an error, and it was terminated in October 1988. After it ended, it took several years for production to return to previous levels, after which alcohol consumption soared in Russia between 1990 and 1993.
In the second year of his leadership, Gorbachev began speaking of "glasnost", or "openness". According to Doder and Branston, this meant "greater openness and candour in government affairs and for an interplay of different and sometimes conflicting views in political debates, in the press, and in Soviet culture." Encouraging reformers into prominent media positions, he brought in Sergei Zalygin as head of "Novy Mir" magazine and Yegor Yakovlev as editor-in-chief of "Moscow News". He made the historian Yuri Afanasiev dean of the State Historical Archive Faculty, from where Afansiev could press for the opening of secret archives and the reassessment of Soviet history. Prominent dissidents like Andrei Sakharov were freed from internal exile or prison. Gorbachev saw glasnost as a necessary measure to ensure perestroika by alerting the Soviet populace to the nature of the country's problems in the hope that they would support his efforts to fix them. Particularly popular among the Soviet intelligentsia, who became key Gorbachev supporters, glasnost boosted his domestic popularity but alarmed many Communist Party hardliners. For many Soviet citizens, this newfound level of freedom of speech and press—and its accompanying revelations about the country's past—was uncomfortable.
Some in the party thought Gorbachev was not going far enough in his reforms; a prominent liberal critic was Yeltsin. He had risen rapidly since 1985, attaining the role of Moscow city boss. Like many members of the government, Gorbachev was skeptical of Yeltsin, believing that he engaged in too much self-promotion. Yeltsin was also critical of Gorbachev, regarding him as patronizing.
In early 1986, Yeltsin began sniping at Gorbachev in Politburo meetings. At the Twenty-Seventh Party Congress in February, Yeltsin called for more far-reaching reforms than Gorbachev was initiating and criticized the party leadership, although did not cite Gorbachev by name, claiming that a new cult of personality was forming. Gorbachev then opened the floor to responses, after which attendees publicly criticized Yeltsin for several hours. After this, Gorbachev also criticized Yeltsin, claiming that he only cared for himself and was "politically illiterate". Yeltsin then resigned as both Moscow boss and as a member of the Politburo. From this point, tensions between the two men developed into a mutual hatred.
In April 1986 the Chernobyl disaster occurred. In the immediate aftermath, officials fed Gorbachev incorrect information to downplay the incident. As the scale of the disaster became apparent, 336,000 people were evacuated from the area around Chernobyl. Taubman noted that the disaster marked "a turning point for Gorbachev and the Soviet regime". Several days after it occurred, he gave a televised report to the nation. He cited the disaster as evidence for what he regarded as widespread problems in Soviet society, such as shoddy workmanship and workplace inertia. Gorbachev later described the incident as one which made him appreciate the scale of incompetence and cover-ups in the Soviet Union. From April to the end of the year, Gorbachev became increasingly open in his criticism of the Soviet system, including food production, state bureaucracy, the military draft, and the large size of the prison population.
In a May 1985 speech given to the Soviet Foreign Ministry—the first time a Soviet leader had directly addressed his country's diplomats—Gorbachev spoke of a "radical restructuring" of foreign policy. A major issue facing his leadership was Soviet involvement in the Afghan Civil War, which had then been going on for over five years. Over the course of the war, the Soviet Army took heavy casualties and there was much opposition to Soviet involvement among both the public and military. On becoming leader, Gorbachev saw withdrawal from the war as a key priority. In October 1985, he met with Afghan Marxist leader Babrak Karmal, urging him to acknowledge the lack of widespread public support for his government and pursue a power sharing agreement with the opposition. That month, the Politburo approved Gorbachev's decision to withdraw combat troops from Afghanistan, although the last troops did not leave until February 1989.
Gorbachev had inherited a renewed period of high tension in the Cold War. He believed strongly in the need to sharply improve relations with the United States; he was appalled at the prospect of nuclear war, was aware that the Soviet Union was unlikely to win the arms race, and thought that the continued focus on high military spending was detrimental to his desire for domestic reform. Although privately also appalled at the prospect of nuclear war, U.S. President Ronald Reagan publicly appeared to not want a de-escalation of tensions, having scrapped détente and arms controls, initiating a military build-up, and calling the Soviet Union the "evil empire".
Both Gorbachev and Reagan wanted a summit to discuss the Cold War, but each faced some opposition to such a move within their respective governments. They agreed to hold a summit in Geneva, Switzerland in November 1985. In the buildup to this, Gorbachev sought to improve relations with the U.S.' NATO allies, visiting France in October 1985 to meet with President François Mitterrand. At the Geneva summit, discussions between Gorbachev and Reagan were sometimes heated, and Gorbachev was initially frustrated that his U.S. counterpart "does not seem to hear what I am trying to say". As well as discussing the Cold War proxy conflicts in Afghanistan and Nicaragua and human rights issues, the pair discussed the U.S.' Strategic Defense Initiative (SDI), to which Gorbachev was strongly opposed. The duo's wives also met and spent time together at the summit. The summit ended with a joint commitment to avoiding nuclear war and to meet for two further summits: in Washington D.C. in 1986 and in Moscow in 1987. Following the conference, Gorbachev traveled to Prague to inform other Warsaw Pact leaders of developments.
In January 1986, Gorbachev publicly proposed a three-stage programme for abolishing the world's nuclear weapons by the end of the 20th century. An agreement was then reached to meet with Reagan in Reykjavík, Iceland in October 1986. Gorbachev wanted to secure guarantees that SDI would not be implemented, and in return was willing to offer concessions, including a 50% reduction in Soviet long range nuclear missiles. Both leaders agreed with the shared goal of abolishing nuclear weapons, but Reagan refused to terminate the SDI program and no deal was reached. After the summit, many of Reagan's allies criticized him for going along with the idea of abolishing nuclear weapons. Gorbachev meanwhile told the Politburo that Reagan was "extraordinarily primitive, troglodyte, and intellectually feeble".
In his relations with the developing world, Gorbachev found many of the leaders professing revolutionary socialist credentials or a pro-Soviet attitude—such as Libya's Muammar Gaddafi and Syria's Hafez al-Assad—frustrating, and his best personal relationship was instead with India's Prime Minister, Rajiv Gandhi. He thought that the "socialist camp" of Marxist-Leninist governed states—the Eastern Bloc countries, North Korea, Vietnam, and Cuba—were a drain on the Soviet economy, receiving a far greater amount of goods from the Soviet Union than they collectively gave in return. He sought improved relations with China, a country whose Marxist government had severed ties with the Soviets in the Sino-Soviet Split and had since undergone its own structural reform. In June 1985 he signed a US$14 billion five-year trade agreement with the country and in July 1986, he proposed troop reductions along the Soviet-Chinese border, hailing China as "a great socialist country". He made clear his desire for Soviet membership of the Asian Development Bank and for greater ties to Pacific countries, especially China and Japan.
In January 1987, Gorbachev attended a Central Committee plenum where he talked about perestroika and democratization while criticizing widespread corruption. He considered putting a proposal to allow multi-party elections into his speech, but decided against doing so. After the plenum, he focused his attentions on economic reform, holding discussions with government officials and economists. Many economists proposed reducing ministerial controls on the economy and allowing state-owned enterprises to set their own targets; Ryzhkov and other government figures were skeptical. In June, Gorbachev finished his report on economic reform. It reflected a compromise: ministers would retain the ability to set output targets but these would not be considered binding. That month, a plenum accepted his recommendations and the Supreme Soviet passed a "law on enterprises" implementing the changes. Economic problems remained: by the late 1980s there were still widespread shortages of basic goods, rising inflation, and declining living standards. These stoked a number of miners' strikes in 1989.
By 1987, the ethos of glasnost had spread through Soviet society: journalists were writing increasingly openly, many economic problems were being publicly revealed, and studies appeared that critically reassessed Soviet history. Gorbachev was broadly supportive, describing glasnost as "the crucial, irreplaceable weapon of perestroika". He nevertheless insisted that people should use the newfound freedom responsibly, stating that journalists and writers should avoid "sensationalism" and be "completely objective" in their reporting. Nearly two hundred previously restricted Soviet films were publicly released, and a range of Western films were also made available. In 1989, Soviet responsibility for the 1940 Katyn massacre was finally revealed.
In September 1987, the government stopped jamming the signal of the British Broadcasting Corporation and Voice of America. The reforms also included greater tolerance of religion; an Easter service was broadcast on Soviet television for the first time and the millennium celebrations of the Russian Orthodox Church were given media attention. Independent organizations appeared, most supportive of Gorbachev, although the largest, Pamyat, was ultra-nationalist and anti-Semitic in nature. Gorbachev also announced that Soviet Jews wishing to migrate to Israel would be allowed to do so, something previously prohibited.
In August 1987, he holidayed in Nizhniaia Oreanda, Ukraine, there writing "Perestroika: New Thinking for Our Country and Our World" at the suggestion of U.S. publishers. For the 70th anniversary of the October Revolution of 1917—which brought Lenin and the Communist Party to power—Gorbachev produced a speech on "October and Perestroika: The Revolution Continues". Delivered to a ceremonial joint session of the Central Committee and the Supreme Soviet in the Kremlin Palace of Congresses, it praised Lenin but criticized Stalin for overseeing mass human rights abuses. Party hardliners thought the speech went too far; liberalisers thought it did not go far enough.
In March 1988, the magazine "Sovetskaya Rossiya" published an open letter by the teacher Nina Andreyeva. It criticized elements of Gorbachev's reforms, attacking what she regarded as the denigration of the Stalinist era and arguing that a reformer clique—whom she implied were mostly Jews and ethnic minorities—were to blame. Over 900 Soviet newspapers reprinted it and anti-reformists rallied around it; many reformers panicked, fearing a backlash against perestroika. On returning from Yugoslavia, Gorbachev called a Politburo meeting to discuss the letter, at which he confronted those hardliners supporting its sentiment. Ultimately, the Politburo arrived at a unanimous decision to express disapproval of Andreyeva's letter and publish a rebuttal in "Pravda". Yakovlev and Gorbachev's rebuttal claimed that those who "look everywhere for internal enemies" were "not patriots" and presented Stalin's "guilt for massive repressions and lawlessness" as "enormous and unforgiveable".
Although the next party congress was not scheduled until 1991, Gorbachev convened the 19th Party Conference in its place in June 1988. He hoped that by allowing a broader range of people to attend than at previous conferences, he would gain additional support for his reforms. With sympathetic officials and academics, Gorbachev drafted plans for reforms that would shift power away from the Politburo and towards the soviets. While the soviets had become largely powerless bodies that rubber-stamped Politburo policies, he wanted them to become year-round legislatures. He proposed the formation of a new institution, the Congress of People's Deputies, whose members were to be elected in a largely free vote. This congress would in turn elect a USSR Supreme Soviet, which would do most of the legislating.
These proposals reflected Gorbachev's desire for more democracy; however, in his view there was a major impediment in that the Soviet people had developed a "slave psychology" after centuries of Tsarist autocracy and Marxist-Leninist authoritarianism. Held at the Kremlin Palace of Congresses, the conference brought together 5,000 delegates and featured arguments between hardliners and liberalisers. The proceedings were televised, and for the first time since the 1920s, voting was not unanimous. In the months following the conference, Gorbachev focused on redesigning and streamlining the party apparatus; the Central Committee staff—which then numbered around 3,000—was halved, while various Central Committee departments were merged to cut down the overall number from twenty to nine.
In March and April 1989, elections to the new Congress were held. Of the 2,250 legislators to be elected, one hundred — termed the "Red Hundred" by the press — were directly chosen by the Communist Party, with Gorbachev ensuring many were reformists. Although over 85% of elected deputies were party members, many of those elected—including Sakharov and Yeltsin—were liberalisers. Gorbachev was happy with the result, describing it as "an enormous political victory under extraordinarily difficult circumstances". The new Congress convened in May 1989. Gorbachev was then elected its chair – the new "de facto" head of state – with 2,123 votes in favor to 87 against. Its sessions were televised live, and its members elected the new Supreme Soviet. At the Congress, Sakharov spoke repeatedly, exasperating Gorbachev with his calls for greater liberalization and the introduction of private property. When Sakharov died shortly after, Yeltsin became the figurehead of the liberal opposition.
Gorbachev tried to improve relations with the UK, France, and West Germany; like previous Soviet leaders, he was interested in pulling Western Europe away from U.S. influence. Calling for greater pan-European co-operation, he publicly spoke of a "Common European Home" and of a Europe "from the Atlantic to the Urals". In March 1987, Thatcher visited Gorbachev in Moscow; despite their ideological differences, they liked one another. In April 1989 he visited London, lunching with Elizabeth II. In May 1987, Gorbachev again visited France, and in November 1988 Mitterrand visited him in Moscow. The West German Chancellor, Helmut Kohl, had initially offended Gorbachev by comparing him to Nazi propagandist Joseph Goebbels, although later informally apologized and in October 1988 visited Moscow. In June 1989 Gorbachev then visited Kohl in West Germany. In November 1989 he also visited Italy, meeting with Pope John Paul II. Gorbachev's relationships with these West European leaders were typically far warmer than those he had with their Eastern Bloc counterparts.
Gorbachev continued to pursue good relations with China to heal the Sino-Soviet Split. In May 1989 he visited Beijing and there met its leader Deng Xiaoping; Deng shared Gorbachev's belief in economic reform but rejected calls for democratization. Pro-democracy students had amassed in Tiananmen Square during Gorbachev's visit but after he left were massacred by troops. Gorbachev did not condemn the massacre publicly but it reinforced his commitment not to use violent force in dealing with pro-democracy protests in the Eastern Bloc.
Following the failures of earlier talks with the U.S., in February 1987, Gorbachev held a conference in Moscow, titled "For a World without Nuclear Weapons, for Mankind's Survival", which was attended by various international celebrities and politicians. By publicly pushing for nuclear disarmament, Gorbachev sought to give the Soviet Union the moral high ground and weaken the West's self-perception of moral superiority. Aware that Reagan would not budge on SDI, Gorbachev focused on reducing "Intermediate-Range Nuclear Forces", to which Reagan was receptive. In April 1987, Gorbachev discussed the issue with U.S. Secretary of State George P. Shultz in Moscow; he agreed to eliminate the Soviets' SS-23 rockets and allow U.S. inspectors to visit Soviet military facilities to ensure compliance. There was hostility to such compromises from the Soviet military, but following the May 1987 Mathias Rust incident—in which a West German teenager was able to fly undetected from Finland and land in Red Square—Gorbachev fired many senior military figures for incompetence. In December 1987, Gorbachev visited Washington D.C., where he and Reagan signed the Intermediate-Range Nuclear Forces Treaty. Taubman called it "one of the highest points of Gorbachev's career".
A second U.S.-Soviet summit occurred in Moscow in May–June 1988, which Gorbachev expected to be largely symbolic. Again, he and Reagan criticized each other's countries—Reagan raising Soviet restrictions on religious freedom; Gorbachev highlighting poverty and racial discrimination in the U.S.—but Gorbachev related that they spoke "on friendly terms". They reached an agreement on notifying each other before conducting the ballistic missile test and made agreements on transport, fishing, and radio navigation. At the summit, Reagan told reporters that he no longer considered the Soviet Union an "evil empire" and the duo revealed that they considered themselves friends.
The third summit was held in New York City in December. Arriving there, Gorbachev gave a speech to the United Nations Assembly where he announced a unilateral reduction in the Soviet armed forces by 500,000; he also announced that 50,000 troops would be withdrawn from Central and Eastern Europe. He then met with Reagan and President-elect George H. W. Bush; he rushed home, skipping a planned visit to Cuba, to deal with the Armenian earthquake. On becoming U.S. president, Bush appeared interested in continuing talks with Gorbachev but wanted to appear tougher on the Soviets than Reagan had to allay criticism from the right-wing of his Republican Party. In December 1989, Gorbachev and Bush met at the Malta Summit. Bush offered to assist the Soviet economy by suspending the Jackson-Vanik Amendment and repealing the Stevenson and Baird Amendments. There, the duo agreed to a joint press conference, the first time that a U.S. and Soviet leader had done so. Gorbachev also urged Bush to normalize relations with Cuba and meet its president, Fidel Castro, although Bush refused to do so.
On taking power, Gorbachev found some unrest among different national groups within the Soviet Union. In December 1986, riots broke out in several Kazakh cities after a Russian was appointed head of the region. In 1987, Crimean Tatars protested in Moscow to demand resettlement in Crimea, the area from which they had been deported on Stalin's orders in 1944. Gorbachev ordered a commission, headed by Gromyko, to examine their situation. Gromyko's report opposed calls for assisting Tatar resettlement in Crimea. By 1988, the Soviet "nationality question" was increasingly pressing. In February, the administration of the Nagorno-Karabakh region officially requested that it be transferred from the Azerbaijan Soviet Socialist Republic to the Armenian Soviet Socialist Republic; the majority of the region's population were ethnically Armenian and wanted unification with other majority Armenian areas. As rival Armenian and Azerbaijani demonstrations took place in Nagorno-Karabakh, Gorbachev called an emergency meeting of the Politburo. Ultimately, Gorbachev promised greater autonomy for Nagorno-Karabakh but refused the transfer, fearing that it would set off similar ethnic tensions and demands throughout the Soviet Union.
That month, in the Azerbaijani city of Sumgait, Azerbaijani gangs began killing members of the Armenian minority. Local troops tried to quell the unrest but were attacked by mobs. The Politburo ordered additional troops into the city, but in contrast to those like Ligachev who wanted a massive display of force, Gorbachev urged restraint. He believed that the situation could be resolved through a political solution, urging talks between the Armenian and Azerbaijani Communist Parties. Further anti-Armenian violence broke out in Baku in 1990. Problems also emerged in the Georgian Soviet Socialist Republic; in April 1989, Georgian nationalists demanding independence clashed with troops in Tbilisi, resulting in various deaths. Independence sentiment was also rising in the Baltic states; the Supreme Soviets of the Estonian, Lithuanian, and Latvian Soviet Socialist Republics declared their economic "autonomy" from Russia and introduced measures to restrict Russian immigration. In August 1989, protesters formed the Baltic Way, a human chain across the three republics to symbolize their wish for independence. That month, the Lithuanian Supreme Soviet ruled the 1940 Soviet annexation of their country to be illegal; in January 1990, Gorbachev visited the republic to encourage it to remain part of the Soviet Union.
Gorbachev rejected the "Brezhnev Doctrine", the idea that the Soviet Union had the right to intervene militarily in other Marxist-Leninist countries if their governments were threatened. In December 1987 he announced the withdrawal of 500,000 Soviet troops from Central and Eastern Europe.
While pursuing domestic reforms, he did not publicly support reformers elsewhere in the Eastern Bloc. Hoping instead to lead by example, he later related that he did not want to interfere in their internal affairs, but he may have feared that pushing reform in Central and Eastern Europe would have angered his own hardliners too much. Some Eastern Bloc leaders, like Hungary's János Kádár and Poland's Wojciech Jaruzelski, were sympathetic to reform; others, like Romania's Nicolae Ceaușescu, were hostile to it. In May 1987 Gorbachev visited Romania, where he was appalled by the state of the country, later telling the Politburo that there "human dignity has absolutely no value". He and Ceaușescu disliked each other, and argued over Gorbachev's reforms.
In the Revolutions of 1989, most of the Marxist-Leninist states of Central and Eastern Europe held multi-party elections resulting in regime change. In most countries, like Poland and Hungary, this was achieved peacefully, but in Romania the revolution turned violent and led to Ceaușescu's overthrow and execution. Gorbachev was too preoccupied with domestic problems to pay much attention to these events. He believed that democratic elections would not lead Eastern European countries into abandoning their commitment to socialism. In 1989 he visited East Germany for the fortieth anniversary of its founding; shortly after, in November, the East German government allowed its citizens to cross the Berlin Wall, a decision Gorbachev praised. Over following years, much of the wall was demolished. Neither Gorbachev nor Thatcher or Mitterrand wanted a swift reunification of Germany, aware that it would likely become the dominant European power. Gorbachev wanted a gradual process of German integration but Kohl began calling for rapid reunification. With Germany reunified, many observers declared the Cold War over.
In February 1990, both liberalisers and Marxist-Leninist hardliners intensified their attacks on Gorbachev. A liberalizer march took part in Moscow criticizing Communist Party rule, while at a Central Committee meeting, the hardliner Vladimir Brovikov accused Gorbachev of reducing the country to "anarchy" and "ruin" and of pursuing Western approval at the expense of the Soviet Union and the Marxist-Leninist cause. Gorbachev was aware that the Central Committee could still oust him as General Secretary, and so decided to reformulate the role of head of government to a presidency from which they could not remove him. He decided that the presidential election should be held by the Congress of People's Deputies. He chose this over a public vote because he thought the latter would escalate tensions and feared that he might lose it; a spring 1990 poll nevertheless still showed him as the most popular politician in the country.
In March, the Congress of People's Deputies held the first (and only) Soviet presidential election, in which Gorbachev was the only candidate. He secured 1,329 in favor to 495 against; 313 votes were invalid or absent. He therefore became the first executive President of the Soviet Union. A new 18-member Presidential Council "de facto" replaced the Politburo. At the same Congress meeting, he presented the idea of repealing Article 6 of the Soviet constitution, which had ratified the Communist Party as the "ruling party" of the Soviet Union. The Congress passed the reform, undermining the "de jure" nature of the one-party state.
In the 1990 elections for the Russian Supreme Soviet, the Communist Party faced challengers from an alliance of liberalisers known as "Democratic Russia"; the latter did particularly well in urban centers. Yeltsin was elected the parliament's chair, something Gorbachev was unhappy about. That year, opinion polls showed Yeltsin overtaking Gorbachev as the most popular politician in the Soviet Union. Gorbachev struggled to understand Yeltsin's growing popularity, commenting: "he drinks like a fish... he's inarticulate, he comes up with the devil knows what, he's like a worn-out record." The Russian Supreme Soviet was now out of Gorbachev's control; in June 1990, it declared that in the Russian Republic, its laws took precedence over those of the Soviet central government. Amid a growth in Russian nationalist sentiment, Gorbachev had reluctantly allowed the formation of a Communist Party of the Russian Soviet Federative Socialist Republic as a branch of the larger Soviet Communist Party. Gorbachev attended its first congress in June, but soon found it dominated by hardliners who opposed his reformist stance.
In January 1990, Gorbachev privately agreed to permit East German reunification with West Germany, but rejected the idea that a unified Germany could retain West Germany's NATO membership. His compromise that Germany might retain both NATO and Warsaw Pact memberships did not attract support. In May 1990, he visited the U.S. for talks with President Bush; there, he agreed that an independent Germany would have the right to choose its international alliances. He later revealed that he had agreed to do so because U.S. Secretary of State James Baker promised that NATO troops would not be posted to eastern Germany and that the military alliance would not expand into Eastern Europe. Privately, Bush ignored Baker's assurances and later pushed for NATO expansion. On the trip, the U.S. informed Gorbachev of its evidence that the Soviet military—possibly unbeknownst to Gorbachev—had been pursuing a biological weapons program in contravention of the 1987 Biological Weapons Convention. In July, Kohl visited Moscow and Gorbachev informed him that the Soviets would not oppose a reunified Germany being part of NATO. Domestically, Gorbachev's critics accused him of betraying the national interest; more broadly, they were angry that Gorbachev had allowed the Eastern Bloc to move away from direct Soviet influence.
In August 1990, Saddam Hussein's Iraqi government invaded Kuwait; Gorbachev endorsed President Bush's condemnation of it. This brought criticism from many in the Soviet state apparatus, who saw Hussein as a key ally in the Persian Gulf and feared for the safety of the 9,000 Soviet citizens in Iraq, although Gorbachev argued that the Iraqis were the clear aggressors in the situation. In November the Soviets endorsed a UN Resolution permitting force to be used in expelling the Iraqi Army from Kuwait. Gorbachev later called it a "watershed" in world politics, "the first time the superpowers acted together in a regional crisis." However, when the U.S. announced plans for a ground invasion, Gorbachev opposed it, urging instead a peaceful solution. In October 1990, Gorbachev was awarded the Nobel Peace Prize; he was flattered but acknowledged "mixed feelings" about the accolade. Polls indicated that 90% of Soviet citizens disapproved of the award, which was widely seen as a Western and anti-Soviet accolade.
With the Soviet budget deficit climbing and no domestic money markets to provide the state with loans, Gorbachev looked elsewhere. Throughout 1991, Gorbachev requested sizable loans from Western countries and Japan, hoping to keep the Soviet economy afloat and ensure the success of perestroika. Although the Soviet Union had been excluded from the G7, Gorbachev secured an invitation to its London summit in July 1991. There, he continued to call for financial assistance; Mitterrand and Kohl backed him, while Thatcher—no longer in office— also urged Western leaders to agree. Most G7 members were reluctant, instead offering technical assistance and proposing the Soviets receive "special associate" status—rather than full membership—of the World Bank and International Monetary Fund. Gorbachev was frustrated that the U.S. would spend $100 billion on the Gulf War but would not offer his country loans. Other countries were more forthcoming; West Germany had given the Soviets DM60 billion by mid-1991. Later that month, Bush visited Moscow, where he and Gorbachev signed the START I treaty, a bilateral agreement on the reduction and limitation of strategic offensive arms, after ten years of negotiation.
At the 28th Communist Party Congress in July 1990, hardliners criticized the reformists but Gorbachev was re-elected party leader with the support of three-quarters of delegates and his choice of Deputy General Secretary, Vladimir Ivashko, was also elected. Seeking compromise with the liberalizers, Gorbachev assembled a team of both his own and Yeltsin's advisers to come up with an economic reform package: the result was the "500 Days" programme. This called for further decentralization and some privatization. Gorbachev described the plan as "modern socialism" rather than a return to capitalism but had many doubts about it. In September, Yeltsin presented the plan to the Russian Supreme Soviet, which backed it. Many in the Communist Party and state apparatus warned against it, arguing that it would create marketplace chaos, rampant inflation, and unprecedented levels of unemployment. The 500 Days plan was abandoned. At this, Yeltsin rallied against Gorbachev in an October speech, claiming that Russia would no longer accept a subordinate position to the Soviet government.
By mid-November 1990, much of the press was calling for Gorbachev to resign and predicting civil war. Hardliners were urging Gorbachev to disband the presidential council and arrest vocal liberals in the media. In November, he addressed the Supreme Soviet where he announced an eight-point program, which included governmental reforms, among them the abolition of the presidential council. By this point, Gorbachev was isolated from many of his former close allies and aides. Yakovlev had moved out of his inner circle and Shevardnadze had resigned. His support among the intelligentsia was declining, and by the end of 1990 his approval ratings had plummeted.
Amid growing dissent in the Baltics, especially Lithuania, in January 1991 Gorbachev demanded that the Lithuanian Supreme Council rescind its pro-independence reforms. Soviet troops occupied several Vilnius buildings and clashed with protesters, 15 of whom were killed. Gorbachev was widely blamed by liberalizers, with Yeltsin calling for his resignation. Gorbachev denied sanctioning the military operation, although some in the military claimed that he had; the truth of the matter was never clearly established. Fearing more civil disturbances, that month Gorbachev banned demonstrations and ordered troops to patrol Soviet cities alongside the police. This further alienated the liberalizers but was not enough to win-over hardliners. Wanting to preserve the Union, in April Gorbachev and the leaders of nine Soviet republics jointly pledged to prepare a treaty that would renew the federation under a new constitution; six of the republics—Estonia, Latvia, Lithuania, Moldova, Georgia, and Armenia—did not endorse this. A referendum on the issue brought 76.4% in favor of continued federation but the six rebellious republics had not taken part. Negotiations as to what form the new constitution would take took place, again bringing together Gorbachev and Yeltsin in discussion; it was planned to be formally signed in August.
In August, Gorbachev and his family holidayed at their dacha, "Zarya" ('Dawn') in Foros, Crimea. Two weeks into his holiday, a group of senior Communist Party figures—the "Gang of Eight"—calling themselves the State Committee on the State of Emergency launched a coup d'état to seize control of the Soviet Union. The phone lines to his dacha were cut and a group arrived, including Boldin, Shenin, Baklanov, and General Varennikov, informing him of the take-over. The coup leaders demanded that Gorbachev formally declare a state of emergency in the country, but he refused. Gorbachev and his family were kept under house arrest in their dacha. The coup plotters publicly announced that Gorbachev was ill and thus Vice President Yanayev would take charge of the country.
Yeltsin, now President of the Russian Soviet Federative Socialist Republic, went inside the Moscow White House. Tens of thousands of protesters amassed outside it to prevent troops storming the building to arrest him. Gorbachev feared that the coup plotters would order him killed, so had his guards barricade his dacha. However, the coup's leaders realized that they lacked sufficient support and ended their efforts. On 21 August, Vladimir Kryuchkov, Dmitry Yazov, Oleg Baklanov, and Anatoly Lukyanov, and Vladimir Ivashko arrived at Gorbachev's dacha to inform him that they were doing so.
That evening, Gorbachev returned to Moscow, where he thanked Yeltsin and the protesters for helping to undermine the coup. At a subsequent press conference, he pledged to reform the Soviet Communist Party. Two days later, he resigned as its General Secretary and called on the Central Committee to disband. Several members of the coup committed suicide; others were fired. Gorbachev attended a session of the Russian Supreme Soviet on 23 August, where Yeltsin aggressively criticized him for having appointed and promoted many of the coup members to start with. Yeltsin then announced a ban on the Russian Communist Party.
On 30 October, Gorbachev attended a conference in Madrid trying to revive the Israeli–Palestinian peace process. The event was co-sponsored by the U.S. and Soviet Union, one of the first examples of such cooperation between the two countries. There, he again met with Bush. En route home, he traveled to France where he stayed with Mitterrand at the latter's home near Bayonne.
After the coup, Yeltsin had suspended all Communist Party activities on Russian soil by shutting down the Central Committee offices in Staraya Square along with raising of the imperial Russian tricolor flag alongside the Soviet flag at Red Square. By the final weeks of 1991, Yeltsin began to take over the remnants of the Soviet government including the Kremlin itself.
To keep unity within the country, Gorbachev continued to pursue plans for a new union treaty but found increasing opposition to the idea of a continued federal state as the leaders of various Soviet republics bowed to growing nationalist pressure. Yeltsin stated that he would veto any idea of a unified state, instead favoring a confederation with little central authority. Only the leaders of the Kazakhstan and Kirghizia supported Gorbachev's approach. The referendum in Ukraine on 1 December with a 90% turnout for secession from the Union was a fatal blow; Gorbachev had expected Ukrainians to reject independence.
Without Gorbachev's knowledge, Yeltsin met with Ukrainian President Leonid Kravchuk and Belarusian President Stanislav Shushkevich in Belovezha Forest, near Brest, Belarus, on 8 December and signed the Belavezha Accords, which declared the Soviet Union had ceased to exist and formed the Commonwealth of Independent States (CIS) as its successor. Gorbachev only learned of this development when Shushkevich phoned him; Gorbachev was furious. He desperately looked for an opportunity to preserve the Soviet Union, hoping in vain that the media and intelligentsia might rally against the idea of its dissolution. Ukrainian, Belarusian, and Russian Supreme Soviets then ratified the establishment of the CIS. On 10 December, he issued a statement calling the CIS agreement "illegal and dangerous". On 20 December, the leaders of 11 of the 12 remaining republics–all except Georgia–met in Alma-Ata and signed the Alma-Ata Protocol, agreeing to dismantle the Soviet Union and formally establish the CIS. They also provisionally accepted Gorbachev's resignation as president of what remained of the Soviet Union. Gorbachev revealed that he would resign as soon as he saw that the CIS was a reality.
Yeltsin and Gorbachev agreed that the latter would formally announce his resignation as Soviet President and Commander-in-Chief on 25 December, before vacating the Kremlin by 29 December. Yakovlev, Chernyaev, and Shevardnadze joined Gorbachev to help him write a resignation speech. Gorbachev then gave his speech in the Kremlin in front of television cameras, allowing for international broadcast. In it, he announced, "I hereby discontinue my activities at the post of President of the Union of Soviet Socialist Republics." He expressed regret for the breakup of the Soviet Union but cited what he saw as the achievements of his administration: political and religious freedom, the end of totalitarianism, the introduction of democracy and a market economy, and an end to the arms race and Cold War. Gorbachev was only the third Soviet leader, after Malenkov and Khrushchev, not to die in office. The Soviet Union officially ceased to exist at midnight on 31 December 1991.
Out of office, Gorbachev had more time to spend with his wife and family. He and Raisa initially lived in their dilapidated dacha on Rublevskoe Shosse, although were also allowed to privatize their small apartment on Kosygin Street. He focused on establishing his International Foundation for Socio-Economic and Political Studies, or "Gorbachev Foundation", launched in March 1992; Yakovlev and Grigory Revenko were its first Vice Presidents. Its initial tasks were in analyzing and publishing material on the history of perestroika, as well as defending the policy from what it called "slander and falsifications". The foundation also tasked itself with monitoring and critiquing life in post-Soviet Russia and presenting alternate forms of development to those pursued by Yeltsin. In 1993, Gorbachev launched Green Cross International, which focused on encouraging sustainable futures, and then the World Political Forum.
To finance his foundation, Gorbachev began lecturing internationally, charging large fees to do so. On a visit to Japan, he was well received and given multiple honorary degrees. In 1992, he toured the U.S. in a Forbes private jet to raise money for his foundation. During the trip he met up with the Reagans for a social visit. From there he went to Spain, where he attended the Expo '92 world fair in Seville and also met with Prime Minister Felipe González, who had become a friend of his. In March, he visited Germany, where he was received warmly by many politicians who praised his role in facilitating German reunification. To supplement his lecture fees and book sales, Gorbachev appeared in print and television adverts for companies like Pizza Hut and Louis Vuitton, enabling him to keep the foundation afloat. With his wife's assistance, Gorbachev worked on his memoirs, which were published in Russian in 1995 and in English the following year. He also began writing a monthly syndicated column for "The New York Times".
Gorbachev had promised to refrain from criticizing Yeltsin while the latter pursued democratic reforms, but soon the two men were publicly criticizing each other again. After Yeltsin's decision to lift price caps generated massive inflation and plunged many Russians into poverty, Gorbachev openly criticized him, comparing the reform to Stalin's policy of forced collectivization. After pro-Yeltsin parties did poorly in the 1993 legislative election, Gorbachev called on him to resign. In 1995 his foundation held a conference on "The Intelligentsia and Perestroika". It was there that Gorbachev proposed to the Duma a law that would reduce many of the presidential powers established by Yeltsin's 1993 constitution. Gorbachev continued to defend perestroika but acknowledged that he had made tactical errors as Soviet leader. While he still believed that Russia was undergoing a process of democratization, he concluded that it would take decades rather than years, as he had previously thought.
The Russian presidential elections were scheduled for June 1996, and although his wife and most of his friends urged him not to run, Gorbachev decided to do so. He hated the idea that the election would result in a run-off between Yeltsin and Gennady Zyuganov, the Communist Party of the Russian Federation candidate whom Yeltsin saw as a Stalinist hardliner. He never expected to win outright but thought a centrist bloc could be formed around either himself or one of the other candidates with similar views, such as Grigory Yavlinsky, Svyatoslav Fyodorov, or Alexander Lebed. After securing the necessary one million signatures of nomination, he announced his candidacy in March. Launching his campaign, he traveled across Russia giving rallies in twenty cities. He repeatedly faced anti-Gorbachev protesters, while some pro-Yeltsin local officials tried to hamper his campaign by banning local media from covering it or by refusing him access to venues. In the election, Gorbachev came seventh with circa 386,000 votes, or around 0.5% of the total. Yeltsin and Zyuganov went through to the second round, where the former was victorious.
In contrast to her husband's political efforts, Raisa had focused on campaigning for children's charities. In 1997 she founded a sub-division of the Gorbachev Foundation known as Raisa Maksimovna's Club to focus on improving women's welfare in Russia. The Foundation had initially been housed in the former Social Science Institute building, but Yeltsin introduced limits to the number of rooms it could use there; the American philanthropist Ted Turner then donated over $1 million to enable the foundation to build a new premises on the Leningradsky Prospekt. In 1999, Gorbachev made his first visit to Australia, where he gave a speech to the country's parliament. Shortly after, in July, Raisa was diagnosed with leukemia. With the assistance of German Chancellor Gerhard Schröder, she was transferred to a cancer center in Münster, Germany and there underwent chemotherapy. In September she fell into a coma and died. After Raisa's passing, Gorbachev's daughter Irina and his two granddaughters moved into his Moscow home to live with him. When questioned by journalists, he said that he would never remarry.
In December 1999, Yeltsin resigned and was succeeded by his deputy, Vladimir Putin, who then won the March 2000 presidential election. Gorbachev attended Putin's inauguration ceremony in May, the first time he had entered the Kremlin since 1991.
Gorbachev initially welcomed Putin's rise, seeing him as an anti-Yeltsin figure. Although he spoke out against some of the Putin government's actions, Gorbachev also had praise for the new regime; in 2002 he said that "I've been in the same skin. That's what allows me to say what [Putin's] done is in the interest of the majority". At the time, he believed Putin to be a committed democrat who nevertheless had to use "a certain dose of authoritarianism" to stabilize the economy and rebuild the state after the Yeltsin era. At Putin's request, Gorbachev became co-chair of the "Petersburg Dialogue" project between high-ranking Russians and Germans.
In 2000, Gorbachev helped form the Russian United Social Democratic Party. In June 2002 he participated in a meeting with Putin, who praised the venture, suggesting that a center-left party could be good for Russia and that he would be open to working with it. In 2003, Gorbachev's party merged with the Social Democratic Party to form the Social Democratic Party of Russia, which faced much internal division and failed to gain traction with voters. Gorbachev resigned as party leader in May 2004 following a disagreement with the party's chairman over the direction taken in the 2003 election campaign. The party was later banned in 2007 by the Supreme Court of the Russian Federation due to its failure to establish local offices with at least 500 members in the majority of Russian regions, which is required by Russian law for a political organization to be listed as a party. Later that year, Gorbachev founded a new movement, the Union of Social Democrats. Stating that it would not contest the forthcoming elections, Gorbachev declared: "We are fighting for power, but only for power over people's minds".
Gorbachev was critical of U.S. hostility to Putin, arguing that the U.S. government "doesn't want Russia to rise" again as a global power and wants "to continue as the sole superpower in charge of the world". More broadly, Gorbachev was critical of U.S. policy following the Cold War, arguing that the West had attempted to "turn [Russia] into some kind of backwater". He rejected the idea – expressed by Bush – that the U.S. had "won" the Cold War, arguing that both sides had cooperated to end the conflict. He claimed that since the fall of the Soviet Union, the U.S., rather than cooperating with Russia, had conspired to build a "new empire headed by themselves". He was critical of how the U.S. had expanded NATO right up to Russia's borders despite their initial assurances that they would not do so, citing this as evidence that the U.S. government could not be trusted. He spoke out against the 1999 NATO bombing of Yugoslavia because it lacked UN backing, as well as the 2003 invasion of Iraq led by the U.S. In June 2004 Gorbachev nevertheless attended Reagan's state funeral, and in 2007 visited New Orleans to see the damage caused by Hurricane Katrina.
Barred by the constitution from serving more than two consecutive terms as president, Putin stood down in 2008 and was succeeded by his Prime Minister, Dmitry Medvedev, who reached out to Gorbachev in ways that Putin had not. In September 2008, Gorbachev and business oligarch Alexander Lebedev announced they would form the Independent Democratic Party of Russia, and in May 2009 Gorbachev announced that the launch was imminent. After the outbreak of the 2008 South Ossetia war between Russia and South Ossetian separatists on one side and Georgia on the other, Gorbachev spoke out against U.S. support for Georgian President Mikheil Saakashvili and for moving to bring the Caucasus into the sphere of its national interest. Gorbachev nevertheless remained critical of Russia's government and criticized the 2011 parliamentary elections as being rigged in favor of the governing party, United Russia, and called for them to be re-held. After protests broke out in Moscow over the election, Gorbachev praised the protesters.
In 2009 Gorbachev released "Songs for Raisa", an album of Russian romantic ballads, sung by him and accompanied by musician Andrei Makarevich, to raise money for a charity devoted to his late wife. That year he also met with U.S. President Barack Obama in efforts to "reset" strained U.S.-Russian relations, and attended an event in Berlin commemorating the twentieth anniversary of the fall of the Berlin Wall.
In 2011, an eightieth birthday gala for him was held at London's Royal Albert Hall, featuring tributes from Simon Peres, Lech Wałęsa, Michel Rocard, and Arnold Schwarzenegger. Proceeds from the event went to the Raisa Gorbachev Foundation. That year, Medvedev awarded him the Order of St Andrew the Apostle the First-Called.
In 2012, Putin announced that he was standing again as president, something Gorbachev was critical of. He complained that Putin's new measures had "tightened the screws" on Russia and that the president was trying to "completely subordinate society", adding that United Russia now "embodied the worst bureaucratic features of the Soviet Communist party".
Gorbachev was in increasingly poor health; in 2011 he had spinal operation and in 2014 oral surgery. In 2015, Gorbachev ceased his pervasive international traveling. He continued to speak out on issues affecting Russia and the world. In 2014, he defended the Crimean status referendum that led to Russia's annexation of Crimea. He noted that while Crimea was transferred from Russia to Ukraine in 1954, when both were part of the Soviet Union, the Crimean people had not been asked at the time, whereas in the 2014 referendum they had. After sanctions were placed on Russia as a result of the annexation, Gorbachev spoke out against them. His comments led to Ukraine banning him from entering the country for five years.
At a November 2014 event marking 25 years since the fall of the Berlin Wall, Gorbachev warned that the ongoing War in Donbass had brought the world to the brink of a new cold war, and he accused Western powers, particularly the U.S., of adopting an attitude of "triumphalism" towards Russia. In July 2016, Gorbachev criticized NATO for deploying more troops to Eastern Europe amid escalating tensions between the military alliance and Russia. In June 2018, he welcomed the 2018 Russia–United States summit between Putin and U.S. President Donald Trump, although in October criticized Trump's threat to withdraw from the 1987 Intermediate-Range Nuclear Forces Treaty, saying the move "is not the work of a great mind." He added: "all agreements aimed at nuclear disarmament and the limitation of nuclear weapons must be preserved for the sake of life on Earth."
According to his university friend Zdeněk Mlynář, in the early 1950s "Gorbachev, like everyone else at the time, was a Stalinist." Mlynář noted, however, that unlike most other Soviet students, Gorbachev did not view Marxism simply as "a collection of axioms to be committed to memory." Biographers Doder and Branson related that after Stalin's death, Gorbachev's "ideology would never be doctrinal again", but noted that he remained "a true believer" in the Soviet system. Doder and Branson noted that at the Twenty-Seventh Party Congress in 1986, Gorbachev was seen to be an orthodox Marxist-Leninist; that year, the biographer Zhores Medvedev stated that "Gorbachev is neither a liberal nor a bold reformist".
By the mid-1980s, when Gorbachev took power, many analysts were arguing that the Soviet Union was declining to the status of a Third World country.
In this context, Gorbachev argued that the Communist Party had to adapt and engage in creative thinking much as Lenin had creatively interpreted and adapted the writings of Karl Marx and Friedrich Engels to the situation of early 20th century Russia. For instance, he thought that rhetoric about global revolution and overthrowing the bourgeoisie—which had been integral to Leninist politics—had become too dangerous in an era where nuclear warfare could obliterate humanity. He began to move away from the Marxist-Leninist belief in class struggle as the engine of political change, instead viewing politics as a ways of co-ordinating the interests of all classes. However, as Gooding noted, the changes that Gorbachev proposed were "expressed wholly within the terms of Marxist-Leninist ideology".
According to Doder and Branson, Gorbachev also wanted to "dismantle the hierarchical military society at home and abandon the grand-style, costly, imperialism abroad". However, Jonathan Steele argued that Gorbachev failed to appreciate why the Baltic nations wanted independence and "at heart he was, and remains, a Russian imperialist." Gooding thought that Gorbachev was "committed to democracy", something marking him out as different from his predecessors. Gooding also suggested that when in power, Gorbachev came to see socialism not as a place on the path to communism, but a destination in itself.
Gorbachev's political outlook was shaped by the 23 years he served as a party official in Stavropol. Doder and Branson thought that throughout most of his political career prior to becoming General Secretary, "his publicly expressed views almost certainly reflected a politician's understanding of what should be said, rather than his personal philosophy. Otherwise he could not have survived politically."
Like many Russians, Gorbachev sometimes thought of the Soviet Union as being largely synonymous with Russia and in various speeches described it as "Russia"; in one incident he had to correct himself after calling the USSR "Russia" while giving a speech in Kiev, Ukraine.
McCauley noted that perestroika was "an elusive concept", one which "evolved and eventually meant something radically different over time." McCauley stated that the concept originally referred to "radical reform of the economic and political system" as part of Gorbachev's attempt to motivate the labor force and make management more effective. It was only after initial measures to achieve this proved unsuccessful that Gorbachev began to consider market mechanisms and co-operatives, albeit with the state sector remaining dominant. The political scientist John Gooding suggested that had the perestroika reforms succeeded, the Soviet Union would have "exchanged totalitarian controls for milder authoritarian ones" although not become "democratic in the Western sense". With perestroika, Gorbachev had wanted to improve the existing Marxist-Leninist system but ultimately ended up destroying it. In this, he brought an end to state socialism in the Soviet Union and paved the way for a transition to liberal democracy.
Taubman nevertheless thought Gorbachev remained a socialist. He described Gorbachev as "a true believer—not in the Soviet system as it functioned (or didn't) in 1985 but in its potential to live up to what he deemed its original ideals." He added that "until the end, Gorbachev reiterated his belief in socialism, insisting that it wasn't worthy of the name unless it was truly democratic."
As Soviet leader, Gorbachev believed in incremental reform rather than a radical transformation; he later referred to this as a "revolution by evolutionary means". Doder and Branson noted that over the course of the 1980s, his thought underwent a "radical evolution". Taubman noted that by 1989 or 1990, Gorbachev had transformed into a social democrat. McCauley suggested that by at least June 1991 Gorbachev was a "post-Leninist", having "liberated himself" from Marxism-Leninism. After the fall of the Soviet Union, the newly formed Communist Party of the Russian Federation would have nothing to do with him. However, in 2006, he expressed his continued belief in Lenin's ideas: "I trusted him then and I still do". He claimed that "the essence of Lenin" was a desire to develop "the living creative activity of the masses". Taubman believed that Gorbachev identified with Lenin on a psychological level.
Reaching an adult height of , Gorbachev has a distinctive port-wine stain on the top of his head. By 1955 his hair was thinning, and by the late 1960s he was bald. Throughout the 1960s he struggled against obesity and dieted to control the problem; Doder and Branson characterized him as "stocky but not fat". He speaks in a southern Russian accent, and is known to sing both folk and pop songs.
Throughout his life, he tried to dress fashionably. Having an aversion to hard liquor, he drank sparingly and did not smoke. He was protective of his private life and avoided inviting people to his home.
Gorbachev cherished his wife, who in turn was extremely protective of him.
He was an involved parent and grandparent. He sent his daughter, his only child, to a local school in Stavropol rather than to a school set aside for the children of party elites. Unlike many of his contemporaries in the Soviet administration, he was not a womanizer and was known for treating women respectfully.
Gorbachev was baptized Russian Orthodox and when he was growing up, his grandparents had been practicing Christians. In 2008, there was some press speculation that he was a practicing Christian after he visited the tomb of St Francis of Assisi, to which he publicly clarified that he was an atheist. Since studying at university, Gorbachev considered himself an intellectual; Doder and Branson thought that "his intellectualism was slightly self-conscious", noting that unlike most Russian intelligentsia, Gorbachev was not closely connected "to the world of science, culture, the arts, or education". When living in Stavropol he and his wife collected hundreds of books. Among his favorite authors were Arthur Miller, Dostoevsky, and Chingiz Aitmatov, while he also enjoyed reading detective fiction. He enjoyed going for walks, having a love of natural environments, and was also a fan of association football. He favored small gatherings where the assembled discussed topics like art and philosophy rather than the large, alcohol-fueled parties common among Soviet officials.
Gorbachev's university friend, Mlynář, described him as "loyal and personally honest". He was self-confident, polite, and tactful; he had a happy and optimistic temperament. He used self-deprecating humour, and sometimes profanities, and often referred to himself in the third person. He was a skilled manager, and had a good memory. A hard worker or workaholic, as General Secretary, he would rise at 7 or 8 in the morning and not go to bed until 1 or 2. Taubman called him "a remarkably decent man"; he thought Gorbachev to have "high moral standards".
Zhores Medvedev thought him a talented orator, in 1986 stating that "Gorbachev is probably the best speaker there has been in the top Party echelons" since Leon Trotsky. Medvedev also considered Gorbachev "a charismatic leader", something Brezhnev, Andropov, and Chernenko had not been. Doder and Branson called him "a charmer capable of intellectually seducing doubters, always trying to co-opt them, or at least blunt the edge of their criticism". McCauley thought Gorbachev displayed "great tactical skill" in maneuvering successfully between hardline Marxist-Leninists and liberalisers for most of his time as leader, although added that he was "much more skilled at tactical, short-term policy than strategic, long-term thinking", in part because he was "given to making policy on the hoof".
Doder and Branson thought Gorbachev "a Russian to the core, intensely patriotic as only people living in the border regions can be."
Taubman also noted that the former Soviet leader has a "sense of self-importance and self-righteousness" as well as a "need for attention and admiration" which grated on some of his colleagues. He was sensitive to personal criticism and easily took offense. Colleagues were often frustrated that he would leave tasks unfinished, and sometimes also felt underappreciated and discarded by him. Biographers Doder and Branson thought that Gorbachev was "a puritan" with "a proclivity for order in his personal life". Taubman noted that he was "capable of blowing up for calculated effect". He also thought that by 1990, when his domestic popularity was waning, Gorbachev become "psychologically dependent on being lionized abroad", a trait for which he was criticized in the Soviet Union. McCauley was of the view that "one of his weaknesses was an inability to foresee the consequences of his actions".
Opinions on Gorbachev are deeply divided. Many, particularly in Western countries, see him as the greatest statesman of the second half of the twentieth century. U.S. press referred to the presence of "Gorbymania" in Western countries during the late 1980s and early 1990s, as represented by large crowds that turned out to greet his visits, with "Time" magazine naming him its "Man of the Decade" in the 1980s. In the Soviet Union itself, opinion polls indicated that Gorbachev was the most popular politician from 1985 through to late 1989. For his domestic supporters, Gorbachev was seen as a reformer trying to modernise the Soviet Union, and to build a form of democratic socialism. Taubman characterized Gorbachev as "a visionary who changed his country and the world—though neither as much as he wished." Taubman regarded Gorbachev as being "exceptional... as a Russian ruler and a world statesman", highlighting that he avoided the "traditional, authoritarian, anti-Western norm" of both predecessors like Brezhnev and successors like Putin. McCauley thought that in allowing the Soviet Union to move away from Marxism-Leninism, Gorbachev gave the Soviet people "something precious, the right to think and manage their lives for themselves", with all the uncertainty and risk that that entailed.
Gorbachev's negotiations with the U.S. helped bring an end to the Cold War and reduced the threat of nuclear conflict. His decision to allow the Eastern Bloc to break apart prevented significant bloodshed in Central and Eastern Europe; as Taubman noted, this meant that the "Soviet Empire" ended in a far more peaceful manner than the British Empire several decades before. Similarly, under Gorbachev, the Soviet Union broke apart without falling into civil war, as happened during the breakup of Yugoslavia at the same time. McCauley noted that in facilitating the merger of East and West Germany, Gorbachev was "a co-father of German unification", assuring him long-term popularity among the German people.
He also faced domestic criticism during his rule. During his career, Gorbachev attracted the admiration of some colleagues, but others came to hate him. Across society more broadly, his inability to reverse the decline in the Soviet economy brought discontent. Liberals thought he lacked the radicalism to really break from Marxism-Leninism and establish a free market liberal democracy. Conversely, many of his Communist Party critics thought his reforms were reckless and threatened the survival of Soviet socialism; some believed he should have followed the example of China's Communist Party and restricted himself to economic rather than governmental reforms. Many Russians saw his emphasis on persuasion rather than force as a sign of weakness.
For much of the Communist Party nomenklatura, the Soviet Union's dissolution was disastrous as it resulted in their loss of power. In Russia, he is widely despised for his role in the collapse of the Soviet Union and the ensuing economic collapse. General Varennikov, one of those who orchestrated the 1991 coup attempt against Gorbachev, for instance called him "a renegade and traitor to your own people". Many of his critics attacked him for allowing the Marxist-Leninist governments across Eastern Europe to fall, and for allowing a reunited Germany to join NATO, something they deem to be contrary to Russia's national interest.
The historian Mark Galeotti stressed the connection between Gorbachev and his predecessor, Andropov. In Galeotti's view, Andropov was "the godfather of the Gorbachev revolution", because—as a former head of the KGB—he was able to put forward the case for reform without having his loyalty to the Soviet cause questioned, an approach that Gorbachev was able to build on and follow through with. According to McCauley, Gorbachev "set reforms in motion without understanding where they could lead. Never in his worst nightmare could he have imagined that perestroika would lead to the destruction of the Soviet Union".
In 1988, India awarded Gorbachev the Indira Gandhi Prize for Peace, Disarmament and Development; in 1990 he was given the Nobel Peace Prize for "his leading role in the peace process which today characterizes important parts of the international community". Out of office he continued to receive honors. In 1992 he was the first recipient of the Ronald Reagan Freedom Award, and in 1994 was given the Grawemeyer Award by the University of Louisville, Kentucky. In 1995 he was awarded the Grand-Cross of the Order of Liberty by Portuguese President Mário Soares, and in 1998 the Freedom Award from the National Civil Rights Museum in Memphis, Tennessee. In 2002, Gorbachev received the Freedom of the City of Dublin from Dublin City Council.
In 2002, Gorbachev was awarded the Charles V Prize by the European Academy of Yuste Foundation. Gorbachev, together with Bill Clinton and Sophia Loren, were awarded the 2004 Grammy Award for Best Spoken Word Album for Children for their recording of Sergei Prokofiev's "Peter and the Wolf" for Pentatone. In 2005, Gorbachev was awarded the Point Alpha Prize for his role in supporting German reunification.
|
https://en.wikipedia.org/wiki?curid=20979
|
Masada
Masada ( ', "fortress") is an ancient fortification in the Southern District of Israel situated on top of an isolated rock plateau, akin to a mesa. It is located on the eastern edge of the Judaean Desert, overlooking the Dead Sea east of Arad.
Herod the Great built two palaces for himself on the mountain and fortified Masada between 37 and 31 BCE.
According to Josephus, the siege of Masada by Roman troops from 73 to 74 CE, at the end of the First Jewish–Roman War, ended in the mass suicide of the 960 Sicarii rebels who were hiding there.
Masada is one of Israel's most popular tourist attractions.
The cliff of Masada is, geologically speaking, a horst. As the plateau abruptly ends in cliffs steeply falling about to the east and about to the west, the natural approaches to the fortress are very difficult to navigate. The top of the mesa-like plateau is flat and rhomboid-shaped, about by . Herod built a high casemate wall around the plateau totalling in length, reinforced by many towers. The fortress contained storehouses, barracks, an armory, a palace, and cisterns that were refilled by rainwater. Three narrow, winding paths led from below up to fortified gates.
Almost all historical information about Masada comes from the first-century Jewish Roman historian Josephus.
Josephus writes that the site was first fortified by Hasmonean ruler Alexander Jannaeus in the first century BCE. However, so far no Hasmonean-period building remains could be identified during archaeological excavations.
Josephus further writes that Herod the Great captured it in the power struggle that followed the death of his father Antipater. It survived the siege of the last Hasmonean king Antigonus II Mattathias, who ruled with Parthian support.
According to Josephus, between 37 and 31 BCE, Herod the Great built a large fortress on the plateau as a refuge for himself in the event of a revolt, and erected there two palaces.
In 66 CE, a group of Jewish rebels, the Sicarii, overcame the Roman garrison of Masada with the aid of a ruse. After the destruction of the Second Temple in 70 CE, additional members of the Sicarii fled Jerusalem and settled on the mountaintop after slaughtering the Roman garrison. According to Josephus, the Sicarii were an extremist Jewish splinter group antagonistic to a larger grouping of Jews referred to as the Zealots, who carried the main burden of the rebellion. Josephus said that the Sicarii raided nearby Jewish villages including Ein Gedi, where they massacred 700 women and children.
In 73 CE, the Roman governor of Iudaea, Lucius Flavius Silva, headed the Roman legion X "Fretensis" and laid siege to Masada. The Roman legion surrounded Masada, built a circumvallation wall and then a siege ramp against the western face of the plateau. According to Dan Gill, geological investigations in the early 1990s confirmed earlier observations that the 114 m (375 ft) high assault ramp consisted mostly of a natural spur of bedrock. The ramp was complete in the spring of 73, after probably two to three months of siege, allowing the Romans to finally breach the wall of the fortress with a battering ram on April 16. The Romans employed the X Legion and a number of auxiliary units and Jewish prisoners of war, totaling some 15,000 (of whom an estimated 8,000 to 9,000 were fighting men), in crushing Jewish resistance at Masada. A giant siege tower with a battering ram was constructed and moved laboriously up the completed ramp. According to Josephus, when Roman troops entered the fortress, they discovered that its defendants had set all the buildings but the food storerooms ablaze and committed mass suicide or killed each other, 960 men, women, and children in total. Josephus wrote of two stirring speeches that the Sicari leader had made to convince his men to kill themselves. Only two women and five children were found alive.
Josephus presumably based his narration upon the field commentaries of the Roman commanders that were accessible to him.
Significant discrepancies exist between archaeological findings and Josephus' writings. Josephus mentions only one of the two palaces that have been excavated, refers only to one fire, while many buildings show fire damage, and claims that 960 people were killed, while the remains of only 28 bodies at the very most have been found.Some of the other details that Josephus gives, were correct – for instance, he describes the baths that were built there, the fact that the floors in some of the buildings ‘were paved with stones of several colours’, and that many pits were cut into the living rock to serve as cisterns. Josephus must be referring to the sort of mosaics that Yadin found still partially intact on some of the floors.
The year of the siege of Masada may have been 73 or 74 CE.
Masada was last occupied during the Byzantine period, when a small church was established at the site. The church was part of a monastic settlement identified with the monastery of Marda known from hagiographical literature. This identification is generally accepted by researchers. The Aramaic common noun "marda", "fortress", corresponds in meaning to the Greek name of another desert monastery of the time, Kastellion, and is used to describe that site in the "vita" (biography) of St Sabbas, but it is only used as a proper name for the monastery at Masada, as can be seen from the "vita" of St Euthymius.
An almost inaccessible cave, dubbed Yoram Cave, located on the sheer southern cliff face 100 m below the plateau, has been found to contain numerous plant remains, of which 6,000-year-old barley seeds were in such good state of preservation that their genome could be sequenced. This is the first time that this succeeded with a Chalcolithic plant genome, which is also the oldest one sequenced so far. The result helped determine that the earliest domestication of barley, dated elsewhere in the Fertile Crescent to 10,000 years ago, happened further north up the Jordan Rift Valley, namely in the Upper Jordan Valley in northern Israel. The Yoram Cave seeds were found to be fairly different from the wild variety, proof for an already advanced process of domestication, but very similar to the types of barley still cultivated in the region – an indication for remarkable constancy. Considering the difficulty in reaching the cave, whose mouth opens some 4 m above the exposed access path, the researchers have speculated that it was a place of short-term refuge for Chalcolithic people fleeing an unknown catastrophe.
The site of Masada was identified in 1838 by Americans Edward Robinson and Eli Smith, and in 1842, American missionary Samuel W. Wolcott and the English painter W. Tipping were the first moderns to climb it. After visiting the site several times in the 1930s and 1940s, Shmarya Guttman conducted an initial probe excavation of the site in 1959.
Masada was extensively excavated between 1963 and 1965 by an expedition led by Israeli archaeologist and former military Chief-of-Staff Yigael Yadin.
Due to the remoteness from human habitation and its arid environment, the site remained largely untouched by humans or nature for two millennia.
Many of the ancient buildings have been restored from their remains, as have the wall paintings of Herod's two main palaces, and the Roman-style bathhouses that he built. The synagogue, storehouses, and houses of the Jewish rebels have also been identified and restored.
Water cisterns two-thirds of the way up the cliff drain the nearby wadis by an elaborate system of channels, which explains how the rebels managed to conserve enough water for such a long time.
The Roman attack ramp still stands on the western side and can be climbed on foot. The meter-high circumvallation wall that the Romans built around Masada can be seen, together with eight Roman siege camps just outside this wall. The Roman siege installations as a whole, especially the attack ramp, are the best preserved of their kind, and the reason for declaring Masada a UNESCO World Heritage site.
Inside the synagogue, an ostracon bearing the inscription "me'aser cohen" (tithe for the priest) was found, as were fragments of two scrolls: parts of Deuteronomy and of the Book of Ezekiel including the vision of the "dry bones" ( and ), found hidden in pits dug under the floor of a small room built inside the synagogue. In other loci, fragments were found of the books of Genesis, Leviticus, Psalms, and Sirach, as well as of the Songs of the Sabbath Sacrifice.
In the area in front of the Northern Palace, 11 small ostraca were recovered, each bearing a single name. One reads "ben Ya'ir" and could be short for Eleazar ben Ya'ir, the commander of the fortress. The other 10 names may be those of the men chosen by lot to kill the others and then themselves, as recounted by Josephus.
The remains of a maximum of 28 people were unearthed at Masada, possibly 29 including a foetus. The skeletal remains of 25 individuals were found in a cave outside and below the southern wall. The remains of another two males and a female were found in the bathhouse of the Northern Palace.
Of the bathhouse remains, the males were variously and unconvincingly assessed to have been of an age of either 40 and 20–22, or 22 and 11–12, but the dental remains of both seem to be between 16–18 of age; and the female was described as 17–18 years old. However, the skeletal remains of the males were incomplete, and only the hair (a full head of hair with braids) but no bones of the female were found. Forensic analysis showed the hair had been cut from the woman's head with a sharp instrument while she was still alive, a practice prescribed for captured women in the Bible () and the 2nd-century BCE Temple Scroll, while the braids indicate that she was married. Based on the evidence, anthropologist Joe Zias and forensic scientist Azriel Gorski believe the remains may have been Romans whom the rebels captured when they seized the garrison.
As to the sparse remains of 24 people found in the southern cave at the base of the cliff, excavator Yigael Yadin was unsure of their ethnicity; however, the rabbinical establishment concluded that they were remains of the Jewish defenders, and in July 1969, they were reburied as Jews in a state ceremony. Carbon dating of textiles found with the remains in the cave indicate they are contemporaneous with the period of the revolt, and pig bones were also present (occasionally occurring for Roman burials due to pig sacrifices); this indicates that the remains may belong to non-Jewish Roman soldiers or civilians who occupied the site before or after the siege. Zias also questioned whether as many as 24 individuals were present, since only 4% of that number of bones was recovered.
A 2,000-year-old Judean date palm seed discovered during archaeological excavations in the early 1960s was successfully germinated into a date plant, popularly known as "Methuselah" after the longest-living figure in the Hebrew Bible. At the time, it was the oldest known germination, remaining so until a new record was set in 2012. As of September 2016, it remains the oldest germination from a seed.
The remnants of a Byzantine church dating from the fifth and sixth centuries have been excavated on the plateau.
Yadin's team could detect no architectural remains of the Hasmonean period, the only findings firmly dated to this period being the numerous coins of Alexander Jannaeus. Researchers have speculated that the southwestern block of the Western Palace and the auxiliary buildings east and south of it could be Hasmonean, relying on similarities to the Twin Palaces at Jericho. However, their excavators could make no archaeological discovery able to support this presumption.
According to Shaye Cohen, archaeology shows that Josephus' account is "incomplete and inaccurate". Josephus only writes of one palace; archaeology reveals two. His description of the northern palace contains several inaccuracies, and he gives exaggerated figures for the height of the walls and towers. Josephus' account is contradicted by the "skeletons in the cave, and the numerous separate fires".
According to Kenneth Atkinson, no "archaeological evidence that Masada's defenders committed mass suicide" exists.
Masada was declared a UNESCO World Heritage Site in 2001.
In 2007, the Masada Museum in Memory of Yigael Yadin opened at the site, in which archeological findings are displayed in a theatrical setting. Many of the artifacts exhibited were unearthed by Yadin and his archaeological team from the Hebrew University of Jerusalem during the 1960s.
The archaeological site is situated in the Masada National Park, and the park requires an entrance fee (even if by hiking). There are two hiking paths, both very steep:
Hikers frequently start an hour before sunrise, when the park opens, to avoid the mid-day heat, which can exceed in the summer. In fact, the hiking paths are often closed during the day in the summer because of the heat. Visitors are encouraged to bring drinking water for the hike up, as water is only available at the top.
Alternatively, for a higher fee, visitors can take a cable car (the Masada cableway, opens at 8 am) to the top of the mesa.
A visitors' center and the museum are at the base of the cable car.
A light-and-sound show is presented on some summer nights on the western side of the mountain (access by car from the Arad road or by foot, down the mountain via the Roman Ramp path).
In May 2015, 20-year old American tourist Briana McHam fell 25 feet on Masada's Snake Path, after she became separated from her Florida State University tour group and went off the marked trail. Following an hour and a half search, Magen David Adom personnel found her unresponsive and suffering from dehydration. After failed attempts to resuscitate, she was declared dead at the scene.
An example of Herodian architecture, Masada was the first site Herod the Great fortified after he gained control of his kingdom.
The first of three building phases completed by Herod began in 35 BCE. During the first phase the Western Palace was built, along with three smaller palaces, a storeroom, and army barracks. Three columbarium towers and a swimming pool at the south end of the site were also completed during this building phase.
The original center of the Western Palace was square and was accessed through an open courtyard on the northwest corner of the building. The courtyard was the central room of the Western Palace and directed visitors into a portico, used as a reception area for visitors. Visitors were then led to a throne room. Off the throne room was a corridor used by the king, with a private dressing room, which also had another entrance way that connected to the courtyard through the mosaic room. The mosaic room contained steps that led to a second floor with separate bedrooms for the king and queen.
The second building phase in 25 BCE included an addition to the Western Palace, a large storage complex for food, and the Northern Palace. The Northern Palace is one of Herod's more lavish palace-fortresses, and was built on the hilltop on the north side of Masada and continues two levels down, over the end of the cliffs. The upper terrace of the Northern Palace included living quarters for the king and a semicircular portico to provide a view of the area. A stairway on the west side led down to the middle terrace that was a decorative circular reception hall. The lower terrace was also for receptions and banquets. It was enclosed on all four sides with porticos and included a Roman bathhouse.
In 15 BCE, during the third and final building phase, the entire site of Masada – except for the Northern Palace – was enclosed by a casemate wall, which consisted of a double wall with a space between that was divided into rooms by perpendicular walls; these were used as living chambers for the soldiers and as extra storage space. The Western Palace was also extended for a third time to include more rooms for the servants and their duties.
The Masada story was the inspiration for the "Masada plan" devised by the British during the Mandate era. The plan was to man defensive positions on Mount Carmel with Palmach fighters, to stop Erwin Rommel's expected drive through the region in 1942. The plan was abandoned following Rommel's defeat at El Alamein.
The chief of staff of the Israel Defense Forces (IDF), Moshe Dayan, initiated the practice of holding the swearing-in ceremony of Israeli Armoured Corps soldiers who had completed their "tironut" (IDF basic training) on top of Masada. The ceremony ended with the declaration: "Masada shall not fall again." The soldiers climbed the Snake Path at night and were sworn in with torches lighting the background. These ceremonies are now also held at various other locations, including the Armoured Corps Memorial at Latrun, the Western Wall and Ammunition Hill in Jerusalem, Akko prison, and training bases.
|
https://en.wikipedia.org/wiki?curid=20985
|
Marvel Universe
The Marvel Universe is a fictional universe where the stories in most American comic book titles and other media published by Marvel Comics take place. Super-teams such as the Avengers, the X-Men, the Fantastic Four, the Guardians of the Galaxy, the Defenders, the Midnight Sons, and many Marvel superheroes live in this universe, including characters such as Spider-Man, Iron Man, the Hulk, Thor, Captain America, Black Widow, Wolverine, Captain Marvel, Black Panther, Doctor Strange, Ghost Rider, Blade, the Silver Surfer, Adam Warlock, Nova, Daredevil, Iron Fist, the Moon Knight, the Punisher, Deadpool and numerous others. It also contains well-known supervillains such as Doctor Doom, Magneto, Thanos, Loki, Green Goblin, Kingpin, Red Skull, Ultron, Doctor Octopus, the Mandarin, MODOK, Carnage, Apocalypse, Hela, Ronan the Accuser, Kang, Mephisto, Dormammu, Annihilus and Galactus.
The Marvel Universe is further depicted as existing within a "multiverse" consisting of thousands of separate universes, all of which are the creations of Marvel Comics and all of which are, in a sense, "Marvel universes". In this context, "Marvel Universe" is taken to refer to the mainstream Marvel continuity, which is known as Earth-616 or currently as Earth Prime.
Though the concept of a shared universe was not new or unique to comic books in 1961, writer/editor Stan Lee, together with several artists including Jack Kirby and Steve Ditko, created a series of titles where events in one book would have repercussions in another title and serialized stories would show characters' growth and change. Headline characters in one title would make cameos or guest appearances in other books. Eventually, many of the leading heroes assembled into a team known as the Avengers. This was not the first time that Marvel's characters had interacted with one another—Namor the Sub-Mariner and the original Human Torch had been rivals when Marvel was Timely Comics (Marvel Vault)— but it was the first time that the comic book publisher's characters seemed to share a world. The Marvel Universe was also notable for setting its central titles in New York City; by contrast, many DC heroes live in fictional cities. Care was taken to portray the city and the world as realistically as possible, with the presence of superhumans affecting the common citizens in various ways.
Over time, a few Marvel Comics writers lobbied Marvel editors to incorporate the idea of a Multiverse resembling DC's parallel worlds; this plot device allows one to create several fictional universes which normally do not overlap. What happens on Earth in the main Marvel Universe would normally not affect what happens on a parallel Earth in another Marvel-created universe. However, writers would have the creative ability to write stories in which people from one such universe would visit this alternative universe.
In 1982, Marvel published the miniseries "Contest of Champions", in which all of the major heroes in existence at the time were gathered together to deal with one threat. This was Marvel's first miniseries. Each issue contained biographical information on many major costumed characters; these biographies were a precursor to Marvel's series of reference material, "The Official Handbook of the Marvel Universe", which followed shortly on the heels of "Contest of Champions".
The Marvel Universe is strongly based on the real world. Earth in the Marvel Universe has all the features of the real one: same countries, same personalities (politicians, movie stars, etc.), same historical events (such as World War II), and so on; however, it also contains many other fictional elements: countries such as Wakanda and Latveria (very small nations) and organizations like the espionage agency S.H.I.E.L.D. and its enemies, HYDRA and A.I.M. In 2009 Marvel officially described its world's geography in a two-part miniseries, the "Marvel Atlas".
Most importantly, the Marvel Universe also incorporates examples of almost all major science fiction and fantasy concepts, with writers adding more continuously. Aliens, gods, magic, cosmic powers and extremely advanced human-developed technology all exist prominently in the Marvel Universe. (A universe incorporating "all" these types of fantastic elements is fairly rare; another example is the DC Universe.) Monsters also play a more prominent role with east Asian origins of magical incantation, outlandish sorcery and manifesting principle in the Marvel Universe. One such case is Fin Fang Foom arising from the ashes of tantric magic. Thanks to these extra elements, Earth in the Marvel Universe is home to a large number of superheroes and supervillains, who have gained their powers by any of these means.
Comparatively, little time passes in the Marvel Universe compared to the real world, owing to the serial nature of storytelling, with the stories of certain issues picking up mere seconds after the conclusion of the previous one, while a whole month has passed by in "real-time". Marvel's major heroes were created in the 1960s, but the amount of time that has passed between then and now within the universe itself has (after a prolonged period of being identified as about 10 years in the mid-to-late 1990s) most recently been identified as 13 years. Consequently, the settings of some events which were contemporary when written have to be updated every few years to "make sense" in this floating timeline. Thus, the events of previous stories are considered to have happened within a certain number of years before the publishing date of the current issue. For example, Spider-Man's high school graduation was published in "Amazing Spider-Man" #28 (September 1965), his college graduation in "Amazing Spider-Man" #185 (October 1978), and his high school reunion in "Marvel Knights Spider-Man" #7 (December 2004). Because of the floating timeline, where stories refer to real-life historic events, these references are later ignored or rewritten to suit current sensibilities; for instance, the origin of Iron Man was changed in a 2004 storyline to refer to the War on Terror in Afghanistan, whereas the original Iron Man stories had referred to the Vietnam War in Vietnam; similarly, The Punisher's backstory has also been changed as well.
Marvel Comics itself exists as a company within the Marvel Universe, and versions of people such as Stan Lee and Jack Kirby have appeared in some of the stories, whereas characters like Steve Rogers, (Captain America's alter ego), have worked for Marvel. The Marvel of this reality publishes comics that adapt the actual adventures of the superheroes (except for details not known to the public, like their secret identities); many of these are licensed with the permission of the heroes themselves, who customarily donate their share of profits to charity. Additionally, the DC Comics Universe is also said to exist in the Marvel Universe as one of the many alternative universes. The reverse may also be said concerning the DC Universe. This is one method of explaining the various crossover stories co-published by the two companies.
Pop culture characters such as Dracula and the Frankenstein Monster exist in the Marvel Universe. This is usually justified as a second-hand account of events as told to credited authors Bram Stoker and Mary Shelley, although the general public continues to believe them to be fictional. Robert E. Howard's Conan the Barbarian, Red Sonja, Kull the Conqueror, and Solomon Kane also have real-life existences in the Marvel Universe. The Hyborian Era of Conan and Kull is considered part of Earth-616 pre-recorded history. However, they rarely encounter modern Marvel superhero characters. This is most likely possible due to the uncertain legal status of Howard's works before 2006 when they became public domain. As of 2019, Conan the Barbarian, as well as Kull the Conqueror and Solomon Kane, have been firmly integrated, thanks to Marvel regaining the publishing rights to the characters. Other licensed works that have been incorporated into the Marvel Universe include Godzilla, the Transformers, the film "" (in the character of Machine Man), Rom the Spaceknight, the Micronauts, and the Shogun Warriors. In most cases, such material is either restricted from use after the license expires or the characters redesigned or renamed to avoid copyright infringement.
Within the fictional history of the Marvel Universe, the tradition of using costumed secret identities to fight or commit evil had long existed, but it came into prominence during the days of the American "Wild West" with heroes such as Carter Slade/the Phantom Rider. During the 20th century, the tradition was reinvigorated by Steve Rogers/Captain America and the Invaders in the 1940s, who fought for the Allies of World War II.
Marvel's most prominent heroes were created during the Silver Age of Comic Books in the 1960s to early 1970s, including Peter Parker/Spider-Man, Tony Stark/Iron Man, Thor, Bruce Banner/the Hulk, Stephen Strange/Doctor Strange, Matt Murdock/Daredevil, Ant-Man and the Wasp (Hank Pym and Janet van Dyne), Natasha Romanoff/the Black Widow, Clint Barton/Hawkeye, Pietro Maxmioff/Quicksilver, Wanda Maximoff/the Scarlet Witch, the Vision, Simon Williams/Wonder Man, Hercules, Kevin Plunder/Ka-Zar, Groot, Nick Fury, T'Challa/the Black Panther, Mar-Vell (the first Captain Marvel), Carol Danvers (also known as the first Ms. Marvel, Binary, Warbird, and the current Captain Marvel), Sam Wilson/the Falcon, Dane Whitman/the Black Knight, Norrin Radd/the Silver Surfer, Jane Foster (also known as the second Thor), Warren Worthington III/the Angel-Archangel, Hank McCoy/the Beast, Scott Summers/Cyclops, Robert "Bobby" Drake/the Iceman, Jean Grey (also known as Marvel Girl and Phoenix), Charles Xavier/Professor X, Lorna Dane/Polaris, Alex Summers/Havok, Reed Richards/Mister Fantastic, Susan Storm/the Invisible Woman, Johnny Storm/the Human Torch, Ben Grimm/the Thing, Brunnhilde/the Valkyrie, the Inhumans (composed of Blackagar Boltagon, Medusalith Amaquelin-Boltagon, Crystalia Amaquelin-Boltagon/Crystal, Gorgon, Karnak Mandel-Azur/Karnak the Shatterer, Triton and Lockjaw) and Alexi Shostakov/the Red Guardian.
Other notable heroes from the Bronze Age and Modern Age from the early-to-mid 1970s to the early 1990s include James "Logan" Howlett/Wolverine, Ororo Munroe/Storm, Piotr "Peter" Rasputin/Colossus, Kurt Wagner/Nightcrawler, Sean Cassidy/the Banshee, Luke Cage (also known as Power-Man), Danny Rand/Iron Fist, Misty Knight, Colleen Wing, Barbara "Bobbi" Morse/Mockingbird, the White Tiger (Hector Ayala), Shang-Chi, Greer Grant Nelson/Tigra, Jessica Drew (also known as Spider-Woman), the Ghost Rider (Johnny Blaze), Daimon Hellstrom, Satana Hellstrom, Theodore "Ted" Sallis/the Man-Thing, Eric Brooks/Blade the Vampire-Slayer, Michael Morbius/Morbius the Living Vampire, Howard the Duck, Monica Rambeau (also known as Photon, Pulsar, Spectrum and the second Captain Marvel), Moondragon, Drax the Destroyer, Peter Quill/Star-Lord, Gamora, Rocket Raccoon, Frank Castle/the Punisher, Marc Spector/the Moon Knight, Barbara "Bobbi" Morde/Mockingbird, the Eternals (composed of Ikaris, Thena, Ajak, Makkari, Kingo, Phastos, Gilgamesh and Sprite), War Machine, Nova (Richard Rider), Adam Warlock, Power Pack, Elizabeth "Betsy" Braddock/Psylocke, Scott Lang (the second Ant-Man), Felicia Hardy/the Black Cat, Silver Sable, Katherine "Kitty" Pryde (also known as Shadowcat, Ariel, Sprite, Star-Lord and the Red Queen), Emma Frost (also known as the White Queen), Jennifer Walters/the She-Hulk, Tyrone Johnson/Cloak and Tandy Bowen/Dagger, Brian Braddock/Captain Britain, Doreen Green/Squirrel Girl, Elektra Natchios, the New Mutants (composed of Illyana Rasputin/Magik, Xi'an Coy Minh/Karma, Danielle Moonstar/Mirage, Sam Guthrie/Cannonball, Rahne Sinclair/Wolfsbane, Doug Ramsey/Cypher, Warlock and others), the New Warriors, David Haller/Legion, John Proudstar/Warpath-Thunderbird, Anna Marie LeBeau/Rogue and Jubilation Lee/Jubilee.
Some of Marvel's more recent creations from the mid-to-late 1990s, 2000s and 2010s, such as Wade Wilson/Deadpool, Remy LeBeau/Gambit, Nathan Summers/Cable, Neena Thurman/Domino, Clarice Fong/Blink, the Thunderbolts, Yelena Belova (also known as the second Black Widow), the Runaways, the modern Guardians of the Galaxy, the modern Defenders (based off the Netflix MCU version of the same name), Laura Kinney/X-23 (a.k.a. the second Wolverine), Shuri, the Dora Milaje, Daisy Johnson (also known as Quake), Phil Coulson, Melinda May, Bucky Barnes/the Winter Soldier, Maria Hill, Miles Morales (the second Spider-Man of the Ultimate Marvel Universe), Hope van Dyne (also known as the Red Queen and the second Wasp), Cassandra Lang (also known as Stature, Stinger, Ant-Girl and Giant-Girl), the Stepford Cuckoos, Amadeus Cho (also known as the second Hulk), Kamala Khan (also known as the second Ms. Marvel), Kate Bishop (also known as the third Hawkeye), Lunaella Lafayette/Moon Girl, America Chavez (also known as the second Miss America), Robbie Reyes (also known as the fourth Ghost Rider), Riri Williams/Ironheart and Spider-Gwen (Gwen Stacy of Earth-65) have become popular characters in their own right. Unlike the DC Universe, few of Marvel's 1940s characters have risen to prominence in modern publications; Captain America is one exception, and to a lesser extent, his contemporary, Namor the Sub-Mariner, primarily because both of these characters were reintroduced to readers and the Marvel Universe during the 1960s.
Prominent teams of superheroes include the Avengers, the X-Men, the Fantastic Four, the Defenders, the Inhumans, S.H.I.E.L.D., the Howling Commandos, the Guardians of the Galaxy, the Runaways, the Midnight Sons and the Thunderbolts. All these groups have varying lineups; the Avengers have included Marvel's major heroes as members at one time or another. The X-Men are a team of mutants led by Professor X and include many of Marvel's most popular characters, such as Wolverine and others. The Fantastic Four are viewed as "Marvel's First Family" of superheroes, usually consisting of Mister Fantastic, the Invisible Woman, the Human Torch and the Thing, as well as siblings Franklin and Valeria Richards. The Defenders were an ad-hoc team usually brought together by Doctor Strange which has included the Hulk, Namor the Sub-Mariner and the Silver Surfer, while the most recent incarnation of the team consists of street-level New York City heroes Daredevil, Jessica Jones, Luke Cage and Iron Fist. The Guardians of the Galaxy include Marvel's cosmic characters such as Star-Lord, Gamora, Drax the Destroyer, Groot and Rocket Raccoon, but the team has also introduced other heroes into the roster such as Kitty Pryde, the Silver Surfer, the Thing and Nova. The Inhumans are a royal family consisting of Black Bolt, Medusa, Crystal, Gorgon, Triton, Karnak the Shatterer and Lockjaw, who rule the city of Attilan. The Runaways are a group of teenagers and a dinosaur consisting of Alex Wilder, Nico Minoru, Karolina Dean, Chase Stein, Molly Hayes, Gert Yorkes and Old Lace who rebel against their evil parents known as the Pride. The Midnight Sons consist of supernatural heroes such as Blade, Ghost Rider, Moon Knight, Elsa Bloodstone, Hellstrom, the Werewolf and the Man-Thing. The Thunderbolts' original incarnation were supervillains disguised as superheroes consisting of Citizen V (a.k.a. Helmut Zemo), MACH-IV (a.k.a. the Beetle), Songbird (a.k.a. Screaming Mimi), Moonstone (a.k.a. Meteorite), Techno/the Ogre (a.k.a. the Fixer) and Jolt, while the current incarnation of the team is made up of reformed supervillains working for the government: Deadpool, the Punisher, the Red Hulk, the Winter Soldier and the Ghost. Although teams of supervillains are few and far between, notable examples include the Masters of Evil, the Emissaries of Evil, the Brotherhood of Mutants, the Sinister Six, the Frightful Four, the Lethal Legion, the Legion of the Unliving, the Black Order, the Annihilation Wave, the Starforce and the Cabal.
Most of the superhumans in Marvel's Earth owe their powers to the Celestials, cosmic entities who visited Earth millions of years ago and experimented on our prehistoric ancestors (a process they also carried out on several other planets). This resulted in the creation of two hidden races, the godlike Eternals and the genetically unstable Deviants, in addition to giving some humans an "x-factor" in their genes, which sometimes activates naturally, resulting in sometimes superpowered, sometimes disfigured individuals called mutants. Others require other factors (such as radiation) for their powers to come forth. Depending on the genetic profile, individuals who are exposed to different chemicals or radiation will often suffer death or injury, while in others it will cause superhuman abilities to manifest. Except for psionic abilities, these powers are usually random; rarely do two people have the same set of powers. It is not clear why the Celestials did this, although it is known that they continue to observe humanity's evolution. A Marvel series titled "Earth X" explored one possible reason for this: that superhumans are meant to protect a Celestial embryo that grows inside Earth against any planetary threats and have done so for eons. An X-Men villain that is known as Vargas claims to be a new direction in human evolution, as he is born with superpowers even though his genetic profile said he was an ordinary human being. The majority of the public is unaware of what may cause superhuman powers.
Other possible origins for superhuman powers include magic, genetic manipulation and/or bionic implants. Some heroes and villains have no powers at all but depend instead on hand-to-hand combat training or advanced technological equipment. In the Marvel Universe, technology is considerably more advanced than in the real world; this is due to unique individuals of genius-level intelligence, such as Reed Richards (Mister Fantastic) of the Fantastic Four. However, most of the advanced devices (such as powered armor and death rays) are too expensive for the common citizen, and are usually in the hands of government organizations like S.H.I.E.L.D., or powerful criminal organizations like A.I.M.. One major company producing these devices is Stark Industries, owned by Tony Stark (Iron Man), but there are others. Advanced technology has also been given to humans by hidden races, aliens, or time travelers like Kang the Conqueror, who is known to have influenced the robotics industry in the past.
In superhumans, the energy required for their superpowers either comes from within using their own body as a source or, if the demand of energy exceeds what their body is capable to deliver, comes from another source. In most cases, this other source seems to be what is called the universal psionic field (UPF), which they can tap into. Sometimes they are connected to another source, and more rarely they are even a host for it.
Marvel tries to explain most superpowers and their sources "scientifically", usually through the use of fictional science-like concepts, such as:
A degree of paranoid fear against mutants exists due to stories of mutants being a species or even a subspecies of humans ("Homo superior" or "Homo sapiens superior") that is evolving and is meant to replace normal humans. This has caused organizations to form to deal with the problem, who can be divided into three camps: those who seek peaceful coexistence between mutants and normal humans (the X-Men and their affiliated groups), those who seek to control or eliminate humans to give mutants safety or dominance (Magneto and his followers, as well as other mutants such as Apocalypse), and those who seek to regulate or eliminate mutants in favor of humans. The latter often use the robots known as the Sentinels as weapons. Certain species are regarded as subhuman, like the Morlocks, who lurk beneath New York City and have been discriminated against by the outside world because of their mutant deformities. The Morlocks have recently joined the terrorist organization Gene Nation.
In addition to mutants, Eternals, and Deviants, several other intelligent races have existed secretly on Earth. These include the Inhumans, another genetically unstable race (like the Deviants, but in their case, it is due to their use of a substance called the "Terrigen Mists") that was created by a Kree experiment long ago; the Subterraneans, a race of humanoids adapted to living below the surface, created by the Deviants (some Subterraneans were transformed into the 'Lava Men' by a demon); and "Homo mermanus", a humanoid species of water-breathers that live in Earth's oceans. Most of these races have advanced technology but existed hidden from humanity until recent times. More variants of humanity can be found in the Savage Land (see Places below). Most of the Savage Land races have their origin from a group of primitive ape-men who seems to have escaped the Celestial experiments and whose influence is present in all modern "Homo sapiens". Other leftovers from the era when primitive humanoids walked on Earth still exist, such as the radiation-altered Neanderthal man known as the Missing Link, an enemy of the Hulk.
The Marvel Universe also contains hundreds of intelligent alien races. Earth has interacted with many of them because a major "hyperspace warp" happens to exist in the Solar System.
The three major space empires are:
The three are often in direct or indirect conflict, which occasionally involves Earth humans; in particular, the Kree and Skrulls are ancient enemies, and the Kree-Skrull War has involved humans on several occasions.
The Skrulls have also been known to be in a long and consistent war against the Majesdanians, who live on a milky planet named Majesdane. The war between the two had started after two Majesdanians, Frank and Leslie Dean of the Pride had been kicked out for criminal activities; the two traveled to Earth, where Frank and Leslie stopped the war against Earth in exchange for giving the Skrulls the location of Majesdane, which was hidden behind the corona of a white dwarf. The war had gone on for 16 years minimum; it ended abruptly after the Skrulls shot a barrage of missiles at Majesdane, who retaliated.
Another prominent alien race is the Watchers, immortal and wise beings who watch over the Marvel Universe and have taken a sacred vow not to intervene in events, though the Watcher assigned to Earth, Uatu, has violated this oath on several occasions.
The Elders of the Universe are ancient aliens who have often had a great impact on many worlds for billions of years, acting alone or as a group. A power called the Power Primordial is channeled through them.
Many other races exist and have formed an "Intergalactic Council" to have their say on matters that affect them all, such as interference from Earth humans in their affairs.
Also abundant in the Marvel Universe are legendary creatures such as gods, demons and vampires. The 'gods' of most polytheistic pantheons are powerful, immortal human-like races residing in other dimensions who visited Earth in ancient times, and became the basis of many legends. However, all of these 'gods' share a common ancestry and connection to Earth due to Gaea, the primeval Elder Goddess that infused her life essence into all living things on Earth. Gaea is known by various names and appearances in other cultures and among the various pantheons, but she is the same being. As a result, she is a member of every polytheistic pantheon of 'gods' worshiped by humans. Besides mythological gods, many deities made up by Marvel writers exist as well, such as the Dark Gods, enemies of the Asgardians. The Dark Gods are a race of 'gods' that have been worshiped by extraterrestrial races. Well-known alien races like the Shi'ar and Skrulls also have beings they worship as 'gods', though little has been revealed about them.
Many persons and beings have falsely pretended to be gods or demons during history; in particular, none of the ones claiming to be major figures from Judeo-Christian beliefs have turned out to be the real article, although several angels have appeared in recent years, as well as an apparent true rebellion and expulsion of angels from a higher realm known as Paradise, proving that some form of Heaven and Hell do exist in this Universe, seemingly like those in keeping with common real-world religious belief. Similarly, demons are evil magical beings who take affairs in the matters of the universe. Some of the most powerful are Blackheart, Mephisto, Nightmare, Satannish, Thog the Nether-Spawn and Zom. There are also powerful benevolent mystical entities such as the Vishanti; or amoral and malevolent entities who are not truly demonic, such as Dormammu and the Octessence, or ones heavily drawing upon the mythologies of H.P. Lovecraft and Robert E. Howard. Some supernatural beings, entities and human characters created by Lovecraft and Howard, who were friends and influenced each other's work, have been adapted by Marvel and include Abdul Alhazred, Conan the Barbarian Nyarlathotep and Set. Some deities or demonic beings that are original characters of Marvel have been heavily influenced by these mythologies, such as Shuma-Gorath.
Most of the current generation of gods have been revealed to be the descendants of the Elder Goddess Gaea. The two most featured pantheons are the Asgardians (of whom Thor is a member) and the Olympians (of whom Hercules is a member). The lords of the various pantheons sometimes gather in groups known as either the Council of Godheads or the Council of Skyfathers. The gods were forced to stop meddling with humanity (at least openly) a thousand years ago by the Celestials, and most people today believe them to be fictional. Other pantheons have been depicted in the Marvel Universe that is still actively worshiped in the real world, including those worshiped by the Aboriginal inhabitants of Australia, the gods of Hinduism, the Shinto gods and the gods of Zoroastrianism. These deities are rarely depicted, however. One such appearance generated a good deal of controversy as the depiction involved a fight between Marvel's incarnation of Thor and the Hindu god Shiva, a battle which Shiva lost. As Shiva is one of the principal deities of Hindu religion, his defeat offended some followers of Hinduism. This battle was retconned later as having been the deity Indra, the Hindu god of thunder, who was posing as Shiva, that met defeat. To avoid offending the believers of still active religions, Marvel features such deities as characters in the background or who make very brief cameo appearances.
Marvel's depiction of vampires has been heavily influenced by various interpretations of popular media, such as Bram Stoker's "Dracula". As with many other supernatural creatures, Marvel entwined the origin of vampires with aspects of the mythologies created by Lovecraft and Howard. They were created by magical rites performed by priests of Atlantis before the Great Cataclysm that destroyed much of the world, with Varnae becoming the first vampire. Marvel would depict vampires as frequent antagonists during the Hyborian Age to Howard characters such as Kull and Conan. In recent years, Marvel's depiction of vampires has altered greatly by creating various subspecies of vampires that exist in clans that greatly differ in appearance and belief. All vampires are depicted with varying degrees of superhuman strength, speed, stamina, agility, reflexes and accelerated healing. Many are capable of transforming into animals such as bats or wolves; some can transform into a mist-like substance; some of the most powerful are capable of controlling the weather to a somewhat limited degree. All vampires must ingest blood to maintain their survival and physical vitality. So long as they do so regularly, they cease to age and are immune to diseases. They retain the well-known vulnerabilities common to vampires in other media interpretations, including sunlight, garlic, religious icons and weapons made of silver. Vampires can be killed by a wooden stake driven through the heart, though they return to life if the stake is removed. Vampires are highly allergic to silver and can be killed with it. While they normally heal rapidly, injuries inflicted by silver weapons heal at a much slower rate if the injuries are not fatal. Vampires can also be killed by decapitation or being burned with fire, with burning them to ashes and then scattering the ashes being the most effective means of ensuring their demise (scattering the ashes is done so that the vampire cannot be mystically resurrected).
The cosmic entities are beings of unbelievably great levels of power (the weakest of whom can destroy entire planets) who exist to perform duties that maintain the existence of the universe. Most do not care at all about "lesser beings" such as humans, and as a consequence, their acts are recurrently dangerous to mortals. When dire threats threaten the universe, it is not uncommon for these beings to gather together to discuss the threat and even act on it.
Most conceptual entities are simply interested in furthering their essential function or to keep the balance with an opposing force. However, certain cosmic entities, such as Galactus, the In-Betweener, the Maelstrom, or the Stranger have demonstrated personality, motivations, or (except for the first one mentioned) even ambitions beyond their functions, but often maintain the perspective that morality is entirely relative, or that destroying civilizations of "lesser" beings is no eviler than if these beings destroyed an anthill. Others such as Uatu the Watcher, Eon, or the Celestials Ashema and Tiamut are aberrations in the sense of sympathizing with, and occasionally coming to the defense of, humanity.
The "Fulcrum" is a comparatively recent addition to the hierarchy, that "all" cosmic entities allegedly serve, of a level of raw power stated to far surpass the might of the Watchers and the Celestials. Unlike most other entities, it is capable of conscience, compassion, and even a sense of humor, and has stated that it wants other cosmic beings to develop such as well. He is a possible manifestation/avatar of the One Above All.
The Phoenix Force first received personification in Jean Grey. The Phoenix Force is composed of the psionic energy from all living beings' past, present, and future, and is an embodiment of rebirth and destructive transformation through "burning away what doesn't work", and helped to restart the universe before the Big Bang.
The Marvel Universe is part of a Multiverse, with various universes coexisting simultaneously, "usually" without affecting each other directly. According to Reed Richards, the ultimate fate of the Multiverse is to perish in all-encompassing heat death.
The action of most of the Marvel Comics titles takes place in a continuity known as Earth-616. This continuity exists in a multiverse alongside trillions of alternative continuities. Alternative continuities in the Marvel Multiverse are generally defined in terms of their differences from Earth-616.
Continuities besides Earth-616 include the following (for a complete listing see Multiverse (Marvel Comics)):
In addition, multiple continuities are visited in the comic book series "What If", "What The--?!" (formerly "Not Brand Echh") and "Exiles". The concept of continuity is not the same as a "dimension" or a "universe"; for example, characters like Mephisto and Dormammu hail from parallel dimensions and Galactus from the universe that existed before the Big Bang which began the current universe, but they all nevertheless belong to the Earth-616 continuity (where all the parallel dimensions and alternate universes seem to be connected to the same main timeline). A continuity should also not be confused with an imprint; for example, while the titles of some imprints, such as Ultimate Marvel, take place in a different continuity, some or all publications in other imprints, such as Epic Comics, Marvel MAX, and Marvel UK, take place within the Earth-616 continuity.
Within and sometimes "between" continuities, there exist a variety of "dimensions", sometimes called "pocket dimensions", which typically are not depicted as separate continuities, but rather part of one, typically Earth-616. There is a score of such dimensions, ranging from the Earth-like to the alien. Some are magical and others are scientific; some are inhabited and others are not. These include realities like the Microverse, the Darkforce Dimension, Limbo, the Mojoverse, and many more. The Astral Plane is a dimensional plane that is the source of telekinesis and various other psychic powers. It is a dimension created by the Elder Goddess Oshtur that is sometimes referred to as the "Temple of Oshtur" or the "Realm of the Mind".
Despite various contradictions, the term "dimension" is sometimes interchangeable with "universe" or "reality". Every reality of the Marvel Universe has numerous interconnected dimensions, with each dimension differing from those of other realities; for example, the Ultimate Asgard has clearly been shown to be distinct from the Asgard known to Earth-616 characters. Such dimensions, such as Asgard or the Dark Dimension are not "pocket dimensions" as they reside completely "outside" the boundaries of the Marvel Universe instead of within, as the former does.
One cannot normally alter the Marvel Universe's history; if a time-traveler should cause an alteration to the established flow of events at some point in the past, a divergent universe will simply "branch out" from the existing timeline, and the time-traveler will still return to his or her unaltered original universe. Those realities can also spawn realities of their own. There exist hundreds, probably thousands of such realities. It is unknown why this happens, though a warp known as the Nexus of All Realities exists in a swamp in the Florida Everglades of Earth-616. For the most part, this does not matter, as most beings are unaware that this occurs, or even that their universes were recently "born" from another. However, individuals and organizations exist that try to monitor or manipulate the various realities. These include Immortus, the Captain Britain Corps, the Time Variance Authority, the Timebreakers/Exiles, and Kang the Conqueror's forces. It is possible to travel through time without creating a new alternative universe, instead of altering events in the future, but this seems to have devastating and very far-reaching repercussions, as depicted in Marvel 1602 (it almost destroyed the whole multiverse, including the afterlife).
Also, time itself passes much differently within the confines of the Marvel Universe than it does in the real world. Despite various characters having appeared within company publications for decades, few, if any, have aged to any appreciable degree. For example, the patriotic hero Captain America was created in 1941 but stopped appearing in titles soon after the end of World War II. The character was revived more than 20 years later, explained as having been frozen in a block of ice though believed to be dead, to lead Marvel's latest team of superheroes the Avengers. This first Avengers team featured several characters that would go on to be some of the company's most famous and most popular. Although the characters would be portrayed in hundreds and even thousands of adventures over the decades, they have been portrayed as having aged little or not at all.
Naturally, this tendency is purely due to story conveniences (or a somewhat haphazardly shifting patchwork pattern of authors), and mainly that the fictional "continuity" has been maintained and expanded far beyond what Stan Lee and others originally planned or hoped for. Hence, the passing of time was more discernible in the very early years, such as the graduation of Spider-Man; and what started as children or teenaged characters, such as Kitty Pryde, Franklin Richards, Valeria Richards, Power Pack, or the New Mutants are all allowed to age at wildly shifting rates (in the second case even backward at times), whereas surrounding characters somewhat dependent on a certain age limit do not change at all. This recurrently creates inherently contradictory effects, as events are routinely described to have happened several years ago, even in cases when this would mean that some of the involved characters would have been toddlers. Different approaches also exist regarding allowing "second-generation" descendants of heroes or villains, full-grown over 18 years after an event (for example, Hulkling, other members of the Young Avengers, the Runaways, and the Secret Warriors), whereas other books, such as "Young Allies" use the inherent contradiction to debunk similar claims. If a past storyline wherein a direct depiction of a then-current president or similar is referred to in a later era, it tends to become updated accordingly, sometimes with an "in-joke" acknowledgment.
A more recent explanation was given by Galactus to the Ultimates, namely that some important events - for instance, the creation of the Fantastic Four or the Avengers - have a 'gravity' all their own and warp time around them, causing the timeline to subtly change to accommodate this.
While the Marvel Universe is presumably as large as the non-fictional universe comic book readers inhabit, for all intents and purposes the Local Group is the universe; practically all action takes place in it. The Skrull Empire is located in the Andromeda Galaxy, the Kree Empire in the Greater Magellanic Cloud, which is a satellite of the Milky Way galaxy in which Earth, of course, is found, and the Shi´ar Empire is located somewhere between them in one of the smaller galaxies (perhaps the Triangulum Galaxy); frequently, these three empires are quoted as the main political powers "in the universe". Similarly, the Local Group seems to be the only affected area when the Annihilation Wave cut its bloody swath "across the universe".
Four role-playing games have been set in the Marvel Universe:
For more complete lists of inhabitants of the Marvel Universe, see List of Marvel Comics characters, List of Marvel Comics teams and organizations, and List of Marvel Comics alien races.
|
https://en.wikipedia.org/wiki?curid=20986
|
Munich massacre
The Munich massacre was an attack during the 1972 Summer Olympics in Munich, West Germany, by eight members of the Palestinian terrorist group Black September, who took nine members of the Israeli Olympic team hostage, after killing two of them previously, and killing them along with a West German police officer. Black September called the operation ""Iqrit and Biram"", after two Palestinian Christian villages whose inhabitants were expelled by the Israel Defense Forces (IDF) during the 1948 Arab-Israeli War. The Black September commander was Luttif Afif, who was also the negotiator. West German neo-Nazis gave the group logistical assistance.
Shortly after the hostages were taken, Afif demanded the release of 234 Palestinian prisoners in Israel jails and the West German-held founders of the Red Army Faction, Andreas Baader and Ulrike Meinhof.
Five of the eight Black September members were killed during a failed attempt to rescue the hostages. A West German policeman was also killed in the crossfire. The three surviving perpetrators were Adnan Al-Gashey, Jamal Al-Gashey, and Mohammed Safady, who were arrested. The next month, however, following the hijacking of Lufthansa Flight 615, the West German government released them in a hostage exchange. The Israeli government launched Operation Wrath of God, which authorised Mossad to track down and kill those involved in the Munich massacre.
Two days prior to the start of the 2016 Summer Olympics, in a ceremony led by Brazilian and Israeli officials, the International Olympic Committee honored the eleven Israelis that were killed at Munich.
The hostages were taken during the second week of the 1972 Summer Olympics. The West German Olympic Organizing Committee had hoped to discard the military image of Germany. The Committee was wary of the image portrayed by the 1936 Summer Olympics, which Nazi dictator Adolf Hitler used for his propaganda. Security personnel known as Olys were inconspicuous, but were prepared to deal mostly with ticket fraud and drunkenness. The documentary film "One Day in September" claims that security in the athletes' village was unfit for the Games and that athletes could come and go as they pleased. Athletes could sneak past security, and go to other countries' rooms, by going over the fencing that encompassed the village.
The absence of armed personnel had worried Israeli delegation head Shmuel Lalkin even before his team arrived in Munich. In later interviews with journalists Serge Groussard and Aaron J. Klein, Lalkin said that he had expressed concern with the relevant authorities about his team's lodgings. The team was housed in a relatively isolated part of the Olympic Village, on the ground floor of a small building close to a gate, which Lalkin felt made his team particularly vulnerable to an outside assault. The West German authorities apparently assured Lalkin that extra security would be provided to look after the Israeli team, but Lalkin doubts that any additional measures were ever taken.
Olympic organizers asked West German forensic psychologist Georg Sieber to create 26 terrorism scenarios to aid the organizers in planning security. His "Situation 21" accurately forecast armed Palestinians invading the Israeli delegation's quarters, killing and taking hostages, and demanding Israel's release of prisoners and a plane to leave West Germany. Organizers balked against preparing for Situation 21 and the other scenarios, since guarding the Games against them would have gone against the goal of "Carefree Games" without heavy security.
The German weekly news magazine "Der Spiegel" wrote in 2012 that West German authorities had a tip-off from a Palestinian informant in Beirut three weeks before the massacre. The informant told West Germany that Palestinians were planning an "incident" at the Olympic Games, and the Foreign Ministry in Bonn viewed the tip-off seriously enough to pass it to the secret service in Munich and urge that "all possible security measures" be taken.
But, according to "Der Spiegel", the authorities failed to act on the tip, and had never acknowledged it in the following 40 years. The magazine said that this was only part of a 40-year cover-up by German authorities of the mishandling of its response to the massacre.
On Monday evening, 4 September, the Israeli athletes enjoyed a night out, watching a performance of "Fiddler on the Roof" and dining with the star of the play, Israeli actor Shmuel Rodensky, before returning to the Olympic Village. On the return trip in the team bus Lalkin denied his 13-year-old son – who had befriended weightlifter Yossef Romano and wrestler Eliezer Halfin — permission to spend the night in their apartment at Connollystraße 31, which probably saved the boy's life.
At 4:30 am local time on 5 September, as the athletes slept, eight tracksuit-clad members of the Black September faction of the Palestine Liberation Organization, carrying duffel bags loaded with AKM assault rifles, Tokarev pistols, and grenades, scaled a chain-link fence with the assistance of unsuspecting athletes who were also sneaking into the Olympic Village. The athletes were originally identified as Americans, but were claimed to be Canadians decades later.
Once inside, the group used stolen keys to enter two apartments being used by the Israeli team at Connollystraße 31. Yossef Gutfreund, a wrestling referee, was awakened by a faint scratching noise at the door of Apartment 1, which housed the Israeli coaches and officials. When he investigated, he saw the door begin to open and masked men with guns on the other side. He shouted a warning to his sleeping roommates and threw his 135 kg (300 lb) weight against the door in a futile attempt to stop the intruders from forcing their way in. Gutfreund's actions gave his roommate, weightlifting coach Tuvia Sokolovsky, enough time to smash a window and escape. Wrestling coach Moshe Weinberg fought the intruders, who shot him through his cheek and then forced him to help them find more hostages.
Leading the intruders past Apartment 2, Weinberg lied by telling them that the residents of the apartment were not Israelis. Instead, Weinberg led them to Apartment 3, where the gunmen corralled six wrestlers and weightlifters as additional hostages. It is possible that Weinberg had hoped that the stronger men would have a better chance of fighting off the attackers than those in Apartment 2, but they were all surprised in their sleep.
As the athletes from Apartment 3 were marched back to the coaches' apartment, the wounded Weinberg again attacked the gunmen, allowing one of his wrestlers, Gad Tsobari, to escape via the underground parking garage. Weinberg knocked unconscious one of the intruders and slashed at another with a fruit knife but failed to draw blood before being shot to death.
Weightlifter Yossef Romano, a veteran of the 1967 Six-Day War, also attacked and wounded one of the intruders before being shot and killed. In its publication of 1 December 2015, "The New York Times" reported that Romano was castrated after he was shot.
The gunmen were left with nine hostages. They were, in addition to Gutfreund, sharpshooting coach Kehat Shorr, track and field coach Amitzur Shapira, fencing master Andre Spitzer, weightlifting judge Yakov Springer, wrestlers Eliezer Halfin and Mark Slavin, and weightlifters David Berger and Ze'ev Friedman. Berger was an expatriate American with dual citizenship; Slavin, at 18 the youngest of the hostages, had only arrived in Israel from the Soviet Union four months before the Olympic Games began. Gutfreund, physically the largest of the hostages, was bound to a chair (Groussard describes him as being tied up like a mummy); the rest were lined up four apiece on the two beds in Springer and Shapira's room, and bound at the wrists and ankles and then to each other. Romano's bullet-riddled corpse was left at his bound comrades' feet as a warning. Several of the hostages were beaten during the stand-off, with some suffering broken bones as a result.
Of the other members of Israel's team, racewalker Shaul Ladany had been jolted awake in Apartment 2 by Gutfreund's screams. He jumped from the second-story balcony of his room and fled to the American dormitory, awakening U.S. track coach Bill Bowerman and informing him of the attack. Ladany, a survivor of the Bergen-Belsen concentration camp, was the first person to spread the alert. The other four residents of Apartment 2 (sharpshooters Henry Hershkowitz and Zelig Shtroch, fencers Dan Alon and Yehuda Weisenstein), plus chef de mission Shmuel Lalkin and the two team doctors, hid and eventually fled the besieged building. The two female members of Israel's Olympic team, sprinter and hurdler Esther Shahamorov and swimmer Shlomit Nir, were housed in a separate part of the Olympic Village. Three more members of Israel's Olympic team, two sailors and their manager, were housed in Kiel, from Munich.
The attackers were reported to be Palestinian terrorists from refugee camps in Lebanon, Syria, and Jordan. They were identified as Luttif Afif (using the codename Issa), the leader (three of Issa's brothers were also reportedly members of Black September, two of them in Israeli jails), his deputy Yusuf Nazzal ("Tony"), and junior members Afif Ahmed Hamid ("Paolo"), Khalid Jawad ("Salah"), Ahmed Chic Thaa ("Abu Halla"), Mohammed Safady ("Badran"), Adnan Al-Gashey ("Denawi"), and Al-Gashey's cousin, Jamal Al-Gashey ("Samir").
According to author Simon Reeve, Afif (the son of a Jewish mother and Christian father), Nazzal, and one of their confederates, had all worked in various capacities in the Olympic Village, and had spent a couple of weeks scouting for their potential target. A member of the Uruguayan Olympic delegation, which shared housing with the Israelis, claimed that he found Nazzal inside 31 Connollystraße less than 24 hours before the attack, but since he was recognized as a worker in the Village, nothing was thought of it at the time. The other members of the group entered Munich via train and plane in the days before the attack. All the members of the Uruguay and Hong Kong Olympic teams, which also shared the building with the Israelis, were released unharmed during the attack.
On 5 September, Golda Meir, Prime Minister of Israel, appealed to other countries to "save our citizens and condemn the unspeakable criminal acts committed." She also stated, "if we [Israel] should give in, then no Israeli anywhere in the world shall feel that his life is safe... it's blackmail of the worst kind."
King Hussein of Jordan, the only leader of an Arab country to denounce the attack publicly, called it a "savage crime against civilization ... perpetrated by sick minds."
U.S. President Richard Nixon privately discussed a number of possible American responses, such as declaring a national day of mourning (favored by Secretary of State William P. Rogers), or having Nixon fly to the athletes' funerals. Nixon and U.S National Security Advisor Henry Kissinger decided instead to press the United Nations to take steps against international terrorism.
The hostage-takers demanded the release of 234 Palestinians and non-Arabs jailed in Israel, along with two West German insurgents held by the West German penitentiary system, Andreas Baader and Ulrike Meinhof, who were founders of the West German Red Army Faction. The hostage-takers threw the body of Weinberg out of the front door of the residence to demonstrate their resolve. Israel's response was immediate and absolute: there would be no negotiation. Israel's official policy at the time was to refuse to negotiate with terrorists under any circumstances, as according to the Israeli government such negotiations would give an incentive to future attacks.
It has been claimed that the German authorities, under the leadership of Chancellor Willy Brandt and Minister for the Interior Hans-Dietrich Genscher, rejected Israel's offer to send an Israeli special forces unit to West Germany. The Bavarian interior minister Bruno Merk, who headed the crisis centre jointly with Genscher and Munich's police chief Manfred Schreiber, denies that such an Israeli offer ever existed.
According to journalist John K. Cooley, the hostage situation presented an extremely difficult political situation for the Germans because the hostages were Jewish. Cooley reported that the Germans offered the Palestinians an unlimited amount of money for the release of the athletes, as well as the substitution by high-ranking Germans. However, the kidnappers refused both offers.
Munich police chief Manfred Schreiber, and Bruno Merk, interior minister of Bavaria, negotiated directly with the kidnappers, repeating the offer of an unlimited amount of money. According to Cooley, the reply was that "money means nothing to us; our lives mean nothing to us." Magdi Gohary and Mohammad Khadif, both Egyptian advisers to the Arab League, and A.D. Touny, an Egyptian member of the International Olympic Committee (IOC) also helped try to win concessions from the kidnappers, but to no avail. However, the negotiators apparently were able to convince the terrorists that their demands were being considered, as "Issa" granted a total of five deadline extensions. Elsewhere in the village, athletes carried on as normal, seemingly oblivious of the events unfolding nearby. The Games continued until mounting pressure on the IOC forced a suspension some 12 hours after the first athlete had been murdered. United States marathon runner Frank Shorter, observing the unfolding events from the balcony of his nearby lodging, was quoted as saying, "Imagine those poor guys over there. Every five minutes a psycho with a machine gun says, 'Let's kill 'em now,' and someone else says, 'No, let's wait a while.' How long could you stand that?"
At 4:30 pm, a squad of 38 West German police officers was dispatched to the Olympic Village. Dressed in Olympic sweatsuits (some also wearing Stahlhelme and carrying Walther MP sub-machine guns), they were members of the German border police, although according to former Munich policeman Heinz Hohensinn they were regular Munich police officers, with no experience in combat or hostage rescue. Their plan was to crawl down from the ventilation shafts and kill the terrorists. The police took up positions awaiting the codeword "Sunshine", which upon hearing, they were to begin the assault. In the meantime, camera crews filmed the actions of the officers from the German apartments, and broadcast the images live on television. Thus, the terrorists were able to watch the police prepare to attack. Footage shows one of the kidnappers peering from the balcony door while one of the police officers stood on the roof less than from him. In the end, after "Issa" threatened to kill two of the hostages, the police retreated from the premises.
At one point during the crisis, the negotiators demanded direct contact with the hostages to satisfy themselves the Israelis were still alive. Fencing coach Andre Spitzer, who spoke fluent German, and shooting coach Kehat Shorr, the senior member of the Israeli delegation, had a brief conversation with West German officials while standing at the second-floor window of the besieged building, with two kidnappers holding guns on them. When Spitzer attempted to answer a question, he was clubbed with the butt of an AK-47 in full view of international television cameras and pulled away from the window. A few minutes later, Hans-Dietrich Genscher and Walter Tröger, the mayor of the Olympic Village, were briefly allowed into the apartments to speak with the hostages. Tröger spoke of being very moved by the dignity with which the Israelis held themselves, and that they seemed resigned to their fate.
Tröger noticed that several of the hostages, especially Gutfreund, showed signs of having suffered physical abuse at the hands of the kidnappers, and that David Berger had been shot in his left shoulder. While being debriefed by the crisis team, Genscher and Tröger told them that they had seen "four or five" attackers inside the apartment. Fatefully, these numbers were accepted as definitive. While Genscher and Tröger were talking with the hostages, Kehat Shorr had told the West Germans that the Israelis would not object to being flown to an Arab country, provided that strict guarantees for their safety were made by the Germans and whichever nation they landed in. At 6 pm Munich time, the Palestinians issued a new dictate, demanding transportation to Cairo.
The authorities feigned agreement to the Cairo demand (although Egyptian Prime Minister Aziz Sedki had already told the West German authorities that the Egyptians did not wish to become involved in the hostage crisis).
Two Bell UH-1 military helicopters were to transport the terrorists and hostages to nearby Fürstenfeldbruck, a NATO airbase. Initially, the perpetrators' plan was to go to Riem, which was the international airport near Munich at the time, but the negotiators convinced them that Fürstenfeldbruck would be more practical. The authorities, who preceded the Black Septemberists and hostages in a third helicopter, had an ulterior motive: they planned an armed assault at the airport.
Realizing that the Palestinians and Israelis had to walk 200 metres through the underground garages to reach the helicopters, the West German police saw another opportunity to ambush the perpetrators, and placed sharpshooters there. But "Issa" insisted on checking the route first. He and some other Palestinians walked pointing their AK-47s at Schreiber, Tröger and Genscher. At that time, the police snipers were lying behind cars in the sidestreets, and when they approached the latter crawled away, making noise in the process. Thus the terrorists were immediately alerted of the dangerous presence, and they decided to use a bus instead of walking. The bus arrived at 10:00 pm and drove the contingent to the helicopters. "Issa" checked them with a flashlight before boarding in groups.
Five West German policemen were deployed around the airport in sniper roles—three on the roof of the control tower, one hidden behind a service truck and one behind a small signal tower at ground level. However, none of them had any special sniper training, nor any special weapon (being equipped with the H&K G3, the ordinary battle rifle of the German Armed Forces without optics or night vision devices). The officers were selected because they shot competitively on weekends. During a subsequent German investigation, an officer identified as "Sniper No. 2" stated: "I am of the opinion that I am not a sharpshooter."
The members of the crisis teamSchreiber, Genscher, Merk and Schreiber's deputy Georg Wolfsupervised and observed the attempted rescue from the airport control tower. Cooley, Reeve and Groussard all place Mossad chief Zvi Zamir and Victor Cohen, one of Zamir's senior assistants, at the scene as well, but as observers only. Zamir has stated repeatedly in interviews over the years that he was never consulted by the Germans at any time during the rescue attempt and thought that his presence actually made the Germans uncomfortable.
A Boeing 727 jet was positioned on the tarmac with sixteen West German police inside dressed as flight crew. It was agreed that "Issa" and "Tony" would inspect the plane. The plan was that the West Germans would overpower them as they boarded, giving the snipers a chance to kill the remaining terrorists at the helicopters. These were believed to number no more than two or three, according to what Genscher and Tröger had seen inside 31 Connollystraße. However, during the transfer from the bus to the helicopters, the crisis team discovered that there were actually eight of them.
At the last minute, as the helicopters were arriving at Fürstenfeldbruck, the West German police aboard the airplane voted to abandon their mission, without consulting the central command. This left only the five sharpshooters to try to overpower a larger and more heavily armed group. At that point, Colonel Ulrich Wegener, Genscher's senior aide and later the founder of the elite German counter-terrorist unit GSG 9, said "I'm sure this will blow the whole affair!".
The helicopters landed just after 10:30 pm and the four pilots and six of the kidnappers emerged. While four of the Black September members held the pilots at gunpoint (breaking an earlier promise that they would not take any Germans hostage), Issa and Tony walked over to inspect the jet, only to find it empty. Realizing they had been lured into a trap, they sprinted back toward the helicopters. As they ran past the control tower, Sniper 3 took one last opportunity to eliminate "Issa", which would have left the group leaderless. However, due to the poor lighting, he struggled to see his target and missed, hitting "Tony" in the thigh instead. Meanwhile, the West German authorities gave the order for snipers positioned nearby to open fire, which occurred around 11:00 pm.
In the ensuing chaos, Ahmed Chic Thaa and Afif Ahmed Hamid, the two kidnappers holding the helicopter pilots, were killed while the remaining gunmensome possibly already woundedscrambled to safety, returning fire from behind and beneath the helicopters, out of the snipers' line of sight, shooting out many of the airport lights. A West German policeman in the control tower, Anton Fliegerbauer, was killed by the gunfire. The helicopter pilots fled; the hostages, tied up inside the craft, could not. During the gun battle, the hostages secretly worked on loosening their bonds and teethmarks were found on some of the ropes after the gunfire had ended.
The West Germans had not arranged for armored personnel carriers ahead of time and only at this point were they called in to break the deadlock. Since the roads to the airport had not been cleared, the carriers became stuck in traffic and finally arrived around midnight. With their appearance, the kidnappers felt the shift in the status quo, and possibly panicked at the thought of the failure of their operation.
At four minutes past midnight of 6 September, one of them (likely Issa) turned on the hostages in the eastern helicopter and fired at them with a Kalashnikov assault rifle from point-blank range. Springer, Halfin and Friedman were killed instantly; Berger, shot twice in the leg, is believed to have survived the initial onslaught (as his autopsy later found that he had died of smoke inhalation). The attacker then pulled the pin on a hand grenade and tossed it into the cockpit; the ensuing explosion destroyed the helicopter and incinerated the bound Israelis inside.
Issa then dashed across the tarmac and began firing at the police, who killed him with return fire. Another, Khalid Jawad, attempted to escape and was gunned down by one of the snipers. What happened to the remaining hostages is still a matter of dispute. A German police investigation indicated that one of their snipers and a few of the hostages may have been shot inadvertently by the police. However, a "Time" magazine reconstruction of the long-suppressed Bavarian prosecutor's report indicates that a third kidnapper (Reeve identifies Adnan Al-Gashey) stood at the door of the western helicopter and raked the remaining five hostages with machine gun fire; Gutfreund, Shorr, Slavin, Spitzer and Shapira were shot an average of four times each.
Of the four hostages in the eastern helicopter, only Ze'ev Friedman's body was relatively intact; he had been blown clear of the helicopter by the explosion. In some cases, the exact cause of death for the hostages in the eastern helicopter was difficult to establish because the rest of the corpses were burned almost beyond recognition in the explosion and subsequent fire. Three of the remaining men lay on the ground, one of them feigning death, and were captured by police. Jamal Al-Gashey had been shot through his right wrist, and Mohammed Safady had sustained a flesh wound to his leg. Adnan Al-Gashey had escaped injury completely. Tony escaped the scene, but was tracked down with police dogs 40 minutes later in an airbase parking lot. Cornered and bombarded with tear gas, he was shot dead after a brief gunfight. By around 1:30 am on 6 September, the battle was over.
Initial news reports, published all over the world, indicated that all the hostages were alive, and that all the attackers had been killed. Only later did a representative for the International Olympic Committee (IOC) suggest that "initial reports were overly optimistic." Jim McKay, who was covering the Olympics that year for the American Broadcasting Company (ABC), had taken on the job of reporting the events as Roone Arledge fed them into his earpiece. At 3:24 am, McKay received the official confirmation:
Several sources listed Ladany as having been killed. Ladany recalled later:
Author Simon Reeve, among others, writes that the shootout with the well-trained Black September members showed an egregious lack of preparation on the part of the German authorities. They were not prepared to deal with this sort of situation. This costly lesson led directly to the founding, less than two months later, of police counter-terrorism branch GSG 9. German authorities made a number of mistakes. First, because of restrictions in the post-war West German constitution, the army could not participate in the attempted rescue, as the German armed forces are not allowed to operate inside Germany during peacetime. The responsibility was entirely in the hands of the Munich police and the Bavarian authorities.
It was known a half-hour before the hostages and kidnappers had even arrived at Fürstenfeldbruck that the number of the latter was larger than first believed. Despite this new information, Schreiber decided to continue with the rescue operation as originally planned and the new information could not reach the snipers since they had no radios.
It is a basic tenet of sniping operations that there are enough snipers (at least two for each "known" target, or in this case a minimum of ten) deployed to neutralize as many of the attackers as possible with the first volley of shots. The 2006 "National Geographic" Channel's "Seconds From Disaster" profile on the massacre stated that the helicopters were supposed to land sideways and to the west of the control tower, a maneuver which would have allowed the snipers clear shots into them as the kidnappers threw open the helicopter doors. Instead, the helicopters were landed facing the control tower and at the centre of the airstrip. This not only gave them a place to hide after the gunfight began, but put Snipers 1 and 2 in the line of fire of the other three snipers on the control tower. The snipers were denied valuable shooting opportunities as a result of the positioning of the helicopters, stacking the odds against what were effectively three snipers versus eight heavily armed gunmen.
According to the same program, the crisis committee delegated to make decisions on how to deal with the incident consisted of Bruno Merk (the Bavarian interior minister), Hans-Dietrich Genscher (the West German interior minister) and Manfred Schreiber (Munich's Chief of Police); in other words, two politicians and one tactician. The program mentioned that a year before the Games, Schreiber had participated in another hostage crisis (a failed bank robbery) in which he ordered a marksman to shoot one of the perpetrators, managing only to wound the robber. As a result, the robbers shot an innocent woman dead. Schreiber was consequently charged with involuntary manslaughter. An investigation ultimately cleared him of any wrongdoing, but the program suggested that the prior incident affected his judgment in the subsequent Olympic hostage crisis.
As mentioned earlier, the five German snipers at Fürstenfeldbruck did not have radio contact with one another (nor with the German authorities conducting the rescue operation) and therefore were unable to coordinate their fire. The only contact the snipers had with the operational leadership was with Georg Wolf, who was lying next to the three snipers on the control tower giving orders directly to them. The two snipers at ground level had been given vague instructions to shoot when the other snipers began shooting, and were basically left to fend for themselves.
In addition, the snipers did not have the proper equipment for this hostage rescue operation. The Heckler & Koch G3 battle rifles used were considered by several experts to be inadequate for the distance at which the snipers were trying to shoot. The G3, the standard service rifle of the Bundeswehr at that time, had a barrel; at the distances the snipers were required to shoot, a barrel would have ensured far greater accuracy. None of the rifles were equipped with telescopic or infrared sights. Additionally, none of the snipers were equipped with a steel helmet or bullet-proof vest. No armored vehicles were at the scene at Fürstenfeldbruck, and were only called in after the gunfight was well underway.
There were also numerous tactical errors. As mentioned earlier, "Sniper 2", who was stationed behind the signal tower, wound up directly in the line of fire of his fellow snipers on the control tower, without any protective gear and without any other police being aware of his location. Because of this, "Sniper 2" didn't fire a single shot until late in the gunfight, when hostage-taker Khalid Jawad attempted to escape on foot and ran right at the exposed sniper. "Sniper 2" killed the fleeing perpetrator but was in turn badly wounded by a fellow police officer, who was unaware that he was shooting at one of his own men. One of the helicopter pilots, Gunnar Ebel, was lying near "Sniper 2" and was also wounded by friendly fire. Both Ebel and the sniper recovered from their injuries.
Many of the errors made by the Germans during the rescue attempt were ultimately detailed by Heinz Hohensinn, who had participated in Operation Sunshine earlier that day. He stated in "One Day in September" that he had been selected to pose as a crew member. He and his fellow policemen understood that it was a suicide mission, so the group unanimously voted to flee the plane. None of them were reprimanded for that desertion.
The bodies of the five Palestinian attackers—Afif, Nazzal, Chic Thaa, Hamid and Jamal—killed during the Fürstenfeldbruck gun battle were delivered to Libya, where they received heroes' funerals and were buried with full military honours. On 8 September, Israeli planes bombed ten PLO bases in Syria and Lebanon in response to the massacre, killing scores of militants and civilians.
The three surviving Black September gunmen had been arrested after the Fürstenfeldbruck gunfight, and were being held in a Munich prison for trial. On 29 October, Lufthansa Flight 615 was hijacked and threatened to be blown up if the Munich attackers were not released. Safady and the Al-Gasheys were immediately released by West Germany, receiving a tumultuous welcome when they touched down in Libya and (as seen in "One Day in September") giving their own firsthand account of their operation at a press conference broadcast worldwide.
Further international investigations into the Lufthansa Flight 615 incident have produced theories of a secret agreement between the German government and Black September release of the surviving terrorists in exchange for assurances of no further attacks on Germany.
In the wake of the hostage-taking, competition was suspended for 34 hours, for the first time in modern Olympic history, after public criticism of the Olympic Committee's decision to continue the games. On 6 September, a memorial service attended by 80,000 spectators and 3,000 athletes was held in the Olympic Stadium. IOC President Avery Brundage made little reference to the murdered athletes during a speech praising the strength of the Olympic movement and equating the attack on the Israeli sportsmen with the recent arguments about encroaching professionalism and disallowing Rhodesia's participation in the Games, which outraged many listeners. The victims' families were represented by Andre Spitzer's widow Ankie, Moshe Weinberg's mother, and a cousin of Weinberg, Carmel Eliash. During the memorial service, Eliash collapsed and died of a heart attack.
Many of the 80,000 people who filled the Olympic Stadium for West Germany's football match with Hungary carried noisemakers and waved flags, but when several spectators unfurled a banner reading "17 dead, already forgotten?" security officers removed the sign and expelled those responsible from the grounds. During the memorial service, the Olympic Flag was flown at half-staff, along with the flags of most of the other competing nations at the request of Willy Brandt. Ten Arab nations objected to their flags flying at half-staff and the mandate was rescinded.
Willi Daume, president of the Munich organizing committee, initially sought to cancel the remainder of the Games, but in the afternoon Brundage and others who wished to continue the Games prevailed, stating that they could not let the incident halt the Games. Brundage stated "The Games must go on, and we must ... and we must continue our efforts to keep them clean, pure and honest." The decision was endorsed by the Israeli government and Israeli Olympic team chef de mission Shmuel Lalkin.
On 6 September, after the memorial service, the remaining members of the Israeli team withdrew from the Games and left Munich. All Jewish sportsmen were placed under guard. Mark Spitz, the American swimming star who had already completed his competitions, left Munich during the hostage crisis (it was feared that as a prominent Jew, Spitz might be a kidnapping target). The Egyptian team left the Games on 7 September, stating they feared reprisals. The Philippine and Algerian teams also left the Games, as did some members of the Dutch and Norwegian teams. American marathon runner Kenny Moore, who wrote about the incident for "Sports Illustrated", quoted Dutch distance runner Jos Hermens as saying "It's quite simple. We were invited to a party, and if someone comes to the party and shoots people, how can you stay?" Many athletes, dazed by the tragedy, similarly felt that their desire to compete had been destroyed, although they stayed at the Games.
Four years later at the 1976 Summer Olympics in Montreal, the Israeli team commemorated the massacre: when they entered the stadium at the Opening Ceremony, their national flag was adorned with a black ribbon.
The families of some victims have asked the IOC to establish a permanent memorial to the athletes. The IOC has declined, saying that to introduce a specific reference to the victims could "alienate other members of the Olympic community," according to the BBC. Alex Gilady, an Israeli IOC official, told the BBC: "We must consider what this could do to other members of the delegations that are hostile to Israel."
The IOC rejected an international campaign in support of a minute of silence at the Opening Ceremony of the 2012 London Olympics in honour of the Israeli victims on the 40th anniversary of the massacre. Jacques Rogge, the IOC President, said it would be "inappropriate," although the opening ceremony included a memorium for the victims of the 7 July 2005 London bombings. Speaking of the decision, Olympian Shaul Ladany, who survived the attack, commented: "I do not understand. I do not understand, and I do not accept it."
In 2014 the International Olympic Committee agreed to contribute $250,000 towards a memorial to the murdered Israeli athletes. After 44 years, the IOC commemorated the victims of the Munich massacre for the first time in the Rio 2016 Olympic Village on 4 August 2016.
There is a memorial outside the Olympic stadium in Munich in the form of a stone tablet at the bridge linking the stadium to the former Olympic village. There is a memorial tablet to the slain Israelis outside the front door of their former lodging at 31 Connollystraße. On 15 October 1999 (almost a year before the Sydney 2000 Games), a memorial plaque was unveiled in one of the large light towers (Tower 14) outside the Sydney Olympic Stadium.
Golda Meir and the Israeli Defense Committee secretly authorized the Mossad to track down and kill those allegedly responsible for the Munich massacre. The accusation that this was motivated by a desire for vengeance was disputed by Zvi Zamir, who described the mission as "putting an end to the type of terror that was perpetrated" in Europe. To this end Mossad set up a number of special teams to locate and kill these fedayeen, aided by the agency's stations in Europe.
In a February 2006 interview, former Mossad chief Zvi Zamir answered direct questions:
The Israeli mission later became known as "Operation Wrath of God" or "Mivtza Za'am Ha'El". Reeve quotes General Aharon Yariv—who, he writes, was the general overseer of the operation—as stating that after Munich the Israeli government felt it had no alternative but to exact justice.
Benny Morris writes that a target list was created using information from "turned" PLO personnel and friendly European intelligence services. Once completed, a wave of assassinations of suspected Black September operatives began across Europe. On 9 April 1973, Israel launched Operation "Spring of Youth", a joint Mossad–IDF operation in Beirut. The targets were Mohammad Yusuf al-Najjar (Abu Yusuf), head of Fatah's intelligence arm, which ran Black September, according to Morris; Kamal Adwan, who headed the PLO's Western Sector, which controlled PLO action inside Israel; and Kamal Nassir, the PLO spokesman. A group of Sayeret commandos were taken in nine missile boats and a small fleet of patrol boats to a deserted Lebanese beach, before driving in two cars to downtown Beirut, where they killed Najjar, Adwan and Nassir. Two further detachments of commandos blew up the PFLP's headquarters in Beirut and a Fatah explosives plant. The leader of the commando team that conducted the operations was Ehud Barak.
On 21 July 1973, in the Lillehammer affair, a team of Mossad agents mistakenly killed Ahmed Bouchiki, a Moroccan man unrelated to the Munich attack, in Lillehammer, Norway, after an informant mistakenly said Bouchiki was Ali Hassan Salameh, the head of Force 17 and a Black September operative. Five Mossad agents, including two women, were captured by the Norwegian authorities, while others managed to slip away. The five were convicted of the killing and imprisoned, but were released and returned to Israel in 1975. The Mossad later found Ali Hassan Salameh in Beirut and killed him on 22 January 1979 with a remote-controlled car bomb. The attack killed four passersby and injured 18 others. According to CIA officer Duane "Dewey" Claridge, chief of operations of the CIA Near East Division from 1975 to 1978, in mid-1976, Salameh offered Americans assistance and protection with Arafat's blessings during the American embassy pull-out from Beirut during the down-spiraling chaos of the Lebanese Civil War. There was a general feeling that Americans could be trusted. However, the scene of cooperation came to an end abruptly after the assassination of Salameh. Americans were generally blamed as Israel's principal benefactors.
Simon Reeve writes that the Israeli operations continued for more than twenty years. He details the assassination in Paris in 1992 of Atef Bseiso, the PLO's head of intelligence, and says that an Israeli general confirmed there was a link back to Munich. Reeve also writes that while Israeli officials have stated "Operation Wrath of God" was intended to exact vengeance for the families of the athletes killed in Munich, "few relatives wanted such a violent reckoning with the Palestinians." Reeve states the families were instead desperate to know the truth of the events surrounding the Munich massacre. Reeve outlines what he sees as a lengthy cover-up by German authorities to hide the truth. After a lengthy court fight, in 2004 the families of the Munich victims reached a settlement of €3 million with the German government.
An article in 2012 in a front-page story of the German news magazine "Der Spiegel" reported that much of the information pertaining to the mishandling of the massacre was covered up by the German authorities. For twenty years, Germany refused to release any information about the attack and did not accept responsibility for the results. The magazine reported that the government had been hiding 3,808 files, which contained tens of thousands of documents. "Der Spiegel" said it obtained secret reports by authorities, embassy cables, and minutes of cabinet meetings that demonstrate the lack of professionalism of the German officials in handling the massacre. The newspaper also wrote that the German authorities were told that Palestinians were planning an "incident" at the Olympics three weeks before the massacre, but failed to take the necessary security measures, and these facts are missing from the official documentation of the German government.
In August 2012, "Der Spiegel" reported that following the massacre, Germany began secret meetings with Black September, at the behest of the West German government, due to the fear that Black September would carry out other terrorist attacks in Germany. The government proposed a clandestine meeting between German Foreign Minister Walter Scheel and a member of Black September to create a "new basis of trust." In return for an exchange of the political status of the Palestine Liberation Organization, the PLO would stop terrorist attacks on German soil. When French police arrested Abu Daoud, one of the chief organizers of the Munich massacre, and inquired about extraditing him to Germany, Bavaria's justice secretary recommended that Germany should not take any action, causing the French to release Abu Daoud and the Assad regime to shelter him until he died at a Damascus hospital in 2010.
Two of the three surviving gunmen, Mohammed Safady and Adnan Al-Gashey, were allegedly killed by Mossad as part of "Operation Wrath of God". Al-Gashey was allegedly located after making contact with a cousin in a Gulf State, and Safady was found by remaining in touch with family in Lebanon. This account was challenged in a book by Aaron J. Klein, who claims that Al-Gashey died of heart failure in the 1970s, and that Safady was killed by Christian Phalangists in Lebanon in the early 1980s. However, in July 2005, PLO veteran Tawfiq Tirawi told Klein that Safady, whom Tirawi claimed as a close friend, was "as alive as you are."
The third surviving gunman, Jamal Al-Gashey, was known to be alive as of 1999, hiding in North Africa or in Syria, claiming to still fear retribution from Israel. He is the only one of the surviving terrorists to consent to interviews since 1972, having granted an interview in 1992 to a Palestinian newspaper, and having briefly emerged from hiding in 1999 to participate in an interview for the film "One Day in September", during which he was disguised and his face shown only in blurry shadow.
Of those believed to have planned the massacre, only Abu Daoud, the man who claims that the attack was his idea, is known to have died of natural causes. Historical documents released to "Der Spiegel" by the German secret service show that Dortmund police had been aware of collaboration between Abu Daoud and neo-Nazi ( E. W. Pless and, since 1979, officially named Willi Voss) seven weeks before the attack. In January 1977, Abu Daoud was intercepted by French police in Paris while traveling from Beirut under an assumed name. Under protest from the PLO, Iraq, and Libya, who claimed that because Abu Daoud was traveling to a PLO comrade's funeral he should receive diplomatic immunity, the French government refused a West German extradition request on grounds that forms had not been filled in properly, and put him on a plane to Algeria before Germany could submit another request. On 27 July 1981, he was shot 5 times from a distance of around two meters in a Warsaw Victoria (now Sofitel) hotel coffee shop, but survived the attack, chasing his would-be assassin down to the coffee shop's front entrance before collapsing.
Abu Daoud was allowed safe passage through Israel in 1996 so he could attend a PLO meeting convened in the Gaza Strip for the purpose of rescinding an article in its charter that called for Israel's eradication. In his autobiography, "From Jerusalem to Munich", first published in France in 1999, and later in a written interview with "Sports Illustrated", Abu Daoud wrote that funds for Munich were provided by Mahmoud Abbas, Chairman of the PLO since 11 November 2004 and President of the Palestinian National Authority since 15 January 2005.
Abu Daoud believed that if the Israelis knew that Mahmoud Abbas was the financier of the operation, the 1993 Oslo Accords would not have been achieved, during which Mahmoud Abbas was seen in photo ops at the White House.
Abu Daoud, who lived with his wife on a pension provided by the Palestinian Authority, said that "the Munich operation had the endorsement of Arafat," although Arafat was not involved in conceiving or implementing the attack. In his autobiography, Abu Daoud writes that Arafat saw the team off on the mission with the words "God protect you."
Ankie Spitzer, widow of fencing coach and Munich victim Andre, declined several offers to meet with Abu Daoud, saying that the only place she wants to meet him is in a courtroom. According to Spitzer, "He [Abu Daoud] didn't pay the price for what he did." In 2006, during the release of Steven Spielberg's film, "Munich", "Der Spiegel" interviewed Abu Daoud regarding the Munich massacre. He was quoted as saying: "I regret nothing. You can only dream that I would apologize."
Daoud died of kidney failure aged 73 on 3 July 2010 in Damascus, Syria.
|
https://en.wikipedia.org/wiki?curid=20989
|
Michael Nesmith
Robert Michael Nesmith (born December 30, 1942) is an American musician, songwriter, actor, producer, novelist, businessman, and philanthropist, best known as a member of the pop rock band the Monkees and co-star of the TV series "The Monkees" (1966–1968). Nesmith's songwriting credits include "Different Drum" (sung by Linda Ronstadt with the Stone Poneys).
After the break-up of the Monkees, Nesmith continued his successful songwriting and performing career, first with the seminal country rock group the First National Band, with whom he had a top-40 hit, "Joanne", and then as a solo artist. He is a noted player of the 12-string guitar, performing on custom-built 12-string electric guitars with the Monkees (built by Gretsch) and various 12-string acoustic models during his post-Monkees career.
He is also an executive producer of the cult film "Repo Man" (1984). In 1981, Nesmith won the first Grammy Award given for Video of the Year for his hour-long television show, "Elephant Parts".
Nesmith was born in Houston, Texas, in 1942. He is an only child; his parents Warren and Bette Nesmith (née McMurray) divorced when he was four. His mother married Robert Graham in 1962, and they remained married until 1975. Nesmith and his mother moved to Dallas to be closer to her family. She took temporary jobs ranging from clerical work to graphic design, eventually attaining the position of executive secretary at Texas Bank and Trust. When Nesmith was 13, his mother invented the typewriter correction fluid known commercially as Liquid Paper. Over the next 25 years, she built the Liquid Paper Corporation into a multimillion-dollar international company which she sold to Gillette in 1979 for $48 million. She died a few months later at age 56.
Nesmith participated in choral and drama activities at Thomas Jefferson High School in Dallas, but he enlisted in the Air Force in 1960 without graduating. He completed basic training at Lackland Air Force Base in San Antonio, was trained as an aircraft mechanic at Sheppard Air Force Base in Wichita Falls, Texas, and then was permanently stationed at the Clinton-Sherman Air Force Base near Burns Flat, Oklahoma. He obtained a GED certificate and was honorably discharged in 1962. He enrolled in San Antonio College where he met John Kuehne and began a musical collaboration. They won the first San Antonio College talent award, performing a mixture of standard folk songs and a few of Nesmith's original songs. Nesmith began to write more songs and poetry, then he moved to Los Angeles and began singing in folk clubs around the city. He served as the "Hootmaster" for the Monday night hootenanny at The Troubadour, a West Hollywood nightclub that featured new artists.
Randy Sparks from the New Christy Minstrels offered Nesmith a publishing deal for his songs, and Barry Freedman told him about upcoming auditions for a new TV series called "The Monkees". In October 1965, Nesmith landed the role as the wool hat-wearing guitar player "Mike" in the show, which required real-life musical talent for writing, instrument playing, singing, and performing in live concerts as part of The Monkees band. "The Monkees" television series aired from 1966 until 1968, and has developed a cult following over the years.
After a tour of duty in the Air Force, Nesmith was given a guitar as a Christmas present from his mother and stepfather. Learning as he went, he played solo and in a series of working bands, performing folk, country, and occasionally rock and roll. His verse poems became the basis for song lyrics, and after moving to Los Angeles with Phyllis and friend John London, he signed a publishing deal for his songs. Nesmith's "Mary, Mary" was recorded by the Paul Butterfield Blues Band, while "Different Drum" and "Some of Shelly's Blues" were recorded by Linda Ronstadt and the Stone Poneys. "Pretty Little Princess", written in 1965, was recorded by Frankie Laine and released as a single in 1968 on ABC Records. Later, "Some of Shelly's Blues" and "Propinquity (I've Just Begun to Care)" were made popular by the Nitty Gritty Dirt Band on their 1970 album "Uncle Charlie & His Dog Teddy".
Nesmith began his recording career in 1963 by releasing a single on the Highness label. He followed this in 1965 with a one-off single released on Edan Records followed by two more recorded singles; one was titled "The New Recruit" under the name "Michael Blessing", released on Colpix Records, coincidentally also the label of Davy Jones, though they did not meet until the Monkees formed.
From 1965 to early 1970, Nesmith was a member of the television pop-rock band the Monkees, created for the television situation comedy of the same name. Nesmith won his role largely by appearing nonchalant when he auditioned. He rode his motorcycle to the audition, and wore a wool hat to keep his hair out of his eyes; producers Bob Rafelson and Bert Schneider remembered the "wool hat" guy, and called Nesmith back.
Once he was cast, Screen Gems bought his songs so they could be used in the show. Many of the songs Nesmith wrote for the Monkees, such as "The Girl I Knew Somewhere", "Mary, Mary", and "Listen to the Band", became minor hits. One song he wrote, "You Just May Be the One", is in mixed meter, interspersing 5/4 bars into an otherwise 4/4 structure.
As part of a promotional deal, Gretsch guitar company built a one-off, natural-finish, 12-string electric guitar for Nesmith when he was performing with the Monkees. The custom-made guitar was frequently cited at that time as being worth $5,000 (the equivalent of $36,500 in 2018), which was undoubtedly inflated for publicity purposes. He earlier played a customized Gretsch 12-string, which had originally been a six-string model. Nesmith used this guitar for his appearances on the television series, as well as the Monkees' live appearances in 1966 and 1967. Beginning in 1968, Nesmith used a white six-string Gibson SG Custom for his live appearances with the Monkees. He used that guitar in their motion picture "Head" for the live version of "Circle Sky", and also for the final original Monkees tour in 1969. In a post on his Facebook page in 2011, Nesmith reported that both guitars were stolen in the early 1970s.
As with the other Monkees, Nesmith came to be frustrated by the band's manufactured image. Nesmith was the most publicly vocal Monkee about the band's prefabricated image.
The Monkees succeeded in ousting supervisor Don Kirshner and took control of their records and song choices, but they worked as a four-man group on only one album, 1967's "Headquarters". Nesmith withheld many of his songs from the final Monkees albums, opting to release them on his post-Monkees solo records. During the band's first independent press conference, Nesmith called "More of The Monkees" "probably the worst record in the history of the world". The band never regained its credibility after fans learned they had not played the instruments on their earlier records. Sales still continued to be profitable until the disastrous release of the movie "Head".
Nesmith's last contractual Monkees commitment was a commercial for Kool-Aid and Nerf balls in April 1970 (fittingly, the spot ends with Nesmith frowning and saying, "Enerf's enerf!"). As the band's sales declined, Nesmith asked to be released from his contract, despite it costing him: "I had three years left ... at $150,000 [equivalent to $980,940 in 2018] a year." He remained in a financial bind until 1980, when he received his inheritance from the Liquid Paper Company. In a 1980 interview with "Playboy", he said of that time: "I had to start telling little tales to the tax man while they were putting tags on the furniture."
Nesmith did not participate in the Monkees' 20th-anniversary reunion. However, he did appear during an encore with the other three members at the Greek Theatre on September 7, 1986. In a 1987 interview for Nick Rocks, Nesmith stated, "When Peter called up and said 'we're going to go out, do you want to go?' I was booked. But, if you get to L.A ... I'll play."
Nesmith next joined his fellow Monkees band members for the 1986 "Monkees Christmas Melody" video for MTV appearing throughout dressed/disguised as Santa Claus until the finale where he revealed his identity - and participation - to all.
Nesmith appeared again in 1989 with Dolenz, Tork and Jones. Prior to the official kickoff of The Monkees '89 tour (on July 1 in Winnipeg, Manitoba, Canada) all four Monkees gathered in Los Angeles, California making two live radio appearances (KLOS-FM: The Mark & Brian Show on June 28 and KIIS Radio on June 30th) to promote their reunion concert at the Universal Amphitheatre where they appeared together as a foursome live on stage on July 9. The following day (July 10th) all four band members were in attendance as the Monkees received a Hollywood Walk of Fame star.
In 1995, Nesmith was again reunited with the Monkees to record their studio album (and first to feature all four since "Head"), titled "Justus", released in 1996. He also wrote and directed a Monkees television special, "Hey, Hey, It's the Monkees". To support the reunion, Nesmith, Jones, Dolenz and Tork briefly toured the UK in 1997. The UK tour was the last appearance of all four Monkees performing together.
In 2012, 2013 and 2014, after Jones's death, Nesmith reunited with Dolenz and Tork to perform concerts throughout the United States. Backed with a seven-piece band that included Nesmith's son, Christian, the trio performed 27 songs from The Monkees discography ("Daydream Believer" was sung by the audience). When asked why he had decided to return to the Monkees, Nesmith stated, "I never really left. It is a part of my youth that is always active in my thoughts and part of my overall work as an artist. It stays in a special place."
In 2016, Nesmith contributed vocally and instrumentally to the Monkees' 50th anniversary album "Good Times!". He additionally contributed a song, "I Know What I Know", and was reportedly "thrilled" at the outcome of the album. Despite not touring with Dolenz and Tork for the majority of the Monkees' 50th anniversary reunion in 2016, Nesmith did twice fill in for the ailing Peter Tork as well as appearing for the final show of the tour which featured all three surviving band members (the last show to do so). At the end of the final show Nesmith announced his retirement from the Monkees, never to tour again.
In 2018, Nesmith and Dolenz toured together as a duo for the first time under the banner "The Monkees Present: The Mike and Micky Show". The tour was cut short four dates out due to Nesmith having health issues (he was flown back home and proceeded to have quadruple bypass surgery). He contributed two songs for the Monkees' 13th studio album, "Christmas Party" (the group's first ever Christmas album), released on October 12, 2018.
In 2019, Nesmith and Dolenz reunited again to make up the cancelled dates of the tour and adding several more dates, including a forthcoming tour of Australia and New Zealand.
In 1969, Nesmith formed the group First National Band with Kuehne, John Ware, and Red Rhodes. Nesmith wrote most of the songs for the band, including the single "Joanne", which received some airplay and was a moderate chart hit for seven weeks during 1970, rising to number 21 on the "Billboard" Top 40. The First National Band has been credited with being among the pioneers of country-rock music.
As he prepared for his exit from The Monkees in 1970, Nesmith was approached by John Ware of The Corvettes, a band that featured Nesmith's friend John London, who played on some of the earliest pre-Monkees Nesmith 45s, as well as numerous Monkees sessions, and had 45s produced by Nesmith for the Dot label in 1969. Ware wanted Nesmith to put together a band. Nesmith said he would be interested only if noted pedal steel player Orville "Red" Rhodes was part of the project; Nesmith's musical partnership with Rhodes continued until Rhodes's death in 1995. The new band was christened Michael Nesmith and the First National Band and went on to record three albums for RCA Records in 1970.
Nesmith has been considered one of the pioneers of country rock. He also had moderate commercial success with the First National Band. Their second single, "Joanne," hit number 21 on the "Billboard" chart, number 17 on Cashbox, and number four in Canada, with the follow-up "Silver Moon" making number 42 "Billboard", number 28 Cashbox, and number 13 in Canada. Two more singles charted ("Nevada Fighter" made number 70 "Billboard", number 73 Cashbox, and number 67 Canada, and "Propinquity" reached number 95 Cashbox), and the first two LPs charted in the lower regions of the" Billboard" album chart. No clear answer has ever been given for the band's breakup.
Nesmith followed up with The Second National Band, a band that, besides Nesmith, consisted of Michael Cohen (keyboards and Moog), Johnny Meeks (of The Strangers) (bass), jazzer Jack Ranelli (drums), and Orville Rhodes (pedal steel), as well as an appearance by singer, musician, and songwriter José Feliciano on congas. The album, "Tantamount to Treason Vol. 1", was a commercial and critical disaster. Nesmith then recorded "And the Hits Just Keep on Comin'", featuring only him on guitar and Red Rhodes on pedal steel.
Nesmith became more heavily involved in producing, working on Iain Matthews's album "Valley Hi" and Bert Jansch's "L.A. Turnaround". Nesmith was given a label of his own, Countryside, through Elektra Records, as Elektra's Jac Holzman was a fan of Nesmith's. It featured a number of artists produced by Nesmith, including Garland Frady and Red Rhodes. The staff band at Countryside also helped Nesmith on his next, and last, RCA album, "Pretty Much Your Standard Ranch Stash". Countryside folded when David Geffen replaced Holzman, as Countryside was unnecessary in Geffen's eyes.
In the mid-1970s, Nesmith briefly collaborated as a songwriter with Linda Hargrove, resulting in the tune "I've Never Loved Anyone More", a hit for Lynn Anderson and recorded by many others, as well as the songs "Winonah" and "If You Will Walk With Me," both of which were recorded by Hargrove. Of these songs, only "Winonah" was recorded by Nesmith himself. During this same period, Nesmith started his multimedia company Pacific Arts, which initially put out audio records, eight-track tapes, and cassettes, followed in 1981 with "video records." Nesmith recorded a number of LPs for his label, and had a moderate worldwide hit in 1977 with his song "Rio", the single taken from the album "From a Radio Engine to the Photon Wing". In 1983, Nesmith produced the music video for the Lionel Richie single "All Night Long". In 1987, he produced the music video for the Michael Jackson single "The Way You Make Me Feel".
During this time, Nesmith created a video-clip for "Rio", which helped spur Nesmith's creation of a television program called "PopClips" for the Nickelodeon cable network. In 1980, "PopClips" was sold to the Time Warner/Amex consortium. Time Warner/Amex developed "PopClips" into the MTV network.
Nesmith won the first Grammy Award given for (Long-form) Music-Video in 1982, for his hour-long "Elephant Parts" and also had a short-lived series on NBC inspired by the video called "Michael Nesmith in Television Parts". "Television Parts" included many other artists who were unknown at the time, but went on to become major stars in their own right. Jay Leno, Jerry Seinfeld, Garry Shandling, Whoopi Goldberg, and Arsenio Hall all became well-known artists after their appearances on Nesmith's show. The concept of the show was to have comics render their stand-up routines into short comedy films much like the ones in "Elephant Parts". Nesmith assembled writers Jack Handey, William Martin, John Levenstein, and Michael Kaplan, along with directors William Dear (who had directed "Elephant Parts") and Alan Myerson, as well as producer Ward Sylvester to create the show. The half-hour show ran for eight episodes in the summer of 1985 on NBC Thursday nights in prime time.
Nesmith formed the Pacific Arts Corporation, Inc. in 1974 to manage and develop media projects. Pacific Arts Video became a pioneer in the home video market, producing and distributing a wide variety of videotaped programs, although the company eventually ceased operations after an acrimonious contract dispute with PBS over home video licensing rights and payments for several series, including Ken Burns' "The Civil War". The dispute escalated into a lawsuit that went to jury trial in federal court in Los Angeles. On February 3, 1999, a jury awarded Nesmith and his company Pacific Arts $48.875 million in compensatory and punitive damages, prompting his widely quoted comment, "It's like finding your grandmother stealing your stereo. You're happy to get your stereo back, but it's sad to find out your grandmother is a thief." PBS appealed the ruling, but the appeal never reached court and a settlement was reached, with the amount paid to Pacific Arts and Nesmith kept confidential.
Nesmith's current Pacific Arts project is Videoranch 3D, a virtual environment on the internet that hosts live performances at various virtual venues inside the ranch. He performed live inside Videoranch 3D on May 25, 2009.
Nesmith was the executive producer for the films "Repo Man", "Tapeheads", and "", as well as his own solo recording and film projects.
In 1998, Nesmith published his first novel, "The Long Sandy Hair of Neftoon Zamora". It was developed originally as an online project and was later published as a hardcover book by St Martin's Press. Nesmith's second novel, "The America Gene", was released in July 2009 as an online download from Videoranch.com.
In the early 1980s, Nesmith teamed with satirist P. J. O'Rourke to ride his vehicle "Timerider" in the annual Baja 1000 off-road race. This is chronicled in O'Rourke's 2009 book "Driving Like Crazy".
During the 1990s, Nesmith, as trustee and president of the Gihon Foundation, hosted the Council on Ideas, a gathering of intellectuals from different fields who were asked to identify the most important issues of their day and publish the result. The foundation ceased the program in 2000 and started a new program for the performing arts. Nesmith also spent a decade as a board of trustees member, nominating member and vice-chair of the American Film Institute.
In 1992, Nesmith undertook a concert tour of North America to promote the CD release of his RCA solo albums (although he included the song "Rio" from the album "From a Radio Engine to the Photon Wing"). The concert tour ended at the Britt Festival in Oregon. A video and CD, both entitled "Live at the Britt Festival", were released capturing the 1992 concert.
Nesmith continues to record and release his own music. His most recent album, "Rays", was released in 2006. In 2011, he returned to producing, working with blues singer and guitarist Carolyn Wonderland. Nesmith produced Wonderland's version of Robert Johnson's "I Believe I'll Dust My Broom" on her album "Peace Meal". Wonderland married writer-comedian A. Whitney Brown on March 4, 2011, in a ceremony officiated by Nesmith.
In 2012, Nesmith briefly toured Europe prior to rejoining the Monkees for their tours of the United States. Intermixing the Monkees concerts, Nesmith also launched solo tours of the U.S. Unlike his 1992 U.S. tour, which predominantly featured music from his RCA recordings, Nesmith stated that his 2013 tour would feature songs he considers "thematic, chronological and most often requested by fans". Chris Scruggs, grandson of Earl Scruggs, replaced the late Red Rhodes on the steel guitar. The tour was captured on a forthcoming live album, "Movies Of The Mind".
In 2014, he guest-starred in season four, episode 9 of the IFC comedy series "Portlandia" in the fictitious role of the father of the mayor of Portland, Oregon.
In 2017, he released a memoir and companion "soundtrack" album titled "Infinite Tuesday: An Autobiographical Riff".
In 2018, he announced that he would be doing a five-date tour of California with a revamped version of The First National Band, including a date at The Troubadour, where he performed before The Monkees. On February 20, a tour was announced as "The Monkees Present: The Mike and Micky Show", their first tour as a duo. The pair will play Monkees music and promote the tour under the Monkees banner, but Nesmith stated, "there's no pretense there about Micky and I [sic] being the Monkees. We're not." The tour was cut short in June 2018, with four shows left unplayed, due to Nesmith having a “minor health issue”; Dolenz and he rescheduled the unplayed concerts plus adding several other including an Australian tour in 2019.
After recovering from his health scare Michael Nesmith and the First National Band Redux went on a tour of the U. S. with mostly the same lineup and setlist as the southern California shows.
In 2019 Nesmith toured focusing on his 1972 album, "And the Hits Just Keep on Comin'", in a two piece configuration with pedal steel player Pete Finney, the first time in this format since 1974 with Red Rhodes. Nesmith was also joined by special guests Ben Gibbard and Scott McCaughey on opening night in Seattle.
Nesmith had a cameo appearance as a taxi driver in the Whoopi Goldberg film "Burglar".
He had cameo appearances in his own films including "" (Race Official), "Repo Man" (Rabbi), and "Tapeheads" (Water Man).
In a promotional video to support Pacific Arts's video release of "Tapeheads", Nesmith was introduced with a voice-over making fun of his Monkees persona. The narration teases Nesmith, who approaches the camera to speak, poking fun at his "missing hat".
An opportunistic lookalike from the U.S. cashed in on his similarity to Nesmith by appearing on talk shows and doing interviews in Australia during the 1980s. The scam was successful, the lookalike being far enough from America to avoid detection as a fraud (which was more likely in the U.S., where the real Nesmith had made many media and show-business acquaintances). An entertaining interviewee, the impersonator's charade was not discovered until after he had vanished from the public eye. The impostor, Barry Faulkner, who had pulled various fraudulent scams for 40 years, was finally apprehended and sent to jail in 2009.
Nesmith has been married three times and has four children.
He met his first wife, Phyllis Ann Barbour, while at San Antonio College, and they married in 1964. Together, they had three children: Christian, born in 1965; Jonathan, born in 1968; and Jessica, born in 1970. Nesmith and Barbour divorced in 1972.
Nesmith also has a son, Jason, born in August 1968 to Nurit Wilde, whom he met while working on "The Monkees".
In 1976, he married his second wife, Kathryn Bild.
In 2000, he married his third wife, Victoria Kennedy, but the marriage ended in divorce in 2011.
When the Monkees' TV series ended in 1968, Nesmith enrolled part-time at the University of California, Los Angeles, where he studied American history and music history. In 1973, Nesmith founded the Countryside Records label with Jac Holzman, the founder of Elektra Records. In 1974, Nesmith started Pacific Arts Records and released what he called "a book with a soundtrack", titled "The Prison", as the company's first release.
Nesmith was forced to cancel the last four dates of his 2018 tour with Micky Dolenz due to a "minor health scare". However, in an interview with "Rolling Stone" published on July 26 of that year, Nesmith said he had undergone quadruple bypass heart surgery, and had been hospitalized for over a month.
|
https://en.wikipedia.org/wiki?curid=20993
|
McLaren
McLaren Racing Limited is a British motor racing team based at the McLaren Technology Centre, Woking, Surrey, England. McLaren is best known as a Formula One constructor and also has a history of competing in American open wheel racing as both an entrant and a chassis constructor, and has won the Canadian-American Challenge Cup (Can-Am) sports car racing championship. The team is the second oldest active, and second most successful Formula One team after Ferrari, having won 182 races, 12 Drivers' Championships and eight Constructors' Championships. The team is a wholly owned subsidiary of the McLaren Group.
Founded in 1963 by New Zealander Bruce McLaren, the team won its first Grand Prix at the 1968 Belgian Grand Prix, but their greatest initial success was in Can-Am, which they dominated from 1967 to 1971. Further American triumph followed, with Indianapolis 500 wins in McLaren cars for Mark Donohue in 1972 and Johnny Rutherford in 1974 and 1976. After Bruce McLaren died in a testing accident in 1970, Teddy Mayer took over and led the team to their first Formula One Constructors' Championship in , with Emerson Fittipaldi and James Hunt winning the Drivers' Championship in 1974 and respectively. The year 1974 also marked the start of a long-standing sponsorship by Phillip Morris' Marlboro cigarette brand.
In 1981, McLaren merged with Ron Dennis' Project Four Racing; Dennis took over as team principal and shortly after organised a buyout of the original McLaren shareholders to take full control of the team. This began the team's most successful era: with Porsche and Honda engines, Niki Lauda, Alain Prost, and Ayrton Senna took between them seven Drivers' Championships and the team took six Constructors' Championships. The combination of Prost and Senna was particularly dominant—together they won all but one race in —but later their rivalry soured and Prost left for Ferrari. Fellow English team Williams offered the most consistent challenge during this period, the two winning every constructors' title between and . However, by the mid-1990s, Honda had withdrawn from Formula One, Senna had moved to Williams, and the team went three seasons without a win. With Mercedes-Benz engines, West sponsorship, and former Williams designer Adrian Newey, further championships came in and with driver Mika Häkkinen, and during the 2000s the team were consistent front-runners, driver Lewis Hamilton taking their latest title in .
Ron Dennis retired as McLaren team principal in 2009, handing over to long time McLaren employee Martin Whitmarsh. However, at the end of 2013, after the team's worst season since 2004, Whitmarsh was ousted. McLaren announced in 2013 that they would be using Honda engines from 2015 onwards, replacing Mercedes-Benz. The team raced as McLaren Honda for the first time since 1992 at the 2015 Australian Grand Prix. In September 2017, McLaren announced they had agreed on an engine supply with Renault from 2018 to 2020. McLaren will return to use Mercedes-Benz engines from the 2021 season until at least 2024.
After initially returning to the Indianapolis 500 in 2017 as a partner to Andretti Autosport to run Fernando Alonso, McLaren announced in August 2019 that they would run in conjunction with Arrow Schmidt Peterson Motorsports starting in 2020 to run the full IndyCar Series, the combined entry being named Arrow McLaren SP.
Bruce McLaren Motor Racing was founded in 1963 by New Zealander Bruce McLaren. Bruce was a works driver for the British Formula One team Cooper with whom he had won three Grands Prix and come second in the World Championship. Wanting to compete in the Australasian Tasman Series, Bruce approached his employers, but when team owner Charles Cooper insisted on using 1.5-litre Formula One-specification engines instead of the 2.5-litre motors permitted by the Tasman rules, Bruce decided to set up his own team to run him and his prospective Formula One teammate Timmy Mayer with custom-built Cooper cars.
Bruce won the 1964 series, but Mayer was killed in practice for the final race at the Longford Circuit in Tasmania. When Bruce McLaren approached Teddy Mayer to help him with the purchase of the Zerex sports car from Roger Penske, Teddy Mayer and Bruce McLaren began discussing a business partnership resulting in Teddy Mayer buying in to Bruce McLaren Motor Racing Limited (BMMR) ultimately becoming its largest shareholder. The team was based in Feltham in 1963–1964, and from 1965 until 1981 in Colnbrook, England. The team also held a British licence. Despite this, Bruce never used the traditional British racing green on his cars. Instead, he used colour schemes that were not based on national principles (e.g. his first car, the McLaren M2B, was painted white with a green stripe, to represent a fictional Yamura team in John Frankenheimer's film "Grand Prix").
During this period, Bruce drove for his team in sports car races in the United Kingdom and North America and also entered the 1965 Tasman Series with Phil Hill, but did not win it. He continued to drive in Grands Prix for Cooper, but judging that team's form to be waning, decided to race his own cars in 1966.
Bruce McLaren made the team's Grand Prix debut at the 1966 Monaco race (of the current Formula One teams only Ferrari is older). His race ended after nine laps due to a terminal oil leak. The car was the M2B designed by Robin Herd, but the programme was hampered by a poor choice of engines: a 3.0-litre version of Ford's Indianapolis 500 engine and a Serenissima V8 were used, the latter scoring the team's first point in Britain, but both were underpowered and unreliable. For Bruce decided to use a British Racing Motors (BRM) V12 engine, but due to delays with the engine, was forced initially to use a modified Formula Two car called the M4B powered by a 2.1-litre BRM V8, later building a similar but slightly larger car called the M5A for the V12. Neither car brought great success, the best result being a fourth at Monaco.
For , after driving McLaren's sole entry for the previous two years, Bruce was joined by 1967 champion and fellow New Zealander Denny Hulme, who was already racing for McLaren in Can-Am. That year's new M7A car, Herd's final design for the team, was powered by Cosworth's new and soon to be ubiquitous DFV engine (the DFV would go on to be used by McLaren until 1983) and with it a major upturn in form proceeded. Bruce won the Race of Champions at the Brands Hatch circuit and Hulme won the International Trophy at Silverstone, both non-championship races, before Bruce took the team's first championship win at the Belgian Grand Prix. Hulme also won the Italian and Canadian Grands Prix later in the year, helping the team to second in the Constructors' Championship. Using an updated 'C' version on the M7, a further three podium finishes followed for Bruce in , but the team's fifth win had to wait until the last race of the 1969 championship when Hulme won the Mexican Grand Prix. That year, McLaren experimented with four-wheel drive in the M9A, but the car had only a single outing driven by Derek Bell at the British Grand Prix; Bruce described driving it as like "trying to write your signature with somebody jogging your elbow".
The year started with a second place each for Hulme and Bruce in the first two Grands Prix, but in June, Bruce was killed in a crash at Goodwood while testing the new M8D Can-Am car. After his death, Teddy Mayer took over effective control of the team; Hulme continued with Dan Gurney and Peter Gethin partnering him. Gurney won the first two Can-Am events at Mosport and St. Jovite and placed ninth in the third, but left the team mid-season, and Gethin took over from there. While began promisingly when Hulme led the opening round in South Africa before retiring with broken suspension, ultimately Hulme, Gethin (who left for BRM mid-season,) and Jackie Oliver again failed to score a win. The 1972 season saw improvements though: Hulme won the team's first Grand Prix for two-and-a-half years in South Africa and he and Peter Revson scored ten other podiums, the team finishing third in the Constructors' Championship. McLaren gave Jody Scheckter his Formula One debut at the final race at Watkins Glen. All McLaren drivers used the Ford-Cosworth engines, except for Andrea de Adamich and Nanni Galli who used engines from Alfa Romeo in 1970.
The McLaren M23, designed by Gordon Coppuck, was the team's new car for the season. Sharing parts of the design of both McLaren's Formula One M19 and Indianapolis M16 cars (itself inspired by Lotus's 72), it was a mainstay for four years. Hulme won with it in Sweden and Revson took the only Grand Prix wins of his career in Britain and Canada. In , Emerson Fittipaldi, world champion with Lotus two years earlier, joined McLaren. Hulme, in his final Formula One campaign, won the Argentinian season-opener; Fittipaldi, with wins in Brazil, Belgium and Canada, took the Drivers' Championship. It was a close fight for Fittipaldi, who secured the title with a fourth at the season-ending United States Grand Prix, putting him three points ahead of Ferrari's Clay Regazzoni. With Hulme and multiple motorcycle world champion Mike Hailwood, he also sealed McLaren's first Constructors' Championship. The year was less successful for the team: Fittipaldi was second in the championship behind Niki Lauda. Hulme's replacement Jochen Mass took his sole GP win in Spain.
At the end of 1975, Fittipaldi left to join his brother's Fittipaldi/Copersucar team. With the top drivers already signed to other teams, Mayer turned to James Hunt, a driver on whom biographer Gerald Donaldson reflected as having "a dubious reputation". In , Lauda was again strong in his Ferrari; at midseason, he led the championship with 56 points whilst Hunt had only 26 despite wins in Spain (a race from which he was initially disqualified) and France. At the German Grand Prix, though, Lauda crashed heavily, was nearly killed, and missed the next two races. Hunt capitalised by winning four more Grands Prix giving him a three-point deficit going into the finale in Japan. Here it rained torentially, Lauda retired because of safety concerns, and Hunt sealed the Drivers' Championship by finishing third. McLaren, though, lost the Constructors' Championship to Ferrari.
In , the M23 was gradually replaced with the M26, the M23's final works outing being Gilles Villeneuve's Formula One debut with the team in a one-off appearance at the British Grand Prix. Hunt won on three occasions that year, but the Lauda and Ferrari combination proved too strong, Hunt and McLaren managing just fifth and third in the respective championships. From there, results continued to worsen. Lotus and Mario Andretti took the titles with their 78 and 79 ground-effect cars and neither Hunt nor Mass's replacement Patrick Tambay were able to seriously challenge with the nonground-effect M26. Hunt was dropped at the end of 1978 in favour of Lotus's Ronnie Peterson, but when Peterson was killed by a crash at the Italian Grand Prix, John Watson was signed, instead. No improvement occurred in ; Coppuck's M28 design was described by Mayer as "ghastly, a disaster" and "quite diabolical" and the M29 did little to change the situation. Tambay scored no points and Watson only 15 to place the team eighth at the end of the year.
The 1980s started much as the 1970s had ended: Alain Prost took over from Tambay but Watson and he rarely scored points. Under increasing pressure since the previous year from principal sponsor Philip Morris and their executive John Hogan, Mayer was coerced into merging McLaren with Ron Dennis's Project Four Formula Two team, also sponsored by Philip Morris. Dennis had designer John Barnard who, inspired by the carbon-fibre rear wings of the BMW M1 race cars that Project Four was preparing, had ideas for an innovative Formula One chassis constructed from carbon-fibre instead of conventional aluminium alloy. On their own, they lacked the money to build it, but with investment that came with the merger it became the McLaren MP4 (later called MP4/1) of , driven by Watson and Andrea de Cesaris. In the MP4, Watson won the British Grand Prix and had three other podium finishes. Soon after the merger, McLaren moved from Colnbrook to a new base in Woking and Dennis and Mayer initially shared the managing directorship of the company; by 1982, Mayer had departed and Tyler Alexander's and his shareholdings had been bought by the new owners.
In the early 1980s, teams like Renault, Ferrari and Brabham were using 1.5-litre turbocharged engines in favour of the 3.0-litre naturally aspirated engines that had been standard since 1966. Having seen in 1982 the need for a turbo engine of their own, Dennis had convinced Williams backer Techniques d'Avant Garde (TAG) to fund Porsche-built, TAG-branded turbo engines made to Barnard's specifications; TAG's founder Mansour Ojjeh would later become a McLaren shareholder. In the meantime, they continued with Cosworth engines as old rival Lauda came out of retirement in 1982 to drive alongside Watson in that year's 1B development of the MP4. They each won two races, Watson notably from 17th place on the grid in Detroit, and at one stage of the season McLaren were second in the constructors' championship. As part of a dispute with FISA, they boycotted the San Marino Grand Prix. Although was not so fruitful, Watson did win again in the United States, this time from 22nd on the grid at Long Beach.
Having been fired by Renault, Prost was once again at McLaren for . Now using the TAG engines, the team dominated, scoring 12 wins and two-and-a-half times as many constructors' points as nearest rival Ferrari. In the Drivers' Championship, Lauda prevailed over Prost by half a point, the narrowest margin ever. The McLaren-TAGs were again strong in ; a third Constructors' Championship came their way whilst this time Prost won the Drivers' Championship. In , the Williams team were resurgent with their Honda engine and drivers Nigel Mansell and Nelson Piquet, whilst at McLaren, Lauda's replacement, 1982 champion Keke Rosberg could not gel with the car. Williams took the Constructors' Championship, but for Prost, wins in San Marino, Monaco, and Austria combined with the fact that the Williams drivers were taking points from each other meant that he retained a chance going into the last race, the Australian Grand Prix. There, a puncture for Mansell and a precautionary pit stop for Piquet gave Prost the race win and his second title, making him the first driver to win back-to-back championships since Jack Brabham in and 1960. In Barnard departed for Ferrari to be replaced by Steve Nichols (who himself joined Ferrari in 1989). In the hands of Prost and Stefan Johansson, though, Nichols's MP4/3 and the TAG engine could not match the Williams-Honda.
For , Honda switched their supply to McLaren and, encouraged by Prost, Dennis signed Ayrton Senna to drive. Despite regulations reducing the boost pressure and fuel capacity (and therefore, power) of the turbo cars, Honda persisted with a turbocharged engine. In the MP4/4, Senna and Prost engaged in a season-long battle, winning 15 of the 16 races (at the other race at Monza, Senna had been leading comfortably, but collided with back-marker Jean-Louis Schlesser). At the Portuguese Grand Prix, their relationship soured when Senna squeezed Prost against the pit wall; Prost won, but afterwards said, "It was dangerous. If he wants the world championship that badly he can have it." Prost scored more points that year, but because only the best 11 results counted, Senna took the title at the penultimate race in Japan.
The next year, with turbos banned, Honda supplied a new 3.5-L naturally aspirated V10 engine and McLaren again won both titles with the MP4/5. Their drivers' relationship continued to deteriorate, though, especially when, at the San Marino Grand Prix, Prost felt that Senna had reneged on an agreement not to pass each other at the first corner. Believing that Honda and Dennis were favouring Senna, Prost announced mid-season that he would leave to drive at Ferrari the following year. For the second year in succession, the Drivers' Championship was decided at the Japanese Grand Prix, this time in Prost's favour after Senna and he collided (Senna initially recovered and won the race, but was later disqualified).
With former McLaren men Nichols and Prost (Barnard had moved to the Benetton team), Ferrari pushed the British team more closely in . McLaren, in turn, brought in Ferrari's Gerhard Berger, but like the two seasons before, the Drivers' Championship was led by Prost and Senna and settled at the penultimate race in Japan. Here, Senna collided with Prost at the first corner, forcing both to retire, but this time Senna escaped punishment and took the title; McLaren also won the Constructors' Championship. The year was another for McLaren and Senna, with the ascendent Renault-powered Williams team their closest challengers. By , Williams, with their advanced FW14B car, had overtaken McLaren, breaking their four-year run as champions, despite the latter winning five races that year.
As Honda withdrew from the sport at end of 1992 due to their entrance into CART PPG Indy Car World Series in 1993, McLaren sought a new engine supplier. A deal to secure Renault engines fell through, subsequently McLaren switched to customer Ford engines for the season. Senna—who initially agreed only to a race-by-race contract before later signing for the whole year—won five races, including a record-breaking sixth victory at Monaco and a win at the European Grand Prix, where he went from fifth to first on the opening lap. His teammate, 1991 IndyCar champion Michael Andretti, fared much worse: he scored only seven points, and was replaced by test driver Mika Häkkinen for the final three rounds of the season. Williams ultimately won both titles and Senna—who had flirted with moving there for 1993—signed with them for the season. During the 1993 season McLaren took part in a seven part BBC Television documentary called "A Season With McLaren".
McLaren tested a Lamborghini V12 engine ahead of the season, as part of a potential deal with then-Lamborghini owner Chrysler, before eventually deciding to use Peugeot engines. With Peugeot power, the MP4/9 was driven by Häkkinen and Martin Brundle, despite achieving eight podiums over the season no wins were achieved. Peugeot was dropped after a single year due to multiple engine failures / unreliability which cost McLaren potential race victories and they switched to a Mercedes-Benz-branded, Ilmor-designed engine.
The alliance with Mercedes started slowly: 's MP4/10 car was not a front-runner and Brundle's replacement, former champion Nigel Mansell, was unable to fit into the car at first and departed after just two races, with Mark Blundell taking his place.
While Williams dominated in , McLaren, now with David Coulthard alongside Häkkinen, went a third successive season without a win. In , however, Coulthard broke this run by winning the season-opening Australian Grand Prix; Häkkinen and he would each win another race before the end of the season, and highly rated designer Adrian Newey joined the team from Williams in August that year. Despite the car's improved pace, unreliability proved costly throughout the season, with retirements at the British and Luxembourg Grands Prix occurring whilst Häkkinen was in the lead.
With Newey able to take advantage of new technical regulations for , and with Williams losing their works Renault engines, McLaren were once again able to challenge for the championship; "F1 Racing" magazine stated that the only way to increase their championship hopes was to hire Ferrari's double champion Michael Schumacher. Häkkinen and Coulthard won five of the first six races despite the banning of the team's "brake steer" system, which allowed the rear brakes to be operated individually to reduce understeer, after a protest by Ferrari at the second race in Brazil. Schumacher and Ferrari provided the greatest competition, the former levelled on points with Häkkinen with two races to go, but wins for Häkkinen at the Luxembourg and Japanese Grands Prix gave both him the Drivers' Championship and McLaren the Constructors' Championship. Häkkinen won his second Drivers' Championship the following season, but due to a combination of driver errors and mechanical failures, the team lost the constructors' title to Ferrari.
The year was not a repeat of recent successes: McLaren won seven races in a close fight with Ferrari, but ultimately Ferrari and Schumacher prevailed in both competitions. This marked the start of a decline in form as Ferrari cemented their position at the head of Formula One. In , Häkkinen was outscored by Coulthard for the first time since 1997 and retired (ending Formula One's longest ever driver partnership), his place taken by Kimi Räikkönen, then in , Coulthard took their solitary win at Monaco while Ferrari repeated McLaren's 1988 feat of 15 wins in a season.
The year started very promisingly, with one win each for Coulthard and Räikkönen at the first two Grands Prix. However, they were hampered when the MP4-18 car designed for that year suffered crash test and reliability problems, forcing them to continue using a 'D' development of the year-old MP4-17 for longer than they had initially planned. Despite this, Räikkönen scored points consistently and challenged for the championship up to the final race, eventually losing by two points. The team began with the MP4-19, which technical director Adrian Newey described as "a debugged version of [the MP4-18]". It was not a success, though, and was replaced mid-season by the MP4-19B. With this, Räikkönen scored the team's and his only win of the year at the Belgian Grand Prix, as McLaren finished fifth in the Constructors' Championship, their worst ranking since 1983.
Coulthard left for Red Bull Racing in to be replaced by former CART champion Juan Pablo Montoya for what was McLaren's most successful season in several years as he and Räikkönen won ten races. However, both the team not being able to work out why the car could not heat its tyres properly in the early stages of the season and the overall unreliability of the MP4-20 cost a number of race victories when Räikkönen had been leading or in contention to win and also costing him grid positions in some qualifying sessions, which allowed Renault and their driver Fernando Alonso to capitalise and win both titles.
In , the team failed to build on the previous year's good form as the superior reliability and speed of the Ferraris and Renaults prevented the team from gaining any victories for the first time in a decade. Montoya parted company acrimoniously with the team to race in NASCAR after the United States Grand Prix, where he crashed into Räikkönen at the start; test driver Pedro de la Rosa deputised for the remainder of the season. The team also lost Räikkönen to Ferrari at the end of the year.
Steve Matchett argued that the poor reliability of McLaren in 2006 and recent previous years was due to a lack of team continuity and stability. His cited examples of instability are logistical challenges related to the move to the McLaren Technology Centre, Adrian Newey's aborted move to Jaguar and later move to Red Bull, the subsequent move of Newey's deputy to Red Bull, and personnel changes at Ilmor.
The season had Fernando Alonso, who had been contracted over a year previously, race alongside Formula One debutant and long-time McLaren protege Lewis Hamilton. The pair scored four wins each and led the Drivers' Championship for much of the year, but tensions arose within the team, some commentators claiming that Alonso was unable to cope with Hamilton's competitiveness. At the Hungarian Grand Prix, Alonso was judged to have deliberately impeded his teammate during qualifying, so the team were not allowed to score Constructors' points at the event. Indeed, an internal agreement within the McLaren team stated that drivers would alternatively have an extra lap for qualifying, that Lewis Hamilton refused to accept for the Hungarian Grand Prix, explaining Alonso's decision. Subsequently, the McLaren team were investigated by the FIA for being in possession of proprietary detailed technical blueprints of Ferrari's car – the so-called "Spygate" controversy. At the first hearing, McLaren management consistently denied all knowledge, blaming a single "rogue engineer". However, in the final hearing, McLaren were found guilty and the team were excluded from the Constructors' Championship and fined $100 million. The drivers were allowed to continue without penalty, and whilst Hamilton led the Drivers' Championship heading into the final race in Brazil, Räikkönen in the Ferrari won the race and the Drivers' Championship, a single point ahead of both McLaren drivers. In November, Alonso and McLaren agreed to terminate their contract by mutual consent, Heikki Kovalainen filling the vacant seat alongside Hamilton.
In , a close fight ensued between Hamilton and the Ferraris of Felipe Massa and Räikkönen; Hamilton won five times and despite also crossing the finish line first at the Belgian Grand Prix, he was deemed to have gained an illegal advantage by cutting a chicane during an overtake and was controversially demoted to third. Going into the final race in Brazil, Hamilton had a seven-point lead over Massa. Massa won there, but Hamilton dramatically clinched his first Drivers' Championship by moving into the necessary fifth position at the final corner of the final lap of the race. Despite winning his first Grand Prix in Hungary, Kovalainen finished the season only seventh in the overall standings, allowing Ferrari to take the constructors' title.
Before the start of the season, Dennis retired as team principal, handing responsibility to Martin Whitmarsh, but the year started badly: the MP4-24 car was off the pace and the team was given a three-race suspended ban for misleading stewards at the Australian and Malaysian Grands Prix. Despite these early problems, a late revival had Hamilton win at the Hungarian and Singapore Grands Prix. McLaren signed that year's champion, Jenson Button, to replace Kovalainen alongside Hamilton in .
Button won twice (in Australia and China) and Hamilton three times (in Turkey, Canada, and Belgium), but they and McLaren failed to win their respective championships, that year's MP4-25 largely outpaced by Red Bull's RB6.
Hamilton and Button remained with the team into , with Hamilton winning three races – China, Germany, and Abu Dhabi and Button also winning three races – Canada, Hungary, and Japan. Button finished the Drivers' Championship in second place with 270 points behind 2011 Drivers' Champion Sebastian Vettel of Red Bull Racing, ahead of Hamilton's 227 points. McLaren were second in the Constructors' Championship to Red Bull Racing.
In 2012, McLaren won the first race of the year in Australia with a 1–3 finish for Button and Hamilton, while Hamilton went on to win in Canada, but by the mid-way mark of the season at the team's home race at Silverstone, the McLaren cars managed only eighth place (Hamilton) and 10th place (Button), while the drivers' and Constructors' Championships were being dominated by Red Bull Racing and Ferrari, whose cars occupied the first four places of the , this was partially due to pit stop problems and Button's loss of form after not working as well with the new car as Hamilton and the car not adapting to the Pirelli tyres. The car also suffered reliability problems which cost the team and its drivers numerous potential points, most notably in Singapore and Abu Dhabi, where Hamilton had been leading from the front in both races.
Sergio Pérez replaced Hamilton for , after Hamilton decided to leave for Mercedes. The team's car for the season, the MP4-28, was launched on 31 January 2013. The car struggled to compete with the other top teams and the season had McLaren fail to produce a podium finish for the first time since .
Kevin Magnussen replaced Pérez for , and Ron Dennis, who had remained at arm's length since stepping down from the team principal role, returned as CEO of the operation. McLaren were the first team to officially launch their 2014 car, the MP4-29, which was revealed on 24 January 2014. They had a largely unsuccessful 2014; their best result was in Australia where – after Daniel Ricciardo's disqualification from second place – Magnussen finished second and Button third. Button subsequently finished fourth in Canada, Britain, and Russia. Their highest grid position was in Britain with Button's third place on the grid.
For , McLaren ended their engine deal with Mercedes which included buying back the 40% stake that Mercedes held in the team and reforged their historical partnership with Honda. After a prolonged period, the team announced Fernando Alonso and Jenson Button as their race drivers, with Kevin Magnussen demoted to test driver. During pre-season testing at the Circuit de Barcelona-Catalunya in February, Alonso suffered a concussion and, as a result, Kevin Magnussen replaced him for the season opening in March. At that inaugural race of the season, Jenson Button finished 11th, but was lapped twice and finished last of the finishing cars. Following considerable unreliability and initial suggestions that the Honda engine was underpowered relative to its competitors, steady performance gains eventually resulted in Button managing to score the team's first (four) points of the season at the sixth round in Monaco. By contrast, Alonso scored his first point a further three races later at the British Grand Prix.
The Hungarian Grand Prix saw the team score their best result of the season with Alonso and Button finishing fifth and ninth, respectively. However, McLaren didn't score points in the next four races until Button finished ninth at the Russian Grand Prix. At the following United States Grand Prix, Button scored his best result of the season with sixth place. The team finished ninth in the constructors' standings with 27 points, marking McLaren's worst points finish since 1980.
McLaren retained their Alonso - Button pair for the season. The second year of the renewed Honda partnership was much more promising than the first with the team being able to challenge for top 10 positions on a more regular basis. However, the season started with a massive crash at the Australian Grand Prix in which Fernando Alonso sustained rib fractures and a collapsed lung after colliding with Esteban Gutiérrez and somersaulting into the crash barriers. Alonso, as a result of his injuries, was forced to miss the second round of the Championship, the Bahrain Grand Prix, and was replaced by reserve driver Stoffel Vandoorne. Vandoorne produced an impressive performance in his first race to score the team's first point of the season with 10th place. The next points for McLaren came at the Russian Grand Prix with Alonso and Button finishing sixth and 10th respectively. The rain affected Monaco Grand Prix was one of best races of the season for the team. Alonso finished fifth, having kept Nico Rosberg's Mercedes behind him for 46 laps, while Button scored two points with ninth. At the Austrian Grand Prix, Button recorded his best result of the season with sixth place after qualifying third in a wet/dry session. After a disappointing display at their home race, the British Grand Prix at Silverstone, the team scored points at the next three rounds with six points in Hungary, four in Germany and six points again thanks to an impressive seventh-place finish from Alonso at the Belgian Grand Prix. At the United States Grand Prix, McLaren matched their Monaco result with 12 points after an attacking race from Alonso saw him claim fifth position while Button once again finished ninth. After a season of significant progress compared to 2015, Alonso and Button finished the championship in 10th and 15th places respectively with the team ending the season in sixth place in the Constructors' Championship with 76 points. On 3 September 2016, Jenson Button announced he would take a sabbatical from Formula One for the 2017 season. He then confirmed on 25 November that he would retire from F1 altogether with Vandoorne being Alonso's new Teammate for 2017.
In February 2017, McLaren signed Lando Norris to their Young Driver Programme.
Alonso did not take part in the 2017 Monaco Grand Prix as he was participating in the Indianapolis 500. Instead Jenson Button returned for the one race as his replacement.
On 15 September 2017, McLaren confirmed that they would end their partnership with Honda at the end of the 2017 season and use engines supplied by Renault. Team boss, Éric Boullier, described the poor on-track performance between 2015 and 2017 as a "proper disaster" for the team's credibility. McLaren finished 2017 9th with 30 points in total.
McLaren announced during the 2017 Singapore Grand Prix weekend that they would split from engine supplier Honda at the end of the 2017 season, and had agreed a three-year deal to be supplied by Renault. was the first season in McLaren's history when their cars were powered by Renault engines. McLaren also announced that Fernando Alonso and Stoffel Vandoorne would remain with the team for the 2018 season. On 6 November 2017, the team announced that Lando Norris would be the team's test and reserve driver.
At the season opening Australian Grand Prix, Fernando Alonso scored the team's best finish since the 2016 Monaco Grand Prix with fifth, Alonso said that the team's target would be Red Bull Racing. McLaren had a relatively good start to the season with points finishes in the next four races, but in the next 16 races after Spain, McLaren only scored 22 points, 8 points less than in the same period in 2017. On 14 August 2018, Fernando Alonso announced he would not compete in Formula One in 2019, ending his four-year spell at the team. Carlos Sainz Jr. was signed as his replacement on a multi-year deal. On 3 September 2018, it was announced that Stoffel Vandoorne would be leaving the team at the end of the season, with Lando Norris being promoted from reserve driver to replace him in 2019. McLaren struggled with performance throughout the season, with the McLaren drivers being knocked out 21 times in the first qualifying session, and McLaren having the second worst average qualifying ranking of any team, only ahead of Williams. The team finished the disappointing season – after being helped by the exclusion of Force India's points from the first 12 races – in 6th place with 62 points, 357 points behind their target, Red Bull Racing, with the same engine.
The 2019 season was much more positive for McLaren, with the team securely establishing themselves as the best constructor behind Mercedes, Ferrari and Red Bull. At the Brazilian Grand Prix, Sainz recorded the team's first podium since the 2014 Australian Grand Prix, finishing 4th on the road but later promoted to 3rd after Lewis Hamilton received a post-race penalty, meaning that the team missed out on the official podium ceremony. McLaren ended the season in 4th place with 145 points, their best result since 2014 and 54 points ahead of their nearest competitor, Renault.
McLaren withdrew from the season-opening Australian Grand Prix after one of their team members tested positive for COVID-19. The race was later cancelled due to the COVID-19 pandemic.
McLaren are due to return to using Mercedes engines in after their deal with Renault ends. McLaren had previously collaborated with Mercedes from 1995 through 2014. Daniel Ricciardo is due to move from Renault to partner Lando Norris for the 2021 Formula One World Championship on a multi-year deal. Ricciardo is replacing Carlos Sainz Jr., who is moving to Scuderia Ferrari.
McLaren's first sports-racing car was the Group 7 M1 – with a small-block Chevrolet engine in a modified Elva chassis. The car was raced in North America and Europe in 1963 and 1964 in various G7 and United States Road Racing Championship events. For the Can-Am Series, which started in 1966, McLaren created the M3 which Bruce and Chris Amon drove – customer cars also appeared in a number of races in the 1966 season. With the M3, they led two races, but scored no wins, and the inaugural title was taken by John Surtees in a Lola T70. The following year, Robin Herd purpose-designed the Chevrolet V8-powered M6A, delays with the Formula One programme allowing the team to spend extra resources on developing the Can-Am car which was the first to be painted in McLaren orange. With Denny Hulme now partnering Bruce, they won five of six races and Bruce won the championship, setting the pattern for the next four years. In 1968, they used a new car, the M8, to win four races; non-works McLarens took the other two, but this time Hulme was victorious overall. In 1969, McLaren domination became total as they won all 11 races with the M8B; Hulme won five, and Bruce won six and the Drivers' Championship. From 1969 onwards, McLaren M12 – the customer "variant" of the M8 – was driven by a number of entrants, including a version modified by Jim Hall of Chaparral fame. McLaren's success in Can-Am brought with it financial rewards, both prize money and money from selling cars to other teams, that helped to support the team and fund the nascent and relatively poor-paying Formula One programme.
When Bruce was killed testing the 1970 season's M8D, he was at first replaced by Dan Gurney, then later by Peter Gethin. They won two and one races, respectively, while Hulme won six on the way to the championship. Private teams competing in the 1970 Can-Am series included older M3Bs as well as the M12 – the customer version of the team's M8B. In 1971, the team held off the challenge of 1969 world champion Jackie Stewart in the Lola T260, winning eight races, with Peter Revson taking the title. Hulme also won three Can-Am races in 1972, but the McLaren M20 was defeated by the Porsche 917/10s of Mark Donohue and George Follmer. Faced by the greater resources of Porsche, McLaren decided to abandon Can-Am at the end of 1972 and focus solely on open-wheel racing. When the original Can-Am series ceased at the end of 1974, McLaren were by far the most successful constructor with 43 wins.
McLaren first contested the United States Auto Club's (USAC) Indianapolis 500 race in 1970, encouraged by their tyre supplier Goodyear, which wanted to break competitor Firestone's stranglehold on the event. With the M15 car, Bruce, Chris Amon, and Denny Hulme entered, but after Amon withdrew and Hulme was severely burned on the hands in an incident in practice, Peter Revson and Carl Williams took their places in the race to retire and finish seventh, respectively. The team also contested some of the more prestigious races in the USAC championship that year, as they would do in subsequent years. For 1971 they had a new car, the M16, which driver Mark Donohue said "...obsoleted every other car on track..." At that year's Indianapolis 500, Revson qualified on pole and finished second, whilst in 1972, Donohue won in privateer Team Penske's M16B. The 1973 event had Johnny Rutherford join the team; he qualified on pole, but finished ninth, Revson crashed out. McLaren won their first Indianapolis 500 in 1974 with Rutherford. The McLaren and Rutherford combination was second in 1975 and won again in 1976. Developments of the M16 had been used throughout this period until the new M24 car was introduced in 1977. The team did not reproduce their recent success at Indianapolis in 1977, 1978, or 1979, and although they continued to win other USAC races, by the end of 1979, they decided to end their involvement.
On 12 April 2017, McLaren revealed they would participate in the 2017 Indianapolis 500 with their current Formula 1 driver Fernando Alonso at the wheel of a Honda-powered McLaren-branded Andretti Autosport IndyCar.
In qualifying, Alonso secured a second-row start from fifth. During the race Alonso led 27 laps in his first Indy 500 start. With 21 laps remaining Alonso was running seventh when his Honda engine failed. He was classified 24th. After his retirement he received a standing ovation from the grandstands. Alonso was praised for his strong debut.
On 10 November 2018, McLaren announced that they would participate in the 2019 Indianapolis 500 with Fernando Alonso again at the wheel, using Chevrolet engines. However, their 2019 attempt was far less successful than their 2017 showing; after suffering mechanical difficulties and a severe crash in practice the team failed to qualify for the race (as did two other Carlin-associated entries, one with another former F1 driver (Max Chilton) driving).
In August 2019, it was announced McLaren would contest the championship full-time in 2020, collaborating with Arrow Schmidt Peterson Motorsports.
Besides the cars raced by the works team, a variety of McLaren racing cars have also been used by customer teams. In their formative years, McLaren built Formula Two, hillclimbing, Formula 5000 and sports racing cars that were sold to customers. Lacking the capacity to build the desired numbers, Trojan was subcontracted to construct some of them. In Can-Am, Trojan built customer versions of the M6 and M8 cars and ex-works cars were sold to privateers when new models arrived; half of the field was McLarens at some races. Author Mark Hughes says, "over 220" McLarens were built by Trojan. In USAC competition and Formula One, too, many teams used McLarens during the late 1960s and 1970s. A 1972 M8F was rebuilt as the C8 for use in Group C racing in 1982, but had little success.
In the mid-1990s, McLaren Racing's sister company, McLaren Cars (now McLaren Automotive) built a racing version of their F1 road car, the F1 GTR which won the 1995 24 Hours of Le Mans and the 1995 and 1996 BPR Global GT Series. More recently, a GT3 version of their new MP4-12C road car was announced, and will be entered by CRS Racing in the FIA GT3 European Championship.
McLaren Racing is part of the McLaren Group which includes five other associated companies; in 2009 the Group was said to have "more than 1300" employees. Since 2004 the team has been based at the McLaren Technology Centre in Woking, United Kingdom. Facilities there include a wind tunnel and a driving simulator which is said to be the most sophisticated in the sport. The Mercedes engines were built by the car-maker's Mercedes AMG High Performance Powertrains subsidiary (formerly Mercedes-Ilmor) in Brixworth, Northamptonshire. Honda replaced Mercedes as McLaren's engine supplier from the 2015 season.
Founded in 1963 by New Zealander Bruce McLaren. After Bruce McLaren died in a testing accident in 1970, Teddy Mayer took over the team. In 1981, McLaren merged with Ron Dennis' Project Four Racing; Dennis took over as team principal and shortly after organised a buyout of the original McLaren shareholders to take full control of the team.
Ron Dennis was the chairman of the Group—a role from which he resigned in 2009 before retaking it a year later. He also was team principal from 1980 to 2009. Martin Whitmarsh held the role of team principal from 2009 to 2013. Dennis later removed the position of team principal; Éric Boullier was named racing director in January 2014, becoming responsible for the F1 team. On 4 July 2018, Boullier resigned and Gil de Ferran was appointed to the new position of sporting director and Andrea Stella as Performance Director. On 1 May 2019 Andreas Seidl was appointed as team principal.
On 16 January 2014, it was announced that Ron Dennis had returned to the role of Group CEO of McLaren, combining his current role as Chairman of McLaren Group.
On 21 November 2016, Zak Brown was announced as the new executive director of McLaren Technology Group after Ron Dennis was forced out. Instead of directly replacing Dennis as CEO, Brown will report directly to the group's Executive Committee. Both Jonathan Neale (chief operating officer) and Brown will jointly lead the businesses as part of the first step in the Group's transition to a new organisational structure.
On 10 April 2018, Brown became the CEO of McLaren Racing, as part of an operational restructure of the McLaren Group. Under the new management structure, racing director Eric Boullier will report directly to Brown.
McLaren Racing Limited is a wholly owned subsidiary of McLaren Group. In 2000, Mercedes's parent company Daimler (then DaimlerChrysler) bought a 40% share of McLaren Group, which they maintained until 2009 when they bought out the championship-winning Brawn team and began to sell back their McLaren stake.
, the Bahrain royal family's Mumtalakat investment company owns 56% of McLaren Group, Mansour Ojjeh (TAG Group) owns 14%, Michael Latifi owns 10% and minor shareholders owning the rest.
McLaren has had an uneasy relationship with the Formula One's governing body, the FIA, and its predecessor FISA, as well as with the commercial rights holder of the sport. In the early 1980s, McLaren were involved, along with the other teams of the Formula One Constructors Association, in a dispute over control of the sport with FISA and the teams of car manufacturers Alfa Romeo, Renault, and Ferrari. This was known as the FISA–FOCA war and had a breakaway series threatened, FISA refusing to sanction one race, and another race boycotted by FOCA. It was eventually resolved by a revenue-sharing deal called the Concorde Agreement. Subsequent Concorde Agreements were signed in 1987 and 1992, but in 1996, McLaren were again one of the teams pitched into dispute over the terms of a new agreement, this time with former FOCA president Bernie Ecclestone's Formula One Promotions and Administration organisation. McLaren rejected the Concorde Agreement of 1997 before signing a new 10-year agreement in 1998. Arguments over the commercial structure and regulations in the sport restarted in the mid-2000s with McLaren and their part-owner Mercedes again amongst teams threatening to start a rival series until 2009 when another Concorde Agreement, effective until the end of 2012, was settled upon. In 2007, McLaren were involved in an espionage controversy after their chief designer Mike Coughlan obtained confidential technical information from Ferrari. McLaren was excluded from the Constructors' Championship and fined US$100 million.
McLaren's Formula One team was originally called Bruce McLaren Motor Racing, and for their first season ran white-and-green coloured cars, which came about as a result of a deal with the makers of the film "Grand Prix".
Between and , the team used an orange design, which was also applied to cars competing in the Indianapolis 500 and Can-Am series, and was used as an interim testing livery in later years.
In , the Royal Automobile Club and the Fédération Internationale de l'Automobile relaxed the rules regarding commercial sponsorship of Formula One cars, and in , the Yardley of London cosmetics company became McLaren's first title sponsor, and the livery was changed to a predominantly white one to reflect the sponsor's colours. This changed in , when Philip Morris joined as title sponsor through their Marlboro cigarette brand, whilst one car continued to run—ostensibly by a separate team—with Yardley livery for the year. Marlboro's red-and-white branding lasted until , during which time the team went by various names incorporating the word "Marlboro", making it the then longest-running Formula One sponsorship (and still the longest title sponsorship, which has since been surpassed by Hugo Boss' sponsorship of the team, which ran from to ).
In , Philip Morris parted ways with McLaren, moving to Ferrari, instead. The Marlboro sponsorship was replaced by Reemtsma's West cigarette branding, with the team entering under the name West McLaren Mercedes, and adopting a silver and black livery.
By mid-2005, a European Union directive banned tobacco advertising in sport, which forced McLaren to end its association with West. In , the team competed without a title sponsor, entering under the name Team McLaren Mercedes. McLaren altered their livery to introduce red into the design, and changed the silver to chrome.
In , McLaren signed a seven-year contract with telecommunications company Vodafone, and became known as Vodafone McLaren Mercedes. The arrangement was due to last until , although the team announced at the 2013 Australian Grand Prix that their partnership would conclude at the end of the season. Despite explaining the decision to conclude the sponsorship as being a result of Vodafone's desire to reconsider its commercial opportunities, it was later reported that the decision to run the 2012 Bahrain Grand Prix in spite of an ongoing civil uprising and protests against the race, and Vodafone's inability to remove their logos from the McLaren cars during the race as being a key factor in the decision to terminate the sponsorship. Diageo-owned whisky brand Johnnie Walker, an associate sponsor since 2005, offered to take over as title sponsor at the end of 2013, but their offer of £43m was turned down by McLaren chairman Ron Dennis, who believed it to be "too small."
At the end of 2015, it was announced that McLaren were due to lose sponsor TAG Heuer to Red Bull Racing. McLaren chief Ron Dennis later admitted to falling out with TAG Heuer CEO Jean-Claude Biver.
In 2015 McLaren were without a title sponsor, and set to lose a further £20m in sponsorship in 2016.
From 2015 till 2017, during their 3-year run with Honda, they competed under the name "McLaren Honda".
From 2018 they will compete under the name "McLaren".
McLaren's cars were originally named with the letter M followed by a number, sometimes also followed by a letter denoting the model. After the 1981 merger with Project Four, the cars were called "MP4/x", or since 2001 "MP4-x", where x is the generation of the chassis (e.g. MP4/1, MP4-22). "MP4" stood initially for "Marlboro Project 4", so that the full title of the cars (McLaren MP4/x) reflected not only the historical name of the team, but also the names of the team's major sponsor and its new component part. Since the change of title sponsor in 1997, "MP4" was said to stand for "McLaren Project 4". From 2017, following Ron Dennis' departure from the team, the naming scheme of the cars changed to "MCL" followed by a number. The colour scheme was also changed to orange and black to reflect both McLaren's corporate colours and their original liveries.
, there are no drivers currently signed to the McLaren Young Driver Programme.
Seven drivers have won a total of twelve Drivers' Championships with McLaren:
|
https://en.wikipedia.org/wiki?curid=20994
|
Montalcino
Montalcino is a hill town and "comune" in the province of Siena, Tuscany, central Italy.
The town is located to the west of Pienza, close to the Crete Senesi in Val d'Orcia. It is from Siena, from Florence and from Pisa. Monte Amiata is located nearby.
The hill upon which Montalcino sits has probably been settled since Etruscan times. Its first mention in historical documents in 814 AD suggests there was a church here in the 9th century, most likely built by monks associated with the nearby Abbey of Sant'Antimo. The population grew suddenly in the middle of the tenth century, when people fleeing the nearby town of Roselle took up residence in the town.
The town takes its name from a variety of oak tree that once covered the terrain. The very high site of the town offers stunning views over the Asso, Ombrone and Arbia valleys of Tuscany, dotted with silvery olive orchards, vineyards, fields and villages. The lower slopes of the Montalcino hill itself are dominated by highly productive vines and olive orchards.
During medieval times the city was known for its tanneries and for the shoes and other leather goods that were made from the high-quality leathers that were produced there. As time went by, many medieval hill towns, including Montalcino, went into serious economic decline.
Like many of the medieval towns of Tuscany, Montalcino experienced long periods of peace and often enjoyed a measure of prosperity. This peace and prosperity was, however, interrupted by a number of extremely violent episodes.
During the late Middle Ages it was an independent commune with considerable importance owing to its location on the old Via Francigena, the main road between France and Rome, but increasingly Montalcino came under the sway of the larger and more aggressive city of Siena.
As a satellite of Siena since the Battle of Montaperti in 1260, Montalcino was deeply involved and affected by the conflicts in which Siena became embroiled, particularly in those with the city of Florence in the 14th and 15th centuries, and like many other cities in central and northern Italy, the town was also caught up in the internecine wars between the Ghibellines (supporters of the Holy Roman Empire) and the Guelphs (supporters of the Papacy). Factions from each side controlled the town at various times in the late medieval period.
Once Siena had been conquered by Florence under the rule of the Medici family in 1555, Montalcino held out for almost four years, but ultimately fell to the Florentines, under whose control it remained until the Grand Duchy of Tuscany was amalgamated into a united Italy in 1861.
In the case of Montalcino, gradual economic decline has recently been reversed by economic growth due to the increasing popularity of the town's famous wine Brunello di Montalcino, made from the sangiovese grosso grapes grown within the comune. The number of producers of the wine has grown from only 11 in the 1960s to more than 200 today, producing some 330,000 cases of the Brunello wine annually. Brunello was the first wine to be awarded "Denominazione di Origine Controllata e Garantita" (DOCG) status. In addition to Brunello di Montalcino, which must be aged five years prior to release, 6 years for the Riserva, Rosso di Montalcino (DOC), made from sangiovese grosso grapes and aged one year, and a variety of Super Tuscan wines are also produced within the comune, as well as the Moscadello sweet white wines for which it was most famous until the development of the Brunello series.
The first walls of the town were built in the 13th century. The fortress, built in 1361 atop the highest point of the town, was designed with a pentagonal layout by the Sienese architects Mino Foresi and Domenico di Feo. The fortress incorporates some of the pre-existing southern walls, the pre-existing structures including the keep of Santo Martini, the San Giovanni tower and an ancient basilica which now serves as the fortress chapel. Though the town itself was eventually conquered, the fortress itself never submitted, an admirable feat, considering the size of the Sienese and Florentine forces that besieged Montalcino at varying intervals.
The narrow, short street leads down from the main gate of the fortress to the "Chiesa di Sant'Agostino" with its simple 13th-century, Romanesque façade. Adjacent to the church is the former convent, now the Musei Riuniti, both a civic and diocesan museum, housing among its collections: a wooden crucifix by an unknown artist of the Sienese school, two 15th century wooden sculptures, including a "Madonna" by an anonymous artist, and several terracotta sculptures attributed to the Della Robbia school. The collection also includes a "St Peter and St Paul" by Ambrogio Lorenzetti and a "Virgin and Child" by Simone Martini. There are also modern works from the early 20th century in the museum.
The "Duomo" (cathedral), dedicated to San Salvatore, was built originally in the 14th Century, but now has a 19th-century Neoclassical façade designed by the Sienese architect Agostino Fantasici.
The "Piazza della Principessa Margherita" is down the hill from the fortress and Duomo on the via Matteotti. The principal building on the piazza is the former "Palazzo dei Priori" or "Palazzo Comunale" (built late 13th, early 14th century), now town hall. The palace is adorned with the coats of arms of the Podesta, once rulers of the city. A tall medieval tower is incorporated into the palazzo. Close by is a Renaissance-style building with six round arches, called "La Loggia", for which construction began at the very end of the 14th century and finished in the early 15th, but which has undergone much restoration work over the subsequent centuries.
Montalcino is divided, like most medieval Tuscan cities, into quarters called "contrade", Borghetto, Travaglio, Pianello and Ruga, each with their own colors, songs and separate drum rhythms to distinguish them. Twice a year they meet together in a breath taking archery contest under the walls of the Fortezza, conducted in Medieval dress, with lords and ladies of each contrada who accompany the proceedings.
The 13th-century church of "San Francesco" in the Castlevecchio "contrada" has undergone several renovations. It contains 16th-century frescoes by Vincenzo Tamagni.
In 2010, the Festa Europea Della Musica had its first edition in Montalcino, to promote tourism and the produce, especially wine, of the region. Associated with the Fête de la Musique, created in 1981 in Paris to celebrate music and musicians, the Festa was incorporated into the Italian Minister of Culture's agenda in 1994, and has since spread across Italy.
People awarded the honorary citizenship of Montalcino are:
|
https://en.wikipedia.org/wiki?curid=20997
|
List of marine aquarium fish species
The following list of marine aquarium fish species commonly available in the aquarium trade is not a completely comprehensive list; certain rare specimens may available commercially yet not be listed here. A brief section on each, with a link to the page about the particular species is provided along with references for further information.
These large fish are considered to be quite hardy, but because of their size may present a significant challenge to the keeper. They need huge aquariums, up to 180 gallons to house one for its entire lifespan. Two angels might be kept in the same aquarium provided it is a large aquarium, they are properly acclimated as juveniles, and they have very different colouring and body shape. However, because all Angelfish have essentially the same diet, mixing them is a feat that should be left to only advanced keepers. None are reef safe, and a potential owner should be aware that they need to have plenty of vegetable matter in their diet. They undergo major changes in colouration while maturing, and unless specified given descriptions are for adult specimens.
Although Dwarf Angelfish are smaller and generally more manageable than their larger counterparts, they still have some specific care requirements. They are omnivores, but plenty of vegetable matter, preferably in the form of macroalgae, should be provided for their grazing pleasure. Their suitability for reef tanks is hotly debated, so add at your own risk. Specimens that have been successfully maintained in reef aquaria include the Flame and Coral Beauty angels. However, for obvious reasons they should not be put into tanks with expensive decorative macroalgae.
Although Anthias resemble damsels in shape and size, the two should never be confused. Anthias (also known as "fairy basslets") are finicky and many starve to death in captivity. In the wild, they eat zooplankton, and will not accept anything else in the aquarium. They also need to be fed nearly constantly, three times a day at least. The best way to ensure the health and longevity of an Anthias is to attach a refugium where copepods can be grown to "drip" into the display tank. Unlike many other saltwater aquarium inhabitants, they can be kept in groups.
In this exceedingly large group of fish, few are considered proper aquarium inhabitants, for various reasons including diet and size. Basses vary greatly from species to species. Appropriate research should be done before purchasing a specimen. Many unsuspecting hobbyists bring home cute little specimens of popular aquarium fish such as the lyretail grouper, only to realize several months later that they do not have the resources to care for a meter-long that may cost hundreds of dollars a month to feed.
Basslets and Assessors are small, long bodied fish strongly resembling Anthias. Their care requirements, however, are closer to those of damsels. They should be kept individually, and generally not with other fish of similar shape and colour. Feeding is easy: they will generally eat any meaty foods offered. Good water quality should be maintained at all times.
Batfish are gorgeous and striking fish that are not common in aquaria for one major reason: they get huge. A two or three hundred gallon tank is needed for one, minimum, and larger is better. They start out as tiny, manageable-looking cuties, which often fools aquarists into purchasing them for their small aquariums. However they quickly grow to gargantuan proportions, and require large amounts of food as well as space, so beware. They are not reef safe and should be fed plenty of large meaty foods.
Batfish change greatly as they grow, however the potential aquarist is most likely to see them in their juvenile form, so that is the description of the colouration here. They all have generally the same body shape: disk-like with tall dorsal and anal fins, similar to a Freshwater Angelfish.
Blennies are popular aquarium fish, and for good reason. Most of them are peaceful to other fish, while very aggressive to other blennies which has a similar shape. Some blennis are colorful, and many are downright helpful. For example, the aptly named Lawnmower Blenny will keep your green algae well trimmed and presentable. With the exception of Fang Blennies, Blennies are totally reef safe- in fact a reef environment is really best for them because they can be shy and the intricate rockwork of a reef provides ample hiding spaces. They are omnivores and should be fed a varied diet of frozen or live foods and plant matter. Blennies do not have teeth or functional jaw, so food must be small enough for them to swallow whole.
Blennies are often confused with Gobies, but there is an easy way to tell the difference. Gobies have two distinct dorsal fins, Blennies have a single dorsal fin that runs the length of their body. Also, Gobies' pelvic fins are fused to form a sucker, similar to Remoras.
The engineer goby is a close relative of cichlids and leaf fishes, the juvenile can often be found in aquarium trade, while the adult is rare.
Members of the family Tetraodontidae, Boxfish, Blowfish or Pufferfish and their cousins Cowfishes and Porcupinefishes can be very personable and quirky pets, for the prepared.
They are not thought of as an ordinary aquarium tank mate, but are quickly gaining popularity. They do pose a hazard in the community tank however. They are capable of releasing a very powerful toxin which can kill other fish and in some cases, the boxfish itself. They generally only use it when threatened or dying, but can become disturbed easily with aggressive tank mates or overcrowded aquarium. Generally they are reef safe, though they will pick at invertebrates if not fed well enough.
Many people think puffed up Pufferfish, like in the picture, are cute, but an owner should never subject their pet to this as they are often unable to expel the air should they be out of the water. To prevent this, never remove a puffer from the water.
When properly cared for, Butterflyfish can make beautiful and distinctive additions to fish only marine aquariums. Specimens often grow to large sizes and are not well suited to smaller aquariums. Butterflyfish can be fussy and overparticular, but when fed a varied diet and kept in pristine conditions they will usually thrive. Some species in this family do not do well in captivity, and potential keepers must take care to purchase only those species that have a fighting chance. When selecting Butterflyfish especially, specimens presenting any sign or signs of mishandling are to be avoided.
The following species are relatively hardy and experienced aquarists should have no trouble with them, so long as they are diligent.
One of the few groups of shoaling fish commonly available to marine aquarists, Cardinalfish are nocturnal and tend to be quite shy. They require meaty foods and will often not take prepared foods such as flakes and tablets. For the best chance of success, keep a wide variety of frozen foods on hand. In the event of a hunger strike, they will almost always take adult brine shrimp. As far as other care requirements they are similar to damsels: not picky. So long as they are properly acclimated, they tolerate a wide range of parameters. A marine aquarist should watch the ammonia/nitrite levels of the environment, as cardinalfish are particularly sensitive to these chemicals.
Chromis are perhaps the ultimate reef fish. Generally peaceful, most species are easy to take care of and quite colorful. Like anthias, they will school, but in many cases this tendency disappears as they age. They are, nevertheless, at least ambivalent with their own species, as well as completely reef safe. Like Damsels and Anemonefish, their close cousins, Chromis are omnivores and will accept most foods offered. A flake staple is usually sufficient, but for best color and health supplement with frozen and live foods when possible.
Clownfish, more technically known as Anemonefish, are the classic aquarium fish. Both hardy and attractive, they are perhaps best known for their symbiotic relationship with Sea Anemones, a relative of coral. In the wild, Anemonefish are always found with a host, leading many potential keepers to believe that an anemone is necessary to keep them. Anemonefish are easy to keep, but their cnidarian counterparts are inordinately finicky and need high light levels, and luckily Anemonefish will thrive without them. Aquarists often find that Anemonefish will host in other things, from corals and Feather Duster Worms to powerheads and other equipment.
Anemonefish care is identical to that of Damselfish, as they are actually very closely related.
All Damselfish can be considered reef-safe, sometimes excluding larger, more aggressive Dascyllus varieties. Some Damselfish will host in anemones like clownfish. Most Damselfish are aggressive and difficult to catch once you put them in an aquarium.
Damselfish change gender as they grow larger and older. Small damselfish are ungendered. Eventually, they become males if no males prevent them from doing so. One or sometimes two males live with a female and guard over the eggs. Females are the largest fish and dominant over the males and juveniles. They will not allow other females into an area they have claimed as their territory without a fight. They may not allow new males or juveniles, either. Aggression increases with each change.
Most should be kept as pairs or small groups where all individuals are added at once.
Dragonets are often mis-categorized as gobies or blennies by fish sellers. They are bottom-dwelling fish that constantly hunt tiny invertebrates for food. Most starve to death in a marine aquarium unless you provide a refugium or place for the invertebrates to reproduce safely without any fish being able to reach them.
Most eels are easily kept in a large aquarium, although several species such as the blue ribbon eel should usually be avoided. With any moray eel care must be taken to secure the lid as one of the most common causes of death is escaping from the tank, and onto the floor.
Less often kept than their relatives the triggerfish and puffers, there are many filefish that make good aquarium residents, and a few that require specialized diets, making it hard to sustain them in an aquarium.
See Rabbitfish
A type of Anglerfish, Frogfish are ambush predators with huge mouths. They are capable of eating fish up to twice their length so care should be taken in choosing tank mates.
While not as common a choice for aquariums as many other species, they are typically hardy and brightly colored
Typically are hardy and do not harm invertebrates which makes them a good choice of fish for a reef tank.
Attractive and relatively small, Hawkfish make excellent additions to fish only or FOWLR aquariums. With extreme caution taken, they could be kept in reef aquariums, but because of their propensity to eat small ornamental shrimps and other mobile invertebrates (usually leaving sessile invertebrates alone) they are not considered reef safe. Lacking a swim bladder, Hawkfish can often be found resting in crevices of rocks or among the branches of corals or gorgonians. Hawkfish are easy to care for and not picky at all about water quality. A varied diet, including spirulina and small meaty foods like Mysis is recommended.
Jawfish are burrowers and require a sandy substrate of sufficient depth.
"Lionfish" specifically refer to the genus "Pterois" within the family Scorpaenidae. They have venomous spines and should be treated with caution. Other species within Scorpaenidae but outside "Pterois" may also have "lionfish" in their common names. Feeder goldfish are not the proper nutrition for a lion fish.
Pipefish are relatives of seahorses and require a similar level of care. They should only be bought by experienced aquarium owners. Captive bred specimens are sometimes available, and are significantly more likely to survive.
Usually only a single specimen can be kept in an aquarium. Sometimes multiple specimens can be kept in larger aquariums, but usually this requires them to be added at the same time or they will be too territorial.
Less commonly kept than some other species, many still make hardy and colorful aquarium residents.
Most rays have a venomous spine near the base of the tail. Care must be taken to avoid this animal when performing tank maintenance and during capture.
Because they are relatively inactive fishes, most species can be kept in smaller aquariums than other equally large fish, and 30 gallon tanks are not unusual. Because they are capable of eating fish that are surprisingly large, but will often be picked at by fish that eat invertebrates a species tank is often set up for them. Some fish will never accept anything but live food, typically these specimens are fed on gut packed guppies, mollies, or ghost shrimp. Similarly to the lionfish, care should be taken when handling these fish as they are also venomous.
It takes a special aquarist to maintain these delicate beauties. A potential keeper must be dedicated and willing to throw artistic creativity to the winds- as what seahorses need is not always beautiful. They require taller tanks, live/frozen food, and many hitching posts, as well as very peaceful tankmates. In fact, beginners would be well-advised not to mix seahorses with any other species until they have more experience.
Seahorses found in stores are generally Captive Bred, but occasionally one might find a wild caught (WC) specimen. WC Seahorses should only be purchased by seahorse experts who are going to breed them, as they tend to be finicky and most are endangered in the wild.
One of the advantages of Seahorses is that many species stay small and can (in fact, some "should") be kept in smaller tanks, making them ideal for aquarists who are pressed for space or money.
Seahorses are among the few popular marine aquarium species that can be temperate. Species vary in their temperature requirement, so here an extra category has been added.
TR=Tropical ST=Sub-Tropical TM=Temperate
Typically are hardy fish that can be kept with a wide variety of tankmates.
Many sharks will outgrow most home aquariums and/or adapt poorly to captivity. However, numerous coastal and coral reef sharks do well in good aquarium surroundings although you should have experience in keeping other saltwater fish before trying to keep sharks as they are more difficult to care for. In a shark aquarium setup (preferably an oval-shaped tank for more active species), there should be much surface area (wide and long tanks with good gas exchange/more room for biological filtration and room for sharks to swim, glide, and turn with little constraint opposed to tall, thin tanks), fine substrate (coarse substrate can irritate the shark's underside), little décor and rockwork (which should be secure) for swimming space (sharks in the orders Orectolobiformes and Heterodontiformes however, feel more secure in tanks with caves and ledges), excellent filtration (sharks are messy eaters and need good water conditions), protected heaters, filter intakes, etc. by surrounding them in polyurethane foam barriers (unprotected equipment can be dangerous to active sharks), and a secure canopy (sharks can jump out of the water) as well as, strong, steady, linear water flow (10+ x the volume of the aquarium per hour) moving in a gyre circling the aquarium, dissolved oxygen levels of 7-8ppm (slightly more if you are using ozone), low light levels, and no stray electrical currents/amounts of metal in the aquarium water. Many sharks feed on invertebrates to a great degree along with fish (even ones that are larger than themselves), and although they don't eat coral, they can knock them over and rest on them. There are also many fish and invertebrates that can harm/irritate sharks such as Scorpionfish, Butterflyfish, Angelfish (large), Filefish, Triggerfish, Pufferfish, Suckerfish (over time), Porcupinefish, certain other sharks, large crabs, Hermit crabs, sea anemones, and stinging corals. Also, sharks need iodine which can be provided through regular water changes or supplements for sharks (iodine deficiencies and possibly the buildup of nitrates can result in goiter), and feeding frequency is species-specific. Copper treatments should not be administered to most shark species.
Tangs generally feed on algae, though there are a few carnivorous species. Most tangs will not tolerate other fish the same color and/or shape as them. They have a spine on their tails that can cut open other fish and unprotected hands. All tangs should be given plenty of swimming room; try to have at least a 4' tank. Contrary to popular belief they will tolerate smaller (4' to 5') tanks just fine but tend to live better in larger tanks, over 5'.
Though often categorized as gobies, tilefish are a separate species.
While they are generally considered monsters that will chomp invertebrates, a few species can make great reef fish. Other more aggressive species such as the undulated trigger, and clown trigger will sometimes be so aggressive that it is necessary to keep as the sole inhabitant of the aquarium. All will require large tanks, with good filtration.
A diverse group of fish with an equally wide range of characteristics. Some wrasse species are aggressive towards small fish and invertebrates, others are reef safe. Some are quite hardy, some typically die within weeks.
|
https://en.wikipedia.org/wiki?curid=20999
|
Magazine
A magazine is a periodical publication which is printed in gloss-coated and matte paper or electronically published (sometimes referred to as an online magazine). Magazines are generally published on a regular schedule and contain a variety of content. They are generally financed by advertising, by a purchase price, by prepaid subscriptions, or a combination of the three.
By definition, a "magazine" paginates with each issue starting at page three, with the standard sizing being . However, in the technical sense a "journal" has continuous pagination throughout a volume. Thus "Business Week", which starts each issue anew with page one, is a magazine, but the "Journal of Business Communication", which continues the same sequence of pagination throughout the coterminous year, is a journal. Some professional or trade publications are also peer-reviewed, for example the "Journal of Accountancy". Non-peer-reviewed academic or professional publications are generally "professional magazines". That a publication calls itself a "journal" does not make it a journal in the technical sense; "The Wall Street Journal" is actually a newspaper.
From Middle French "warehouse, depot, store", from Italian , from Arabic , plural of "storehouse". At its root, the word "magazine" refers to a collection or storage location. In the case of written publication, it is a collection of written articles. This explains why magazine publications share the word root with gunpowder magazines, artillery magazines, firearms magazines, and, in French and Russian (adopted from French as ), retail stores such as department stores.
Magazines can be distributed through the mail, through sales by newsstands, bookstores, or other vendors, or through free distribution at selected pick-up locations. The subscription business models for distribution fall into three main categories:
In this model, the magazine is sold to readers for a price, either on a per-issue basis or by subscription, where an annual fee or monthly price is paid and issues are sent by post to readers. Paid circulation allows for defined readership statistics.
This means that there is no cover price and issues are given away, for example in street dispensers, airline, or included with other products or publications. Because this model involves giving issues away to unspecific populations, the statistics only entail the number of issues distributed, and not who reads them.
This is the model used by many trade magazines (industry-based periodicals) distributed only to qualifying readers, often for free and determined by some form of survey. Because of costs (e.g., printing and postage) associated with the medium of print, publishers may not distribute free copies to everyone who requests one (unqualified leads); instead, they operate under controlled circulation, deciding who may receive free subscriptions based on each person's qualification as a member of the trade (and likelihood of buying, for example, likelihood of having corporate purchasing authority, as determined from job title). This allows a high level of certainty that advertisements will be received by the advertiser's target audience, and it avoids wasted printing and distribution expenses. This latter model was widely used before the rise of the World Wide Web and is still employed by some titles. For example, in the United Kingdom, a number of computer-industry magazines use this model, including "Computer Weekly" and "Computing", and in finance, "Waters Magazine". For the global media industry, an example would be "VideoAge International."
The earliest example of magazines was "Erbauliche Monaths Unterredungen", a literary and philosophy magazine, which was launched in 1663 in Germany. "The Gentleman's Magazine", first published in 1731 in London was the first general-interest magazine. Edward Cave, who edited "The Gentleman's Magazine" under the pen name "Sylvanus Urban," was the first to use the term "magazine," on the analogy of a military storehouse. Founded by Herbert Ingram in 1842, "The Illustrated London News" was the first illustrated magazine.
The oldest consumer magazine still in print is "The Scots Magazine", which was first published in 1739, though multiple changes in ownership and gaps in publication totalling over 90 years weaken that claim. "Lloyd's List" was founded in Edward Lloyd's England coffee shop in 1734; and though its online platform is still updated daily it has not been published as a magazine since 2013 after 274 years.
Under the ancient regime, the most prominent magazines were "Mercure de France", "Journal des sçavans", founded in 1665 for scientists, and "Gazette de France", founded in 1631. Jean Loret was one of France's first journalists. He disseminated the weekly news of music, dance and Parisian society from 1650 until 1665 in verse, in what he called a "gazette burlesque", assembled in three volumes of "La Muse historique" (1650, 1660, 1665). The French press lagged a generation behind the British, for they catered to the needs the aristocracy, while the newer British counterparts were oriented toward the middle and working classes.
Periodicals were censored by the central government in Paris. They were not totally quiescent politically—often they criticized Church abuses and bureaucratic ineptitude. They supported the monarchy and they played at most a small role in stimulating the revolution. During the Revolution, new periodicals played central roles as propaganda organs for various factions. Jean-Paul Marat (1743–1793) was the most prominent editor. His "L'Ami du peuple" advocated vigorously for the rights of the lower classes against the enemies of the people Marat hated; it closed when he was assassinated. After 1800 Napoleon reimposed strict censorship.
Magazines flourished after Napoleon left in 1815. Most were based in Paris and most emphasized literature, poetry and stories. They served religious, cultural and political communities. In times of political crisis they expressed and helped shape the views of their readership and thereby were major elements in the changing political culture. For example, there were eight Catholic periodicals in 1830 in Paris. None were officially owned or sponsored by the Church and they reflected a range of opinion among educated Catholics about current issues, such as the 1830 July Revolution that overthrew the Bourbon monarchy. Several were strong supporters of the Bourbon kings, but all eight ultimately urged support for the new government, putting their appeals in terms of preserving civil order. They often discussed the relationship between church and state. Generally, they urged priests to focus on spiritual matters and not engage in politics. Historian M. Patricia Dougherty says this process created a distance between the Church and the new monarch and enabled Catholics to develop a new understanding of church-state relationships and the source of political authority.
The "Moniteur Ottoman" was a gazette written in French and first published in 1831 on the order of Mahmud II. It was the first official gazette of the Ottoman Empire, edited by Alexandre Blacque at the expense of the Sublime Porte. Its name perhaps referred to the French newspaper "Le Moniteur Universel". It was issued weekly. "Takvim-i vekayi" was published a few months later, intended as a translation of the "Moniteur" into Ottoman Turkish. After having been edited by former Consul for Denmark ""M. Franceschi"", and later on by ""Hassuna de Ghiez"", it was lastly edited by Lucien Rouet. However, facing the hostility of embassies, it was closed in the 1840s.
Satirical magazines of Turkey have a long tradition, with the first magazine ("Diyojen") published in 1869. There are currently around 20 satirical magazines; the leading ones are "Penguen" (70,000 weekly circulation), "LeMan" (50,000) and "Uykusuz". Historical examples include Oğuz Aral's magazine "Gırgır" (which reached a circulation of 500,000 in the 1970s) and "Marko Paşa" (launched 1946). Others include "L-Manyak" and "Lombak".
In the mid-1800s, monthly magazines gained popularity. They were general interest to begin, containing some news, vignettes, poems, history, political events, and social discussion. Unlike newspapers, they were more of a monthly record of current events along with entertaining stories, poems, and pictures. The first periodicals to branch out from news were "Harper's" and "The Atlantic", which focused on fostering the arts. Both "Harper's" and "The Atlantic" persist to this day, with Harper's being a cultural magazine and The Atlantic focusing mainly on world events. Early publications of Harper's even held famous works such as early publications of "Moby Dick" or famous events such as the laying of the world's first transatlantic telegraph cable; however, the majority of early content was trickle down from British events.
The development of the magazines stimulated an increase in literary criticism and political debate, moving towards more opinionated pieces from the objective newspapers. The increased time between prints and the greater amount of space to write provided a forum for public arguments by scholars and critical observers.
The early periodical predecessors to magazines started to evolve to modern definition in the late 1800s. Works slowly became more specialized and the general discussion or cultural periodicals were forced to adapt to a consumer market which yearned for more localization of issues and events.
Mass circulation magazines became much more common after 1900, some with circulations in the hundreds of thousands of subscribers. Some passed the million-mark in the 1920s. It was an age of mass media. Because of the rapid expansion of national advertising, the cover price fell sharply to about 10 cents. One cause was the heavy coverage of corruption in politics, local government and big business, especially by "Muckrakers." They were journalists who wrote for popular magazines to expose social and political sins and shortcomings. They relied on their own investigative journalism reporting; muckrakers often worked to expose social ills and corporate and political corruption. Muckraking magazines–notably "McClure's"–took on corporate monopolies and crooked political machines while raising public awareness of chronic urban poverty, unsafe working conditions, and social issues like child labor.
The journalists who specialized in exposing waste, corruption, and scandal operated at the state and local level, like Ray Stannard Baker, George Creel, and Brand Whitlock. Other like Lincoln Steffens exposed political corruption in many large cities; Ida Tarbell went after John D. Rockefeller's Standard Oil Company. Samuel Hopkins Adams in 1905 showed the fraud involved in many patent medicines, Upton Sinclair's 1906 novel "The Jungle" gave a horrid portrayal of how meat was packed, and, also in 1906, David Graham Phillips unleashed a blistering indictment of the U.S. Senate. Roosevelt gave these journalists their nickname when he complained they were not being helpful by raking up all the muck.
In 2011, 152 magazines ceased operations. Between the years of 2008 to 2015, Oxbridge communications announced that 227 magazines launched and 82 magazines closed in 2012 in North America. Furthermore, according to MediaFinder.com, 93 new magazines launched between the first six months of 2014 and just 30 closed. The category which produced the most new publications was "Regional interest", of which six new magazines were launched, including "12th & Broad" and "Craft Beer & Brewing". However, two magazines had to change their print schedules. Johnson Publishing's "Jet" stopped printing regular issues making the transition to digital format, however still print an annual print edition. "Ladies' Home Journal" stopped their monthly schedule and home delivery for subscribers to become a quarterly newsstand-only special interest publication.
According to statistics from the end of 2013, subscription levels for 22 of the top 25 magazines declined from 2012 to 2013, with just "Time", "Glamour" and "ESPN The Magazine" gaining numbers.
Immortalized in movies and magazines, young women's fashions of the 1920s set both a trend and social statement, a breaking-off from the rigid Victorian way of life. Their glamorous life style was celebrated in the feature pages and in the advertisements, where tubhey learned the brands that best exemplified the look they sought. These young, rebellious, middle-class women, labeled "flappers" by older generations, did away with the corset and donned slinky knee-length dresses, which exposed their legs and arms. The hairstyle of the decade was a chin-length bob, which had several popular variations. Cosmetics, which, until the 1920s, were not typically accepted in American society because of their association with prostitution, became, for the first time, extremely popular.
In the 1920s new magazines appealed to young German women with a sensuous image and advertisements for the appropriate clothes and accessories they would want to purchase. The glossy pages of "Die Dame" and "Das Blatt der Hausfrau" displayed the "Neue Frauen," "New Girl" – what Americans called the flapper. She was young and fashionable, financially independent, and was an eager consumer of the latest fashions. The magazines kept her up to date on fashion, arts, sports, and modern technology such as automobiles and telephones.
Religious groups have used magazines for spreading and communicating religious doctrine for over 100 years. "The Friend" was founded in Philadelphia in 1827 at the time of a major Quaker schism; it has been continually published and was renamed Friends Journal when the rival Quaker groups formally reconciled in the mid-1950s. Several Catholic magazines launched at the turn of the 20th Century that still remain in circulation including; St. Anthony Messenger founded in 1893 and published by the Franciscan Friars (O.F.M.) of St. John the Baptist Province, Cincinnati, Ohio, Los Angeles based Tidings, founded in 1895 (renamed Angelus in 2016), and published jointly by The Tidings Corporation and the Roman Catholic Archdiocese of Los Angeles, and Maryknoll, founded in 1907 by the Foreign Mission Society of America which brings news about the organization's charitable and missionary work in over 100 countries. There are over 100 Catholic magazines published in the United States, and thousands globally which range in scope from inspirational messages to specific religious orders, faithful family life, to global issues facing the world wide Church. The Watchtower publication was started by Charles Taze Russell on July 1879 under the title "Zion's Watch Tower and Herald of Christ's Presence". "The Watchtower—Public Edition" is one of the most widely circulated magazine in the world, with an average printing of approximately 62 million copies every two months in 200 languages.
|
https://en.wikipedia.org/wiki?curid=21001
|
Multivibrator
A multivibrator is an electronic circuit used to implement a variety of simple two-state devices such as relaxation oscillators, timers and flip-flops. It consists of two amplifying devices (transistors, vacuum tubes or other devices) cross-coupled by resistors or capacitors. The first multivibrator circuit, the astable multivibrator oscillator, was invented by Henri Abraham and Eugene Bloch during World War I. They called their circuit a "multivibrator" because its output waveform was rich in harmonics.
The three types of multivibrator circuits are:
Multivibrators find applications in a variety of systems where square waves or timed intervals are required. For example, before the advent of low-cost integrated circuits, chains of multivibrators found use as frequency dividers. A free-running multivibrator with a frequency of one-half to one-tenth of the reference frequency would accurately lock to the reference frequency. This technique was used in early electronic organs, to keep notes of different octaves accurately in tune. Other applications included early television systems, where the various line and frame frequencies were kept synchronized by pulses included in the video signal.
The first multivibrator circuit, the classic astable multivibrator oscillator (also called a "plate-coupled multivibrator") was first described by Henri Abraham and Eugene Bloch in "Publication 27" of the French "Ministère de la Guerre", and in "Annales de Physique 12, 252 (1919)". Since it produced a square wave, in contrast to the sine wave generated by most other oscillator circuits of the time, its output contained many harmonics above the fundamental frequency, which could be used for calibrating high frequency radio circuits. For this reason Abraham and Bloch called it a "multivibrateur". It is a predecessor of the Eccles-Jordan trigger which was derived from the circuit a year later.
Historically, the terminology of multivibrators has been somewhat variable:
An astable multivibrator consists of two amplifying stages connected in a positive feedback loop by two capacitive-resistive coupling networks. The amplifying elements may be junction or field-effect transistors, vacuum tubes, operational amplifiers, or other types of amplifier. Figure 1, below right, shows bipolar junction transistors.
The circuit is usually drawn in a symmetric form as a cross-coupled pair. The two output terminals can be defined at the active devices and have complementary states. One has high voltage while the other has low voltage, except during the brief transitions from one state to the other.
The circuit has two astable (unstable) states that change alternatively with maximum transition rate because of the "accelerating" positive feedback. It is implemented by the coupling capacitors that instantly transfer voltage changes because the voltage across a capacitor cannot suddenly change. In each state, one transistor is switched on and the other is switched off. Accordingly, one fully charged capacitor discharges (reverse charges) slowly thus converting the time into an exponentially changing voltage. At the same time, the other empty capacitor quickly charges thus restoring its charge (the first capacitor acts as a time-setting capacitor and the second prepares to play this role in the next state). The circuit operation is based on the fact that the forward-biased base-emitter junction of the switched-on bipolar transistor can provide a path for the capacitor restoration.
State 1 (Q1 is switched on, Q2 is switched off)
In the beginning, the capacitor C1 is fully charged (in the previous State 2) to the power supply voltage "V" with the polarity shown in Figure 1. Q1 is "on" and connects the left-hand positive plate of C1 to ground. As its right-hand negative plate is connected to Q2 base, a maximum negative voltage (-"V") is applied to Q2 base that keeps Q2 firmly "off". C1 begins discharging (reverse charging) via the high-value base resistor R2, so that the voltage of its right-hand plate (and at the base of Q2) is rising from below ground (-"V") toward +"V". As Q2 base-emitter junction is reverse-biased, it does not conduct, so all the current from R2 goes into C1. Simultaneously, C2 that is fully discharged and even slightly charged to 0.6 V (in the previous State 2) quickly charges via the low-value collector resistor R4 and Q1 forward-biased base-emitter junction (because R4 is less than R2, C2 charges faster than C1). Thus C2 restores its charge and prepares for the next State C2 when it will act as a time-setting capacitor. Q1 is firmly saturated in the beginning by the "forcing" C2 charging current added to R3 current. In the end, only R3 provides the needed input base current. The resistance R3 is chosen small enough to keep Q1 (not deeply) saturated after C2 is fully charged.
When the voltage of C1 right-hand plate (Q2 base voltage) becomes positive and reaches 0.6 V, Q2 base-emitter junction begins diverting a part of R2 charging current. Q2 begins conducting and this starts the avalanche-like positive feedback process as follows. Q2 collector voltage begins falling; this change transfers through the fully charged C2 to Q1 base and Q1 begins cutting off. Its collector voltage begins rising; this change transfers back through the almost empty C1 to Q2 base and makes Q2 conduct more thus sustaining the initial input impact on Q2 base. Thus the initial input change circulates along the feedback loop and grows in an avalanche-like manner until finally Q1 switches off and Q2 switches on. The forward-biased Q2 base-emitter junction fixes the voltage of C1 right-hand plate at 0.6 V and does not allow it to continue rising toward +"V".
State 2 (Q1 is switched off, Q2 is switched on)
Now, the capacitor C2 is fully charged (in the previous State 1) to the power supply voltage "V" with the polarity shown in Figure 1. Q2 is "on" and connects the right-hand positive plate of C2 to ground. As its left-hand negative plate is connected to Q1 base, a maximum negative voltage (-"V") is applied to Q1 base that keeps Q1 firmly "off". C2 begins discharging (reverse charging) via the high-value base resistor R3, so that the voltage of its left-hand plate (and at the base of Q1) is rising from below ground (-"V") toward +"V". Simultaneously, C1 that is fully discharged and even slightly charged to 0.6 V (in the previous State 1) quickly charges via the low-value collector resistor R1 and Q2 forward-biased base-emitter junction (because R1 is less than R3, C1 charges faster than C2). Thus C1 restores its charge and prepares for the next State 1 when it will act again as a time-setting capacitor...and so on... (the next explanations are a mirror copy of the second part of State 1).
The duration of state 1 (low output) will be related to the time constant "R"2"C"1 as it depends on the charging of C1, and the duration of state 2 (high output) will be related to the time constant "R"3"C"2 as it depends on the charging of C2. Because they do not need to be the same, an asymmetric duty cycle is easily achieved.
The voltage on a capacitor with non-zero initial charge is:
Looking at C2, just before Q2 turns on, the left terminal of C2 is at the base-emitter voltage of Q1 (VBE_Q1) and the right terminal is at "V"CC (""V"CC" is used here instead of "+"V"" to ease notation). The voltage across C2 is "V"CC minus "V"BE_Q1 . The moment after Q2 turns on, the right terminal of C2 is now at 0 V which drives the left terminal of C2 to 0 V minus ("V"CC - "V"BE_Q1) or "V"BE_Q1 - "V"CC. From this instant in time, the left terminal of C2 must be charged back up to VBE_Q1. How long this takes is half our multivibrator switching time (the other half comes from C1). In the charging capacitor equation above, substituting:
results in:
Solving for t results in:
For this circuit to work, VCCVBE_Q1 (for example: VCC=5 V, VBE_Q1=0.6 V), therefore the equation can be simplified to:
The period of each "half" of the multivibrator is therefore given by
"t" = ln(2)"RC".
The total period of oscillation is given by:
"T" = "t"1 + "t"2 = ln(2)"R"2 "C"1 + ln(2)"R"3 "C"2
formula_10
where...
For the special case where
formula_11
The output voltage has a shape that approximates a square waveform. It is considered below for the transistor Q1.
During State 1, Q2 base-emitter junction is reverse-biased and capacitor C1 is "unhooked" from ground. The output voltage of the switched-on transistor Q1 changes rapidly from high to low since this low-resistive output is loaded by a high impedance load (the series connected capacitor C1 and the high-resistive base resistor R2).
During State 2, Q2 base-emitter junction is forward-biased and capacitor C1 is "hooked" to ground. The output voltage of the switched-off transistor Q1 changes exponentially from low to high since this relatively high resistive output is loaded by a low impedance load (capacitor C1). This is the output voltage of R1C1 integrating circuit.
To approach the needed square waveform, the collector resistors have to be low in resistance. The base resistors have to be low enough to make the transistors saturate in the end of the restoration (RB < β.RC).
When the circuit is first powered up, neither transistor will be switched on. However, this means that at this stage they will both have high base voltages and therefore a tendency to switch on, and inevitable slight asymmetries will mean that one of the transistors is first to switch on. This will quickly put the circuit into one of the above states, and oscillation will ensue. In practice, oscillation always occurs for practical values of "R" and "C".
However, if the circuit is temporarily held with both bases high, for longer than it takes for both capacitors to charge fully, then the circuit will remain in this stable state, with both bases at 0.60 V, both collectors at 0 V, and both capacitors charged backwards to −0.60 V. This can occur at startup without external intervention, if "R" and "C" are both very small.
An astable multivibrator can be synchronized to an external chain of pulses. A single pair of active devices can be used to divide a reference by a large ratio, however, the stability of the technique is poor owing to the variability of the power supply and the circuit elements. A division ratio of 10, for example, is easy to obtain but not dependable. Chains of bistable flip-flops provide more predictable division, at the cost of more active elements.
While not fundamental to circuit operation, diodes connected in series with the base or emitter of the transistors are required to prevent the base-emitter junction being driven into reverse breakdown when the supply voltage is in excess of the "V"eb breakdown voltage, typically around 5-10 volts for general purpose silicon transistors. In the monostable configuration, only one of the transistors requires protection.
Assume all the capacitors to be discharged at first. The output of the op-amp Vo at node c is +Vsat initially. At node a, a voltage of +β Vsat is formed due to voltage division where formula_12. The current that flows from nodes c and b to ground charges the capacitor C towards +Vsat. During this charging period, the voltage at b becomes greater than +β Vsat at some point. The voltage at inverting terminal will be greater than the voltage at the non-inverting terminal of the op-amp. This is a comparator circuit and hence, the output becomes -Vsat. The voltage at node a becomes -βVsat due to voltage division. Now the capacitor discharges towards -Vsat. At some point, the voltage at b becomes less than -β Vsat. The voltage at the non-inverting terminal will be greater than the voltage at the inverting terminal of the op-amp. So, the output of the op-amp is +Vsat. This repeats and forms a free-running oscillator or an astable multivibrator.
If VC is the voltage across the capacitor and from the graph, the time period of the wave formed at capacitor and the output would match, then the time period could be calculated in this way:formula_13
formula_14
At "t" ="T1",
formula_15
Upon solving, we get:
formula_16
We are taking values of R, C and β such that we get a symmetrical square wave. Thus, we get "T1" = "T2" and total time period "T" = "T1" + "T2". So, the time period of the square wave generated at the output is:
formula_17
In the monostable multivibrator, one resistive-capacitive network (C2-R3 in Figure 1) is replaced by a resistive network (just a resistor). The circuit can be thought as a 1/2 astable multivibrator. Q2 collector voltage is the output of the circuit (in contrast to the astable circuit, it has a perfect square waveform since the output is not loaded by the capacitor).
When triggered by an input pulse, a monostable multivibrator will switch to its unstable position for a period of time, and then return to its stable state. The time period monostable multivibrator remains in unstable state is given by "t" = ln(2)"R"2"C"1. If repeated application of the input pulse maintains the circuit in the unstable state, it is called a "retriggerable" monostable. If further trigger pulses do not affect the period, the circuit is a "non-retriggerable" multivibrator.
For the circuit in Figure 2, in the stable state Q1 is turned off and Q2 is turned on. It is triggered by zero or negative input signal applied to Q2 base (with the same success it can be triggered by applying a positive input signal through a resistor to Q1 base). As a result, the circuit goes in State 1 described above. After elapsing the time, it returns to its stable initial state.
The circuit is useful for generating single output pulse of adjustable time duration in response to a triggering signal. The width of the output pulse
depends only on external components connected to the op-amp. A diode D1 clamps the capacitor voltage to 0.7 V when the output is at +Vsat. Let us assume that in the stable
state the output Vo = +Vsat. The diode D1 clamps the capacitor to 0.7 V. The voltage at the non-inverting terminal through the potential divider will be + βVsat. Now a negative trigger of magnitude V1 is applied to the non-inverting terminal so that the effective signal at this terminal is less than 0.7 V. Then the output voltage switches from +Vsat to -Vsat. The diode will now get reverse biased and the capacitor starts charging exponentially to -Vsat through R. The voltage at the non-inverting terminal through the potential divider will be - βVsat. After some time the capacitor charges to a voltage more than - βVsat. The voltage on the non-inverting input is now greater than on the inverting input and the output of the op-amp switches again to +Vsat. The capacitor discharges through resistor R and charges again to 0.7 V.
The pulse width T of a monostable multivibrator is calculated as follows:
The general solution for a low pass RC circuit is
where formula_19 and formula_20, the diode forward voltage. Therefore,
at formula_22,
after simplification,
where formula_26
If formula_27 and formula_28 so that formula_29, then
formula_30
In the bistable multivibrator, both resistive-capacitive networks (C1-R2 and C2-R3 in Figure 1) are replaced by resistive networks (just resistors or direct coupling).
This latch circuit is similar to an astable multivibrator, except that there is no charge or discharge time, due to the absence of capacitors. Hence, when the circuit is switched on, if Q1 is on, its collector is at 0 V. As a result, Q2 gets switched off. This results in more than half +"V" volts being applied to R4 causing current into the base of Q1, thus keeping it on. Thus, the circuit remains stable in a single state continuously. Similarly, Q2 remains on continuously, if it happens to get switched on first.
Switching of state can be done via Set and Reset terminals connected to the bases. For example, if Q2 is on and Set is grounded momentarily, this switches Q2 off, and makes Q1 on. Thus, Set is used to "set" Q1 on, and Reset is used to "reset" it to off state.
|
https://en.wikipedia.org/wiki?curid=21008
|
Marsh gas
Marsh gas, swamp gas, and bog gas is a mixture of methane, hydrogen sulfide, and carbon dioxide, produced naturally within some geographical marshes, swamps, and bogs.
The surface of marshes, swamps, and bogs is initially porous vegetation that rots to form a crust that prevents oxygen from reaching the organic material trapped below. That is the condition that allows anaerobic digestion and fermentation of any plant or animal material which incidentally also produces methane.
In some cases there is sufficient heat, fuel, and oxygen to allow spontaneous combustion and underground fires to smolder for some considerable time, as has occurred at a natural reserve in Spain. Such fires can cause surface subsidence, presenting an unpredictable physical hazard as well as environmental changes or damage to the local environment and the ecosystem it supports.
|
https://en.wikipedia.org/wiki?curid=21009
|
Merseburg
Merseburg () is a town in the south of the German state of Saxony-Anhalt on the river Saale, approx. 14 km south of Halle (Saale) and 30 km west of Leipzig. It is the capital of the Saalekreis district. It had a diocese founded by Archbishop Adalbert of Magdeburg.
The University of Merseburg is located within the town. Merseburg has around 33,000 inhabitants. Merseburg is part of the Central German Metropolitan Region.
Venenien was incorporated into Merseburg on 1 January 1949. The parish Kötzschen followed on 1 July 1950. Since 30 May 1994, Meuschau is part of Merseburg. Trebnitz followed later. Beuna was annexed on 1 January 2009. Geusa is a part of Merseburg since 1 January 2010.
Merseburg was first mentioned in 850. King Henry the Fowler built a royal palace at Merseburg; in the 933 Battle of Riade, he gained his great victory over the Hungarians in the vicinity.
Thietmar, appointed in 973, became the first bishop of the newly created bishopric of Prague in Bohemia. Prague had been part of the archbishopric of Mainz for a hundred years before that. From 968 until the Protestant Reformation, Merseburg was the seat of the Bishop of Merseburg, and in addition to being for a time the residence of the margraves of Meissen, it was a favorite residence of the German kings during the 10th, 11th and 12th centuries. Fifteen diets were held here during the Middle Ages, during which time its fairs enjoyed the importance which was afterwards transferred to those of Leipzig. Merseburg was the site of a failed assassination attempt on Polish ruler Bolesław I Chrobry in 1002. The town suffered severely during the German Peasants' War and also during the Thirty Years' War.
From 1657 to 1738 Merseburg was the residence of the Dukes of Saxe-Merseburg, after which it fell to the Electorate of Saxony. In 1815 following the Napoleonic Wars, the town became part of the Prussian Province of Saxony.
Merseburg is where the Merseburg Incantations were rediscovered in 1841. Written down in Old High German, they are hitherto the only preserved German documents with a heathen theme. One of them is a charm to release warriors caught during battle, and the other is a charm to heal a horse's sprained foot.
At the beginning of the 20th century, Merseburg was transformed into an industrial town, largely due to the pioneering work done by Carl Bosch and Friedrich Bergius, who laid down the scientific fundamentals of the catalytic high-pressure ammonia synthesis from 1909 to 1913. Enterprises, too, blazed a trail in the course of the transformational process. Ultimately, the nearby Leuna works emerged at the nearby town of Leuna, which continues to operate in the 21st century as a chemical production park that serves multiple international chemical companies.
Merseburg was badly damaged in World War II. In 23 air raids 6,200 dwellings were completely or partly destroyed. The historic town centre was almost completely destroyed.
Briefly part of Saxony-Anhalt after the war, it was then administered within the "Bezirk" Halle in East Germany. It became part of Saxony-Anhalt again after reunification of Germany.
Like many towns in the former East Germany, Merseburg has had a general decline in population since German Reunification despite annexing and merging with a number of smaller nearby villages.
Population of Merseburg "(from 1960, population on 31 December, unless otherwise indicated)":
Data source from 1990: Statistical Office of Saxony Anhalt
1 29 October
2 31 August
3 3 October
4 14 July 2008
Among the notable buildings of Merseburg are the Merseburg Cathedral of St John the Baptist (founded 1015, rebuilt in the 13th and 16th centuries) and the episcopal palace (15th century). The cathedral-and-palace ensemble also features a palace garden ("Schlossgarten").
Other attractions include the Merseburg House of Trades with a cultural stage and the German Museum of Chemistry, Merseburg.
The Merseburg Palace Festival with the Historical Pageant, the International Palace-Moat Concerts, Merseburg Organ Days and the Puppet Show Festival Week are events celebrated every year.
Merseburg station is located on the Halle–Bebra railway. Leipzig/Halle Airport is just 25 kilometers away.
Merseburg is connected with the Halle (Saale) tramway network. A tram ride from Halle's city centre to Merseburg takes about 50 minutes.
Merseburg is twinned with:
|
https://en.wikipedia.org/wiki?curid=21012
|
Microcontroller
A microcontroller (MCU for "microcontroller unit") is a small computer on a single metal-oxide-semiconductor (MOS) integrated circuit (IC) chip. In modern terminology, it is similar to, but less sophisticated than, a system on a chip (SoC); a SoC may include a microcontroller as one of its components. A microcontroller contains one or more CPUs (processor cores) along with memory and programmable input/output peripherals. Program memory in the form of ferroelectric RAM, NOR flash or OTP ROM is also often included on chip, as well as a small amount of RAM. Microcontrollers are designed for embedded applications, in contrast to the microprocessors used in personal computers or other general purpose applications consisting of various discrete chips.
Microcontrollers are used in automatically controlled products and devices, such as automobile engine control systems, implantable medical devices, remote controls, office machines, appliances, power tools, toys and other embedded systems. By reducing the size and cost compared to a design that uses a separate microprocessor, memory, and input/output devices, microcontrollers make it economical to digitally control even more devices and processes. Mixed signal microcontrollers are common, integrating analog components needed to control non-digital electronic systems. In the context of the internet of things, microcontrollers are an economical and popular means of data collection, sensing and actuating the physical world as edge devices.
Some microcontrollers may use four-bit words and operate at frequencies as low as for low power consumption (single-digit milliwatts or microwatts). They generally have the ability to retain functionality while waiting for an event such as a button press or other interrupt; power consumption while sleeping (CPU clock and most peripherals off) may be just nanowatts, making many of them well suited for long lasting battery applications. Other microcontrollers may serve performance-critical roles, where they may need to act more like a digital signal processor (DSP), with higher clock speeds and power consumption.
The origins of both the microprocessor and the microcontroller can be traced back to the invention of the MOSFET (metal-oxide-semiconductor field-effect transistor), also known as the MOS transistor. It was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959, and first demonstrated in 1960. The same year, Atalla proposed the concept of the MOS integrated circuit, which was an integrated circuit chip fabricated from MOSFETs. By 1964, MOS chips had reached higher transistor density and lower manufacturing costs than bipolar chips. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on a single MOS LSI chip.
The first multi-chip microprocessors, the Four-Phase Systems AL1 in 1969 and the Garrett AiResearch MP944 in 1970, were developed with multiple MOS LSI chips. The first single-chip microprocessor was the Intel 4004, released on a single MOS LSI chip in 1971. It was developed by Federico Faggin, using his silicon-gate MOS technology, along with Intel engineers Marcian Hoff and Stan Mazor, and Busicom engineer Masatoshi Shima. It was followed by the 4-bit Intel 4040, the 8-bit Intel 8008, and the 8-bit Intel 8080. All of these processors required several external chips to implement a working system, including memory and peripheral interface chips. As a result, the total system cost was several hundred (1970s US) dollars, making it impossible to economically computerize small appliances. MOS Technology introduced sub-$100 microprocessors, the 6501 and 6502, with the chief aim of addressing this economic obstacle, but these microprocessors still required external support, memory, and peripheral chips which kept the total system cost in the hundreds of dollars.
One book credits TI engineers Gary Boone and Michael Cochran with the successful creation of the first microcontroller in 1971. The result of their work was the TMS 1000, which became commercially available in 1974. It combined read-only memory, read/write memory, processor and clock on one chip and was targeted at embedded systems.
During the early-to-mid-1970s, Japanese electronics manufacturers began producing microcontrollers for automobiles, including 4-bit MCUs for in-car entertainment, automatic wipers, electronic locks, and dashboard, and 8-bit MCUs for engine control.
Partly in response to the existence of the single-chip TMS 1000, Intel developed a computer system on a chip optimized for control applications, the Intel 8048, with commercial parts first shipping in 1977. It combined RAM and ROM on the same chip with a microprocessor. Among numerous applications, this chip would eventually find its way into over one billion PC keyboards. At that time Intel's President, Luke J. Valenter, stated that the microcontroller was one of the most successful products in the company's history, and he expanded the microcontroller division's budget by over 25%.
Most microcontrollers at this time had concurrent variants. One had EPROM program memory, with a transparent quartz window in the lid of the package to allow it to be erased by exposure to ultraviolet light. These erasable chips were often used for prototyping. The other variant was either a mask programmed ROM or a PROM variant which was only programmable once. For the latter, sometimes the designation OTP was used, standing for "one-time programmable". In an OTP microcontroller, the PROM was usually of identical type as the EPROM, but the chip package had no quartz window; because there was no way to expose the EPROM to ultraviolet light, it could not be erased. Because the erasable versions required ceramic packages with quartz windows, they were significantly more expensive than the OTP versions, which could be made in lower-cost opaque plastic packages. For the erasable variants, quartz was required, instead of less expensive glass, for its transparency to ultraviolet light—to which glass is largely opaque—but the main cost differentiator was the ceramic package itself.
In 1993, the introduction of EEPROM memory allowed microcontrollers (beginning with the Microchip PIC16C84) to be electrically erased quickly without an expensive package as required for EPROM, allowing both rapid prototyping, and in-system programming. (EEPROM technology had been available prior to this time, but the earlier EEPROM was more expensive and less durable, making it unsuitable for low-cost mass-produced microcontrollers.) The same year, Atmel introduced the first microcontroller using Flash memory, a special type of EEPROM. Other companies rapidly followed suit, with both memory types.
Nowadays microcontrollers are cheap and readily available for hobbyists, with large online communities around certain processors.
In 2002, about 55% of all CPUs sold in the world were 8-bit microcontrollers and microprocessors.
Over two billion 8-bit microcontrollers were sold in 1997, and according to Semico, over four billion 8-bit microcontrollers were sold in 2006. More recently, Semico has claimed the MCU market grew 36.5% in 2010 and 12% in 2011.
A typical home in a developed country is likely to have only four general-purpose microprocessors but around three dozen microcontrollers. A typical mid-range automobile has about 30 microcontrollers. They can also be found in many electrical devices such as washing machines, microwave ovens, and telephones.
Cost to manufacture can be under $0.10 per unit.
Cost has plummeted over time, with the cheapest 8-bit microcontrollers being available for under in 2018, and some 32-bit microcontrollers around US$1 for similar quantities.
In 2012, following a global crisis—a worst ever annual sales decline and recovery and average sales price year-over-year plunging 17%—the biggest reduction since the 1980s—the average price for a microcontroller was US$0.88 ($0.69 for 4-/8-bit, $0.59 for 16-bit, $1.76 for 32-bit).
In 2012, worldwide sales of 8-bit microcontrollers were around $4 billion, while 4-bit microcontrollers also saw significant sales.
In 2015, 8-bit microcontrollers could be bought for $0.311 (1,000 units), 16-bit for $0.385 (1,000 units), and 32-bit for $0.378 (1,000 units, but at $0.35 for 5,000).
In 2018, 8-bit microcontrollers can be bought for $0.03, 16-bit for $0.393 (1,000 units, but at $0.563 for 100 or $0.349 for full reel of 2,000), and 32-bit for $0.503 (1,000 units, but at $0.466 for 5,000). A lower-priced 32-bit microcontroller, in units of one, can be had for $0.891.
In 2018, the low-priced microcontrollers above from 2015 are all more expensive (with inflation calculated between 2018 and 2015 prices for those specific units) at: the 8-bit microcontroller can be bought for $0.319 (1,000 units) or 2.6% higher, the 16-bit one for $0.464 (1,000 units) or 21% higher, and the 32-bit one for $0.503 (1,000 units, but at $0.466 for 5,000) or 33% higher.
On 21 June 2018, the "world's smallest computer" was announced by the University of Michigan. The device is a "0.04mm3 16nW wireless and batteryless sensor system with integrated Cortex-M0+ processor and optical communication for cellular temperature measurement." It "measures just 0.3 mm to a side—dwarfed by a grain of rice. [...] In addition to the RAM and photovoltaics, the new computing devices have processors and wireless transmitters and receivers. Because they are too small to have conventional radio antennae, they receive and transmit data with visible light. A base station provides light for power and programming, and it receives the data." The device is 1/10th the size of IBM's previously claimed world-record-sized computer from months back in March 2018, which is "smaller than a grain of salt", has a million transistors, costs less than $0.10 to manufacture, and, combined with blockchain technology, is intended for logistics and “crypto-anchors”—”digital fingerprints” applications.
A microcontroller can be considered a self-contained system with a processor, memory and peripherals and can be used as an embedded system. The majority of microcontrollers in use today are embedded in other machinery, such as automobiles, telephones, appliances, and peripherals for computer systems.
While some embedded systems are very sophisticated, many have minimal requirements for memory and program length, with no operating system, and low software complexity. Typical input and output devices include switches, relays, solenoids, LED's, small or custom liquid-crystal displays, radio frequency devices, and sensors for data such as temperature, humidity, light level etc. Embedded systems usually have no keyboard, screen, disks, printers, or other recognizable I/O devices of a personal computer, and may lack human interaction devices of any kind.
Microcontrollers must provide real-time (predictable, though not necessarily fast) response to events in the embedded system they are controlling. When certain events occur, an interrupt system can signal the processor to suspend processing the current instruction sequence and to begin an interrupt service routine (ISR, or "interrupt handler") which will perform any processing required based on the source of the interrupt, before returning to the original instruction sequence. Possible interrupt sources are device dependent, and often include events such as an internal timer overflow, completing an analog to digital conversion, a logic level change on an input such as from a button being pressed, and data received on a communication link. Where power consumption is important as in battery devices, interrupts may also wake a microcontroller from a low-power sleep state where the processor is halted until required to do something by a peripheral event.
Typically micro-controller programs must fit in the available on-chip memory, since it would be costly to provide a system with external, expandable memory. Compilers and assemblers are used to convert both high-level and assembly language codes into a compact machine code for storage in the micro-controller's memory. Depending on the device, the program memory may be permanent, read-only memory that can only be programmed at the factory, or it may be field-alterable flash or erasable read-only memory.
Manufacturers have often produced special versions of their micro-controllers in order to help the hardware and software development of the target system. Originally these included EPROM versions that have a "window" on the top of the device through which program memory can be erased by ultraviolet light, ready for reprogramming after a programming ("burn") and test cycle. Since 1998, EPROM versions are rare and have been replaced by EEPROM and flash, which are easier to use (can be erased electronically) and cheaper to manufacture.
Other versions may be available where the ROM is accessed as an external device rather than as internal memory, however these are becoming rare due to the widespread availability of cheap microcontroller programmers.
The use of field-programmable devices on a micro controller may allow field update of the firmware or permit late factory revisions to products that have been assembled but not yet shipped. Programmable memory also reduces the lead time required for deployment of a new product.
Where hundreds of thousands of identical devices are required, using parts programmed at the time of manufacture can be economical. These "mask programmed" parts have the program laid down in the same way as the logic of the chip, at the same time.
A customized micro-controller incorporates a block of digital logic that can be personalized for additional processing capability, peripherals and interfaces that are adapted to the requirements of the application. One example is the AT91CAP from Atmel.
Microcontrollers usually contain from several to dozens of general purpose input/output pins (GPIO). GPIO pins are software configurable to either an input or an output state. When GPIO pins are configured to an input state, they are often used to read sensors or external signals. Configured to the output state, GPIO pins can drive external devices such as LEDs or motors, often indirectly, through external power electronics.
Many embedded systems need to read sensors that produce analog signals. This is the purpose of the analog-to-digital converter (ADC). Since processors are built to interpret and process digital data, i.e. 1s and 0s, they are not able to do anything with the analog signals that may be sent to it by a device. So the analog to digital converter is used to convert the incoming data into a form that the processor can recognize. A less common feature on some microcontrollers is a digital-to-analog converter (DAC) that allows the processor to output analog signals or voltage levels.
In addition to the converters, many embedded microprocessors include a variety of timers as well. One of the most common types of timers is the programmable interval timer (PIT). A PIT may either count down from some value to zero, or up to the capacity of the count register, overflowing to zero. Once it reaches zero, it sends an interrupt to the processor indicating that it has finished counting. This is useful for devices such as thermostats, which periodically test the temperature around them to see if they need to turn the air conditioner on, the heater on, etc.
A dedicated pulse-width modulation (PWM) block makes it possible for the CPU to control power converters, resistive loads, motors, etc., without using lots of CPU resources in tight timer loops.
A universal asynchronous receiver/transmitter (UART) block makes it possible to receive and transmit data over a serial line with very little load on the CPU. Dedicated on-chip hardware also often includes capabilities to communicate with other devices (chips) in digital formats such as Inter-Integrated Circuit (I²C), Serial Peripheral Interface (SPI), Universal Serial Bus (USB), and Ethernet.
Micro-controllers may not implement an external address or data bus as they integrate RAM and non-volatile memory on the same chip as the CPU. Using fewer pins, the chip can be placed in a much smaller, cheaper package.
Integrating the memory and other peripherals on a single chip and testing them as a unit increases the cost of that chip, but often results in decreased net cost of the embedded system as a whole. Even if the cost of a CPU that has integrated peripherals is slightly more than the cost of a CPU and external peripherals, having fewer chips typically allows a smaller and cheaper circuit board, and reduces the labor required to assemble and test the circuit board, in addition to tending to decrease the defect rate for the finished assembly.
A micro-controller is a single integrated circuit, commonly with the following features:
This integration drastically reduces the number of chips and the amount of wiring and circuit board space that would be needed to produce equivalent systems using separate chips. Furthermore, on low pin count devices in particular, each pin may interface to several internal peripherals, with the pin function selected by software. This allows a part to be used in a wider variety of applications than if pins had dedicated functions.
Micro-controllers have proved to be highly popular in embedded systems since their introduction in the 1970s.
Some microcontrollers use a Harvard architecture: separate memory buses for instructions and data, allowing accesses to take place concurrently. Where a Harvard architecture is used, instruction words for the processor may be a different bit size than the length of internal memory and registers; for example: 12-bit instructions used with 8-bit data registers.
The decision of which peripheral to integrate is often difficult. The microcontroller vendors often trade operating frequencies and system design flexibility against time-to-market requirements from their customers and overall lower system cost. Manufacturers have to balance the need to minimize the chip size against additional functionality.
Microcontroller architectures vary widely. Some designs include general-purpose microprocessor cores, with one or more ROM, RAM, or I/O functions integrated onto the package. Other designs are purpose built for control applications. A micro-controller instruction set usually has many instructions intended for bit manipulation (bit-wise operations) to make control programs more compact. For example, a general purpose processor might require several instructions to test a bit in a register and branch if the bit is set, where a micro-controller could have a single instruction to provide that commonly required function.
Microcontrollers traditionally do not have a math coprocessor, so floating point arithmetic is performed by software. However, some recent designs do include an FPU and DSP optimized features. An example would be Microchip's PIC32 MIPS based line.
Microcontrollers were originally programmed only in assembly language, but various high-level programming languages, such as C, Python and JavaScript, are now also in common use to target microcontrollers and embedded systems. Compilers for general purpose languages will typically have some restrictions as well as enhancements to better support the unique characteristics of microcontrollers. Some microcontrollers have environments to aid developing certain types of applications. Microcontroller vendors often make tools freely available to make it easier to adopt their hardware.
Microcontrollers with specialty hardware may require their own non-standard dialects of C, such as SDCC for the 8051, which prevent using standard tools (such as code libraries or static analysis tools) even for code unrelated to hardware features. Interpreters may also contain nonstandard features, such as MicroPython, although a fork, CircuitPython, has looked to move hardware dependencies to libraries and have the language adhere to a more CPython standard.
Interpreter firmware is also available for some microcontrollers. For example, BASIC on the early microcontrollers Intel 8052; BASIC and FORTH on the Zilog Z8 as well as some modern devices. Typically these interpreters support interactive programming.
Simulators are available for some microcontrollers. These allow a developer to analyze what the behavior of the microcontroller and their program should be if they were using the actual part. A simulator will show the internal processor state and also that of the outputs, as well as allowing input signals to be generated. While on the one hand most simulators will be limited from being unable to simulate much other hardware in a system, they can exercise conditions that may otherwise be hard to reproduce at will in the physical implementation, and can be the quickest way to debug and analyze problems.
Recent microcontrollers are often integrated with on-chip debug circuitry that when accessed by an in-circuit emulator (ICE) via JTAG, allow debugging of the firmware with a debugger. A real-time ICE may allow viewing and/or manipulating of internal states while running. A tracing ICE can record executed program and MCU states before/after a trigger point.
, there are several dozen microcontroller architectures and vendors including:
Many others exist, some of which are used in very narrow range of applications or are more like applications processors than microcontrollers. The microcontroller market is extremely fragmented, with numerous vendors, technologies, and markets. Note that many vendors sell or have sold multiple architectures.
In contrast to general-purpose computers, microcontrollers used in embedded systems often seek to optimize interrupt latency over instruction throughput. Issues include both reducing the latency, and making it be more predictable (to support real-time control).
When an electronic device causes an interrupt, during the context switch the intermediate results (registers) have to be saved before the software responsible for handling the interrupt can run. They must also be restored after that interrupt handler is finished. If there are more processor registers, this saving and restoring process may take more time, increasing the latency. (If an ISR does not require the use of some registers, it may simply leave them alone rather than saving and restoring them, so in that case those registers are not involved with the latency.) Ways to reduce such context/restore latency include having relatively few registers in their central processing units (undesirable because it slows down most non-interrupt processing substantially), or at least having the hardware not save them all (this fails if the software then needs to compensate by saving the rest "manually"). Another technique involves spending silicon gates on "shadow registers": One or more duplicate registers used only by the interrupt software, perhaps supporting a dedicated stack.
Other factors affecting interrupt latency include:
Lower end microcontrollers tend to support fewer interrupt latency controls than higher end ones.
Two different kinds of memory are commonly used with microcontrollers, a non-volatile memory for storing firmware and a read-write memory for temporary data.
From the earliest microcontrollers to today, six-transistor SRAM is almost always used as the read/write working memory, with a few more transistors per bit used in the register file. FRAM or MRAM could potentially replace it as it is denser which would make it more cost effective.
In addition to the SRAM, some microcontrollers also have internal EEPROM for data storage; and even ones that do not have any (or not enough) are often connected to external serial EEPROM chip (such as the BASIC Stamp) or external serial flash memory chip.
A few recent microcontrollers beginning in 2003 have "self-programmable" flash memory.
The earliest microcontrollers used mask ROM to store firmware. Later microcontrollers (such as the early versions of the Freescale 68HC11 and early PIC microcontrollers) had EPROM memory, which used a translucent window to allow erasure via UV light, while production versions had no such window, being OTP (one-time-programmable). Firmware updates were equivalent to replacing the microcontroller itself, thus many products were not upgradeable.
Motorola MC68HC805 was the first microcontroller to use EEPROM to store the firmware. EEPROM microcontrollers became more popular in 1993 when Microchip introduced PIC16C84 and Atmel introduced an 8051-core microcontroller that was first one to use NOR Flash memory to store the firmware. Today's microcontrollers almost exclusively use flash memory, with a few models using FRAM, and some ultra-low-cost parts still use OTP or Mask-ROM.
|
https://en.wikipedia.org/wiki?curid=21017
|
Michelangelo
Michelangelo di Lodovico Buonarroti Simoni (; 6 March 1475 – 18 February 1564), known best as simply Michelangelo (), was an Italian sculptor, painter, architect and poet of the High Renaissance born in the Republic of Florence, who exerted an unparalleled influence on the development of Western art. His artistic versatility was of such a high order that he is often considered a contender for the title of the archetypal Renaissance man, along with his rival, the fellow Florentine, Leonardo da Vinci. Several scholars have described Michelangelo as the greatest artist of his age and even as the greatest artist of all time.
A number of Michelangelo's works of painting, sculpture and architecture rank among the most famous in existence. His output in these fields was prodigious; given the sheer volume of surviving correspondence, sketches and reminiscences, he is the best-documented artist of the 16th century. He sculpted two of his best-known works, the "Pietà" and "David", before the age of thirty. Despite holding a low opinion of painting, he also created two of the most influential frescoes in the history of Western art: the scenes from Genesis on the ceiling of the Sistine Chapel in Rome, and "The Last Judgment" on its altar wall. His design of the Laurentian Library pioneered Mannerist architecture. At the age of 74, he succeeded Antonio da Sangallo the Younger as the architect of St. Peter's Basilica. He transformed the plan so that the western end was finished to his design, as was the dome, with some modification, after his death.
Michelangelo was the first Western artist whose biography was published while he was alive. In fact, two biographies were published during his lifetime. One of them, by Giorgio Vasari, proposed that Michelangelo's work transcended that of any artist living or dead, and was "supreme in not one art alone but in all three".
In his lifetime, Michelangelo was often called "Il Divino" ("the divine one"). His contemporaries often admired his "terribilità"—his ability to instil a sense of awe. Attempts by subsequent artists to imitate Michelangelo's impassioned, highly personal style resulted in Mannerism, the next major movement in Western art after the High Renaissance.
Michelangelo was born on 6 March 1475 in Caprese, known today as Caprese Michelangelo, a small town situated in Valtiberina, near Arezzo, Tuscany. For several generations, his family had been small-scale bankers in Florence; but the bank failed, and his father, Ludovico di Leonardo Buonarroti Simoni, briefly took a government post in Caprese, where Michelangelo was born. At the time of Michelangelo's birth, his father was the town's judicial administrator and "podestà" or local administrator of Chiusi della Verna. Michelangelo's mother was Francesca di Neri del Miniato di Siena. The Buonarrotis claimed to descend from the Countess Mathilde of Canossa—a claim that remains unproven, but which Michelangelo believed.
Several months after Michelangelo's birth, the family returned to Florence, where he was raised. During his mother's later prolonged illness, and after her death in 1481 (when he was six years old), Michelangelo lived with a nanny and her husband, a stonecutter, in the town of Settignano, where his father owned a marble quarry and a small farm. There he gained his love for marble. As Giorgio Vasari quotes him:
As a young boy, Michelangelo was sent to Florence to study grammar under the Humanist Francesco da Urbino. However, he showed no interest in his schooling, preferring to copy paintings from churches and seek the company of other painters.
The city of Florence was at that time Italy's greatest centre of the arts and learning. Art was sponsored by the Signoria (the town council), the merchant guilds, and wealthy patrons such as the Medici and their banking associates. The Renaissance, a renewal of Classical scholarship and the arts, had its first flowering in Florence. In the early 15th century, the architect Filippo Brunelleschi, having studied the remains of Classical buildings in Rome, had created two churches, San Lorenzo's and Santo Spirito, which embodied the Classical precepts. The sculptor Lorenzo Ghiberti had laboured for fifty years to create the bronze doors of the Baptistry, which Michelangelo was to describe as "The Gates of Paradise". The exterior niches of the Church of Orsanmichele contained a gallery of works by the most acclaimed sculptors of Florence: Donatello, Ghiberti, Andrea del Verrocchio, and Nanni di Banco. The interiors of the older churches were covered with frescos (mostly in Late Medieval, but also in the Early Renaissance style), begun by Giotto and continued by Masaccio in the Brancacci Chapel, both of whose works Michelangelo studied and copied in drawings.
During Michelangelo's childhood, a team of painters had been called from Florence to the Vatican to decorate the walls of the Sistine Chapel. Among them was Domenico Ghirlandaio, a master in fresco painting, perspective, figure drawing and portraiture who had the largest workshop in Florence. In 1488, at age 13, Michelangelo was apprenticed to Ghirlandaio. The next year, his father persuaded Ghirlandaio to pay Michelangelo as an artist, which was rare for someone of fourteen. When in 1489, Lorenzo de' Medici, de facto ruler of Florence, asked Ghirlandaio for his two best pupils, Ghirlandaio sent Michelangelo and Francesco Granacci.
From 1490 to 1492, Michelangelo attended the Humanist academy the Medici had founded along Neo-Platonic lines. There his work and outlook were influenced by many of the most prominent philosophers and writers of the day, including Marsilio Ficino, Pico della Mirandola and Poliziano. At this time, Michelangelo sculpted the reliefs "Madonna of the Steps" (1490–1492) and "Battle of the Centaurs" (1491–1492), the latter based on a theme suggested by Poliziano and commissioned by Lorenzo de Medici. Michelangelo worked for a time with the sculptor Bertoldo di Giovanni. When he was seventeen, another pupil, Pietro Torrigiano, struck him on the nose, causing the disfigurement that is conspicuous in the portraits of Michelangelo.
Lorenzo de' Medici's death on 8 April 1492 brought a reversal of Michelangelo's circumstances. Michelangelo left the security of the Medici court and returned to his father's house. In the following months he carved a polychrome wooden "Crucifix" (1493), as a gift to the prior of the Florentine church of Santo Spirito, which had allowed him to do some anatomical studies of the corpses from the church's hospital. This was the first of several instances during his career that Michelangelo studied anatomy by dissecting cadavers.
Between 1493 and 1494 he bought a block of marble, and carved a larger-than-life statue of Hercules, which was sent to France and subsequently disappeared sometime in the 18th century. On 20 January 1494, after heavy snowfalls, Lorenzo's heir, Piero de Medici, commissioned a snow statue, and Michelangelo again entered the court of the Medici.
In the same year, the Medici were expelled from Florence as the result of the rise of Savonarola. Michelangelo left the city before the end of the political upheaval, moving to Venice and then to Bologna. In Bologna, he was commissioned to carve several of the last small figures for the completion of the Shrine of St. Dominic, in the church dedicated to that saint. At this time Michelangelo studied the robust reliefs carved by Jacopo della Quercia around main portal of the Basilica of St Petronius, including the panel of "The Creation of Eve" the composition of which was to reappear on the Sistine Chapel ceiling. Towards the end of 1495, the political situation in Florence was calmer; the city, previously under threat from the French, was no longer in danger as Charles VIII had suffered defeats. Michelangelo returned to Florence but received no commissions from the new city government under Savonarola. He returned to the employment of the Medici. During the half year he spent in Florence, he worked on two small statues, a child "St. John the Baptist" and a sleeping "Cupid". According to Condivi, Lorenzo di Pierfrancesco de' Medici, for whom Michelangelo had sculpted "St. John the Baptist", asked that Michelangelo "fix it so that it looked as if it had been buried" so he could "send it to Rome ... pass [it off as] an ancient work and ... sell it much better." Both Lorenzo and Michelangelo were unwittingly cheated out of the real value of the piece by a middleman. Cardinal Raffaele Riario, to whom Lorenzo had sold it, discovered that it was a fraud, but was so impressed by the quality of the sculpture that he invited the artist to Rome. This apparent success in selling his sculpture abroad as well as the conservative Florentine situation may have encouraged Michelangelo to accept the prelate's invitation.
Michelangelo arrived in Rome on 25 June 1496 at the age of 21. On 4 July of the same year, he began work on a commission for Cardinal Riario, an over-life-size statue of the Roman wine god "Bacchus". Upon completion, the work was rejected by the cardinal, and subsequently entered the collection of the banker Jacopo Galli, for his garden.
In November 1497, the French ambassador to the Holy See, Cardinal Jean de Bilhères-Lagraulas, commissioned him to carve a "Pietà", a sculpture showing the Virgin Mary grieving over the body of Jesus. The subject, which is not part of the Biblical narrative of the Crucifixion, was common in religious sculpture of Medieval Northern Europe and would have been very familiar to the Cardinal. The contract was agreed upon in August of the following year. Michelangelo was 24 at the time of its completion. It was soon to be regarded as one of the world's great masterpieces of sculpture, "a revelation of all the potentialities and force of the art of sculpture". Contemporary opinion was summarised by Vasari: "It is certainly a miracle that a formless block of stone could ever have been reduced to a perfection that nature is scarcely able to create in the flesh." It is now located in St Peter's Basilica.
Michelangelo returned to Florence in 1499. The republic was changing after the fall of its leader, anti-Renaissance priest Girolamo Savonarola, who was executed in 1498, and the rise of the "gonfaloniere" Piero Soderini. Michelangelo was asked by the consuls of the Guild of Wool to complete an unfinished project begun 40 years earlier by Agostino di Duccio: a colossal statue of Carrara marble portraying David as a symbol of Florentine freedom to be placed on the gable of Florence Cathedral. Michelangelo responded by completing his most famous work, the statue of David, in 1504. The masterwork definitively established his prominence as a sculptor of extraordinary technical skill and strength of symbolic imagination. A team of consultants, including Botticelli and Leonardo da Vinci, was called together to decide upon its placement, ultimately the Piazza della Signoria, in front of the Palazzo Vecchio. It now stands in the Academia while a replica occupies its place in the square.
With the completion of the "David" came another commission. In early 1504 Leonardo da Vinci had been commissioned to paint "The Battle of Anghiari" in the council chamber of the Palazzo Vecchio, depicting the battle between Florence and Milan in 1440. Michelangelo was then commissioned to paint the "Battle of Cascina". The two paintings are very different: Leonardo depicts soldiers fighting on horseback, while Michelangelo has soldiers being ambushed as they bathe in the river. Neither work was completed and both were lost forever when the chamber was refurbished. Both works were much admired, and copies remain of them, Leonardo's work having been copied by Rubens and Michelangelo's by Bastiano da Sangallo.
Also during this period, Michelangelo was commissioned by Angelo Doni to paint a "Holy Family" as a present for his wife, Maddalena Strozzi. It is known as the "Doni Tondo" and hangs in the Uffizi Gallery in its original magnificent frame, which Michelangelo may have designed. He also may have painted the Madonna and Child with John the Baptist, known as the "Manchester Madonna" and now in the National Gallery, London.
In 1505 Michelangelo was invited back to Rome by the newly elected Pope Julius II and commissioned to build the Pope's tomb, which was to include forty statues and be finished in five years. Under the patronage of the pope, Michelangelo experienced constant interruptions to his work on the tomb in order to accomplish numerous other tasks. Although Michelangelo worked on the tomb for 40 years, it was never finished to his satisfaction. It is located in the Church of San Pietro in Vincoli in Rome and is most famous for the central figure of Moses, completed in 1516. Of the other statues intended for the tomb, two, known as the "Rebellious Slave" and the "Dying Slave", are now in the Louvre.
During the same period, Michelangelo painted the ceiling of the Sistine Chapel, which took approximately four years to complete (1508–1512). According to Condivi's account, Bramante, who was working on the building of St. Peter's Basilica, resented Michelangelo's commission for the pope's tomb and convinced the pope to commission him in a medium with which he was unfamiliar, in order that he might fail at the task. Michelangelo was originally commissioned to paint the Twelve Apostles on the triangular pendentives that supported the ceiling, and to cover the central part of the ceiling with ornament. Michelangelo persuaded Pope Julius to give him a free hand and proposed a different and more complex scheme, representing the Creation, the Fall of Man, the Promise of Salvation through the prophets, and the genealogy of Christ. The work is part of a larger scheme of decoration within the chapel that represents much of the doctrine of the Catholic Church.
The composition stretches over 500 square metres of ceiling and contains over 300 figures. At its centre are nine episodes from the Book of Genesis, divided into three groups: God's creation of the earth; God's creation of humankind and their fall from God's grace; and lastly, the state of humanity as represented by Noah and his family. On the pendentives supporting the ceiling are painted twelve men and women who prophesied the coming of Jesus, seven prophets of Israel, and five Sibyls, prophetic women of the Classical world. Among the most famous paintings on the ceiling are The Creation of Adam, Adam and Eve in the Garden of Eden, the Deluge, the Prophet Jeremiah, and the Cumaean Sibyl.
In 1513, Pope Julius II died and was succeeded by Pope Leo X, the second son of Lorenzo dei Medici. From 1513 to 1516 Pope Leo was on good terms with Pope Julius's surviving relatives, so encouraged Michelangelo to continue work on Julius's tomb, but the families became enemies again in 1516 when Pope Leo tried to seize the Duchy of Urbino from Julius's nephew Francesco Maria I della Rovere. Pope Leo then had Michelangelo stop working on the tomb, and commissioned him to reconstruct the façade of the Basilica of San Lorenzo in Florence and to adorn it with sculptures. He spent three years creating drawings and models for the façade, as well as attempting to open a new marble quarry at Pietrasanta specifically for the project. In 1520 the work was abruptly cancelled by his financially strapped patrons before any real progress had been made. The basilica lacks a façade to this day.
In 1520 the Medici came back to Michelangelo with another grand proposal, this time for a family funerary chapel in the Basilica of San Lorenzo. For posterity, this project, occupying the artist for much of the 1520s and 1530s, was more fully realised. Michelangelo used his own discretion to create the composition of the Medici Chapel, which houses the large tombs of two of the younger members of the Medici family, Giuliano, Duke of Nemours, and Lorenzo, his nephew. It also serves to commemorate their more famous predecessors, Lorenzo the Magnificent and his brother Giuliano, who are buried nearby. The tombs display statues of the two Medici and allegorical figures representing Night and Day, and Dusk and Dawn. The chapel also contains Michelangelo's "Medici Madonna". In 1976 a concealed corridor was discovered with drawings on the walls that related to the chapel itself.
Pope Leo X died in 1521 and was succeeded briefly by the austere Adrian VI, and then by his cousin Giulio Medici as Pope Clement VII. In 1524 Michelangelo received an architectural commission from the Medici pope for the Laurentian Library at San Lorenzo's Church. He designed both the interior of the library itself and its vestibule, a building utilising architectural forms with such dynamic effect that it is seen as the forerunner of Baroque architecture. It was left to assistants to interpret his plans and carry out instruction. The library was not opened until 1571, and the vestibule remained incomplete until 1904.
In 1527, Florentine citizens, encouraged by the sack of Rome, threw out the Medici and restored the republic. A siege of the city ensued, and Michelangelo went to the aid of his beloved Florence by working on the city's fortifications from 1528 to 1529. The city fell in 1530, and the Medici were restored to power. Michelangelo fell out of favour with the young Alessandro Medici, who had been installed as the first Duke of Florence. Fearing for his life, he fled to Rome, leaving assistants to complete the Medici chapel and the Laurentian Library. Despite Michelangelo's support of the republic and resistance to the Medici rule, he was welcomed by Pope Clement, who reinstated an allowance that he had previously granted the artist and made a new contract with him over the tomb of Pope Julius.
In Rome, Michelangelo lived near the church of Santa Maria di Loreto. It was at this time that he met the poet Vittoria Colonna, marchioness of Pescara, who was to become one of his closest friends until her death in 1547.
Shortly before his death in 1534 Pope Clement VII commissioned Michelangelo to paint a fresco of "The Last Judgement" on the altar wall of the Sistine Chapel. His successor, Pope Paul III, was instrumental in seeing that Michelangelo began and completed the project, which he laboured on from 1534 to October 1541. The fresco depicts the Second Coming of Christ and his Judgement of the souls. Michelangelo ignored the usual artistic conventions in portraying Jesus, showing him as a massive, muscular figure, youthful, beardless and naked. He is surrounded by saints, among whom Saint Bartholomew holds a drooping flayed skin, bearing the likeness of Michelangelo. The dead rise from their graves, to be consigned either to Heaven or to Hell.
Once completed, the depiction of Christ and the Virgin Mary naked was considered sacrilegious, and Cardinal Carafa and Monsignor Sernini (Mantua's ambassador) campaigned to have the fresco removed or censored, but the Pope resisted. At the Council of Trent, shortly before Michelangelo's death in 1564, it was decided to obscure the genitals and Daniele da Volterra, an apprentice of Michelangelo, was commissioned to make the alterations. An uncensored copy of the original, by Marcello Venusti, is in the Capodimonte Museum of Naples.
Michelangelo worked on a number of architectural projects at this time. They included a design for the Capitoline Hill with its trapezoid piazza displaying the ancient bronze statue of Marcus Aurelius. He designed the upper floor of the Palazzo Farnese and the interior of the Church of Santa Maria degli Angeli, in which he transformed the vaulted interior of an Ancient Roman bathhouse. Other architectural works include San Giovanni dei Fiorentini, the Sforza Chapel (Capella Sforza) in the Basilica di Santa Maria Maggiore and the Porta Pia.
While still working on the "Last Judgement", Michelangelo received yet another commission for the Vatican. This was for the painting of two large frescos in the Cappella Paolina depicting significant events in the lives of the two most important saints of Rome, the "Conversion of Saint Paul" and the "Crucifixion of Saint Peter". Like the "Last Judgement", these two works are complex compositions containing a great number of figures. They were completed in 1550. In the same year, Giorgio Vasari published his "Vita", including a biography of Michelangelo.
In 1546, Michelangelo was appointed architect of St. Peter's Basilica, Rome. The process of replacing the Constantinian basilica of the 4th century had been underway for fifty years and in 1506 foundations had been laid to the plans of Bramante. Successive architects had worked on it, but little progress had been made. Michelangelo was persuaded to take over the project. He returned to the concepts of Bramante, and developed his ideas for a centrally planned church, strengthening the structure both physically and visually. The dome, not completed until after his death, has been called by Banister Fletcher, "the greatest creation of the Renaissance".
As construction was progressing on St Peter's, there was concern that Michelangelo would pass away before the dome was finished. However, once building commenced on the lower part of the dome, the supporting ring, the completion of the design was inevitable.
On 7 December 2007, a red chalk sketch for the dome of St Peter's Basilica, possibly the last made by Michelangelo before his death, was discovered in the Vatican archives. It is extremely rare, since he destroyed his designs later in life. The sketch is a partial plan for one of the radial columns of the cupola drum of Saint Peter's.
Michelangelo was a devout Catholic whose faith deepened at the end of his life. His poetry includes the following closing lines from what is known as poem 285 (written in 1554);
""Neither painting nor sculpture will be able any longer to calm my soul, now turned toward that divine love that opened his arms on the cross to take us in.""
Michelangelo was abstemious in his personal life, and once told his apprentice, Ascanio Condivi: "However rich I may have been, I have always lived like a poor man." Condivi said he was indifferent to food and drink, eating "more out of necessity than of pleasure" and that he "often slept in his clothes and ... boots." His biographer Paolo Giovio says, "His nature was so rough and uncouth that his domestic habits were incredibly squalid, and deprived posterity of any pupils who might have followed him." He may not have minded, since he was by nature a solitary and melancholy person, "bizzarro e fantastico," a man who "withdrew himself from the company of men."
It is impossible to know for certain whether Michelangelo had physical relationships (Condivi ascribed to him a "monk-like chastity"); speculation about his sexuality is rooted in his poetry. He wrote over three hundred sonnets and madrigals. The longest sequence, displaying deep romantic feeling, was written to Tommaso dei Cavalieri (c. 1509–1587), who was 23 years old when Michelangelo met him in 1532, at the age of 57. These make up the first large sequence of poems in any modern tongue addressed by one man to another; they predate by fifty years Shakespeare's sonnets to the fair youth:
Cavalieri replied: "I swear to return your love. Never have I loved a man more than I love you, never have I wished for a friendship more than I wish for yours." Cavalieri remained devoted to Michelangelo until his death.
In 1542, Michelangelo met Cecchino dei Bracci who died only a year later, inspiring Michelangelo to write forty-eight funeral epigrams. Some of the objects of Michelangelo's affections, and subjects of his poetry, took advantage of him: the model Febo di Poggio asked for money in response to a love-poem, and a second model, Gherardo Perini, stole from him shamelessly.
What some have interpreted as the seemingly homoerotic nature of the poetry has been a source of discomfort to later generations. Michelangelo's grandnephew, Michelangelo Buonarroti the Younger, published the poems in 1623 with the gender of pronouns changed, and it was not until John Addington Symonds translated them into English in 1893 that the original genders were restored. In modern times some scholars insist that, despite the restoration of the pronouns, they represent "an emotionless and elegant re-imagining of Platonic dialogue, whereby erotic poetry was seen as an expression of refined sensibilities".
Late in life, Michelangelo nurtured a great platonic love for the poet and noble widow Vittoria Colonna, whom he met in Rome in 1536 or 1538 and who was in her late forties at the time. They wrote sonnets for each other and were in regular contact until she died. These sonnets mostly deal with the spiritual issues that occupied them. Condivi recalls Michelangelo's saying that his sole regret in life was that he did not kiss the widow's face in the same manner that he had her hand.
In a letter from late 1542, Michelangelo blamed the tensions between Julius II and himself on the envy of Bramante and Raphael, saying of the latter, "all he had in art, he got from me". According to Gian Paolo Lomazzo, Michelangelo and Raphael met once: the former was alone, while the latter was accompanied by several others. Michelangelo commented that he thought he had encountered the chief of police with such an assemblage, and Raphael replied that he thought he had met an executioner, as they are wont to walk alone.
The "Madonna of the Steps" is Michelangelo's earliest known work in marble. It is carved in shallow relief, a technique often employed by the master-sculptor of the early 15th century, Donatello, and others such as Desiderio da Settignano. While the Madonna is in profile, the easiest aspect for a shallow relief, the child displays a twisting motion that was to become characteristic of Michelangelo's work. The "Taddei Tondo" of 1502 shows the Christ Child frightened by a Bullfinch, a symbol of the Crucifixion. The lively form of the child was later adapted by Raphael in the "Bridgewater Madonna". The "Bruges Madonna" was, at the time of its creation, unlike other such statues depicting the Virgin proudly presenting her son. Here, the Christ Child, restrained by his mother's clasping hand, is about to step off into the world. The "Doni Tondo", depicting the Holy Family, has elements of all three previous works: the frieze of figures in the background has the appearance of a low-relief, while the circular shape and dynamic forms echo the Taddeo Tondo. The twisting motion present in the "Bruges Madonna" is accentuated in the painting. The painting heralds the forms, movement and colour that Michelangelo was to employ on the ceiling of the Sistine Chapel.
The kneeling angel is an early work, one of several that Michelangelo created as part of a large decorative scheme for the Arca di San Domenico in the church dedicated to that saint in Bologna. Several other artists had worked on the scheme, beginning with Nicola Pisano in the 13th century. In the late 15th century, the project was managed by Niccolò dell'Arca. An angel holding a candlestick, by Niccolò, was already in place. Although the two angels form a pair, there is a great contrast between the two works, the one depicting a delicate child with flowing hair clothed in Gothic robes with deep folds, and Michelangelo's depicting a robust and muscular youth with eagle's wings, clad in a garment of Classical style.
Everything about Michelangelo's angel is dynamic.
Michelangelo's "Bacchus" was a commission with a specified subject, the youthful God of Wine. The sculpture has all the traditional attributes, a vine wreath, a cup of wine and a fawn, but Michelangelo ingested an air of reality into the subject, depicting him with bleary eyes, a swollen bladder and a stance that suggests he is unsteady on his feet. While the work is plainly inspired by Classical sculpture, it is innovative for its rotating movement and strongly three-dimensional quality, which encourages the viewer to look at it from every angle.
In the so-called "Dying Slave", Michelangelo has again utilised the figure with marked contraposto to suggest a particular human state, in this case waking from sleep. With the "Rebellious Slave", it is one of two such earlier figures for the Tomb of Pope Julius II, now in the Louvre, that the sculptor brought to an almost finished state. These two works were to have a profound influence on later sculpture, through Rodin who studied them at the Louvre.
The "Bound Slave" is one of the later figures for Pope Julius' tomb. The works, known collectively as "The Captives", each show the figure struggling to free itself, as if from the bonds of the rock in which it is lodged. The works give a unique insight into the sculptural methods that Michelangelo employed and his way of revealing what he perceived within the rock.
The Sistine Chapel ceiling was painted between 1508 and 1512. The ceiling is a flattened barrel vault supported on twelve triangular pendentives that rise from between the windows of the chapel. The commission, as envisaged by Pope Julius II, was to adorn the pendentives with figures of the twelve apostles. Michelangelo, who was reluctant to take the job, persuaded the Pope to give him a free hand in the composition. The resultant scheme of decoration awed his contemporaries and has inspired other artists ever since. The scheme is of nine panels illustrating episodes from the Book of Genesis, set in an architectonic frame. On the pendentives, Michelangelo replaced the proposed Apostles with Prophets and Sibyls who heralded the coming of the Messiah.
Michelangelo began painting with the later episodes in the narrative, the pictures including locational details and groups of figures, the "Drunkenness of Noah" being the first of this group. In the later compositions, painted after the initial scaffolding had been removed, Michelangelo made the figures larger. One of the central images, "The Creation of Adam" is one of the best known and most reproduced works in the history of art. The final panel, showing the "Separation of Light from Darkness" is the broadest in style and was painted in a single day. As the model for the Creator, Michelangelo has depicted himself in the action of painting the ceiling.
As supporters to the smaller scenes, Michelangelo painted twenty youths who have variously been interpreted as angels, as muses, or simply as decoration. Michelangelo referred to them as "ignudi". The figure reproduced may be seen in context in the above image of the "Separation of Light from Darkness".
In the process of painting the ceiling, Michelangelo made studies for different figures, of which some, such as that for "The Libyan Sibyl" have survived, demonstrating the care taken by Michelangelo in details such as the hands and feet. The Prophet Jeremiah, contemplating the downfall of Jerusalem, is an image of the artist himself.
Michelangelo's relief of the "Battle of the Centaurs", created while he was still a youth associated with the Medici Academy, is an unusually complex relief in that it shows a great number of figures involved in a vigorous struggle. Such a complex disarray of figures was rare in Florentine art, where it would usually only be found in images showing either the Massacre of the Innocents or the Torments of Hell. The relief treatment, in which some of the figures are boldly projecting, may indicate Michelangelo's familiarity with Roman sarcophagus reliefs from the collection of Lorenzo Medici, and similar marble panels created by Nicola and Giovanni Pisano, and with the figurative compositions on Ghiberti's Baptistry Doors.
The composition of the "Battle of Cascina" is known in its entirety only from copies, as the original cartoon, according to Vasari, was so admired that it deteriorated and was eventually in pieces. It reflects the earlier relief in the energy and diversity of the figures, with many different postures, and many being viewed from the back, as they turn towards the approaching enemy and prepare for battle.
In "The Last Judgment" it is said that Michelangelo drew inspiration from a fresco by Melozzo da Forlì in Rome's Santi Apostoli. Melozzo had depicted figures from different angles, as if they were floating in the Heaven and seen from below. Melozzo's majestic figure of Christ, with windblown cloak, demonstrates a degree of foreshortening of the figure that had also been employed by Andrea Mantegna, but was not usual in the frescos of Florentine painters. In "The Last Judgement" Michelangelo had the opportunity to depict, on an unprecedented scale, figures in the action of either rising heavenward or falling and being dragged down.
In the two frescos of the Pauline Chapel, "The Crucifixion of St. Peter" and "The Conversion of Saul", Michelangelo has used the various groups of figures to convey a complex narrative. In the "Crucifixion of Peter" soldiers busy themselves about their assigned duty of digging a post hole and raising the cross while various people look on and discuss the events. A group of horrified women cluster in the foreground, while another group of Christians is led by a tall man to witness the events. In the right foreground, Michelangelo walks out of the painting with an expression of disillusionment.
Michelangelo's architectural commissions included a number that were not realised, notably the façade for Brunelleschi's Church of San Lorenzo in Florence, for which Michelangelo had a wooden model constructed, but which remains to this day unfinished rough brick. At the same church, Giulio de' Medici (later Pope Clement VII) commissioned him to design the Medici Chapel and the tombs of Giuliano and Lorenzo Medici. Pope Clement also commissioned the Laurentian Library, for which Michelangelo also designed the extraordinary vestibule with columns recessed into niches, and a staircase that appears to spill out of the library like a flow of lava, according to Nikolaus Pevsner, "... revealing Mannerism in its most sublime architectural form."
In 1546 Michelangelo produced the highly complex ovoid design for the pavement of the Campidoglio and began designing an upper storey for the Farnese Palace. In 1547 he took on the job of completing St Peter's Basilica, begun to a design by Bramante, and with several intermediate designs by several architects. Michelangelo returned to Bramante's design, retaining the basic form and concepts by simplifying and strengthening the design to create a more dynamic and unified whole. Although the late 16th-century engraving depicts the dome as having a hemispherical profile, the dome of Michelangelo's model is somewhat ovoid and the final product, as completed by Giacomo della Porta, is more so.
In his old age, Michelangelo created a number of "Pietàs" in which he apparently reflects upon mortality. They are heralded by the "Victory", perhaps created for the tomb of Pope Julius II but left unfinished. In this group, the youthful victor overcomes an older hooded figure, with the features of Michelangelo.
The "Pietà of Vittoria Colonna" is a chalk drawing of a type described as "presentation drawings", as they might be given as a gift by an artist, and were not necessarily studies towards a painted work. In this image, Mary's upraise arms and upraised hands are indicative of her prophetic role. The frontal aspect is reminiscent of Masaccio's fresco of the Holy Trinity in the Basilica of Santa Maria Novella, Florence.
In the "Florentine Pietà", Michelangelo again depicts himself, this time as the aged Nicodemus lowering the body of Jesus from the cross into the arms of Mary his mother and Mary Magdalene. Michelangelo smashed the left arm and leg of the figure of Jesus. His pupil Tiberio Calcagni repaired the arm and drilled a hole in which to fix a replacement leg which was not subsequently attached. He also worked on the figure of Mary Magdalene.
The last sculpture that Michelangelo worked on (six days before his death), the "Rondanini Pietà" could never be completed because Michelangelo carved it away until there was insufficient stone. The legs and a detached arm remain from a previous stage of the work. As it remains, the sculpture has an abstract quality, in keeping with 20th-century concepts of sculpture.
Michelangelo died in Rome in 1564, at the age of 88 (three weeks before his 89th birthday). His body was taken from Rome for interment at the Basilica of Santa Croce, fulfilling the maestro's last request to be buried in his beloved Florence.
Michelangelo, with Leonardo da Vinci and Raphael, is one of the three giants of the Florentine High Renaissance. Although their names are often cited together, Michelangelo was younger than Leonardo by 23 years, and older than Raphael by eight. Because of his reclusive nature, he had little to do with either artist and outlived both of them by more than forty years. Michelangelo took few sculpture students. He employed Francesco Granacci, who was his fellow pupil at the Medici Academy, and became one of several assistants on the Sistine Chapel ceiling. Michelangelo appears to have used assistants mainly for the more manual tasks of preparing surfaces and grinding colours. Despite this, his works were to have a great influence on painters, sculptors and architects for many generations to come.
While Michelangelo's "David" is the most famous male nude of all time and now graces cities around the world, some of his other works have had perhaps even greater impact on the course of art. The twisting forms and tensions of the "Victory", the "Bruges Madonna" and the "Medici Madonna" make them the heralds of the Mannerist art. The unfinished giants for the tomb of Pope Julius II had profound effect on late-19th- and 20th-century sculptors such as Rodin and Henry Moore.
Michelangelo's foyer of the Laurentian Library was one of the earliest buildings to utilise Classical forms in a plastic and expressive manner. This dynamic quality was later to find its major expression in Michelangelo's centrally planned St Peter's, with its giant order, its rippling cornice and its upward-launching pointed dome. The dome of St Peter's was to influence the building of churches for many centuries, including Sant'Andrea della Valle in Rome and St Paul's Cathedral, London, as well as the civic domes of many public buildings and the state capitals across America.
Artists who were directly influenced by Michelangelo include Raphael, whose monumental treatment of the figure in the "School of Athens" and "The Expulsion of Heliodorus from the Temple" owes much to Michelangelo, and whose fresco of "Isaiah" in Sant'Agostino closely imitates the older master's prophets. Other artists, such as Pontormo, drew on the writhing forms of the "Last Judgement" and the frescoes of the Capella Paolina.
The Sistine Chapel ceiling was a work of unprecedented grandeur, both for its architectonic forms, to be imitated by many Baroque ceiling painters, and also for the wealth of its inventiveness in the study of figures. Vasari wrote:
|
https://en.wikipedia.org/wiki?curid=21019
|
Mecca
Makkah al-Mukarramah, () commonly shortened to Makkah, and also known by its latinized form, Mecca, is the holiest city in Islam and the capital of the Makkah Province of Saudi Arabia. The city is located inland from Jeddah on the Red Sea, in a narrow valley above sea level. Its last recorded population was 1,578,722 in 2015. estimated metro population in 2020 is 2.042 million, making it the third-most populated city in the kingdom. Pilgrims more than triple this number every year during the "Ḥajj" pilgrimage, observed in the twelfth Hijri month of "Dhūl-Ḥijjah".
Makkah is the birthplace of Muhammad. The Hira cave located atop the "Jabal al-Nur" ("Mountain of Light") is located just outside the city and is the location where Muslims believe the Qur'an was first revealed to Muhammad. Visiting Makkah for the Hajj is an obligation upon all able Muslims. The Great Mosque of Makkah, known as the "Masjid al-Haram" is home to the Ka'bah, believed by Muslims to have been built by Abraham and Ishmael, is one of Islam's holiest sites and the direction of prayer for all Muslims ("qibla"), cementing Makkah's significance in Islam.
Muslim rulers from in and around the region long tried to take the city and keep it in their control, and thus, much like most of the Hejaz region, the city has seen several regime changes, which owes to its rich history. The city was finally conquered in the Saudi conquest of Hejaz by Ibn Saud and his allies in 1925. Since then. Makkah has seen a tremendous expansion in size and infrastructure, with newer, modern buildings such as the Abraj Al Bait, the world's fourth-tallest building and third-largest by floor area, towering over the Great Mosque. The Saudi government has also carried out a destruction of several historical structures and archaeological sites, such as the Ajyad Fortress. Non-Muslims are strictly prohibited from entering the city.
Muslims from around the world visit the city, not only for the Hajj and Umrah pilgrimages, but also as tourists to visit regional landmarks such as the 'Aisha Mosque ("Masjid 'Aisha") and the sites visited by pilgrims in the Hajj and 'Umrah. Makkah is now home to two of the most expensive buildings in the world, the Masjid al-Haram, valued at 100 billion US dollars and the Abraj al-Bait complex, valued at 15 billion US dollars.
Makkah has been referred to by many names. Much like most of Arabic, the etymology of Makkah, is obscure. Widely believed to be a synonym for Mecca, it is said to be more specifically the early name for the valley located therein, while Muslim scholars generally use it to refer to the sacred area of the city that immediately surrounds and includes the Ka'bah.
The Qur'an refers to the city as Bakkah in Surah Al Imran (3), verse 96, "Indeed the first House [of worship], established for mankind was that at Bakkah..."
– Qur'an 3:96 This is presumed to have been the name of the city at the time of Abraham (Ibrahim in Islamic tradition) and it is also transliterated as Baca, Baka, Bakah, Bakka, Becca, Bekka, among others.
In South Arabic, the language in use in the southern portion of the Arabian Peninsula at the time of Muhammad, the "b" and "m" were interchangeable. This is presumed to have been the origin of the current form of the name. "Makkah" is the official transliteration used by the Saudi government and is closer to the Arabic pronunciation. The government adopted "Makkah" as the official spelling in the 1980s, but is not universally known or used worldwide. The full official name is Makkah al-Mukarramah (). "Makkah" is used to refer to the city in the Qur'an in Surah Al-Fath (48), verse 24.
The word "Mecca" in English has come to be used to refer to any place that draws large numbers of people, and because of this some English-speaking Muslims have come to regard the use of this spelling for the city as offensive. Mecca is the familiar form of the English transliteration for the Arabic name of the city,
The consensus in academic scholarship is that "Macoraba", the place mentioned in Arabia Felix by Claudius Ptolemy, is Makkah. Many etymologies have been proposed but the most suitable one is that it is derived from the Old South Arabian root "M-K-R-B" which means temple.
Another name used for Mecca in the Qur'an is at 6:92 where it is called "Umm al-Qurā" (, meaning "Mother of all Settlements". The city has been called several other names in both the Qur'an and "ahadith." Another name used historically for Makkah is "Tihāmah". According to Arab and Islamic tradition, another name for Makkah, Fārān, is synonymous with the Desert of Paran mentioned in the Old Testament at Genesis 21:21. Arab and Islamic tradition holds that the wilderness of Paran, broadly speaking, is the Tihamah coastal plain and the site where Ishmael settled was Mecca. Yaqut al-Hamawi, the 12th century Syrian geographer, wrote that Fārān was "an arabized Hebrew word, one of the names of Mecca mentioned in the Torah."
In 2010, Makkah and the surrounding area became an important site for paleontology with respect to primate evolution, with the discovery of a "Saadanius" fossil. "Saadanius" is considered to be a primate closely related to the common ancestor of the Old World monkeys and apes. The fossil habitat, near what is now the Red Sea in western Saudi Arabia, was a damp forest area between 28 million and 29 million years ago. Paleontologists involved in the research hope to find further fossils in the area.
The early history of Mecca is still largely disputed, as there are no unambiguous reference to it in ancient literature prior to the rise of Islam. The Roman Empire took control of part of the Hejaz in 106 CE, ruling cities such as Hegra (now known as Mada'in Saleh), located around 800 km (500 mi) north of Makkah. Even though detailed descriptions of Western Arabia were established by the Romans, such as by Procopius, there are no references of a pilgrimage and trading outpost such as Mecca.
The Greek historian Diodorus Siculus writes about Arabia in his work Bibliotheca historica, describing a holy shrine: "And a temple has been set up there, which is very holy and exceedingly revered by all Arabians". Claims have been made this could be a reference to the Ka'bah in Mecca. However, the geographic location Diodorus describes is located in northwest Arabia, around the area of Leuke Kome, closer to Petra and within the former Nabataean Kingdom and Roman province of Arabia Petraea.
Ptolemy lists the names of 50 cities in Arabia, one going by the name of "Macoraba". There has been speculation since 1646 that this could be a reference to Makkah, but many scholars see no compelling explanation to link the two names. Bowersock favors the identity of the former, with his theory being that "Macoraba" is the word ""Makkah"" followed by the aggrandizing Aramaic adjective "rabb" (great). The Roman historian Ammianus Marcellinus also enumerated many cities of Western Arabia, most of whom can be identified. According to Bowersock, he did mention Mecca as "Geapolis" or "Hierapolis", the latter one meaning "holy city", referring to the sanctuary of the Ka'bah, well known already in pagan times. Patricia Crone, from the Revisionist school of Islamic studies on the other hand, writes that "the plain truth is that the name Macoraba has nothing to do with that of Mecca [...] if Ptolemy mentions Mecca at all, he calls it Moka, a town in Arabia Petraea" (in northwest Arabia near present-day Petra).
The first direct reference to Makkah in external literature occurs in 741 CE, in the Byzantine-Arab Chronicle, though here the author places it in Mesopotamia rather than the Hejaz. Given the inhospitable environment, and lack of historical references in Roman, Persian and Indian sources, historians including Patricia Crone and Tom Holland have cast doubt on the claim that Mecca was a major historical trading outpost. However, other scholars such as Glen W. Bowersock disagree and assert that Mecca was a major trading outpost.
Mecca is mentioned in the following early Qur'anic manuscripts:
In the Islamic view, the beginnings of Mecca are attributed to the Biblical figures, Abraham, Hagar and Ishmael. The civilization of Makkah is believed to have started after Ibrāhīm (Abraham) left his son Ismāʿīl (Ishmael) and wife Hājar (Hagar) in the valley at Allah's command. Some people from the Yemeni tribe of Jurhum settled with them, and Isma'il reportedly married two women, one after divorcing the first, on Ibrahim's advice. At least one man of the Jurhum helped Ismāʿīl and his father to construct or according to Islamic narratives, reconstruct, the "Ka'bah" ('Cube'), which would have social, religious, political and historical implications for the site and region.
Muslims see the mention of a pilgrimage at the Valley of Baca in the Old Testament chapter Psalm 84:3–6 as a reference to Makkah, similar to the Qur'an at Surah 3:96. In the "Sharḥ al-Asāṭīr", a commentary on the Samaritan midrashic chronology of the Patriarchs, of unknown date but probably composed in the 10th century CE, it is claimed that Makkah was built by the sons of Nebaioth, the eldest son of Ismāʿīl orIshmael.
Some Thamudic inscriptions which were discovered in the south Jordan contained names of some individuals such as "ʿAbd Mekkat" (, "Servant of Mecca").
There were also some other inscriptions which contained personal names such as "Makki" (, "Meccan"), but Jawwad Ali from the University of Baghdad suggested that there's also a probability of a tribe named "Makkah".
Some time in the 5th century, the Ka'bah was a place of worship for the deities of Arabia's pagan tribes. Mecca's most important pagan deity was Hubal, which had been placed there by the ruling Quraish tribe. and remained until the Conquest of Makkah by Muhammad. In the 5th century, the Quraish took control of Mecca, and became skilled merchants and traders. In the 6th century, they joined the lucrative spice trade, since battles elsewhere were diverting trade routes from dangerous sea routes to more secure overland routes. The Byzantine Empire had previously controlled the Red Sea, but piracy had been increasing. Another previous route that ran through the Persian Gulf via the Tigris and Euphrates rivers was also being threatened by exploitations from the Sassanid Empire, and was being disrupted by the Lakhmids, the Ghassanids, and the Roman–Persian Wars. Mecca's prominence as a trading center also surpassed the cities of Petra and Palmyra. The Sassanids however did not always pose a threat to Mecca, as in 575 CE they protected it from an Yemeni invasion, led by its Christian leader Abraha. The tribes of southern Arabia asked the Persian king Khosrau I for aid, in response to which he came south to Arabia with foot-soldiers and a fleet of ships near Mecca.
By the middle of the 6th century, there were three major settlements in northern Arabia, all along the south-western coast that borders the Red Sea, in a habitable region between the sea and the Hejaz mountains to the east. Although the area around Mecca was completely barren, it was the wealthiest of the three settlements with abundant water from the renowned Zamzam Well and a position at the crossroads of major caravan routes.
The harsh conditions and terrain of the Arabian peninsula meant a near-constant state of conflict between the local tribes, but once a year they would declare a truce and converge upon Mecca in an annual pilgrimage. Up to the 7th century, this journey was intended for religious reasons by the pagan Arabs to pay homage to their shrine, and to drink Zamzam. However, it was also the time each year that disputes would be arbitrated, debts would be resolved, and trading would occur at Meccan fairs. These annual events gave the tribes a sense of common identity and made Makkah an important focus for the peninsula.
The "Year of the Elephant" is the name in Islamic history for the year approximately equating to 570 CE, when, according to Islamic sources such as Ibn Ishaq, Abraha descended upon Makkah, riding an elephant, with a large army after building a cathedral at San'aa, named "al-Qullays" in honor of the Negus of Axum. It gained widespread fame, even gaining attention from the Byzantine Empire. Abraha attempted to divert the pilgrimage of the Arabs from the Ka'bah to al-Qullays, effectively converting them to Christianity. According to Islamic tradition, this was the year of Muhammad's birth. Abraha allegedly sent a messenger named Muhammad ibn Khuza'i to Mecca and Tihamah with a message that al-Qullays was both much better than other houses of worship and purer, having not been defiled by the housing of idols. When Muhammad ibn Khuza'i got as far as the land of Kinana, the people of the lowland, knowing what he had come for, sent a man of Hudhayl called ʿUrwa bin Hayyad al-Milasi, who shot him with an arrow, killing him. His brother Qays who was with him, fled to Abraha and told him the news, which increased his rage and fury and he swore to raid the Kinana tribe and destroy the Ka'bah. Ibn Ishaq further states that one of the men of the Quraysh tribe was angered by this, and going to Sana'a, entering the church at night and defiling it; widely assumed to have done so by defecating in it.
Abraha marched upon the Ka'bah with a large army, which included one or more war elephants, intending to demolish it. When news of the advance of his army came, the Arab tribes of Quraysh, Kinanah, Khuza'a and Hudhayl united in the defense of the Ka'bah and the city. A man from the Himyarite Kingdom was sent by Abraha to advise them that Abraha only wished to demolish the Ka'bah and if they resisted, they would be crushed. Abdul Muttalib told the Meccans to seek refuge in the hills while he and some members of the Quraysh remained within the precincts of the Kaaba. Abraha sent a dispatch inviting Abdul-Muttalib to meet with Abraha and discuss matters. When Abdul-Muttalib left the meeting he was heard saying, "The Owner of this House is its Defender, and I am sure he will save it from the attack of the adversaries and will not dishonor the servants of His House." Abraha eventually attacked Mecca. However, the lead elephant, known as Mahmud, is said to have stopped at the boundary around Mecca and refused to enter. It has been theorized that an epidemic such as by smallpox could have caused such a failed invasion of Mecca. The reference to the story in Quran is rather short. According to the 115th Surah of the Qur'an, Al-Fil, the next day, a dark cloud of small birds sent by Allah appeared. The birds carried small rocks in their beaks, and bombarded the Ethiopian forces and smashed them to a state like that of eaten straw.
Camel caravans, said to have first been used by Muhammad's great-grandfather, were a major part of Mecca's bustling economy. Alliances were struck between the merchants in Mecca and the local nomadic tribes, who would bring goods – leather, livestock, and metals mined in the local mountains – to Mecca to be loaded on the caravans and carried to cities in Shaam and Iraq. Historical accounts also provide some indication that goods from other continents may also have flowed through Mecca. Goods from Africa and the Far East passed through en route to Syria including spices, leather, medicine, cloth, and slaves; in return Mecca received money, weapons, cereals and wine, which in turn were distributed throughout Arabia. The Meccans signed treaties with both the Byzantines and the Bedouins, and negotiated safe passages for caravans, giving them water and pasture rights. Mecca became the center of a loose confederation of client tribes, which included those of the Banu Tamim. Other regional powers such as the Abyssinians, Ghassanids, and Lakhmids were in decline leaving Meccan trade to be the primary binding force in Arabia in the late 6th century.
Muhammad was born in Mecca in 570, and thus Islam has been inextricably linked with it ever since. He was born in a minor faction, the Banu Hashim, of the ruling Quraysh tribe. It was in Makkah, in the nearby mountain cave of Hira on Jabal al-Nour, that, according to Islamic tradition, Muhammad began receiving divine revelations from God through the archangel Jibreel in 610 AD. Advocating his form of Abrahamic monotheism against Meccan paganism, and after enduring persecution from the pagan tribes for 13 years, Muhammad emigrated to Madinah ("hijrah") in 622 with his companions, the "Muhajirun", to Yathrib (later renamed Madinah). The conflict between the Quraysh and the Muslims is accepted to have begun at this point. Overall, Meccan efforts to annihilate Islam failed and proved to be costly and unsuccessful. During the Battle of the Trench in 627, the combined armies of Arabia were unable to defeat Muhammad's forces.
In 628, Muhammad and his followers wanted to enter Mecca for pilgrimage, but were blocked by the Quraysh. Subsequently, Muslims and Meccans entered into the Treaty of Hudaybiyyah, whereby the Quraysh and their allies promised to cease fighting Muslims and their allies and promised that Muslims would be allowed into the city to perform the pilgrimage the following year. It was meant to be a ceasefire for 10 years; however, just two years later, the Banu Bakr, allies of the Quraish, violated the truce by slaughtering a group of the Banu Khuza'ah, allies of the Muslims. Muhammad and his companions, now 10,000 strong, marched into Makkah and conquered the city. The pagan imagery was destroyed by Muhammad's followers and the location Islamized and rededicated to the worship of Allah alone. Mecca was declared the holiest site in Islam ordaining it as the center of Muslim pilgrimage ("Hajj"), one of the faith's Five Pillars. Muhammad then returned to Medina, after assigning 'Akib ibn Usaid as governor of the city. His other activities in Arabia led to the unification of the peninsula under the banner of Islam. Muhammad died in 632. Within the next few hundred years, stretched from North Africa into Asia and parts of Europe. As the Islamic realm grew, Makkah continued to attract pilgrims from all across the Muslim world and beyond, as Muslims came to perform the annual Hajj pilgrimage. Makkah also attracted a year-round population of scholars, pious Muslims who wished to live close to the Kaaba, and local inhabitants who served the pilgrims. Due to the difficulty and expense of the Hajj, pilgrims arrived by boat at Jeddah, and came overland, or joined the annual caravans from Syria or Iraq.
Makkah was never the capital of any of the Islamic states. Muslim rulers did contribute to its upkeep, such as during the reigns of 'Umar (r. 634–644 CE) and 'Uthman ibn Affan (r. 644–656 CE) when concerns of flooding caused the caliphs to bring in Christian engineers to build barrages in the low-lying quarters and construct dykes and embankments to protect the area round the Kaaba.
Muhammad's return to Madinah shifted the focus away from Makkah and later even further away when 'Ali, the fourth caliph, took power chose Kufa as his capital. The Umayyad Caliphate moved the capital to Damascus in Syria and the Abbasid Caliphate to Baghdad, in modern-day Iraq, which remained the center of the Islamic Empire for nearly 500 years. Mecca re-entered Islamic political history during the Second Fitna, when it was held by Abdullah ibn az-Zubayr and the Zubayrids. The city was twice besieged by the Umayyads, in 683 and 692 and for some time thereafter, the city figured little in politics, remaining a city of devotion and scholarship governed by various other factions. In 930, Mecca was attacked and sacked by Qarmatians, a millenarian Shi'a Isma'ili Muslim sect led by Abū-Tāhir Al-Jannābī and centered in eastern Arabia. The Black Death pandemic hit Makkah in 1349.
One of the most famous travelers to Mecca in the 14th century was Moroccan scholar and traveler, Ibn Battuta. In his "rihla" (account), he provides a vast description of the city. Around the year 1327 CE or 729 AH, Ibn Battuta arrived at the holy city. Immediately, he says, it felt like a holy sanctuary and thus. he started the rites of the pilgrimage. He remained in Mecca for three years and left in 1330 CE. During his second year in the holy city, he says his caravan arrived "with a great quantity of alms for the support of those who were staying in Mecca and Medina". While in Makkah, prayers were made for (not to) the King of Iraq and also for Salaheddin al-Ayyubi, Sultan of Egypt and Syria at the Ka'bah. Battuta says the Ka'bah was large, but was destroyed and rebuilt smaller than the original and that it contained images of angels and prophets including Jesus, his mother Mary and many others. Battuta describes the Ka'bah as an important part of Makkah due to the fact that many people make the pilgrimage to it. Battuta describes the people of the city as being humble and kind, and also willing to give a part of everything they had to someone who had nothing. The inhabitants of Makkah and the village itself, he says, were very clean. There was also a sense of elegance to the village.
In 1517, the then Sharif of Makkah, Barakat bin Muhammad, acknowledged the supremacy of the Ottoman Caliph but retained a great degree of local autonomy. In 1803 the city was captured by the First Saudi State, which held Mecca until 1813. destroying some of the historic tombs and domes in and around the city. The Ottomans assigned the task of bringing Makkah back under Ottoman control to their powerful "Khedive" (viceroy) and "Wali" of Egypt, Muhammad Ali Pasha. Muhammad Ali Pasha successfully returned Mecca to Ottoman control in 1813. In 1818, the Saud were defeated again but survived and founded the Second Saudi State that lasted until 1891 and led on to the present country of Saudi Arabia. In 1853, Sir Richard Francis Burton undertook the Muslim pilgrimage to Mecca and Medina disguised as a Muslim. Although Burton was certainly not the first non-Muslim European to make the "Hajj" (Ludovico di Varthema did this in 1503), his pilgrimage remains one of the most famous and documented of modern times. Mecca was regularly hit by cholera outbreaks. Between 1830 and 1930, cholera broke out among pilgrims at Makkah 27 times.
In World War I, the Ottoman Empire was at war with Britain and its allies. It had successfully repulsed an attack on Istanbul in the Gallipoli Campaign and on Baghdad in the Siege of Kut. The British agent T.E. Lawrence conspired with the Ottoman governor, Hussain bin Ali, the Sharif of Mecca to revolt against the Ottoman Empire and it was the first city captured by his forces in the 1916 Battle of Makkah. Sharif's revolt proved a turning point of the war on the eastern front. Hussein declared a new state, the Kingdom of Hejaz, declaring himself the Sharif of the state and Makkah his capital. News reports in November 1916 via contact in Cairo with returning Hajj pilgrims, stated that with the Ottoman Turkish authorities gone, the Hajj of 1916 was free of the previous massive extortion and monetary demands made by the Turks who were agents of the Ottoman government.
Following the 1924 Battle of Makkah, the Sharif of Mecca was overthrown by the Saud family, and Makkah was incorporated into Saudi Arabia. Under Saudi rule, much of the historic city has been demolished as a result of the Saudi government fearing these sites might become sites of association in worship beside Allah ("shirk"). The city has been expanded to include several towns previously considered to be separate from the holy city and now is just a few kilometers outside the main sites of the Hajj, Mina, Muzdalifah and Arafat. Makkah is not served by any airport, due to concerns about the city's safety. It is instead served by the King Abdulaziz International Airport in Jeddah (approx. 70 km away) internationally and the Ta'if Regional Airport (approx. 120 km away) for domestic flights.
The city today is at the junction of the two most important highways in all of the Saudi Arabian highway system, Highway 40, which connects the city to Jeddah in the west and the capital, Riyadh and Dammam in the east and Highway 15, which connects it to Madinah, Tabuk and onward to Jordan in the north and Abha and Jizan in the south. The Ottomans had planned to extend their railway network to the holy city, but were forced to abandon this plan due to their partaking in the First World War. This plan was later carried out by the Saudi government, which connected the two holy cities of Madinah and Makkah with the modern Haramain high-speed railway system which runs at 300 km/h (190 mph) and connects the two cities via Jeddah, King Abdulaziz International Airport and King Abdullah Economic City near Rabigh within two hours.
The haram area of Makkah, in which the entry of non-Muslims is forbidden, is much larger than that of Madinah.
On 20 November 1979, two hundred armed dissidents led by Juhayman al-Otaibi, seized the Grand Mosque, claiming the Saudi royal family no longer represented pure Islam and that the Masjid al-Haramand the Ka'bah, must be held by those of true faith. The rebels seized tens of thousands of pilgrims as hostages and barricaded themselves in the mosque. The siege lasted two weeks, and resulted in several hundred deaths and significant damage to the shrine, especially the Safa-Marwah gallery. A multinational force was finally able to retake the mosque from the dissidents. Since then, the Grand Mosque has been expanded several times, with many other expansions being undertaken in the present-day.
Under Saudi rule, it has been estimated that since 1985, about 95% of Mecca's historic buildings, most over a thousand years old, have been demolished. It has been reported that there are now fewer than 20 structures remaining in Makkah that date back to the time of Muhammad. Some important buildings that have been destroyed include the house of Khadijah, the wife of Muhammad, the house of Abu Bakr, Muhammad's birthplace and the Ottoman-era Ajyad Fortress. The reason for much of the destruction of historic buildings has been for the construction of hotels, apartments, parking lots, and other infrastructure facilities for Hajj pilgrims.
Mecca has been the site of several incidents and failures of crowd control because of the large numbers of people who come to make the Hajj. For example, on 2 July 1990, a pilgrimage to Mecca ended in tragedy when the ventilation system failed in a crowded pedestrian tunnel and 1,426 people were either suffocated or trampled to death in a stampede. On 24 September 2015, 700 pilgrims were killed in a stampede at Mina during the stoning-the-Devil ritual at Jamarat.
Makkah is governed by the Makkah Regional Municipality, a municipal council of 14 locally elected members headed by the mayor (called "Amin" in Arabic) appointed by the Saudi government. , the mayor of the city is Dr. Osama bin Fadhel Al-Barr. Makkah is the capital of the Makkah Province, which includes the neighboring cities of Jeddah and Ta'if, even though Jeddah is considerably larger in population compared to Makkah. The Provincial Governor of the province from 16 May 2007 is Prince Khalid bin Faisal Al Saud was appointed as the new governor.
Makkah holds an important place in Islam and is the holiest city in all branches of the religion. The city derives its importance from the role it plays in the Hajj and 'Umrah.
The "Masjid al-Haram" is the largest mosque in the world and the most expensive single building in the entire world, valued at 100 billion US dollars, as of 2020. It is the site of two of the most important rites of both the Hajj and of the Umrah, the circumambulation around the Ka'bah ("tawaf") and the walking between the two mounts of Safa and Marwa ("sa'ee"). The masjid is also the site of the Zamzam Well. According to Islamic tradition, a prayer in the masjid is equal to 100,000 prayers in any other masjid around the world.
There is a difference of opinion between Islamic scholars upon who first built the Ka'bah, some believe it was built by the angels while others believe it was built by Adam. Regardless, it was built several times before reaching its current state, the most famous of these renovations being the one by Abraham (Ibrahim in Islamic tradition). The Ka'bah is also the common direction of prayer ("qibla") for all Muslims. The surface surrounding the Ka'bah on which Muslims circumambulate it is known as the Mataf.
The Black Stone is a stone, considered by scientists to be a meteorite or of similar origin and believed by Muslims to be of divine origin. It is set in a corner of the Ka'bah and it is Sunnah to touch and kiss the stone. The area around the stone is generally always crowded and guarded by policemen to ensure the pilgrims' safety.
This is a stone which Muslims believe Abraham stood on to build the higher parts of the Ka'bah. It contains two footprints that are comparatively larger than average modern-day human feet. The stone is raised and housed in a golden hexagonal chamber beside the Ka'bah on the Mataf plate.
Muslims believe that in the divine revelation to Muhammad, the Qur'an, Allah describes the mountains of Safa and Marwah as symbols of his divinity. Walking between the two mountains seven times, 4 times from Safa to Marwah and 3 times from Marwah interchangeably, is considered a mandatory pillar ("rukn") of 'Umrah.
The Hajj pilgrimage, also called the greater pilgrimage, attracts millions of Muslims from all over the world and almost triples Makkah's population for one week in the twelfth and final Islamic month of "Dhu al-Hijjah". In 2019, the Hajj attracted 2,489,406 pilgrims to the holy city. The 'Umrah, or the lesser pilgrimage, can be done at anytime during the year. Every adult, healthy Muslim who has the financial and physical capacity to travel to Mecca must perform the Hajj at least once in a lifetime. Umrah, the lesser pilgrimage, is not obligatory, but is recommended in the Quran.
In addition to the "Masjid al-Haram", pilgrims also must visit the nearby towns of Mina/Muna, Muzdalifah and 'Arafah/'Arafat for varying rituals that are part of the Hajj.
This is a mountain believed by Muslims to have been the place where Muhammad spent his time away from the bustling city of Makkah in seclusion. The mountain is located on the eastern entrance of the city and is the highest point in the city at 642 meters (2,106 feet).
This is the place where Muslims believe Muhammad received the first revelation from Allah through the archangel Gabriel (Jibril in Islamic tradition) at the age of 40.
The Makkah Gate, known popularly as the Qur'an Gate, on the western entrance of the city of Makkah, or from Jeddah. Located on Highway 40, it marks the boundary of the Haram area where non-Muslims are prohibited from entering. The gate was designed in 1979 by an Egyptian architect, Samir Elabd, for the architectural firm IDEA Center. The structure is that of a book, representing the Quran, sitting on a "rehal", or bookrest.
Makkah is located in the Hejaz region, a 200 km (124 mi) wide strip of mountains separating the Nafud desert from the Red Sea. The city is situated in a valley with the same name around 70 km (44 mi) west of the port city of Jeddah. Makkah is one of the lowest cities in elevation in the Hejaz region, located at an elevation of 277 m (909 ft) above sea level at 21º23' north latitude and 39º51' east longitude. Makkah is divided into 34 districts.
The city centers on the al-Haram area, which contains the Masjid al-Haram. The area around the mosque is the old city and contains the most famous district of Makkah, Ajyad. The main street that runs to "al-Haram" is the Ibrahim al-Khalil Street, named after Ibrahim. Traditional, historical homes built of local rock, two to three stories long are still present within the city's central area, within view of modern hotels and shopping complexes. The total area of modern Makkah today is over .
Mecca is at an elevation of above sea level, and approximately 70 km (44 mi) inland from the Red Sea. It is one of the lowest in the Hejaz region.
The city center lies in a corridor between mountains, which is often called the "Hollow of Makkah". The area contains the valley of al-Taneem, the valley of Bakkah and the valley of Abqar. This mountainous location has defined the contemporary expansion of the city.
In pre-modern Mecca, the city used a few chief sources of water. The first were local wells, such as the Zamzam Well, that produced generally brackish water. The second source was the spring of 'Ayn Zubaydah (Spring of Zubaydah). The sources of this spring are the mountains of Jabal Sa'd and Jabal Kabkāb, which are a few kilometers east of 'Arafah/'Arafat or about southeast of Mec---------ca. Water was transported from it using underground channels. A very sporadic third source was rainfall which was stored by the people in small reservoirs or cisterns. The rainfall, scant as it is, also presents the threat of flooding and has been a danger since earliest times. According to al-Kurdī, there have been 89 floods by 1965. In the last century, the most severe flood was that of 1942. Since then, dams have been build to ameliorate this problem.
Mecca features a hot desert climate (Köppen: "BWh"), in three different plant hardiness zones: 10, 11 and 12. Like most Saudi Arabian cities, Mecca retains warm to hot temperatures even in winter, which can range from at night to in the afternoon, but also, very rarely, fall to zero and subzero temperatures. Summer temperatures are extremely hot and consistently break the mark in the afternoon, dropping to in the evening. Rain usually falls in Mecca in small amounts scattered between November and January, with heavy thunderstorms also common during the winter.
The Meccan economy has been heavily dependent on the annual pilgrimage. Income generated from the Hajj, in fact, not only powers the Meccan economy but has historically had far-reaching effects on the economy of the entire Arabian Peninsula. The income was generated in a number of ways. One method was taxing the pilgrims. Taxes were especially increased during the Great Depression, and many of these taxes existed to as late as 1972. Another way the Hajj generates income is through services to pilgrims. For example, the Saudi flag carrier, Saudia, generates 12% of its income from the pilgrimage. Fares paid by pilgrims to reach Makkah by land also generate income; as do the hotels and lodging companies that house them. The city takes in more than $100 million, while the Saudi government spends about $50 million on services for the Hajj. There are some industries and factories in the city, but Mecca no longer plays a major role in Saudi Arabia's economy, which is mainly based on oil exports. The few industries operating in Mecca include textiles, furniture, and utensils. The majority of the economy is service-oriented.
Nevertheless, many industries have been set up in Mecca. Various types of enterprises that have existed since 1970 in the city include corrugated iron manufacturing, copper extraction, carpentry, upholstery, bakeries, farming and banking. The city has grown substantially in the 20th and 21st centuries, as the convenience and affordability of jet travel has increased the number of pilgrims participating in the Hajj. Thousands of Saudis are employed year-round to oversee the Hajj and staff the hotels and shops that cater to pilgrims; these workers in turn have increased the demand for housing and services. The city is now ringed by freeways, and contains shopping malls and skyscrapers.
Formal education started to be developed in the late Ottoman period continuing slowly into Hashemite times. The first major attempt to improve the situation was made by a Jeddah merchant, Muhammad ʿAlī Zaynal Riḍā, who founded the Madrasat al-Falāḥ in Makkah in 1911–12 that cost £400,000. The school system in Makkah has many public and private schools for both males and females. As of 2005, there were 532 public and private schools for males and another 681 public and private schools for female students. The medium of instruction in both public and private schools is Arabic with emphasis on English as a second language, but some private schools founded by foreign entities such as International schools use the English language for medium of instruction. Some of these are coeducational while other schools are not. For higher education, the city has only one university, Umm Al-Qura University, which was established in 1949 as a college and became a public university in 1979.
Healthcare is provided by the Saudi government free-of-charge to all pilgrims. There are ten main hospitals in Mecca:
There are also many walk-in clinics available for both residents and pilgrims. Several temporary clinics are set up during the Hajj to tend to wounded pilgrims.
In February 2020, Saudi Arabia temporarily banned foreigners from entering Mecca and Medina to mitigate the COVID-19 pandemic in the Kingdom.
Makkah's culture has been affected by the large number of pilgrims that arrive annually, and thus boasts a rich cultural heritage. As a result of the vast numbers of pilgrims coming to the city each year, Mecca has become by far the most diverse city in the Muslim world. In contrast to the rest of Saudi Arabia, and particularly Najd, Makkah has, according to "The New York Times", become "a striking oasis of free thought and discussion and, also, of unlikely liberalism as Meccans see themselves as a bulwark against the creeping extremism that has overtaken much Islamic debate".
In pre-modern Mecca, the most common sports were impromptu wrestling and foot races. Football is now the most popular sport in Makkah and the kingdom, and the city hosts some of the oldest sport clubs in Saudi Arabia such as Al Wahda FC (established in 1945). King Abdulaziz Stadium is the largest stadium in Makkah with a capacity of 38,000.
Al Baik, a local fast-food chain is very popular among pilgrims and locals alike. Until 2018, it was available only in Makkah, Madinah and Jeddah and people travelling to Jeddah just to get a taste of the fried chicken was common.
Makkah is very densely populated. Most long-term residents of Makkah live in the Old City, the area around the Great Mosque and many work to support pilgrims, known locally as the "Hajj" industry. 'Iyad Madani, the Saudi Arabian Minister for Hajj, was quoted saying, "We never stop preparing for the Hajj."
Year-round, pilgrims stream into the city to perform the rites of 'Umrah, and during the last weeks of eleventh Islamic month, Dhu al-Qi'dah, on average 2-4 million Muslims arrive in the city to take part in the rites known as Hajj. Pilgrims are from varying ethnicities and backgrounds, mainly South and Southeast Asia, Europe and Africa. Many of these pilgrims have remained and become residents of the city. The Burmese are an older, more established community who number roughly 250,000. Adding to this, the discovery of oil in the past 50 years has brought hundreds of thousands of working immigrants.
Non-Muslims are not permitted to enter Mecca under Saudi law, and using fraudulent documents to do so may result in arrest and prosecution. The prohibition extends to Ahmadis, as they are considered non-Muslims. Nevertheless, many non-Muslims and Ahmadis have visited the city as these restrictions are loosely enforced. The first such recorded example of a non-Muslim entering the city is that of Ludovico di Varthema of Bologna in 1503. Guru Nanak Sahib, the founder of Sikhism, visited Makkah in December 1518. One of the most famous was Richard Francis Burton, who traveled as a Qadiriyya Sufi from Afghanistan in 1853.
Makkah Province is the only province where expatriates outnumber Saudis.
The first press was brought to Mecca in 1885 by Osman Nuri Pasha, an Ottoman Wāli. During the Hashemite period, it was used to print the city's official gazette, "Al Qibla". The Saudi regime expanded this press into a larger operation, introducing the new Saudi official gazette of Makkah, "Umm al-Qurā". Makkah also has its own paper owned by the city, "Al Nadwa". However, other Saudi newspapers are also provided in Makkah such as the "Saudi Gazette", "Al Madinah", "Okaz" and "Al Bilad," in addition to other international newspapers.
Telecommunications in the city were emphasized early under the Saudi reign. King Abdulaziz pressed them forward as he saw them as a means of convenience and better governance. While under Hussein bin Ali, there were about 20 public telephones in the entire city; in 1936, the number jumped to 450, totaling about half the telephones in the country. During that time, telephone lines were extended to Jeddah and Ta’if, but not to the capital, Riyadh. By 1985, Mecca, like other Saudi cities, possessed modern telephone, telex, radio and television communications. Many television stations serving the city area include Saudi TV1, Saudi TV2, Saudi TV Sports, Al-Ekhbariya, Arab Radio and Television Network and various cable, satellite and other specialty television providers.
Limited radio communication was established within the Kingdom under the Hashemites. In 1929, wireless stations were set up in various towns in the region, creating a network that would become fully functional by 1932. Soon after World War II, the existing network was greatly expanded and improved. Since then, radio communication has been used extensively in directing the pilgrimage and addressing the pilgrims. This practice started in 1950, with the initiation of broadcasts on the Day of 'Arafah (9 Dhu al-Hijjah), and increased until 1957, at which time Radio Makkah became the most powerful station in the Middle East at 50 kW. Later, power was increased 9-fold to 450 kW. Music was not immediately broadcast, but gradually folk music was introduced.
The only airport even close to Makkah is the Mecca East airport, which is not active. Makkah is primarily served by King Abdulaziz International Airport in Jeddah for international and regional connections and Ta'if Regional Airport for regional connections. To cater the large number of Hajj pilgrims, this airport has Hajj Terminal, specifically for use in the "Hajj" season, which can accommodate 47 planes simultaneously and can receive 3,800 pilgrims per hour during the Hajj season.
Makkah, similar to Madinah, lies at the junction of two of the most important highways in Saudi Arabia, Highway 40, connecting it to the important port city of Jeddah in the west and the capital of Riyadh and the other major port city, Dammam, in the east. The other, Highway 15, connects Makkah to the other holy Islamic city of Madinah approximately 400 km (250 mi) in the north and onward to Tabuk and Jordan. While in the south, it connects Makkah to Abha and Jizan. Makkah is served by four ring roads, and these are very crowded compared to the three ring roads of Madinah.
The Al Masha'er Al Muqaddassah Metro is a metro line in Mecca opened on 13 November 2010. The 18.1-kilometer (11.2-mile) elevated metro transports pilgrims to the holy sites of 'Arafat, Muzdalifah and Mina in the city to reduce congestion on the road and is only operational during the "Hajj" season. It consists of nine stations, three in each of the aforementioned towns.
The Makkah Metro, officially known as Makkah Mass Rail Transit, is a planned four-line metro system for the city. This will be in addition to the Al Masha'er Al Muqaddassah Metro which carries pilgrims during Hajj.
In 2018, a high speed intercity rail line, part of the Haramain High Speed Rail Project, named the Haramain high-speed railway line entered operation, connecting the holy cities cities of Makkah and Madinah together via Jeddah, King Abdulaziz International Airport and King Abdullah Economic City in Rabigh. The railway consists of 35 electric trains and is capable of transporting 60 million passengers annually. Each train can achieve speeds of upto 300 kmh (190 mph), traveling a total distance of 450 km (280 mi), reducing the travel time between the two cities to less than two hours. It was built by a business consortium from Spain.
|
https://en.wikipedia.org/wiki?curid=21021
|
Mormonism
Mormonism is the predominant religious tradition of the Latter Day Saint movement of Restorationist Christianity started by Joseph Smith in Western New York in the 1820s and 30s.
The word "Mormon" originally derived from the Book of Mormon, a religious text published by Smith, which he said he translated from golden plates with divine assistance. The book describes itself as a chronicle of early indigenous peoples of the Americas and their dealings with God. Based on the book's name, Smith's early followers were more widely known as "Mormons", and their faith "Mormonism". The term was initially considered pejorative, but Mormons no longer consider it so (although generally preferring other terms such as Latter-day Saint or LDS).
After Smith was killed in 1844, most Mormons followed Brigham Young on his westward journey to the area that became the Utah Territory, calling themselves The Church of Jesus Christ of Latter-day Saints (LDS Church). Other sects include Mormon fundamentalism, which seeks to maintain practices and doctrines such as polygamy, and other small independent denominations. The second-largest Latter Day Saint denomination, the Reorganized Church of Jesus Christ of Latter-day Saints, since 2001 called the Community of Christ, does not describe itself as "Mormon", but follows a Trinitarian Christian restorationist theology, and considers itself Restorationist in terms of Latter Day Saint doctrine.
Mormonism has common beliefs with the rest of the Latter Day Saint movement, including the use of and belief in the Bible, and in other religious texts including the Book of Mormon and Doctrine and Covenants. It also accepts the Pearl of Great Price as part of its scriptural canon, and has a history of teaching eternal marriage, eternal progression and polygamy (plural marriage), although the LDS Church formally abandoned the practice of plural marriage in 1890. Cultural Mormonism, a lifestyle promoted by Mormon institutions, includes cultural Mormons who identify with the culture, but not necessarily the theology.
Mormonism originated in the 1820s in western New York during a period of religious excitement known as the Second Great Awakening. After praying about which denomination he should join, Joseph Smith, Jr. said he received a vision in the spring of 1820. Called the "First Vision", Smith said that God the Father and His son Jesus Christ appeared to him and instructed him to join none of the existing churches because they were all wrong. During the 1820s Smith reported several angelic visitations, and was eventually told that God would use him to re-establish the true Christian church, and that the Book of Mormon would be the means of establishing correct doctrine for the restored church. Smith, Oliver Cowdery, and other early followers, began baptizing new converts in 1829. Formally organized in 1830 as the Church of Christ. Smith was seen by his followers as a modern-day prophet.
Joseph Smith said the Book of Mormon was translated from writing on golden plates in a reformed Egyptian language, translated with the assistance of the Urim and Thummim and seer stones. Both the special spectacles and the seer stone were at times referred to as the "Urim and Thummim". He said an angel first showed him the location of the plates in 1823, buried in a nearby hill, but he was not allowed to take the plates until 1827. Smith began dictating the text of The Book of Mormon around the fall of 1827 until the summer of 1828 when 116 pages were lost. Translation began again in April 1829 and finished in June 1829, saying that he translated it "by the gift and power of God". Oliver Cowdery acted as scribe for the majority of the translation. After the translation was completed, Smith said the plates were returned to the angel. During Smith's supposed possession, very few people were allowed to "witness" the plates.
The book described itself as a chronicle of an early Israelite diaspora, integrating with the pre-existing indigenous peoples of the Americas, written by a people called the Nephites. According to The Book of Mormon, Lehi's family left Jerusalem at the urging of God c. 600 BC, and later sailed to the Americas c. 589 BC. The Nephites are described as descendants of Nephi, the fourth son of the prophet Lehi. The Nephites are portrayed as having a belief in Christ hundreds of years before his birth. Historical accuracy and veracity of the Book of Mormon was and continues to be hotly contested. No archaeological, linguistic, or other evidence of the use of Egyptian writing in ancient America has been discovered.
To avoid confrontation with New York residents, the members moved to Kirtland, Ohio, and hoped to establish a permanent New Jerusalem or City of Zion in Jackson County, Missouri. However, they were expelled from Jackson County in 1833 and fled to other parts of Missouri in 1838. Violence between the Missourians and church members resulted in the governor of Missouri issuing an "extermination order," again forcing the church to relocate. The displaced Mormons fled to Illinois, to a small town called Commerce. The church bought the town, renamed it Nauvoo, and lived with a degree of peace and prosperity for a few years. However, tensions between Mormons and non-Mormons again escalated, and in 1844 Smith was killed by a mob, precipitating a succession crisis.
The largest group of Mormons (LDS Church) accepted Brigham Young as the new prophet/leader and emigrated to what became the Utah Territory. There, the church began the open practice of plural marriage, a form of polygyny which Smith had instituted in Nauvoo. Plural marriage became the faith's most sensational characteristic during the 19th century, but vigorous opposition by the United States Congress threatened the church's existence as a legal institution. Further, polygamy was also a major cause for the opposition to Mormonism in the states of Idaho and Arizona. In the 1890 Manifesto, church president Wilford Woodruff announced the official end of plural marriage.
Because of the formal abolition of plural marriage in 1890, several smaller groups of Mormons broke with the LDS Church forming several denominations of Mormon fundamentalism. Meanwhile, the LDS Church had become a proponent of monogamy and patriotism, has extended its reach internationally by a vigorous missionary program, and has grown in size to a reported membership of over 16 million. The church is becoming a part of the American and international mainstream. However, it consciously and intentionally retains its identity as a "peculiar people," believing their unique relationship with God helps save them from "worldliness" (non-spiritual influences).
Like most other Christian groups, Mormonism teaches that there are the Father, the Son, and the Holy Spirit, but unlike trinitarian faiths, the LDS Church teaches that they are separate and distinct beings with the Father and Son having perfected physical bodies and the Holy Ghost having only a body of spirit. While the three beings are physically distinct, in Mormon theology they are one in thoughts, actions, and purpose and commonly referred to collectively as the "Godhead". Also, Mormonism teaches that God the Father is the literal father of the spirits of all men and women, which existed prior to their mortal existence. The LDS Church also believes that a Heavenly Mother exists. Further, it is believed that all humans as children of God can become exalted, inheriting all that God has, as joint-heirs with Christ, and becoming like him as a God. Lorenzo Snow is quoted as saying "As man now is God once was: As God now is, man may be."
Mormonism describes itself as falling within world Christianity, but as a distinct restored dispensation; it characterizes itself as the only true form of the Christian religion since the time of a Great Apostasy that began not long after the ascension of Jesus Christ. According to Mormons this Apostasy involved the corruption of the pure, original Christian doctrine with Greek and other philosophies, and followers dividing into different ideological groups.
Additionally, Mormons claim the martyrdom of the Apostles
led to the loss of Priesthood authority to administer the Church and its ordinances.
Mormons believe that God re-established the early Christian Church as found in the New Testament through Joseph Smith. In particular, Mormons believe that angels such as Peter, James, John, and John the Baptist appeared to Joseph Smith and others and bestowed various Priesthood authorities on them. Mormons thus believe that their Church is the "only true and living church" because divine authority was restored to it through Smith. In addition, Mormons believe that Smith and his legitimate successors are modern prophets who receive revelation from God to guide the church. They maintain that other religions have a portion of the truth and are guided by the light of Christ.
Smith's cosmology is laid out mostly in Smith's later revelations and sermons, but particularly the Book of Abraham, the Book of Moses, and the King Follett discourse. Mormon cosmology presents a unique view of God and the universe, and places a high importance on human agency. In Mormonism, life on earth is just a short part of an eternal existence. Mormons believe that in the beginning, all people existed as spirits or "intelligences," in the presence of God. In this state, God proposed a plan of salvation whereby they could progress and "have a privilege to advance like himself." The spirits were free to accept or reject this plan, and a "third" of them, led by Satan rejected it. The rest accepted the plan, coming to earth and receiving bodies with an understanding that they would experience sin and suffering.
In Mormonism, the central part of God's plan is the atonement of Jesus Christ. Mormons believe that one purpose of earthly life is to learn to choose good over evil. In this process, people inevitably make mistakes, becoming unworthy to return to the presence of God. Mormons believe that Jesus paid for the sins of the world and that all people can be saved through his atonement. Mormons accept Christ's atonement through faith, repentance, formal covenants or ordinances such as baptism, and consistently trying to live a Christ-like life.
According to Mormon scripture, the Earth's creation was not "ex nihilo", but organized from existing matter. The Earth is just one of many inhabited worlds, and there are many governing heavenly bodies, including the planet or star Kolob, which is said to be nearest the throne of God.
In Mormonism, an ordinance is a religious ritual of special significance, often involving the formation of a covenant with God. Ordinances are performed by the authority of the priesthood and in the name of Jesus Christ. The term has a meaning roughly similar to that of the term "sacrament" in other Christian denominations.
Saving ordinances (or ordinances viewed as necessary for salvation) include: baptism by immersion after the age of accountability (normally age 8); confirmation and reception of the gift of the Holy Ghost, performed by laying hands on the head of a newly baptized member; ordination to the Aaronic and Melchizedek priesthoods for males; an endowment (including washing and anointing) received in temples; and marriage (or sealing) to a spouse.
Mormons also perform other ordinances, which include the Lord's supper (commonly called the sacrament), naming and blessing children, giving priesthood blessings and patriarchal blessings, anointing and blessing the sick, participating in prayer circles, and setting apart individuals who are called to church positions.
In Mormonism, the saving ordinances are seen as necessary for salvation, but they are not sufficient in and of themselves. For example, baptism is required for exaltation, but simply having been baptized does not guarantee any eternal reward. The baptized person is expected to be obedient to God's commandments, to repent of any sinful conduct subsequent to baptism, and to receive the other saving ordinances.
Because Mormons believe that everyone must receive certain ordinances to be saved, Mormons perform ordinances on behalf of deceased persons. These ordinances are performed vicariously or by "proxy" on behalf of the dead. In accordance with their belief in each individual's "free agency", living or dead, Mormons believe that the deceased may accept or reject the offered ordinance in the spirit world, just as all spirits decided to accept or reject God's plan originally. In addition, these "conditional" ordinances on behalf of the dead are performed only when a deceased person's genealogical information has been submitted to a temple and correctly processed there before the ordinance ritual is performed. Only ordinances for salvation are performed on behalf of deceased persons.
Mormons believe in the Old and New Testaments, and the LDS Church uses the King James Bible as its official scriptural text of the Bible. While Mormons believe in the general accuracy of the modern day text of the Bible, they also believe that it is incomplete and that errors have been introduced. In Mormon theology, many lost truths are restored in the Book of Mormon, which Mormons hold to be divine scripture and equal in authority to the Bible.
The Mormon scriptural canon also includes a collection of revelations and writings contained in the Doctrine and Covenants which contains doctrine and prophecy and the Pearl of Great Price which addresses briefly Genesis to Exodus. These books, as well as the Joseph Smith Translation of the Bible, have varying degrees of acceptance as divine scripture among different denominations of the Latter Day Saint movement.
In Mormonism, continuous revelation is the principle that God or his divine agents still continue to communicate to mankind. This communication can be manifest in many ways: influences of the Holy Ghost (the principal form in which this principle is manifest), visions, visitations of divine beings, and others. Joseph Smith used the example of the Lord's revelations to Moses in Deuteronomy to explain the importance of continuous revelation.
Mormons believe that Smith and subsequent church leaders could speak scripture "when moved upon by the Holy Ghost." In addition, many Mormons believe that ancient prophets in other regions of the world received revelations that resulted in additional scriptures that have been lost and may, one day, be forthcoming. In Mormonism, revelation is not limited to church members. For instance, Latter Day Saints believe that the United States Constitution is a divinely inspired document.
Mormons are encouraged to develop a personal relationship with the Holy Ghost and receive personal revelation for their own direction and that of their family. The Latter Day Saint concept of revelation includes the belief that revelation from God is available to all those who earnestly seek it with the intent of doing good. It also teaches that everyone is entitled to "personal" revelation with respect to his or her stewardship (leadership responsibility). Thus, parents may receive inspiration from God in raising their families, individuals can receive divine inspiration to help them meet personal challenges, church officers may receive revelation for those whom they serve.
The important consequence of this is that each person may receive confirmation that particular doctrines taught by a prophet are true, as well as gain divine insight in using those truths for their own benefit and eternal progress. In the church, personal revelation is expected and encouraged, and many converts believe that personal revelation from God was instrumental in their conversion.
Mormonism categorizes itself within Christianity, and nearly all Mormons self-identify as Christian. For some who define Christianity within the doctrines of Catholicism, Eastern Orthodoxy, and Protestantism, Mormonism's differences place it outside the umbrella of Christianity.
Since its beginnings, the faith has proclaimed itself to be Christ's Church restored with its original authority, structure and power; maintaining that existing denominations believed in incorrect doctrines and were not acknowledged by God as his church and kingdom. Though the religion quickly gained a large following of Christian seekers, in the 1830s, many American Christians came to view the church's early doctrines and practices as politically and culturally subversive, as well as doctrinally heretical, abominable, and condemnable. This discord led to a series of sometimes-deadly conflicts between Mormons and others who saw themselves as orthodox Christians. Although such violence declined during the twentieth century, the religion's unique doctrinal views and practices still generate criticism, sometimes vehemently so. This gives rise to efforts by Mormons and opposing types of Christians to proselytize each other.
Mormons believe in Jesus Christ as the literal Son of God and Messiah, his crucifixion as a conclusion of a sin offering, and subsequent resurrection. However, Latter-day Saints (LDS) reject the ecumenical creeds and the definition of the Trinity. (In contrast, the second largest Latter Day Saint denomination, the Community of Christ, is Trinitarian and monotheistic.) Mormons hold the view that the New Testament prophesied both the apostasy from the teachings of Christ and his apostles as well as the restoration of all things prior to the second coming of Christ.
Some notable differences with mainstream Christianity include: A belief that Jesus began his atonement in the garden of Gethsemane and continued it to his crucifixion, rather than the orthodox belief that the crucifixion alone was the physical atonement; and an afterlife with three degrees of glory, with hell (often called spirit prison) being a temporary repository for the wicked between death and the resurrection. Additionally, Mormons do not believe in creation "ex nihilo", believing that matter is eternal, and creation involved God organizing existing matter.
Much of the Mormon belief system is geographically oriented around the North and South American continents. Mormons believe that the people of the Book of Mormon lived in the western hemisphere, that Christ appeared in the western hemisphere after his death and resurrection, that the true faith was restored in Upstate New York by Joseph Smith, that the Garden of Eden was located in North America, and that the New Jerusalem would be built in Missouri. For this and other reasons, including a belief by many Mormons in American exceptionalism, Molly Worthen speculates that this may be why Leo Tolstoy described Mormonism as the "quintessential 'American religion'".
Although Mormons do not claim to be part of Judaism, Mormon theology claims to situate Mormonism within the context of Judaism to an extent that goes beyond what most other Christian denominations claim. The faith incorporates many Old Testament ideas into its theology, and the beliefs of Mormons sometimes parallel those of Judaism and certain elements of Jewish culture. In the earliest days of Mormonism, Joseph Smith taught that the Indigenous peoples of the Americas were members of some of the Lost Tribes of Israel. Later, he taught that Mormons were Israelites, and that they may learn of their tribal affiliation within the twelve Israelite tribes. Members of the LDS Church receive Patriarchal blessings which declare the recipient's lineage within one of the tribes of Israel. The lineage is either through true blood-line or adoption. The LDS Church teaches that if one is not a direct descendant of one of the twelve tribes, upon baptism he or she is adopted into one of the tribes. Patriarchal blessings also include personal information which is revealed through a patriarch by the power of the priesthood.
The Mormon affinity for Judaism is expressed by the many references to Judaism in the Mormon liturgy. For example, Smith named the largest Mormon settlement he founded "Nauvoo", which means "to be beautiful" in Hebrew. Brigham Young named a tributary of the Great Salt Lake the "Jordan River". The LDS Church created a writing scheme called the Deseret Alphabet, which was based, in part, on Hebrew. The LDS Church has a Jerusalem Center in Israel, where students focus their study on Near Eastern history, culture, language, and the Bible.
There has been some controversy involving Jewish groups who see the actions of some elements of Mormonism as offensive. In the 1990s, Jewish groups vocally opposed the LDS practice of baptism for the dead on behalf of Jewish victims of the Holocaust and Jews in general. According to LDS Church general authority Monte J. Brough, "Mormons who baptized 380,000 Holocaust victims posthumously were motivated by love and compassion and did not understand their gesture might offend Jews ... they did not realize that what they intended as a 'Christian act of service' was 'misguided and insensitive'". Mormons believe that when the dead are baptized through proxy, they have the option of accepting or rejecting the ordinance.
Since its origins in the 19th century, Mormonism has been compared to Islam, often by detractors of one religion or the other. For instance, Joseph Smith was referred to as "the modern mahomet" by the "New York Herald", shortly after his murder in June 1844. This epithet repeated a comparison that had been made from Smith's earliest career, one that was not intended at the time to be complimentary. Comparison of the Mormon and Muslim prophets still occurs today, sometimes for derogatory or polemical reasons but also for more scholarly (and neutral) purposes. While Mormonism and Islam have many similarities, there are also significant, fundamental differences between the two religions. Mormon–Muslim relations have been historically cordial; recent years have seen increasing dialogue between adherents of the two faiths, and cooperation in charitable endeavors, especially in the Middle and Far East.
Islam and Mormonism both originate in the Abrahamic traditions. Each religion sees its founder (Muhammad for Islam, and Joseph Smith for Mormonism) as being a true prophet of God, called to re-establish the truths of these ancient theological belief systems that have been altered, corrupted, or lost. In addition, both prophets received visits from an angel, leading to additional books of scripture. Both religions share a high emphasis on family life, charitable giving, chastity, abstention from alcohol, and a special reverence for, though not worship of, their founding prophet. Before the 1890 Manifesto against plural marriage, Mormonism and Islam also shared in the belief in and practice of plural marriage, a practice now held in common by Islam and various branches of Mormon fundamentalism.
The religions differ significantly in their views on God. Islam insists upon the complete oneness and uniqueness of God (Allah), while Mormonism asserts that the Godhead is made up of three distinct "personages."
Mormonism sees Jesus Christ as the promised Messiah and the literal Son of God, while Islam insists that the title "Messiah" means that Jesus (or "Isa") was a prophet sent to establish the true faith, not that he was the Son of God or a divine being. Despite opposition from other Christian denominations, Mormonism identifies itself as a Christian religion, the "restoration" of primitive Christianity. Islam does not refer to itself as "Christian", asserting that Jesus and all true followers of Christ's teachings were (and are) Muslims–a term that means "submitters to God". Islam proclaims that its prophet Muhammad was the "seal of the prophets", and that no further prophets would come after him. Mormons, though honoring Joseph Smith as the first prophet in modern times, see him as just one in a long line of prophets, with Jesus Christ being the premier figure of the religion. For these and many other reasons, group membership is generally mutually exclusive: both religious groups would agree that a person cannot be both Mormon and Muslim.
Mormon theology includes three main movements. By far the largest of these is "mainstream Mormonism", defined by the leadership of The Church of Jesus Christ of Latter-day Saints (LDS Church). The two broad movements outside mainstream Mormonism are Mormon fundamentalism, and liberal reformist Mormonism.
Mainstream Mormonism is defined by the leadership of the LDS Church which identifies itself as Christian. Members of the LDS Church consider their top leaders to be prophets and apostles, and are encouraged to accept their positions on matters of theology, while seeking confirmation of them through personal study of the Book of Mormon and the Bible. Personal prayer is encouraged as well. The LDS Church is by far the largest branch of Mormonism. It has continuously existed since the succession crisis of 1844 that split the Latter Day Saint movement after the death of founder Joseph Smith, Jr.
The LDS Church seeks to distance itself from other branches of Mormonism, particularly those that practice polygamy.
The church maintains a degree of orthodoxy by excommunicating or disciplining its members who take positions or engage in practices viewed as apostasy. For example, the LDS Church excommunicates members who practice polygamy or who adopt the beliefs and practices of Mormon fundamentalism.
One way Mormon fundamentalism distinguishes itself from mainstream Mormonism is through the practice of plural marriage. Fundamentalists initially broke from the LDS Church after that doctrine was discontinued around the beginning of the 20th century. Mormon fundamentalism teaches that plural marriage is a requirement for exaltation (the highest degree of salvation), which will allow them to live as gods and goddesses in the afterlife. Mainstream Mormons, by contrast, believe that a single Celestial marriage is necessary for exaltation.
In distinction with the LDS Church, Mormon fundamentalists also often believe in a number of other doctrines taught and practiced by Brigham Young in the 19th century, which the LDS Church has either abandoned, repudiated, or put in abeyance. These include:
Mormon fundamentalists believe that these principles were wrongly abandoned or changed by the LDS Church, in large part due to the desire of its leadership and members to assimilate into mainstream American society and avoid the persecutions and conflict that had characterized the church throughout its early years. Others believe that it was a necessity at some point for "a restoration of all things" to be a truly restored Church.
Some LDS Church members have worked towards a more liberal reform of the church. Others have left the LDS Church and still consider themselves to be cultural Mormons. Others have formed new religions (many of them now defunct). For instance the Godbeites broke away from the LDS Church in the late 19th century, on the basis of both political and religious liberalism, and in 1985 the Restoration Church of Jesus Christ broke away from the LDS Church as an LGBT-friendly denomination, which was formally dissolved in 2010.
As the largest denomination within Mormonism, the LDS Church has been the subject of criticism since it was founded by Joseph Smith in 1830.
Perhaps the most controversial, and a key contributing factor for Smith's murder, is the claim that plural marriage (as defenders call it) or polygamy (as critics call it) is biblically authorized. Under heavy pressure — Utah would not be accepted as a state if polygamy was practiced — the church formally and publicly renounced the practice in 1890. Utah's statehood soon followed. However, plural marriage remains a controversial and divisive issue, as despite the official renunciation of 1890, it still has sympathizers, defenders, and semi-secret practitioners within Mormonism, though not within the LDS Church.
More recent criticism has focused on questions of historical revisionism, homophobia, racism, sexist policies, inadequate financial disclosure, and the historical authenticity of the Book of Mormon.
|
https://en.wikipedia.org/wiki?curid=21023
|
Meat Loaf
Michael Lee Aday, (born Marvin Lee Aday, September 27, 1947), known professionally as Meat Loaf, is an American singer and actor. He is noted for his powerful, wide-ranging voice and for his theatrical live shows.
Meat Loaf's "Bat Out of Hell" trilogy of albums—"Bat Out of Hell", "", and ""—has sold more than 50 million albums worldwide. More than 40 years after its release, "Bat Out of Hell" still sells an estimated 200,000 copies annually and stayed on the charts for over nine years, making it one of the best selling albums in history.
After the commercial success of "Bat Out of Hell" and "Bat Out of Hell II: Back Into Hell", and earning a Grammy Award for Best Solo Rock Vocal Performance for the song "I'd Do Anything for Love", Meat Loaf nevertheless experienced some difficulty establishing a steady career within the United States. This did not stop him from becoming one of the best-selling music artists of all time, with worldwide sales of more than 80 million records. The key to this success was his retention of iconic status and popularity in Europe, especially the United Kingdom, where he received the 1994 Brit Award for best-selling album and single, appeared in the 1997 film "Spice World", and ranks 23rd for the number of weeks spent on the UK charts, He ranks 96th on VH1's "100 Greatest Artists of Hard Rock".
Sometimes credited as "Meat Loaf Aday", he has also appeared in over 50 movies and television shows, sometimes as himself or as characters resembling his stage persona. His most notable film roles include Eddie in "The Rocky Horror Picture Show" (1975), and Robert "Bob" Paulson in "Fight Club" (1999). His early stage work included dual roles in the original cast of "The Rocky Horror Show", and he was also in the musical "Hair", both on- and off-Broadway.
Marvin Lee Aday was born in Dallas, Texas, the only child of Wilma Artie ("née" Hukel), a school teacher and a member of the Vo-di-o-do Girls gospel quartet, and Orvis Wesley Aday, a former police officer who went into business with his wife and one of their friends as the Griffin Grocery Company, selling a homemade cough remedy. His father was an alcoholic who would go on drinking binges for days at a time; this had started after he was invalided out of the army during World War II, hit with shrapnel from a mortar explosion. Aday and his mother would drive around to all the bars in Dallas, looking for Orvis to take him home. As a result, Aday often stayed with his grandmother, Charlsee Norrod.
Meat Loaf relates a story in his autobiography, "", about how he, a friend, and his friend's father drove out to Love Field on November 22, 1963 to watch John F. Kennedy land. After watching him leave the airport, they went to Market Hall, which was on Kennedy's parade route. On the way, they heard that Kennedy had been shot, so they headed to Parkland Hospital, where they saw Jackie Kennedy get out of the car and Governor John Connally get pulled out, although they did not see the president taken out.
In 1965, Aday graduated from Thomas Jefferson High School, having already started his acting career via school productions such as "Where's Charley?" and "The Music Man". After attending college at Lubbock Christian College, he transferred to North Texas State University (now the University of North Texas). After he received his inheritance from his mother's death, he rented an apartment in Dallas and isolated himself for three and a half months. Eventually, a friend found him. A short time later, Aday went to the airport and caught the next flight leaving. The plane took him to Los Angeles.
In Los Angeles, Aday formed his first band, "Meat Loaf Soul", after a nickname coined by his football coach because of his weight. He was immediately offered three recording contracts, which he turned down. Meat Loaf Soul's first gig was in Huntington Beach in 1968 at the Cave, opening for Van Morrison's band Them, and Question Mark and the Mysterians. While performing their cover of the Howlin' Wolf song "Smokestack Lightning", the smoke machine they used made too much smoke and the club had to be cleared out. Later, the band was the opening act at Cal State Northridge for Renaissance, Taj Mahal, and Janis Joplin. The band then underwent several changes of lead guitarists, changing the name of the band each time. The new names included Popcorn Blizzard and Floating Circus. As Floating Circus, they opened for the Who, the Fugs, the Stooges, MC5, Grateful Dead, and the Grease Band. Their regional success led them to release a single, "Once Upon a Time", backed with "Hello". Then Meat Loaf joined the Los Angeles production of the musical "Hair". During an interview with New Zealand radio station ZM, Meat Loaf stated that the biggest life struggle he had to overcome was not being taken seriously in the music industry. He compared his treatment to that of a "circus clown".
With the publicity generated from "Hair", Meat Loaf was invited to record with Motown. They suggested he do a duet with Shaun "Stoney" Murphy, who had performed with him in "Hair", to which he agreed. The Motown production team in charge of the album wrote and selected the songs while Meat Loaf and Stoney came in only to lay down their vocals. The album, titled "Stoney & Meatloaf" (with Meat Loaf misspelled as one word) was completed in the summer of 1971 and released in September of that year. A single released in advance of the album, "What You See Is What You Get", reached number thirty-six on the Best Selling Soul Singles chart (the same chart is now titled Hot R&B/Hip-Hop Songs) and seventy-one on the "Billboard" Hot 100 chart. To support their album, Meat Loaf and Stoney toured with Jake Wade and the Soul Searchers, opening for Richie Havens, the Who, the Stooges, Bob Seger, Alice Cooper, and Rare Earth. Meat Loaf left soon after Motown replaced his and Stoney's vocals from the one song he liked, "Who Is the Leader of the People?" with new vocals by Edwin Starr. The album has been re-released after Meat Loaf's success, with Stoney's vocals removed. Meat Loaf's version of "Who Is the Leader of the People?" was released; however, the album failed.
In December 1972, Meat Loaf was in the original off-Broadway production of "Rainbow" at the Orpheum Theatre in New York.
After the tour, Meat Loaf rejoined the cast of "Hair", this time on Broadway. After he hired an agent, he auditioned for the Public Theater's production of "More Than You Deserve". During the audition Meat Loaf met his future collaborator Jim Steinman. He sang a former Stoney and Meatloaf favorite of his, "(I'd Love to Be) As Heavy as Jesus", and subsequently got the part of Rabbit, a maniac that blows up his fellow soldiers so they can "go home." Ron Silver and Fred Gwynne were also in the show. In the summer between the show's workshop production (April 1973) and full production (Nov 1973 - Jan 1974), he appeared in a Shakespeare In The Park production of "As You Like It" with Raul Julia and Mary Beth Hurt.
He recorded a single of "More Than You Deserve", with a cover of "Presence of the Lord" as the B-side. He was only able to save three copies of it, because the record company did not allow its release.. He recorded it again (1981) in a slightly rougher voice. The original single came out on RSO SO-407 with some promotional copies bearing both songs, while some were double-A side copies with "More Than You Deserve" in mono and stereo on them.
In late 1973, Meat Loaf was cast in the original L.A. Roxy cast of "The Rocky Horror Show", playing the parts of Eddie and Dr. Everett Scott. Two other cast members from "More Than You Deserve" were also part of this cast; Graham Jarvis (playing The Narrator) and Kim Milford (playing Rocky). The success of the musical led to the filming of "The Rocky Horror Picture Show" in which Meat Loaf played only Eddie, a decision he said made the movie not as good as the musical. About the same time, Meat Loaf and Steinman started work on "Bat Out of Hell". Meat Loaf convinced Epic Records to shoot videos for four songs, "Bat Out of Hell", "Paradise by the Dashboard Light", "You Took the Words Right Out of My Mouth", and "Two Out of Three Ain't Bad". He then convinced Lou Adler, the producer of "Rocky Horror", to run the "Paradise" video as a trailer to the movie. Meat Loaf's final theatrical show in New York was Gower Champion's "Rockabye Hamlet", a Hamlet musical. It closed two weeks into its initial run. Meat Loaf later returned occasionally to perform "Hot Patootie – Bless My Soul" for a special "Rocky Horror" reunion or convention, and rarely at his own live shows (one performance of which was released in the 1996 "Live Around the World" CD set).
During his recording of the soundtrack for "Rocky Horror", Meat Loaf recorded two more songs: "Stand by Me" (a Ben E. King cover), and "Clap Your Hands". They remained unreleased until 1984, when they appeared as B-sides to the "Nowhere Fast" single.
In 1976, Meat Loaf recorded lead vocals for Ted Nugent's album "Free-for-All" when regular Nugent lead vocalist Derek St. Holmes temporarily quit the band. Meat Loaf sang lead on five of the album's nine tracks. As on the "Stoney & Meatloaf" album, he was credited as Meatloaf (one word) on the "Free-for-All" liner notes.
Meat Loaf and Steinman started "Bat Out of Hell" in 1972, but did not get serious about it until the end of 1974. Meat Loaf decided to leave theatre, and concentrate exclusively on music. Then the National Lampoon show "Lemmings" opened on Broadway and it needed an understudy for John Belushi, a close friend of Meat Loaf since 1972. It was at the "Lampoon" show that Meat Loaf met Ellen Foley, the co-star who sang "Paradise by the Dashboard Light" and "Bat Out of Hell" with him on the album "Bat Out of Hell".
After the "Lampoon" show ended, Meat Loaf and Steinman spent time seeking a record deal. Their approaches were rejected by each record company, because their songs did not fit any specific recognized music industry style. Finally, they performed the songs for Todd Rundgren, who decided to produce the album, as well as play lead guitar on it (other members of Rundgren's band Utopia also lent their musical talents). They then shopped the record around, but still had no takers until Cleveland International Records decided to take a chance. In October 1977, "Bat Out of Hell" was finally released.
Meat Loaf and Steinman formed the band The Neverland Express to tour in support of "Bat Out of Hell". Their first gig was opening for Cheap Trick in Chicago. He gained national exposure as musical guest on "Saturday Night Live" on March 25, 1978. Guest host Christopher Lee jokingly introduced him by saying, "And now ladies and gentlemen I would like you to meet Loaf. (pauses, looks dumbfounded) I beg your pardon, what? (he listens to the director's aside) Oh! Why...why I'm sorry, yes, of course...ah... Ladies and gentlemen, Meat Loaf!"
"Bat Out of Hell" has sold an estimated 43 million copies globally (15 million of those in the United States), making it one of the highest selling albums of all time. In the United Kingdom alone, its 2.1 million sales put it in 38th place. Despite peaking at No. 9 and spending only two weeks in the top ten in 1981, it has now clocked up 485 weeks on the UK Albums Chart (May 2015), a figure bettered only by "Rumours" by Fleetwood Mac—487 weeks. In Australia, it knocked the Bee Gees off the number No. 1 spot and went on to become the biggest-selling Australian album of all time. "Bat Out of Hell" is also one of only two albums that has never exited the Top 200 in the UK charts; this makes it the longest stay in any music chart in the world, although the "published" chart contains just 75 positions.
In 1976, Meat Loaf appeared in the short-lived Broadway production of the rock musical "Rockabye Hamlet".
Steinman started to work on "Bad for Good", the album that was supposed to be the follow-up to 1977's "Bat out of Hell", in 1979. During that time, a combination of touring, drugs and exhaustion had caused Meat Loaf to lose his voice. Without a singer, and pressured by the record company, Steinman decided that he should sing on "Bad for Good" himself, and write a new album for Meat Loaf; the result was "Dead Ringer", which was later released in 1981, after the release of Steinman's "Bad for Good".
After playing the role of Travis Redfish in the movie "Roadie", Meat Loaf's singing voice returned, and he started to work on his new album in 1980. Steinman had written five new songs which, in addition to the track "More Than You Deserve" (sung by Meat Loaf in the stage musical of the same name) and a reworked monologue, formed the album "Dead Ringer", which was produced by Meat Loaf and Stephan Galfas, with backing tracks produced by Todd Rundgren, Jimmy Iovine, and Steinman. (In 1976, Meat Loaf appeared on the track "Keeper Keep Us", from the Intergalactic Touring Band's self-titled album, produced by Galfas.) The song "Dead Ringer for Love" was the pinnacle of the album, and launched Meat Loaf to even greater success after it reached No. 5 in the United Kingdom and stayed in the charts for a surprising 19 weeks. Cher provided the lead female vocals in the song.
A comedy/documentary movie was filmed to accompany the release of "Dead Ringer", written and produced by Meat Loaf's managers David Sonenberg and Al Dellentash. It featured Meat Loaf playing two roles: himself, and a Meat Loaf fan, 'Marvin'. Sonenberg persuaded CBS to advance money for the making of the movie, which was shown at the Toronto International Film Festival and won some favorable reviews.
The album reached No. 1 in the United Kingdom, and three singles were released from the album: "Dead Ringer for Love" (with Cher), "I'm Gonna Love Her for Both of Us", and "Read 'Em and Weep".
Following a dispute with his former songwriter Jim Steinman, Meat Loaf was contractually obliged to release a new album. Struggling for time, and with, it seemed, no resolution to his arguments with Steinman on the horizon (eventually, Steinman sued Meat Loaf, who subsequently sued Steinman as well), he was forced to find songwriters wherever he could. The resulting album was "Midnight at the Lost and Found".
According to Meat Loaf, Steinman had given the songs "Total Eclipse of the Heart" and "Making Love Out of Nothing at All" to Meat Loaf for this album. However, Meat Loaf's record company refused to pay for Steinman. This was hard luck for Meat Loaf, as Bonnie Tyler's version of "Eclipse" and Air Supply's version of "Making Love" topped the charts together, holding No. 1 and No. 2 for a period during 1983.
Meat Loaf is credited with having been involved in the writing of numerous tracks on the album, including the title track, "Midnight at the Lost and Found".
The title track still regularly forms part of Meat Loaf concerts, and was one of few 1980s songs to feature on the 1998 hit album "The Very Best of Meat Loaf". This was Meat Loaf's final album to be released through Epic. It was also his last album release via Sony Music until 2011's "Hell In A Handbasket".
On December 5, 1981, Meat Loaf and the Neverland Express were the musical guests for "Saturday Night Live" where he and former fellow "Rocky Horror Picture Show" actor Tim Curry performed a skit depicting a One-Stop Rocky Horror Shop. Later, Curry performed "The Zucchini Song" and Meat Loaf & the Neverland Express performed "Bat Out of Hell" and "Promised Land". In 1983, he released the self written "Midnight at the Lost and Found".
In 1984, Meat Loaf went to England to record the album "Bad Attitude"; it was released that year. It features two songs by Steinman, both previously recorded. It was a minor success with a few commercially successful singles, the most successful being "Modern Girl". The American release on RCA Records was in April 1985 and features a slightly different track list, as well as alternate mixes for some songs. The title track features a duet with the Who's lead singer Roger Daltrey.
"Modern Girl" was the first single taken from the album but was only a moderate success on the European Charts.
In 1986 he and songwriter John Parr started recording a new album, "Blind Before I Stop". In 1985, Meat Loaf took part in some comedy sketches in the UK, with Hugh Laurie. Meat Loaf also tried stand-up comedy, appearing several times in Connecticut.
"Blind Before I Stop" was released in 1986. It features production, mixing, and general influence by Frank Farian. Meat Loaf gave songwriting another shot with this album and wrote three of the songs on the album. Released as a single (in the United Kingdom) was "Rock 'n' Roll Mercenaries", which was a duet with rock singer John Parr. Another single released in the United Kingdom was "Special Girl".
According to Meat Loaf's 1998 autobiography, the album sold poorly because of its production. Meat Loaf would have preferred to cancel the project and wait to work with more Steinman material. However, the album gained a cult following over the years, with the songs "Execution Day" and "Standing on the Outside" as standout tracks on the record. "Standing on the Outside" was also featured during the third season of the 1980s television series "Miami Vice"; it was used several times during the episode titled "Forgive Us Our Debts" (first aired December 12, 1986).
In the former USSR, this was the first Meat Loaf album officially permitted to be published, in connection with the beginning of the collapse of the Iron Curtain.
The song "Masculine" was the only song from the record that was a live show mainstay from 1987 to 1992. He then omitted that song in favor of "Life Is a Lemon and I Want My Money Back", with the success of "Bat Out of Hell II: Back into Hell".
Meat Loaf performed "Thrashin" for the soundtrack of the 1986 skateboarding film "Thrashin'" (directed by David Winters and starring Josh Brolin).
Following the success of Meat Loaf's touring in the 1980s, he and Steinman began work during the Christmas of 1990 on the sequel to "Bat Out of Hell". After two years, "" was finished. The artist's then manager, Tommy Manzi, later told HitQuarters that music industry insiders were wholly unenthusiastic about the idea of a comeback, and considered the project "a joke". The immediate success of "Bat Out of Hell II" quickly proved any doubters wrong, with the album going on to sell over 15 million copies, and the single "I'd Do Anything for Love (But I Won't Do That)" reaching number one in 28 countries. Meat Loaf won the Grammy Award for Best Rock Vocal Performance, Solo in 1994 for "I'd Do Anything for Love". This song stayed at No. 1 in the United Kingdom charts for seven consecutive weeks. The single features a female vocalist who was credited only as "Mrs. Loud". Mrs. Loud was later identified as Lorraine Crosby, a performer from England. Meat Loaf promoted the song with American vocalist Patti Russo who performed lead female vocals on tour with him. In Germany, Meat Loaf was commercially successful following the release of "Bat Out of Hell II".
Also in 1994, he sang the U.S. national anthem "The Star Spangled Banner" at the Major League Baseball All-Star Game. He released the single "Rock and Roll Dreams Come Through", which reached No. 13 in the United States.
In 1995, Meat Loaf released his seventh studio album, "Welcome to the Neighborhood". The album went platinum in the United States and the United Kingdom. It released three singles that hit the top 40, including "I'd Lie for You" (which reached No. 13 in the United States and No. 2 in the United Kingdom charts), and "Not a Dry Eye in the House" (which reached No. 7 in the UK charts). "I'd Lie for You (And That's the Truth)" was a duet with Patti Russo, who had been touring with Meat Loaf and singing on his albums since 1993.
Of the twelve songs on the album, two are written by Steinman. Both are cover versions, the "Original Sin" from Pandora's Box's "Original Sin" album and "Left in the Dark" first appeared on Steinman's own "Bad for Good" as well as the 1984 album "Emotion" by Barbra Streisand. The video had a bigger budget than any of his previous videos. His other singles "I'd Lie for You" and "Not a Dry Eye in the House" were written by Diane Warren.
In 1998, Meat Loaf released "The Very Best of Meat Loaf". Although not reaching the top ten in the United Kingdom, it went platinum in December of that year, and was already platinum around the rest of the world just after its release. The album featured all of Meat Loaf's best-known songs, a few from his less popular albums from the 1980s, and three new songs. The music on the two Steinman songs was written and composed by Andrew Lloyd Webber. The single from the album was "Is Nothing Sacred", written by Steinman with lyrics by Don Black. The single version of this song is a duet with Patti Russo, whereas the album version is a solo song by Meat Loaf. The album did not feature any songs from his 1986 album "Blind Before I Stop".
In 2003, Meat Loaf released his album "Couldn't Have Said It Better". Only for the third time in his career, Meat Loaf released an album without any songs written by Steinman (not counting live bonus tracks on special edition releases). Although Meat Loaf claimed that "Couldn't Have Said It Better" was "the most perfect album [he] did since "Bat Out of Hell"", it was not as commercially successful. The album was a minor commercial success worldwide and reached No. 4 in the UK charts, accompanied by a sellout world tour to promote the album and some of Meat Loaf's best selling singles. One such performance on his world tour was at Sydney's 2003 NRL grand final. There were many writers for the album including Diane Warren and James Michael, who were both asked to contribute his 2006 album "". Diane Warren has written for Meat Loaf in the past with some commercially successful singles. James Michael had never written for Meat Loaf before and it was only his songs that were released as singles from the album. The album featured duets with Patti Russo and Meat Loaf's daughter Pearl Aday.
From February 20 to 22, 2004, during an Australian tour, Meat Loaf performed with the Melbourne Symphony Orchestra, titled "". The performance included the Australian Boys' Choir singing back-up on a "Couldn't Have Said It Better" track, "Testify". The show was released as a DVD and a CD called "Meat Loaf and The Neverland Express featuring Patti Russo Live with the Melbourne Symphony Orchestra". The CD had few edited songs from the concert on it.
Meat Loaf sold out over 160 concerts during his 2005 tour, "Hair of the Dog". On November 17, 2003, during a performance at London's Wembley Arena, on his "Couldn't Have Said It Better" tour, he collapsed of what was later diagnosed as Wolff-Parkinson-White syndrome. The following week, he underwent a surgical procedure intended to correct the problem. As a result, Meat Loaf's insurance agency did not allow him to perform for any longer than one hour and 45 minutes.
As well as singing his best known songs, Meat Loaf sang a cover version of the hit single "Black Betty". During this tour he also sang "Only When I Feel", a song meant to appear on his then-upcoming album "Bat Out of Hell III". The song subsequently turned into "If It Ain't Broke (Break It)".
Meat Loaf and Steinman had begun to work on the third installment of "Bat Out of Hell" when Steinman suffered some health setbacks, including a heart attack. According to Meat Loaf, Steinman was too ill to work on such an intense project while Steinman's manager said health was not an issue. Steinman had registered the phrase "Bat Out of Hell" as a trademark in 1995. In May 2006, Meat Loaf sued Steinman and his manager in federal District Court in Los Angeles, seeking $50 million and an injunction against Steinman's use of the phrase. Steinman and his representatives attempted to block the album's release. An agreement was reached in July 2006. According to Virgin, "the two came to an amicable agreement that ensured that Jim Steinman's music would be a continuing part of the 'Bat Out of Hell' legacy." Denying reports in the press over the years of a rift between Meat Loaf and Steinman, Meat Loaf told Dan Rather that he and Steinman never stopped talking, and that the lawsuits reported in the press were between lawyers and managers, and not between Meat Loaf and Steinman.
The album was released on October 31, 2006, and was produced by Desmond Child. The first single from the album "It's All Coming Back to Me Now" (featuring Marion Raven) was released on October 16, 2006. It entered the UK singles chart at No. 6, giving Meat Loaf his highest UK chart position in nearly 11 years. The album debuted at No. 8 on the Billboard 200, and sold 81,000 copies in its opening week, but after that did not sell well in the United States and yielded no hit singles, although it was certified gold. The album also featured duets with Patti Russo and Jennifer Hudson.
In the weeks following the release of "Bat III", Meat Loaf and the NLE (the Neverland Express) did a brief tour of America and Europe, known as the Bases Loaded Tour. In 2007, a newer, bigger worldwide tour began, The Seize the Night Tour, with Marion Raven, serving as a supporting act, throughout the European and American tour. Portions of the tour in February 2007 were featured in the documentary "", directed by Bruce David Klein. The film was an official selection of the Montreal World Film Festival in 2007. It opened in theaters in March 2008 and was released on DVD in May 2008.
During a performance at the Metro Radio Arena in Newcastle upon Tyne, England, on October 31, 2007, at the opening of "Paradise by the Dashboard Light" he suggested that the crowd of thousands should enjoy the performance as it was the last of his career. He attempted to sing the first line of the song, but instead said "Ladies and gentlemen, I love you, thank you for coming, but I can no longer continue." Removing the jacket he was wearing, he thanked the audience for 30 years, said "goodbye forever" and left the stage. His tour promoter, Andrew Miller, denied that this was the end for Meat Loaf and said he would continue touring after suitable rest. The next two gigs in the tour, at the NEC and Manchester Evening News Arena were cancelled because of "acute laryngitis" and were rescheduled for late November. The concert scheduled for November 6, 2007 at London's Wembley Arena was also cancelled. Meat Loaf cancelled his entire European tour for 2007 after being diagnosed with a cyst on his vocal cords. After releasing a statement he said "It really breaks my heart not to be able to perform these shows," adding "I will be back."
On June 27, 2008, Meat Loaf returned to the stage in Plymouth, England, for the first show of The Casa de Carne Tour alongside his longtime duet partner Patti Russo, who debuted one of her own original songs during his show. The tour continued through July and August with twenty dates throughout England, Ireland, Germany, Portugal, the Netherlands, Norway, Sweden, Finland and Denmark. Six U.S. showdates were also added for October and December 2008.
In May 2009, Meat Loaf began work on the album "Hang Cool Teddy Bear" in the studio with Green Day's "American Idiot" album producer Rob Cavallo, working with such writers as Justin Hawkins, Rick Brantley, Ollie Wride, Tommy Henriksen and Jon Bon Jovi. Though not much was revealed officially to begin with, Meat Loaf gave away some information through videos he posted on Twitter and YouTube. The album is based on the story of a fictional soldier, whose "story" furnishes the theme. During his concert of March 19, 2011, held outside of Vancouver, B.C., Canada, Meat Loaf explained that he had wanted an insert put with the album to explain what the premise of the album was, but he said there were too many "bleeping" record label politics and it did not get done. He went on to tell the audience that the story was of a soldier who being wounded, had his life flash forward before his eyes, and the songs were telling the story of his life.
The album is based on a short story by L.A.-based screenwriter and director Kilian Kerwin, a long-time friend of the singer. Hugh Laurie and Jack Black both perform on the album, Laurie plays piano on the song "If I Can't Have You", while Black sings a duet with Meat Loaf on "Like A Rose". Patti Russo and Kara DioGuardi also duet on the album. Queen's Brian May features on guitar along with Steve Vai. It received positive reviews from critics and fans alike. The first single from the album, "Los Angeloser", was released for download on April 5 with the album charting at number 4 in the official UK album chart on April 25, 2010.
The Hang Cool Tour followed in the United States, United Kingdom and Canada with rave reviews from fans and critics. Patti Russo accompanied him on the tour, continuing through the summer of 2011.
In May 2011, Meat Loaf confirmed in a video on his YouTube account, that he was in the process of recording a new album called "Hell in a Handbasket". According to Meat Loaf, the album was recorded and produced by Paul Crook; Dough McKean did the mix with input from Rob Cavallo. The album features songs called "All of Me", "Blue Sky", "The Giving Tree", "Mad, Mad World", and a duet with Patti Russo called "Our Love and Our Souls". On July 6, the album had to be finished for the record company. They released it in October 2011 for Australia and New Zealand, and February 2012 for the rest of the world. Meat Loaf said, "It's really the first record I've ever put out about how I feel about life and how I feel about what's going on at the moment."
The "Mad, Mad World" tour in connection with the album "Hell in a Handbasket" was launched in late June 2012. For the tour Meat Loaf has said, "People who come to Meat Loaf shows know what to expect. They know they're going to get full-on energy with the best rock 'n' roll band in the world. That's not an opinion. That's the truth."
At the 2011 Australian Football League Grand Final, the pre-match entertainment was headlined by a 12-minute medley performed by Meat Loaf. The performance was panned as the worst in the 34-year history of AFL Grand Final pre-game entertainment in a multitude of online reviews by football fans and Australian sport commentators. Meat Loaf responded by calling online critics "butt-smellers", and the AFL "jerks", saying "I will go out of my way to tell any artist, 'Do not play for them.
In response to this criticism, the AFL changed the format of the entertainment, effective from the 2012 Grand Final, to have a small pre-match show, a larger half-time show, and, for the first time, a free concert open to the public at the Melbourne Cricket Ground after the match.
Meat Loaf said in 2011 that he planned to release a Christmas album called "Hot Holidays". As of 2019, the album has not yet been released.
In media interviews to promote his 2013 "Last at Bat" tour, Meat Loaf said he would work with Steinman again on an upcoming album called "Brave and Crazy". The album was released in 2016 as "Braver Than We Are" on September 9 (Europe) and September 16 (North America). It features 10 tracks. Meat Loaf claimed in several interviews that he will be recording reworked versions of Steinman's songs "Braver Than We Are", "Speaking in Tongues", "Who Needs the Young", and "More" (previously recorded by the Sisters of Mercy) for the album. Additionally, the song "Prize Fight Lover", originally issued as a download-only bonus track for "Hang Cool Teddy Bear", has been re-recorded for the album.
In January 2020, during an interview for 'The Mirror', Meat Loaf announced "I’m not old. I’ve got songs for another record and I’m reading a script.” In a February 2020 Facebook post, Meat Loaf announced his intention to record a new album containing "4 or 5 new tracks", including Steinman's "What Part of My Body Hurts the Most" (a song long requested by fans, but previously under contract restrictions for the Bat Out of Hell musical), along with the original 1975 demo recordings made for the "Bat Out of Hell" album.
In 1984, Meat Loaf legally changed his first name from Marvin to Michael.
Meat Loaf identifies as a Christian.
Meat Loaf is a baseball fan and supporter of the New York Yankees. He is a fantasy baseball player and participates in multiple leagues every season.
He is also a supporter of the English football team Hartlepool United and, in 2003, the BBC reported he was seeking a residence in the nearby area. He
currently resides just outside Calabasas, California, near Saddle Peak and Calabasas Peak. In June 2008, he took part in a football penalty shootout competition on behalf of two cancer charities in Newcastle upon Tyne in the United Kingdom. He auctioned shots to the 100 highest bidders and then took his place between the goal posts. He also participates in celebrity golf tournaments.
Meat Loaf has expressed that he has social anxiety, being quoted saying "I never meet anybody much in a social situation because when I go into a social situation, I have no idea what to do." He revealed that he does not "even go anywhere", and also feels he leads a "boring life", saying that he "completely freaked" when having to attend a party, and that he was "so nervous, so scared". He also said he met with fellow musicians chiefly in work-related situations as he was working a lot.
On June 25, 2019, "The New York Times Magazine" listed Meat Loaf among hundreds of artists whose material was reportedly destroyed in the 2008 Universal fire.
In December 1978, he went to Woodstock, NY to work with Steinman. It was at the Bearsville studio that Meat Loaf met his future wife, Leslie G. Edmonds; they were married within a month. Leslie had a daughter, Pearl, from a previous marriage (Pearl later married Scott Ian, the rhythm guitarist for the thrash metal band Anthrax).
Meat Loaf and his family moved to Stamford, Connecticut, in 1979. In 1981, Leslie gave birth to Amanda Aday, later a television actress. For a brief time after Amanda's birth, they lived in nearby Westport. According to Meat Loaf, Pearl, then in the fifth grade, came home crying "because she had the wrong type of jeans and I said, 'That's it. We're gone. The family then moved to Redding, Connecticut, "which is much more of a blue-collar, working-class kind of town, and it really didn't make any difference what kind of jeans you were wearing. I really liked it there." Meat Loaf coached children's baseball or softball in each of the Connecticut towns where he lived. In 1998, Meat Loaf relocated to California. Meat Loaf and Leslie divorced in 2001. He married Deborah Gillespie in 2007. At the start of his 2012 tour in Austin on June 22, Meat Loaf announced that he was a new resident (1 month) of Austin, Texas.
Meat Loaf was a vegetarian for ten years and declared in late-2019 that he would 'go vegan for Veganuary' (January 2020).
In October 2006, his private jet had to make an emergency landing at London Stansted Airport after the plane's forward landing gear failed.
In 2011, Meat Loaf fainted on stage while performing in Pittsburgh. He collapsed again while on stage in Edmonton on 16 June 2016, due to severe dehydration after having cancelled two other shows due to illness. The playback containing his pre-recorded, voice-over vocal track continued while he lay unconscious on the stage. In 2019, while in Texas at the Texas Frightmare horror convention, Meat Loaf fell off of an interview stage and broke his collarbone.
Meat Loaf is not officially registered with any political party. He performed at the 1997 pre-inauguration ball for re-elected Democratic President Bill Clinton and attended the 2001 inauguration of Republican President George W. Bush. In 2008, Meat Loaf donated to the Presidential campaigns of Republican Party candidates Rick Santorum and John McCain, the latter of whom became the party's candidate in that year's election.
On October 25, 2012, Meat Loaf endorsed Mitt Romney for President of the United States, citing poor relations with Russia as a major reason he had been "arguing for Mitt Romney for a year". Meat Loaf explained, "I have never been in any political agenda in my life, but I think that in 2012 this is the most important election in the history of the United States." He cited "storm clouds" over the United States, and "thunder storms over Europe. There are hail storms – and I mean major hail storms! – in the Middle East. There are storms brewing through China, through Asia, through everywhere." The same day, he performed "America the Beautiful" standing next to Romney.
Solo albums
As featured artist
|
https://en.wikipedia.org/wiki?curid=21026
|
James Cameron
James Francis Cameron (born August 16, 1954) is a Canadian filmmaker and environmentalist who is best known for making science fiction and epic films for the Hollywood mainstream. Cameron first gained recognition for directing "The Terminator" (1984). He found further critical and commercial success with "Aliens" (1986), "The Abyss" (1989), "" (1991), and the comedy thriller "True Lies" (1994). His other productions include "Titanic" (1997) and "Avatar" (2009), with the former earning him Academy Awards in Best Picture, Best Director and Best Film Editing. "Avatar," filmed in 3D technology, also garnered him nominations in the same categories.
He also co-founded Lightstorm Entertainment, Digital Domain and Earthship Productions. In addition to his filmmaking, he is a National Geographic sea explorer and has produced a number of documentaries on the subject, including "Ghosts of the Abyss" (2003) and "Aliens of the Deep" (2005). Cameron also contributed to underwater filming and remote vehicle technologies and helped create the digital 3D Fusion Camera System. In 2012, Cameron became the first person to perform a solo descent to the bottom of the Mariana Trench, the deepest part of the Earth's ocean, in the "Deepsea Challenger" submersible.
In total, Cameron's films have grossed approximately US$2 billion in North America and US$6 billion worldwide. Cameron's "Avatar" and "Titanic" are the second and third highest-grossing films of all time, earning $2.78 billion and $2.19 billion, respectively. Cameron holds the achievement of having directed the first two of the five films in history to gross over $2 billion worldwide. In 2010, "Time" magazine named Cameron one of the 100 most influential people in the world.
James Cameron was born on August 16, 1954 in Kapuskasing, Ontario. He was born to Philip Cameron, an electrical engineer, and Shirley (née Lowe), an artist and nurse. His paternal great-great-great-grandfather emigrated from Balquhidder, Scotland, in 1825. Cameron is the eldest of five siblings. As a child he described the Lord's Prayer as a "tribal chant". He attended Stamford Collegiate School in Niagara Falls. At age 17, Cameron and his family moved from Chippawa to Brea, California. He attended Sonora High School and then moved to Brea Olinda High School. Classmates recalled that he was not a sportsman but instead enjoyed building things that "either went up into the air or into the deep".
After high school, Cameron enrolled at Fullerton College, a community college in 1973 to study physics. He switched subjects to English, but left the college at the end of 1974. He worked odd jobs, including as a truck driver and a janitor, but wrote in his free time. During this period, he learnt about special effects by reading other students' work on "optical printing, or front screen projection, or dye transfers, anything that related to film technology" at the library. After the excitement of seeing "Star Wars" in 1977, Cameron quit his job as a truck driver to enter the film industry.
Cameron's directing career began in 1978. After borrowing money from a consortium of dentists, he learnt to direct, write and produce his first short film, "Xenogenesis" (1978) with a friend. Learning as they went, Cameron said he felt like a doctor doing his first surgical procedure. He then served as a production assistant for "Rock and Roll High School" (1979). While educating himself about filmmaking techniques, Cameron started a job as a miniature model maker at Roger Corman Studios. He was soon employed as an art director for the science-fiction film "Battle Beyond the Stars" (1980). He carried out the special effects for John Carpenter's "Escape from New York" (1981), served as production designer for "Galaxy of Terror" (1981), and consulted on the design for "Android" (1982).
Cameron was hired as the special effects director for the sequel to "Piranha" (1978), titled "" in 1982. The original director, Miller Drake, left the project due to creative differences with producer Ovidio Assonitis. Shot in Rome, Italy and on Grand Cayman Island, the film gave Cameron the opportunity to become director for a major film for the first time. However, Cameron later said that it did not feel like his first movie due to power-struggles with Assonitis. Disillusioned from being in Rome and suffering from a fever, Cameron had a nightmare about an invincible robot hit-man sent from the future to assassinate him, which later led to the inspiration of "The Terminator" (1984). Upon release of "Piranha II: The Spawning", critics were not impressed. Author Tim Healey, called it "a marvellously bad movie which splices cliches from every conceivable source".
Inspired by John Carpenter's "Halloween" (1978) and other science fiction work, Cameron in 1982 wrote the script for "The Terminator" (1984), which is a thriller about a cyborg sent from the future to carry out a lethal mission. Cameron wanted to sell the script so that he could direct the movie. Whilst some film studios expressed interest in the project, many executives were unwilling to let a new and unfamiliar director make the movie. Gale Anne Hurd, a colleague and founder of Pacific Western Productions, to whom Cameron was married from 1984 to 1989, agreed to buy Cameron's script for one dollar, on the condition that Cameron direct the film. Eventually, he convinced the president of Hemdale Pictures to make the film, with Cameron as director and Hurd as a producer. Lance Henriksen, who had starred in "", was considered for the lead role, but Cameron decided that Arnold Schwarzenegger was more suitable as the cyborg villain due to his bodybuilder appearance. Henriksen was given a smaller role instead. Michael Biehn and Cameron's future wife, Linda Hamilton, also joined the cast. "The Terminator" was a box office success, exceeding expectations set by Orion Pictures. The film proved popular with audiences and earned over $78 million worldwide. George Perry of the BBC praised Cameron's direction, writing "Cameron laces the action with ironic jokes, but never lets up on hinting that the terror may strike at any moment". In 2008, the film was selected for preservation in the United States National Film Registry, being deemed "culturally, historically, or aesthetically significant".
In 1984, Cameron co-wrote the screenplay to "" with Sylvester Stallone. Soon, Cameron moved onto his next directorial feature, which was the sequel to "Alien" (1979), a science fiction horror by Ridley Scott. After titling the sequel "Aliens" (1986), Cameron recast Sigourney Weaver as Ellen Ripley, who first appeared in "Alien". "Aliens" follows the protagonist, Ripley, as she helps a group of marines fight off extraterrestrials. Despite conflicts with cast and crew during production, and having to replace one of the lead actors—James Remar with Michael Biehn—"Aliens" was a box office success, generating over $130 million worldwide. The film was nominated for seven Academy Awards in 1987; Best Actress, Best Art Direction, Best Film Editing, Best Original Score and Best Sound. It won awards for Best Sound Editing and Best Visual Effects. In addition, the film including Weaver made the cover of "Time" magazine in July 1986.
After "Aliens", Cameron and Gale Anne Hurd decided to make "The Abyss", a story of oil-rig workers who discover strange intelligent life in the ocean. Based on an idea which Cameron had conceived of during high school, the film was initially budgeted at $41 million, although it ran considerably over this amount. It starred Ed Harris, Mary Elizabeth Mastrantonio and Michael Biehn. The production process began in the Cayman Islands and then in South Carolina, inside the building of an unfinished nuclear power plant with two huge tanks. The cast and crew recall Cameron's tough stance and the filming of underwater scenes which were mentally and physically exhausting. Upon the film's release, "The Abyss" was praised for its special effects, and earned $90 million at the worldwide box office. "The Abyss" received four Academy Award nominations and won Best Visual Effects.
In 1990, Cameron co-founded the firm Lightstorm Entertainment with partner Lawrence Kasanoff. In 1991, Cameron served as executive producer for "Point Break" (1991), directed by Kathryn Bigelow, to whom he was married to between 1989 and 1991. After the success of "The Terminator", there were discussions for a sequel. In the late 1980s, Mario Kassar of Carolco Pictures secured the rights to the sequel, allowing Cameron to begin production of the film, "" (1991). Written by William Wisher Jr. and himself, Schwarzenegger and Linda Hamilton reprise their roles. The story follows on from "Terminator", depicting a new villain (T-1000), possessing shape-shifting ability and hunting for Sarah Connor's son, John (Edward Furlong). Cameron cast Robert Patrick as T-1000 because of his lean and thin appearance—a sharp contrast to Schwarzenegger. Cameron explained, "I wanted someone who was extremely fast and agile. If the T-800 is a human Panzer tank, then the T-1000 is a Porsche". Like its predecessor, "Terminator 2" was one of the most expensive films to be produced, costing at least $94 million. Despite the challenging use of computer-generated imagery, the film was completed on time and released on July 3, 1991. "Terminator 2" broke box office records (including the opening weekend record for an R-rated film), earning over $200 million in the North America and being the first to earn over $300 million worldwide. It won four Academy Awards: Best Makeup, Best Sound Mixing, Best Sound Editing, and Best Visual Effects. It also received nominations for Best Cinematography and Best Film Editing, but lost both to political thriller "JFK".
In subsequent years, Cameron planned to do a third "Terminator" film but plans never materialized. The rights to the "Terminator" franchise were eventually purchased by Kassar from a bankruptcy sale of Carolco's assets. He moved onto other projects and in 1993, Cameron co-founded Digital Domain, a visual effects production company. In 1994, Cameron and Schwarzenegger reunited for their third collaboration, "True Lies" (1994)"," a remake of the 1991 French comedy "La Totale!" The story depicts an American secret agent who leads a double life as a married man, whose wife believes he is a computer salesman. The film co-stars Jamie Lee Curtis, Eliza Dushku and Tom Arnold. Cameron's Lightstorm Entertainment signed a deal with 20th Century Fox for the production of "True Lies". Budgeted at a minimum of $100 million, the film earned $146 million in North America, and $232 million worldwide. The film was nominated for an Academy Award for Best Visual Effects and Curtis won a Golden Globe Award for Best Actress. In 1995, Cameron co-produced "Strange Days" (1995), a science fiction thriller. The film was directed by Kathryn Bigelow and co-written by Jay Cocks. "Strange Days" was critically and financially unsuccessful. In 1996, Cameron reunited with the cast of "Terminator 2" to film "", an attraction at Universal Studios Florida and in other parks around the world.
His next major project was "Titanic" (1997), an epic film about which sank in 1912 after striking an iceberg. With a production budget of $200 million, "Titanic" is one of the most expensive films ever made. The production was troubled for being over-budget and exceeding its filming schedule, which made headlines before the film's release. Starting in 1995, Cameron took several dives to the bottom of the Atlantic Ocean to capture footage of the wreck, which would later be used in the film. A replica of the ship was built in Rosarito Beach and principal photography began in September 1996. His completed screenplay depicts two star-crossed lovers, portrayed by Leonardo DiCaprio and Kate Winslet, from different social classes who fall in love amid the backdrop of the tragedy—a radical departure from his previous science fiction work. The supporting cast includes Billy Zane, Kathy Bates, Frances Fisher, Gloria Stuart, Bernard Hill, Jonathan Hyde, Victor Garber, Danny Nucci, David Warner and Bill Paxton.After months of delay, "Titanic" premiered on December 19, 1997. The film received strong critical acclaim and became the highest-grossing film of all time worldwide in 1998, and held this position for 12 years until Cameron's "Avatar" (2009) beat the record in 2010. The costumes and sets were very realistic, and "The Washington Post" considered the CGI graphics to be spectacular. "Titanic" received a record-tie of fourteen nominations (tied with "All About Eve" (1950)) at the 1998 Academy Awards. It won 11 of the awards (tying the record for most wins with "Ben-Hur" (1959) and later, "" (2003), including: Best Picture, Best Director, Best Art Direction, Best Cinematography, Best Visual Effects, Best Film Editing, Best Costume Design, Best Sound Mixing, Best Sound Editing, Best Original Score, and Best Original Song. Upon receiving Best Picture, Cameron and producer Jon Landau, asked for a moment of silence to remember the 1,500 people who died when the ship sank. Film critic Roger Ebert praised Cameron's storytelling, writing "It is flawlessly crafted, intelligently constructed, strongly acted, and spellbinding". Kevin Sandler and Gaylyn Studlar wrote in 1999 that the mix of romance, historical nostalgia and James Horner's music, contributed to the film's cultural phenomenon. In 2017, "Titanic" became Cameron's second film to be selected for preservation in the United States National Film Registry.
After the huge success of "Titanic", Cameron kept a low profile. In 1998, he and his brother, John, formed Earthship Productions, a company to allow streaming of documentaries on the deep sea, one of Cameron's interests. He had planned to make a film about Spider-Man, a project developed by Menahem Golan of Cannon Films. Columbia hired David Koepp to adapt Cameron's ideas into a screenplay, but due to various disagreements, Cameron abandoned the project. In 2002, "Spider-Man" was released with the screenplay credited solely to Koepp. In 2000, Cameron ventured into television and co-created "Dark Angel" with Charles H. Eglee, a television series influenced by cyberpunk, biopunk, contemporary superheroes and third-wave feminism. "Dark Angel" starred Jessica Alba as Max Guevara, a genetically enhanced super-soldier created by a secretive organization. While the first season was moderately successful, the second season did less well, which led to its cancellation.
In 2002, Cameron served as producer on the 2002 film "Solaris", a science fiction drama directed by Steven Soderbergh. The film received mixed reviews and did poorly at the box office. Keen to make documentaries, Cameron directed "," a documentary about the German Battleship "Bismarck." In 2003, he directed "Ghosts of the Abyss," a documentary about RMS "Titanic" which was released by Walt Disney Pictures and Walden Media and designed for 3D theaters. Cameron also told "The Guardian" his intention for filming everything in 3D. In 2005, Cameron co-directed "Aliens of the Deep," a documentary about the various forms of life in the ocean. He also starred in "Titanic Adventure" with Tony Robinson"," another documentary about the "Titanic" shipwreck. Then in 2006, Cameron co-created and narrated "The Exodus Decoded," a documentary exploring the Biblical account of the Exodus. In 2007, Cameron and fellow director Simcha Jacobovici, produced "The Lost Tomb of Jesus." Broadcast on Discovery Channel on March 4, 2007, the documentary was controversial for arguing that the Talpiot Tomb was the burial place of Jesus of Nazareth.
By the mid-2000s, Cameron returned to directing and producing another mainstream film since 1997's "Titanic". Cameron had mentioned two projects as early as June 2005. Titled "Avatar" (2009) and "" (2019) (the latter which he produced), both films were to be shot in 3D technology. He also wanted to make "Alita:" "Battle Angel" first, followed by "Avatar" but switched the order in February 2006. Although Cameron had written an 80-page treatment for "Avatar" in 1995, Cameron stated that he wanted the necessary technology to improve before starting production. "Avatar," with the story line set in the mid-22nd century, had an estimated budget in excess of $300 million. The cast includes Sam Worthington, Zoe Saldana, Stephen Lang, Michelle Rodriguez and Sigourney Weaver. It was composed entirely with computer-generated animation, using an advanced version of the performance capture technique, previously used by director Robert Zemeckis in "The Polar Express". Cameron intended "Avatar" to be 3D-only but decided to adapt it for conventional viewing as well.
Intended for release in May 2009, "Avatar" eventually premiered on December 18, 2009. This delay allowed more time for post-production and the opportunity for theaters to install 3D projectors. On release, "Avatar" broke several box office records during its initial theatrical run. It grossed $749.7 million in the United States and Canada and more than $2.74 billion worldwide, becoming the highest-grossing film of all time in the United States and Canada, surpassing "Titanic". It was the first film to ever earn more than $2 billion worldwide. "Avatar" was nominated for nine Academy Awards, including Best Picture and Best Director, and won three for Best Art Direction, Best Cinematography and Best Visual Effects. In July 2010, an extended theatrical re-release generated a worldwide total of $33.2 million at the box office. In his mixed review, Sukhdev Sandhu of "The Telegraph" complimented the 3D, but opined Cameron "should have been more brutal in his editing". That year, "Vanity Fair" reported that Cameron earned $257 million, making him the highest earner in Hollywood.
In 2011, Cameron served as an executive producer for "Sanctum", a disaster-survival film about a cave diving expedition which turns deadly. Although receiving mixed reviews, the film earned a fair $108 million at the worldwide box office. Cameron re-investigated the sinking of RMS "Titanic" with eight experts in a 2012 TV documentary special, "Titanic: The Final Word with James Cameron", which premiered on April 8 on the National Geographic Channel. In the documentary, the experts revised the CGI animation of the sinking conceived in 1995. In March 2010, Cameron announced that "Titanic" will be converted and re-released in 3D to commemorate the centennial anniversary of the tragedy. On March 27, 2012, "Titanic" 3D premiered at Royal Albert Hall, London. He also served as executive producer of "" and "Deepsea Challenge 3D" in 2012 and 2014, respectively.
Cameron starred in the 2017 documentary "Atlantis Rising," with collaborator Simcha Jacobovci. The pair go on an adventure to explore the existence of the city of Atlantis. The programme aired on January 29 on the National Geographic channel. Next, Cameron produced and appeared in a documentary about the history of science fiction. "James Cameron’s Story of Science Fiction," the six-episodic series was broadcast on AMC in 2018. The series featured interviews with guests including Ridley Scott, Steven Spielberg, George Lucas and Christopher Nolan. He stated "Without Jules Verne and H. G. Wells, there wouldn't have been Ray Bradbury or Robert A. Heinlein, and without them, there wouldn't be [George] Lucas, [Steven] Spielberg, Ridley Scott or me".
"" was finally released in 2019 after being in parallel development with "Avatar". Written by Cameron and his friend, Jon Landau, the film was directed by Robert Rodriguez. The film is based on a 1990s Japanese manga series "Battle Angel Alita," depicting a cyborg who cannot remember anything of her past life and tries to uncover the truth. Produced with similar techniques and technology as used in "Avatar," the film starred Rosa Salazar, Christoph Waltz, Jennifer Connelly, Mahershala Ali, Ed Skrein, Jackie Earle Haley and Keean Johnson. The film premiered on January 31, 2019 in London and received generally positive reviews from critics, and was financially successful, earning $404 million worldwide. In her review, Monica Castillo of "RogerEbert.com" called it, "an awe-inspiring jump for [Rodriguez]" and "a visual bonanza" despite the bulky script. Cameron returned to the "Terminator" franchise as producer and writer for "" (2019), directed by Tim Miller.
In August 2013, Cameron announced plans to direct three sequels to "Avatar" simultaneously, for release in December 2016, 2017, and 2018. However, the release dates have been postponed to December 17, 2021, with the following three sequels to be released, respectively, on December 22, 2023, December 19, 2025 and December 17, 2027. "Deadline Hollywood" estimated that the budget for these would be over $1 billion in total. "Avatar 2" and "Avatar 3" began simultaneous production in Manhattan Beach, California on August 15, 2017. Principal photography began in New Zealand on September 25, 2017. The other sequels are expected to begin production as soon as "Avatar 2" and "3" have finished. Although the sequels "4" and "5" have been given the green-light, Cameron stated in a 2017 interview, "Let's face it, if "Avatar 2" and "3" don't make enough money, there's not going to be a "4" and "5"".
Lightstorm Entertainment bought the film rights to the Taylor Stevens novel, "The Informationist," a thriller set in Africa. Cameron plans to direct it. He is also working on a film adaptation of the Charles R. Pellegrino book "The Last Train from Hiroshima," which is about the survivors of the atomic bombings of Hiroshima and Nagasaki. Cameron met with survivor, Tsutomu Yamaguchi, before his death in 2010.
As of 2012, Cameron and his family have adopted a vegan diet. Cameron states that "by changing what you eat, you will change the entire contract between the human species and the natural world". He and his wife are advocates of plant-based food and have called for constructive actions to produce more plant-based food and less meat to mitigate the impact of climate change. In 2006, Cameron's wife co-founded MUSE School, which became the first K-12 vegan school in the United States. In early 2014, Cameron purchased the Beaufort Vineyard and Estate Winery in Courtenay, British Columbia for $2.7 million, to pursue his passion for sustainable agribusiness. In 2018, Cameron served as executive producer of "The Game Changers", a documentary showcasing vegan athletes and other celebrities. In June 2019, Cameron announced a business venture with film director Peter Jackson, to produce plant-based meat, cheese, and dairy products in New Zealand. He suggested that we need "a nice transition to a meatless or relatively meatless world in 20 or 30 years."
In June 2010, Cameron met with officials of the Environmental Protection Agency to discuss possible solutions to the Deepwater Horizon oil spill. It was reported that he offered his assistance to help stop the oil well from leaking. Cameron is a member of the NASA Advisory Council and he worked with the space agency to build cameras for the Curiosity rover sent for Mars. However, NASA, launched the rover without Cameron's technology due to a lack of time during testing. He has expressed interest in a project about Mars, stating "I've been very interested in the Humans to Mars movement [...] and I've done a tremendous amount of personal research for a novel, a miniseries, and a 3D film". Cameron is a member of the Mars Society, a non-profit organization lobbying for the colonization of Mars. In 2016, Cameron endorsed Democratic candidate Hillary Clinton for the 2016 U.S. presidential election.
Cameron has been married five times. He was married to Sharon Williams from 1978 to 1984. A year after he and Sharon divorced, Cameron married film producer Gale Anne Hurd, a close collaborator for his 1980s films. They divorced in 1989. Soon after separating from Hurd, Cameron met the director Kathryn Bigelow whom he wed in 1989, but they divorced in 1991. Cameron then began a relationship with Linda Hamilton, actress in "The Terminator" series. Their daughter was born in 1993. Cameron married Hamilton in 1997. Amid speculation of an affair between Cameron and actress Suzy Amis, Cameron and Hamilton separated after two years of marriage, with Hamilton receiving a settlement of $50 million. He married Amis, his fifth wife, in 2000. They have one son and two daughters together.
Cameron has resided in the United States since 1971, but he remains a Canadian citizen. Cameron applied for American citizenship in 2004, but withdrew his application after George W. Bush won the presidential election. Captivated by New Zealand while filming "Avatar", Cameron bought a home there. He divides his time between California and New Zealand. Cameron has said he is a "Converted Agnostic", adding "I've sworn off agnosticism, which I now call cowardly atheism". Cameron met close friend Guillermo del Toro on the production of his 1993 film, "Cronos". In 1998, Toro's father was kidnapped in Guadalajara and Cameron gave Toro more than $1 million in cash to pay a ransom and have his father released.
In 2012, Cameron purchased more than 1,000 hectares of land in remote South Wairarapa, New Zealand, subsequent purchases has seen that grow to approximately 5,000 hectares. The Cameron’s grow a range of organic fruit, nuts and vegetables on the land. Nearby in Greytown, they run a café and grocery store, Forest Food Organics, selling produce from their land.
Cameron became an expert on deep-sea exploration, in part because of his work on "The Abyss" and "Titanic", as well as his childhood fascination with shipwrecks. In 2011, Cameron became a National Geographic explorer-in-residence. In his role on March 7, 2012, he dived five-mile deep to the bottom of the New Britain Trench with the "Deepsea Challenger". 19 days later, Cameron reached the Challenger Deep, the deepest part of the Mariana Trench. He spent more than three hours exploring the ocean floor, becoming the first to accomplish the trip alone. During his dive to the Challenger Deep, he discovered new species of sea cucumber, squid worm and a giant single-celled amoeba. He was preceded by unmanned dives in 1995 and 2009, as well as by Jacques Piccard and Don Walsh, the first men to reach the bottom of the Mariana Trench aboard the Bathyscaphe Trieste in 1960.
In June 2013, British artist Roger Dean filed a copyright complaint against Cameron, seeking damages of $50 million. Accused of "wilful and deliberate copying, dissemination and exploitation" of his original images (relating to "Avatar"), the case was dismissed by U.S district judge Jesse Ferman in 2014. In 2016, Premier Exhibitions, owner of many RMS "Titanic" artifacts, filed for bankruptcy. Cameron supported the U.K.'s National Maritime Museum and National Museums Northern Ireland decision to bid for the artifacts, but they were acquired by an investment group before a formal bid took place.
Cameron has been regarded as an innovative filmmaker in the industry, as well as not easy to work for. According to Ed Harris, who worked with Cameron on "The Abyss", Cameron behaved in an autocratic manner. Keegan, author of "The Futurist: The Life and Films of James Cameron," describes Cameron as "comically hands-on" and would try to do every job on the set. Andrew Gumbel, of "The Independent" says Cameron "is a nightmare to work with. Studios [...] fear his habit of straying way over schedule and over budget. He is notorious on set for his uncompromising and dictatorial manner, as well as his flaming temper". Keller writes that Cameron is an egomaniac, obsessed with vision but praises his "technological ingenuity" at creating a "visceral viewing experience".
Speaking of her experience of filming "Titanic", Kate Winslet said that she admired Cameron but "there were times I was genuinely frightened of him". Describing him as having "a temper like you wouldn't believe", she had said she wouldn't work with him again unless it was "for a lot of money". Winslet, however, has since joined the cast for "Avatar 2". Her co-star, Leonardo DiCaprio, told "Esquire" magazine, "when somebody felt a different way on the set, there was a confrontation. He lets you know exactly how he feels", but complimented Cameron, "he's of the lineage of John Ford. He knows what he wants his film to be." Sam Worthington, who starred in "Avatar", said that if a mobile phone rang during filming, Cameron would "nail it to the wall with a nail gun". Film score composer James Horner was also not immune to Cameron's demands; he recalls having to write music in a short time frame for Cameron's 1986 "Aliens." After the strained experience, Horner did not work with Cameron for a decade. In 1996, they reconciled their friendship and Horner produced the soundtracks for "Titanic" and "Avatar".
Despite this reputation, Bill Paxton and Sigourney Weaver have praised Cameron's perfectionism and attention to detail. Weaver said, "He really does want us to risk our lives and limbs for the shot, but he doesn't mind risking his own". In 2015, Weaver, along with Jamie Lee Curtis, applauded Cameron again. Curtis remarked, "he can do every other job [than acting]. I'm talking about every single department, from art direction to props to wardrobe to cameras, he knows more than everyone doing the job [..] He is so generous to actors". Weaver referred to Cameron as a "genius". Michael Biehn, a frequent collaborator, also praised Cameron, saying "Jim is a really passionate person. He cares more about his movies than other directors care about their movies", adding, "I've never seen him yell at anybody". Biehn, however, acknowledged that Cameron is "not real ["sic"] sensitive when it comes to actors and their trailers, and waiting for actors to come to the set". Worthington commented, "He demands excellence. If you don't give it to him, you're going to get chewed out. And that's a good thing". When asked in 2012 about his reputation, Cameron drily responded, “I don’t have to shout any more, because the word is out there already".
Cameron's work has had an influence in the Hollywood film industry. "The Avengers" (2012), a film directed by Joss Whedon, was inspired by Cameron's approach to action sequences. Whedon also admires Cameron's ability for writing heroic female characters such as Ellen Ripley (of "Aliens"), adding that he is "the leader and the teacher and the Yoda". Director Michael Bay idolizes Cameron and was convinced by him to use 3D cameras for filming "" (2011). Cameron's approach to 3D also inspired Baz Luhrmann during the production of "The Great Gatsby" (2013). Other directors that have been inspired by Cameron include Peter Jackson, Neill Blomkamp, and Xavier Dolan.
Cameron's films are often based on themes which explore the conflicts between intelligent machines and humanity or nature, dangers of corporate greed, strong female characters, and a romance subplot. Characters suffering from emotionally intense and dramatic environments in the sea wilderness are explored in "The Abyss" and "Titanic. The" "Terminator" series amplifies technology as an enemy which could lead to devastation of mankind. Similarly, "Avatar" views tribal people as an honest group, whereas a "technologically advanced imperial culture is fundamentally evil".
Cameron received the inaugural Ray Bradbury Award from the Science Fiction and Fantasy Writers of America in 1992 for "Terminator 2: Judgment Day". In recognition of "a distinguished career as a Canadian filmmaker", Carleton University, Ottawa, awarded Cameron the honorary degree of Doctor of Fine Arts on June 13, 1998. He also received an honorary doctorate in 1998 from Brock University in St. Catharines, Ontario, for his accomplishments in the international film industry.
That year, Cameron attended a convocation to receive an honorary degree from Ryerson University, Toronto. The university awards its highest honor to those who have made extraordinary contributions in Canada or internationally. A year later, Cameron received the honorary Doctor of Fine Arts degree from California State University, Fullerton. He accepted the degree at the university's summer annual commencement exercise.
For his work in film, Cameron's has been recognized by the Academy of Motion Picture Arts and Sciences. Cameron is one of the few directors to have won three Academy Awards in a single year. For "Titanic", he won Best Director, Best Picture (shared with Jon Landau) and Best Film Editing (shared with Conrad Buff and Richard A. Harris). In 2009, he was nominated for awards in Best Film Editing (shared with John Refoua and Stephen E. Rivkin, Best Director and Best Picture for "Avatar." Cameron has won two Golden Globes: Best Director for "Titanic" and "Avatar". He has been nominated for a number of BAFTA Awards, such as in Best Film for the same titles.
In recognition of his contributions to underwater filming and remote vehicle technology, University of Southampton awarded Cameron the honorary degree of Doctor of the University in July 2004. Cameron accepted the award at the National Oceanography Centre. In 2008, Cameron received a star on Canada's Walk of Fame and a year later, received the 2,396th star on the Hollywood Walk of Fame. On February 28, 2010, Cameron was honored with a Visual Effects Society (VES) Lifetime Achievement Award. In June 2012, Cameron was inducted to The Science Fiction Hall of Fame at the Museum of Pop Culture for his contribution to the science fiction and fantasy field. Inspired by "Avatar", Disney constructed "Pandora – The World of Avatar", at Disney's Animal Kingdom in Florida which opened to the public on May 27, 2017. A species of frog, "Pristimantis jamescameroni," was named after Cameron for his work in promoting environmental awareness and advocacy of veganism.
In 2010, Cameron was ranked at the top of the list in "The Guardian" Film Power 100. In the same year, British magazine "New Statesman" ranked Cameron 30th place in their list of "The World's 50 Most Influential Figures 2010". In 2013, Cameron received the Nierenberg Prize for Science in the Public, which is annually awarded by the Scripps Institution of Oceanography. In 2019 Cameron was appointed as a Companion of the Order of Canada by Governor General Julie Payette, giving him the Post Nominal Letters "CC" for life.
As a director:
Cameron has frequently cast the same actors in films that he has directed, including Arnold Schwarzenegger and Michael Biehn. Bill Paxton appeared in five of Cameron's films before Paxton's death in 2017. Lance Henriksen and Jenette Goldstein have appeared in four and three of Cameron's films, respectively. For the "Avatar" sequels, much of the original cast will be reuniting with Cameron: C.C.H. Pounder, Giovanni Ribisi, Sam Worthington, Sigourney Weaver, Stephen Lang and Zoe Saldana.
1 Apart from acting, Wisher Jr. also collaborated with Cameron in writing credits.
|
https://en.wikipedia.org/wiki?curid=15622
|
Judaism
Judaism (originally from Hebrew , "Yehudah", "Judah"; via Latin and Greek) is an ethnic religion comprising the collective religious, cultural and legal tradition and civilization of the Jewish people. Judaism is considered by religious Jews to be the expression of the covenant that God established with the Children of Israel. It encompasses a wide body of texts, practices, theological positions, and forms of organization. The Torah is part of the larger text known as the Tanakh or the Hebrew Bible, and supplemental oral tradition represented by later texts such as the Midrash and the Talmud. With between 14.5 and 17.4 million adherents worldwide, Judaism is the tenth largest religion in the world.
Within Judaism there are a variety of movements, most of which emerged from Rabbinic Judaism, which holds that God revealed his laws and commandments to Moses on Mount Sinai in the form of both the Written and Oral Torah. Historically, all or part of this assertion was challenged by various groups such as the Sadducees and Hellenistic Judaism during the Second Temple period; the Karaites and Sabbateans during the early and later medieval period; and among segments of the modern non-Orthodox denominations. Modern branches of Judaism such as Humanistic Judaism may be nontheistic. Today, the largest Jewish religious movements are Orthodox Judaism (Haredi Judaism and Modern Orthodox Judaism), Conservative Judaism, and Reform Judaism. Major sources of difference between these groups are their approaches to Jewish law, the authority of the Rabbinic tradition, and the significance of the State of Israel. Orthodox Judaism maintains that the Torah and Jewish law are divine in origin, eternal and unalterable, and that they should be strictly followed. Conservative and Reform Judaism are more liberal, with Conservative Judaism generally promoting a more traditionalist interpretation of Judaism's requirements than Reform Judaism. A typical Reform position is that Jewish law should be viewed as a set of general guidelines rather than as a set of restrictions and obligations whose observance is required of all Jews. Historically, special courts enforced Jewish law; today, these courts still exist but the practice of Judaism is mostly voluntary. Authority on theological and legal matters is not vested in any one person or organization, but in the sacred texts and the rabbis and scholars who interpret them.
Judaism has its roots as an organized religion in the Middle East during the Bronze Age. It evolved from ancient Israelite religions around 500 BCE, and is considered one of the oldest monotheistic religions. The Hebrews and Israelites were already referred to as "Jews" in later books of the Tanakh such as the Book of Esther, with the term Jews replacing the title "Children of Israel". Judaism's texts, traditions and values strongly influenced later Abrahamic religions, including Christianity, Islam and the Baha'i Faith. Many aspects of Judaism have also directly or indirectly influenced secular Western ethics and civil law. Hebraism, like Hellenism, played a seminal role in the formation of Western civilization through its impact as a core background element of Early Christianity.
Jews are an ethnoreligious group including those born Jewish, in addition to converts to Judaism. In 2015, the world Jewish population was estimated at about 14.3 million, or roughly 0.2% of the total world population. About 43% of all Jews reside in Israel and another 43% reside in the United States and Canada, with most of the remainder living in Europe, and other minority groups spread throughout Latin America, Asia, Africa, and Australia.
Unlike other ancient Near Eastern gods, the Hebrew God is portrayed as unitary and solitary; consequently, the Hebrew God's principal relationships are not with other gods, but with the world, and more specifically, with the people he created. Judaism thus begins with ethical monotheism: the belief that God is one and is concerned with the actions of mankind. According to the Tanakh (Hebrew Bible), God promised Abraham to make of his offspring a great nation. Many generations later, he commanded the nation of Israel to love and worship only one God; that is, the Jewish nation is to reciprocate God's concern for the world. He also commanded the Jewish people to love one another; that is, Jews are to imitate God's love for people. These commandments are but two of a large corpus of commandments and laws that constitute this covenant, which is the substance of Judaism.
Thus, although there is an esoteric tradition in Judaism (Kabbalah), Rabbinic scholar Max Kadushin has characterized normative Judaism as "normal mysticism", because it involves everyday personal experiences of God through ways or modes that are common to all Jews. This is played out through the observance of the Halakha (Jewish law) and given verbal expression in the Birkat Ha-Mizvot, the short blessings that are spoken every time a positive commandment is to be fulfilled.
Whereas Jewish philosophers often debate whether God is immanent or transcendent, and whether people have free will or their lives are determined, Halakha is a system through which any Jew acts to bring God into the world.
Ethical monotheism is central in all sacred or normative texts of Judaism. However, monotheism has not always been followed in practice. The Jewish Bible (Tanakh) records and repeatedly condemns the widespread worship of other gods in ancient Israel. In the Greco-Roman era, many different interpretations of monotheism existed in Judaism, including the interpretations that gave rise to Christianity.
Moreover, some have argued that Judaism is a non-creedal religion that does not require one to believe in God. For some, observance of Jewish law is more important than belief in God "per se". In modern times, some liberal Jewish movements do not accept the existence of a personified deity active in history. The debate about whether one can speak of authentic or normative Judaism is not only a debate among religious Jews but also among historians.
Scholars throughout Jewish history have proposed numerous formulations of Judaism's core tenets, all of which have met with criticism. The most popular formulation is Maimonides' thirteen principles of faith, developed in the 12th century. According to Maimonides, any Jew who rejects even one of these principles would be considered an apostate and a heretic. Jewish scholars have held points of view diverging in various ways from Maimonides' principles.
In Maimonides' time, his list of tenets was criticized by Hasdai Crescas and Joseph Albo. Albo and the Raavad argued that Maimonides' principles contained too many items that, while true, were not fundamentals of the faith.
Along these lines, the ancient historian Josephus emphasized practices and observances rather than religious beliefs, associating apostasy with a failure to observe Jewish law and maintaining that the requirements for conversion to Judaism included circumcision and adherence to traditional customs. Maimonides' principles were largely ignored over the next few centuries. Later, two poetic restatements of these principles (""Ani Ma'amin"" and ""Yigdal"") became integrated into many Jewish liturgies, leading to their eventual near-universal acceptance.
In modern times, Judaism lacks a centralized authority that would dictate an exact religious dogma. Because of this, many different variations on the basic beliefs are considered within the scope of Judaism. Even so, all Jewish religious movements are, to a greater or lesser extent, based on the principles of the Hebrew Bible and various commentaries such as the Talmud and Midrash. Judaism also universally recognizes the Biblical Covenant between God and the Patriarch Abraham as well as the additional aspects of the Covenant revealed to Moses, who is considered Judaism's greatest prophet. In the Mishnah, a core text of Rabbinic Judaism, acceptance of the Divine origins of this covenant is considered an essential aspect of Judaism and those who reject the Covenant forfeit their share in the World to Come.
Establishing the core tenets of Judaism in the modern era is even more difficult, given the number and diversity of the contemporary Jewish denominations. Even if to restrict the problem to the most influential intellectual trends of the nineteenth and twentieth century, the matter remains complicated. Thus for instance, Joseph Soloveitchik's (associated with the Modern Orthodox movement) answer to modernity is constituted upon the identification of Judaism with following the halakha whereas its ultimate goal is to bring the holiness down to the world. Mordecai Kaplan, the founder of the Reconstructionist Judaism, abandons the idea of religion for the sake of identifying Judaism with civilization and by means of the latter term and secular translation of the core ideas, he tries to embrace as many Jewish denominations as possible. In turn, Solomon Schechter's Conservative Judaism was identical with the tradition understood as the interpretation of Torah, in itself being the history of the constant updates and adjustment of the Law performed by means of the creative interpretation. Finally, David Philipson draws the outlines of the Reform movement in Judaism by opposing it to the strict and traditional rabbinical approach and thus comes to the conclusions similar to that of the Conservative movement.
The following is a basic, structured list of the central works of Jewish practice and thought.
Many traditional Jewish texts are available online in various Torah databases (electronic versions of the Traditional Jewish Bookshelf). Many of these have advanced search options available.
The basis of Jewish law and tradition (halakha) is the Torah (also known as the Pentateuch or the Five Books of Moses). According to rabbinic tradition, there are 613 commandments in the Torah. Some of these laws are directed only to men or to women, some only to the ancient priestly groups, the Kohanim and Leviyim (members of the tribe of Levi), some only to farmers within the Land of Israel. Many laws were only applicable when the Temple in Jerusalem existed, and only 369 of these commandments are still applicable today.
While there have been Jewish groups whose beliefs were based on the written text of the Torah alone (e.g., the Sadducees, and the Karaites), most Jews believe in the oral law. These oral traditions were transmitted by the Pharisee school of thought of ancient Judaism and were later recorded in written form and expanded upon by the rabbis.
According to Rabbinical Jewish tradition, God gave both the Written Law (the Torah) and the Oral law to Moses on Mount Sinai. The Oral law is the oral tradition as relayed by God to Moses and from him, transmitted and taught to the sages (rabbinic leaders) of each subsequent generation.
For centuries, the Torah appeared only as a written text transmitted in parallel with the oral tradition. Fearing that the oral teachings might be forgotten, Rabbi Judah haNasi undertook the mission of consolidating the various opinions into one body of law which became known as the "Mishnah".
The Mishnah consists of 63 tractates codifying Jewish law, which are the basis of the "Talmud." According to Abraham ben David, the "Mishnah" was compiled by Rabbi Judah haNasi after the destruction of Jerusalem, in anno mundi 3949, which corresponds to 189 CE.
Over the next four centuries, the Mishnah underwent discussion and debate in both of the world's major Jewish communities (in Israel and Babylonia). The commentaries from each of these communities were eventually compiled into the two Talmuds, the Jerusalem Talmud ("Talmud Yerushalmi") and the Babylonian Talmud ("Talmud Bavli"). These have been further expounded by commentaries of various Torah scholars during the ages.
In the text of the Torah, many words are left undefined and many procedures are mentioned without explanation or instructions. Such phenomena are sometimes offered to validate the viewpoint that the Written Law has always been transmitted with a parallel oral tradition, illustrating the assumption that the reader is already familiar with the details from other, i.e., oral, sources.
Halakha, the rabbinic Jewish way of life, then, is based on a combined reading of the Torah, and the oral tradition—the Mishnah, the halakhic Midrash, the Talmud and its commentaries. The Halakha has developed slowly, through a precedent-based system. The literature of questions to rabbis, and their considered answers, is referred to as responsa (in Hebrew, "Sheelot U-Teshuvot".) Over time, as practices develop, codes of Jewish law are written that are based on the responsa; the most important code, the Shulchan Aruch, largely determines Orthodox religious practice today.
Jewish philosophy refers to the conjunction between serious study of philosophy and Jewish theology. Major Jewish philosophers include Solomon ibn Gabirol, Saadia Gaon, Judah Halevi, Maimonides, and Gersonides. Major changes occurred in response to the Enlightenment (late 18th to early 19th century) leading to the post-Enlightenment Jewish philosophers. Modern Jewish philosophy consists of both Orthodox and non-Orthodox oriented philosophy. Notable among Orthodox Jewish philosophers are Eliyahu Eliezer Dessler, Joseph B. Soloveitchik, and Yitzchok Hutner. Well-known non-Orthodox Jewish philosophers include Martin Buber, Franz Rosenzweig, Mordecai Kaplan, Abraham Joshua Heschel, Will Herberg, and Emmanuel Lévinas.
Orthodox and many other Jews do not believe that the revealed Torah consists solely of its written contents, but of its interpretations as well. The study of Torah (in its widest sense, to include both poetry, narrative, and law, and both the Hebrew Bible and the Talmud) is in Judaism itself a sacred act of central importance. For the sages of the Mishnah and Talmud, and for their successors today, the study of Torah was therefore not merely a means to learn the contents of God's revelation, but an end in itself. According to the Talmud,
In Judaism, "the study of Torah can be a means of experiencing God". Reflecting on the contribution of the Amoraim and Tanaim to contemporary Judaism, Professor Jacob Neusner observed:
To study the Written Torah and the Oral Torah in light of each other is thus also to study "how" to study the word of God.
In the study of Torah, the sages formulated and followed various logical and hermeneutical principles. According to David Stern, all Rabbinic hermeneutics rest on two basic axioms:
These two principles make possible a great variety of interpretations. According to the Talmud,
Observant Jews thus view the Torah as dynamic, because it contains within it a host of interpretations.
According to Rabbinic tradition, all valid interpretations of the written Torah were revealed to Moses at Sinai in oral form, and handed down from teacher to pupil (The oral revelation is in effect coextensive with the Talmud itself). When different rabbis forwarded conflicting interpretations, they sometimes appealed to hermeneutic principles to legitimize their arguments; some rabbis claim that these principles were themselves revealed by God to Moses at Sinai.
Thus, Hillel called attention to seven commonly used hermeneutical principles in the interpretation of laws (baraita at the beginning of Sifra); R. Ishmael, thirteen (baraita at the beginning of Sifra; this collection is largely an amplification of that of Hillel). Eliezer b. Jose ha-Gelili listed 32, largely used for the exegesis of narrative elements of Torah. All the hermeneutic rules scattered through the Talmudim and Midrashim have been collected by Malbim in "Ayyelet ha-Shachar", the introduction to his commentary on the Sifra. Nevertheless, R. Ishmael's 13 principles are perhaps the ones most widely known; they constitute an important, and one of Judaism's earliest, contributions to logic, hermeneutics, and jurisprudence. Judah Hadassi incorporated Ishmael's principles into Karaite Judaism in the 12th century. Today R. Ishmael's 13 principles are incorporated into the Jewish prayer book to be read by observant Jews on a daily basis.
The term "Judaism" derives from "Iudaismus", a Latinized form of the Ancient Greek "Ioudaismos" (Ἰουδαϊσμός) (from the verb , "to side with or imitate the [Judeans]"). Its ultimate source was the Hebrew יהודה, "Yehudah", "Judah", which is also the source of the Hebrew term for Judaism: יַהֲדוּת, "Yahadut". The term "Ἰουδαϊσμός" first appears in the Hellenistic Greek book of 2 Maccabees in the 2nd century BCE. In the context of the age and period it meant "seeking or forming part of a cultural entity" and it resembled its antonym "hellenismos", a word that signified a people's submission to Hellenic (Greek) cultural norms. The conflict between "iudaismos" and "hellenismos" lay behind the Maccabean revolt and hence the invention of the term "iudaismos".
Shaye J. D. Cohen writes in his book "The Beginnings of Jewishness":
According to the "Oxford English Dictionary" the earliest citation in English where the term was used to mean "the profession or practice of the Jewish religion; the religious system or polity of the Jews" is Robert Fabyan's "The newe cronycles of Englande and of Fraunce" (1516). "Judaism" as a direct translation of the Latin "Iudaismus" first occurred in a 1611 English translation of the apocrypha (Deuterocanon in Catholic and Eastern Orthodoxy), 2 Macc. ii. 21: "Those that behaved themselves manfully to their honour for Iudaisme."
According to Daniel Boyarin, the underlying distinction between religion and ethnicity is foreign to Judaism itself, and is one form of the dualism between spirit and flesh that has its origin in Platonic philosophy and that permeated Hellenistic Judaism. Consequently, in his view, Judaism does not fit easily into conventional Western categories, such as religion, ethnicity, or culture. Boyarin suggests that this in part reflects the fact that much of Judaism's more than 3,000-year history predates the rise of Western culture and occurred outside the West (that is, Europe, particularly medieval and modern Europe). During this time, Jews experienced slavery, anarchic and theocratic self-government, conquest, occupation, and exile. In the Diaspora, they were in contact with, and influenced by, ancient Egyptian, Babylonian, Persian, and Hellenic cultures, as well as modern movements such as the Enlightenment (see Haskalah) and the rise of nationalism, which would bear fruit in the form of a Jewish state in their ancient homeland, the Land of Israel. They also saw an elite population convert to Judaism (the Khazars), only to disappear as the centers of power in the lands once occupied by that elite fell to the people of Rus and then the Mongols. Thus, Boyarin has argued that "Jewishness disrupts the very categories of identity, because it is not national, not genealogical, not religious, but all of these, in dialectical tension."
In contrast to this point of view, practices such as Humanistic Judaism reject the religious aspects of Judaism, while retaining certain cultural traditions.
According to Rabbinic Judaism, a Jew is anyone who was either born of a Jewish mother or who converted to Judaism in accordance with Jewish Law. Reconstructionist Judaism and the larger denominations of worldwide Progressive Judaism (also known as Liberal or Reform Judaism) accept the child as Jewish if one of the parents is Jewish, if the parents raise the child with a Jewish identity, but not the smaller regional branches. All mainstream forms of Judaism today are open to sincere converts, although conversion has traditionally been discouraged since the time of the Talmud. The conversion process is evaluated by an authority, and the convert is examined on his or her sincerity and knowledge. Converts are called "ben Abraham" or "bat Abraham", (son or daughter of Abraham). Conversions have on occasion been overturned. In 2008, Israel's highest religious court invalidated the conversion of 40,000 Jews, mostly from Russian immigrant families, even though they had been approved by an Orthodox rabbi.
Rabbinical Judaism maintains that a Jew, whether by birth or conversion, is a Jew forever. Thus a Jew who claims to be an atheist or converts to another religion is still considered by traditional Judaism to be Jewish. According to some sources, the Reform movement has maintained that a Jew who has converted to another religion is no longer a Jew, and the Israeli Government has also taken that stance after Supreme Court cases and statutes. However, the Reform movement has indicated that this is not so cut and dried, and different situations call for consideration and differing actions. For example, Jews who have converted under duress may be permitted to return to Judaism "without any action on their part but their desire to rejoin the Jewish community" and "A proselyte who has become an apostate remains, nevertheless, a Jew".
Karaite Judaism believes that Jewish identity can only be transmitted by patrilineal descent. Although a minority of modern Karaites believe that Jewish identity requires that both parents be Jewish, and not only the father. They argue that only patrilineal descent can transmit Jewish identity on the grounds that all descent in the Torah went according to the male line.
The question of what determines Jewish identity in the State of Israel was given new impetus when, in the 1950s, David Ben-Gurion requested opinions on "mihu Yehudi" ("Who is a Jew") from Jewish religious authorities and intellectuals worldwide in order to settle citizenship questions. This is still not settled, and occasionally resurfaces in Israeli politics.
Historical definitions of Jewish identity have traditionally been based on "halakhic" definitions of matrilineal descent, and halakhic conversions. Historical definitions of who is a Jew date back to the codification of the Oral Torah into the Babylonian Talmud, around 200 CE. Interpretations of sections of the Tanakh, such as Deuteronomy 7:1–5, by Jewish sages, are used as a warning against intermarriage between Jews and Canaanites because "[the non-Jewish husband] will cause your child to turn away from Me and they will worship the gods (i.e., idols) of others." says that the son in a marriage between a Hebrew woman and an Egyptian man is "of the community of Israel." This is complemented by , where Israelites returning from Babylon vow to put aside their gentile wives and their children. A popular theory is that the rape of Jewish women in captivity brought about the law of Jewish identity being inherited through the maternal line, although scholars challenge this theory citing the Talmudic establishment of the law from the pre-exile period. Since the anti-religious "Haskalah" movement of the late 18th and 19th centuries, "halakhic" interpretations of Jewish identity have been challenged.
The total number of Jews worldwide is difficult to assess because the definition of "who is a Jew" is problematic; not all Jews identify themselves as Jewish, and some who identify as Jewish are not considered so by other Jews. According to the "Jewish Year Book" (1901), the global Jewish population in 1900 was around 11 million. The latest available data is from the World Jewish Population Survey of 2002 and the Jewish Year Calendar (2005). In 2002, according to the Jewish Population Survey, there were 13.3 million Jews around the world. The Jewish Year Calendar cites 14.6 million. Jewish population growth is currently near zero percent, with 0.3% growth from 2000 to 2001.
Rabbinic Judaism (or in some Christian traditions, Rabbinism) (Hebrew: "Yahadut Rabanit" – יהדות רבנית) has been the mainstream form of Judaism since the 6th century CE, after the codification of the Talmud. It is characterised by the belief that the Written Torah (Written Law) cannot be correctly interpreted without reference to the Oral Torah and the voluminous literature specifying what behavior is sanctioned by the Law.
The Jewish Enlightenment of the late 18th century resulted in the division of Ashkenazi (Western) Jewry into religious movements or denominations, especially in North America and Anglophone countries. The main denominations today outside Israel (where the situation is rather different) are Orthodox, Conservative, and Reform.
While traditions and customs (see also "Sephardic law and customs") vary between discrete communities, it can be said that Sephardi and Mizrahi Jewish communities do not generally adhere to the "movement" framework popular in and among Ashkenazi Jewry. Historically, Sephardi and Mizrahi communities have eschewed denominations in favour of a "big tent" approach. This is particularly the case in contemporary Israel, which is home to the largest communities of Sephardi and Mizrahi Jews in the world. (However, individual Sephardi and Mizrahi Jews may be members of or attend synagogues that do adhere to one Ashkenazi-inflected movement or another.)
Sephardi and Mizrahi observance of Judaism tends toward the conservative, and prayer rites are reflective of this, with the text of each rite being largely unchanged since their respective inception. Observant Sephardim may follow the teachings of a particular rabbi or school of thought; for example, the Sephardic Chief Rabbi of Israel.
Most Jewish Israelis classify themselves as "secular" ("hiloni"), "traditional" ("masorti"), "religious" ("dati") or "Haredi". The term "secular" is more popular as a self-description among Israeli families of western (European) origin, whose Jewish identity may be a very powerful force in their lives, but who see it as largely independent of traditional religious belief and practice. This portion of the population largely ignores organized religious life, be it of the official Israeli rabbinate (Orthodox) or of the liberal movements common to diaspora Judaism (Reform, Conservative).
The term "traditional" ("masorti") is most common as a self-description among Israeli families of "eastern" origin (i.e., the Middle East, Central Asia, and North Africa). This term, as commonly used, has nothing to do with the Conservative Judaism, which also names itself "Masorti" outside North America. There is a great deal of ambiguity in the ways "secular" and "traditional" are used in Israel: they often overlap, and they cover an extremely wide range in terms of worldview and practical religious observance. The term "Orthodox" is not popular in Israeli discourse, although the percentage of Jews who come under that category is far greater than in the diaspora. What would be called "Orthodox" in the diaspora includes what is commonly called "dati" (religious) or "haredi" (ultra-Orthodox) in Israel. The former term includes what is called "Religious Zionism" or the "National Religious" community, as well as what has become known over the past decade or so as "haredi-leumi" (nationalist "haredi"), or "Hardal", which combines a largely "haredi" lifestyle with nationalist ideology. (Some people, in Yiddish, also refer to observant Orthodox Jews as "frum", as opposed to "frei" (more liberal Jews)).
"Haredi" applies to a populace that can be roughly divided into three separate groups along both ethnic and ideological lines: (1) "Lithuanian" (non-hasidic) "haredim" of Ashkenazic origin; (2) Hasidic "haredim" of Ashkenazic origin; and (3) Sephardic "haredim".
Karaite Judaism defines itself as the remnants of the non-Rabbinic Jewish sects of the Second Temple period, such as the Sadducees. The Karaites ("Scripturalists") accept only the Hebrew Bible and what they view as the Peshat ("simple" meaning); they do not accept non-biblical writings as authoritative. Some European Karaites do not see themselves as part of the Jewish community at all, although most do.
The Samaritans, a very small community located entirely around Mount Gerizim in the Nablus/Shechem region of the West Bank and in Holon, near Tel Aviv in Israel, regard themselves as the descendants of the Israelites of the Iron Age kingdom of Israel. Their religious practices are based on the literal text of the written Torah (Five Books of Moses), which they view as the only authoritative scripture (with a special regard also for the Samaritan Book of Joshua).
"See also: Haymanot; Beta Israel."
Haymanot (meaning "religion" in Ge'ez and Amharic) refers the Judaism practiced by Ethiopian Jews. This version of Judaism differs substantially from Rabbinic, Karaite, and Samaritan Judaisms, Ethiopian Jews having diverged from their coreligionists earlier. Sacred scriptures (the Orit) are written in Ge'ez, not Hebrew, and dietary laws are based strictly on the text of the Orit, without explication from ancillary commentaries. Holidays also differ, with some Rabbinic holidays not observed in Ethiopian Jewish communities, and some additional holidays, like Sigd.
"See also: Noahidism."
Noahidism () or Noachidism () is a monotheistic branch of Judaism based on the Seven Laws of Noah, and their traditional interpretations within Rabbinic Judaism. According to the Jewish law, non-Jews (Gentiles) are not obligated to convert to Judaism, but they are required to observe the Seven Laws of Noah to be assured of a place in the World to Come (Olam Ha-Ba), the final reward of the righteous. The divinely ordained penalty for violating any of these Noahide Laws is discussed in the Talmud, but in practical terms it is subject to the working legal system which is established by the society at large. Those who subscribe to the observance of the Noahic Covenant are referred to as B'nei Noach (Hebrew: בני נח, "Children of Noah") or Noahides (). Supporting organizations have been established around the world over the past decades by either Noahides or Orthodox Jews.
Historically, the Hebrew term "B'nei Noach" has applied to all non-Jews as descendants of Noah. However, nowadays it's primarily used to refer specifically to those non-Jews who observe the Seven Laws of Noah.
Jewish ethics may be guided by halakhic traditions, by other moral principles, or by central Jewish virtues. Jewish ethical practice is typically understood to be marked by values such as justice, truth, peace, loving-kindness (chesed), compassion, humility, and self-respect. Specific Jewish ethical practices include practices of charity (tzedakah) and refraining from negative speech (lashon hara). Proper ethical practices regarding sexuality and many other issues are subjects of dispute among Jews.
Traditionally, Jews recite prayers three times daily, Shacharit, Mincha, and Ma'ariv with a fourth prayer, Mussaf added on Shabbat and holidays. At the heart of each service is the "Amidah" or "Shemoneh Esrei". Another key prayer in many services is the declaration of faith, the "Shema Yisrael" (or "Shema"). The "Shema" is the recitation of a verse from the Torah (Deuteronomy 6:4): "Shema Yisrael Adonai Eloheinu Adonai Echad"—"Hear, O Israel! The Lord is our God! The Lord is One!"
Most of the prayers in a traditional Jewish service can be recited in solitary prayer, although communal prayer is preferred. Communal prayer requires a quorum of ten adult Jews, called a "minyan". In nearly all Orthodox and a few Conservative circles, only male Jews are counted toward a "minyan"; most Conservative Jews and members of other Jewish denominations count female Jews as well.
In addition to prayer services, observant traditional Jews recite prayers and benedictions throughout the day when performing various acts. Prayers are recited upon waking up in the morning, before eating or drinking different foods, after eating a meal, and so on.
The approach to prayer varies among the Jewish denominations. Differences can include the texts of prayers, the frequency of prayer, the number of prayers recited at various religious events, the use of musical instruments and choral music, and whether prayers are recited in the traditional liturgical languages or the vernacular. In general, Orthodox and Conservative congregations adhere most closely to tradition, and Reform and Reconstructionist synagogues are more likely to incorporate translations and contemporary writings in their services. Also, in most Conservative synagogues, and all Reform and Reconstructionist congregations, women participate in prayer services on an equal basis with men, including roles traditionally filled only by men, such as reading from the Torah. In addition, many Reform temples use musical accompaniment such as organs and mixed choirs.
A "kippah" (Hebrew: כִּפָּה, plural "kippot"; Yiddish: יאַרמלקע, "yarmulke") is a slightly rounded brimless skullcap worn by many Jews while praying, eating, reciting blessings, or studying Jewish religious texts, and at all times by some Jewish men. In Orthodox communities, only men wear kippot; in non-Orthodox communities, some women also wear kippot. "Kippot" range in size from a small round beanie that covers only the back of the head to a large, snug cap that covers the whole crown.
"Tzitzit" (Hebrew: צִיציִת) (Ashkenazi pronunciation: "tzitzis") are special knotted "fringes" or "tassels" found on the four corners of the "tallit" (Hebrew: טַלִּית) (Ashkenazi pronunciation: "tallis"), or prayer shawl. The "tallit" is worn by Jewish men and some Jewish women during the prayer service. Customs vary regarding when a Jew begins wearing a tallit. In the Sephardi community, boys wear a tallit from bar mitzvah age. In some Ashkenazi communities, it is customary to wear one only after marriage. A "tallit katan" (small tallit) is a fringed garment worn under the clothing throughout the day. In some Orthodox circles, the fringes are allowed to hang freely outside the clothing.
Tefillin (Hebrew: תְפִלִּין), known in English as phylacteries (from the Greek word φυλακτήριον, meaning "safeguard" or "amulet"), are two square leather boxes containing biblical verses, attached to the forehead and wound around the left arm by leather straps. They are worn during weekday morning prayer by observant Jewish men and some Jewish women.
A "kittel" (Yiddish: קיטל), a white knee-length overgarment, is worn by prayer leaders and some observant traditional Jews on the High Holidays. It is traditional for the head of the household to wear a kittel at the Passover seder in some communities, and some grooms wear one under the wedding canopy. Jewish males are buried in a "tallit" and sometimes also a "kittel" which are part of the "tachrichim" (burial garments).
Jewish holidays are special days in the Jewish calendar, which celebrate moments in Jewish history, as well as central themes in the relationship between God and the world, such as creation, revelation, and redemption.
"Shabbat", the weekly day of rest lasting from shortly before sundown on Friday night to nightfall on Saturday night, commemorates God's day of rest after six days of creation. It plays a pivotal role in Jewish practice and is governed by a large corpus of religious law. At sundown on Friday, the woman of the house welcomes the Shabbat by lighting two or more candles and reciting a blessing. The evening meal begins with the Kiddush, a blessing recited aloud over a cup of wine, and the Mohtzi, a blessing recited over the bread. It is customary to have challah, two braided loaves of bread, on the table. During Shabbat, Jews are forbidden to engage in any activity that falls under 39 categories of "melakhah", translated literally as "work". In fact the activities banned on the Sabbath are not "work" in the usual sense: They include such actions as lighting a fire, writing, using money and carrying in the public domain. The prohibition of lighting a fire has been extended in the modern era to driving a car, which involves burning fuel and using electricity.
Jewish holy days ("chaggim"), celebrate landmark events in Jewish history, such as the Exodus from Egypt and the giving of the Torah, and sometimes mark the change of seasons and transitions in the agricultural cycle. The three major festivals, Sukkot, Passover and Shavuot, are called "regalim" (derived from the Hebrew word "regel", or foot). On the three regalim, it was customary for the Israelites to make pilgrimages to Jerusalem to offer sacrifices in the Temple.
The High Holidays ("Yamim Noraim" or "Days of Awe") revolve around judgment and forgiveness.
Purim (Hebrew: "Pûrîm" "lots") is a joyous Jewish holiday that commemorates the deliverance of the Persian Jews from the plot of the evil Haman, who sought to exterminate them, as recorded in the biblical Book of Esther. It is characterized by public recitation of the Book of Esther, mutual gifts of food and drink, charity to the poor, and a celebratory meal (Esther 9:22). Other customs include drinking wine, eating special pastries called hamantashen, dressing up in masks and costumes, and organizing carnivals and parties.
Purim has celebrated annually on the 14th of the Hebrew month of Adar, which occurs in February or March of the Gregorian calendar.
Hanukkah (, "dedication") also known as the Festival of Lights, is an eight-day Jewish holiday that starts on the 25th day of Kislev (Hebrew calendar). The festival is observed in Jewish homes by the kindling of lights on each of the festival's eight nights, one on the first night, two on the second night and so on.
The holiday was called Hanukkah (meaning "dedication") because it marks the re-dedication of the Temple after its desecration by Antiochus IV Epiphanes. Spiritually, Hanukkah commemorates the "Miracle of the Oil". According to the Talmud, at the re-dedication of the Temple in Jerusalem following the victory of the Maccabees over the Seleucid Empire, there was only enough consecrated oil to fuel the eternal flame in the Temple for one day. Miraculously, the oil burned for eight days—which was the length of time it took to press, prepare and consecrate new oil.
Hanukkah is not mentioned in the Bible and was never considered a major holiday in Judaism, but it has become much more visible and widely celebrated in modern times, mainly because it falls around the same time as Christmas and has national Jewish overtones that have been emphasized since the establishment of the State of Israel.
Tisha B'Av ( or , "the Ninth of Av") is a day of mourning and fasting commemorating the destruction of the First and Second Temples, and in later times, the expulsion of the Jews from Spain.
There are three more minor Jewish fast days that commemorate various stages of the destruction of the Temples. They are the 17th Tamuz, the 10th of Tevet and Tzom Gedaliah (the 3rd of Tishrei).
The modern holidays of Yom Ha-shoah (Holocaust Remembrance Day), Yom Hazikaron (Israeli Memorial Day) and Yom Ha'atzmaut (Israeli Independence Day) commemorate the horrors of the Holocaust, the fallen soldiers of Israel and victims of terrorism, and Israeli independence, respectively.
There are some who prefer to commemorate those who were killed in the Holocaust on the 10th of Tevet.
The core of festival and Shabbat prayer services is the public reading of the Torah, along with connected readings from the other books of the Tanakh, called Haftarah. Over the course of a year, the whole Torah is read, with the cycle starting over in the autumn, on Simchat Torah.
Synagogues are Jewish houses of prayer and study. They usually contain separate rooms for prayer (the main sanctuary), smaller rooms for study, and often an area for community or educational use. There is no set blueprint for synagogues and the architectural shapes and interior designs of synagogues vary greatly. The Reform movement mostly refer to their synagogues as temples. Some traditional features of a synagogue are:
In addition to synagogues, other buildings of significance in Judaism include yeshivas, or institutions of Jewish learning, and mikvahs, which are ritual baths.
The Jewish dietary laws are known as "kashrut". Food prepared in accordance with them is termed kosher, and food that is not kosher is also known as "treifah" or "treif". People who observe these laws are colloquially said to be "keeping kosher".
Many of the laws apply to animal-based foods. For example, in order to be considered kosher, mammals must have split hooves and chew their cud. The pig is arguably the most well-known example of a non-kosher animal. Although it has split hooves, it does not chew its cud. For seafood to be kosher, the animal must have fins and scales. Certain types of seafood, such as shellfish, crustaceans, and eels, are therefore considered non-kosher. Concerning birds, a list of non-kosher species is given in the Torah. The exact translations of many of the species have not survived, and some non-kosher birds' identities are no longer certain. However, traditions exist about the "kashrut" status of a few birds. For example, both chickens and turkeys are permitted in most communities. Other types of animals, such as amphibians, reptiles, and most insects, are prohibited altogether.
In addition to the requirement that the species be considered kosher, meat and poultry (but not fish) must come from a healthy animal slaughtered in a process known as "shechitah". Without the proper slaughtering practices even an otherwise kosher animal will be rendered "treif". The slaughtering process is intended to be quick and relatively painless to the animal. Forbidden parts of animals include the blood, some fats, and the area in and around the sciatic nerve.
Jewish law also forbids the consumption of meat and dairy products together. The waiting period between eating meat and eating dairy varies by the order in which they are consumed and by community, and can extend for up to six hours. Based on the Biblical injunction against cooking a kid in its mother's milk, this rule is mostly derived from the Oral Torah, the Talmud and Rabbinic law. Chicken and other kosher birds are considered the same as meat under the laws of "kashrut", but the prohibition is Rabbinic, not Biblical.
The use of dishes, serving utensils, and ovens may make food "treif" that would otherwise be kosher. Utensils that have been used to prepare non-kosher food, or dishes that have held meat and are now used for dairy products, render the food "treif" under certain conditions.
Furthermore, all Orthodox and some Conservative authorities forbid the consumption of processed grape products made by non-Jews, due to ancient pagan practices of using wine in rituals. Some Conservative authorities permit wine and grape juice made without rabbinic supervision.
The Torah does not give specific reasons for most of the laws of "kashrut". However, a number of explanations have been offered, including maintaining ritual purity, teaching impulse control, encouraging obedience to God, improving health, reducing cruelty to animals and preserving the distinctness of the Jewish community. The various categories of dietary laws may have developed for different reasons, and some may exist for multiple reasons. For example, people are forbidden from consuming the blood of birds and mammals because, according to the Torah, this is where animal souls are contained. In contrast, the Torah forbids Israelites from eating non-kosher species because "they are unclean". The Kabbalah describes sparks of holiness that are released by the act of eating kosher foods, but are too tightly bound in non-kosher foods to be released by eating.
Survival concerns supersede all the laws of "kashrut", as they do for most halakhot.
The Tanakh describes circumstances in which a person who is "tahor" or ritually pure may become "tamei" or ritually impure. Some of these circumstances are contact with human corpses or graves, seminal flux, vaginal flux, menstruation, and contact with people who have become impure from any of these. In Rabbinic Judaism, Kohanim, members of the hereditary caste that served as priests in the time of the Temple, are mostly restricted from entering grave sites and touching dead bodies. During the Temple period, such priests (Kohanim) were required to eat their bread offering (Terumah) in a state of ritual purity, which laws eventually led to more rigid laws being enacted, such as hand-washing which became a requisite of all Jews before consuming ordinary bread.
An important subcategory of the ritual purity laws relates to the segregation of menstruating women. These laws are also known as "niddah", literally "separation", or family purity. Vital aspects of halakha for traditionally observant Jews, they are not usually followed by Jews in liberal denominations.
Especially in Orthodox Judaism, the Biblical laws are augmented by Rabbinical injunctions. For example, the Torah mandates that a woman in her normal menstrual period must abstain from sexual intercourse for seven days. A woman whose menstruation is prolonged must continue to abstain for seven more days after bleeding has stopped. The Rabbis conflated ordinary "niddah" with this extended menstrual period, known in the Torah as "zavah", and mandated that a woman may not have sexual intercourse with her husband from the time she begins her menstrual flow until seven days after it ends. In addition, Rabbinical law forbids the husband from touching or sharing a bed with his wife during this period. Afterwards, purification can occur in a ritual bath called a mikveh.
Traditional Ethiopian Jews keep menstruating women in separate huts and, similar to Karaite practice, do not allow menstruating women into their temples because of a temple's special sanctity. Emigration to Israel and the influence of other Jewish denominations have led to Ethiopian Jews adopting more normative Jewish practices.
Life-cycle events, or rites of passage, occur throughout a Jew's life that serves to strengthen Jewish identity and bind him/her to the entire community.
The role of the priesthood in Judaism has significantly diminished since the destruction of the Second Temple in 70 CE when priests attended to the Temple and sacrifices. The priesthood is an inherited position, and although priests no longer have any but ceremonial duties, they are still honored in many Jewish communities. Many Orthodox Jewish communities believe that they will be needed again for a future Third Temple and need to remain in readiness for future duty.
From the time of the Mishnah and Talmud to the present, Judaism has required specialists or authorities for the practice of very few rituals or ceremonies. A Jew can fulfill most requirements for prayer by himself. Some activities—reading the Torah and "haftarah" (a supplementary portion from the Prophets or Writings), the prayer for mourners, the blessings for bridegroom and bride, the complete grace after meals—require a "minyan", the presence of ten Jews.
The most common professional clergy in a synagogue are:
Jewish prayer services do involve two specified roles, which are sometimes, but not always, filled by a rabbi or hazzan in many congregations. In other congregations these roles are filled on an ad-hoc basis by members of the congregation who lead portions of services on a rotating basis:
Many congregations, especially larger ones, also rely on a:
The three preceding positions are usually voluntary and considered an honor. Since the Enlightenment large synagogues have often adopted the practice of hiring rabbis and hazzans to act as "shatz" and "baal kriyah", and this is still typically the case in many Conservative and Reform congregations. However, in most Orthodox synagogues these positions are filled by laypeople on a rotating or ad-hoc basis. Although most congregations hire one or more Rabbis, the use of a professional hazzan is generally declining in American congregations, and the use of professionals for other offices is rarer still.
At its core, the Tanakh is an account of the Israelites' relationship with God from their earliest history until the building of the Second Temple (c. 535 BCE). Abraham is hailed as the first Hebrew and the father of the Jewish people. As a reward for his act of faith in one God, he was promised that Isaac, his second son, would inherit the Land of Israel (then called Canaan). Later, the descendants of Isaac's son Jacob were enslaved in Egypt, and God commanded Moses to lead the Exodus from Egypt. At Mount Sinai, they received the Torah—the five books of Moses. These books, together with Nevi'im and Ketuvim are known as "Torah Shebikhtav" as opposed to the Oral Torah, which refers to the Mishnah and the Talmud. Eventually, God led them to the land of Israel where the tabernacle was planted in the city of Shiloh for over 300 years to rally the nation against attacking enemies. As time went on, the spiritual level of the nation declined to the point that God allowed the Philistines to capture the tabernacle. The people of Israel then told Samuel the prophet that they needed to be governed by a permanent king, and Samuel appointed Saul to be their King. When the people pressured Saul into going against a command conveyed to him by Samuel, God told Samuel to appoint David in his stead.
Once King David was established, he told the prophet Nathan that he would like to build a permanent temple, and as a reward for his actions, God promised David that he would allow his son, Solomon, to build the First Temple and the throne would never depart from his children.
Rabbinic tradition holds that the details and interpretation of the law, which are called the "Oral Torah" or "oral law", were originally an unwritten tradition based upon what God told Moses on Mount Sinai. However, as the persecutions of the Jews increased and the details were in danger of being forgotten, these oral laws were recorded by Rabbi Judah HaNasi (Judah the Prince) in the Mishnah, redacted "circa" 200 CE. The Talmud was a compilation of both the Mishnah and the Gemara, rabbinic commentaries redacted over the next three centuries. The Gemara originated in two major centers of Jewish scholarship, Palestine and Babylonia. Correspondingly, two bodies of analysis developed, and two works of Talmud were created. The older compilation is called the Jerusalem Talmud. It was compiled sometime during the 4th century in Palestine. The Babylonian Talmud was compiled from discussions in the houses of study by the scholars Ravina I, Ravina II, and Rav Ashi by 500 CE, although it continued to be edited later.
According to critical scholars, the Torah consists of inconsistent texts edited together in a way that calls attention to divergent accounts. Several of these scholars, such as Professor Martin Rose and John Bright, suggest that during the First Temple period the people of Israel believed that each nation had its own god, but that their god was superior to other gods. Some suggest that strict monotheism developed during the Babylonian Exile, perhaps in reaction to Zoroastrian dualism. In this view, it was only by the Hellenic period that most Jews came to believe that their god was the only god and that the notion of a clearly bounded Jewish nation identical with the Jewish religion formed. John Day argues that the origins of biblical Yahweh, El, Asherah, and Ba'al, may be rooted in earlier Canaanite religion, which was centered on a pantheon of gods much like the Greek pantheon.
According to the Hebrew Bible, the United Monarchy was established under Saul and continued under King David and Solomon with its capital in Jerusalem. After Solomon's reign, the nation split into two kingdoms, the Kingdom of Israel (in the north) and the Kingdom of Judah (in the south). The Kingdom of Israel was conquered by the Assyrian ruler Sargon II in the late 8th century BCE with many people from the capital Samaria being taken captive to Media and the Khabur River valley. The Kingdom of Judah continued as an independent state until it was conquered by a Babylonian army in the early 6th century BCE, destroying the First Temple that was at the center of ancient Jewish worship. The Judean elite was exiled to Babylonia and this is regarded as the first Jewish Diaspora. Later many of them returned to their homeland after the subsequent conquest of Babylonia by the Persians seventy years later, a period known as the Babylonian Captivity. A new Second Temple was constructed, and old religious practices were resumed.
During the early years of the Second Temple, the highest religious authority was a council known as the Great Assembly, led by Ezra of the Book of Ezra. Among other accomplishments of the Great Assembly, the last books of the Bible were written at this time and the canon sealed.
Hellenistic Judaism spread to Ptolemaic Egypt from the 3rd century BCE. After the Great Revolt (66–73 CE), the Romans destroyed the Temple. Hadrian built a pagan idol on the Temple grounds and prohibited circumcision; these acts of ethnocide provoked the Bar Kokhba revolt 132–136 CE after which the Romans banned the study of the Torah and the celebration of Jewish holidays, and forcibly removed virtually all Jews from Judea. In 200 CE, however, Jews were granted Roman citizenship and Judaism was recognized as a "religio licita" ("legitimate religion") until the rise of Gnosticism and Early Christianity in the fourth century.
Following the destruction of Jerusalem and the expulsion of the Jews, Jewish worship stopped being centrally organized around the Temple, prayer took the place of sacrifice, and worship was rebuilt around the community (represented by a minimum of ten adult men) and the establishment of the authority of rabbis who acted as teachers and leaders of individual communities (see Jewish diaspora).
Around the 1st century CE, there were several small Jewish sects: the Pharisees, Sadducees, Zealots, Essenes, and Christians. After the destruction of the Second Temple in 70 CE, these sects vanished. Christianity survived, but by breaking with Judaism and becoming a separate religion; the Pharisees survived but in the form of Rabbinic Judaism (today, known simply as "Judaism"). The Sadducees rejected the divine inspiration of the Prophets and the Writings, relying only on the Torah as divinely inspired. Consequently, a number of other core tenets of the Pharisees' belief system (which became the basis for modern Judaism), were also dismissed by the Sadducees. (The Samaritans practiced a similar religion, which is traditionally considered separate from Judaism.)
Like the Sadducees who relied only on the Torah, some Jews in the 8th and 9th centuries rejected the authority and divine inspiration of the oral law as recorded in the Mishnah (and developed by later rabbis in the two Talmuds), relying instead only upon the Tanakh. These included the Isunians, the Yudganites, the Malikites, and others. They soon developed oral traditions of their own, which differed from the rabbinic traditions, and eventually formed the Karaite sect. Karaites exist in small numbers today, mostly living in Israel. Rabbinical and Karaite Jews each hold that the others are Jews, but that the other faith is erroneous.
Over a long time, Jews formed distinct ethnic groups in several different geographic areas—amongst others, the Ashkenazi Jews (of central and Eastern Europe), the Sephardi Jews (of Spain, Portugal, and North Africa), the Beta Israel of Ethiopia, the Yemenite Jews from the southern tip of the Arabian Peninsula and the Malabari and Cochin Jews from Kerala . Many of these groups have developed differences in their prayers, traditions and accepted canons; however, these distinctions are mainly the result of their being formed at some cultural distance from normative (rabbinic) Judaism, rather than based on any doctrinal dispute.
Antisemitism arose during the Middle Ages, in the form of persecutions, pogroms, forced conversions, expulsions, social restrictions and ghettoization.
This was different in quality from the repressions of Jews which had occurred in ancient times. Ancient repressions were politically motivated and Jews were treated the same as members of other ethnic groups. With the rise of the Churches, the main motive for attacks on Jews changed from politics to religion and the religious motive for such attacks was specifically derived from Christian views about Jews and Judaism. During the Middle Ages, Jewish people who lived under Muslim rule generally experienced tolerance and integration, but there were occasional outbreaks of violence like Almohad's persecutions.
Hasidic Judaism was founded by Yisroel ben Eliezer (1700–1760), also known as the "Ba'al Shem Tov" (or "Besht"). It originated in a time of persecution of the Jewish people when European Jews had turned inward to Talmud study; many felt that most expressions of Jewish life had become too "academic", and that they no longer had any emphasis on spirituality or joy. Its adherents favored small and informal gatherings called Shtiebel, which, in contrast to a traditional synagogue, could be used both as a place of worship and for celebrations involving dancing, eating, and socializing. Ba'al Shem Tov's disciples attracted many followers; they themselves established numerous Hasidic sects across Europe. Unlike other religions, which typically expanded through word of mouth or by use of print, Hasidism spread largely owing to Tzadiks, who used their influence to encourage others to follow the movement. Hasidism appealed to many Europeans because it was easy to learn, did not require full immediate commitment, and presented a compelling spectacle. Hasidic Judaism eventually became the way of life for many Jews in Eastern Europe. Waves of Jewish immigration in the 1880s carried it to the United States. The movement itself claims to be nothing new, but a "refreshment" of original Judaism. As some have put it: ""they merely re-emphasized that which the generations had lost"". Nevertheless, early on there was a serious schism between Hasidic and non-Hasidic Jews. European Jews who rejected the Hasidic movement were dubbed by the Hasidim as Misnagdim, (lit. "opponents"). Some of the reasons for the rejection of Hasidic Judaism were the exuberance of Hasidic worship, its deviation from tradition in ascribing infallibility and miracles to their leaders, and the concern that it might become a messianic sect. Over time differences between the Hasidim and their opponents have slowly diminished and both groups are now considered part of Haredi Judaism.
In the late 18th century CE, Europe was swept by a group of intellectual, social and political movements known as the Enlightenment. The Enlightenment led to reductions in the European laws that prohibited Jews to interact with the wider secular world, thus allowing Jews access to secular education and experience. A parallel Jewish movement, Haskalah or the "Jewish Enlightenment", began, especially in Central Europe and Western Europe, in response to both the Enlightenment and these new freedoms. It placed an emphasis on integration with secular society and a pursuit of non-religious knowledge through reason. With the promise of political emancipation, many Jews saw no reason to continue to observe Jewish law and increasing numbers of Jews assimilated into Christian Europe. Modern religious movements of Judaism all formed in reaction to this trend.
In Central Europe, followed by Great Britain and the United States, Reform (or Liberal) Judaism developed, relaxing legal obligations (especially those that limited Jewish relations with non-Jews), emulating Protestant decorum in prayer, and emphasizing the ethical values of Judaism's Prophetic tradition. Modern Orthodox Judaism developed in reaction to Reform Judaism, by leaders who argued that Jews could participate in public life as citizens equal to Christians while maintaining the observance of Jewish law. Meanwhile, in the United States, wealthy Reform Jews helped European scholars, who were Orthodox in practice but critical (and skeptical) in their study of the Bible and Talmud, to establish a seminary to train rabbis for immigrants from Eastern Europe. These left-wing Orthodox rabbis were joined by right-wing Reform rabbis who felt that Jewish law should not be entirely abandoned, to form the Conservative movement. Orthodox Jews who opposed the Haskalah formed Haredi Orthodox Judaism. After massive movements of Jews following The Holocaust and the creation of the state of Israel, these movements have competed for followers from among traditional Jews in or from other countries.
Countries such as the United States, Israel, Canada, United Kingdom, Argentina and South Africa contain large Jewish populations. Jewish religious practice varies widely through all levels of observance. According to the 2001 edition of the National Jewish Population Survey, in the United States' Jewish community—the world's second largest—4.3 million Jews out of 5.1 million had some sort of connection to the religion. Of that population of connected Jews, 80% participated in some sort of Jewish religious observance, but only 48% belonged to a congregation, and fewer than 16% attend regularly.
Birth rates for American Jews have dropped from 2.0 to 1.7. (Replacement rate is 2.1.) Intermarriage rates range from 40–50% in the US, and only about a third of children of intermarried couples are raised as Jews. Due to intermarriage and low birth rates, the Jewish population in the US shrank from 5.5 million in 1990 to 5.1 million in 2001. This is indicative of the general population trends among the Jewish community in the Diaspora, but a focus on total population obscures growth trends in some denominations and communities, such as Haredi Judaism. The Baal teshuva movement is a movement of Jews who have "returned" to religion or become more observant.
Christianity was originally a sect of Second Temple Judaism, but the two religions diverged in the first century. The differences between Christianity and Judaism originally centered on whether Jesus was the Jewish Messiah but eventually became irreconcilable. Major differences between the two faiths include the nature of the Messiah, of atonement and sin, the status of God's commandments to Israel, and perhaps most significantly of the nature of God himself. Due to these differences, Judaism traditionally regards Christianity as Shituf or worship of the God of Israel which is not monotheistic. Christianity has traditionally regarded Judaism as obsolete with the invention of Christianity and Jews as a people replaced by the Church, though a Christian belief in dual-covenant theology emerged as a phenomenon following Christian reflection on how their theology influenced the Nazi Holocaust.
Since the time of the Middle Ages, the Catholic Church upheld the "Constitutio pro Judæis" (Formal Statement on the Jews), which stated
Until their emancipation in the late 18th and the 19th century, Jews in Christian lands were subject to humiliating legal restrictions and limitations. They included provisions requiring Jews to wear specific and identifying clothing such as the Jewish hat and the yellow badge, restricting Jews to certain cities and towns or in certain parts of towns (ghettos), and forbidding Jews to enter certain trades (for example selling new clothes in medieval Sweden). Disabilities also included special taxes levied on Jews, exclusion from public life, restraints on the performance of religious ceremonies, and linguistic censorship. Some countries went even further and completely expelled Jews, for example, England in 1290 (Jews were readmitted in 1655) and Spain in 1492 (readmitted in 1868). The first Jewish settlers in North America arrived in the Dutch colony of New Amsterdam in 1654; they were forbidden to hold public office, open a retail shop, or establish a synagogue. When the colony was seized by the British in 1664 Jewish rights remained unchanged, but by 1671 Asser Levy was the first Jew to serve on a jury in North America.
In 1791, Revolutionary France was the first country to abolish disabilities altogether, followed by Prussia in 1848. Emancipation of the Jews in the United Kingdom was achieved in 1858 after an almost 30-year struggle championed by Isaac Lyon Goldsmid with the ability of Jews to sit in parliament with the passing of the Jews Relief Act 1858. The newly created German Empire in 1871 abolished Jewish disabilities in Germany, which were reinstated in the Nuremberg Laws in 1935.
Jewish life in Christian lands was marked by frequent blood libels, expulsions, forced conversions and massacres. Religious prejudice was an underlying source against Jews in Europe. Christian rhetoric and antipathy towards Jews developed in the early years of Christianity and was reinforced by ever increasing anti-Jewish measures over the ensuing centuries. The action taken by Christians against Jews included acts of violence, and murder culminating in the Holocaust. These attitudes were reinforced by Christian preaching, in art and popular teaching for two millennia which expressed contempt for Jews, as well as statutes which were designed to humiliate and stigmatise Jews. The Nazi Party was known for its persecution of Christian Churches; many of them, such as the Protestant Confessing Church and the Catholic Church, as well as Quakers and Jehovah's Witnesses, aided and rescued Jews who were being targeted by the antireligious régime.
The attitude of Christians and Christian Churches toward the Jewish people and Judaism have changed in a mostly positive direction since World War II. Pope John Paul II and the Catholic Church have "upheld the Church's acceptance of the continuing and permanent election of the Jewish people" as well as a reaffirmation of the covenant between God and the Jews. In December 2015, the Vatican released a 10,000-word document that, among other things, stated that Catholics should work with Jews to fight antisemitism.
Both Judaism and Islam track their origins from the patriarch Abraham, and they are therefore considered Abrahamic religions. In both Jewish and Muslim tradition, the Jewish and Arab peoples are descended from the two sons of Abraham—Isaac and Ishmael, respectively. While both religions are monotheistic and share many commonalities, they differ based on the fact that Jews do not consider Jesus or Muhammad to be prophets. The religions' adherents have interacted with each other since the 7th century when Islam originated and spread in the Arabian peninsula. Indeed, the years 712 to 1066 CE under the Ummayad and the Abbasid rulers have been called the Golden age of Jewish culture in Spain. Non-Muslim monotheists living in these countries, including Jews, were known as dhimmis. Dhimmis were allowed to practice their own religions and administer their own internal affairs, but they were subject to certain restrictions that were not imposed on Muslims. For example, they had to pay the jizya, a per capita tax imposed on free adult non-Muslim males, and they were also forbidden to bear arms or testify in court cases involving Muslims. Many of the laws regarding dhimmis were highly symbolic. For example, dhimmis in some countries were required to wear distinctive clothing, a practice not found in either the Qur'an or the hadiths but invented in early medieval Baghdad and inconsistently enforced. Jews in Muslim countries were not entirely free from persecution—for example, many were killed, exiled or forcibly converted in the 12th century, in Persia, and by the rulers of the Almohad dynasty in North Africa and Al-Andalus, as well as by the Zaydi imams of Yemen in the 17th century (see: Mawza Exile). At times, Jews were also restricted in their choice of residence—in Morocco, for example, Jews were confined to walled quarters (mellahs) beginning in the 15th century and increasingly since the early 19th century.
In the mid-20th century, Jews were expelled from nearly all of the Arab countries. Most have chosen to live in Israel. Today, antisemitic themes including Holocaust denial have become commonplace in the propaganda of Islamic movements such as Hizbullah and Hamas, in the pronouncements of various agencies of the Islamic Republic of Iran, and even in the newspapers and other publications of Refah Partisi.
There are some movements that combine elements of Judaism with those of other religions. The most well-known of these is Messianic Judaism, a religious movement, which arose in the 1960s, that incorporates elements of Judaism with the tenets of Christianity. The movement generally states that Jesus is the Jewish Messiah, that he is one of the Three Divine Persons, and that salvation is only achieved through acceptance of Jesus as one's savior. Some members of the movement argue that Messianic Judaism is a sect of Judaism. Jewish organizations of every denomination reject this, stating that Messianic Judaism is a Christian sect, because it teaches creeds which are identical to those of Pauline Christianity.
Other examples of syncretism include Semitic neopaganism, a loosely organized sect which incorporates pagan or Wiccan beliefs with some Jewish religious practices; Jewish Buddhists, another loosely organized group that incorporates elements of Asian spirituality in their faith; and some Renewal Jews who borrow freely and openly from Buddhism, Sufism, Native American religions, and other faiths.
The Kabbalah Centre, which employs teachers from multiple religions, is a New Age movement that claims to popularize the kabbalah, part of the Jewish esoteric tradition.
Jews in Islamic countries:
See also Torah database for links to more Judaism e-texts.
Text study projects at . In many instances, the Hebrew versions of these projects are more fully developed than the English.
|
https://en.wikipedia.org/wiki?curid=15624
|
John Stuart Mill
John Stuart Mill (20 May 1806 – 7 May 1873), usually cited as J. S. Mill, was a British philosopher, political economist, and civil servant. One of the most influential thinkers in the history of classical liberalism, he contributed widely to social theory, political theory, and political economy. Dubbed "the most influential English-speaking philosopher of the nineteenth century", His conception of liberty justified the freedom of the individual in opposition to unlimited state and social control.
Mill was a proponent of utilitarianism, an ethical theory developed by his predecessor Jeremy Bentham. He contributed to the investigation of scientific methodology, though his knowledge of the topic was based on the writings of others, notably William Whewell, John Herschel, and Auguste Comte, and research carried out for Mill by Alexander Bain. He engaged in written debate with Whewell.
A member of the Liberal Party and author of the early feminist work "The Subjection of Women", Mill was also the second Member of Parliament to call for women's suffrage after Henry Hunt in 1832.
John Stuart Mill was born at 13 Rodney Street in Pentonville, Middlesex, the eldest son of Harriet Barrow and the Scottish philosopher, historian, and economist James Mill. John Stuart was educated by his father, with the advice and assistance of Jeremy Bentham and Francis Place. He was given an extremely rigorous upbringing, and was deliberately shielded from association with children his own age other than his siblings. His father, a follower of Bentham and an adherent of associationism, had as his explicit aim to create a genius intellect that would carry on the cause of utilitarianism and its implementation after he and Bentham had died.
Mill was a notably precocious child. He describes his education in his autobiography. At the age of three he was taught Greek. By the age of eight, he had read "Aesop's Fables", Xenophon's "Anabasis", and the whole of Herodotus, and was acquainted with Lucian, Diogenes Laërtius, Isocrates and six dialogues of Plato. He had also read a great deal of history in English and had been taught arithmetic, physics and astronomy.
At the age of eight, Mill began studying Latin, the works of Euclid, and algebra, and was appointed schoolmaster to the younger children of the family. His main reading was still history, but he went through all the commonly taught Latin and Greek authors and by the age of ten could read Plato and Demosthenes with ease. His father also thought that it was important for Mill to study and compose poetry. One of his earliest poetic compositions was a continuation of the "Iliad". In his spare time he also enjoyed reading about natural sciences and popular novels, such as "Don Quixote" and "Robinson Crusoe".
His father's work, "The History of British India" was published in 1818; immediately thereafter, at about the age of twelve, Mill began a thorough study of the scholastic logic, at the same time reading Aristotle's logical treatises in the original language. In the following year he was introduced to political economy and studied Adam Smith and David Ricardo with his father, ultimately completing their classical economic view of factors of production. Mill's "comptes rendus" of his daily economy lessons helped his father in writing "Elements of Political Economy" in 1821, a textbook to promote the ideas of Ricardian economics; however, the book lacked popular support. Ricardo, who was a close friend of his father, used to invite the young Mill to his house for a walk in order to talk about political economy.
At the age of fourteen, Mill stayed a year in France with the family of Sir Samuel Bentham, brother of Jeremy Bentham. The mountain scenery he saw led to a lifelong taste for mountain landscapes. The lively and friendly way of life of the French also left a deep impression on him. In Montpellier, he attended the winter courses on chemistry, zoology, logic of the "Faculté des Sciences", as well as taking a course in higher mathematics. While coming and going from France, he stayed in Paris for a few days in the house of the renowned economist Jean-Baptiste Say, a friend of Mill's father. There he met many leaders of the Liberal party, as well as other notable Parisians, including Henri Saint-Simon.
Mill went through months of sadness and contemplated suicide at twenty years of age. According to the opening paragraphs of Chapter V of his autobiography, he had asked himself whether the creation of a just society, his life's objective, would actually make him happy. His heart answered "no", and unsurprisingly he lost the happiness of striving towards this objective. Eventually, the poetry of William Wordsworth showed him that beauty generates compassion for others and stimulates joy. With renewed joy he continued to work towards a just society, but with more relish for the journey. He considered this one of the most pivotal shifts in his thinking. In fact, many of the differences between him and his father stemmed from this expanded source of joy.
Mill had been engaged in a pen-friendship with Auguste Comte, the founder of positivism and sociology, since Mill first contacted Comte in November 1841. Comte's "sociologie" was more an early philosophy of science than we perhaps know it today, and the "positive" philosophy aided in Mill's broad rejection of Benthamism.
As a nonconformist who refused to subscribe to the Thirty-Nine Articles of the Church of England, Mill was not eligible to study at the University of Oxford or the University of Cambridge. Instead he followed his father to work for the East India Company, and attended University College, London, to hear the lectures of John Austin, the first Professor of Jurisprudence. He was elected a Foreign Honorary Member of the American Academy of Arts and Sciences in 1856.
Mill's career as a colonial administrator at the British East India Company spanned from when he was 17 years old in 1823 until 1858, when the Company was abolished in favor of direct rule by the British crown over India. In 1836, he was promoted to the Company's Political Department, where he was responsible for correspondence pertaining to the Company's relations with the princely states, and in 1856, was finally promoted to the position of Examiner of Indian Correspondence. In "On Liberty", "A Few Words on Non-Intervention", and other works, he defended British imperialism by arguing that a fundamental distinction existed between civilized and barbarous peoples. Mill viewed countries such as India and China as having once been progressive, but that were now stagnant and barbarous, thus legitimizing British rule as benevolent despotism, "provided the end is [the barbarians'] improvement". When the crown proposed to take direct control over the colonies in India, he was tasked with defending Company rule, penning "Memorandum on the Improvements in the Administration of India during the Last Thirty Years" among other petitions. He was offered a seat on the Council of India, the body created to advise the new Secretary of State for India, but declined, citing his disapproval of the new system of rule.
In 1851, Mill married Harriet Taylor after 21 years of intimate friendship. Taylor was married when they met, and their relationship was close but generally believed to be chaste during the years before her first husband died in 1849. The couple waited two years before marrying in 1851. Brilliant in her own right, Taylor was a significant influence on Mill's work and ideas during both friendship and marriage. His relationship with Taylor reinforced Mill's advocacy of women's rights. He said that in his stand against domestic violence, and for women's rights he was “chiefly an amanuensis to my wife”. He called her mind a “perfect instrument”, and said she was “the most eminently qualified of all those known to the author”. He cites her influence in his final revision of "On Liberty", which was published shortly after her death. Taylor died in 1858 after developing severe lung congestion, after only seven years of marriage to Mill.
Between the years 1865 and 1868 Mill served as Lord Rector of the University of St Andrews. At his inaugural address, delivered to the University on 1 February 1867, he made the now famous (but often wrongly attributed) remark that "Bad men need nothing more to compass their ends, than that good men should look on and do nothing". That Mill included that sentence in the address is a matter of historical record, but it by no means follows that it expressed a wholly original insight. During the same period, 1865–68, he was also a Member of Parliament for City and Westminster. He was sitting for the Liberal Party. During his time as an MP, Mill advocated easing the burdens on Ireland. In 1866, he became the first person in the history of Parliament to call for women to be given the right to vote, vigorously defending this position in subsequent debate. He also became a strong advocate of such social reforms as labour unions and farm cooperatives. In "Considerations on Representative Government", he called for various reforms of Parliament and voting, especially proportional representation, the single transferable vote, and the extension of suffrage. In April 1868, he favoured in a Commons debate the retention of capital punishment for such crimes as aggravated murder; he termed its abolition "an effeminacy in the general mind of the country".
He was godfather to the philosopher Bertrand Russell.
In his views on religion, Mill was an agnostic and a sceptic.
Mill died in 1873 of erysipelas in Avignon, France, where his body was buried alongside his wife's.
Mill joined the debate over scientific method which followed on from John Herschel's 1830 publication of "A Preliminary Discourse on the study of Natural Philosophy", which incorporated inductive reasoning from the known to the unknown, discovering general laws in specific facts and verifying these laws empirically. William Whewell expanded on this in his 1837 "History of the Inductive Sciences, from the Earliest to the Present Time", followed in 1840 by "The Philosophy of the Inductive Sciences, Founded Upon their History", presenting induction as the mind superimposing concepts on facts. Laws were self-evident truths, which could be known without need for empirical verification.
Mill countered this in 1843 in "A System of Logic" (fully titled "A System of Logic, Ratiocinative and Inductive, Being a Connected View of the Principles of Evidence, and the Methods of Scientific Investigation"). In "Mill's Methods" (of induction), as in Herschel's, laws were discovered through observation and induction, and required empirical verification.
Mill's "On Liberty" (1859) addresses the nature and limits of the power that can be legitimately exercised by society over the individual. However Mill is clear that his concern for liberty does not extend to all individuals and all societies. He states that "Despotism is a legitimate mode of government in dealing with barbarians."
Mill states that it is not a crime to harm oneself as long as the person doing so is not harming others. He favors the "harm principle": "The only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others.". [Citation incomplete: lacks page reference.] He excuses those who are "incapable of self-government" from this principle, such as young children or those living in "backward states of society".
Though this principle seems clear, there are a number of complications. For example, Mill explicitly states that "harms" may include acts of omission as well as acts of commission. Thus, failing to rescue a drowning child counts as a harmful act, as does failing to pay taxes, or failing to appear as a witness in court. All such harmful omissions may be regulated, according to Mill. By contrast, it does not count as harming someone if—without force or fraud—the affected individual consents to assume the risk: thus one may permissibly offer unsafe employment to others, provided there is no deception involved. (He does, however, recognise one limit to consent: society should not permit people to sell themselves into slavery.) In these and other cases, it is important to bear in mind that the arguments in "On Liberty" are grounded on the principle of "utility", and not on appeals to natural rights. Thus, his concept of the slavery into which individuals may not sell themselves would apply to the simple, brutal form of complete unfreedom whose abolition was a burning political issue in his day. But it would not extend to the more extended contemporary notions of slavery by economic means or 'wage slavery'.Not every interpretation or inference requires a source in published work. Some such original thoughts are inherently sustainable as reasonable assertions which hold up to the test of reflective consideration. In the present context, for example, there is no doubt that Mill, as a laissez-faire Liberal individualist, would have subscribed to the economics of Ricardo and Adam Smith, rather than to those of Karl Marx. It is for consideration, however, what Mill would have made of various forms of modern slavery, including debt bondage.
The question of what counts as a self-regarding action and what actions, whether of omission or commission, constitute harmful actions subject to regulation, continues to exercise interpreters of Mill. He did not consider giving offence to constitute "harm"; an action could not be restricted because it violated the conventions or morals of a given society.
Mill believed that "the struggle between Liberty and Authority is the most conspicuous feature in the portions of history." For him, liberty in antiquity was a "contest…between subjects, or some classes of subjects, and the government."
Mill defined "social liberty" as protection from "the tyranny of political rulers". He introduced a number of different concepts of the form tyranny can take, referred to as social tyranny, and "tyranny of the majority". "Social liberty" for Mill meant putting limits on the ruler's power so that he would not be able to use that power to further his own wishes and thus make decisions that could harm society. In other words, people should have the right to have a say in the government's decisions. He said that "social liberty" was "the nature and limits of the power which can be legitimately exercised by society over the individual." It was attempted in two ways: first, by obtaining recognition of certain immunities (called "political liberties" or "rights"); and second, by establishment of a system of "constitutional checks".
However, in Mill's view, limiting the power of government was not enough:Society can and does execute its own mandates: and if it issues wrong mandates instead of right, or any mandates at all in things with which it ought not to meddle, it practises a social tyranny more formidable than many kinds of political oppression, since, though not usually upheld by such extreme penalties, it leaves fewer means of escape, penetrating much more deeply into the details of life, and enslaving the soul itself.
Mill's view on liberty, which was influenced by Joseph Priestley and Josiah Warren, is that the individual ought to be free to do as she/he wishes unless she/he harms others. Individuals are rational enough to make decisions about their well being. Government should interfere when it is for the protection of society. Mill explained:
The sole end for which mankind are warranted, individually or collectively, in interfering with the liberty of action of any of their number, is self-protection. That the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will, is to prevent harm to others. His own good, either physical or moral, is not sufficient warrant. He cannot rightfully be compelled to do or forbear because it will be better for him to do so, because it will make him happier, because, in the opinion of others, to do so would be wise, or even right.… The only part of the conduct of anyone, for which he is amenable to society, is that which concerns others. In the part which merely concerns him, his independence is, of right, absolute. Over himself, over his own body and mind, the individual is sovereign.
"On Liberty" involves an impassioned defense of free speech. Mill argues that free discourse is a necessary condition for intellectual and social progress. We can never be sure, he contends, that a silenced opinion does not contain some element of the truth. He also argues that allowing people to air false opinions is productive for two reasons. First, individuals are more likely to abandon erroneous beliefs if they are engaged in an open exchange of ideas. Second, by forcing other individuals to re-examine and re-affirm their beliefs in the process of debate, these beliefs are kept from declining into mere dogma. It is not enough for Mill that one simply has an unexamined belief that happens to be true; one must understand why the belief in question is the true one. Along those same lines Mill wrote, "unmeasured vituperation, employed on the side of prevailing opinion, really does deter people from expressing contrary opinions, and from listening to those who express them."
As an influential advocate of freedom of speech, Mill objected to censorship:
I choose, by preference the cases which are least favourable to me – In which the argument opposing freedom of opinion, both on truth and that of utility, is considered the strongest. Let the opinions impugned be the belief of God and in a future state, or any of the commonly received doctrines of morality ... But I must be permitted to observe that it is not the feeling sure of a doctrine (be it what it may) which I call an assumption of infallibility. It is the undertaking to decide that question "for others", without allowing them to hear what can be said on the contrary side. And I denounce and reprobate this pretension not the less if it is put forth on the side of my most solemn convictions. However positive anyone's persuasion may be, not only of the faculty but of the pernicious consequences, but (to adopt expressions which I altogether condemn) the immorality and impiety of opinion. – yet if, in pursuance of that private judgement, though backed by the public judgement of his country or contemporaries, he prevents the opinion from being heard in its defence, he assumes infallibility. And so far from the assumption being less objectionable or less dangerous because the opinion is called immoral or impious, this is the case of all others in which it is most fatal.
Mill outlines the benefits of 'searching for and discovering the truth' as a way to further knowledge. He argued that even if an opinion is false, the truth can be better understood by refuting the error. And as most opinions are neither completely true nor completely false, he points out that allowing free expression allows the airing of competing views as a way to preserve partial truth in various opinions. Worried about minority views being suppressed, he argued in support of freedom of speech on political grounds, stating that it is a critical component for a representative government to have in order to empower debate over public policy. He also eloquently argued that freedom of expression allows for personal growth and self-realization. Not that he would have used, or even have been familiar with, either of those terms. Thus to attribute to him ideas he may never have thought, let alone have expressed, is a somewhat dubious misattribution and over-interpretation.He said that freedom of speech was a vital way to develop talents and realise a person's potential and creativity. He repeatedly said that eccentricity was preferable to uniformity and stagnation.
The belief that freedom of speech would advance society presupposed a society sufficiently culturally and institutionally advanced to be capable of progressive improvement. If any argument is really wrong or harmful, the public will judge it as wrong or harmful, and then those arguments cannot be sustained and will be excluded. Mill argued that even any arguments which are used in justifying murder or rebellion against the government shouldn't be politically suppressed or socially persecuted. According to him, if rebellion is really necessary, people should rebel; if murder is truly proper, it should be allowed. However, the way to express those arguments should be a public speech or writing, not in a way that causes actual harm to others. Such is the "harm principle": "That the only purpose for which power can be rightfully exercised over any member of a civilised community, against his will, is to prevent harm to others."
At the beginning of the 20th century, Associate Justice Oliver Wendell Holmes Jr. made the standard of "clear and present danger" based on Mill's idea. In the majority opinion, Holmes writes:
Holmes suggested that shouting out "Fire!" in a dark theatre, which evokes panic and provokes injury, would be such a case of speech that creates an illegal danger. But if the situation allows people to reason by themselves and decide to accept it or not, any argument or theology should not be blocked.
Nowadays, Mill's argument is generally accepted by many democratic countries, and they have laws at least guided by the harm principle. For example, in American law some exceptions limit free speech such as obscenity, defamation, breach of peace, and "fighting words".
Mill, an employee of the British East India Company from 1823 to 1858, argued in support of what he called a "benevolent despotism" with regard to the colonies. Mill argued:To suppose that the same international customs, and the same rules of international morality, can obtain between one civilized nation and another, and between civilized nations and barbarians, is a grave error.… To characterize any conduct whatever towards a barbarous people as a violation of the law of nations, only shows that he who so speaks has never considered the subject.Mill justified the British colonization of India, but was concerned with the way in which British rule of India was conducted.
In 1850, Mill sent an anonymous letter (which came to be known under the title "The Negro Question"), in rebuttal to Thomas Carlyle's anonymous letter to "Fraser's Magazine for Town and Country" in which Carlyle argued for slavery. Mill supported abolition in the United States, expressing his opposition to slavery in his essay of 1869, "The Subjection of Women":
This absolutely extreme case of the law of force, condemned by those who can tolerate almost every other form of arbitrary power, and which, of all others, presents features the most revolting to the feeling of all who look at it from an impartial position, was the law of civilized and Christian England within the memory of persons now living: and in one half of Anglo-Saxon America three or four years ago, not only did slavery exist, but the slave trade, and the breeding of slaves expressly for it, was a general practice between slave states. Yet not only was there a greater strength of sentiment against it, but, in England at least, a less amount either of feeling or of interest in favour of it, than of any other of the customary abuses of force: for its motive was the love of gain, unmixed and undisguised: and those who profited by it were a very small numerical fraction of the country, while the natural feeling of all who were not personally interested in it, was unmitigated abhorrence.
Mill corresponded with John Appleton, an American legal reformer from Maine, extensively on the topic of racial equality. Appleton influenced Mill's work on such, especially swaying him on the optimal economic and social welfare plan for the Antebellum South. In a letter sent to Appleton in response to a previous letter, Mill expressed his view on antebellum integration:I cannot look forward with satisfaction to any settlement but complete emancipation—land given to every negro family either separately or in organized communities under such rules as may be found temporarily necessary—the schoolmaster set to work in every village & the tide of free immigration turned on in those fertile regions from which slavery has hitherto excluded it. If this be done, the gentle & docile character which seems to distinguish the negroes will prevent any mischief on their side, while the proofs they are giving of fighting powers will do more in a year than all other things in a century to make the whites respect them & consent to their being politically & socially equals.
Mill's view of history was that right up until his time "the whole of the female" and "the great majority of the male sex" were simply "slaves". He countered arguments to the contrary, arguing that relations between sexes simply amounted to "the legal subordination of one sex to the other – [which] is wrong itself, and now one of the chief hindrances to human improvement; and that it ought to be replaced by a principle of perfect equality." Here, then, we have an instance of Mill's use of 'slavery' in a sense which, compared to its fundamental meaning of absolute unfreedom of person, is an extended and arguably a rhetorical rather than a literal sense.
With this, Mill can be considered among the earliest male proponents of gender equality. His book "The Subjection of Women" (1861, publ.1869) is one of the earliest written on this subject by a male author. In "The Subjection of Women", Mill attempts to make a case for perfect equality.
He talks about the role of women in marriage and how it must be changed. Mill comments on three major facets of women's lives that he felt are hindering them:
He argues that the oppression of women was one of the few remaining relics from ancient times, a set of prejudices that severely impeded the progress of humanity. As a Member of Parliament, Mill introduced an unsuccessful amendment to the Reform Bill to substitute the word "person" in place of "man".
The canonical statement of Mill's utilitarianism can be found in his book, "Utilitarianism". Although this philosophy has a long tradition, Mill's account is primarily influenced by Jeremy Bentham and Mill's father James Mill.
John Stuart Mill believed in the philosophy of "utilitarianism", which he would describe as the principle that holds "that actions are right in the proportion as they tend to promote happiness, wrong as they tend to produce the reverse of happiness". By "happiness" he means, "intended pleasure, and the absence of pain; by unhappiness, pain, and the privation of pleasure". It is clear that we do not all value virtues as a path to happiness and that we sometimes only value them for selfish reasons. However, Mill asserts that upon reflection, even when we value virtues for selfish reasons we are in fact cherishing them as a part of our happiness.
Bentham's famous formulation of utilitarianism is known as the greatest-happiness principle. It holds that one must always act so as to produce the greatest aggregate happiness among all sentient beings, within reason. In a similar vein, Mill's method of determining the best utility is that a moral agent, when given the choice between two or more actions, ought to choose the action that contributes most to (maximizes) the total happiness in the world. "Happiness", in this context, is understood as the production of pleasure or privation of pain. Given that determining the action that produces the most utility is not always so clear cut, Mill suggests that the utilitarian moral agent, when attempting to rank the utility of different actions, should refer to the general experience of persons. That is, if people generally experience more happiness following action "X" than they do action "Y", the utilitarian should conclude that action "X" produces more utility than, and is thus favorable to, action Y.
Is Utilitarianism a species of "consequentialism", that is, are acts means are justified by their desirable outcomes? The overarching goal of utilitarianism—the ideal consequence—is to achieve the "greatest good for the greatest number as the end result of human action." Mill states in his writings on Utilitarianism [which writiings, if any, other than his Utilitarianism] that "happiness is the sole end of human action". This statement aroused some controversy, which is why Mill took it a step further, explaining how the very nature of humans wanting happiness, and who "take it to be reasonable under free consideration", demands that happiness is indeed desirable. In other words, free will leads everyone to make actions inclined on their own happiness, unless reasoned that it would improve the happiness of others, in which case, the greatest utility is still being achieved. To that extent, the "utilitarianism" that Mill is describing is a default lifestyle that he believes is what people who have not studied a specific opposing field of ethics would naturally and subconsciously utilize when faced with decision.
Utilitarianism is thought of by some of its activists to be a more developed and overarching ethical theory of Immanuel Kant's belief in goodwill, and not just some default cognitive process of humans. Where Kant would argue that reason can only be used properly by goodwill, Mill would say that the only way to universally create fair laws and systems would be to step back to the consequences, whereby Kant's ethical theories become based around the ultimate good—utility. By this logic the only valid way to discern what is proper reason would be to view the consequences of any action and weigh the good and the bad, even if on the surface, the ethical reasoning seems to indicate a different train of thought.
Mill's major contribution to utilitarianism is his argument for the qualitative separation of pleasures. Bentham treats all forms of happiness as equal, whereas Mill argues that intellectual and moral pleasures ("higher pleasures") are superior to more physical forms of pleasure ("lower pleasures"). He distinguishes between happiness and contentment, claiming that the former is of higher value than the latter, a belief wittily encapsulated in the statement that, "it is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied. And if the fool, or the pig, are of a different opinion, it is because they only know their own side of the question."
This made Mill believe that "our only ultimate end" is happiness. One unique part of his utilitarian view, that is not seen in others, is the idea of higher and lower pleasures. Mill explains the different pleasures as:
He defines higher pleasures as mental, moral, and aesthetic pleasures, and lower pleasures as being more sensational. He believed that higher pleasures should be seen as preferable to lower pleasures since they have a greater quality in virtue. He holds that pleasures gained in activity are of a higher quality than those gained passively. Thus, to sing the leading role in a Verdi opera at La Scala would be in a far higher category of pleasure, qualitatively, than to loll back on the sofa eating a beefburger, quaffing beer from a can and watching a hardcore pornographic film.
Mill defines the difference between higher and lower forms of pleasure with the principle that those who have experienced both tend to prefer one over the other. This is, perhaps, in direct contrast with Bentham's statement that "Quantity of pleasure being equal, push-pin is as good as poetry", that, if a simple child's game like hopscotch causes more pleasure to more people than a night at the opera house, it is more incumbent upon a society to devote more resources to propagating hopscotch than running opera houses. Mill's argument is that the "simple pleasures" tend to be preferred by people who have no experience with high art, and are therefore not in a proper position to judge. He also argues that people who, for example, are noble or practise philosophy, benefit society more than those who engage in individualist practices for pleasure, which are lower forms of happiness. It is not the agent's own greatest happiness that matters "but the greatest amount of happiness altogether".
Mill separated his explanation of Utilitarianism into five different sections:
In the General Remarks portion of his essay he speaks how next to no progress has been made when it comes to judging what is right and what is wrong of morality and if there is such a thing as moral instinct (which he argues that there may not be). However he agrees that in general "Our moral faculty, according to all those of its interpreters who are entitled to the name of thinkers, supplies us only with the general principles of moral judgments".
In What Utilitarianism Is, he focuses no longer on background information but utilitarianism itself. He quotes utilitarianism as "The greatest happiness principle", defining this theory by saying that pleasure and no pain are the only inherently good things in the world and expands on it by saying that "actions are right in proportion as they tend to promote happiness, wrong as they tend to produce the reverse of happiness. By happiness is intended pleasure, and the absence of pain; by unhappiness, pain, and the privation of pleasure." He views it not as an animalistic concept because he sees seeking out pleasure as a way of using our higher facilities. He also says in this chapter that the happiness principle is based not exclusively on the individual but mainly on the community.
Mill also defends the idea of a "strong utilitarian conscience (i.e. a strong feeling of obligation to the general happiness)". He argued that humans have a desire to be happy and that that desire causes us to want to be in unity with other humans. This causes us to care about the happiness of others, as well as the happiness of complete strangers. But this desire also causes us to experience pain when we perceive harm to other people. He believes in internal sanctions that make us experience guilt and appropriate our actions. These internal sanctions make us want to do good because we do not want to feel guilty for our actions. Happiness is our ultimate end because it is our duty. He argues that we do not need to be constantly motivated by the concern of people's happiness because the most of the actions done by people are done out of good intention, and the good of the world is made up of the good of the people.
In Mill's fourth chapter, Of What Sort of Proof the Principle of Utility is Susceptible, he speaks of what proofs of Utility are affected. He starts this chapter off by saying that all of his claims cannot be backed up by reasoning. He claims that the only proof that something brings one pleasure is if someone finds it pleasurable. Next he talks about how morality is the basic way to achieve happiness. He also discusses in this chapter that Utilitarianism is beneficial for virtue. He says that "it maintains not only that virtue is to be desired, but that it is to be desired disinterestedly, for itself." In his final chapter he looks at the connection between Utilitarianism and justice. He contemplates the question of whether justice is something distinct from Utility or not. He reasons this question in several different ways and finally comes to the conclusion that in certain cases justice is essential for Utility, but in others social duty is far more important than justice. Mill believes that "justice must give way to some other moral principle, but that what is just in ordinary cases is, by reason of that other principle, not just in the particular case."
The qualitative account of happiness that Mill advocates thus sheds light on his account presented in "On Liberty". As he suggests in that text, utility is to be conceived in relation to humanity "as a progressive being", which includes the development and exercise of rational capacities as we strive to achieve a "higher mode of existence". The rejection of censorship and paternalism is intended to provide the necessary social conditions for the achievement of knowledge and the greatest ability for the greatest number to develop and exercise their deliberative and rational capacities.
Mill redefines the definition of happiness as "the ultimate end, for the sake of which all other things are desirable (whether we are considering our own good or that of other people) is an existence as free as possible from pain and as rich as possible in enjoyments". He firmly believed that moral rules and obligations could be referenced to promoting happiness, which connects to having a noble character. While Mill is not a standard act or rule utilitarian [What is meant by 'act-Utilitarian' and 'rule-Utilitarian'? It is unintelligible to introduce terms of art wiithout defining them.], he is a minimizing utilitarian, which "affirms that it would be "desirable" to maximize happiness for the greatest number, but not that we are not morally "required" to do so".
Mill's early economic philosophy was one of free markets. However, he accepted interventions in the economy, such as a tax on alcohol, if there were sufficient utilitarian grounds. He also accepted the principle of legislative intervention for the purpose of animal welfare. He originally believed that "equality of taxation" meant "equality of sacrifice" and that progressive taxation penalised those who worked harder and saved more and was therefore "a mild form of robbery".
Given an equal tax rate regardless of income, Mill agreed that inheritance should be taxed. A utilitarian society would agree that everyone should be equal one way or another. Therefore, receiving inheritance would put one ahead of society unless taxed on the inheritance. Those who donate should consider and choose carefully where their money goes – some charities are more deserving than others. Considering public charities boards such as a government will disburse the money equally. However, a private charity board like a church would disburse the monies fairly to those who are in more need than others.
Later he altered his views toward a more socialist bent, adding chapters to his Principles of Political Economy in defence of a socialist outlook, and defending some socialist causes. [The work cited does not support the alleged shift of ground.] Within this revised work he also made the radical proposal that the whole wage system be abolished in favour of a co-operative wage system. Nonetheless, some of his views on the idea of flat taxation remained, albeit altered in the third edition of the "Principles of Political Economy" to reflect a concern for differentiating restrictions on "unearned" incomes, which he favoured, and those on "earned" incomes, which he did not favour.
Mill's "Principles", first published in 1848, was one of the most widely read of all books on economics in the period. As Adam Smith's "Wealth of Nations" had during an earlier period, "Principles" came to dominate economics teaching. In the case of Oxford University it was the standard text until 1919, when it was replaced by Marshall's "Principles of Economics".
Mill's main objection to socialism focused on what he saw its destruction of competition. He wrote, "I utterly dissent from the most conspicuous and vehement part of their teaching – their declamations against competition." He was an egalitarian, but he argued more for equal opportunity and placed meritocracy above all other ideals in this regard. According to Mill, a socialist society would only be attainable through the provision of basic education for all, promoting economic democracy instead of capitalism, in the manner of substituting capitalist businesses with worker cooperatives. He says:
Mill's major work on political democracy, "Considerations on Representative Government", defends two fundamental principles: extensive participation by citizens and enlightened competence of rulers. The two values are obviously in tension, and some readers have concluded that he is an elitist democrat, while others count him as an earlier participatory democrat. In one section, he appears to defend plural voting, in which more competent citizens are given extra votes (a view he later repudiated). However, in another chapter he argues cogently for the value of participation by all citizens. He believed that the incompetence of the masses could eventually be overcome if they were given a chance to take part in politics, especially at the local level.
Mill is one of the few political philosophers ever to serve in government as an elected official. In his three years in Parliament, he was more willing to compromise than the "radical" principles expressed in his writing would lead one to expect.
Mill was a major proponent of the diffusion and use of public education to the working class. He saw the value of the individual person, and believed that "man had the inherent capability of guiding his own destiny-but only if his faculties were developed and fulfilled", which could be achieved through education. He regarded education as a pathway to improve human nature which to him meant "to encourage, among other characteristics, diversity and originality, the energy of character, initiative, autonomy, intellectual cultivation, aesthetic sensibility, non-self-regarding interests, prudence, responsibility, and self-control". Education allowed for humans to develop into full informed citizens that had the tools to improve their condition and make fully informed electoral decisions. The power of education lay in its ability to serve as a great equalizer among the classes allowing the working class the ability to control their own destiny and compete with the upper classes. Mill recognized the paramount importance of public education in avoiding the tyranny of the majority by ensuring that all the voters and political participants were fully developed individuals. It was through education, he believed, that an individual could become a full participant within representative democracy.
In "Principles of Political Economy", Mill offered an analysis of two economic phenomena often linked together: the laws of production and wealth and the modes of its distribution. Regarding the former, he believed that it was not possible to alter to laws of production, "the ultimate properties of matter and mind... only to employ these properties to bring about events we are interested". The modes of distribution of wealth is a matter of human institutions solely, starting with what Mill believed to be the primary and fundamental institution: Individual Property. He believed that all individuals must start on equal terms, with division of the instruments of production fairly among all members of society. Once each member has an equal amount of individual property, they must be left to their own exertion not to be interfered with by the state. Regarding inequality of wealth, Mill believed that it was the role of the government to establish both social and economic policies that promote the equality of opportunity.
The government, according to Mill, should implement three tax policies to help alleviate poverty:
Inheritance of capital and wealth plays a large role in development of inequality, because it provides greater opportunity for those receiving the inheritance. Mill’s solution to inequality of wealth brought about by inheritance was to implement a greater tax on inheritances, because he believed the most important authoritative function of the government is taxation, and taxation judiciously implemented could promote equality.
Mill demonstrated an early insight into the value of the natural world—in particular in Book IV, chapter VI of "Principles of Political Economy": "Of the Stationary State" in which Mill recognised wealth beyond the material, and argued that the logical conclusion of unlimited growth was destruction of the environment and a reduced quality of life. He concludes that a stationary state could be preferable to unending economic growth:
I cannot, therefore, regard the stationary states of capital and wealth with the unaffected aversion so generally manifested towards it by political economists of the old school.
If the earth must lose that great portion of its pleasantness which it owes to things that the unlimited increase of wealth and population would extirpate from it, for the mere purpose of enabling it to support a larger, but not a better or a happier population, I sincerely hope, for the sake of posterity, that they will be content to be stationary, long before necessity compel them to it.
According to Mill, the ultimate tendency in an economy is for the rate of profit to decline due to diminishing returns in agriculture and increase in population at a Malthusian rate. What is meant by 'a Malthusian rate' of anything?
|
https://en.wikipedia.org/wiki?curid=15626
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.