doc_id
int32
18
2.25M
text
stringlengths
245
2.96k
source
stringlengths
38
44
__index_level_0__
int64
18
2.25M
551,176
As early as 1895, Professor William H. (later Sir William) Bragg was working on wireless telegraphy, though public lectures and demonstrations focussed on his X-ray research which would later lead to his Nobel Prize. In a hurried visit by Rutherford, he was reported as working on a Hertzian oscillator. There were many common practical threads to the two technologies and he was ably assisted in the laboratory by Arthur Lionel Rogers who manufactured much of the equipment. On 21 September 1897 Bragg gave the first recorded public demonstration of the working of wireless telegraphy in Australia during a lecture meeting at the University of Adelaide as part of the Public Teachers' Union conference. Bragg departed Adelaide in December 1897, and spent all of 1898 on a 12-month leave of absence, touring Great Britain and Europe and during this time visited Marconi and inspected his wireless facilities. He returned to Adelaide in early March 1899, and already on 13 May 1899, Bragg and his father-in-law, Sir Charles Todd, were conducting preliminary tests of wireless telegraphy with a transmitter at the Observatory and a receiver on the South Road (about 200 metres). Experiments continued throughout the southern winter of 1899 and the range was progressively extended to Henley Beach. In September the work was extended to two way transmissions with the addition of a second induction coil loaned by Mr. Oddie of Ballarat. It was desired to extend the experiments cross a sea path and Todd was interested in connecting Cape Spencer and Althorpe Island, but local costs were considered prohibitive while the charges for patented equipment from the Marconi Company were exorbitant. At the same time Bragg's interests were leaning towards X-rays and practical work in wireless in South Australia was largely dormant for the next decade.
https://en.wikipedia.org/wiki?curid=396459
550,888
1,578,757
The observatory was established due to the efforts of William Petrie, an amateur astronomer who had a small private observatory at Egmore in Madras. Petrie's original observatory was established in 1786 and was made of iron and timber. In 1789, Petrie gifted his instruments to the Madras Government before retiring to England. Sir Charles Oakley accepted Petrie's plea to establish an official observatory for the purpose of ""promoting the knowledge of astronomy, geography and navigation in India"". The building was designed by Michael Topping on the bank of the river Cooum at Nungambakkam. The building consisted of a single room 40 feet long and 20 feet wide with a 15-foot ceiling. At the centre a granite pillar of 10 tons supported a 12-inch azimuth transit circle instrument made by Troughton. These were used to make observations on the meridian that began on 9 January 1793. Topping died in 1796 and was succeeded by John Goldingham who was formerly Petrie's assistant, Government Architect and Editor of the "Government Gazette" apart from serving as first superintendent of the Engineering School. Goldingham determined the longitude as 80° 18' 30" based on eclipses of Jupiter's moons. This was the value used as a benchmark by William Lambton for the Great Trigonometrical Survey. When Goldingham went on leave between 1805 and 1810, the observatory was maintained by Lt. John Warren (born Jean-Baptiste Francois Joseph de Warren, 21 September 1769 – 9 February 1830, Pondicherry) who recalculated the longitude as 80°17'21"E. He recorded observations on the comet of September 1807 and computed the declinations of several stars. Goldingham returned in 1812 and served until 1830 when he was replaced by Thomas Glanville Taylor who measured the positions of 11,000 stars which were published in five volumes which came to be known as the "Madras Catalogue". Taylor's estimate of the longitude for Madras was 80°14'20"E. Taylor also made observations on the comet of 1831.
https://en.wikipedia.org/wiki?curid=6433541
1,577,867
471,115
The glucose-alanine cycle may also be of significant clinical relevance in oncological (cancer) pathogenesis. A recent study explored the role of the glucose-alanine cycle in the metabolic reprogramming of Hepatocellular Carcinoma (HCC). HCC is the most common form of liver cancer and the third most common cause of cancer-related deaths worldwide. The search for alternative treatment options remains a lucrative area of research as current available therapeutics (surgery, radiotherapy, chemotherapy) generally have severe side effects and/or low success rates with HCC. One common characteristic of many novel alternative and/or supplementary treatments is the targeting of cellular metabolism of cancer cells, due to their general hyper-metabolic state which favors rapid growth and proliferation. In conjunction with consuming glucose at a much more rapid rate than healthy cells, cancers cells heavily rely on amino acid metabolism to satisfy their avid nutritional needs. The researchers involved in this study speculated exogenous alanine, processed via the glucose-alanine cycle, to be one of the alternative energy sources for HCC cells in a nutrient deficient environment and that this dependency can be harnessed for targeted therapy. To demonstrate this experimentally, HCC cells were cultured in vitro in a nutrient poor media and then supplied with alanine. The alanine supplication was enough to promote HCC cell growth under those conditions- a phenomenon called metabolic reprogramming. Next, they performed a series of over expression and loss of function experiments and determined that specifically Glutamic Pyruvate Transaminase 1 (GPT1) is the GPT isomer primarily involved in alanine turnover in HCC cells, consistent with previous findings that GPT1 tends to be found in the liver. They proceeded by treating the metabolically reprogrammed HCC cells with Berberine, a naturally occurring inhibitor of GPT1; the observed affect was to curb ATP production and subsequently the growth of the alanine-supplied cancer cells. Their study demonstrated that components of the glucose-alanine cycle, particularly GPT1, may be a good choice as a target for alternative HCC therapies and that Berberine, as a plant- derived selective GPT1 inhibitor, has potential for use in one of these novel medicines. The concept of alanine as an alternative fuel for cancer cells was similarly demonstrated in other studies performed on pancreatic cancer cells.
https://en.wikipedia.org/wiki?curid=3945314
470,879
1,338,629
Technological analysis is concerned with the examination of the production of knapped-stone artifacts. The study of the attributes of waste products (debitage) and tools are the most important methods for the study of knapped-stone technology, backed up with experimental production. One such method of experimentation is to use steel balls dropped by an electromagnet onto a glass prism to test relationships such as platform thickness and flake length. Additionally, work by Patterson(1990) indicates that the process of bifacial reduction can be identified through analysis of debitage in the absence of an identifiable bifacial artefact by comparing the various proportions of an assemblage’s flake sizes. A very wide range of attributes may be used to characterize and compare assemblages to isolate (and interpret) differences across time and space in the production of stone tools. Lithic analysts identify flake scarring on stone artifacts in order to understand the manufacturing process of flake production. There have been efforts to identify variables to predict original size of discarded tool artifact but the results yielded from these studies have not been uniform and research continues. Kuhn (1990) presents his Geometric Index of Unifacial Reduction, an equation for estimating the mass loss of retouched stone artefacts. This index attempts to use 2D measurements of a flakes reduced edge to find the lost mass. Discovering the amount a particular flake has been reduced can help archaeologists answer questions of tool maintainability, optimal resources, and knapping practices. Kuhn's GIUR method was recently reestablished as a robust method as evident through simulation and experiments yielding strong positive correlation coefficients of flake mass removed from retouched flakes. The GIUR method is best used on flakes that have been lightly retouched and it can only be used on flakes that are unifacial. 3D modelling is an increasingly important tool for lithic analysis.
https://en.wikipedia.org/wiki?curid=147620
1,337,896
383,019
Borderline elevated metanephrines present a diagnostic challenge to the physician - the first step is to repeat the labs, taking extra precautions to follow the gold standard diagnosis described above, including the conditions of collection, pharmaceutical interference, and any potential diet and lifestyle habits that could alter the results. If the offending medications cannot be discontinued or repeated labs remained the same, consider administering a clonidine suppression test. In the 1970s, the drug clonidine hydrocloride swept the market as a novel agent for hypertension; however, the reported side-effects (nausea, vomiting, drowsiness, dryness of the eyes and mouth, constipation, and generalized weakness) limit compliance and have vastly diminished prescriptions. While the adverse side-effects with clonidine are inconvenient, the most dangerous aspect of clonidine is withdrawal rebound hypertension - that is, when the medicine is abruptly discontinued, blood pressure may rapidly return or surpass the original value. However, a one-time, weight-based dose can be utilized in limited settings to help determine disease status. After fasting overnight, patient's will present to their testing site for a baseline metanephrines blood draw and clonidine administration. They will remain supine for (3) hours and a repeat blood draw will be taken. A positive result (indicating a pheochromocytoma) will occur if the plasma metanephrine levels remain elevated after clonidine is given. If the results are the same or fall, the test is negative and the patient does not have a pheochromocytoma. It is important to note that if a patient "does not" have a pheochromocytoma, they may become extremely hypotensive following clonidine. Patients should not depend on themselves for transport following this test.
https://en.wikipedia.org/wiki?curid=277088
382,824
1,285,694
Malawi's soil is depleted, like that of other local countries. Many of its farmers could not afford fertilizer at the then-current market prices. Bingu wa Mutharika declared he did not get elected to rule a nation of beggars. After initially failing to persuade the World Bank and other donors to help subsidize green revolution inputs, the president decided to spend $58 million from Malawi's own reserves to provide seeds and fertilizers to the poorest farmers. The World Bank eventually endorsed a scheme to allow the poorest 1.3 million farm families to buy three kilograms of hybrid maize and two 50-kilogram bags of fertilizer at a third of the market price. Following a bumper harvest in 2007, Malawi sold more maize to the World Food Program of the United Nations than any other southern Africa country, and exported hundreds of thousands of tons of corn to Zimbabwe. The success of these subsidies caused some re-examination of the role of agriculture in helping poor in Africa, and of government investment in basic components of farming, such as fertilizer, improved seed, farmer education, credit and agricultural research. Despite this, the UN Food and Agriculture Agency recorded that in the period 2010–12, 23.1% of the population were under-nourished, almost the same percentage was recorded for the whole period from 2004 to 2009, and only a slight fall from the 26.8% in 1999-2001
https://en.wikipedia.org/wiki?curid=14543243
1,284,993
1,908,563
The 1965 Long Beach Sculpture Symposium was not only the United States’ first international sculpture symposium but the first to occur at a university. It was also the first to partner with industrial partners to create technologically advanced works. Such industrial partners include Bethlehem Steel, Fellows and Stewart Shipyard, and North American Aviation. The symposium was organized by the Cal State Long Beach professor Kenneth Glenn and the Israeli artist Kosso Eloul. It lasted 12 weeks and included artists from Poland, Canada, Japan, Israel, and American artists. There were nine artists who participated whom all produced large scale abstract pieces made from concrete, steel, and earth. At the time, the symposium responded to the war and politics occurring which centered around human cooperation and engagement. Each artist was matched with a new age industrial representative. For example, Piotr Kowallski worked with the North American Aviation Corporation. Robert Murray worked with Bethlehem Steel. Murray’s piece, Duet is a large-scale piece that is made of three sheets of painted steel. The thick sheets were painted in a unique tangerine shade that was accurately restored in the year 2015 due to the work done by the Getty Conservation Institute. Beyond the pieces, California architect, Edward Killingsworth, utilized CSULB’s 350 acre campus and made all the works and their sites on the campus connect to Modernist architecture and an effort to showcase monumental outdoor sculpture. Most of the campus is designed in a mid-century modern style with an emphasis of an open landscaped area to develop almost a sprawling park feeling. Working with outdoor sculpture proved some challenges such as damage and exposure to outside elements. More importantly, the pieces worked towards a greater goal of conserving art in public places. In addition to constructing consistent color ties across campus through modernist proportioning and close connections to landscape.
https://en.wikipedia.org/wiki?curid=439256
1,907,466
624,952
In 1800, Alessandro Volta invented the electric battery (known of the voltaic pile) and thus improved the way electric currents could also be studied. A year later, Thomas Young demonstrated the wave nature of light—which received strong experimental support from the work of Augustin-Jean Fresnel—and the principle of interference. In 1813, Peter Ewart supported the idea of the conservation of energy in his paper "On the measure of moving force". In 1820, Hans Christian Ørsted found that a current-carrying conductor gives rise to a magnetic force surrounding it, and within a week after Ørsted's discovery reached France, André-Marie Ampère discovered that two parallel electric currents will exert forces on each other. In 1821, William Hamilton began his analysis of Hamilton's characteristic function. In 1821, Michael Faraday built an electricity-powered motor, while Georg Ohm stated his law of electrical resistance in 1826, expressing the relationship between voltage, current, and resistance in an electric circuit. A year later, botanist Robert Brown discovered Brownian motion: pollen grains in water undergoing movement resulting from their bombardment by the fast-moving atoms or molecules in the liquid. In 1829, Gaspard Coriolis introduced the terms of work (force times distance) and kinetic energy with the meanings they have today.
https://en.wikipedia.org/wiki?curid=56661172
624,619
382,474
Electronic limitations to invasive BCIs have been an active area of research in recent decades. While intracellular recordings of neurons reveal action potential voltages on the scale of hundreds of millivolts, chronic invasive BCIs rely on recording extracellular voltages which typically are three orders of magnitude smaller, existing at hundreds of microvolts. Further adding to the challenge of detecting signals on the scale of microvolts is the fact that the electrode-tissue interface has a high capacitance at small voltages. Due to the nature of these small signals, for BCI systems that incorporate functionality onto an integrated circuit, each electrode requires its own amplifier and ADC, which convert analog extracellular voltages into digital signals. Because a typical neuron action potential lasts for one millisecond, BCIs measuring spikes must have sampling rates ranging from 300 Hz to 5 kHz. Yet another concern is that invasive BCIs must be low-power, so as to dissipate less heat to surrounding tissue; at the most basic level more power is traditionally needed to optimize signal-to-noise ratio. Optimal battery design is an active area of research in BCIs.Challenges existing in the area of material science are central to the design of invasive BCIs. Variations in signal quality over time have been commonly observed with implantable microelectrodes. Optimal material and mechanical characteristics for long term signal stability in invasive BCIs has been an active area of research. It has been proposed that the formation of glial scarring, secondary to damage at the electrode-tissue interface, is likely responsible for electrode failure and reduced recording performance. Research has suggested that blood-brain barrier leakage, either at the time of insertion or over time, may be responsible for the inflammatory and glial reaction to chronic microelectrodes implanted in the brain. As a result, flexible and tissue-like designs have been researched and developed to minimize foreign-body reaction by means of matching the Young's modulus of the electrode closer to that of brain tissue.
https://en.wikipedia.org/wiki?curid=623686
382,279
643,991
The close relationship between bibliometrics and commercial vendors of citation data and indicators has become more strained since the 1990s. Leading scientific publishers have diversified their activities beyond publishing and moved "from a content-provision to a data analytics business." By 2019, Elsevier has either acquired or built a large portofolio platforms, tools, databases and indicators covering all aspects and stages of scientific research: "the largest supplier of academic journals is also in charge of evaluating and validating research quality and impact (e.g., Pure, Plum Analytics, Sci Val), identifying academic experts for potential employers (e.g., Expert Lookup5), managing the research networking platforms through which to collaborate (e.g., SSRN, Hivebench, Mendeley), managing the tools through which to find funding (e.g., Plum X, Mendeley, Sci Val), and controlling the platforms through which to analyze and store researchers' data (e.g., Hivebench, Mendeley)." Metrics and indicators are key components of this vertical integration: "Elsevier's further move to offering metrics-based decision making is simultaneously a move to gain further influence in the entirety of the knowledge production process, as well as to further monetize its disproportionate ownership of content." The new market for scientific publication and scientific data has been compared with the business models of social networks, search engines and other forms of "platform capitalism" While content access is free, it is indirectly paid through data extraction and surveillance. In 2020, Rafael Ball envisioned a bleak future for bibliometricians where their research contribute to the emerge of a highly invasive form of "surveillance capitalism":scientists "be given a whole series of scores which not only provide a more comprehensive picture of the academic performance, but also the perception, behaviour, demeanour, appearance and (subjective) credibility (…) In China, this kind of personal data analysis is already being implemented and used simultaneously as an incentive and penalty system."
https://en.wikipedia.org/wiki?curid=1223245
643,651
1,243,094
The microbial safety and stability of food are based on an application of preservative factors called hurdles. The delicate texture and flavor of seafood are very sensitive to the decontamination technologies such as cooking, and more recent mild technologies such as pulsed light, high pressure, ozone, and ultrasound. Chemical preservatives, which are not processes but ingredients, are out of favor with consumers due to natural preservatives demand. An alternative solution that is gaining more and more attention is biopreservation technology. In fish processing, biopreservation is achieved by adding antimicrobials or by increasing the acidity of the fish muscle. Most bacteria stop multiplying when the pH is less than 4.5. Traditionally, acidity has been increased by fermentation, marination or by directly adding acetic, citric or lactic acid to food products. Other preservatives include nitrites, sulphites, sorbates, benzoates and essential oils. The main reason for less documented studies for application of protective microorganisms, bacteriophages or bacteriocins on seafood products for biopreservation compared to dairy or meat products is probably that the early stages of biopreservation have occurred mainly in fermented foodstuffs that are not so developed among seafood products. The selection of potential protective bacteria in seafood products is challenging due to the fact that they need adaptation to the seafood matrix (poor in sugar and their metabolic activities should not change the initial characteristics of the product, i.e. by acidification, and not induce spoilage that could lead to a sensory rejection. Among the microbiota identified in fresh or processed seafood, LAB remains the category that offers the highest potential for direct application as a bioprotective culture or for bacteriocin production.
https://en.wikipedia.org/wiki?curid=31196402
1,242,421
1,894,288
The use of weather charts in a modern sense began in the middle portion of the 19th century. Weather map pioneers include William Charles Redfield, William Reid, Elias Loomis, and Sir Francis Galton, who created the first weather maps in order to devise a theory on storm systems. The invention of the telegraph in 1837 made it possible to gather weather information from multiple distant locations quickly enough to preserve its value for real-time applications. The Smithsonian Institution developed its network of observers over much of the central and eastern United States between the 1840s and 1860s once Joseph Henry took the helm. Beginning in 1849, the Smithsonian started producing surface analyses on a daily basis using the 150 stations in their network. The U.S. Army Signal Corps inherited this network between 1870 and 1874 by an act of Congress, and expanded it to the west coast soon afterwards. Three times daily, all stations would telegraph in their observations to the central office which would then plot the information on a map upon which isobars, or lines of equal pressure, would be drawn which would identify centers of high and low pressure, as well as squall lines. At first, all the data on the map was not taken at exactly the same time in the early days of these analyses because of a lack of time standardization. The first attempts at time standardization took hold in the Great Britain by 1855. However, in the United States, standard time did not come to pass until 1883, when time zones started to come into use across America for railroad use. The entire United States did not finally come under the influence of time zones until 1905, when Detroit finally established standard time.
https://en.wikipedia.org/wiki?curid=11165739
1,893,204
1,651,114
Shifting cultivation is a mode of production involving the low intensity production of plant-based foods; this mode is also known as horticulture or ‘slash and burn agriculture’ in some texts. Horticultural societies are generally situated in semi-sedentary villages of a few hundred that clear a field and burn the cleared vegetation in order to use the ashes to nourish the soil (hence the phrase slash and burn). Next, the group plants a crop or crops in this clearing and uses it for cultivation for several years. At the end of this period, the entire village relocates and starts the process anew, leaving the old clearing fallow for a period of decades in order to allow regeneration through the regrowth of wild vegetation. These food items can be supplemented through the raising of livestock, hunting wild game, and in many cases with the gathering of wild plants (Miller 2005; Park 2006). Though periodic movement precludes absolute permanent ownership of land, some horticultural societies fiercely defend current territories and practice violence against neighboring groups. For instance, Napoleon Chagnon (1997) depicts the Yanamamo of Venezuela and Brazil as the “Fierce People”, though others have been highly critical of Chagnon's account of this society. Horticulture can also produce a broad diet, and in some cases more food per unit of land area than foraging. Though populations of horticulturalists tend to have greater density than those of foragers, they are generally less dense than those which practice other modes of production. If practiced on a small scale, over a large area, with long fallow periods, horticulture has less negative environmental impact than agriculture or industrialism, but more than foraging (Miller 2005). Generally, horticulture coincides with a subsistence type of economy in terms of production, distribution.
https://en.wikipedia.org/wiki?curid=17200325
1,650,182
1,368,580
Some updates included features promised as part of the long-term release plan. as the competitive mode that was added in June 2016, and potential changes to how the Play of the Game is selected to showcase non-attacking-based Heroes. Other updates will come from monitoring the game and adjusting various attributes of the Heroes to better manage their expectations they had in designing the game and in response to player feedback. For example, one of the first planned updates was to change the strength of Cassidy's alternate fire "Fan the Hammer" ability, which could do a great deal of damage to most targets. Blizzard felt this attack should be lethal to most of the Heroes but should not be able to take out Tank-based characters in a single shot, and reduced the damage to address this. Similarly, Symmetra had major character update in late 2016 that gave her additional abilities including a second Ultimate ability, the changes made in response to Blizzard's observations that she had been infrequently selected by players. Kaplan said that while they do want to promote lesser-played heroes, their goal of these updates is not to strive for equal selection rates for all heroes, but to work with the fluctuations in the meta-game as new heroes are introduced and new strategies by players are developed to counter new heroes or updates made to other heroes. Kaplan described their overall approach in these post-release adjustments as part of a "balance triangle", weighing the various statistics they collect, the feedback and general attitudes from the player base they monitor, and their own personal opinions and gut feelings where the balance should be. Not all three points are equally weighed, but they consider avoiding allowing one of these areas overwhelm the other two.
https://en.wikipedia.org/wiki?curid=55762533
1,367,824
789,563
In Rome in 1864, Pope Pius IX in his "Syllabus of Errors" decreed that it was an "error" that "reason is the ultimate standard by which man can and ought to arrive at knowledge" and an "error" that "divine revelation is imperfect" in the Bible – and anyone maintaining those errors was to be "anathematized" – and in 1888 decreed as follows: "The fundamental doctrine of rationalism is the supremacy of the human reason, which, refusing due submission to the divine and eternal reason, proclaims its own independence... A doctrine of such character is most hurtful both to individuals and to the State... It follows that it is quite unlawful to demand, to defend, or to grant, unconditional [or promiscuous] freedom of thought, speech, writing, or religion." Those principles and Tyndall's principles were profound enemies. Luckily for Tyndall he didn't need to get into a contest with them in Britain. Even in Italy, Huxley and Darwin were awarded honorary medals and most of the Italian governing class was hostile to the papacy. But in Ireland during Tyndall's lifetime the majority of the population grew increasingly doctrinaire and vigorous in its Roman Catholicism and also grew stronger politically. Between 1886 and 1893, Tyndall was active in the debate in England about whether to give the Catholics of Ireland more freedom to go their own way. Like the great majority of Irish-born scientists of the 19th century he opposed the Irish Home Rule Movement. He had ardent views about it, which were published in newspapers and pamphlets. For example, in an opinion piece in "The Times" on 27 December 1890 he saw priests and Catholicism as "the heart and soul of this movement" and wrote that placing the non-Catholic minority under the dominion of "the priestly horde" would be "an unspeakable crime". He tried unsuccessfully to get the UK's premier scientific society to denounce the Irish Home Rule proposal as contrary to the interests of science.
https://en.wikipedia.org/wiki?curid=256310
789,138
447,128
Carbon nanotubes are also a promising material as building blocks in hierarchical composite materials given their exceptional mechanical properties (~1 TPa in modulus, and ~100 GPa in strength). Initial attempts to incorporate CNTs into hierarchical structures (such as yarns, fibres or films) has led to mechanical properties that were significantly lower than these potential limits. The hierarchical integration of multi-walled carbon nanotubes and metal/metal oxides within a single nanostructure can leverage the potentiality of carbon nanotubes composite for water splitting and electrocatalysis. Windle "et al." have used an "in situ" chemical vapor deposition (CVD) spinning method to produce continuous CNT yarns from CVD-grown CNT aerogels. CNT yarns can also be manufactured by drawing out CNT bundles from a CNT forest and subsequently twisting to form the fibre (draw-twist method, see picture on right). The Windle group have fabricated CNT yarns with strengths as high as ~9 GPa at small gage lengths of ~1 mm, however, strengths of only about ~1 GPa were reported at the longer gage length of 20 mm. The reason why fibre strengths have been low compared to the strength of individual CNTs is due to a failure to effectively transfer load to the constituent (discontinuous) CNTs within the fibre. One potential route for alleviating this problem is via irradiation (or deposition) induced covalent inter-bundle and inter-CNT cross-linking to effectively 'join up' the CNTs, with higher dosage levels leading to the possibility of amorphous carbon/carbon nanotube composite fibres. Espinosa "et al." developed high performance DWNT-polymer composite yarns by twisting and stretching ribbons of randomly oriented bundles of DWNTs thinly coated with polymeric organic compounds. These DWNT-polymer yarns exhibited an unusually high energy to failure of ~100 J·g (comparable to one of the toughest natural materials – spider silk), and strength as high as ~1.4 GPa. Effort is ongoing to produce CNT composites that incorporate tougher matrix materials, such as Kevlar, to further improve on the mechanical properties toward those of individual CNTs.
https://en.wikipedia.org/wiki?curid=7452926
446,911
1,583,408
Publius declared that he had successfully finished the purpose with his work. To the best of his knowledge, he had explained and defended the proposed government from every inquiry and objection set forth against it and succinctly dissected the reasons and purposes for its ratification. The argument then turned to the constitution's imperfect nature. The essay asks why adopt an imperfect document when it could be revised and amended first and then ratified at a later date. Publius responded by declaring it imprudent to prolong national affairs for the irrational pursuit of perfection. He stated: "I never expect to see a perfect work from imperfect man. The result of the deliberation of all collective bodies, must necessarily be a compound as well of the errors and prejudices, as of the good sense and wisdom of the individuals of whom they are composed. The compacts which are to embrace thirteen distinct states, in a common bond of amity and union, must as necessarily be a compromise of the dissimilar interests and inclinations. How can perfection spring from such materials?" For those with serious intent to amend the proposed constitution, it would subsequently be easier to amend the document after its ratification rather than prior. With every amendment added before its ratification, every state must then revise and agree upon these changes. Little progress would be made and compromise from the states would obscure the amendments original intent. However, if any amendment was proposed after ratification, there would no compromise on its language, it would either be accepted or denied. Publius quoted Scottish Enlightenment philosopher, David Hume, the excerpt saying: "to balance a large state or society (says he) whether monarchical or republican, on general laws, is a work of so great difficulty, that no human genius, however comprehensive, is able by the mere dint of reason and reflection, to effect it. The judgments of many must unite in the work: EXPERIENCE must guide their labour: TIME must bring it to perfection: and the FEELING OF inconveniences must correct the mistakes which they inevitably fall into, in their first trials and experiments."
https://en.wikipedia.org/wiki?curid=2654114
1,582,518
1,183,859
The "Hot and Energetic Universe" science theme revolves around two fundamental questions in astrophysics: How does ordinary matter assemble into the large-scale structures that we see today? And how do black holes grow and shape the Universe? Both questions can only be answered using a sensitive X-ray space observatory. Its combination of scientific performance exceeds any existing or planned X-ray missions by over one order of magnitude on several parameter spaces: effective area, weak line sensitivity, survey speed, just to mention a few. "Athena" will perform very sensitive measurements on a wide range of celestial objects. It will investigate the chemical evolution of the hot plasma permeating the intergalactic space in cluster of galaxies, search for elusive observational features of the Warm-Hot Intergalactic Medium, investigate powerful outflows ejected from accreting black holes across their whole mass spectrum, and study their impact on the host galaxy, and identify sizeable samples of comparatively rare populations of Active Galactic Nuclei (AGN)  that are key to understanding the concurrent cosmological evolution of accreting black holes and galaxies. Among them are highly obscured and high-redshift (z≥6) AGN. Furthermore, "Athena" will be an X-ray observatory open to the whole astronomical community, poised to provide wide-ranging discoveries in almost all fields of modern astrophysics, with a large discovery potential of still unknown and unexpected phenomena. It represents the X-ray contribution to the fleet of large-scale observational facilities to be operational in the 2030s (incl. SKA, ELT, ALMA, LISA...).
https://en.wikipedia.org/wiki?curid=23423743
1,183,232
1,145,498
The nervous system and immune system require the appropriate degrees of cellular differentiation, organizational integrity, and neural network connectivity. These operational features of the brain and nervous system may make signaling difficult to duplicate in severely diseased scenarios. There are currently three classes of therapies that have been utilized in both animal models of disease and in human clinical trials. These three classes include DNA methylation inhibitors, HDAC inhibitors, and RNA-based approaches. DNA methylation inhibitors are used to activate previously silenced genes. HDACs are a class of enzymes that have a broad set of biochemical modifications and can affect DNA demethylation and synergy with other therapeutic agents. The final therapy includes using RNA-based approaches to enhance stability, specificity, and efficacy, especially in diseases that are caused by RNA alterations. Emerging concepts concerning the complexity and versatility of the epigenome may suggest ways to target genomewide cellular processes. Other studies suggest that eventual seminal regulator targets may be identified allowing with alterations to the massive epigenetic reprogramming during gametogenesis. Many future treatments may extend beyond being purely therapeutic and may be preventable perhaps in the form of a vaccine. Newer high throughput technologies when combined with advances in imaging modalities such as in vivo optical nanotechnologies may give rise to even greater knowledge of genomic architecture, nuclear organization, and the interplay between the immune and nervous systems.
https://en.wikipedia.org/wiki?curid=4052447
1,144,897
1,875,268
Clore's recent work has focused on developing new NMR methods (such as paramagnetic relaxation enhancement, dark state exchange saturation transfer spectroscopy and lifetime line broadening) to detect, characterize and visualize the structure and dynamics of sparsely-populated states of macromolecules, which are important in macromolecular interactions but invisible to conventional structural and biophysical techniques. Examples of include the direct demonstration of rotation-coupled sliding and intermolecular translocation as mechanisms whereby sequence-specific DNA binding proteins locate their target site(s) within an overwhelming sea of non-specific DNA sequences; the detection, visualization and characterization of encounter complexes in protein-protein association; the analysis of the synergistic effects of conformational selection and induced fit in protein-ligand interactions; and the uncovering of "dark", spectroscopically invisible states in interactions of NMR-visible proteins and polypeptides (including intrinsically disordered states) with very large megadalton macromolecular assemblies. The latter includes an atomic-resolution view of the dynamics of the amyloid-β aggregation process. and the demonstration of intrinsic unfoldase/foldase activity of the macromolecular machine GroEL. These various techniques have also been used to uncover the kinetic pathway of pre-nucleation transient oligomerization events and associated structures involving the protein encoded by huntingtin exon-1, which may provide a potential avenue for therapeutic intervention in Huntington's disease, a fatal autosomal dominant, neurodegenerative condition.
https://en.wikipedia.org/wiki?curid=45449941
1,874,191
643,979
The emerging computing technologies were immediately considered as a potential solution to make a larger amount of scientific output readable and searchable. During the 1950s and 1960s, an uncoordinated wave of experiments in indexing technologies resulted in the rapid development of key concepts of computing research retrieval. In 1957, IBM engineer Hans Peter Luhn introduced an influential paradigm of statistical-based analysis of word frequencies, as "communication of ideas by means of words is carried out on the basis of statistical probability." Automated translation of non-English scientific work has also significantly contributed to fundamental research on natural language processing of bibliographic references, as in this period a significant amount of scientific publications were not still available in English, especially the one coming from the Soviet block. Influent members of the National Science Foundation like Joshua Ledeberg advocated for the creation of a "centralized information system", SCITEL, partly influenced by the ideas of John Desmond Bernal. This system would at first coexist with printed journals and gradually replace them altogether on account of its efficiency. In the plan laid out by Ledeberg to Eugen Garfield in November 1961, a centralized deposit would index as much as 1,000,000 scientific articles per year. Beyond full-text searching, the infrastructure would also ensure the indexation of citation and other metadata, as well as the automated translation of foreign language articles.
https://en.wikipedia.org/wiki?curid=1223245
643,639
2,106,282
Irving Kaplan (1913–1997) was a chemist and a Massachusetts Institute of Technology professor, who was among the founders of the Department of Nuclear Engineering at the institution. Kaplan received a BA from Columbia University in 1933, an MA in 1934 and a PhD in chemistry in 1937. Before coming to MIT, he was a researcher in chemistry at the Michael Reese Hospital in Chicago from 1937 to 1941. He participated in the Manhattan Project to do research on isotope separation. Kaplan was also a lead founding member of the Federation of American Scientists, and worked with other scientists to promote civilian control of the atomic energy. This eventually led the way to the creation of the U.S. Atomic Energy Commission in 1947. From 1946 to 1957, he worked as a senior physicist at the Brookhaven National Laboratory on Long Island, and wrote a textbook titled Nuclear Physics. Kaplan visited MIT in 1957, and became a professor in 1958 to participate in the new department. He participated in various projects such as the research on lattices of partially enriched uranium rods in heavy water, and development of graduate and undergraduate courses such as the history of science and classical Greek. Professor Kaplan had a wife, two sons and one daughter, and four grandchildren. He died at the Massachusetts General Hospital on April 10 after a heart surgery.
https://en.wikipedia.org/wiki?curid=8530659
2,105,069
1,395,266
Staats began studies to analyze cases of important human behaviors in basic and applied ways in 1954. In 1958 he analyzed dyslexia and introduced his token reinforcer system (later called the token economy) along with his teaching method and materials for treating the disorder. When his daughter Jenny was born in 1960 he began to study and to produce her language, emotional, and sensory-motor development. When she was a year and a half old he began teaching her number concepts, and then reading six months later, using his token reinforcer system, as he recorded on audiotape. Films were made in 1966 of Staats being interviewed about his conception of how variations in children's home learning variously prepared them for school on the first of three Arthur Staats YouTube videos. Following that the second Staats YouTube video records him beginning teaching his three-year-old son with the reading learning (and counting) method he developed in 1962 with his daughter. This film also shows a graduate assistant working with a culturally deprived four-year-old learning reading and writing numbers and counting, participating voluntarily. The Staats YouTube video number 3 has additional cases of these usually delayed children voluntarily learning much ahead of time these cognitive repertoires that prepare them for school. This group of 11 children gained an average of 11 points in IQ and advanced significantly on a child development measure as they also learned to like the learning situation. Staats published the first study in this series in 1962 and describes his later studies and his more general conception in his 1963 book. This research, that included work with his own children from birth on, was the basis for Staats’ books specifying the importance of the parents' early training of the child in language and other cognitive repertoires. He shows they are the foundations for being intelligent and doing well on entering school. There are new studies showing that parents who talk to their children more have children with advanced language development, school success, and intelligence measures. These statistical studies should be joined with Staats’ work with individual children that shows the specifics of the learning involved and how to best produce it. The two together show powerfully the importance of early child learning.
https://en.wikipedia.org/wiki?curid=24006408
1,394,495
230,870
In 1920 she attended the sixth congress of the International Psychoanalytical Association in The Hague, where she gave a talk on the origins of language in childhood. The audience included Sigmund Freud, his daughter Anna Freud, Melanie Klein and Sandor Ferenczi. She also announced her intention to join the staff of the Rousseau Institute in Geneva, a pioneering clinical, training and research centre for child development in Geneva. She remained there for three years, working alongside its founder Édouard Claparède, as well as other distinguished psychologists of the time including Pierre Bovet. While she was there, Jean Piaget also joined the staff: they collaborated closely, and in 1921 he went into an eight-month analysis with her. In 1922, she and Piaget both delivered papers at the seventh congress of the International Psychoanalytical Association in Berlin. This was one of the most productive periods of her life, and she published twenty papers between 1920 and 1923. The most important of these was a new version of the paper she had given at the Hague on the origins of language, drawing on her collaboration with the linguist Charles Bally. Entitled "The origins of the words 'Papa' and 'Mama'", she described how language develops on a substrate of genetic readiness, first through interactions between the child and the mother's breast, and then through family and social interactions. Her other papers from the time are mainly devoted to bring psychoanalytic thought together with observational studies of child development. Her papers in the "Zeitschrift" and "Imago" from this time mainly focus on the importance of speech acquisition in early childhood and the sense of time. However, Otto Fenichel singled out for special mention her 1923 article on voyeurism, where “Sabina Spielrein described a peeping perversion in which the patient tried to overcome an early repression of genital and manual erotogeneity, provoked by an intense castration fear”. Overall, her work during this period is thought to have had considerable influence on Piaget's thought, and possibly on Klein's.
https://en.wikipedia.org/wiki?curid=2425513
230,752
592,168
Researchers argue that analyses "[...] of the environmental impacts of miscanthus cultivation on a range of factors, including greenhouse gas mitigation, show that the benefits outweigh the costs in most cases." Others argue that although there is room for more research, "[...] clear indications of environmental sustainability do emerge." In addition to the greenhouse gas mitigation potential, miscanthus' "[…] perennial nature and belowground biomass improves soil structure, increases water-holding capacity (up by ), and reduces run-off and erosion. Overwinter ripening increases landscape structural resources for wildlife. Reduced management intensity promotes earthworm diversity and abundance although poor litter palatability may reduce individual biomass. Chemical leaching into field boundaries is lower than comparable agriculture, improving soil and water habitat quality." A change from first generation to second generation energy crops like miscanthus is environmentally beneficial because of improved farm-scale biodiversity, predation and a net positive greenhouse gas mitigation effect. The benefits are primarily a consequence of low inputs and the longer management cycles associated with second generation (2G) crops. If land use tensions are mitigated, reasonable yields obtained, and low carbon soils targeted, there are many cases where low-input perennial crops like miscanthus "[...] can provide significant GHG [greenhouse gas] savings compared to fossil fuel alternatives [...]." In contrast to annual crops, miscanthus have low nitrogen input requirements, low GHG emissions, sequesters soil carbon due to reduced tillage, and can be economically viable on marginal land. Researchers agree that in recent years, "[...] a more nuanced understanding of the environmental benefits and risks of bioenergy has emerged, and it has become clear that perennial bioenergy crops have far greater potential to deliver significant GHG savings than the conventional crops currently being grown for biofuel production around the world (e.g. corn, palm oil and oilseed rape)." They also agree that "[...] the direct impacts of dedicated perennial bioenergy crops on soil carbon and N2O are increasingly well understood, and are often consistent with significant lifecycle GHG mitigation from bioenergy relative to conventional energy sources."
https://en.wikipedia.org/wiki?curid=8202316
591,866
2,181,309
In early 1949, Salmon went from the Dominion Museum to Victoria University College; he would lecture in zoology for nearly three decades. During that time, he became a world authority on collembola. He was president of the Entomological Society of New Zealand from 1955 to 1957. In 1966, he became the head of the zoological department and gave it a more modern outlook. He also became a noted conservationist and argued strongly, and with the help of professional bodies, against power projects that flooded or permanently changed sites of significant beauty of ecological value. One key project that the public protest could not prevent was the conversion of the Aratiatia Rapids as part of the Aratiatia Power Station. Salmon published a book "Heritage destroyed: the crisis in scenery preservation in New Zealand" in 1960 and it became an important text that help shape the country's conservation movement, then in its infancy. In response, the government set up the Nature Conservation Council in 1962. He was then engaged with the Save Manapouri campaign and joined their national committee. His profile helped him with becoming deputy president of the Royal Forest and Bird Protection Society in 1971 and was instrumental in forming an alliance of environmental groups to oppose the Manapouri project. The Labour Party made support for the group's aims an election issue and it helped them win the 1972 general election.
https://en.wikipedia.org/wiki?curid=70816676
2,180,063
138,115
One of the primary postwar users of the Avenger was the Royal Canadian Navy, which obtained 125 former US Navy TBM-3E Avengers from 1950 to 1952 to replace their venerable Fairey Fireflies. By the time the Avengers were delivered, the RCN was shifting its primary focus to anti-submarine warfare (ASW), and the aircraft was rapidly becoming obsolete as an attack platform. Consequently, 98 of the RCN Avengers were fitted with an extensive number of novel ASW modifications, including radar, electronic countermeasures (ECM) equipment, and sonobuoys, and the upper ball turret was replaced with a sloping glass canopy that was better suited for observation duties. The modified Avengers were designated AS 3. A number of these aircraft were later fitted with a large magnetic anomaly detector (MAD) boom on the rear left side of the fuselage and were redesignated AS 3M. However, RCN leaders soon realized the Avenger's shortcomings as an ASW aircraft, and in 1954 they elected to replace the AS 3 with the Grumman S-2 Tracker, which offered longer range, greater load-carrying capacity for electronics and armament, and a second engine, a great safety benefit when flying long-range ASW patrols over frigid North Atlantic waters. As delivery of the new license-built CS2F Trackers began in 1957, the Avengers were shifted to training duties, and were officially retired in July 1960.
https://en.wikipedia.org/wiki?curid=507536
138,059
122,559
Around £10 million of Brimstones from the RAF stock were sold to the Royal Saudi Air Force for use on its Tornados. In April 2011, the RAF's Assistant Chief of the Air Staff Air Vice-Marshal Baz North reported that the missiles were "being sought by both the United States of America and the French" in the light of Brimstone's success in Libya. France's DGA procurement agency held meetings in late May 2011 to discuss a lightweight air-to-surface weapon for the Dassault Rafale aircraft; Stéphane Reb of the DGA would merely say that "Brimstone is a solution, but it's not the only option". In early 2014 the US Congress' House Armed Services Committee showed interest again in the missile; high-ranking members of the US armed services have stated they "like it". The French Air Force were still thinking about a purchase in March 2012, with a prime consideration being lower collateral damage compared to the AASM missile. India made a request for information about integrating Brimstone on their Sukhoi Su-30MKI fighter fleet. In July 2014, it was revealed that the United States Navy was studying the Dual Mode Brimstone for use on the Boeing F/A-18E/F Super Hornet aircraft. The United States Army also considered the Brimstone as "an option" in its Joint Air-to-Ground Missile (JAGM) program, but selected Lockheed Martin's dual-mode seeker upgrade for the Hellfire missile. In September 2015, MBDA displayed the dual-mode Brimstone for the first time as a helicopter-mounted weapon to fulfill the British Army Air Corps' need for a future attack helicopter weapon for the AgustaWestland Apache. In January 2016, it was reported that Germany was considering arming its newly leased Heron TP drones with Brimstone. In March 2016, France was reportedly considering Brimstone for its Tiger attack helicopters.
https://en.wikipedia.org/wiki?curid=780295
122,510
43,210
Y2K was also exploited by some fundamentalist and charismatic Christian leaders throughout the Western world, particularly in North America and Australia. Their promotion of the perceived risks of Y2K was combined with end times thinking and apocalyptic prophecies in an attempt to influence followers. The "New York Times" reported in late 1999, "The Rev. Jerry Falwell suggested that Y2K would be the confirmation of Christian prophecy — God's instrument to shake this nation, to humble this nation. The Y2K crisis might incite a worldwide revival that would lead to the rapture of the church. Along with many survivalists, Mr. Falwell advised stocking up on food and guns". Adherents in these movements were encouraged to engage in food hoarding, take lessons in self-sufficiency, and the more extreme elements planned for a total collapse of modern society. The "Chicago Tribune" reported that some large fundamentalist churches, motivated by Y2K, were the sites for flea market-like sales of paraphernalia designed to help people survive a social order crisis ranging from gold coins to wood-burning stoves. Betsy Hart, writing for the "Deseret News", reported that a lot of the more extreme evangelicals used Y2K to promote a political agenda in which downfall of the government was a desired outcome in order to usher in Christ's reign. She also noted that, "the cold truth is that preaching chaos is profitable and calm doesn't sell many tapes or books". These types of fears and conspiracies were described dramatically by New Zealand-based Christian prophetic author and preacher Barry Smith in his publication, "I Spy with my Little Eye", where he dedicated a whole chapter to Y2K. Some expected, at times through so-called prophecies, that Y2K would be the beginning of a worldwide Christian revival.
https://en.wikipedia.org/wiki?curid=25739013
43,194
355,899
Although prostheses in general supplement lost or damaged body parts with the integration of a mechanical artifice, bionic implants in medicine allow model organs or body parts to mimic the original function more closely. Michael Chorost wrote a memoir of his experience with cochlear implants, or bionic ears, titled "Rebuilt: How Becoming Part Computer Made Me More Human". Jesse Sullivan became one of the first people to operate a fully robotic limb through a nerve-muscle graft, enabling him a complex range of motions beyond that of previous prosthetics. By 2004, a fully functioning artificial heart was developed. The continued technological development of bionic and (bio-)nanotechnologies begins to raise the question of enhancement, and of the future possibilities for cyborgs which surpass the original functionality of the biological model. The ethics and desirability of "enhancement prosthetics" have been debated; their proponents include the transhumanist movement, with its belief that new technologies can assist the human race in developing beyond its present, normative limitations such as aging and disease, as well as other, more general inabilities, such as limitations on speed, strength, endurance, and intelligence. Opponents of the concept describe what they believe to be biases which propel the development and acceptance of such technologies; namely, a bias towards functionality and efficiency that may compel assent to a view of human people which de-emphasizes as defining characteristics actual manifestations of humanity and personhood, in favor of definition in terms of upgrades, versions, and utility.
https://en.wikipedia.org/wiki?curid=20756967
355,716
458,968
The PowerPC 601 was the first generation of microprocessors to support the basic 32-bit PowerPC instruction set. The design effort started in earnest in mid-1991 and the first prototype chips were available in October 1992. The first 601 processors were introduced in an IBM RS/6000 workstation in October 1993 (alongside its more powerful multichip cousin IBM POWER2 line of processors) and the first Apple Power Macintoshes on March 14, 1994. The 601 was the first advanced single-chip implementation of the POWER/PowerPC architecture designed on a crash schedule to establish PowerPC in the marketplace and cement the AIM alliance. In order to achieve an extremely aggressive schedule while including substantially new functionality (such as substantial performance enhancements, new instructions and importantly POWER/PowerPC's first symmetric multiprocessing (SMP) implementation) the design leveraged a number of key technologies and project management strategies. The 601 team leveraged much of the basic structure and portions of the IBM RISC Single Chip (RSC) processor, but also included support for the vast majority of the new PowerPC instructions not in the POWER instruction set. While nearly every portion of the RSC design was modified, and many design blocks were substantially modified or completely redesigned given the completely different unified I/O bus structure and SMP/memory coherency support. New PowerPC changes, leveraging the basic RSC structure was very beneficial to reducing the uncertainty in chip area/floorplanning and timing analysis/tuning. Worth noting is that the 601 not only implemented substantial new key functions such as SMP, but it also acted as a bridge between the POWER and the future PowerPC processors to assist IBM and software developers in their transitions to PowerPC. From start of design to tape-out of the first 601 prototype was just 12 months in order to push hard to establish PowerPC on the market early.
https://en.wikipedia.org/wiki?curid=8632684
458,744
533,733
Many people do not have a Heating, Ventilation, and Air Condition (HVAC) system in their homes because it is too expensive. However according to this article "Save Money Through Energy Efficiency", HVAC is not as expensive as one may think it is. This article "Save Money Through Energy Efficiency" tells us that the main thing to look for when shopping for a HVAC system is to make sure it runs efficiently. One thing to always look for when in the market of a new HVAC system is the energy guide sticker on the machine. Although many might choose to not believe that sticker and that it is just there to help with the sales, history shows that many of the newer HVAC systems with the yellow energy guide stickers help save customers hundreds to thousands of dollars depending on how much they use their HVAC system. On the yellow energy guide sticker on many of the newer systems, it displays the average cost to run that machine. Once a customer has found the perfect HVAC system, the customer should run it monthly if it is only put into use during specific times of year. It is recommended that if an HVAC system is not being used each month, that it should be turned on and left running for ten to fifteen minutes. On the other hand if the customer is somebody who runs their HVAC system frequently, it is really important to maintain it. Maintenance on an HVAC system includes changing out the air filter, inspecting the areas where air intake takes place, and check for leaks. Doing these three steps are super essential and is the key to keeping an HVAC system running for a long time. A customer should do these three steps every couple of months or when it is suspected problem with the HVAC system. Some signs that could lead to a potential problem is if the HVAC system does not provide air cool enough. That could be due to a leakage in the cooling fluids. Another sign that could mean that the HVAC system is not running perfectly fine is if there is a bad smell to the air that it is providing. That often means that the air filters need to be replaced. Changing the air filters on an HVAC system is really important because they are exposed to a lot of dust depending on where your HVAC system is and it could build up dust from simply just sitting in one's home.
https://en.wikipedia.org/wiki?curid=396508
533,454
148,560
The inhibition of protein synthesis is mediated through aminoglycosides' energy-dependent, sometimes irreversible binding, to the cytosolic, membrane-associated bacterial ribosome (image at right). (Aminoglycosides first cross bacterial cell walls—lipopolysaccharide in gram-negative bacteria—and cell membranes, where they are actively transported.) While specific steps in protein synthesis affected may vary somewhat between specific aminoglycoside agents, as can their affinity and degree of binding, aminoglycoside presence in the cytosol generally disturbs peptide elongation at the 30S ribosomal subunit, giving rise to inaccurate mRNA translation and therefore biosynthesis of proteins that are truncated, or bear altered amino acid compositions at particular points. Specifically, binding impairs translational proofreading leading to misreading of the RNA message, premature termination, or both, and so to inaccuracy of the translated protein product. The subset of aberrant proteins that are incorporated into the bacterial cell membrane may then lead to changes in its permeability and then to "further stimulation of aminoglycoside transport". The amino sugar portion of this class of molecules (e.g., the 2-deoxystreptamine in kanamycins, gentamicins, and tobramycin, see above) are implicated in the association of the small molecule with ribosomal structures that lead to the infidelities in translation (ibid.). Inhibition of ribosomal translocation—i.e., movement of the peptidyl-tRNA from the A- to the P-site—has also been suggested. Recent single-molecule tracking experiments in live "E. coli" showed an ongoing but slower protein synthesis upon treatment with different aminoglycoside drugs. (Spectinomycin, a related but distinct chemical structure class often discussed with aminoglycosides, does not induce mRNA misreading and is generally not bactericidal.)
https://en.wikipedia.org/wiki?curid=617210
148,499
1,651,965
Tropical peatlands extensively occur in Southeast Asia, mainland East Asia, the Caribbean and Central America, South America and southern Africa. Often located in lowlands, tropical peatlands are uniquely identified by rapid rates of peat soil formation, under high precipitation and high temperature regimes. In contrast, a high temperature climate accelerates decomposition rates, causing degraded tropical peatlands to contribute more substantially to global green house gas emissions. Although tropical peatlands cover only 587,000 km, they store 119.2 Gigatonnes C at a density per unit area of 203,066 tonnes C km. For decades, these large carbon stores have succumbed to draining in order to cater for humanity's socio-economic needs. Between 1990 and 2015, cultivation (for management including industrial and small-holder agriculture) had increased from 11 to 50% of forested peatlands in Peninsular Malaysia, Sumatra, and Borneo. In Malaysia and Indonesia in the last twenty years, peat swamp forests have retreated from covering 77% of peatlands to 36%, endangering many mammals and birds in the region. In 2010, industrial agriculture covers about 3-3.1 million hectares, with oil palm accounting for 2.15 million hectares of this area. The conversion of natural tropical peatlands into other land uses leads to peat fires and the associated health effects, soil subsidence increasing flood risks, substantial greenhouse gas emissions and loss of biodiversity. Today efforts are being made to restore degraded tropical peatlands through paludiculture. Paludiculture is researched as a sustainable solution to reduce and reverse the degradation of peat swamp forests, and includes traditional local agricultural practices which predate the use of the term. Commercial paludiculture has not been trialled to the extent that it has in northern peatlands. Below are examples of paludiculture practices in tropical peatlands.
https://en.wikipedia.org/wiki?curid=54076293
1,651,033
305,909
The distinctive four-flippered body-shape has caused considerable speculation about what kind of stroke plesiosaurs used. The only modern group with four flippers are the sea turtles, which only use the front pair for propulsion. Conybeare and Buckland had already compared the flippers with bird wings. However, such a comparison was not very informative, as the mechanics of bird flight in this period were poorly understood. By the middle of the nineteenth century, it was typically assumed that plesiosaurs employed a rowing movement. The flippers would have been moved forward in a horizontal position, to minimise friction, and then axially rotated to a vertical position in order to be pulled to the rear, causing the largest possible reactive force. In fact, such a method would be very inefficient: the recovery stroke in this case generates no thrust and the rear stroke generates an enormous turbulence. In the early twentieth century, the newly discovered principles of bird flight suggested to several researchers that plesiosaurs, like turtles and penguins, made a flying movement while swimming. This was e.g. proposed by Eberhard Fraas in 1905, and in 1908 by Othenio Abel. When flying, the flipper movement is more vertical, its point describing an oval or "8". Ideally, the flipper is first moved obliquely to the front and downwards and then, after a slight retraction and rotation, crosses this path from below to be pulled to the front and upwards. During both strokes, down and up, according to Bernoulli's principle, forward and upward thrust is generated by the convexly curved upper profile of the flipper, the front edge slightly inclined relative to the water flow, while turbulence is minimal. However, despite the evident advantages of such a swimming method, in 1924 the first systematic study on the musculature of plesiosaurs by David Meredith Seares Watson concluded they nevertheless performed a rowing movement.
https://en.wikipedia.org/wiki?curid=1398078
305,745
1,513,985
In 1946, The Arctic Research Laboratory was established under the contract of the Office of Naval Research in Point Barrow, Alaska for the purpose of investigating the physical and biological phenomena unique to the Arctic. In 1948, Dr. Laurence Irving was appointed as the Scientific Director of the Arctic Research Laboratory and put in charge of coordinating various projects. Scientists performed fieldwork to collect data that linked new observations to prior widely accepted knowledge. Through the processes of soil sampling, surveying and photographing landscapes and distributing salmon tags, scientists demonstrated the significance of historical case studies in the study of environmental science. The ability to compare past and present data allowed scientists to understand the causes and effects of ecological changes. Around this time, geographers from McGill University were developing new methods of studying geography in the North. As laboratory research was beginning to trump field research, McGill geographers implemented use of aviation in research, helping knowledge production to occur in the laboratory instead of in the field. Aviation allowed researchers to remould the way they studied the Northern landscape and indigenous people. Quick and easy travel using aircraft also promoted an integration of the Northern science with Southern community-based science, while changing the scale of ecology being studied. The ability to photograph, and observe the Arctic from an aircraft, provided researchers with a larger scope that allowed them to see a massive amount of space at one time, while also asserting objectivity. A photograph produces evidence, similar to laboratory data, yet it can be understood, circulated and accepted by the common people due to its aesthetic value.
https://en.wikipedia.org/wiki?curid=11180149
1,513,134
1,399,969
Pleural plaques are the most common manifestation of asbestos exposure, affecting up to 58% of asbestos-exposed workers. The prevalence among the general population exposed environmentally ranges from 0.53 to 8%. Pleural plaques are discrete circumscribed areas of hyaline fibrosis (patches of thickening) of the parietal pleura and rarely the visceral pleura that develop 20 to 40 years after first exposure. Over time, usually more than 30 years, they often become partly calcified. They consist of mature collagen fibers arranged in an open basket-weave pattern and are covered by flattened or cuboidal mesothelial cells. They have a white or pale yellow shaggy appearance and are typically distributed on the posterolateral chest wall, diaphragm, and mediastinal pleura. The number and size varies. Pleural plaques are typically asymptomatic, however, there is still some controversy on this topic. An association between pleural plaques and chest pain has been reported, but this has not been confirmed in more recent studies. Similarly, an association between pleural plaques and a restrictive impairment with diminished diffusing capacity on pulmonary function testing has been described. This has not been a consistent finding and it has been postulated that this might be related to undetected early fibrosis. The pathogenesis of pleural plaques remains uncertain. The most likely explanation is that asbestos fibres reach the parietal pleura by passage through lymphatic channels where they excite an inflammatory reaction. The chest X-ray is the usual tool for diagnosing pleural plaques but chest CT scan is more sensitive and specific in this regard. Pleural plaques are evidence of past asbestos exposure and indicate an increased risk for the future development of other asbestos-related diseases. Pleural plaques in themselves are not pre-malignant. Individuals with pleural plaques are usually not compensated in most compensation systems.
https://en.wikipedia.org/wiki?curid=37923694
1,399,184
1,516,803
Blessed José Gregorio Hernández (Isnotu, 1858-Caracas, 1919). Graduated as a medical doctor at "Universidad Central de Venezuela", in Caracas. The Venezuelan government awarded him a grant to continue his studies in Europe. Hernández traveled to Paris, France, where he studied other fields of medicine such as: bacteriology, pathology, microbiology, histology, and physiology. Following his return to Venezuela, he became a leading doctor at the "Hospital José María Vargas". Between 1891 and 1916, Hernández dedicated himself to teaching, medicine, and religious practice. He sought priesthood in two occasions, but his fragile physical conditions would ultimately prevent him from achieving that status. He studied at the "Monastery of Lucca" in Italy for ten months in 1908. In 1913, he enrolled at the "Latin American Pío School of Rome" to continue the priestly career, but had to return to Venezuela for health reasons. Among the scientific publications of this famous Venezuelan are "The Elements of Bacteriology" (1906), "About the Angina Pectoris of Malaric Origin" (1909) and "The Elements of Philosophy" (1912). Dr. Hernández treated the poor for free and even bought them medicines with his own money. One day in 1919, while bringing medicine to the home of one of his patients in Caracas, Hernández was struck by a car and killed. In 1986 the Pope John Paul II solemnly declared his heroic virtues, for which he was granted the title of Venerable. After more than 80 years of investigating the first miracle that Hernández did, on 30 April 2021, was beatified in Caracas, Venezuela. He is currently in the process of canonization.
https://en.wikipedia.org/wiki?curid=29302481
1,515,951
346,466
Glyphosate has four ionizable sites, with pKa values of 2.0, 2.6, 5.6 and 10.6. Therefore, it is a zwitterion in aqueous solutions and is expected to exist almost entirely in zwitterionic forms in the environment. Zwitterions generally adsorb more strongly to soils containing organic carbon and clay than their neutral counterparts. Glyphosate strongly sorbs onto soil minerals, and, with the exception of colloid-facilitated transport, its soluble residues are expected to be poorly mobile in the free porewater of soils. The spatial extent of ground and surface water pollution is therefore considered to be relatively limited. Glyphosate is readily degraded by soil microbes to aminomethylphosphonic acid (AMPA, which like glyphosate strongly adsorbs to soil solids and is thus unlikely to leach to groundwater). Though both glyphosate and AMPA are commonly detected in water bodies, a portion of the AMPA detected may actually be the result of degradation of detergents rather than from glyphosate. Glyphosate does have the potential to contaminate surface waters due to its aquatic use patterns and through erosion, as it adsorbs to colloidal soil particles suspended in runoff. Detection in surface waters (particularly downstream from agricultural uses) has been reported as both broad and frequent by the United States Geological Survey (USGS) researchers, although other similar research found equal frequencies of detection in urban-dominated small streams. Rain events can trigger dissolved glyphosate loss in transport-prone soils. The mechanism of glyphosate sorption to soil is similar to that of phosphate fertilizers, the presence of which can reduce glyphosate sorption. Phosphate fertilizers are subject to release from sediments into water bodies under anaerobic conditions, and similar release can also occur with glyphosate, though significant impact of glyphosate release from sediments has not been established. Limited leaching can occur after high rainfall after application. If glyphosate reaches surface water, it is not broken down readily by water or sunlight.
https://en.wikipedia.org/wiki?curid=294295
346,285
1,857,485
Ulm joined MIT in January 1999, where he specialized in experimental nano- and micromechanics of cement-based materials, shales and bones. With his research group, he developed statistical nanoindentation techniques for hydrated nanocomposites; an indentation technique that permits via statistical analysis to link chemistry and mechanical behavior at 10s of nanometer length scales. In 2007, his group discovered that cementitious materials at nanoscale exhibit a nanogranular behavior driven by particle-to-particle contact forces. This nanogranular origin was found to drive much of the strength and durability performance of cementitious materials, such as the long-term creep behavior and fire resistance. A similar nanogranular nature was found for shales and bones. In 2008, he joined forces with a group of physicists and computational material scientists to combine experimental nanoindentation investigations with molecular and meso-scale simulations of cement hydrates. Specifically, the collaboration with Roland Pellenq, a CNRS Research Director and Senior Research Scientist at MIT, allowed identification of the molecular structure of calcium-silicate hydrates, the binding phase of concrete which lends its strength and durability to concrete at engineering scales. By incorporating concepts of glass physics and soft matter physics into cement science, Ulm and co-workers showed by means of a molecular combinatorial approach that concrete's fundamental strength could be elevated without changing the chemistry of its constituents, i.e. Calcium, Silica and Water. The handshake of molecular simulation techniques with experimental nanoscale experiments opened a novel way to address the concrete sustainability challenge to reduce the environmental impact while sustaining its role as the backbone material for housing, shelter and infrastructure worldwide. The challenge led, in 2009, to the foundation of the Concrete Sustainability Hub at MIT (CSHub@MIT). This novel industry–academia partnership between the Portland Cement Association (Skokie, Illinois), the National Ready Mixed Concrete Association (Silver Spring, Maryland) and MIT aims at reducing the environmental footprint of concrete, while enhancing its resilience from materials to structures and urban community scale.
https://en.wikipedia.org/wiki?curid=59593534
1,856,417
1,398,326
In humans, spatial cognition is closely related to how people talk about their environment, find their way in new surroundings, and plan routes. Thus a wide range of studies is based on participants reports, performance measures and similar, for example in order to determine cognitive reference frames that allow subjects to perform. In this context the implementation of virtual reality becomes more and more widespread among researchers, since it offers the opportunity to confront participants with unknown environments in a highly controlled manner. Spatial cognition can be seen from a psychological point of view, meaning that people’s behaviour within that space is key. When people behave in space, they use cognitive maps, the most evolved form of spatial cognition. When using cognitive maps, information about landmarks and the routes between landmarks are stored and used. This knowledge can be built from various sources; from a tightly coordinated vision and locomotion (movement), but also from map symbols, verbal descriptions, and computer-based pointing systems. According to Montello, space is implicitly referring to a person’s body and their associated actions. He mentions different kinds of space; figural space which is a space smaller than the body, vista space which the space is more extended than the human body, environmental space which is learned by locomotion, and geographical space which is the biggest space and can only be learned through cartographic representation. However, since space is represented in the human brain, this can also lead to distortions. When perceiving space and distance, a distortion can occur. Distances are perceived differently on whether they are considered between a given location and a location that has a high cognitive saliency, meaning that it stands out. Different perceived locations and distances can have a “reference point”, which are better known than others, more frequently visited and more visible. There are other kinds of distortions as well. Furthermore, there the distortion in distance estimation and the distortion in angle alignment. Distortion in angle alignment means that your personal north will be viewed as “the north”. The map is mentally represented according to the orientation of the personal point of view of learning. Since perceived distortion is “subjective” and not necessarily correlated with “objective distance”, distortions can happen in this phenomenon too. There can be an overestimation in downtown routes, routes with turns, curved routes and borders or obstacles.
https://en.wikipedia.org/wiki?curid=33429851
1,397,553
97,915
Two researchers, S. Christopher Bennett in 1996, and paleoartist David Peters in 2000, published analyses finding pterosaurs to be protorosaurs or closely related to them. However, Peters gathered novel anatomical data using an unverified technique called "Digital Graphic Segregation" (DGS), which involves digitally tracing over images of pterosaur fossils using photo editing software. Bennett only recovered pterosaurs as close relatives of the protorosaurs after removing characteristics of the hindlimb from his analysis, to test the possibility of locomotion-based convergent evolution between pterosaurs and dinosaurs. A 2007 reply by Dave Hone and Michael Benton could not reproduce this result, finding pterosaurs to be closely related to dinosaurs even without hindlimb characters. They also criticized David Peters for drawing conclusions without access to the primary evidence, that is, the pterosaur fossils themselves. Hone and Benton concluded that, although more basal pterosauromorphs are needed to clarify their relationships, current evidence indicates that pterosaurs are avemetatarsalians, as either the sister group of "Scleromochlus" or a branch between the latter and "Lagosuchus". An 2011 archosaur-focused phylogenetic analysis by Sterling Nesbitt benefited from far more data and found strong support for pterosaurs being avemetatarsalians, though "Scleromochlus" was not included due to its poor preservation. A 2016 archosauromorph-focused study by Martin Ezcurra included various proposed pterosaur relatives, yet also found pterosaurs to be closer to dinosaurs and unrelated to more basal taxa. Working from his 1996 analysis, Bennett published a 2020 study on "Scleromochlus" which argued that both "Scleromochlus" and pterosaurs were non-archosaur archosauromorphs, albeit not particularly closely related to each other. By contrast, a later 2020 study proposed that lagerpetid archosaurs were the sister clade to pterosauria. This was based on newly described fossil skulls and forelimbs showing various anatomical similarities with pterosaurs and reconstructions of lagerpetid brains and sensory systems based on CT scans also showing neuroanatomical similarities with pterosaurs. The results of the latter study were subsequently supported by an independent analysis of early pterosauromorph interrelationships.
https://en.wikipedia.org/wiki?curid=24824
97,874
98,319
The series’ creator Gene Roddenberry had first proposed a "Star Trek" feature at the 1968 World Science Fiction Convention. The movie was to have been set before the television series, showing how the crew of the "Enterprise" met. The popularity of the syndicated "Star Trek" caused Paramount and Roddenberry to begin developing the film in May 1975. Roddenberry was allocated $3 to $5 million to develop a script. By June 30, he had produced what he considered an acceptable script, but studio executives disagreed. This first draft, "", featured a grounded Admiral Kirk assembling the old crew on the refitted "Enterprise" to clash with a godlike entity many miles across, hurtling towards Earth. The object turns out to be a super-advanced computer, the remains of a scheming race who were cast out of their dimension. Kirk wins out, the entity returns to its dimension, and the "Enterprise" crew resumes their voyages. The basic premise and scenes such as a transporter accident and Spock's Vulcan ritual were discarded, but later returned to the script. The film was postponed until spring (March/April) 1976 while Paramount fielded new scripts for "Star Trek II" (the working title) from acclaimed writers such as Ray Bradbury, Theodore Sturgeon, and Harlan Ellison. Ellison's story had a snake-like alien race tampering with Earth's history to create a kindred race; Kirk reunites with his old crew, but they are faced with the dilemma of killing off the reptilian race in Earth's prehistory just to maintain humanity's dominance. When Ellison presented his idea, an executive suggested that Ellison read "Chariots of the Gods?" and include the Maya civilization into his story, which enraged the writer because he knew Maya did not exist at the dawn of time. By October 1976, Robert Silverberg had been signed to work on the screenplay along with a second writer, John D. F. Black, whose treatment featured a black hole that threatened to consume all of existence. Roddenberry teamed up with Jon Povill to write a new story that featured the "Enterprise" crew setting an altered universe right by time travel; like Black's idea, Paramount did not consider it epic enough.
https://en.wikipedia.org/wiki?curid=277006
98,278
706,890
While the defects were not prohibitive and the Mark 33 remained in production until fairly late in World War II, the Bureau started the development of an improved director in 1936, only 2 years after the first installation of a Mark 33. The objective of weight reduction was not met, since the resulting director system actually weighed about more than the equipment it was slated to replace, but the Gun Director Mark 37 that emerged from the program possessed virtues that more than compensated for its extra weight. Though the gun orders it provided were the same as those of the Mark 33, it supplied them with greater reliability and gave generally improved performance with gun batteries, whether they were used for surface or antiaircraft use. Moreover, the stable element and computer, instead of being contained in the director housing were installed below deck where they were less vulnerable to attack and less of a jeopardy to a ship's stability. The design provided for the ultimate addition of radar, which later permitted blind firing with the director. In fact, the Mark 37 system was almost continually improved. By the end of 1945 the equipment had run through 92 modifications—almost twice the total number of directors of that type which were in the fleet on December 7, 1941. Procurement ultimately totalled 841 units, representing an investment of well over $148,000,000. Destroyers, cruisers, battleships, carriers, and many auxiliaries used the directors, with individual installations varying from one aboard destroyers to four on each battleship. The development of the Gun Directors Mark 33 and 37 provided the United States Fleet with good long range fire control against attacking planes. But while that had seemed the most pressing problem at the time the equipments were placed under development, it was but one part of the total problem of air defense. At close-in ranges the accuracy of the directors fell off sharply; even at intermediate ranges they left much to be desired. The weight and size of the equipments militated against rapid movement, making them difficult to shift from one target to another.Their efficiency was thus in inverse proportion to the proximity of danger.
https://en.wikipedia.org/wiki?curid=21169396
706,521
929,504
Since its construction, the telescope was upgraded several times, following the facility's oversight from the DoD to the National Science Foundation on October 1, 1969, and subsequent renaming of the AIO to the National Astronomy and Ionosphere Center (NAIC) in September 1971. Initially, when the maximum expected operating frequency was about 500 MHz, the surface consisted of half-inch galvanized wire mesh laid directly on the support cables. In 1973, a high-precision surface consisting of 38,000 individually adjustable aluminum panels replaced the old wire mesh, and the highest usable frequency rose to about 5000 MHz. A Gregorian reflector system was installed in 1997, incorporating secondary and tertiary reflectors to focus radio waves at one point. This allowed installing a suite of receivers, covering the full 1–10 GHz range, that could be easily moved to the focal point, giving Arecibo more flexibility. The additional instrumentation added to the platform, so six additional support cables were added, two for each tower. A metal mesh screen was also installed around the perimeter to block the ground's thermal radiation from reaching the feed antennas. In 1997, a more powerful 2400 MHz transmitter was added. Finally, in 2013 with a grant of , work for adding the ionospheric modification HF facility began which was completed in 2015. The HF facility consisted on the sender side of six foldable 100 kW crossed dipoles inside the main dish and a hanging 100m wide subreflector mesh between the dish and platform.
https://en.wikipedia.org/wiki?curid=9013135
929,013
1,435,338
In 1941-45 during the battles at the Eastern Front of World War II, all research work of the Institute was focused on the development of the arms industry, the assistance to arms production, including to evacuated industrial enterprises in Tomsk. All research work of universities was coordinated by the Scientists’ Committee formed in June 1941, due to the initiative of a number of professors of Tomsk universities. The Committee contained Tomsk scientists, including professors of Kirov Tomsk Industrial Institute (Innokenty Butakov, Innokenty Gebler, Mikhail Korovin). One of the Vice-Chairmen of the Committee was Konstantin Shmargunov, the Rector of the Institute. TPU alumni headed a lot of USSR plants producing tanks and weapons. Timofey Gorbachev, a 1928 alumnus of Dzerzhinsky Siberian Technological Institute, headed the department of the Kuzbassugol Building Complex. Nikolay Kamov, a 1923 alumnus of the Mechanical Faculty of Tomsk Technological Institute, became chief engineer at the Design Bureau producing helicopters. Vladimir Kozhevin, a 1934 alumnus of the Mining Department of Siberian Institute of Mechanical Engineering, from 1941 held a position of a head of the Engineering Office and a deputy chief engineer at the OsinnikiUgol Trust of the Kuzbassugol Building Complex, where coal of the most valuable types required for the metallurgical and defense industries of Russia was produced. In 1942, he was appointed chief engineer and head of mine No. 10 of the OsinnikiUgol Trust. From 1943 till the end of the battles at the Eastern Front, the mine maintained the ideals of the USSR State Defense Committee. Valery Kuznetsov, a 1932 alumnus of the Geological Prospecting Faculty of Siberian Institute of Mechanical Engineering, during the years of battles at the Eastern Front was in charge of drawing geologic maps at the West-Siberian Geology Administration. These maps were required for the exploration of minerals, a necessity in which sharply rose during that period. Over 700 people, including students, academic staff, employees and volunteers went to war. They took part in many battles and only few of them were able to reach Berlin leaving their signatures on the walls of the Reichstag building.
https://en.wikipedia.org/wiki?curid=4563685
1,434,532
32,118
His research into weather systems and meteorological prediction led him to propose manipulating the environment by spreading colorants on the polar ice caps to enhance absorption of solar radiation (by reducing the albedo), thereby inducing global warming. Von Neumann proposed a theory of global warming as a result of the activity of humans, noting that the Earth was only colder during the last glacial period, he wrote in 1955: "Carbon dioxide released into the atmosphere by industry's burning of coal and oil - more than half of it during the last generation - may have changed the atmosphere's composition sufficiently to account for a general warming of the world by about one degree Fahrenheit." However, von Neumann urged a degree of caution in any program of intentional human weather manufacturing: "What "could" be done, of course, is no index to what "should" be done... In fact, to evaluate the ultimate consequences of either a general cooling or a general heating would be a complex matter. Changes would affect the level of the seas, and hence the habitability of the continental coastal shelves; the evaporation of the seas, and hence general precipitation and glaciation levels; and so on... But there is little doubt that one "could" carry out the necessary analyses needed to predict the results, intervene on any desired scale, and ultimately achieve rather fantastic results." He also warned that weather and climate control could have military uses, telling Congress in 1956 that they could pose an even bigger risk than ICBMs. Although he died the next year, this continuous advocacy ensured that during the Cold War there would be continued interest and funding for research.
https://en.wikipedia.org/wiki?curid=15942
32,106
818,592
Hypogammaglobulinemia can be caused by either a primary or secondary immunodeficiency. Primary immunodeficiencies are caused by a mutation or series of mutations to the genome. For example, a study from 2012 found that a compound heterozygous deleterious mutation in the CD21 gene is associated with hypogammaglobulinemia. Genetic analysis revealed the patient was heterozygous for CD21, with the paternally inherited allele (also shared with one sister) having a disrupted splicing donor site at exon 6, while the maternally inherited allele had a mutation resulting in a premature stop codon in exon 13. Neither mutation was found in 100 healthy control subjects, showing the rarity of the mutations. Around 300 different genes in total have been identified which account for different forms of primary immunodeficiency (PID). These different forms can affect different parts of the immune system, including immunoglobulin production. Primary immunodeficiencies usually have a delay of several years between initial clinical presentation and diagnosis. Some primary immune deficiencies include ataxia-telangiectasia (A-T), autosomal recessive agammaglobulinemia (ARA), common variable immunodeficiency (CVID), hyper-IgM syndromes, IgG subclass deficiency, isolated non-IgG immunoglobulin deficiencies, severe combined immunodeficiency (SCID), specific antibody deficiency (SAD), Wiskott-Aldrich syndrome, or X-linked agammaglobulinemia. CVID is the most common form of primary immunodeficiency. SCID is considered a medical emergency and suspected cases require immediate specialist center referral for diagnosis and treatment. It is more often that hypogammaglobulinemia develops as a result of another condition, which are called secondary or acquired immune deficiencies. These include blood cancers such as chronic lymphocytic leukemia (CLL), lymphoma, or myeloma, HIV, nephrotic syndrome, poor nutrition, protein-losing enteropathy, getting an organ transplant, or radiation therapy. This also includes medications which can cause hypogammaglobulinemia such as corticosteroids, chemotherapy drugs, or antiseizure medication.
https://en.wikipedia.org/wiki?curid=2801308
818,153
1,591,629
The scientific purpose of the Aeronomy of Ice in the Mesosphere (AIM) mission is focused on the study of polar mesospheric clouds (PMCs) that form about above the surface of Earth in summer and mostly in the polar regions of Earth. The overall goal is to resolve why PMCs form and why they vary. AIM expected lifetime was at least two years. AIM measures PMCs and the thermal, chemical and dynamical environment in which they form. This will allow the connection to be made between these clouds and the meteorology of the polar mesospheric summer echoes. This connection is important because a significant variability in the yearly number of noctilucent ("glow in the dark") clouds (NLCs), one manifestation of PMCs, has been suggested as an indicator of global change. The body of data collected by AIM will provide the basis for a rigorous study of PMCs that can be reliably used to study past PMC changes, present trends and their relationship to global change. In the end, AIM will provide an expanded basis for the study of long-term variability in the climate of Earth. The AIM scientific objectives will be achieved by measuring near simultaneous PMC abundances, PMC spatial distributions, cloud particle size distributions, gravity wave activity, cosmic dust influx to the atmosphere needed to study the role of these particles as nucleation sites and precise, vertical profile measurements of temperature, , OH, , , , NO, and aerosols. AIM carries three instruments: an infrared solar occultation differential absorption radiometer, built by the Space Dynamics Laboratory, Utah State University (Solar Occultation for Ice Experiment - SOFIE); a panoramic ultraviolet imager (Cloud Imaging and particle Size Experiment - CIPS); and, an in situ dust detector (Cosmic Dust Experiment - CDE), both designed and built by the Laboratory for Atmospheric and Space Physics, University of Colorado. Ball Aerospace & Technologies Corporation constructed the spacecraft bus and GATS, Inc., Newport News, Virginia, led the data management effort.
https://en.wikipedia.org/wiki?curid=5306982
1,590,733
469,440
There are two main sources of new research projects, namely ideas originating from the researchers themselves ("supply push") and those coming from customers ("demand pull"). Ideas for new processes typically originate from researchers, ideas for new products from customers, respectively customer contacts. Particularly in custom manufacturing, "demand pull" prevails industrial reality. The "new product committee" is the body of choice for evaluating new and monitoring ongoing research activities. It has the assignment to evaluate all new product ideas. It decides whether a new product idea should be taken up in research, reassesses a project at regular intervals and, last but not least decides also about the abandonment of a project, once it becomes evident that the objectives cannot be reached. In a typical project the overall responsibility for the economic and technical success lies with the project champion. He is assisted by the project manager, who is responsible for the technical success. In custom manufacturing, a typical project starts with the acceptance of the product idea, which originates mainly from business development, by the new product committee, followed by the preparation of a laboratory process, and ends with the successful completion of demonstration runs on industrial scale and the signature of a multiyear supply contract, respectively. The input from the customer is contained in the "technology package". Its main constituents are (1) reaction scheme, (2) target of project & deliverables (product, quantity, required dates, specifications), (3) list of analytical methods, (4) process development opportunities (stepwise assessment), (5) list of required reports, (6) Safety, Health and Environment (SHE) issues, (7) materials to be supplied by customer and (8) packaging & shipping information The technical part of a project usually determines its duration. Depending on the quality of the information contained in the "technology package" received from the customer and the complexity of the project as such, particularly the number of steps that have to be performed; it can be any time between 12 and 24 months. Depending on the number of researches involved, the total budget easily amounts to several million US dollars.
https://en.wikipedia.org/wiki?curid=3694845
469,204
1,029,435
Mapping of the outcrop, as well as core hole data and samples taken during the bench-making process, are taken into account to best project the panels that the highwall miner will cut. Obstacles that could be potentially damaged by subsidence and the natural contour of the highwall mine are taken into account, and a surveyor points the highwall miner in a line (theoretical survey plot-line) mostly perpendicular to the highwall. parallel lines represent the drive cut into the mountain (up to deep (2015 records), without heading or corrective steering actuation on a navigation azimuth during mining results in missing a portion of the coal seam and is a potential danger of cutting in pillars from previous mined drives due to horizontal drift (roll) of the pushbeam-cuttermodule string. Recently highwall miners have penetrated more than (2015 ongoing records into the coal seam, and today's models are capable of going farther, with the support of gyro navigation and not limited anymore by the amount of cable stored on the machine. The maximum depth would be determined by the stress of further penetration and associated specific-power draw (torsion and tension in screw transporters string), but today's optimized screw-transporters conveying embodiments (called pushbeams) with visual product development and discrete element modeling (DEM) using flow simulation behavior software shows smart-drive extended penetrations are possible, even so under steep inclined angles from horizontal to more than 30 degree downhole. In case of significant steep mining the new mining method phrase should be "directional mining" (commonly used technologies as valuable synergy directional drilling and directional mining are categorized in "surface to in-seam" (SIS) techniques), dry or wet, dewatering is developed or cutting and dredging through screw transporters are proactive in developing a roadmap of the leading global highwall mining engineering company.
https://en.wikipedia.org/wiki?curid=1702997
1,028,901
1,559,780
Poliomyelitis is a crippling disease which attacks the body's motor neurons. In the early 20th century, epidemics of polio began to hit the United States and other industrialized countries. As hospitals filled with patients in iron lungs, and tens of thousands were left crippled, fear of contracting polio grew rampant and led to the closing of many public facilities. By 1952, the polio epidemic reached new heights in the U.S. with 57,628 cases reported. Meanwhile, in 1947, Jonas Salk had been recruited to Pitt where he set up the University's Virus Research Lab in the basement of what is now Pitt's Salk Hall. By 1951, Salk and his team had begun immunization experiments in monkeys using dead polio virus. Soon, however, Salk began to test inoculations in paralyzed polio patients and by 1953 human trials among the general population were initiated (the majority were Allegheny County residents). By the spring of the following year, the largest controlled field trials in medical history were underway and by 1955 the "Pitt vaccine", developed by Salk and his team of Pitt researchers, was declared effective. By 1962, when Albert Sabin's oral live-virus vaccine was approved, Pitt's vaccine had reduced the incidence of polio in the United States by "95 percent". Together, these two vaccines eradicated naturally occurring poliomyelitis from North and South America, and Western Europe. In 1999 the U.S. Office of Public Health and Science recommended returning to the use of the Pitt dead-virus vaccine for routine inoculation. The breakthroughs in immunology and vaccine development at Pitt by Salk and his team are considered one of the most significant scientific and medical achievements in history.
https://en.wikipedia.org/wiki?curid=13999007
1,558,894
295,391
Though still largely based on pre-industrial era materials and designs, ships greatly improved during the early Industrial Revolution period (1760 to 1825), as "the risk of being wrecked for Atlantic shipping fell by one-third, and of foundering by two thirds, reflecting improvements in seaworthiness and navigation respectively." The improvements in seaworthiness have been credited to "replacing the traditional stepped deck ship with stronger flushed decked ones derived from Indian designs, and the increasing use of iron reinforcement." The design originated from Bengal rice ships, with Bengal being famous for its shipbuilding industry at the time. Iron was gradually adopted in ship construction, initially to provide stronger joints in a wooden hull e.g. as deck knees, hanging knees, knee riders and the other sharp joints, ones in which a curved, progressive joint could not be achieved. One study finds that there were considerable improvements in ship speed from 1750 to 1850: "we find that average sailing speeds of British ships in moderate to strong winds rose by nearly a third. Driving this steady progress seems to be the continuous evolution of sails and rigging, and improved hulls that allowed a greater area of sail to be set safely in a given wind. By contrast, looking at every voyage between the Netherlands and East Indies undertaken by the Dutch East India Company from 1595 to 1795, we find that journey time fell only by 10 percent, with no improvement in the heavy mortality, averaging six percent per voyage, of those aboard."
https://en.wikipedia.org/wiki?curid=187377
295,231
1,513,990
The Rapid cooling the earliest inhabitants felt signaled an early onset of the Little Ice Age of the 1300s. This caused the sea ice to expand making traveling through Greenland and Iceland impossible to manage. This trapped the people in their homes and settlements which made trading come to a stop (National Snow and Ice Data Center, 2020). Because of this, the people have adapted very well to these conditions. It is seen through their use of animal hides in their clothing, tree branches used to make tepee shelters, and their use of ice blocks to make igloos to trap heat (Reference, 2020). in certain areas of the Arctic, many big-footed animals leave tracks and trails behind that lead to plants and trees filled with roots, berries, nuts and other foods needed to survive. Hunters are easily able to find their source of foods because of this (Scholastic, 1961). It is extremely important for a hunter to be smart and really take in their surroundings. They must watch the way animals move and where their prey may be located. They must also keep their eye open for the way the sea moves as well as the weather if they plan to fish or hunt near the water (Vitebsky 2000). Because the temperature can be very extreme, it is very important to become accustomed to the climate. Temperatures can drop to as cold as below 50 degrees and may only rise to approximately 50 degrees during their summers (Business Insider, 2015).
https://en.wikipedia.org/wiki?curid=11180149
1,513,139
484,066
In fault detection and diagnosis, mathematical classification models which in fact belong to supervised learning methods, are trained on the training set of a labeled dataset to accurately identify the redundancies, faults and anomalous samples. During the past decades, there are different classification and preprocessing models that have been developed and proposed in this research area. "K"-nearest-neighbors algorithm ("k"NN) is one of the oldest techniques which has been used to solve fault detection and diagnosis problems. Despite the simple logic that this instance-based algorithm has, there are some problems with large dimensionality and processing time when it is used on large datasets. Since "k"NN is not able to automatically extract the features to overcome the curse of dimensionality, so often some data preprocessing techniques like Principal component analysis(PCA), Linear discriminant analysis(LDA) or Canonical correlation analysis(CCA) accompany it to reach a better performance. In many industrial cases, the effectiveness of "k"NN has been compared with other methods, specially with more complex classification models such as Support Vector Machines (SVMs), which is widely used in this field. Thanks to their appropriate nonlinear mapping using kernel methods, SVMs have an impressive performance in generalization, even with small training data. However, general SVMs do not have automatic feature extraction themselves and just like "k"NN, are often coupled with a data pre-processing technique. Another drawback of SVMs is that their performance is highly sensitive to the initial parameters, particularly to the kernel methods, so in each signal dataset, a parameter tuning process is required to be conducted first. Therefore, the low speed of the training phase is a limitation of SVMs when it comes to its usage in fault detection and diagnosis cases.
https://en.wikipedia.org/wiki?curid=21017316
483,817
1,645,309
Chemaxon's desktop-based, cheminformatics product portfolio maturated in late 2000s. GlaxoSmithKline contracted with them in late 2009. The company started to experiment with agile software development approach in 2010, eventually adopting scrum methodology in the following year. Chemaxon encourages an agile office environment with cross-functional teams. This philosophy is supported by a publicly available online company culture guide. With the wave of change came new product development directions that mimicked the trends in the industry towards the end of the 2000s. The demand for online services emerged in research to allow lab colleagues to access chemical applications from all devices, enabling contract research organizations and other suppliers to collaborate with chemical data, and cut IT costs at the same time. Chemaxon started to build its cloud-based software systems in 2008 - the first one being Chemicalize - and continuously expanded in this area. On the other hand, the rise of the biologics within newly developed pharmaceutical drugs influenced Chemaxon to start developing its biopolymer informatics portfolio in 2015. In 2011, Chemaxon moved its office headquarters to Graphisoft Park, one of Budapest's tech hubs, where the company is currently located. More offices were established: in 2014 a US East Coast headquarters opened in Cambridge, Massachusetts, followed by an office opening in San Diego in 2018.
https://en.wikipedia.org/wiki?curid=10526566
1,644,381
87,323
Smallpox was eradicated by a massive international search for outbreaks, backed up with a vaccination program, starting in 1967. It was organised and co-ordinated by a World Health Organization (WHO) unit, set up and headed by Donald Henderson. The last case in the Americas occurred in 1971 (Brazil), south-east Asia (Indonesia) in 1972, and on the Indian subcontinent in 1975 (Bangladesh). After two years of intensive searches, what proved to be the last endemic case anywhere in the world occurred in Somalia, in October 1977. A Global Commission for the Certification of Smallpox Eradication chaired by Frank Fenner examined the evidence from, and visited where necessary, all countries where smallpox had been endemic. In December 1979 they concluded that smallpox had been eradicated; a conclusion endorsed by the WHO General Assembly in May 1980. However, even as the disease was being eradicated there still remained stocks of smallpox virus in many laboratories. Accelerated by two cases of smallpox in 1978, one fatal (Janet Parker), caused by an accidental and unexplained containment breach at a laboratory at the University of Birmingham Medical School, the WHO ensured that known stocks of smallpox virus were either destroyed or moved to safer laboratories. By 1979, only four laboratories were known to have smallpox virus. All English stocks held at St Mary's Hospital, London were transferred to more secure facilities at Porton Down and then to the U.S. at the Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia in 1982, and all South African stocks were destroyed in 1983. By 1984, the only known stocks were kept at the CDC in the U.S. and the State Research Center of Virology and Biotechnology (VECTOR) in Koltsovo, Russia. These states report that their repositories are for possible anti-bioweaponry research and insurance if some obscure reservoir of natural smallpox is discovered in the future.
https://en.wikipedia.org/wiki?curid=61088
87,288
2,203,381
Organic livestock are not immune to diseases, and are likely to share the same overall disease statues with regard to contagious diseases at the national herd level. At all stages of the life cycle, animals may be exposed to diseases and in the case of organic swine, without the availability of normal industry drugs and prophylactic medication; as internal parasites are a large component of risk. Cross species exposure and contamination are direct concerns for many producers. Internal parasites and cohabitation of animals can lead to exposure, depending on local environment conditions and herd population. In fact, in a 2003 study from 31 organic chicken flocks sampled for bacterial pathogens, 13% were positive for Salmonella and 35% for campylobacter, as contact with other animals cited as the primary exposure route. Additionally, cross contamination on integrated livestock can be problematic, but is dependent on a multitude of variables, including density of the livestock operations, introduction of replacement breeding stock, sharing of facilities, etc. Pigs raised according to general organic certification requirements are as susceptible to diseases as conventional breeds, based on genetics, environment, and diet. Pork diseases vary widely based on localized region and country. Generally, organic production requires extensive management and can be more labor-intensive than conventional systems, where access to Many producers do give vaccines for E. coli, pseudorabies virus, bordetella bronchiseptica, porcine parvovirus, and Erysipelothrix rhusiopathiae among others. It is particularly important to tailor individual farm health security measures according to existing disease status of the swine stock, in order to guarantee financial sustainability of required investments.
https://en.wikipedia.org/wiki?curid=41194738
2,202,127
989,035
Buddhism rejects the concept of substance. Complex structures are comprehended as an aggregate of components without any essence. Just as the junction of parts is called cart, so the collections of elements are called things. All formations are unstable ("aniccā") and lacking any constant core or “self” ("anattā"). Physical objects have no metaphysical substrate. Arising entities hang on previous ones conditionally: in the notable teaching on interdependent origination, effects arise not as caused by agents but conditioned by former situations. Our senses, perception, feelings, wishes and consciousness are flowing, the view "satkāya-dṛṣṭi" of their permanent carrier is rejected as fallacious. The school of Madhyamaka, namely Nāgārjuna, introduced the idea of the ontological void ("śūnyatā"). The Buddhist metaphysics Abhidharma presumes particular forces which determine the origin, persistence, aging and decay of everything in the world. Vasubandhu added a special force making a human, called ""aprāpti"" or ""pṛthagjanatvam"". Because of the absence of a substantial soul, the belief in personal immortality loses foundation. Instead of deceased beings, new ones emerge whose fate is destined by the karmic law. The Buddha admitted the empirical identity of persons testified by their birth, name, and age. He approved the authorship of deeds and responsibility of performers. The disciplinary practice in the Sangha including reproaches, confession and expiation of transgressions, requires continuing personalities as its justification.
https://en.wikipedia.org/wiki?curid=27568
988,519
1,515,623
The USA based Chromebook programs may have shown changes in the economics of 1:1 programs. Although 1:1 programs require better WiFi than previous programs, they use ChromeOS which automatically updates and patches in the background (lowering the maintenance costs). The purchasing prices of Chromebooks were substantially lower than competing devices. According to IDC research the maintenance costs were significantly lower. Since the batteries of Chromebooks easily last a full day, schools experimented with having students charge them at home and only keeping a replacement stock ready in school (for defective, forgotten and out-of-charge Chromebooks) reducing the need for charging equipment/trolleys. GAM (Google Accounts Management) did charge for licensing, but it could remove the necessity of other MDM (Mobile Device Management) solutions and other security solutions at lower than Web-authentication level. Some schools also experimented with parents owning the devices (and paying for them). Having less costly equipment on site may also have saved in insurance and rooms may have become available for other purposes. 1:1 also enabled going paperless (i.e. publishers supplied cheaper digital versions of teaching materials) and reducing the cost on paper/printers used. No serious study is known of this, at this point. But individual schools have published costs and savings. The use of Google-Classroom software and G suite in general was instrumental in going paperless. It allowed electronic hand-in, grading and returning of projects to groups of students. Microsoft is trying to copy the classroom functionality in their O365 for education.
https://en.wikipedia.org/wiki?curid=6357454
1,514,771
116,215
Radioactive materials contained in RTGs are dangerous and can even be used for malicious purposes. They are barely useful for a genuine nuclear weapon, but still can serve in a "dirty bomb". The Soviet Union constructed many uncrewed lighthouses and navigation beacons powered by RTGs using strontium-90 (Sr). They are very reliable and provide a steady source of power. Most have no protection, not even fences or warning signs, and the locations of some of these facilities are no longer known due to poor record keeping. In one instance, the radioactive compartments were opened by a thief. In another case, three woodsmen in Tsalendzhikha Region, Georgia found two ceramic RTG orphan sources that had been stripped of their shielding; two of the woodsmen were later hospitalized with severe radiation burns after carrying the sources on their backs. The units were eventually recovered and isolated. There are approximately 1,000 such RTGs in Russia, all of which have long since exceeded their designed operational lives of ten years. Most of these RTGs likely no longer function, and may need to be dismantled. Some of their metal casings have been stripped by metal hunters, despite the risk of radioactive contamination. Transforming the radioactive material into an inert form reduces the danger of theft by people unaware of the radiation hazard (such as happened in the Goiânia accident in an abandoned Cs-137 source where the Caesium was present in easily water-soluble Caesium chloride form). However, a sufficiently chemically skilled malicious actor could extract a volatile species from inert material and/or achieve a similar effect of dispersion by physically grinding the inert matrix into a fine dust.
https://en.wikipedia.org/wiki?curid=211485
116,170
420,667
An alternative is to enclose the electrostatic elements and operate them as "monopoles." This avoids the many disadvantages of dipole operation, most importantly a great reduction in room reflections and thus also in adulteration of the recorded ambiance. Since there's no attempt at making the speaker visually "see-through", it also allows the application of materials to the rear of the panel to impart full damping of the membrane resonance, which improves transient response. In addition, using relatively small elements with a relatively high crossover frequency, such as 500 Hz, has a number of advantages. It reduces directivity to a degree that offers a reasonably wide sweet spot. It allows more of the 3 dB/octave increase in SPL with frequency to be used, increasing the sensitivity. It does not act as a true line array, so woofers are easier to integrate. Lastly, most of the remaining 3 dB roll-up can be counteracted by filtering the high frequencies from the signal to half or more of the width, which coincidentally widens the dispersion and thus the sweet spot. JansZen speakers incorporate all these alternative features. They also use acoustic suspension woofers (sealed enclosures), which have the lowest group delay of all configurations and thus the best chance of seamlessly integrating with the electrostatics. The panels are also well protected from collecting airborne contaminants, avoiding the need for periodic repairs.
https://en.wikipedia.org/wiki?curid=196048
420,462
321,393
A portion of the "Bacillus thuringiensis" genome was incorporated into corn (and cotton) crops. The resulting GMOs are resistant to some insect pests. "Bacillus subtilis" (natto) is the key microbial participant in the ongoing production of the soya-based traditional natto fermentation, and some "Bacillus" species are on the Food and Drug Administration's GRAS (generally regarded as safe) list. The capacity of selected "Bacillus" strains to produce and secrete large quantities (20–25 g/L) of extracellular enzymes has placed them among the most important industrial enzyme producers. The ability of different species to ferment in the acid, neutral, and alkaline pH ranges, combined with the presence of thermophiles in the genus, has led to the development of a variety of new commercial enzyme products with the desired temperature, pH activity, and stability properties to address a variety of specific applications. Classical mutation and (or) selection techniques, together with advanced cloning and protein engineering strategies, have been exploited to develop these products. Efforts to produce and secrete high yields of foreign recombinant proteins in "Bacillus" hosts initially appeared to be hampered by the degradation of the products by the host proteases. Recent studies have revealed that the slow folding of heterologous proteins at the membrane-cell wall interface of Gram-positive bacteria renders them vulnerable to attack by wall-associated proteases. In addition, the presence of thiol-disulphide oxidoreductases in "B. subtilis" may be beneficial in the secretion of disulphide-bond-containing proteins. Such developments from our understanding of the complex protein translocation machinery of Gram-positive bacteria should allow the resolution of current secretion challenges and make "Bacillus" species preeminent hosts for heterologous protein production. "Bacillus" strains have also been developed and engineered as industrial producers of nucleotides, the vitamin riboflavin, the flavor agent ribose, and the supplement poly-gamma-glutamic acid. With the recent characterization of the genome of "B. subtilis" 168 and of some related strains, "Bacillus" species are poised to become the preferred hosts for the production of many new and improved products as we move through the genomic and proteomic era.
https://en.wikipedia.org/wiki?curid=4751
321,221
2,214,871
The acquisition process of the first electron microscope of the Graz University of Technology was started by a donation from industry in 1949. The next year a research group, headed by , was established. Finally, the first electron microscope (“Übermikroskop UEM100” by Siemens & Halske) was bought in 1951 and the opening ceremony was attended by Ernst Ruska, Werner Glaser and Otto Wolf. Despite the Graz University of Technology providing rooms and infrastructure, from the very beginning supplementary income from services research provide for industry was necessary to help cover the high operating and investment cost. Due to the larger interest in measurements and the high utilization of the instrument, the group soon need to expand its personal and look to acquire need microscopes. In order to concentrate all funding sources the non-profit association for the promotion of electron microscopy (Verein zur Förderung der Elektronenmikroskopie und Feinstrukturforschung) was founded in 1959 under the direction of the governor of . The Graz Centre of Electron Microscopy (ZFE) was attached to this non-profit. The combined institutions grew over the years, where the combined role of the head of the university institute and the head of the ZFE in one person played a crucial role in the development of a tight interconnection between fundamental research and application. In 2011 the to this date most expansive and impressive acquisition was possible, a at that time worldwide unique STEM. With the ASTEM (Austrian Scanning Transmission Electron Microscope) magnification of more than one million became possible, this enables atomic resolution. In addition, with an investment volume of 4.5 Mio. Euro, the ASTEM was one of the larges scientific infrastructure investments in Austria.
https://en.wikipedia.org/wiki?curid=67190786
2,213,611
1,907,764
In 1991, George Kollias' group was first to provide "in vivo", proof of principle studies, on deregulated TNF production being causal to the development of chronic polyarthritis in a transgenic animal model, and for showing originally that anti-huTNF antibody treatment was efficacious in treating the modeled disease (pg. 370). These studies were instrumental in mobilizing the interest of anti-TNF industry and foreshadowed the success of the first clinical trials performed in RA in 1994. Further work in his lab provided insights into the function of TNF in host defense and the structure and function of secondary lymphoid organs, a work that more recently evolved into the establishment of TNFRI and NFkB signals specifically in follicular dendritic cells (FDCs) being of pivotal significance in the regulation of humoral B cell responses and autoimmunity. Moreover, George Kollias' group demonstrated TNF's causal effect in the development of combined Crohn's disease and polyarthritis, and the contribution of transmembrane versus soluble TNF in the pathogenic processes. These studies offered a better understanding of the physiological function of TNF in health and disease and rationalized potential complications or optimizations of anti-TNF therapies in other diseases such as in MS. More recently, George Kollias introduced a novel pathogenic principle to explain the cellular basis of TNF function in gut/joint axis diseases, including spondyloarthropathies, by showing that mesenchymal cells, namely synovial fibroblasts and intestinal subepithelial myofibroblasts, are common pathogenic targets of TNF sufficient to drive the chronic inflammatory and destructive disease process. Animal models developed in his lab have been distributed to numerous academic and industrial laboratories around the world (over 200 MTAs in the last ten years). In 2005 he founded the first CRO-biotech spin-off of BSRC Fleming, Biomedcode Hellas SA.
https://en.wikipedia.org/wiki?curid=13272704
1,906,668
332,123
In mid-2022, during the Russian invasion of Ukraine, the US supplied AGM-88 HARM missiles to Ukraine. It was only disclosed after Russian forces showed footage of a tail fin from one of these missiles in early August 2022. U.S. Under Secretary of Defense for Policy Colin Kahl said in recent aid packages they had included a number of anti-radiation missiles that can be fired by Ukrainian aircraft. As built, Soviet-era aircraft do not have the computer architecture to accept NATO standard weapons. Indeed, none of the former Warsaw Pact countries, even those that have had their Soviet-era aircraft updated, were enabled to fire a HARM before. The interface seemed difficult unless using a "crude modification", such as integrating it with an added e-tablet into the cockpit, building a nearly totally independent subsystem within the carrying aircraft. As suggested by Domenic Nicholis, defense correspondent for the Telegraph in the UK, the HARM missile is possibly operating in one of its three modes that enables it to find its target once flying after being released towards a possible enemy air defense and electronic emission area. Pre mission or during flight, NATO signals intelligence aircraft or different intelligence would be providing the overall electromagnetic emissions battlefield to locate the Russian radars where the Ukrainian jets, armed with HARMs would be directed to fire them. This allows the missile to achieve a very long range attack profile, even if it's possible that the missile does not find a target while flying, going wasted. A second possible use of the HARM is operating it in a mode called “HARM as sensor”. Similar to the described mode before, the missile acts as both sensor and weapon, not requiring a sensor pod. A simple interface would show that the missile has a target and the pilot can launch it. In this way the range is shorter, and the jet could be under threat already, but would maximize the possibility to hit the emitter.
https://en.wikipedia.org/wiki?curid=3149
331,946
939,819
Currently, speciation by vicariance is widely regarded as the most common form of speciation; and is the primary model of allopatric speciation. Vicariance is a process by which the geographical range of an individual taxon, or a whole biota, is split into discontinuous populations (disjunct distributions) by the formation of an extrinsic barrier to the exchange of genes: that is, a barrier arising externally to a species. These extrinsic barriers often arise from various geologic-caused, topographic changes such as: the formation of mountains (orogeny); the formation of rivers or bodies of water; glaciation; the formation or elimination of land bridges; the movement of continents over time (by tectonic plates); or island formation, including sky islands. Vicariant barriers can change the distribution of species populations. Suitable or unsuitable habitat may be come into existence, expand, contract, or disappear as a result of global climate change or even large scale human activities (for example, agricultural, civil engineering developments, and habitat fragmentation). Such factors can alter a region's geography in substantial ways, resulting in the separation of a species population into isolated subpopulations. The vicariant populations may then undergo genotypic or phenotypic divergence as: (a) different mutations arise in the gene pools of the populations, (b) they become subjected to different selective pressures, and/or (c) they independently undergo genetic drift. The extrinsic barriers prevent the exchange of genetic information between the two populations, potentially leading to differentiation due to the ecologically different habitats they experience; selective pressure then invariably leads to complete reproductive isolation. Furthermore, a species' proclivity to remain in its ecological niche (see phylogenetic niche conservatism) through changing environmental conditions may also play a role in isolating populations from one another, driving the evolution of new lineages.
https://en.wikipedia.org/wiki?curid=334986
939,318
1,392,715
For some emerging contaminants, several advanced technologies—sonolysis, photocatalysis, Fenton-based oxidation and ozonation—have treated pollutants in laboratory experiments. Another technology is "enhanced coagulation" in which the treatment entity would work to optimize filtration by removing precursors to contamination through treatment. In the case of THMs, this meant lowering the pH, increasing the feed rate of coagulants, and encouraging domestic systems to operate with activated carbon filters and apparatuses that can perform reverse osmosis. Although these methods are effective, they are costly, and there have been many instances of treatment plants being resistant to pay for the removal of pollution, especially if it wasn't created in the water treatment process as many EC's occur from runoff, past pollution sources, and personal care products. It is also difficult to incentivize states to have their own policies surrounding contamination because it can be burdensome for states to pay for screening and prevention processes. There is also an element of environmental injustice, in that lower income communities with less purchasing and political power cannot buy their own system for filtration, and are regularly exposed to harmful compounds in drinking water and food. However recent treads for Light-based systems shows great potential for such applications. With the decrease in cost of UV-LED systems and growing prevalence of solar powered systems, it shows great potential to remove CECs while keeping cost low.
https://en.wikipedia.org/wiki?curid=53082558
1,391,944
1,320,040
Patrick Manson was a son of Alexander Manson and Elizabeth Livingstone Blaikie, born at Oldmeldrum, eighteen miles north of Aberdeen. His father was manager of the local branch of the British Linen Bank and Laird of Fingask. His mother was distant relative of the famed Christian missionary-explorer David Livingstone. He was the second son of a family of three boys and four girls. He developed a childhood passion in natural history, fishing, shooting, carpentry, mechanics and cricket. Among his Presbyterian-Christian family, he showed excellent memory for memorising church sermons at the age of 5 years. In 1857 his family moved to Aberdeen, where he entered Gymnasium School. He later continued at West End Academy. In 1859 he was apprenticed to Blaikie Brothers, ironmasters based in Aberdeen. However, he was struck by a type of tuberculosis called Pott's disease of the spine which forced him to take rest. In 1860 he entered the University of Aberdeen from where he completed medicine course in 1865. He was only nineteen and was underage for graduation, so he visited hospitals, museums and medical schools in London. Finally of age he formally graduated in October 1865, and was appointed Medical Officer at Durham Lunatic Asylum, where he worked for seven months. He performed 17 postmortem dissections on patients with psychiatric illnesses for his thesis. In 1866 he received the degrees of Master of Surgery, Doctor of Medicine and Doctor of Law.
https://en.wikipedia.org/wiki?curid=2995870
1,319,314
301,365
Determining the thermodynamic state of the stagnation point is more difficult under an equilibrium gas model than a perfect gas model. Under a perfect gas model, the "ratio of specific heats" (also called "isentropic exponent", adiabatic index, "gamma", or "kappa") is assumed to be constant along with the gas constant. For a real gas, the ratio of specific heats can wildly oscillate as a function of temperature. Under a perfect gas model there is an elegant set of equations for determining thermodynamic state along a constant entropy stream line called the "isentropic chain". For a real gas, the isentropic chain is unusable and a "Mollier diagram" would be used instead for manual calculation. However, graphical solution with a Mollier diagram is now considered obsolete with modern heat shield designers using computer programs based upon a digital lookup table (another form of Mollier diagram) or a chemistry based thermodynamics program. The chemical composition of a gas in equilibrium with fixed pressure and temperature can be determined through the "Gibbs free energy method". Gibbs free energy is simply the total enthalpy of the gas minus its total entropy times temperature. A chemical equilibrium program normally does not require chemical formulas or reaction-rate equations. The program works by preserving the original elemental abundances specified for the gas and varying the different molecular combinations of the elements through numerical iteration until the lowest possible Gibbs free energy is calculated (a Newton–Raphson method is the usual numerical scheme). The data base for a Gibbs free energy program comes from spectroscopic data used in defining partition functions. Among the best equilibrium codes in existence is the program "Chemical Equilibrium with Applications" (CEA) which was written by Bonnie J. McBride and Sanford Gordon at NASA Lewis (now renamed "NASA Glenn Research Center"). Other names for CEA are the "Gordon and McBride Code" and the "Lewis Code". CEA is quite accurate up to 10,000 K for planetary atmospheric gases, but unusable beyond 20,000 K (double ionization is not modelled). CEA can be downloaded from the Internet along with full documentation and will compile on Linux under the G77 Fortran compiler.
https://en.wikipedia.org/wiki?curid=45294
301,204
201,007
Samuel Pufendorf (1632–1694) was a notable jurist and philosopher known for his natural law theories, influencing Adam Smith as well as Thomas Jefferson. Olof von Dalin (1708–1763) was an influential Swedish writer and historian of the late enlightenment era. Peter Wieselgren (1800–1877) was a Swedish priest, literature critic and prominent leader of the Swedish temperance movement. Knut Wicksell (1851–1926) was an influential economist, sometimes considered one of the founders of modern macroeconomics. Oscar Olsson (1877–1950) was an important developer of self-education in Sweden and known as the father of study circles. Bertil Ohlin (1899–1979) received the Nobel Prize in economic sciences in 1977 for theories concerning international trade and capital, and was the leader of the Liberal's Peoples Party (Folkpartiet) for 23 years. Gunnar Jarring (1907–2002) was Sweden's ambassador in UN 1956–1958, and Sweden's ambassador in Washington DC 1958–1964. Britta Holmström (1911–1992) was the founder of Individuell Människohjälp (IM), a human rights organization with activities in 12 countries. Torsten Hägerstrand (1916–2004) was an internationally renowned geographer, considered the father of 'time geography' and receiver of the Lauréat Prix International de Géographie Vautrin Lud in 1992. Judith Wallerstein (1921–2012) was a renowned psychologist and internationally recognized authority on the effects of marriage and divorce on children and their parents. The first person from Iceland to earn a degree in archaeology, Ólafía Einarsdóttir, studied for her MA and PhD at Lund.
https://en.wikipedia.org/wiki?curid=17843
200,904
27,818
Tolosani may have criticized the Copernican theory as scientifically unproven and unfounded, but the theory also conflicted with the theology of the time, as can be seen in a sample of the works of John Calvin. In his "Commentary on Genesis" he said that "We indeed are not ignorant that the circuit of the heavens is finite, and that the earth, like a little globe, is placed in the centre." In his commentary on Psalms 93:1 he states that "The heavens revolve daily, and, immense as is their fabric and inconceivable the rapidity of their revolutions, we experience no concussion... How could the earth hang suspended in the air were it not upheld by God's hand? By what means could it maintain itself unmoved, while the heavens above are in constant rapid motion, did not its Divine Maker fix and establish it." One sharp point of conflict between Copernicus's theory and the Bible concerned the story of the Battle of Gibeon in the Book of Joshua where the Hebrew forces were winning but whose opponents were likely to escape once night fell. This is averted by Joshua's prayers causing the Sun and the Moon to stand still. Martin Luther once made a remark about Copernicus, although without mentioning his name. According to Anthony Lauterbach, while eating with Martin Luther the topic of Copernicus arose during dinner on 4 June 1539 (in the same year as professor George Joachim Rheticus of the local University had been granted leave to visit him). Luther is said to have remarked "So it goes now. Whoever wants to be clever must agree with nothing others esteem. He must do something of his own. This is what "that fellow" does who wishes to turn the whole of astronomy upside down. Even in these things that are thrown into disorder I believe the Holy Scriptures, for Joshua commanded the sun to stand still and not the earth." These remarks were made four years before the publication of "On the Revolutions of the Heavenly Spheres" and a year before Rheticus's "Narratio Prima". In John Aurifaber's account of the conversation Luther calls Copernicus "that fool" rather than "that fellow", this version is viewed by historians as less reliably sourced.
https://en.wikipedia.org/wiki?curid=323592
27,808
1,002,900
Dawn redwoods thrive over a large, crescent-shaped region that encompasses the eastern and southern United States, as well as on the West Coast. Many institutions, such as the Arnold Arboretum of Harvard University, have fine specimens. The H. H. Hunnewell estate in Wellesley, Massachusetts, has two specimens (numbers 29 and 34) that date back to the initial distribution of seed by the Arnold Arboretum in 1949. There is a small grove of dawn redwoods at Bailey Arboretum in Locust Valley, New York, including one tree which is claimed to be the world's largest by diameter. The New York City Department of Parks and Recreation has begun planting dawn redwoods on sidewalks throughout Manhattan and Brooklyn, this link maps them. Washington, D.C.'s Urban Forestry Division has planted hundreds throughout that city, including all of the street trees in the 1800 block of Redwood Terrace, NW. A dawn redwood grows outside of the Rosicrucian Research Library at Rosicrucian Park in San Jose, California, as a memorial to H. Spencer Lewis. It was planted in 1950 from a seedling from the lot brought from China by Dr. Ralph Chaney, and donated by an unnamed donor to H. Spencer Lewis's widow for this purpose. In North Carolina, a private endeavor is working to create a "Metasequioa" reserve on 50 acres of uplands in the Sauratown Mountains. Portland, Oregon, is home to some of the oldest dawn redwoods in the US. One specimen planted in the Hoyt Arboretum in 1948 was 103' tall at last measurement and in 1952 earned the distinction of being the first dawn redwood to bear cones in the Western Hemisphere in 6–8 million years. A dawn redwood was planted at Southern Illinois University Carbondale in Carbondale, Illinois by William Marberry in 1950 and remains there to this day.
https://en.wikipedia.org/wiki?curid=334782
1,002,382
1,905,395
Pahlavan received his BS/MS degree in Electrical Engineering from the University of Tehran in 1975, and his PhD degree from the Worcester Polytechnic Institute, Worcester, Massachusetts in 1979. He began his academic career as an assistant Professor at the Northeastern University, Boston, MA in 1979, then he joined the faculty at the Worcester Polytechnic Institute in 1985. At WPI he founded the world's first academic research program in wireless local area networks (WLAN), commercially known as Wi-Fi (1985). He has been a visiting Professor at the University of Oulu, Finland (1995-2007), where he also spent his sabbatical leave in 1999. He has spent his other sabbatical leaves at the Olin College and Harvard University in 2004 and 2011 respectively. He was the Chief Technical Adviser of Skyhook Wireless, Boston, MA between 2004 and 2014. WINDATA was one of the pioneers in design of WLAN and Skyhook is the pioneer of Wi-Fi positioning systems, which was adopted by Steve Jobs for the original iPhone (2007). Throughout several decades of research and scholarship, he has also served as a consultant to many key players in the wireless industry including Nokia, Apple, DEC, Honeywell, Electrobit, JPL and NTT. He has also led the US team for review of the National R&D Programs, sponsored by the Finnish Academy and TEKES (2000, 2003). Pahlavan is widely recognized as the inventor of Wi-Fi, and Body Area Networks by his friends and colleagues. He was elected as a fellow of the IEEE (1996), was selected as a member of the Evolution of Untethered Communications Committee, National Research Council (1997), was the first Fulbright-Nokia scholar (2000), and was awarded the WPI's Board of Trustees' Outstanding Research and Creative Scholarship Award (2011).
https://en.wikipedia.org/wiki?curid=10617186
1,904,300
870,498
Observations from testing the FPA10 confirmed that applications such as Resultz and PipeDream 4—both Colton Software products—and other spreadsheets, whilst ostensibly standing to benefit as number processing applications, exhibited "no noticeable speed improvements", this being attributed to these applications' avoidance of unnecessary calculation and the more significant overhead of servicing a graphical user interface. Other programs such as Draw and ArtWorks—a Computer Concepts product—used their own arithmetic routines instead of the floating-point emulator (FPE) and, as anticipated, were therefore unable to take advantage of the accelerated floating-point instructions. However, various free of charge or low-cost programs ported from other systems, such as POV-Ray, plus selected native applications such as Clares' Illusionist and Oak Solutions' WorraCAD, did exhibit substantial performance gains from the FPA with speed-ups of between five and ten times. The Basic64 interpreter bundled with RISC OS which was "much slower than Basic V normally", with the former using the FPE and the latter providing its own floating-point arithmetic routines, ended up "slightly faster" due to observed speed-ups of around four to around eleven times, with non-trigonometric operations benefiting the most. Programs compiled by Intelligent Interfaces' Fortran compiler were reported as running "some routines up to 20 times faster with the FPA10". The product was perceived as "good value" but having restricted usefulness with the general lack of support in many applications, these employing their own routines and techniques to attempt to provide performant arithmetic on the base hardware platform, and a lack of incentive amongst software producers to offer support without a large enough market of users having the FPA fitted.
https://en.wikipedia.org/wiki?curid=63145
870,038
1,426,419
Budding's mower was designed primarily to cut the lawn on sports grounds and extensive gardens, as a superior alternative to the scythe, and was granted a British patent on 31 August 1830. It took ten more years and further innovations to create a machine that could be worked by animals, and sixty years before a steam-powered lawn mower was built. The first machine produced was 19 inches in width with a frame made of wrought iron. The mower was pushed from behind with motive power coming from the rear land roller which drove gears to transfer the drive to the knives on the cutting cylinder; the ratio was 16:1. There was another roller placed in between the cutting cylinder and the land roller which was adjustable to alter the height of cut. On cutting, the grass clippings were hurled forward into a tray like box. It was soon realized, however, that an extra handle was needed in front of the machine which could be used to help pull it along. Two of the earliest Budding machines sold went to Regent's Park Zoological Gardens in London and the Oxford Colleges. In an agreement between John Ferrabee and Edwin Budding, dated 18 May 1830, Ferrabee paid the costs of development, obtained letters of patent and acquired rights to manufacture, sell and license other manufacturers in the production of lawn mowers. Budding realised that a similar device could be used to cut grass if the mechanism was mounted in a wheeled frame to make the blades rotate close to the lawn's surface. Budding went into partnership with a local engineer, John Ferrabee, and together they made mowers in a factory at Thrupp near Stroud.
https://en.wikipedia.org/wiki?curid=43704
1,425,616
37,443
The T44 was tested in a competitive service rifle competition conducted by the Infantry Board at Fort Benning, Georgia against the Springfield T47 (a modified T25) and the T48, a variant of Fabrique Nationale's FN FAL (from "Fusil Automatique Leger", French for "light automatic rifle"). The T47, which did not have a bolt roller and performed worse than both the T44 and the T48 in dust and cold weather tests. Thus, it was dropped from consideration in 1953. During 1952–53, testing proved the T48 and the T44 to be roughly comparable in performance, with the T48 holding an advantage in ease of field stripping and dust resistance, as well as a longer product development lead time. A "Newsweek" article in July 1953 speculated that the T48/FAL could have been selected over the T44. During the winter of 1953–54, both rifles competed in the winter rifle trials at U.S. Army facilities in the Arctic. Springfield Armory engineers, anxious to ensure the selection of the T44, had been specially preparing and modifying the test T44 rifles for weeks with the aid of the armory's cold chamber, including a redesign of the T44 gas regulator and custom modifications to magazines and other parts to reduce friction and seizing in extreme cold. The T48 rifles received no such special preparation and in the continued cold weather testing began to experience sluggish gas system functioning, aggravated by the T48's close-fitting surfaces between bolt and carrier, and carrier and receiver. FN engineers opened the gas ports in an attempt to improve functioning, but this caused early/violent extraction and broken parts as a result of the increased pressures. As a result, the T44 was ranked superior in cold weather operation to the T48. The Arctic Test Board report made it clear that the T48 needed improvement and that the U.S. would not adopt the T48 until it had successfully completed another round of Arctic tests the following winter.
https://en.wikipedia.org/wiki?curid=319543
37,430
754,580
Thorstein Veblen (1857–1929), who came from rural midwestern America and worked at the University of Chicago is one of the best-known early critics of the "American Way". In "The Theory of the Leisure Class" (1899) he scorned materialistic culture and wealthy people who conspicuously consumed their riches as a way of demonstrating success. In "The Theory of Business Enterprise" (1904) Veblen distinguished production for people to use things and production for pure profit, arguing that the former is often hindered because businesses pursue the latter. Output and technological advance are restricted by business practices and the creation of monopolies. Businesses protect their existing capital investments and employ excessive credit, leading to depressions and increasing military expenditure and war through business control of political power. These two books, focusing on criticism of consumerism and profiteering did not advocate change. However, in 1918 he moved to New York to begin work as an editor of a magazine called "The Dial", and then in 1919, along with Charles A. Beard, James Harvey Robinson, and John Dewey he helped found the New School for Social Research (known today as The New School). He was also part of the Technical Alliance, created in 1919 by Howard Scott. From 1919 through 1926 Veblen continued to write and to be involved in various activities at The New School. During this period he wrote "The Engineers and the Price System" (1921).
https://en.wikipedia.org/wiki?curid=7881361
754,177
1,243,089
Bacteriophages (Greek for 'bacteria eater'), or simply phages, are viruses which infect bacteria. The majority of all bacteriophages known exhibit a double-stranded DNA genome inside the virion capsid and belong to the order of tailed phages, Caudovirales. The tailed phages can be further separated into three families: Podoviridae, which are characterized by very short tails; Myoviridae, which exhibit longer, straight and contractile tails; and Siphoviridae, which can be identified due to their long and flexible tails. Another well studied group of phages with many applications, although minor in terms of species diversity, is represented by filamentous phages which exhibit a single stranded DNA genome decorated by a helical protein layer surrounding the DNA molecule. Bacteriophages are ubiquitously distributed in nature and can also be isolated from human or animal associated microflora. They outnumber their bacterial host species by a factor of ten representing the most abundant self-replicating entities on earth with an estimated 1031 phages in total. The idea of using phages against unwanted bacteria developed shortly after their discovery. With the improvements in organic chemistry during the 1950s, exploration and development of broad spectrum antibiotics displaced interest in bacteriophage research. Several laboratories have been testing suitability of bacteriophage isolates to control certain bacterial pathogens. Significant advancements in this research have been made at the Bacteriophage Institute in Tbilisi, Georgia, where phage therapy is routinely applied in medicine research field. Today treatment of antibiotic resistant bacteria is a challenging task. Recently, research on bacteriophages has gained additional momentum in light of the identification of antibiotic-resistant pathogens of infectious diseases, wherein the application of antibiotics is not effectively working, therefore research on the application of bacteriophages is being reviewed intensely.
https://en.wikipedia.org/wiki?curid=31196402
1,242,416
1,270,377
STS-81 (January 12–22, 1997) was a 10-day mission, the fifth to dock with Russia's Mir space station, and the second to exchange U.S. astronauts. The mission also carried the Spacehab double module providing additional middeck locker space for secondary experiments. In five days of docked operations more than three tons of food, water, experiment equipment and samples were moved back and forth between the two spacecraft. Grunsfeld served as the flight engineer on this flight. Following 160 orbits of the Earth the STS-81 mission concluded with a landing on Kennedy Space Center's Runway 33 ending a 3.9 million mile journey. Mission duration was 244 hours, 56 minutes. During this flight, Grunsfeld placed a phone call to NPR's auto-repair radio show, "Car Talk". In this call he complained about the performance of his serial-numbered, Rockwell-manufactured "government van". To wit, it would run very loud and rough for about two minutes, quieter and smoother for another six and a half, and then the engine would stop with a jolt. He went on to state that the brakes of the vehicle, when applied, would glow red-hot, and that the vehicle's odometer displayed "about 60 million miles". This created some consternation for the hosts, until they noticed the audio of Grunsfeld's voice, being relayed from Mir via TDRS satellite, sounded similar to that of Tom Hanks in the then-recent film "Apollo 13", after which they realized the call was from space and the government van in question was, in fact, the Space Shuttle.
https://en.wikipedia.org/wiki?curid=598981
1,269,686
303,798
Understanding the importance of Germany's radar sites, the Allies directed attacks against these installations and introduced new technology to counteract the effects of radar-directed AAA, including CARPET (US) and WINDOW (UK). A change in tactics saw bomber formations flying higher and more spread out to avoid the effects of flak. Bombing missions were also carried out to accomplish the physical destruction of AAA sites, using imagery intelligence to locate the weapons and employing both heavy bombers and fighter-bombers to destroy them. The P-47 Thunderbolt in particular was chosen for this task due to its ability to survive enemy fire. The effect of these missions varied, with losses suffered by fighter-bombers much higher—up to 40% in some cases—on account of their low-altitude attacks. Artillery also played a major role in suppressing air defenses, with the British Army the first to develop what became known as counterflak or "Apple Pie" missions. These missions were first employed to limited effect during the Battle of France but matured as the war progressed. The largest SEAD mission in history took place on March 24, 1945, when artillery forces of the British XII Corps attempted to knock out the local German air defense network in support of Operation Varsity. Although twenty-four thousand artillery shells were fired over the course of twenty-two minutes at some one hundred targets, the mission was unsuccessful due to inaccurate targeting data and insufficient firepower.
https://en.wikipedia.org/wiki?curid=573491
303,636
1,598,765
Amaza Lee Meredith's educational journey started with her attending the Lynchburg elementary public schools. After that she attended Jackson High School from which she graduated top of her class in 1912. Determined to gain her teaching credentials she attended Virginia Normal and Industrial Institute (Now Virginia State University). It was in her first year that she met Dr. Edna Meade Colson who had an influence on Meredith's education through supporting her with advice and recommending pedagogical literature. Meredith finally got her "Summer School Professional Certificate" for teachers. Now being able to teach elementary school, she took her first teaching job and returned to Petersburg in 1922 to earn her teaching degree. This time she was the valedictorian of her class at Virginia Normal and Industrial Institute. In 1926 she moved to New York City, where she lived together with her sister in North Harlem. She attended the teachers college at Columbia University to study fine arts and art appreciation and received a bachelor's degree in 1930. Among others, she took classes in interior design and home decoration and attended lectures, discussions, practical work and trips to museums which shaped her understanding of a modern design approach. During this time she also came in contact with the new negro movement and grew into it as a young, educated black woman who disagreed with the existing stereotypes. Moreover, she qualified herself as a member of the talented tenth, which after W.E.B. Du Bois was a group of culturally, socially and scientifically educated black Americans who would serve as sort of a mediator between blacks of lower education and the white society. She returned to New York to gain her master's degree at Columbia University in 1934.
https://en.wikipedia.org/wiki?curid=48314474
1,597,865
1,075,260
It is important to realize that EIT is only one of many diverse mechanisms which can produce slow light. The Kramers–Kronig relations dictate that a change in absorption (or gain) over a narrow spectral range must be accompanied by a change in refractive index over a similarly narrow region. This rapid and "positive" change in refractive index produces an extremely low group velocity. The first experimental observation of the low group velocity produced by EIT was by Boller, İmamoğlu, and Harris at Stanford University in 1991 in strontium. In 1999 Lene Hau reported slowing light in a medium of ultracold sodium atoms, achieving this by using quantum interference effects responsible for electromagnetically induced transparency (EIT). Her group performed copious research regarding EIT with Stephen E. Harris. "Using detailed numerical simulations, and analytical theory, we study properties of micro-cavities which incorporate materials that exhibit Electro-magnetically Induced Transparency (EIT) or Ultra Slow Light (USL). We find that such systems, while being miniature in size (order wavelength), and integrable, can have some outstanding properties. In particular, they could have lifetimes orders of magnitude longer than other existing systems, and could exhibit non-linear all-optical switching at single photon power levels. Potential applications include miniature atomic clocks, and all-optical quantum information processing." The current record for slow light in an EIT medium is held by Budker, Kimball, Rochester, and Yashchuk at U.C. Berkeley in 1999. Group velocities as low as 8 m/s were measured in a warm thermal rubidium vapor.
https://en.wikipedia.org/wiki?curid=1244292
1,074,706
544,653
Double cohort brought challenges to the teaching, study and residence spaces at the campus due to increase in first-year enrollment. In response, the Academic Research Centre (ARC) and Joan Foley Hall were constructed. The ARC was built in 2003 as an extension of the Bladen Building with a copper finish. It allowed for the relocation and expansion of the library to its present state and introduced the campus's first 300-seat lecture theatre, which has since held the Watts Lecture series, after formerly being held in the Meeting Place. The Doris McCarthy Gallery, also found in the ARC, exhibits works by local artist and campus alumni, Doris McCarthy. The Student Centre was opened in 2004 through a project that was initiated and funded by students. Constructed using 18 tonnes of recycled steel from a demolished gallery at the Royal Ontario Museum, the three-storey Student Centre earned a Leadership in Energy and Environmental Design (LEED) certification as well as a Green Design Award from the City of Toronto. The Social Sciences Building, home of the Department of Social Sciences, also opened in 2004 as the Management Wing but took its present name after the completion of the Instructional Centre in 2011, which became the new home of the Department of Management, the Department of Computer and Mathematical Sciences, and offices of cooperative education programs. Brick and limestone were used to create the Arts and Administration Building, completed in 2005, which holds the principal's office. The Science Research Building, where post-graduate research facilities and a lecture hall are located, is an extension of the Science Wing that was completed in 2008.
https://en.wikipedia.org/wiki?curid=333619
544,371
487,358
Gifford said that Widlar and Talbert were actually the founders of National Semiconductor, and that Sporck joined them later. The duo started by setting up the epitaxial process at Santa Clara. Once the technology was in place, Widlar concentrated on voltage regulators and by the end of 1966 produced the industry's first integrated linear regulator. The LM100, a revolutionary new circuit, became another flagship product that surpassed expectations for sales and longevity. In 1967 Widlar designed the LM101, an operational amplifier with improved gain, decreased input current, and protection against short circuit. The LM101 featured another unorthodox input stage, employing NPN input transistors emitter coupled to PNP transistors in a common base arrangement. The high reverse breakdown voltage of the PNP transistors allowed the LM101 to withstand a differential input voltage of ±30 V. Its frequency compensation was simpler, more robust and more stable than that of μA709. It was followed by LM101A, a functionally identical IC that pioneered the use of a field-effect transistor to control internal current sources. Widlar's solution minimized die area and current drain, and enabled operation over a wide range of power supply voltages. Later he devised another new device, the super-beta transistor. It was created in silicon by Talbert and integrated in the LM108 precision operational amplifier, which was released in 1969. These high-gain, very-low-voltage devices were capable of operating at very low input currents within the full military range of operating conditions. The items in the linear circuit product line were user friendly, very useful, and very profitable.
https://en.wikipedia.org/wiki?curid=3483624
487,108
916,679
However, the battle was one-sided almost from the beginning. The reasons for this are the subject of continuing study by military strategists and academics. There is general agreement that US technological superiority was a crucial factor but the speed and scale of the Iraqi collapse has also been attributed to poor strategic and tactical leadership and low morale among Iraqi troops, which resulted from a history of incompetent leadership. After devastating initial strikes against Iraqi air defenses and command and control facilities on 17 January 1991, coalition forces achieved total air superiority almost immediately. The Iraqi Air Force was destroyed within a few days, with some planes fleeing to Iran, where they were interned for the duration of the conflict. The overwhelming technological advantages of the US, such as stealth aircraft and infrared sights, quickly turned the air war into a "turkey shoot". The heat signature of any tank which started its engine made an easy target. Air defense radars were quickly destroyed by radar-seeking missiles fired from wild weasel aircraft. Grainy video clips, shot from the nose cameras of missiles as they aimed at impossibly small targets, were a staple of US news coverage and revealed to the world a new kind of war, compared by some to a video game. Over 6 weeks of relentless pounding by planes and helicopters, the Iraqi Army was almost completely beaten but did not retreat, under orders from Iraqi President Saddam Hussein, and by the time the ground forces invaded on 24 February, many Iraqi troops quickly surrendered to forces much smaller than their own; in one instance, Iraqi forces attempted to surrender to a television camera crew that was advancing with coalition forces.
https://en.wikipedia.org/wiki?curid=161323
916,196
1,184,567
In October, the southern trade winds decreased the heat, and clear weather only favored astronomical observations. Besides Bellingshausen, Simonov, Lazarev, and Zavadovsky, no one on the board had skills for navigation and for working with the sextant. Thus, taking into consideration the abundance of instruments on board, all officers started to study navigation. On November 2, 1819, at 5 pm, the expedition arrived in Rio following the orienteer of Pan de Azucar mountain, the image of which they had in the sailing directions. Since no one from the crew spoke Portuguese, there were some language barrier difficulties. By that time, "Otkrytie" and "Blagonamerennyi" were already in the harbor since they did not go to the Canary Islands. On November 3, Consul General of Russia Georg von Langsdorff who was also a participant of the first Russian circumnavigation in 1803–1806, met the crew and escorted officers to the ambassador major general baron Diederik Tuyll van Serooskerken. The next day, the consul arranged for the astronomers to use a rocky island called Rados where Simonov, guard-marine Adams and artilleryman Korniliev set a transit instrument and started to reconcile the chronometers. Generally, Bellingshausen was not fond of the Brazilian capital, mentioning "disgusting untidiness" and "abominable shops where they sell slaves". On the contrary, Simonov claimed that Rio with its "meekness of morals, the luxury and courtesy of society and the magnificence of spiritual processions" do "remind him of southern European cities". Officers visited the neighborhoods of the city, coffee plantations, and Trizhuk Falls. On November 9, commanders of both divisions – Bellingshausen, Lazarev, Vasiliev, and Shishmaryov – received the audience with the Portuguese king John VI of Portugal, who at that time resided in Brazil. Before the ships departed, their crews filled the stocks and took for slaughtering two bulls, 40 pigs and 20 piglets, several sheep, ducks and hens, rum and granulated sugar, lemons, pumpkins, onions, garlic, and other herbs. On November 20, the chronometers were put back on board. On November 22 at 6 am, the expedition headed to the south.
https://en.wikipedia.org/wiki?curid=40880361
1,183,940
890,032
Nietzsche was frequently associated with the Romantic movement. The assumption is correct inasmuch as many motives of Romantic anti-capitalism — e.g., the struggle against the capitalist division of labour and its consequences for bourgeois culture and morals — played a considerable part in his thinking. The setting up of a past age as an ideal for the present age to realize also belonged to the intellectual armoury of Romantic anti-capitalism. Nietzsche’s activity, however, fell within the period after the proletariat’s first seizure of power, after the Paris Commune. Crisis and dissolution, Romantic anti-capitalism’s development into capitalist apologetics, the fate of Carlyle during and after the 1848 revolution — these already lay far behind Nietzsche in the dusty past. Thus the young Carlyle had contrasted capitalism’s cruelty and inhumanity with the Middle Ages as an epoch of popular prosperity, a happy age for those who laboured; whereas Nietzsche began, as we have noted, by extolling as a model the ancient slave economy. And so the reactionary utopia which Carlyle envisioned after 1848 he also found naive and long outdated. Admittedly the aristocratic bias of both had similar social foundations: in the attempt to ensure the leading social position of the bourgeoisie and to account for that position philosophically. But the different conditions surrounding Nietzsche’s work lent to his aristocratic leanings a fundamentally different content and totally different colouring from that of Romantic anti-capitalism. True, remnants of Romanticism (from Schopenhauer, Richard Wagner) are still palpable in the young Nietzsche. But these he proceeded to overcome as he developed, even if — with regard to the crucially important method of indirect apologetics — he still remained a pupil of Schopenhauer and preserved as his basic concept the irrational one of the Dionysian principle (against reason, for instinct); but not without significant modifications, as we shall see. Hence an increasingly energetic dissociation from Romanticism is perceptible in the course of Nietzsche’s development. While the Romantic he identified more and more passionately with decadence (of the bad kind), the Dionysian became a concept increasingly antithetical to Romanticism, a parallel for the surmounting of decadence and a symbol of the ‘good’ kind of decadence, the kind he approved.
https://en.wikipedia.org/wiki?curid=176051
889,563
32,011
During World War II, von Neumann worked on the Manhattan Project with theoretical physicist Edward Teller, mathematician Stanislaw Ulam and others, problem-solving key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb. He developed the mathematical models behind the explosive lenses used in the implosion-type nuclear weapon and coined the term "kiloton" (of TNT) as a measure of the explosive force generated. During this time and after the war, he consulted for a vast number of organizations including the Office of Scientific Research and Development, the Army's Ballistic Research Laboratory, the Armed Forces Special Weapons Project and the Oak Ridge National Laboratory. At the peak of his influence in the 1950s he was the chair for a number of critical Defense Department committees including the Nuclear Weapons Panel of the Air Force Scientific Advisory Board and the ICBM Scientific Advisory Committee as well as a member of the influential Atomic Energy Commission. He played a key role alongside Bernard Schriever and Trevor Gardner in contributing to the design and development of the United States' first ICBM programs. During this time he was considered the nation's foremost expert on nuclear weaponry and the leading defense scientist at the Pentagon. As a Hungarian émigré, concerned that the Soviets would achieve nuclear superiority, he designed and promoted the policy of mutually assured destruction to limit the arms race.
https://en.wikipedia.org/wiki?curid=15942
32,000
595,452
Biomineralization evolved multiple times, independently, and most animal lineages first expressed biomineralized components in the Cambrian period. Many of the same processes are used in unrelated lineages, which suggests that biomineralization machinery was assembled from pre-existing "off-the-shelf" components already used for other purposes in the organism. Although the biomachinery facilitating biomineralization is complex – involving signalling transmitters, inhibitors, and transcription factors – many elements of this 'toolkit' are shared between phyla as diverse as corals, molluscs, and vertebrates. The shared components tend to perform quite fundamental tasks, such as designating that cells will be used to create the minerals, whereas genes controlling more finely tuned aspects that occur later in the biomineralization process, such as the precise alignment and structure of the crystals produced, tend to be uniquely evolved in different lineages. This suggests that Precambrian organisms were employing the same elements, albeit for a different purpose — perhaps to avoid the inadvertent precipitation of calcium carbonate from the supersaturated Proterozoic oceans. Forms of mucus that are involved in inducing mineralization in most animal lineages appear to have performed such an anticalcifatory function in the ancestral state. Further, certain proteins that would originally have been involved in maintaining calcium concentrations within cells are homologous in all animals, and appear to have been co-opted into biomineralization after the divergence of the animal lineages. The galaxins are one probable example of a gene being co-opted from a different ancestral purpose into controlling biomineralization, in this case being 'switched' to this purpose in the Triassic scleractinian corals; the role performed appears to be functionally identical to that of the unrelated pearlin gene in molluscs. Carbonic anhydrase serves a role in mineralization broadly in the animal kingdom, including in sponges, implying an ancestral role. Far from being a rare trait that evolved a few times and remained stagnant, biomineralization pathways in fact evolved many times and are still evolving rapidly today; even within a single genus it is possible to detect great variation within a single gene family.
https://en.wikipedia.org/wiki?curid=1984187
595,147
1,569,569
Diagnosis is based on observation of behavior and development. Many, especially girls and those who have fewer social difficulties, may have been misdiagnosed with other conditions. Males are diagnosed with ASD four to five times more often than females. The reasons for this remain predominantly unclear, but current hypotheses include a higher testosterone level in utero, different presentations of symptoms in females (leading to misdiagnosis or underdiagnosis) compared to males, and gender bias. Clinical assessment of children can involve a variety of individuals, including the caregiver(s), the child, and a core team of professionals (pediatricians, child psychiatrists, speech-and-language therapists and clinical/educational psychologists). For adult diagnosis, clinicians identify neurodevelopmental history, behaviors, difficulties in communication, limited interests and problems in education, employment, and social relationships. Challenging behaviors may be assessed with functional analysis to identify the triggers causing them. The sex and gender disparity in ASD diagnostics requires further research in terms of adding diagnosis specifiers as well as female-oriented examples, which may be masked through camouflaging behaviors. Camouflaging is defined as a coping mechanism used in social situations, consisting of individuals pretending to be other people without any communication difficulties. Because of camouflaging and other societal factors, females with ASD are more likely to be diagnosed late or with a different mental health concern. In general, it is critical for people to understand that the female ASD phenotype is less noticeable, especially when they present as "higher functioning" than others with ASD. Lastly, due to the imbalance in sexes participating in ASD studies, the literature is potentially biased towards the ways that it presents in male individuals.
https://en.wikipedia.org/wiki?curid=35773166
1,568,681
1,655,053
Frankland's analyses were both chemical and bacteriological, and his dissatisfaction with the processes in vogue for the former at the time of his appointment caused him to spend two years in devising new and more accurate methods. In 1859 Frankland passed a night on the very top of Mont Blanc in company with John Tyndall. One of the purposes of the expedition was to discover whether the rate of combustion of a candle varies with the density of the atmosphere in which it is burnt, a question which was answered in the negative. Other observations made by Frankland at the time formed the starting-point of a series of experiments which yielded far-reaching results. He noticed that at the summit the candle gave a very poor light, and was thereby led to investigate the effect produced on luminous flames by varying the pressure of the atmosphere in which they are burning. He found that pressure increases luminosity, so that hydrogen, for example, the flame of which gives no light in normal circumstances, burns with a luminous flame under a pressure of ten or twenty atmospheres, and the inference he drew was that the presence of solid particles is not the only factor that determines the light-giving power of a flame, Further, he showed that the spectrum of a dense ignited gas resembles that of an incandescent liquid or solid, and he traced a gradual change in the spectrum of an incandescent gas under increasing pressure, the sharp lines observable when it is extremely attenuated broadening out to nebulous bands as the pressure rises, till they merge in the continuous spectrum as the gas approaches a density comparable with that of the liquid state. An application of these results to solar physics in conjunction with Sir Norman Lockyer led to the view that at least the external layers of the sun cannot consist of matter in the liquid or solid forms, but must be composed of gases or vapours.
https://en.wikipedia.org/wiki?curid=940583
1,654,121
1,788,163
The Fenians, a secret Irish Catholic militant organization, recruited heavily among Civil War veterans in preparation to invade Canada. The group's goal was to force Britain to grant Ireland its independence. The Fenians counted thousands of members, but they had a confused command structure, competing factions, unfamiliar new weapons, and British agents in their ranks who alerted the Canadians. Their invasion forces were too small and had poor leadership. Several attempts were organized, but they were either canceled at the last minute or failed in a matter of hours. The largest raid took place on May 31-June 2, 1866, when about 1000 Fenians crossed the Niagara River. The Canadians were forewarned, and over 20,000 Canadian militia and British regulars turned out. A few men on each side were killed and the Fenians soon retreated home. The Johnson administration at first quietly tolerated this violation of American neutrality, but, by 1867, dispatched the U.S. Army to prevent further Fenian raids. The Fenians organized a second attack on May 25, 1870, but it was broken up by the United States Marshal for Vermont. London realized that American tolerance for the Fenians showed the strong U.S. displeasure with the British record during the Civil War, and hastened to resolve the Alabama Claims issue. In the long run, the Fenians gave no help to independence movements in Ireland, but they did stimulate a new sense of Canadian nationalism.
https://en.wikipedia.org/wiki?curid=61305749
1,787,157
1,516,814
Rene Sotelo (Caracas 1961) is one of the most experienced laparoscopic/robotic surgeons in the world. He received his medical degree from Central University of Venezuela and has been in practice for more than 20 years from prestigious hospitals of Venezuela and Mexico, he is a pioneer in robotic surgery for complex urinary fistulae in females and males, benign prostate enlargement and inguinal lymph node dissection for cancer. He has helped develop the novel concepts of single-port belly-button and natural orifice surgery. His experience with advanced robotic and laparoscopic surgery exceeds 2,300 personal cases, making him amongst the most experienced in the world. He has published more than 50 peer reviewed scientific papers, three textbooks and 28 chapters in major urology books. He also serves on the editorial board of three urologic journals. In recognition of his work, Dr. Sotelo has been invited as international guest-lecturer in more than 35 universities in 19 countries. His unique skills and large personal experience in minimally invasive laparoscopic and robotic surgery are well known. He is a passionate teacher, having taught his techniques and best practices in 19 countries as an invited guest. To date, he has trained over 64 post-graduate fellows from 14 countries in the art and science of minimally invasive urology. It is because of these seminal achievements that Dr. Sotelo was selected to join the University of Southern California Keck School of Medicine. His passion for surgical innovation, advancing the field and worldwide teaching coincides with, and further strengthens, the philosophy at the USC Institute of Urology.
https://en.wikipedia.org/wiki?curid=29302481
1,515,962
133,314
Over the first half of the 19th century, geologists such as Charles Lyell, Adam Sedgwick, and Roderick Murchison applied the new technique to rocks throughout Europe and eastern North America, setting the stage for more detailed, government-funded mapping projects in later decades. Midway through the 19th century, the focus of geology shifted from description and classification to attempts to understand "how" the surface of the Earth had changed. The first comprehensive theories of mountain building were proposed during this period, as were the first modern theories of earthquakes and volcanoes. Louis Agassiz and others established the reality of continent-covering ice ages, and "fluvialists" like Andrew Crombie Ramsay argued that river valleys were formed, over millions of years by the rivers that flow through them. After the discovery of radioactivity, radiometric dating methods were developed, starting in the 20th century. Alfred Wegener's theory of "continental drift" was widely dismissed when he proposed it in the 1910s, but new data gathered in the 1950s and 1960s led to the theory of plate tectonics, which provided a plausible mechanism for it. Plate tectonics also provided a unified explanation for a wide range of seemingly unrelated geological phenomena. Since 1970 it has served as the unifying principle in geology.
https://en.wikipedia.org/wiki?curid=14400
133,261
1,945,427
Albeit it being now used frequently by the scientific community, genomic phylostratigraphy has also received some criticism for being too inaccurate for its measurements to be trustworthy. First of all, according to some authors precision lacks in the assumptions. It is erroneous to assume for example that all species beyond the organism of focus share the same protein evolutionary rate, which isn't true as it varies depending on cell cycle speeds, leading to problems in setting the limits of BLAST error to englobe all proteins originated from the same founder gene. Another point is that the BLAST search assumes that protein evolutionary rates is constant at all its sites, which is also false. Lastly, it could be said that the model does not account correctly for gene duplications, as well as gene losses: the changes in evolutionary rates caused by gene duplications due to new functional changes would increase BLAST error rates, and gene loss in taxa distant to the one studied could lead to great underestimations in the calculated gene age and phylostratum of founder genes compared to their true values. However, rather than demanding to simply abandon the method, critics have been trying to work at refining it from its original state, by introducing other potential mathematical formulas or sequence searching tools, although the Ruđer Bošković Institute has replied to such criticism claiming their original approach was valid and did not need to be extensively revised. This debate is also included as part of the wider discussion on the importance of de novo gene births in creating genetic diversity, in which genomic phylostratigraphy supports that they do hold a strong effect, in a way that it can only be widely accepted or refuted once the latter dilemma has been resolved.
https://en.wikipedia.org/wiki?curid=13106733
1,944,315
217,411
Research began in 1922 in western Montana, in the Bitterroot Valley around Hamilton, Montana, after the Governor's daughter and his son-in-law died of the fever. However, prior to that, in 1917, Dr. Lumford Fricks introduced herds of sheep into the Bitterroot Valley. His hypothesis was that the sheep would eat the tall grasses where ticks lived and bred. Past Assistant Surgeon R.R. Spencer of the Hygienic Laboratory of the U.S. Public Health Service was ordered to the region, and he led a research team at an abandoned schoolhouse through about 1924. Spencer was assisted by R. R. Parker, Bill Gettinger, Henry Cowan, Henry Greenup, Elmer Greenup, Gene Hughes, Salsbury, Kerlee, and others, of whom Gettinger, Cowan and Kerlee died of Rocky Mountain spotted fever. Through a series of discoveries, the team found that a previous blood meal was necessary to make the tick deadly to its hosts, as well as other facets of the disease. On May 19, 1924, Spencer put a large dose of mashed wood ticks, from lot 2351B, and some weak carbolic acid into his arm by injection. This vaccine worked, and for some years after it was used by people in that region to convert the illness from one with high fatality rate (albeit low incidence) to one that could be either prevented entirely (for many of them) or modified to a non-deadly form (for the rest). Today there is no commercially available vaccine for RMSF because, unlike in the 1920s when Spencer and colleagues developed one, antibiotics are now available to treat the disease, so prevention by vaccination is no longer the sole defense against likely death.
https://en.wikipedia.org/wiki?curid=83461
217,303
1,010,617
Jeffreys's DNA method was first put to use in 1985 when he was asked to help in a disputed immigration case to confirm the identity of a British boy whose family was originally from Ghana. The case was resolved when the DNA results proved that the boy was closely related to the other members of the family, and Jeffreys saw the relief in the mother's face when she heard the results. DNA fingerprinting was first used in a police forensic test to identify the killer of two teenagers, Lynda Mann and Dawn Ashworth, who had been raped and murdered in Narborough, Leicestershire, in 1983 and 1986 respectively. Colin Pitchfork was identified and convicted of their murders after samples taken from him matched semen samples taken from the two dead girls. This turned out to be a specifically important identification; British authorities believe that without it an innocent man would have inevitably been convicted. Not only did Jeffreys' work, in this case, prove who the real killer was, but it exonerated Richard Buckland, initially a prime suspect, who likely would have spent his life in prison otherwise. The story behind the investigations is told in Joseph Wambaugh's 1989 best-selling book "The Blooding: The True Story of the Narborough Village Murders" and the murders and subsequent solving of the crimes was featured in Episode 4 of the first season of the 1996 American TV series "Medical Detectives" in which Jeffreys himself also appears. A further television mini-series based on these events was released in 2015, "Code of a Killer". In 1992, Jeffreys's methods were used to confirm the identity for German prosecutors of the body of Josef Mengele, who had died in 1979, by comparing DNA obtained from a femur bone of his exhumed skeleton, with DNA from his mother and son, in a similar way to paternity testing.
https://en.wikipedia.org/wiki?curid=44292
1,010,096
1,042,484
Dr Theodor Stiebel founded the "ELTRON Dr. Theodor Stiebel" company at Reichenberger Strasse 143 in Berlin's Kreuzberg district with a base capital of 20,000 Reichsmark on 5 May 1924. According to the Commercial Register, the company began operating on the very same day. Dr Stiebel was loaned the capital for this by his uncles, Hermann Stiebel, who ran a hotel in Hamburg and Carl Reese who owned a metalworking business (canning factory) in Holzminden. With his patented invention of the first coil immersion heater, marvelled at by visitors to the 1924 Spring Trade Fair in Leipzig due to its rapid heat-up time and short cooling period, Dr Stiebel laid the foundations for the large-scale production that would begin one year later. The TLn coil immersion heater was initially manufactured by 10 employees working in two leased buildings at Reichenberger Strasse 143 and Oppelner Strasse 34 in Berlin. On 3 March 1925, production was moved to the third floor of Reichenberger Strasse 160. Initially, only immersion heaters under the "Eltro" brand were manufactured here, with an annual output of up to 60,000 units. In 1927, the company had 26 employees and generated an annual turnover of 184,745 Reichsmark. Its immersion heaters retailed for approximately three Reichsmark at the time. Additional space was leased in the building on Reichenberger Strasse from 1927 to 1932, and the first immersion heaters were also exported to Australia, India, China and South America. The first foreign subsidiary began operating in London in 1927 and a branch was opened in Zurich in 1929. The first two-stage small 1000 watts instantaneous water heaters with a porcelain casing went into production in 1928, with a daily output of 100 units. The instantaneous water cylinder was developed in 1931. The cylinder had a 3 litre capacity and the water heating was controlled by thermostats with two heating elements, each with a 500 watt output.
https://en.wikipedia.org/wiki?curid=7497784
1,041,941
860,500
The technologies referred to as Solid Freeform Fabrication are what we recognize today as rapid prototyping, 3D printing or additive manufacturing: Swainson (1977), Schwerzel (1984) worked on polymerization of a photosensitive polymer at the intersection of two computer controlled laser beams. Ciraud (1972) considered magnetostatic or electrostatic deposition with electron beam, laser or plasma for sintered surface cladding. These were all proposed but it is unknown if working machines were built. Hideo Kodama of Nagoya Municipal Industrial Research Institute was the first to publish an account of a solid model fabricated using a photopolymer rapid prototyping system (1981). The very first 3D rapid prototyping system relying on Fused Deposition Modeling (FDM) was made in April 1992 by Stratasys but the patent did not issue until June 9, 1992. Sanders Prototype, Inc introduced the first desktop inkjet 3D Printer (3DP) using an invention from August 4,1992 (Helinski), Modelmaker 6Pro in late 1993 and then the larger industrial 3D printer, Modelmaker 2, in 1997. Z-Corp using the MIT 3DP powder binding for Direct Shell Casting (DSP) invented 1993 was introduced to the market in 1995. Even at that early date the technology was seen as having a place in manufacturing practice. A low resolution, low strength output had value in design verification, mold making, production jigs and other areas. Outputs have steadily advanced toward higher specification uses. Sanders Prototype, Inc. (Solidscape) started as a Rapid Prototyping 3D Printing manufacturer with the Modelmaker 6Pro for making sacrificial Thermoplastic patterns of CAD models uses Drop-On-Demand (DOD) inkjet single nozzle technology.
https://en.wikipedia.org/wiki?curid=10579736
860,042