doc_id
int32
18
2.25M
text
stringlengths
245
2.96k
source
stringlengths
38
44
__index_level_0__
int64
18
2.25M
1,643,717
Thus, chemostratigraphy generally provides two useful types of information to the larger geological community. First, chemostratigraphy can be used to investigate environmental change on the local, regional, and global levels by relating variations in rock chemistry to changes in the environment in which the sediment was deposited. An extreme example of this type of investigation might be the discovery of strata rich in iridium near the boundary between the Cretaceous and Tertiary systems globally. The high concentration of iridium, which is generally rare in the Earth's crust, is indicative of a large delivery of extraterrestrial material, presumably from a large asteroid impactor during this time. A more prosaic example of chemostratigraphic reconstruction of past conditions might be the use of the carbon-13/carbon-12 ratio over geologic time as a proxy for changes in carbon cycle processes at different stages of biological evolution. Second, regionally or globally correlatable chemostratigraphic signals can be found in rocks whose formation time is well-constrained by radionuclide dating of the strata themselves or by strata easily correlated with them, such as a volcanic suite that interrupts nearby strata. However, many sedimentary rocks are much harder to date, because they lack minerals with high concentrations of radionuclides and cannot be correlated with nearly datable sequences. Yet many of these rocks do possess chemostratigraphic signals. Therefore, the correlation between chemostatigraphic signals in conventionally datable and non-datable sequences has extended greatly our understanding of the history of tectonically quiescent regions and of biological organisms that lived in such regions. Chemostratigraphy also has acted as a check on other sub-fields of stratigraphy such as biostratigraphy and magnetostratigraphy.
https://en.wikipedia.org/wiki?curid=4240819
1,642,790
605,277
It has been argued that sound scientific support is lacking for the claim that the specified purely bedside tests have the power to diagnose true and total death of the brainstem, the necessary condition for the assumption of permanent loss of the intrinsically untestable consciousness-arousal function of those elements of the reticular formation which lie within the brainstem (there are elements also within the higher brain). Knowledge of this arousal system is based upon the findings from animal experiments as illuminated by pathological studies in humans. The current neurological consensus is that the arousal of consciousness depends upon reticular components which reside in the midbrain, diencephalon and pons. It is said that the midbrain reticular formation may be viewed as a driving centre for the higher structures, loss of which produces a state in which the cortex appears, on the basis of electroencephalographic (EEG) studies, to be awaiting the command or ability to function. The role of diencephalic (higher brain) involvement is stated to be uncertain and we are reminded that the arousal system is best regarded as a physiological rather than a precise anatomical entity. There should, perhaps, also be a caveat about possible arousal mechanisms involving the first and second cranial nerves (serving sight and smell) which are not tested when diagnosing brainstem death but which were described in cats in 1935 and 1938. In humans, light flashes have been observed to disturb the sleep-like EEG activity persisting after the loss of all brainstem reflexes and of spontaneous respiration.
https://en.wikipedia.org/wiki?curid=26700042
604,967
162,152
Indeed, even if Britain was far better off compared to the badly battered Continent, economic stagnation lasted the whole decade. Overall growth averaged 1.8% per annum during the 1920s, slightly weaker but comparable to the United States. Slow growth was due in part to Britain's heavy dependence on exports, and world trade grew sluggishly through the 1920s. It was also overly dependent on so-called "staple" industries, those which had brought huge prosperity in the 19th century but by the 1920s were experiencing faltering demand and strong competition from abroad. In 1922, for example, the volume of cotton exports was only about half of what it had been in 1913, while coal exports were only one third of their 1913 levels. The most skilled craftsmen were especially hard hit, because there were few alternative uses for their specialised skills. In depressed areas the main social indicators such as poor health, bad housing, and long-term mass unemployment, pointed to terminal social and economic stagnation at best, or even a downward spiral. The heavy dependence on obsolescent heavy industry and mining was a central problem, and no one offered workable solutions. The despair reflected what Finlay (1994) describes as a widespread sense of hopelessness that prepared local business and political leaders to accept a new orthodoxy of centralised government economic planning when it arrived during the Second World War.
https://en.wikipedia.org/wiki?curid=33643110
162,067
1,033,046
The zamindari system was one of two principal revenue settlements undertaken by the Company in India. In southern India, Thomas Munro, who would later become Governor of Madras, promoted the "ryotwari" system, in which the government settled land-revenue directly with the peasant farmers, or "ryots". This was, in part, a consequence of the turmoil of the Anglo-Mysore Wars, which had prevented the emergence of a class of large landowners; in addition, Munro and others felt that "ryotwari" was closer to traditional practice in the region and ideologically more progressive, allowing the benefits of Company rule to reach the lowest levels of rural society. At the heart of the "ryotwari" system was a particular theory of economic rent—and based on David Ricardo's Law of Rent—promoted by utilitarian James Mill who formulated the Indian revenue policy between 1819 and 1830. "He believed that the government was the ultimate lord of the soil and should not renounce its right to 'rent', "i.e." the profit left over on richer soil when wages and other working expenses had been settled." Another keystone of the new system of temporary settlements was the classification of agricultural fields according to soil type and produce, with average rent rates fixed for the period of the settlement. According to Mill, taxation of land rent would promote efficient agriculture and simultaneously prevent the emergence of a "parasitic landlord class." Mill advocated "ryotwari" settlements which consisted of government measurement and assessment of each plot (valid for 20 or 30 years) and subsequent taxation which was dependent on the fertility of the soil. The taxed amount was nine-tenths of the "rent" in the early 19th century and gradually fell afterwards. However, in spite of the appeal of the "ryotwari" system's abstract principles, class hierarchies in southern Indian villages had not entirely disappeared—for example village headmen continued to hold sway—and peasant cultivators sometimes came to experience revenue demands they could not meet. In the 1850s, a scandal erupted when it was discovered that some Indian revenue agents of the Company were using torture to meet the Company's revenue demands.
https://en.wikipedia.org/wiki?curid=21670823
1,032,510
78,731
The F-16XL featured a novel 'cranked-arrow' type of delta wing with more than twice the area of the standard F-16 wing. Developed under a program originally known as the Supersonic Cruise and Maneuvering Program (SCAMP), the design was intended to offer low drag at high subsonic or supersonic speeds without compromising low-speed maneuverability. As a result, the F-16XL was able to cruise efficiently at supersonic speeds without using afterburner, commonly known as supercruise. In late 1980, the USAF agreed to provide GD with the third and fifth FSD F-16s for modification into single-seat and twin-seat F-16XL prototypes. To accommodate the larger wing, the aircraft was lengthened 56in (142cm) by the addition of a 30-inch (76cm) plug in the forward fuselage and a 26-inch (66cm) section to the aft fuselage just behind the landing gear bulkhead. The rear fuselage was also canted up by three degrees to increase the angle of attack on takeoff and landing. The F-16XL could carry twice the payload of the F-16 on 27 hardpoints, and it had a 40% greater range due to an 82% increase in internal fuel carriage. The single-seat F-16XL first flew on 3 July 1982, followed by the two-seater on 29 October 1982. The F-16XL competed unsuccessfully with the F-15E Strike Eagle in the Enhanced Tactical Fighter (ETF) program; if it had won the competition, the production versions were to have been designated F-16E/F. Following the February 1984 selection announcement, both examples of the F-16XL were placed in flyable storage.
https://en.wikipedia.org/wiki?curid=18895385
78,698
17,089
Following Patton's resignation, Woodrow Wilson, an alumnus and popular professor, was elected the 13th president of the university. Noticing falling academic standards, Wilson orchestrated significant changes to the curriculum, where freshman and sophomores followed a unified curriculum while juniors and seniors concentrated study in one discipline. Ambitious seniors were allowed to undertake independent work, which would eventually shape Princeton's emphasis on the practice for the future. Wilson further reformed the educational system by introducing the preceptorial system in 1905, a then-unique concept in the United States that augmented the standard lecture method of teaching with a more personal form in which small groups of students, or precepts, could interact with a single instructor, or preceptor, in their field of interest. The changes brought about many new faculty and cemented Princeton's academics for the first half of the 20th century. Due to the tightening of academic standards, enrollment declined severely until 1907. In 1906, the reservoir Lake Carnegie was created by Andrew Carnegie, and the university officially became nonsectarian. Before leaving office, Wilson strengthened the science program to focus on "pure" research and broke the Presbyterian lock on the board of trustees. However, he did fail in winning support for the permanent location of the Graduate School and the elimination of the eating clubs, which he proposed replacing with quadrangles, a precursor to the residential college system. Wilson also continued to keep Princeton closed off from accepting Black students. When an aspiring Black student wrote a letter to Wilson, he got his secretary to reply telling him to attend a university where he would be more welcome.
https://en.wikipedia.org/wiki?curid=23922
17,084
1,023,903
A major concern in achieving the 2030 agenda for sustainable development, is the lack of representation for ecosystem health. The effects of a healthy ecosystem, particularly in the context of an ocean, can trickle down and fulfill several initiatives within various SDG targets. A healthy, diverse ecosystem, particularly in coastal areas can serve to provide a variety of essential goods and services that are beneficial for human wellbeing. For example, bioactive compounds existing within marine flora and fauna, as well as the derivation of pharmaceutical and nutraceuticals, are all important for human health. In order to properly conserve and use the ocean's resources sustainably, it may require rebuilding marine life support systems. Based on past conservation efforts, the possibility of recovery for the abundance, structure, and function of marine life is not out of reach, assuming the effects of climate change are controlled. It is suggested that putting ocean ecosystem health at the forefront of our concerns can help us achieve other goals across several SDGs. It is important to note, however, that the achievement of short-term objectives does not necessarily imply long-term sustainability. As an example, short-term objectives such as easing hunger within developing nations can be undermined by large-scale concerns such as poor ocean health. By ensuring that the resources provided by the oceans are responsibly maintained, it could help contribute towards Sustainable Development Goals 1, 2, 3, 5, 6, and more, especially for nations that are highly dependant on the ocean's resources. It is also important that international environmental laws are adjusted to accommodate for marine environmental protection targets in order to foster interconnections between various ecosystems such as oceans, climates, and terrestrial environments.
https://en.wikipedia.org/wiki?curid=64523160
1,023,370
1,984,719
Dr. Hugenholtz and colleagues from the University of Queensland in Australia, have completed shotgun sequencing of lyophilized cells of "V. chlorellavourus" strain in culture with "Chlorella vulgaris." Subsequently, Soo and Hugenholtz's team performed a genomic reconstruction in 2014 from a culture previously deposited into the NCBI collection in 1978 and were able to make a general metabolic reconstruction of the genome They found that "V. chlorellavorus" uses a type IV secretion system (T4SS), similar to that of "Agrobacterium tumefaciens" for host invasion, which is conserved in all three copies of the "V. chlorellavorus" genome. To locate its prey, "V. chlorellavorus" seems to be equipped with possible genes for aerotaxis and light activated kinase (moving towards light), suggesting that it might be motile as was originally thought. To digest its algal prey, "V. chlorellavorus" has over 100 hydrolytic enzymes including proteases and peptidases. From the results of Soo and her team's genomic analysis, "V. chlorellavorus" has approximately 26 contigs, 2.91 Mbp, an average GC content of 51.4%, and 2 circular plasmids. In keeping with its description as non-photosynthetic and parasitic microorganism, "V. chlorellavorus" does not have its own genes for photosynthesis or carbon fixation. "V. chlorellavorus" is however capable of synthesizing its own nucleotides, certain cofactors and vitamins, and 15 different amino acids. Its bacterial genome also includes coding for a complete glycolysis pathway as well as an electron transport chain.
https://en.wikipedia.org/wiki?curid=46543992
1,983,580
1,597,120
On the 17 December 2021, Jackson explained that some of the implications for New Zealand of the arrival of the Omicron variant included the need to bring boosters of the vaccine forward and possibly a review of the opening of the borders scheduled for January 2022 which could affect overseas travel plans. He said that the goal should be to keep the variant out of the country for as long as possible. Changes made by the New Zealand Government late in December 2021 to reduce the time for getting boosters, allowing for the vaccination of children and extending the date for quarantine-free entry for New Zealanders returning from Australia, were seen by Jackson as significant in managing the possibility of an Omicron outbreak in the country. He told Radio New Zealand that during the pandemic, every decision made by a Government was about balancing the risk to health and the risk to the economy, but New Zealand was in a good position because the population was recently vaccinated, it was summer and schools were on holiday. He later said that the holiday period for New Zealanders still required caution, despite high rates of vaccination in most areas. By the end of December 2021, after the first border-related case of Omicron was detected as having been active in the New Zealand community, Jackson said that it could be an historical case, but was likely to have been caught in managed quarantine, and the Government should be "very seriously considering requiring a rapid antigen test before people board a plane to NZ. I don't think a PCR within 72 hours works." Jackson expressed confidence in New Zealand's response to Omicron in February 2022 due to the high levels of vaccination, but expressed concern for those who were still unvaccinated. He noted that while Omicron was likely to be less severe than Delta, unvaccinated people were very vulnerable in terms of "mortality and hospitalisations."
https://en.wikipedia.org/wiki?curid=69753744
1,596,221
226,367
The Demeter Association also recommends that the individual design of the land "by the farmer, as determined by site conditions, is one of the basic tenets of biodynamic agriculture. This principle emphasizes that humans have a responsibility for the development of their ecological and social environment which goes beyond economic aims and the principles of descriptive ecology." Crops, livestock, and farmer, and "the entire socioeconomic environment" form a unique interaction, which biodynamic farming tries to "actively shape ...through a variety of management practices. The prime objective is always to encourage healthy conditions for life": soil fertility, plant and animal health, and product quality. "The farmer seeks to enhance and support the forces of nature that lead to healthy crops, and rejects farm management practices that damage the environment, soil, plant, animal or human health...the farm is conceived of as an organism, a self-contained entity with its own individuality," holistically conceived and self-sustaining. "Disease and insect control are addressed through botanical species diversity, predator habitat, balanced crop nutrition, and attention to light penetration and airflow. Weed control emphasizes prevention, including timing of planting, mulching, and identifying and avoiding the spread of invasive weed species."
https://en.wikipedia.org/wiki?curid=445231
226,251
529,522
The C6N originated from a 1942 Imperial Japanese Navy specification for a carrier-based reconnaissance plane with a top speed of 350 knots (650 km/h) at 6,000 m and range of 2,500 nautical miles (4,960 km). Nakajima's initial proposal, designated N-50, was for a craft with two engines housed in tandem in the fuselage, driving two propellers mounted on the wings. With the development of the class Nakajima Homare engine, the dual powerplant configuration was abandoned and Nakajima decided on a more conventional single-engine layout. Unfortunately the new Homare's power output was less than expected, and the design had to be optimized in other areas. The resulting aircraft was designed around a long and extremely narrow cylindrical fuselage just large enough in diameter to accommodate the engine. The crew of three sat in tandem under a single canopy, while equipment was similarly arranged in a line along the fuselage. The C6N's low-mounted laminar flow wing housed fuel tanks and was fitted with both Fowler and slit flaps and leading-edge slats which lowered the aircraft's landing speed to ease use aboard aircraft carriers. Like Nakajima's earlier B6N "Tenzan" torpedo bomber, the vertical stabilizer was angled slightly forward to enable tighter packing on aircraft carrier decks.
https://en.wikipedia.org/wiki?curid=1160838
529,248
1,566,890
In the typical PAC spectrometer, a setup of four 90° and 180° planar arrayed detectors or six octahedral arrayed detectors are placed around the radioactive source sample. The detectors used are scintillation crystals of BaF or NaI. For modern instruments today mainly LaBr:Ce or CeBr are used. Photomultipliers convert the weak flashes of light into electrical signals generated in the scintillator by gamma radiation. In classical instruments these signals are amplified and processed in logical AND/OR circuits in combination with time windows the different detector combinations (for 4 detectors: 12, 13, 14, 21, 23, 24, 31, 32, 34, 41, 42, 43) assigned and counted. Modern digital spectrometers use digitizer cards that directly use the signal and convert it into energy and time values and store them on hard drives. These are then searched by software for coincidences. Whereas in classical instruments, "windows" limiting the respective γ-energies must be set before processing, this is not necessary for the digital PAC during the recording of the measurement. The analysis only takes place in the second step. In the case of probes with complex cascades, this makes it makes it possible to perform a data optimization or to evaluate several cascades in parallel, as well as measuríng different probes simultaneously. The resulting data volumes can be between 60 and 300 GB per measurement.
https://en.wikipedia.org/wiki?curid=62421802
1,566,003
305,187
Labor governments in Britain and Australia were disastrous failures. In the United States, the New Deal liberalism of President Franklin D. Roosevelt won mass support and deprived socialists of any chance of gaining ground. In Germany, it was the fascists of Adolf Hitler's Nazi Party who successfully exploited the Depression to win power, in January 1933. Hitler's regime swiftly destroyed both the German Communist Party and the Social Democratic Party, the worst blow the world socialist movement had ever suffered. This forced Stalin to reassess his strategy, and from 1935 the Comintern began urging a popular front against fascism. The socialist parties were at first suspicious, given the bitter hostility of the 1920s, but eventually effective Popular Fronts were formed in both France and Spain. After the election of a Popular Front government in Spain in 1936 a fascist military revolt led to the Spanish Civil War. The crisis in Spain also brought down the Popular Front government in France under Léon Blum. Ultimately the Popular Fronts were not able to prevent the spread of fascism or the aggressive plans of the fascist powers. Trotskyists considered Popular Fronts a "strike breaking conspiracy" and considered them an impediment to successful resistance to fascism. When Stalin consolidated his power in the Soviet Union in the late 1920s, Trotsky was forced into exile, eventually residing in Mexico. He maintained active in organising the Left Opposition internationally, which worked within the Comintern to gain new members. Some leaders of the Communist Parties sided with Trotsky, such as James P. Cannon in the United States. They found themselves expelled by the Stalinist Parties and persecuted by both GPU agents and the political police in Britain, France, the United States, China, and all over the world. Trotskyist parties had a large influence in Sri Lanka and Bolivia.
https://en.wikipedia.org/wiki?curid=47246185
305,025
559,142
Just authored two books, "Basic Methods for Experiments on Eggs of Marine Animals" (1939) and "The Biology of the Cell Surface" (1939), and he also published at least seventy papers in the areas of cytology, fertilization and early embryonic development. He discovered what is known as the fast block to polyspermy; he further elucidated the slow block, which had been discovered by Fol in the 1870s; and he showed that the adhesive properties of the cells of the early embryo are surface phenomena exquisitely dependent on developmental stage. He believed that the conditions used for experiments in the laboratory should closely match those in nature; in this sense, he can be considered to have been an early ecological developmental biologist. His work on experimental parthenogenesis informed Johannes Holtfreter's concept of "autoinduction" which, in turn, has broadly influenced modern evolutionary and developmental biology. His investigation of the movement of water into and out of living egg cells (all the while maintaining their full developmental potential) gave insights into internal cellular structure that is now being more fully elucidated using powerful biophysical tools and computational methods. These experiments anticipated the non-invasive imaging of live cells that is being developed today. Although Just's experimental work showed an important role for the cell surface and the layer below it, the "ectoplasm," in development, it was largely and unfortunately ignored. This was true even with respect to scientists who emphasized the cell surface in their work. It was especially true of the Americans; with the Europeans, he fared somewhat better.
https://en.wikipedia.org/wiki?curid=7575460
558,853
1,787,851
Probability is used to characterize the degree of belief in the truth of an assertion. The basis for such a belief can be a physical system that produces outcomes at a rate that is uniform over time, such as a gaming device like a roulette wheel or a die. With such a system, the observer does not influence the outcome; a fair six-sided die that is rolled enough times will land on any one of its sides 1/6th of the time. An assertion of a probability based in a physical system is easily tested with sufficient randomized experimentation. Conversely, the basis for a high degree of belief in an asserted claim may be a personally held perspective that cannot be tested. This does not mean that the assertion is any less true than one that can be tested. As an example, one might truthfully assert that “if I eat a banana there is a high probability that it will make me nauseous” based upon experience unknown to anyone but one’s self. It is difficult to test such assertions, which are evaluated through collateral evidence of plausibility and analogy, often based on similar personal experience. In forensic settings, assertions of belief are often characterized as probabilities, that is, "what is most likely", for a given set of facts. For circumstances in which a variety of conditions exist that may modify or “ condition” the probability of a particular outcome or scenario, a method of quantifying the relationship between the modifying conditions and the probability of the outcome employs Bayesian reasoning, named for Bayes’ Theorem or Law upon which the approach is based. Most simply stated, Bayes’ Law allows for a more precise quantification of the uncertainty in a given probability. As applied in a forensic setting, Bayes’ Law tells us what we want to know given what we do know. Although Bayes’ Law is known in forensic sciences primarily for its application to DNA evidence, a number of authors have described the use of Bayesian reasoning for other applications in forensic medicine, including identification and age estimation.
https://en.wikipedia.org/wiki?curid=50123287
1,786,846
326,867
Research indicates that technological superiority and higher land productivity had significant positive effects on population density but insignificant effects on the standard of living during the time period 1–1500 AD. In addition, scholars have reported on the lack of a significant trend of wages in various places over the world for very long stretches of time. In Babylonia during the period 1800 to 1600 BC, for example, the daily wage for a common laborer was enough to buy about 15 pounds of wheat. In Classical Athens in about 328 BC, the corresponding wage could buy about 24 pounds of wheat. In England in 1800 AD the wage was about 13 pounds of wheat. In spite of the technological developments across these societies, the daily wage hardly varied. In Britain between 1200 and 1800, only relatively minor fluctuations from the mean (less than a factor of two) in real wages occurred. Following depopulation by the Black Death and other epidemics, real income in Britain peaked around 1450–1500 and began declining until the British Agricultural Revolution. Historian Walter Scheidel posits that waves of plague following the initial outbreak of the Black Death throughout Europe had a leveling effect that changed the ratio of land to labor, reducing the value of the former while boosting that of the latter, which lowered economic inequality by making employers and landowners less well off while improving the economic prospects and living standards of workers. He says that "the observed improvement in living standards of the laboring population was rooted in the suffering and premature death of tens of millions over the course of several generations." This leveling effect was reversed by a "demographic recovery that resulted in renewed population pressure."
https://en.wikipedia.org/wiki?curid=1454728
326,693
188,340
Another early theory of geomorphology was devised by Song dynasty Chinese scientist and statesman Shen Kuo (1031–1095). This was based on his observation of marine fossil shells in a geological stratum of a mountain hundreds of miles from the Pacific Ocean. Noticing bivalve shells running in a horizontal span along the cut section of a cliffside, he theorized that the cliff was once the pre-historic location of a seashore that had shifted hundreds of miles over the centuries. He inferred that the land was reshaped and formed by soil erosion of the mountains and by deposition of silt, after observing strange natural erosions of the Taihang Mountains and the Yandang Mountain near Wenzhou. Furthermore, he promoted the theory of gradual climate change over centuries of time once ancient petrified bamboos were found to be preserved underground in the dry, northern climate zone of "Yanzhou", which is now modern day Yan'an, Shaanxi province. Previous Chinese authors also presented ideas about changing landforms. Scholar-official Du Yu (222–285) of the Western Jin dynasty predicted that two monumental stelae recording his achievements, one buried at the foot of a mountain and the other erected at the top, would eventually change their relative positions over time as would hills and valleys. Daoist alchemist Ge Hong (284–364) created a fictional dialogue where the immortal Magu explained that the territory of the East China Sea was once a land filled with mulberry trees.
https://en.wikipedia.org/wiki?curid=78534
188,243
893,448
2013: Mercedes updated Pre-Safe on the W222 S-Class as plus with cross-traffic assist. Pre-Safe with pedestrian detection and City Brake function is a combination of stereo camera and radar sensors to detect pedestrians in front of the vehicle. Visual and acoustic warnings are triggered when a hazard is spotted. If the driver then reacts by braking, the braking power will be boosted as the situation requires, up to a full brake application. Should the driver fail to react, Pre-Safe Brake triggers autonomous vehicle braking. Pedestrian detection is active up to about , and is able to reduce collisions with pedestrians autonomously from an initial speed of up to . A radar sensor in the rear bumper monitors the traffic behind the vehicle. If the risk of an impact from the rear is detected, the rear hazard warning lights are activated to alert the driver of the vehicle behind (not on vehicles with USA/Canada coding). Anticipatory occupant protection measures, such as the reversible belt tensioners, are deployed. If the vehicle is stopped and the driver indicates a wish to remain stationary by depressing the brake pedal, activating the hold function, or moving the selector lever to "P" the system increases the brake pressure to keep the vehicle firmly braked during a possible rear-end collision. Pre-Safe Impulse works an early phase of the crash, before the resulting deceleration starts to increase, the front occupants are pulled away from the direction of impact and deeper into their seats by their seat belts. By the time the accident enters the phase when loads peak, the extra distance they are retracted by can be used while dissipating energy in a controlled fashion. Pre-acceleration and force limitation allow the occupants to be temporarily isolated from the effects of the crash, significantly reducing the risk and severity of injuries in a frontal collision.
https://en.wikipedia.org/wiki?curid=18012843
892,978
126,507
These rules have included the banning of such ideas as the "wing car" (ground effect) in 1983 (reintroduced in 2022); the turbocharger in 1989 (these were reintroduced for 2014); active suspension and ABS in 1994; slick tyres in 1998 (these were reintroduced for 2009); smaller front and rear wings and a reduction in engine capacity from 3.5 to 3.0 litres in 1995; reducing the width of the cars from over 2 metres to around 1.8 metres in 1998; again a reduction in engine capacity from 3.0 to 2.4 litres in 2006; launch control and traction control in 1994, and again in 2004 and 2008, alongside engine braking, after electronic driver aids were reintroduced in 2001. Yet despite these changes, constructors continued to extract performance gains by increasing power and aerodynamic efficiency. As a result, the pole position speed at many circuits in comparable weather conditions dropped between 1.5 and 3 seconds in 2004 over the prior year's times. The aerodynamic restrictions introduced in 2005 were meant to reduce downforce by about 30%, however, most teams were able to successfully reduce this to a mere 5 to 10% downforce loss. In 2006 the engine power was reduced from by shifting from the 3.0L V10s, used for a decade, to 2.4L V8s. Some of these new engines were capable of achieving 20,000 rpm during 2006, though for the 2007 season engine development was frozen and the FIA limited all engines to 19,000 rpm to increase reliability and control at increasing engine speeds.
https://en.wikipedia.org/wiki?curid=645083
126,455
771,205
The original XP-39 was built with a V-1710 augmented by a General Electric Type B-5 turbo-supercharger as specified by Fighter Projects Officer Lieutenant Benjamin S. Kelsey and his colleague Gordon P. Saville. Numerous changes were made to the design during a period of time when Kelsey's attention was focused elsewhere, and Bell engineers, NACA aero-dynamic specialists and the substitute fighter project officer determined that dropping the turbocharger would be among the drag reduction measures indicated by borderline wind tunnel test results; an unnecessary step, according to aviation engineer and historian Warren M. Bodie. The production P-39 was thus stuck with poor high-altitude performance and proved unsuitable for the air war in Western Europe which was largely conducted at high altitudes. The P-39 was rejected by the British, but used by the U.S. in the Mediterranean and the early Pacific air war, as well as shipped to the Soviet Union in large numbers under the Lend Lease program. The Soviets were able to make good use of P-39s because of its excellent manoeuvrability and because the air war on the Eastern Front in Europe was primarily short ranged, tactical, and conducted at lower altitudes. In the P-39, Soviet pilots scored the highest number of individual kills made on "any" American, or British fighter type. The P-40, which also had only the single-stage, single-speed-supercharged V-1710, had similar problems with high-altitude performance.
https://en.wikipedia.org/wiki?curid=76309
770,791
1,497,285
The extensive usage of machine learning in various fields has led to a wide range of algorithms of learning methods being applied. The machine learning algorithm applied in solving earth science problem in much interest to the researchers. Choosing the optimal algorithm for a specific purpose can lead to a significant boost in accuracy. For example, the lithological mapping of gold-bearing granite-greenstone rocks in Hutti, India with AVIRIS-NG hyperspectral data, shows more than 10% difference in overall accuracy between using Support Vector Machine (SVM) and random forest. Some algorithms can also reveal some important information. 'White-box models' are transparent models in which the results and methodologies can be easily explained, while 'black-box' models are the opposite. For example, although the support-vector machine (SVM) yielded the best result in landslide susceptibility assessment accuracy, the result cannot be rewritten in the form of expert rules that explain how and why an area was classified as that specific class. In contrast, the decision tree has a transparent model that can be understood easily, and the user can observe and fix the bias if any present in the model. If the computational power is a concern, a more computationally demanding learning method such as artificial neural network is less preferred despite the fact that artificial neural network may slightly outperform other algorithms, such as in soil classification.
https://en.wikipedia.org/wiki?curid=68735447
1,496,442
86,063
In a sense somewhat unrelated to its use in any biological disciplines, the term "epigenetic" has also been used in developmental psychology to describe psychological development as the result of an ongoing, bi-directional interchange between heredity and the environment. Interactive ideas of development have been discussed in various forms and under various names throughout the 19th and 20th centuries. An early version was proposed, among the founding statements in embryology, by Karl Ernst von Baer and popularized by Ernst Haeckel. A radical epigenetic view, known as physiological epigenesis, was developed by Paul Wintrebert. Another variation, probabilistic epigenesis, was presented by Gilbert Gottlieb in 2003. This view encompasses all of the possible developing factors on an organism and how they not only influence the organism and each other but how the organism also influences its own development. Gottlieb gave an example of Rhesus monkeys where infants that did not receive typical maternal care lacked serotonin, which in turn made them more aggressive as they got older. On another note, the long-standing notion "cells that fire together, wire together" derives from Hebbian theory which asserts that synaptogenesis, a developmental process with great epigenetic precedence, depends on the activity of the respective synapses within a neural network. Where experience alters the excitability of neurons, increased neural activity has been linked to increased demethylation .
https://en.wikipedia.org/wiki?curid=49033
86,029
1,209,962
Vico began the third edition with a detailed close reading of a front piece portrait, examining the place of Gentile nations within the providential guidance of the Hebrew God. This portrait contains a number of images that are symbolically ascribed to the flow of human history. A triangle with the Eye of Providence appears in the top left. A beam of light from the eye shines upon a brooch attached to the breastplate of “the lady with the winged temples who surmounts the celestial globe or world of nature” (center right), which represents metaphysics. The beam reflects off the brooch onto the back of a robed character standing upon a pedestal (bottom left), representing the poet Homer. All around these main characters resides a variety of objects that represent the stages of human history which Vico categorizes into three epochs: the age of the gods “in which the gentiles believed they lived under divine governments, and everything was commanded them by auspices and oracles, which are the oldest institutions in profane history; the age of the heroes "in which they reigned everywhere in aristocratic commonwealths, on account of a certain superiority of nature which they held themselves to have over the plebs (or peasants);" and the age of men "in which all men recognized themselves as equal in human nature, and therefore there were established first the popular commonwealths and then the monarchies, both of which are forms of human government." By viewing these principles as universal phenomena which combined nature and government with language and philology, Vico could insert the history of the Gentile nations into the supreme guidance by divine providence. According to Vico, the proper end for government resulted with society entering into a state of universal equity: "The last type of jurisprudence was that of natural equity, which reigns naturally in the free commonwealths, in which the people, each for his own particular good (without understanding that it is the same for all), are led to command universal laws. They naturally desire these laws to bend benignly to the least details of matters calling for equal unity."
https://en.wikipedia.org/wiki?curid=28999214
1,209,315
444,777
As a physician, Malpighi's medical consultations with his patients, which were mostly those belonging to social elite classes, proved useful in better understanding the links between the human anatomy, disease pathology, and treatments for said diseases. Furthermore, Malpighi conducted his consultations not only by bedside, but also by post, using letters to request and conduct them for various patients. These letters served as social connections for the medical practices he performed, allowing his ideas to reach the public even in the face of criticism. These connections that Malpighi created in his practice became even more widespread due to the fact that he practiced in various countries. However, long distances complicated consults for some of his patients. The manner in which Malpighi practiced medicine also reveals that it was customary in his time for Italian patients to have multiple attending physicians as well as consulting physicians. One of Malpighi's principles of medical practice was that he did not rely on anecdotes or experiences concerning remedies for various illnesses. Rather, he used his knowledge of human anatomy and disease pathology to practice what he denoted as "rational" medicine ("rational" medicine was in contrast to "empirics"). Malpighi did not abandon traditional substances or treatments, but he did not employ their use simply based on past experiences that did not draw from the nature of the underlying anatomy and disease process. Specifically in his treatments, Malpighi's goal was to reset fluid imbalances by coaxing the body to correct them on its own. For example, fluid imbalances should be fixed over time by urination and not by artificial methods such as purgatives and vesicants. In addition to Malpighi's "rational" approaches, he also believed in so-called "miraculous," or "supernatural" healing. For this to occur, though, he argued that the body could not have attempted to expel any malignant matter, such as vomit. Cases in which this did occur, when healing could not be considered miraculous, were known as "crises."
https://en.wikipedia.org/wiki?curid=20428
444,562
1,810,975
Combined, Hsp90A and Hsp90B are predicted to interact with 10% of the eukaryotic proteome. In humans this represents a network of roughly 2,000 interacting proteins. Presently over 725 interactions have been experimentally documented for both HSP90A and Hsp90B. This connectivity allows Hsp90 to function as a network hub linking diverse protein interaction networks. Within these networks Hsp90 primarily specializes in maintaining and regulating proteins involved in signal transduction or information processing. These include transcription factors that initiate gene expression, kinases that transmit information by post-translationally modifying other proteins and E3-ligases that target proteins for degradation via the proteosome. Indeed, a recent study utilizing the LUMIER method has shown that human Hsp90B interacts with 7% of all transcription factors, 60% of all kinases and 30% of all E3-ligases. Other studies have shown that Hsp90 interacts with various structural proteins, ribosomal components and metabolic enzymes. Hsp90 has also been found to interact with a large number of viral proteins including those from HIV and EBOLA. This is not to mention the numerous co-chaperones that modulate and direct HSP90 activity. Few studies have focused on discerning the unique protein interactions between Hsp90A and HSP90B. Work done in Xenopus eggs and yeast has shown that Hsp90A and Hsp90B differ in co-chaperone and client interactions. However, little is understood concerning the unique functions delegated to each human paralog. The Picard lab has aggregated all available Hsp90 interaction data into the Hsp90Int.DB website. Gene ontology analysis of both Hsp90A and Hsp90B interactomes indicate that each paralogs is associated with unique biological processes, molecular functions and cellular components.
https://en.wikipedia.org/wiki?curid=14087410
1,809,942
797,560
These examples of applications took place during the 2018 Advanced Naval Technology exercises, in August at the Naval Undersea Warfare Center Division Newport. The first example of unmanned underwater vehicles was displayed by Northrop Grumman with their air drop sonobuoy's from a fire scout aircraft. Throughout the demonstration the company used the: e Iver3-580 (Northrop Grumman AUV) to display their vehicles ability to sweep for mines, while also displaying their real-time target automated recognition system. Another company, Huntington Ingalls Industries, presented their version of an unmanned underwater vehicle named Proteus. The Proteus is a dual-mode undersea vehicle developed by Huntington and Battelle, the company during the presentation displayed their unmanned underwater vehicle capabilities by conducting a full-kill demonstration on sea bed warfare. During the demonstration the vehicle utilized a synthetic aperture sonar which was attached to both the port and starboard of the craft, which allowed the unmanned underwater vehicle to identify the targets placed underwater and to ultimately eliminate them. Ross Lindman (director of operations at the company's technical solution's fleet support group) stated that "The big significance of this is that we ran the full kill chain". "We ran a shortened version of an actual mission. We didn’t say, ‘Well we’re doing this part and you have to imagine this or that.’ We ran the whole thing to illustrate a capability that can be used in the near term." The final demonstration for unmanned underwater vehicles was displayed by General Dynamics, the company showcased their cross-domain multi-platform UUV through a theater simulating warfare planning tool. Through the utilization of this simulation, they showed a Littoral combat ship along with two unmanned underwater vehicles. The goal of this exercise was to demonstrate the communication speed between the operator and the UUV. James Langevin, D-R.I., ranking member on the House Armed Services Committee’s subcommittee on emerging threats, stated in regard to this exercise "What this is all driving to is for the warfare commander to be able to make the decisions that are based on what he thinks is high-confidence input quicker than his adversary can," he said. "That’s the goal — we want to be able to … let them make warfare-related decisions quicker than anybody else out there." These exercises were conducted to showcase the applications of unmanned underwater vehicles within the military community, along with the innovations each company created to better suite these specific mission types.
https://en.wikipedia.org/wiki?curid=4778319
797,135
1,204,993
LOFAR consists of a vast array of omnidirectional radio antennas using a modern concept, in which the signals from the separate antennas are not connected directly electrically to act as a single large antenna, as they are in most array antennas. Instead, the LOFAR dipole antennas (of two types) are distributed in stations, within which the antenna signals can be partly combined in analogue electronics, then digitised, then combined again across the full station. This step-wise approach provides great flexibility in setting and rapidly changing the directional sensitivity on the sky of an antenna station. The data from all stations are then transported over fiber to a central digital processor, and combined in software to emulate a conventional radio telescope dish with a resolving power corresponding to the greatest distance between the antenna stations across Europe. LOFAR is thus an interferometric array, using about 20,000 small antennas concentrated in 52 stations since 2019. 38 of these stations are distributed across the Netherlands, built with regional and national funding. The six stations in Germany, three in Poland, and one each in France, Great Britain, Ireland, Latvia, and Sweden, with various national, regional, and local funding and ownership. Italy officially joined the International LOFAR Telescope (ILT) in 2018; construction at the INAF observatory site in Medicina, near Bologna, is planned as soon as upgraded (so-called LOFAR2.0) hardware becomes available. Further stations in other European countries are in various stages of planning. The total effective collecting area is approximately 300,000 square meters, depending on frequency and antenna configuration. Until 2014, data processing was performed by a Blue Gene/P supercomputer situated in the Netherlands at the University of Groningen. Since 2014 LOFAR uses a GPU-based correlator and beamformer, COBALT, for that task. LOFAR is also a technology and science pathfinder for the Square Kilometre Array.
https://en.wikipedia.org/wiki?curid=1524766
1,204,348
140,671
Emotional prosody was considered by Charles Darwin in "The Descent of Man" to predate the evolution of human language: "Even monkeys express strong feelings in different tones – anger and impatience by low, – fear and pain by high notes." Native speakers listening to actors reading emotionally neutral text while projecting emotions correctly recognized happiness 62% of the time, anger 95%, surprise 91%, sadness 81%, and neutral tone 76%. When a database of this speech was processed by computer, segmental features allowed better than 90% recognition of happiness and anger, while suprasegmental prosodic features allowed only 44%–49% recognition. The reverse was true for surprise, which was recognized only 69% of the time by segmental features and 96% of the time by suprasegmental prosody. In typical conversation (no actor voice involved), the recognition of emotion may be quite low, of the order of 50%, hampering the complex interrelationship function of speech advocated by some authors. However, even if emotional expression through prosody cannot always be consciously recognized, tone of voice may continue to have subconscious effects in conversation. This sort of expression stems not from linguistic or semantic effects, and can thus be isolated from traditional linguistic content. Aptitude of the average person to decode conversational implicature of emotional prosody has been found to be slightly less accurate than traditional facial expression discrimination ability; however, specific ability to decode varies by emotion. These emotional have been determined to be ubiquitous across cultures, as they are utilized and understood across cultures. Various emotions, and their general experimental identification rates, are as follows:
https://en.wikipedia.org/wiki?curid=1411106
140,614
1,733,913
A substantial source of funding, mainly for more basic research, comes from bi-national funds. The leading one is The United States - Israel Binational Agricultural Research and Development Fund (BARD). BARD is a competitive funding program for mutually beneficial, mission-oriented, strategic and applied research of agricultural problems, jointly conducted by American and Israeli scientists. Since 1979, BARD has funded over 870 research projects, with awards of about $9.5 million annually for new research projects. Most of these are of three years duration, the average award being $300,00. Budgets are distributed about equally between the two countries. Proposals are evaluated both in the US by the Agricultural Research Service (ARS) and in Israel by the Scientific Evaluation Committees (SEC), based on expert reviewers from different countries. The recommendations from ARS and SEC are brought before a Technical Advisory Committee (TAC) for final recommendations to the Board of BARD. Among the research areas funded were: Alleviating Heat Stress in Dairy Cattle, Breeding for Heat Tolerant Wheat Varieties, Improving Wheat-Seed Proteins by Molecular Approaches, Algal Culture and Improving Cut Flower Quality to name only a few where significant results were obtained (BARD, 20 year external review). The success of BARD led to the establishment of additional bi-national funds as the Joint Dutch-Israeli Agricultural Science and Technology Program, a bi-national Program with Queensland (Australia) and Canada. The latter are all on a much lower funding level than BARD. In addition funding is also obtained from the EU, The US – Israel Bi-national Science Foundation (BSF) and others.
https://en.wikipedia.org/wiki?curid=6570637
1,732,936
1,233,706
Several attempts have been made to make ethics computable, or at least formal. Whereas Isaac Asimov's Three Laws of Robotics are usually not considered to be suitable for an artificial moral agent, it has been studied whether Kant's categorical imperative can be used. However, it has been pointed out that human value is, in some aspects, very complex. A way to explicitly surmount this difficulty is to receive human values directly from the humans through some mechanism, for example by learning them. Another approach is to base current ethical considerations on previous similar situations. This is called casuistry, and it could be implemented through research on the Internet. The consensus from a million past decisions would lead to a new decision that is democracy dependent. Prof. Bruce M. McLaren built an early (mid 1990s) computational model of casuistry, specifically a program called SIROCCO built with AI and case-base reasoning techniques that retrieves and analyzes ethical dilemmas. This approach could, however, lead to decisions that reflect biases and unethical behaviors exhibited in society. The negative effects of this approach can be seen in Microsoft's Tay (bot), where the chatterbot learned to repeat racist and sexually charged messages sent by Twitter users.
https://en.wikipedia.org/wiki?curid=32237314
1,233,043
202,931
Phylogenetics and sequence alignment are closely related fields due to the shared necessity of evaluating sequence relatedness. The field of phylogenetics makes extensive use of sequence alignments in the construction and interpretation of phylogenetic trees, which are used to classify the evolutionary relationships between homologous genes represented in the genomes of divergent species. The degree to which sequences in a query set differ is qualitatively related to the sequences' evolutionary distance from one another. Roughly speaking, high sequence identity suggests that the sequences in question have a comparatively young most recent common ancestor, while low identity suggests that the divergence is more ancient. This approximation, which reflects the "molecular clock" hypothesis that a roughly constant rate of evolutionary change can be used to extrapolate the elapsed time since two genes first diverged (that is, the coalescence time), assumes that the effects of mutation and selection are constant across sequence lineages. Therefore, it does not account for possible difference among organisms or species in the rates of DNA repair or the possible functional conservation of specific regions in a sequence. (In the case of nucleotide sequences, the molecular clock hypothesis in its most basic form also discounts the difference in acceptance rates between silent mutations that do not alter the meaning of a given codon and other mutations that result in a different amino acid being incorporated into the protein). More statistically accurate methods allow the evolutionary rate on each branch of the phylogenetic tree to vary, thus producing better estimates of coalescence times for genes.
https://en.wikipedia.org/wiki?curid=149289
202,827
329,900
The study of pathology, including the detailed examination of the body, including dissection and inquiry into specific maladies, dates back to antiquity. Rudimentary understanding of many conditions was present in most early societies and is attested to in the records of the earliest historical societies, including those of the Middle East, India, and China. By the Hellenic period of ancient Greece, a concerted causal study of disease was underway (see Medicine in ancient Greece), with many notable early physicians (such as Hippocrates, for whom the modern Hippocratic Oath is named) having developed methods of diagnosis and prognosis for a number of diseases. The medical practices of the Romans and those of the Byzantines continued from these Greek roots, but, as with many areas of scientific inquiry, growth in understanding of medicine stagnated some after the Classical Era, but continued to slowly develop throughout numerous cultures. Notably, many advances were made in the medieval era of Islam (see Medicine in medieval Islam), during which numerous texts of complex pathologies were developed, also based on the Greek tradition. Even so, growth in complex understanding of disease mostly languished until knowledge and experimentation again began to proliferate in the Renaissance, Enlightenment, and Baroque eras, following the resurgence of the empirical method at new centers of scholarship. By the 17th century, the study of rudimentary microscopy was underway and examination of tissues had led British Royal Society member Robert Hooke to coin the word "cell", setting the stage for later germ theory.
https://en.wikipedia.org/wiki?curid=48791
329,725
2,146,306
In addition to her research, Warner was involved in many scientific organisations, often in a leadership role. She was a member of NERC, the Marine Biological Association of the United Kingdom, the Lister Institute of Preventive Medicine, the Roslin Institute, the editorial board of "The Journal of Physiology", the Committee of The Physiological Society, and many Medical Research Council boards and policy committees. In 1976, Warner returned to her alma mater, University College London, when she was appointed as a lecturer at the Royal Free Hospital School of Medicine. Throughout her years of work at the University, Warner held several positions including the position of Reader at the Department of Anatomy and Developmental Biology and Royal Society Foulerton Professor, an honour she received in 1986. In addition, Warner was elected a Fellow of the Royal Society in 1985. Of all of the organisations and leadership roles that Warner was involved in, she is perhaps most well known for her role as Vice-President of the Marine Biological Association (MBA) council and Director of the CoMPLEX (Centre of Mathematics, Physics, and Life Sciences) at the University College London. With Warner's role in the MBA, she is partially responsible for the organisation's survival and legacy to this day. Among many of the programs that Warner initiated in the organisation, she founded the cell physiology Workshop in 1984, which was responsible for creating many cell physiologist cohorts across the world. As director of the UCL CoMPLEX during its infantile stages, Warner was a co-founder of the organisation and fostered its development during her many years as its leader. As the leader of the organisation, Warner brought together a variety of different scientists to work towards the common goal of developing the field of biology. The organisation became an example and model for similar organisations in other countries. Her work with the organisations that she was involved in created a lasting legacy through her many programs that are still used today.
https://en.wikipedia.org/wiki?curid=18517854
2,145,075
1,750,915
The 'age' of a reconstructed sequence is determined using a molecular clock model, and often several are employed. This dating technique is often calibrated using geological time-points (such as ancient ocean constituents or BIFs) and while these clocks offer the only method of inferring a very ancient protein's age, they have sweeping error margins and are diffuclt to defend against contrary data. To this end, ASR 'age' should really be only used as an indicative feature and is often surpassed altogether for a measurement of the number of substitutions between the ancestral and the modern sequences (the fundiment on which the clock is calculated). That being said, the use of a clock allows one to compare observed biophysical data of an ASR protein to the geological or ecological environment at the time. For example, ASR studies on bacterial EF-Tus (proteins involved in translation, that are likely rarely subject to HGT and typically exhibit Tms ~2C greater than Tenv) indicate a hotter Precambrian Earth which fits very closely with geological data on ancient earth ocean temperatures based on Oxygen-18 isotopic levels. ASR studies of yeast Adhs reveal that the emergence of subfunctionalized Adhs for ethanol metabolism (not just waste excretion) arose at a time similar to the dawn of fleshy fruit in the Cambrian Period and that before this emergence, Adh served to excrete ethanol as a byproduct of excess pyruvate. The use of a clock also perhaps indicates that the origin of life occurred before the earliest molecular fossils indicate (>4.1Ga), but given the debatable reliability of molecular clocks, such observations should be taken with caution.
https://en.wikipedia.org/wiki?curid=39867144
1,749,929
270,942
Titchener responded in "Philosophical Review" (1898, 1899) by distinguishing his austere "structural" approach to psychology from what he termed the Chicago group's more applied "functional" approach, and thus began the first major theoretical rift in American psychology between Structuralism and Functionalism. The group at Columbia, led by James McKeen Cattell, Edward L. Thorndike, and Robert S. Woodworth, was often regarded as a second (after Chicago) "school" of American Functionalism (see, e.g., Heidbredder, 1933), although they never used that term themselves, because their research focused on the applied areas of mental testing, learning, and education. Dewey was elected president of the APA in 1899, while Titchener dropped his membership in the association. (In 1904, Titchener formed his own group, eventually known as the Society of Experimental Psychologists.) Jastrow promoted the functionalist approach in his APA presidential address of 1900, and Angell adopted Titchener's label explicitly in his influential textbook of 1904 and his APA presidential address of 1906. In reality, Structuralism was, more or less, confined to Titchener and his students. (It was Titchener's former student E. G. Boring, writing "A History of Experimental Psychology" [1929/1950, the most influential textbook of the 20th century about the discipline], who launched the common idea that the structuralism/functionalism debate was the primary fault line in American psychology at the turn of the 20th century.) Functionalism, broadly speaking, with its more practical emphasis on action and application, better suited the American cultural "style" and, perhaps more important, was more appealing to pragmatic university trustees and private funding agencies.
https://en.wikipedia.org/wiki?curid=1573230
270,794
1,971,087
Horton described the book as persuasive and credited LeVay, along with other researchers, with helping make a strong but not definitive case that biological influences play an important or even decisive role in "determining sexual preference among males", and with "taking a "broad philosophical perspective in his discussion of human sexuality by placing his research in the context of animal evolution." He wrote that LeVay supported his contentious view that there are separate centers in the hypothalamus responsible for generating "male-typical and female-typical sexual behavior and feelings" with a wide range of sources, notably that concerning women with congenital adrenal hyperplasia. Though noting that LeVay acknowledged the limitations of his research, he criticized LeVay for having an unsubtle view of the meaning of "biological influence" on sexual orientation that ignores the question of how genes produce an "unpredictable interplay of behavioral impulses", and engaging in "overstretched speculations" about "why a gene for homosexuality still exists when it apparently has little apparent survival value in evolutionary terms." He concluded that while LeVay's work "presents technical and conceptual difficulties" and his "preliminary findings obviously need replication or refutation" it "represents a genuine epistemological break away from the past's rigid and withered conceptions of sexual preference."
https://en.wikipedia.org/wiki?curid=41143724
1,969,953
1,488,103
Many argue that the next human metasystem transition consists of a merger of biological metasystems with technological metasystems, especially information processing technology. Several cumulative major transitions of evolution have transformed life through key innovations in information storage and replication, including RNA, DNA, multicellularity, and also language and culture as inter-human information processing systems. In this sense it can be argued that the carbon-based biosphere has generated a cognitive system (humans) capable of creating technology that will result in a comparable evolutionary transition. "Digital information has reached a similar magnitude to information in the biosphere... Like previous evolutionary transitions, the potential symbiosis between biological and digital information will reach a critical point where these codes could compete via natural selection. Alternatively, this fusion could create a higher-level superorganism employing a low-conflict division of labor in performing informational tasks... humans already embrace fusions of biology and technology. We spend most of our waking time communicating through digitally mediated channels, ...most transactions on the stock market are executed by automated trading algorithms, and our electric grids are in the hands of artificial intelligence. With one in three marriages in America beginning online, digital algorithms are also taking a role in human pair bonding and reproduction".
https://en.wikipedia.org/wiki?curid=2466610
1,487,264
583,937
The general principle of seismic reflection is to send elastic waves (using an energy source such as dynamite explosion or Vibroseis) into the Earth, where each layer within the Earth reflects a portion of the wave's energy back and allows the rest to refract through. These reflected energy waves are recorded over a predetermined time period (called the record length) by receivers that detect the motion of the ground in which they are placed. On land, the typical receiver used is a small, portable instrument known as a geophone, which converts ground motion into an analogue electrical signal. In water, hydrophones are used, which convert pressure changes into electrical signals. Each receiver's response to a single shot is known as a “trace” and is recorded onto a data storage device, then the shot location is moved along and the process is repeated. Typically, the recorded signals are subjected to significant amounts of signal processing before they are ready to be interpreted and this is an area of significant active research within industry and academia. In general, the more complex the geology of the area under study, the more sophisticated are the techniques required to remove noise and increase resolution. Modern seismic reflection surveys contain large amount of data and so require large amounts of computer processing, often performed on supercomputers or computer clusters.
https://en.wikipedia.org/wiki?curid=676418
583,638
1,114,662
For launches from the Eastern Range, which includes Kennedy Space Center and Cape Canaveral Space Force Station, the Mission Flight Control Officer (MFCO) is responsible for ensuring public safety from the vehicle during its flight up to orbital insertion, or, in the event that the launch is of a ballistic type, until all pieces have fallen safely to Earth. Despite a common misconception, the MFCO is not part of the Safety Office, but is instead part of the Operations group of the Range Squadron of the Space Launch Delta 45 of the Space Force, and is considered a direct representative of the Delta Commander. The MFCO is guided in making destruct decisions by as many as three different types of computer display graphics, generated by the Flight Analysis section of Range Safety. One of the primary displays for most vehicles is a vacuum impact point display in which drag, vehicle turns, wind, and explosion parameters are built into the corresponding graphics. Another includes a vertical plane display with the vehicle's trajectory projected onto two planes. For the Space Shuttle, the primary display a MFCO used is a continuous real time footprint, a moving closed simple curve indicating where most of the debris would fall if the MFCO were to destroy the Shuttle at that moment. This real time footprint was developed in response to the Space Shuttle Challenger disaster in 1986 when stray solid rocket boosters unexpectedly broke off from the destroyed core vehicle and began traveling uprange, toward land.
https://en.wikipedia.org/wiki?curid=16537446
1,114,094
370,900
The complex composition and improper handling of e-waste adversely affect human health. A growing body of epidemiological and clinical evidence has led to increased concern about the potential threat of e-waste to human health, especially in developing countries such as India and China. For instance, in terms of health hazards, open burning of printed wiring boards increases the concentration of dioxins in the surrounding areas. These toxins cause an increased risk of cancer if inhaled by workers and local residents. Toxic metals and poison can also enter the bloodstream during the manual extraction and collection of tiny quantities of precious metals, and workers are continuously exposed to poisonous chemicals and fumes of highly concentrated acids. Recovering resalable copper by burning insulated wires causes neurological disorders, and acute exposure to cadmium, found in semiconductors and chip resistors, can damage the kidneys and liver and cause bone loss. Long-term exposure to lead on printed circuit boards and computer and television screens can damage the central and peripheral nervous system and kidneys, and children are more susceptible to these harmful effects.
https://en.wikipedia.org/wiki?curid=3887690
370,706
705,480
By 1974, a team of Rothamsted Research scientists had discovered three pyrethroids (MoA 3a), suitable for use in agriculture, namely permethrin, cypermethrin and deltamethrin. These compounds were subsequently licensed by the NRDC, as NRDC 143, 149 and 161 respectively, to companies which could then develop them for sale in defined territories. Imperial Chemical Industries (ICI) obtained licenses to permethrin and cypermethrin but their agreement with the NRDC did not allow worldwide sales. Also, it was clear to ICI's own researchers at Jealott's Hill that future competition in the marketplace might be difficult owing to the greater potency of deltamethrin compared to the other compounds. For that reason, in the period 1974–1977, chemists there sought patentable analogues which might have advantages compared to the Rothamsted insecticides by having wider spectrum or greater cost-benefit. The first breakthrough was made when a trifluoromethyl group was used to replace one of the chlorines in cypermethrin, especially when the double bond was in its Z form. The resulting material was found to be more potent than cypermethrin, to which it is most closely related, but also with good activity against the spider mite "Tetranychus urticae", which added to its attractiveness as a potential new product. The second breakthrough occurred when ICI process chemists developed a practical manufacturing process for the Z-cis acid, by controlling the stereochemistry of the cyclopropane ring in addition to that of the double bond. This led to the initial commercialisation of cyhalothrin, under the trade name Grenade, but the resulting material was still a mixture of four isomers owing to the racemic nature of the Z-cis acid and because the alpha-cyano group was a 1:1 mixture of possible R and S configurations.
https://en.wikipedia.org/wiki?curid=13569091
705,111
1,822,404
Throughout its continued development, the Fido explosives detector underwent numerous modifications by the U.S. military to be mounted on various types of platforms in order to detect traces of explosive vapors in dangerous environments and hard-to-reach areas. One prominent example was the plan to integrate the Fido explosives detector on robotic platforms in order to remotely detect improvised explosive devices (IEDs). Initially proposed by the then-Assistant Secretary of the Army for Acquisition, Logistics and Technology (ASAALT), the project was spearheaded by the Joint IED Defeat Task Force (JIEDDTF) as part of a 90-day delivery schedule that promised to produce ten integrated systems for soldiers in combat spaces. After much deliberation, iRobot’s PackBot was selected as the robotic platform for the Fido explosives detector. However, due to challenges with cost and time restrictions, only half of the proposed ten prototype units were ultimately produced, tested, and fielded in Afghanistan and Iraq. While the fielded prototypes encountered technical problems that hindered performance, repairs by teams of scientists from different Army labs were able to resolve much of the arising issues. Other efforts included the development of Neural Robotics, Inc.'s AutoCopter, which had the detector mounted on a small, unmanned helicopter platform, as well as the integration of the detection system into the Foster-Miller TALON and the U.S. Marine Corps’ Dragon Runner.
https://en.wikipedia.org/wiki?curid=10604784
1,821,366
631,630
In powder-fed directed-energy deposition, a high-power laser is used to melt metal powder supplied to the focus of the laser beam. The laser beam typically travels through the center of the deposition head and is focused to a small spot by one or more lenses. The build occurs on an X-Y table which is driven by a tool path created from a digital model to fabricate an object layer by layer. The deposition head is moved up vertically as each layer is completed. Some systems even make use of 5-axis or 6-axis systems ("i.e." articulated arms) capable of delivering material on the substrate (a printing bed, or a pre-existing part) with few to no spatial access restrictions. Metal powder is delivered and distributed around the circumference of the head or can be split by an internal manifold and delivered through nozzles arranged in various configurations around the deposition head. A hermetically sealed chamber filled with inert gas or a local inert shroud gas (sometimes both combined) are often used to shield the melt pool from atmospheric oxygen, to limit oxidation and better control the material properties. The powder fed directed energy process is similar to Selective Laser Sintering, but the metal powder is projected only where material is being added to the part at that moment. The laser beam is used to heat up and create a "melt pool" on the substrate, in which the new powder is injected quasi-simultaneously. The process supports a wide range of materials including titanium, stainless steel, aluminum, tungsten, and other specialty materials as well as composites and functionally graded material. The process can not only fully build new metal parts but can also add material to existing parts for example for coatings, repair, and hybrid manufacturing applications. LENS (Laser Engineered Net Shaping), which was developed by Sandia National Labs, is one example of the Powder Fed - Directed Energy Deposition process for 3D printing or restoring metal parts.
https://en.wikipedia.org/wiki?curid=53292993
631,292
2,136,586
The Plasma Wave Instrument (PWI) measured AC electric fields over the frequency range from 1-Hz to 2-MHz, and an amplitude range of 0.03 microvolt per meter to 100 mV per meter. Magnetic fields were measured from 1-Hz to 400-kHz over an approximately 100-dB range. The objectives of this investigation were to measure the spatial, temporal, spectral, and wave characteristics (particularly the Poynting vector component along the magnetic field line) and the wave polarization for extremely low frequency (ELF), very low frequency (VLF), and High frequency (HF) noise phenomena. Of special interest were the auroral kilometric radiation and VLF hiss, and a variety of electrostatic waves that may cause field-aligned acceleration of particles. The investigation made use of the long dipole antennas in the spin plane and along the Z-axis, and a magnetic loop antenna. A single-axis search coil magnetometer and a short electric antenna were included for low-frequency measurements and electrostatic noise measurements at short wavelengths. The electronics consisted of: (1) a wideband/long baseline receiver with a bandwidth of 10- or 40-kHz in the range 0-2-MHz; (2) a sweep-frequency correlator, containing two sweep-frequency receivers and phase detectors, sweeping 100-Hz to 400-kHz in 32 seconds, and giving the phase between magnetic and electric components of the field; (3) a low-frequency correlator containing two filter receivers and phase detectors (eight filters in the range 1.78–100-Hz were swept in 8 seconds); (4) DC monitors that measured the voltage difference between the two sets of long dipole antennas; and (5) a linear wideband receiver, selectable from 1.5 to 3.0, 3 to 6, or 10 to 16-kHz bands. The wideband receiver was flown to transmit wideband waveform signals to the ground via an analog transmitter, so that detailed high-resolution frequency-time analysis could be performed. Since 23 June 1984, a malfunction in the spacecraft data-handling system has prevented access to some PWI data. Digital measurements from the sweep frequency receiver system were no longer accessible.
https://en.wikipedia.org/wiki?curid=69341287
2,135,358
1,746,229
These films are made by incorporating an additive within normal polymers to provide an oxidative and then a biological mechanism to degrade them. This typically takes 6 months to 1 year in the environment with adequate exposure to oxygen Degradation is a two-stage process; first the plastic is converted by reaction with oxygen (light, heat and/or stress accelerates the process but is not essential) to hydrophilic low molecular-weight materials and then these smaller oxidized molecules are biodegraded, i.e. converted into carbon dioxide, water and biomass by naturally occurring microorganisms. Commercial competitors and their trade associations allege that the process of biodegradation stops at a certain point, leaving fragments, but they have never established why or at what point. In fact Oxo-biodegradation of polymer material has been studied in depth at the Technical Research Institute of Sweden and the Swedish University of Agricultural Sciences. A peer-reviewed report of the work was published in Vol 96 of the journal of Polymer Degradation & Stability (2011) at page 919–928. It shows 91% biodegradation in a soil environment within 24 months, when tested in accordance with ISO 17556.
https://en.wikipedia.org/wiki?curid=5652850
1,745,243
1,452,505
The U.S. National Institute for Occupational Safety and Health developed a Technical Report: Occupational Exposure Sampling for Engineered Nanomaterials which contains guidance for workplace sampling for three engineered nanomaterials: carbon nanotubes and nanofibers, silver, and titanium dioxide, each of which have an elemental mass-based NIOSH Recommended Exposure Limit (REL). In addition, NIOSH developed a practical approach to exposure sampling for other engineered nanomaterials that do not have exposure limits employing the Nanomaterial Exposure Assessment Technique (NEAT) 2.0, a sampling strategy that can be used to determine exposure potential for engineered nanoparticles. The NEAT 2.0 approach uses filter samples both in the worker's personal breathing zone and as area samples. Separate filter samples are used for elemental analysis, and to gather morphologic data from electron microscopy. The latter can provide an order of magnitude evaluation of the contribution of the nanoparticle of interest to the elemental mass load, as well as a qualitative assessment of the particle size, degree of agglomeration, and whether the nanoparticle is free or contained within a matrix. Hazard identification and characterization can then be performed based on a holistic assessment of the integrated filter samples. In addition, field-portable direct reading instruments can be used for continuous recording of normal fluctuations in particle count, size distribution, and mass. By documenting the workers' activities, data-logged results can then be used to identify workplace tasks or practices that contribute to any increase or spikes in the counts. The data need to be carefully interpreted, as direct reading instruments will identify the real-time quantity of all nanoparticles including any incidental background particles such as may occur from motor exhaust, pump exhaust, heating vessels, and other sources. Evaluation of worker practices, ventilation efficacy, and other engineering exposure control systems and risk management strategies serve to allow for a comprehensive exposure assessment.
https://en.wikipedia.org/wiki?curid=54992011
1,451,688
433,956
In the course of the generations, CPUs added more instructions and increased performance. The first three generations (G1 to G3) focused on low cost. The 4th generation was aimed at matching the performance of the last bipolar model, the 9021-9X2. It was decided to be accomplished by pursuing high clock frequencies. The G4 could reach 70% higher frequency than the G3 at silicon process parity, but it suffered a 23% IPC reduction from the G3. The initial G4-based models became available in June 1997, but it wasn't until the 370 MHz model RY5 (with a "Modular Cooling Unit") became available at the end of the year that a 9672 would almost match the 141 MHz model 9X2's performance. At 370 MHz it was the second-highest clocked microprocessor at the time, after the Alpha 21164 of DEC. The execution units in each G4 processor are duplicated for the purpose of error detection and correction. Arriving in late September 1998, the G5 more than doubled the performance over any previous IBM mainframe, and restored IBM's performance lead that had been lost to Hitachi's Skyline mainframes in 1995. The G5 operated at up to 500MHz, again second only to the DEC Alphas into early 1999. The G5 also added support for the IEEE 754 floating-point formats. The thousandth G5 system shipped less than 100 days after the manufacturing began; the greatest ramping of production in S/390's history. In late May 1999 the G6 arrived featuring copper interconnects, raising the frequency to 637MHz, higher than the fastest DEC machines at the time.
https://en.wikipedia.org/wiki?curid=53030665
433,742
155,694
Critics have analyzed Evangelion units, noting how they are organic entities and not machines, and classified them as cyborgs instead of robots. Anno said in 1996 that the Evangelions are not really robots, but giants, describing them as modified humans who move via artificial muscles. During an event in 2021, he described "Evangelion" as a , surprising Shinji's voice actress Megumi Ogata. American writer Susan J. Napier has interpreted the Angels as a representation of the Other and the Evas as a metaphor for the Self that Shinji is called upon to confront in his own personal growth process. Critics and official publications compared the red core in the center of the chest and the limit of operation of the units to Ultraman. Comic Book Resources's Timothy Donohoo similarly compared the Evangelions to the Ideon from Yoshiyuki Tomino's series of the same name, and noted how both mecha have a divine nature, seemingly unlimited energy sources, and the ability to read the emotions of their pilots, trying to protect them. Carl Gustav Horn traced influences to "Super Robot 28" and "Giant Robo". The Entry Plug and Type D equipment, used by Asuka in the tenth episode of the series, have also been juxtaposed with the works of Kenji Yanobe. The Evangelion has been interpreted as a metaphor for the series, produced through emotional efforts by Anno during a difficult time in his life; the constant questions of Shinji, the director's alter ego, who continually wonders why to board the Eva-01 and must make himself emotionally independent from the humanoid, have been interpreted as a symbolic representation of the author's relationship with his work. In the last two episodes Shinji wonders what the reason is for piloting the Eva. According to Sadamoto, the episodes would not focus on him, but on Anno, the true protagonist of the episodes, while mangaka Kentaro Takekuma interpreted the act of boarding the Eva as a metaphor for the production of an anime.
https://en.wikipedia.org/wiki?curid=577372
155,623
400,905
Thirty minutes of moderate- to high-intensity physical exercise has been shown to induce an increase in urinary phenylacetic acid, the primary metabolite of phenethylamine. Two reviews noted a study where the mean 24 hour urinary phenylacetic acid concentration following just 30 minutes of intense exercise rose 77% above its base level; the reviews suggest that phenethylamine synthesis sharply increases during physical exercise during which it is rapidly metabolized due to its short half-life of roughly 30 seconds. In a resting state, phenethylamine is synthesized in catecholamine neurons from -phenylalanine by aromatic amino acid decarboxylase at approximately the same rate as dopamine is produced. Monoamine oxidase deaminates primary and secondary amines that are free in the neuronal cytoplasm but not those bound in storage vesicles of the sympathetic neurone. Similarly, β-PEA would not be deaminated in the gut as it is a selective substrate for MAO-B, which is not found in the gut. Brain levels of endogenous trace amines are several hundred-fold below those for the classical neurotransmitters noradrenaline, dopamine, and serotonin, but their rates of synthesis are equivalent to those of noradrenaline and dopamine and they have a very rapid turnover rate. Endogenous extracellular tissue levels of trace amines measured in the brain are in the low nanomolar range. These low concentrations arise because of their very short half-life. Because of the pharmacological relationship between phenethylamine and amphetamine, the original paper and both reviews suggest that phenethylamine plays a prominent role in mediating the mood-enhancing euphoric effects of a runner's high, as both phenethylamine and amphetamine are potent euphoriants.
https://en.wikipedia.org/wiki?curid=381184
400,706
988,061
Udet was assigned a new Fokker to fly to his new fighter unit—FFA 68—at Habsheim. Mechanically defective, the plane crashed into a hangar when he took off, so he was then given an older Fokker to fly. In this aircraft, he experienced his first aerial combat, which almost ended in disaster. While lining up on a French Caudron, Udet found he could not bring himself to fire on another person and was subsequently fired on by the Frenchman. A bullet grazed his cheek and smashed his flying goggles. Udet survived the encounter, but from then on learned to attack aggressively and began scoring victories, downing his first French opponent on 18 March 1916. On that occasion, he had scrambled to attack two French aircraft, but instead found himself facing a formation of 23 enemy aircraft. He dived from above and behind, giving his Fokker E.III full throttle, and opened fire on a Farman F.40 from close range. Udet pulled away, leaving the flaming bomber trailing smoke, only to see the observer fall from the rear seat of the stricken craft. He later described the incident: "The fuselage of the Farman dives down past me like a giant torch... A man, his arms and legs spread out like a frog's, falls past--the observer. At the moment, I don't think of them as human beings. I feel only one thing--victory, triumph, victory." The victory won Udet the Iron Cross First Class.
https://en.wikipedia.org/wiki?curid=370934
987,545
2,117,270
Researchers began to change tone in the late 1950s/early 1960s, when Lenneberg, Chomsky, and Halle co-founded the field of biolinguistics and explored the role of biology in language. Their ideas led others to consider the role of human development more. In 1962, a turning point came about from a study that emphasized the importance of controlling for factors like age, sex, and SES, as well as of having a standardized measure for bilingualism when selecting a sample of bilinguals to be studied. Researchers carefully matched bilingual to monolingual participants and found that the bilinguals appeared to have significant advantages to that of their monolingual peers, outperforming in both verbal and non-verbal tests, more specifically in the non-verbal tests. In continuation of this study, research after this point began to shift focus, investigating areas of cognitive development and aptitude like perception and executive functioning. In 1967, publication of Lenneberg's seminal book, "Biological Foundations of Language", first introduced the idea of a critical period of language acquisition, now better known as a sensitive period and further influenced bilingualism ideas. In 1977 the American Institutes for Research published an influential study which discussed bilingualism as it relates to education - how it affects a child's performance compared to peers. This study played a large role in our understanding of multilingualism and the effects that it has on the brain.
https://en.wikipedia.org/wiki?curid=3289570
2,116,052
1,201,752
NIFC-CA relies on the use of data-links to provide every aircraft and ship with a picture of the entire battlespace. Aircraft deploying weapons may not need to control missiles after releasing them, as an E-2D would guide them by a data-stream to the target. Other aircraft are also capable of guiding missiles from other aircraft to any target that is identified as long as they are in range; work on weapons that are more survivable and longer-ranged is underway to increase their effectiveness in the data-link-centric battle strategy. This can allow forward-deployed Super Hornets or Lightning IIs to receive data and launch weapons without needing to even have their own radars active. E-2Ds act as the central node of NIFC-CA to connect the strike group with the carrier, but every aircraft is connected to all others through their own links. Two Advanced Hawkeyes would move data using the tactical targeting network technology (TTNT) waveform to share vast amounts of data over long distances with very low latency. Other aircraft would be connected to the E-2D through Link 16 or concurrent multi-netting-4 (CMN-4), a variant of four Link 16 radio receivers "stacked up" on top of each other. Growlers would coordinate with each other using data-links to locate hostile radar emitters on land or on the ocean surface. Having several sensors widely dispersed also hardens the system to electronic warfare; all cannot be jammed, so the parts that are not can home in on the jamming energy and target it for destruction. The network is built with redundancy to make it difficult to jam over a broad geographic area. If an enemy tries to disrupt it by targeting space-based communications, a line-of-sight network can be created.
https://en.wikipedia.org/wiki?curid=41509737
1,201,111
260,628
Gastroesophageal reflux disease (GERD)A condition that is a result of stomach contents consistently coming back up into the esophagus causing troublesome symptoms or complications. Symptoms are considered troublesome based on how disruptive they are to a patient's daily life and well-being. This definition was standardized by the Montreal Consensus in 2006. Symptoms include a painful feeling in the middle of the chest and feeling stomach contents coming back up into the mouth. Other symptoms include chest pain, nausea, difficulty swallowing, painful swallowing, coughing, and hoarseness. Risk factors include obesity, pregnancy, smoking, hiatal hernia, certain medications, and certain foods. Diagnosis is usually based on symptoms and medical history, with further testing only after treatment has been ineffective. Further diagnosis can be achieved by measuring how much acid enters the esophagus or looking into your esophagus with a scope. Treatment and management options include lifestyle modifications, medications, and surgery if there is no improvement with other interventions. Lifestyle modifications include not lying down for three hours after eating, lying down on the left side, elevating head while laying by elevating head of the bed or using extra pillows, losing weight, stopping smoking, and avoiding coffee, mint, alcohol, chocolate, fatty foods, acidic foods, and spicy foods. Medications include antacids, proton pump inhibitors, H2 receptor blockers. Surgery is usually a Nissen fundoplication and is performed by a surgeon. Complications of longstanding GERD can include inflammation of the esophagus that may cause bleeding or ulcer formation, narrowing of the esophagus leading to swallowing issues, a change in the lining of the esophagus that can increase the chances of developing cancer (Barrett's esophagus), chronic cough, asthma, inflammation of the larynx leading to hoarseness, and wearing away of tooth enamel leading to dental issues.
https://en.wikipedia.org/wiki?curid=12976
260,494
849,599
The discovery of bacterial invertebrate chemoautotrophic symbiosis, particularly in vestimentiferan tubeworms "R. pachyptila" and then in vesicomyid clams and mytilid mussels revealed the chemoautotrophic potential of the hydrothermal vent tube worm. Scientists discovered a remarkable source of nutrition that helps to sustain the conspicuous biomass of invertebrates at vents. Many studies focusing on this type of symbiosis revealed the presence of chemoautotrophic, endosymbiotic, sulfur-oxidizing bacteria mainly in "R. pachyptila", which inhabits extreme environments and is adapted to the particular composition of the mixed volcanic and sea waters. This special environment is filled with inorganic metabolites, essentially carbon, nitrogen, oxygen, and sulfur. In its adult phase, "R. pachyptila" lacks a digestive system. To provide its energetic needs, it retains those dissolved inorganic nutrients (sulfide, carbon dioxide, oxygen, nitrogen) into its plume and transports them through a vascular system to the trophosome, which is suspended in paired coelomic cavities and is where the intracellular symbiotic bacteria are found. The trophosome is a soft tissue that runs through almost the whole length of the tube's coelom. It retains a large number of bacteria on the order of 10 bacteria per gram of fresh weight. Bacteria in the trophosome are retained inside bacteriocytes, thereby having no contact with the external environment. Thus, they rely on "R. pachyptila" for the assimilation of nutrients needed for the array of metabolic reactions they employ and for the excretion of waste products of carbon fixation pathways. At the same time, the tube worm depends completely on the microorganisms for the byproducts of their carbon fixation cycles that are needed for its growth.
https://en.wikipedia.org/wiki?curid=550334
849,147
784,958
Between 1910 and 1939, the laboratory was the base of the Eugenics Record Office of biologist Charles B. Davenport and his assistant Harry H. Laughlin, two prominent American eugenicists of the period. Davenport was director of the Carnegie Station from its inception until his retirement in 1934. In 1935 the Carnegie Institution sent a team to review the ERO's work, and as a result the ERO was ordered to stop all work. In 1939 the Institution withdrew funding for the ERO entirely, leading to its closure. The ERO's reports, articles, charts, and pedigrees were considered scientific facts in their day, but have since been discredited. Its closure came 15 years after its findings were incorporated into the National Origins Act (Immigration Act of 1924), which severely reduced the number of immigrants to America from southern and eastern Europe who, Harry Laughlin testified, were racially inferior to the Nordic immigrants from England and Germany. Charles Davenport was also the founder and the first director of the International Federation of Eugenics Organizations in 1925. Today, Cold Spring Harbor Laboratory maintains the full historical records, communications and artifacts of the ERO for historical, teaching and research purposes. The documents are housed in a campus archive and can be accessed online and in a series of multimedia websites.
https://en.wikipedia.org/wiki?curid=441300
784,538
1,404,471
The secret Distantly Controlled Boat (D.C.B.) Section of the Royal Navy's Signals School, Portsmouth was setup to develop aerial radio systems for the control of unmanned naval vessels from 'mother' aircraft. This D.C.B. Section was based at Calshot under the command of Eric Gascoigne Robinson VC. On 1 September 1917 George Callaghan Commander-in-Chief, The Nore was informed that significant shore, & mooring facilities were to be made available for D.C.B. trials and rehearsals in the Thames Estuary. Airfield facilities were also requested in the area for the ‘mother’ aircraft. On 9 October 1917 the Deputy Director of Naval Construction, William Henry Gard assessed HMS Carron for use as a D.C.B. blocking ship but it was considered more suitable as a parent ship and floating repair depot for the D.C.B. Section. By then this D.C.B. Section had access to many vessels including a submarine and to the necessary support of aircraft, pilots and the trained radio control operators. They had conducted trials guiding unmanned boats into the busy waters around Portsmouth Harbour. Then between 28 May and 31 May 1918 Trials were undertaken by the Royal Navy Dover Command, using operators in Armstrong Whitworth aircraft Nos. 5082, 5117, in charge of Captain Tate, R.A.F. to control the boats. During these trials Acting Vice-Admiral Sir Roger J. B. Keyes and Rear-Admiral Cecil Frederick Dampier were on board one of these DCBs while it was remotely controlled. Trials included steering them through 'gates' created by motor launches anchored 60 yards apart. A significant number of high ranking and senior Admiralty, Naval and political officials are referenced in the surviving records.
https://en.wikipedia.org/wiki?curid=67154389
1,403,681
153,388
Historically, there was a trend not only to dismiss the vermiform appendix as being uselessly vestigial, but an anatomical hazard, a liability to dangerous inflammation. As late as the mid-20th century, many reputable authorities conceded it no beneficial function. This was a view supported, or perhaps inspired, by Darwin himself in the 1874 edition of his book "The Descent of Man, and Selection in Relation to Sex". The organ's patent liability to appendicitis and its poorly understood role left the appendix open to blame for a number of possibly unrelated conditions. For example, in 1916, a surgeon claimed that removal of the appendix had cured several cases of trifacial neuralgia and other nerve pain about the head and face, even though he stated that the evidence for appendicitis in those patients was inconclusive. The discovery of hormones and hormonal principles, notably by Bayliss and Starling, argued against these views, but in the early twentieth century, there remained a great deal of fundamental research to be done on the functions of large parts of the digestive tract. In 1916, an author found it necessary to argue against the idea that the colon had no important function and that "the ultimate disappearance of the appendix is a coordinate action and not necessarily associated with such frequent inflammations as we are witnessing in the human".
https://en.wikipedia.org/wiki?curid=12082283
153,318
789,396
A user's motor abilities, communication skills and needs, cognition and vision are assessed in order to determine the most appropriate match to a communication system. Depending on the individual's physical status, recommendations of an alternative access method, a change in seating/positioning, a mounting system and/or communication aid adaptations may be needed. For example, someone with spastic arm movements may require a key guard on top of the keyboard or touchscreen to reduce the selection of non-target items. The person's needs and abilities determine the symbols chosen and their organization, with the goal being that the communication system can be used as efficiently as possible in different contexts, with different communication partners, and for different social purposes. Researcher Janice Light identified four social purposes of communicative interaction in AAC: the expression of needs and wants to a listener, the transfer of information as in more general conversation, the development of social closeness through such things as jokes and cheering, and finally social etiquette practices such as "please" and "thank you". These four purposes vary in terms of the relative importance of the content, rate, duration and the focus of the interaction. It is important that the AAC systems selected also reflect the priorities of the individual and their family. In Western cultures, professionals may see a communication device as helping to promote an individual's self-determination, i.e., the ability to make one's own decisions and choices. However, cultural and religious factors may affect the degree to which individual autonomy is a valued construct, and influence family attitudes towards AAC.
https://en.wikipedia.org/wiki?curid=2106968
788,972
1,513,574
Mechanisms are given as conceptual elements of a networked communication system and are linked to specific functional units, for example, as a service or protocol component. In some cases, a mechanism can also comprise an entire protocol. For example on the transmission layer, LTE can be regarded as such a mechanism. Following this definition, there exist numerous communication mechanisms that are partly equivalent in their basic functionality, such as Wi-Fi, Bluetooth and ZigBee for local wireless networks and UMTS and LTE for broadband wireless connections. For example, LTE and Wi-Fi have equivalent basic functionality, but they are technologically significantly different in their design and operation. Mechanisms affected by transitions are often components of a protocol or service. For example, in case of video streaming/transmission, the use of different video data encoding can be carried out depending on the available data transmission rate. These changes are controlled and implemented by transitions; A research example is a context-aware video adaptation service to support mobile video applications. Through analyzing the current processes in a communication system, it is possible to determine which transitions need to be executed at which communication layer in order to meet the quality requirements. In order for communication systems to adapt to the respective framework conditions, architectural approaches of self-organizing, adaptive systems can be used, such as the MAPE cycle (Monitor-Analyze-Plan-Execute). This central concept of Autonomic Computing can be used to determine the state of the communication system, to analyze the monitoring data and to plan and execute the necessary transition(s). A central goal is that users do not consciously perceive a transition while running applications and that the functionality of the used services is perceived as smooth and fluid.
https://en.wikipedia.org/wiki?curid=53650506
1,512,723
1,920,489
The radar evidence can be difficult to understand due to scattering effects of the layers in the SPLD on radar reflections (according to an eLetter by Hecht et al. replying to the original publication along with other sources ). As a result, further work has focused on explaining how the freezing temperature at the base of the SPLD might be lowered due to a combination of perchlorate salt and enhanced regional geothermal flux. Following the detection of perchlorate in the northern plains of Mars by the Phoenix lander, it was predicted that perchlorate could allow a brine of 1–3 meters deep to exist at the base of the northern ice cap of Mars. Perchlorate is a salt now considered to be widespread on Mars and is known to lower the freezing point of water. The studies in support of the subglacial lake hypothesis proposed that magnesium and calcium perchlorate at the base of the SPLD would lower the freezing point of water to temperatures as low as 204 and 198 K, thereby allowing the existence of briny liquid water. However, even taking into account perchlorate, computer simulations predict the temperature to still be too cold for liquid water to exist at the bottom of the southern ice cap. This is due to a small amount of pressure melting (Mars' gravity is about a third of Earth's) that would only lower the melting point by 0.3-0.5 K and an estimated low geothermal heat flux of 14-30 mW/m. A geothermal heat flux greater than 72 mW/m would support the subglacial lake, thus requiring a local enhancement in the heat flux, perhaps sourced by geologically recent (within hundreds of thousands of years ago) magmatism in the subsurface. Similarly, another study based on the surface topography and ice thickness found that the radar detection did not coincide with their predictions of locations for subglacial lakes based on hydrological potential, and as a result, they proposed the detection was due to a localized patch of basal melting rather than a lake.
https://en.wikipedia.org/wiki?curid=70926421
1,919,386
299,137
The chromosomes of the "T. pallidum" subspecies are small, about 1.14 Mbp. Their DNA sequences are more than 99.7% identical. "T. pallidum" subspecies "pallidum" was sequenced in 1998. This sequencing is significant due to "T. pallidum" not being capable of growing in a pure culture, meaning that this sequencing played an important role in understanding the microbe's functions. It revealed that "T. pallidum" relies on its host for many molecules provided by biosynthetic pathways, and that it is missing genes responsible for encoding key enzymes in oxidative phosphorylation and the tricarboxylic acid cycle. It was found that this is due to 5% of "T. pallidum"'s genes coding for transport genes. The recent sequencing of the genomes of several spirochetes permits a thorough analysis of the similarities and differences within this bacterial phylum and within the species. "T. p. pallidum" has one of the smallest bacterial genomes at 1.14 million base pairs, and has limited metabolic capabilities, reflecting its adaptation through genome reduction to the rich environment of mammalian tissue. The shape of "T. pallidum" is flat and wavy. In order to avoid antibodies attacking, the cell has few proteins exposed on the outer membrane sheath. Its chromosome of about 1000 kilo base pairs is circular with a 52.8% G + C average. Sequencing has revealed a bundle of twelve proteins and some putative hemolysins are potential virulence factors of "T. pallidum." 92.9% of DNA was determined to be ORF's, 55% of which had predicted biological functions.
https://en.wikipedia.org/wiki?curid=61836
298,976
2,110,241
The history of adaptive collaborative control began in 1999 through the efforts of Terrence Fong and Charles Thorpe of Carnegie Mellon University and Charles Baur of École Polytechnique Fédérale de Lausanne. Fong et al. believed existing telerobotic practices, which centered on a human point of view, while sufficient for some domains were sub-optimal for operating multiple vehicles or controlling planetary rovers. The new approach devised by Fong et al. focused on a robot-centric teleoperation model that treated the human as a peer and made requests to them in the manner a person would seek advice from experts. In the nominal work, Fong et al. implemented collaborative control design using a PioneerAT mobile robot and a UNIX workstation with wireless communications and distributed message-based computing. Two years later, Fong utilized collaborative control for several more applications, including the collaboration of a single human operator with multiple mobile robots for surveillance and reconnaissance. Around this same time, Goldberg and Chen presented an adaptive collaborative control system that possessed malfunctioning sources. The control design proved to create a model that maintained a robust performance when subjected to a sizeable fraction of malfunctioning sources. In the work, Goldberg and Chen expanded on the definition of collaborative control to include multiple sensors and multiple control processes in addition to human operators as sources. A collaborative, cognitive workspace in the form of a three-dimensional representation developed by Idaho National Laboratory to support understanding of tasks and environments for human operators expounds on Fong's seminal work which used textual dialogue as the human-robot interaction. The success of the 3-D display provided evidence of the use of mental models for increased team success. During that same time, Fong et al. developed a three dimensional display that was formed via a fusion of sensor data. A recent adaptation of adaptive collaborative control in 2010 was used to design a fault tolerant control system using a Lyapunov function based analysis.
https://en.wikipedia.org/wiki?curid=33882236
2,109,027
50,608
Other innovative technologies introduced in fourth-generation fighters included pulse-Doppler fire-control radars (providing a "look-down/shoot-down" capability), head-up displays (HUD), "hands on throttle-and-stick" (HOTAS) controls, and multi-function displays (MFD), all essential equipment . Aircraft designers began to incorporate composite materials in the form of bonded-aluminum honeycomb structural elements and graphite epoxy laminate skins to reduce weight. Infrared search-and-track (IRST) sensors became widespread for air-to-ground weapons delivery, and appeared for air-to-air combat as well. "All-aspect" IR AAM became standard air superiority weapons, which permitted engagement of enemy aircraft from any angle (although the field of view remained relatively limited). The first long-range active-radar-homing RF AAM entered service with the AIM-54 Phoenix, which solely equipped the Grumman F-14 Tomcat, one of the few variable-sweep-wing fighter designs to enter production. Even with the tremendous advancement of air-to-air missiles in this era, internal guns were standard equipment.
https://en.wikipedia.org/wiki?curid=10929
50,588
1,838,001
Branched flow refers to a phenomenon in wave dynamics, that produces a tree-like pattern involving successive mostly forward scattering events by smooth obstacles deflecting traveling rays or waves. Sudden and significant momentum or wavevector changes are absent, but accumulated small changes can lead to large momentum changes. The path of a single ray is less important than the environs around a ray, which rotate, compress, and stretch around in an area preserving way. Even more revealing are groups, or manifolds of neighboring rays extending over significant zones. Starting rays out from a point but varying their direction over a range, one to the next, or from different points along a line all with the same initial directions are examples of a manifold. Waves have analogous launching conditions, such as a point source spraying in many directions, or an extended plane wave heading on one direction. The ray bending or refraction leads to characteristic structure in phase space and nonuniform distributions in coordinate space that look somehow universal and resemble branches in trees or stream beds. The branches taken on non-obvious paths through the refracting landscape that are indirect and nonlocal results of terrain already traversed. For a given refracting landscape, the branches will look completely different depending on the initial manifold.
https://en.wikipedia.org/wiki?curid=66113656
1,836,952
710,575
In 1980 when the members of the new SAE student branch at the University of Texas (Austin) learned that the Mini-Indy had died, they generated the concept for a new intercollegiate student engineering design competition that would allow students to apply what they were learning in the classroom to a complex, real-world engineering design problem: design and development of a race car. UT SAE student branch members Robert Edwards and John Tellkamp led a discussion among UT SAE members and envisioned a competition that would involve designing and constructing a race car along the lines of the SCCA Formula 440 entry-level racing series that was popular at the time. Prof. Matthews came up with the “Formula SAE” name following the format of Formula A and Formula Vee but emphasizing that this new race car was an engineering competition rather than a driver's competition. Schools would meet after the end of the academic year to compete and determine who had built the best car. Edwards, Tellkamp, and fellow UT SAE students Joe Green, Dick Morton, Mike Best, and Carl Morris drafted a set of safety and competition rules and presented them to the SAE student branch membership and to UT SAE Faculty Advisor Prof. Ron Matthews. Prof. Matthews then contacted Bob Sechler of the SAE Educational Relations Department at SAE headquarters and asked for his permission both to establish the new intercollegiate student engineering design competition and to host the first Formula SAE competition during the summer of 1981, and he agreed. The newly formed UT SAE branch, consisting mostly of automotive and motorcycle enthusiasts pursuing engineering degrees, including several who had left careers in fields for which the job market had virtually disappeared due to the depressed economy in the early 1980s – including some experienced auto mechanics, embraced and adopted the concept with little idea of what they were getting themselves into. SAE student branch officers Mike Best, Carl Morris, and Sylvia Obregon, along with Dr. Matthews began planning and organizing the event to be held the following year.
https://en.wikipedia.org/wiki?curid=278148
710,204
1,209,967
Vico’s humanism (his returning to a pre-modern form of reasoning), his interest in classical rhetoric and philology, and his response to Descartes contribute to the philosophical foundations for the second "Scienza Nuova". Through an elaborate Latin etymology, Vico establishes not only the distinguishing features of first humans, but also how early civilization developed out of a "sensus communis" or common (not collective) sense. Beginning with the first form of authority intuited by the "giganti" or early humans and transposed in their first "mute" or "sign" language, Vico concludes that “first, or vulgar, wisdom was poetic in nature.” This observation is not an aesthetic one, but rather points to the capacity inherent in all men to imagine meaning via comparison and to reach a communal "conscience" or "prejudice" about their surroundings. The metaphors that define the poetic age gradually yield to the first civic discourse, finally leading to a time characterized by "full-fledged reason" ("ragione tutta spiegata"), in which reason and right are exposed to the point that they vanish into their own superficial appearance. At this point, speech returns to its primitive condition, and with it men. Hence the "recurring" ("ricorso") of life to "barbarism" ("barbarie"). It is by way of warning his age and those stemming from it of the danger of seeking truth in clear and distinct ideas blinding us to the real depths of life, that Vico calls our attention back to a classical art of moderating the course of human things, lest the liberty enjoyed in the "Republic" be supplanted by the anarchic tyranny of the senses.
https://en.wikipedia.org/wiki?curid=28999214
1,209,320
1,083,062
Interpreting the vibration signal obtained is an elaborate procedure that requires specialized training and experience. It is simplified by the use of state-of-the-art technologies that provide the vast majority of data analysis automatically and provide information instead of raw data. One commonly employed technique is to examine the individual frequencies present in the signal. These frequencies correspond to certain mechanical components (for example, the various pieces that make up a rolling-element bearing) or certain malfunctions (such as shaft unbalance or misalignment). By examining these frequencies and their harmonics, the CM specialist can often identify the location and type of problem, and sometimes the root cause as well. For example, high vibration at the frequency corresponding to the speed of rotation is most often due to residual imbalance and is corrected by balancing the machine. A degrading rolling-element bearing, on the other hand, will usually exhibit vibration signals at specific frequencies increasing in intensity as it wears. Special analysis instruments can detect this wear weeks or even months before failure, giving ample warning to schedule replacement before a failure which could cause a much longer down-time. Beside all sensors and data analysis it is important to keep in mind that more than 80% of all complex mechanical equipment fail accidentally and without any relation to their life-cycle period.
https://en.wikipedia.org/wiki?curid=2890021
1,082,505
1,847,585
Historically, a presumed gendered binary exists associating men with the “public” sphere and women with the “private” sphere. This often is explained by the historical division of labor in hunter-gatherer societies where men were responsible for food procurement of large game and women were responsible for child-rearing and gathering plant food sources. Friedrich Engels argued that the sexual division of labor was an "outgrowth of nature" because men went to war, and hunted while women cared for the house and prepared food. This argument is unacceptable today. Women are engaged in productive, active work inside and outside the home all over the world. Anthropological inquiry into the sexual division of labor in the domestic domain began with Ester Boserup's work on women and economic development. A universally assumed Western division between the “domestic/house/private” space versus the “public/exterior male world” exists. This launched a structured division between gendered social worlds. Women have made a substantial contribution to subsistence activities and household production. The structured system of social relationships also implies an “idealized” gendered identity associated with a public vs. private dichotomy. Political actions and socioeconomic motivations aid male and female actions in domestic groups. The mother-child relationship is often associated with the nucleus of all family groups. Reproduction involves physical reproduction, child rearing and socialization. Households can be organized to collectively achieve these tasks. Gender themes resonate as a result of this action. Household studies of space and place often involve function and activity. Site layouts, and conceptions of households provide information about gender roles in the past.
https://en.wikipedia.org/wiki?curid=29970881
1,846,528
893,019
On 25 November 2009, a debate was held in the Australian Senate concerning the alleged involvement of the CSIRO and the Labor government in censorship. The debate was called for by opposition parties after evidence came to light that a paper critical of carbon emissions trading was being suppressed. At the time, the Labor government was trying to get such a scheme through the Senate. After the debate, the Science Minister, Kim Carr, was forced to release the paper, but when doing so in the Senate he also delivered a letter from the CEO of the CSIRO, Megan Clark, which attacked the report's author and threatened him with unspecified punishment. The author of the paper, Clive Spash, was cited in the press as having been bullied and harassed, and later gave a radio interview about this. In the midst of the affair, CSIRO management had considered releasing the paper with edits that Nature reported would be "tiny". Spash claimed the changes actually demanded amounted to censorship and resigned. He later posted on his website a document detailing the text that CSIRO management demanded be deleted; by itself, this document forms a coherent set of statements criticising emissions trading without any additional wording needed. In subsequent Senate Estimates hearings during 2010, Senator Carr and Clark went on record claiming the paper was originally stopped from publication solely due to its low quality not meeting CSIRO standards. At the time of its attempted suppression, the paper had been accepted for publication in an academic journal, New Political Economy, which in 2010 had been ranked by the Australian Research Council as an 'A class' publication. In an ABC radio interview, Spash called for a Senate enquiry into the affair and the role played by senior management and the Science Minister. After these events, the "Sydney Morning Herald" reported that "Questions are being raised about the closeness of BHP Billiton and the CSIRO under its chief executive, Megan Clark". After his resignation, an unedited version of the paper was released by Spash as a discussion paper, and later published as an academic journal article.
https://en.wikipedia.org/wiki?curid=288262
892,549
1,920,731
The study also made conclusions about what hadrosaurids ate, although Purnell cautioned the conclusions about the hadrosaur's diet were "a little less secure than the very good evidence we have for the motions of the teeth relative to each other." The scratches found on each individual tooth were so equal that measuring an area of just one square millimeter was enough to sample the whole jaw. The team concluded the evenness of the scratches suggested the hadrosaur used the same series of jaw motions over and over again. As a result, the study determined that the hadrosaur diet was probably made of leaves and lacked the bulkier items such as twigs or stems, which might have required a different chewing method and created different wear patterns. The lack of pit marks on the teeth also upheld these conclusions, and suggested the hadrosaurs likely grazed on low-lying vegetation that lacked pits, rather than browsing on higher-growing vegetation with twigs. The scratches also indicated the hadrosaur's food contained either small particles of grit, which was normal for vegetation cropped close to the ground, or that it contained microscopic granules of silica, which is common in grass. Grasses had evolved by the Late Cretaceous period, but were not particularly common, so the study concluded it probably did not play a major component in the hadrosaur's diet. Instead, they believed horsetails, a common plant at the time containing the above characteristics, was probably an important food for the dinosaur. The results of the study were published online on June 30, 2009, in "The Proceedings of the National Academy of Sciences", the official journal of the United States National Academy of Sciences. The study was published under the title, "Quantitative analysis of dental microwear in hadrosaurid dinosaurs, and the implications for hypotheses of jaw mechanics and feeding".
https://en.wikipedia.org/wiki?curid=23481302
1,919,628
1,073,103
In May 2019, Maxar Technologies was contracted by NASA to manufacture this module, which will also supply the station with electrical power and is based on Maxar's SSL 1300 series satellite bus. The PPE will use Busek 6kW Hall-effect thrusters and NASA Advanced Electric Propulsion System (AEPS) Hall-effect thrusters. Maxar was awarded a firm-fixed price contract of US$375 million to build the PPE. Maxar's SSL business unit, previously known as Space Systems/Loral, will lead the project. Maxar stated they will receive help from Blue Origin and Draper Laboratory on the project, with Blue Origin assisting in human-rating and safety aspect while Draper will work with trajectory and navigation development. NASA is supplying the PPE with a S-band communications system to provide a radio link with nearby vehicles and a passive docking adapter to receive the Gateway's future utilization module. Maxar stated they are experienced dealing with high power components from making satellites. They did mention that their satellites are around 20 to 30 kilowatts, while the PPE will be about 60 kilowatts, but they say much of the technology they have already developed will still be applicable. After a one-year demonstration period, NASA would then "exercise a contract option to take over control of the spacecraft". Its expected service time is about 15 years.
https://en.wikipedia.org/wiki?curid=66112244
1,072,549
1,558,064
This phenomenon is generally stimulated to help improve the mobility of people with akinesia. Akinesia consists of changes in walking pattern, freezing of gait (FOG), and losses of balance (LOBs). FOG occurs in the middle of stride, cutting off walking, and making it fairly difficult for a person to re-initiate a movement. Kinesia paradoxa can be used as a management strategy to overcome this. LOBs are when a person has difficulty maintaining an upright position and lose their balance, eventually leading to them falling. Since Parkinson’s disease is a progressive disease, patient’s symptoms continue to worsen with time and they often develop visible differences in their walking that greatly affects their quality of life. These differences include shuffling of steps, decreased stride length, and decrease in overall movement. Kinesia paradoxa is not able to be stimulated in everyone with movement disorders; persons who can stimulate this phenomenon demonstrate visible improvements in mobility including, increased stride length, more fluidity in strides, less FOGs incidents, less LOBs, and those that appeared to be completely frozen previously can regain their movement. More recently, kinesia paradoxa is also being used to treat children with Asperger's syndrome. Children with Asperger's demonstrate excellent skills in drawing, modeling, building, and computer games, but often struggle with everyday motor tasks such as walking or catching a ball. Kinesia paradoxa is currently being explored to help aid these individuals in focusing their attention and improving their efficiency in these simple motor tasks.
https://en.wikipedia.org/wiki?curid=33816120
1,557,179
634,047
Between December 1985 and July 1986, four Lakshya PTA prototypes powered by Microturbo TRI-60-5 engines were launched for trials. While the first two launches were successful for planned flight times of 20 and 38 minutes respectively, the next two launches failed. By June 1994, 18 Lakshya PTA prototypes were fabricated by ADE itself and 43 trials were conducted, 24 of which were between December 1985 and February 1992. Due to rigorous evaluation and stringent quality control, a total of 10 prototypes were lost during the testing phase between 1985 and 1990. The project was formally closed on June 1994 and a final closure report was issued in April 1995 after incurring a total expenditure of . The first 6 Lakshya drones were given to the Indian Air Force in 1998. Laskhya units are manufactured and overhauled at HAL's Aircraft division, Bangalore. The Lakshya was formally inducted into the services by CAS AY Tipnis, on 9 November 2000 at Interim test range (ITR) Chandipur. On 9 May 2002, an upgraded version of the Laskhya featuring the new engine from HAL was flown from ITR Chandipur, bringing user trials to a close. On 6 November 2002, HAL announced that they had received an initial order for 25 Lakshya drones and that limited series production to satisfy the order for all three services had already begun. By 16 January 2003, the drone had completed over 100 flights.
https://en.wikipedia.org/wiki?curid=13152409
633,709
1,158,951
In December 1986, Võ Văn Kiệt, vice chairman of the Council of Ministers and member of the Political Bureau, highlighted most of the major problems of Vietnamese agriculture in his speech to the Twelfth Session of the Seventh National Assembly. While mentioning gains in fisheries and forestry, he noted that nearly all farming subsectors—constituting 80 percent of the agricultural sector-had failed to achieve plan targets for 1986. Kiet blamed state agencies, such as the Council of Ministers, the State Planning Commission, and the Ministry of Foreign Trade, for their failure to ensure appropriate "material conditions" (chiefly sufficient quantities of chemical fertilizers and pesticides) for the growth of agricultural production. Kiet also blamed the state price system for underproduction of key "industrial crops" that Vietnam exported, including jute, sugar, groundnut, coffee, tea, and rubber. Production levels of subsidiary food crops, such as sweet potatoes, corn, and manioc, had been declining for several years, both in relation to plan targets and in actual output as well. By contrast, livestock output, including that of cattle, poultry, buffalo, and hogs, was reported by the government to have continued its growth and to have met or exceeded targets, despite unstable prices and shortages of state-provided animal feed.
https://en.wikipedia.org/wiki?curid=10903239
1,158,337
61,456
While in developed countries, nontyphoidal serotypes present mostly as gastrointestinal disease, in sub-Saharan Africa, these serotypes can create a major problem in bloodstream infections, and are the most commonly isolated bacteria from the blood of those presenting with fever. Bloodstream infections caused by nontyphoidal salmonellae in Africa were reported in 2012 to have a case fatality rate of 20–25%. Most cases of invasive nontyphoidal "Salmonella" infection (iNTS) are caused by "Salmonella enterica" Typhimurium or "Salmonella enterica" Enteritidis. A new form of "Salmonella" Typhimurium (ST313) emerged in the southeast of the African continent 75 years ago, followed by a second wave which came out of central Africa 18 years later. This second wave of iNTS possibly originated in the Congo Basin, and early in the event picked up a gene that made it resistant to the antibiotic chloramphenicol. This created the need to use expensive antimicrobial drugs in areas of Africa that were very poor, making treatment difficult. The increased prevalence of iNTS in sub-Saharan Africa compared to other regions is thought to be due to the large proportion of the African population with some degree of immune suppression or impairment due to the burden of HIV, malaria, and malnutrition, especially in children. The genetic makeup of iNTS is evolving into a more typhoid-like bacterium, able to efficiently spread around the human body. Symptoms are reported to be diverse, including fever, hepatosplenomegaly, and respiratory symptoms, often with an absence of gastrointestinal symptoms.
https://en.wikipedia.org/wiki?curid=42114
61,431
772,928
Charcot first began studying hysteria after creating a special ward for non-insane females with "hystero-epilepsy". He discovered two distinct forms of hysteria among these women: minor hysteria and major hysteria. His interest in hysteria and hypnotism "developed at a time when the general public was fascinated in 'animal magnetism' and 'mesmerization, which was later revealed to be a method of inducing hypnosis. His study of hysteria "attract[ed] both scientific and social notoriety". Bogousslavsky, Walusinski, and Veyrunes write:Charcot and his school considered the ability to be hypnotized as a clinical feature of hysteria ... For the members of the Salpêtrière School, susceptibility to hypnotism was synonymous with disease, i.e. hysteria, although they later recognized ... that "grand hypnotisme" (in hysterics) should be differentiated "from petit hypnotisme", which corresponded to the hypnosis of ordinary people.Charcot argued vehemently against the widespread medical and popular prejudice that hysteria was rarely found in men, presenting several cases of traumatic male hysteria. He taught that due to this prejudice these "cases often went unrecognised, even by distinguished doctors" and could occur in such models of masculinity as railway engineers or soldiers. Charcot's analysis, in particular his view of hysteria as an organic condition which could be caused by trauma, paved the way for understanding neurological symptoms arising from industrial-accident or war-related traumas.
https://en.wikipedia.org/wiki?curid=932831
772,512
1,909,732
The Oregon Graduate Center for Study and Research (OGC) was incorporated on 2 April 1963 as a university at the behest of Gov. Mark O. Hatfield, Tektronix co-founder Howard Vollum and the City Club of Portland, with the help of $2M grant from the Tektronix Foundation. Retired physician Samuel L. Diack of the Oregon Medical Research Foundation was named the first chairman of OGC's board of trustees, and Vollum was a board member. Diack is also noted as a founder of the Oregon Museum of Science and Industry. Physicist Donald L. Benedict of the Stanford Research Institute (SRI) was hired as the first president of OGC in 1966. The original campus, a former Martin Marietta building, was located at 9430 SW Barnes Road near the intersection of Oregon Route 217 and U.S. Route 26 in an unincorporated area just north of Beaverton next to Tek's Sunset facility. Hatfield was unsuccessful in his attempt to get $1.5M in seed funding for OGC from the state legislature. Financial support was an ongoing problem for OGC, as demonstrated by the brief terms of several of its presidents. Funding in the late 1960s was received from Pacific Northwest Bell Telephone Company, and sought from the U.S. Department of Health, Education and Welfare and the National Institutes of Health. Other early backers and board members included Douglas Strain of Electro Scientific Industries (ESI), John Gray of Omark Industries Inc. and Ira Keller of Western Kraft Corporation.
https://en.wikipedia.org/wiki?curid=48411848
1,908,634
247,892
After that, he returned to his hometown, Nagoya, and joined Toyota Boshoku, which was founded by his father, Sakichi, in 1918 (Sakichi served as company president since then). From July 1921 to February 1922, Kiichirō visited San Francisco, London, Oldham (a large town in Greater Manchester, England), etc. to learn about the spinning and weaving industry and then returned from Marseille via Shanghai. After he returned to Japan, in December 1922, he married Hatako Iida, the daughter of the Takashimaya department store chain co-founder, Shinshichi Iida. In 1926, Kiichiro established Toyota Industries Corporation and became its Managing Director. Also, he became interested in automatic looms, so he set up a pilot plant in Kariya, Aichi to start development for them even though his father, Sakichi, disagreed. He travelled to Europe and America from September 1929 to April 1930, and thought that the automobile industry, which was in its infancy at that time, would greatly develop in the future. Therefore, in 1933, an automobile manufacturing department (later the automobile department) was newly established in Toyota Industries Corporation. In 1936, it was designated as a licensed company under the Automotive Manufacturing Act. In 1937, it became independent as Toyota Motor Corporation. Kiichirō became the vice president the same year (the president was Rizaburo Toyoda). In 1941, Kiichiro took office as president. At the height of the Pacific theater of World War II, the Toyoda family would be affected on both family business and home fronts. His children's education would be delayed by civil ramifications, and his business would be compelled to manufacture trucks for the Imperial Japanese Army. The family firms would be spared destruction in the days before the Japanese government's surrender.
https://en.wikipedia.org/wiki?curid=4670679
247,764
928,708
In December 2014, Beretta announced the M9A3, which was submitted via an Engineering Change Proposal (ECP) in accordance with the terms of the current M9 contract. A modified version of the existing M9A1, the new model features a thinner grip, MIL-STD-1913 accessory rail, removable tritium sights, threaded barrel, and a sand-resistant 17-round magazine, produced in a dark earth tone color. Beretta claims likely cost savings over the standard M9 model and meeting almost all of the enhanced handgun requirements. Later that month, the Army decided not to evaluate the M9A3 in favor of pursuing the MHS program, without asking any questions about the upgraded pistol or requesting a sample. Army weapons officials maintain that the M9 design does not meet requirements and a cost-benefit analysis determined the old fleet would cost more to replace and repair than buying a new service pistol. Beretta claims M9A3 upgrade features fix most of the complaints and could be sold for less than the cost of previous M9 versions; the company has suggested a dual-path strategy to evaluate commercially available options while simultaneously evaluating improvements.
https://en.wikipedia.org/wiki?curid=29098840
928,217
71,165
Mustard gases are extremely toxic and have powerful blistering effects on victims. Their alkylating capabilities make them strongly carcinogenic and mutagenic. Furthermore, they are highly lipophilic, which accelerates their absorption into the body. Because people exposed to mustard agents rarely suffer immediate symptoms, and contaminated areas may appear completely normal, victims can unknowingly receive high doses. Within 24 hours of exposure, victims experience intense itching and skin irritation. If this irritation goes untreated, blisters filled with yellow fluid (pus) can start to form wherever the agent contacted the skin. These are chemical burns and are very debilitating. Mustard gases easily penetrate clothing fabrics such as wool or cotton, so it is not only exposed skin that gets burned. If the victim's eyes were exposed, then they become sore, starting with conjunctivitis (also known as pink eye), after which the eyelids swell, resulting in temporary blindness. Extreme ocular exposure to mustard gas vapors may result in corneal ulceration, anterior chamber scarring, and neovascularization. In these severe and infrequent cases, corneal transplantation has been used as a treatment option. Miosis, when the pupil constricts more than usual, may also occur, which is probably the result of the cholinomimetic activity of mustard. At very high concentrations, if inhaled, mustard agents cause bleeding and blistering within the respiratory system, damaging mucous membranes and causing pulmonary edema. Depending on the level of contamination, mustard agent burns can vary between first and second degree burns, though they can also be every bit as severe, disfiguring, and dangerous as third degree burns. Burns that are severe (i.e. covering more than 50% of the victim's skin) are often fatal, with death occurring after only days or weeks. Mild or moderate exposure to mustard gases is unlikely to kill, though victims still require lengthy periods of medical treatment and convalescence before recovery is complete.
https://en.wikipedia.org/wiki?curid=46126
71,138
173,436
As early as 1914, the possible existence of superheavy elements with atomic numbers well beyond that of uranium—then the heaviest known element—was suggested, when German physicist Richard Swinne proposed that superheavy elements around "Z" = 108 were a source of radiation in cosmic rays. Although he did not make any definitive observations, he hypothesized in 1931 that transuranium elements around "Z" = 100 or "Z" = 108 may be relatively long-lived and possibly exist in nature. In 1955, American physicist John Archibald Wheeler also proposed the existence of these elements; he is credited with the first usage of the term "superheavy element" in a 1958 paper published with Frederick Werner. This idea did not attract wide interest until a decade later, after improvements in the nuclear shell model. In this model, the atomic nucleus is built up in "shells", analogous to electron shells in atoms. Independently of each other, neutrons and protons have energy levels that are normally close together, but after a given shell is filled, it takes substantially more energy to start filling the next. Thus, the binding energy per nucleon reaches a local maximum and nuclei with filled shells are more stable than those without. This theory of a nuclear shell model originates in the 1930s, but it was not until 1949 that German physicists Maria Goeppert Mayer and Johannes Hans Daniel Jensen et al. independently devised the correct formulation.
https://en.wikipedia.org/wiki?curid=66394
173,345
2,162,213
After the Japanese attack on Pearl Harbor, at a December 10, 1941 assembly President Whitley advised ETSTC students to continue "getting an education until duty calls to something else". Students and faculty did everything from joining the military (including the WAVES for women) and coordinating Red Cross drives and morale committees to teaching courses on subjects such as first aid, maximizing food production, and Morse code. Numerous social fixtures at ETSTC were suspended during World War II, from Sadie Hawkins Day to intramural sports to the "Locust" student yearbook, although women's clubs remained active, often by buying war bonds and donating blood. In March 1942, the college was selected as a "war information center" in its region of the state. In 1943, ETSTC hosted approximately 300 women of the Women's Army Auxiliary Corps (WAAC), and later that year it did the same for a company of the Army Specialized Training Program (ASTP), which numbered 200 men in the 1943–44 term. ETSTC also participated in the Civilian Pilot Training Program beginning in 1940; 63 of its students went on to serve in the Army, and another 28 in the Navy. By the time the war ended in August 1945, 63 former students had been killed in the line of duty.
https://en.wikipedia.org/wiki?curid=49869717
2,160,978
1,255,805
The article continued to quote the U.S. Nuclear Energy Foundation to the effect that too much stock is being put into the fear surrounding Fukushima power plant. On a policy level, United States officials were wary. U.S. Energy Secretary Steven Chu told Congress that the Obama administration intends to hold the course on underwriting new nuclear power plants. "The people in the United States, U.S. territories, are in no danger”, Chu said during a "Fox News Sunday" broadcast. "It's unlikely they will be exposed to danger." As the ARPA-E Energy Innovation Summit 2011 Keynote Presentation, he skirted the nuclear issue and argued for a "longer term more measured approach". He emphasized lithium-ion batteries, high-speed rail, computerized design for streamlining long-haul trucks, carbon capture and other technologies, emphasizing that Europe and China may be surpassing the US in clean energy and roboticized manufacturing. The power point presentation, and a video of the presentation, are available online. The US is in the lead of venture capital financing, technology adaptation and deployment but in many areas is neck and neck with China. Many of his comments seem broadly applicable to nuclear policy, such as that "just because we've lost a lead doesn't mean we can't recover it." ARPA-E is Advanced Research Projects Agency-Energy, a relatively new United States government agency set up to promote and fund research and development.
https://en.wikipedia.org/wiki?curid=31284652
1,255,121
1,077,959
Mass spectrometry is an analytical technique that measures the mass-to-charge ratio of molecules. Molecules or molecular fragments are typically charged or ionized by spraying them through a charged field (electrospray ionization), bombarding them with electrons from a hot filament (electron ionization) or blasting them with a laser when they are placed on specially coated plates (matrix assisted laser desorption ionization). The charged molecules are then propelled through space using electrodes or magnets and their speed, rate of curvature, or other physical characteristics are measured to determine their mass-to-charge ratio. From these data the mass of the parent molecule can be determined. Further fragmentation of the molecule through controlled collisions with gas molecules or with electrons can help determine the structure of molecules. Very accurate mass measurements can also be used to determine the elemental formulas or elemental composition of compounds. Most forms of mass spectrometry require some form of separation using liquid chromatography or gas chromatography. This separation step is required to simplify the resulting mass spectra and to permit more accurate compound identification. Some mass spectrometry methods also require that the molecules be derivatized or chemically modified so that they are more amenable for chromatographic separation (this is particularly true for GC-MS). As an analytical technique, MS is a very sensitive method that requires very little sample (<1 ng of material or <10 μL of a biofluid) and can generate signals for thousands of metabolites from a single sample. MS instruments can also be configured for very high throughput metabolome analyses (hundreds to thousands of samples a day). Quantification of metabolites and the characterization of novel compound structures is more difficult by MS than by NMR. LC-MS is particularly amenable to detecting hydrophobic molecules (lipids, fatty acids) and peptides while GC-MS is best for detecting small molecules (<500 Da) and highly volatile compounds (esters, amines, ketones, alkanes, thiols).
https://en.wikipedia.org/wiki?curid=1075211
1,077,404
383,863
The earliest European references to gunpowder are found in Roger Bacon's "Opus Majus" from 1267, in which he mentions a firecracker toy found in various parts of the world. The passage reads: "We have an example of these things (that act on the senses) in [the sound and fire of] that children's toy which is made in many [diverse] parts of the world; i.e., a device no bigger than one's thumb. From the violence of that salt called saltpeter [together with sulfur and willow charcoal, combined into a powder] so horrible a sound is made by the bursting of a thing so small, no more than a bit of parchment [containing it], that we find [the ear assaulted by a noise] exceeding the roar of strong thunder, and a flash brighter than the most brilliant lightning." In the early 20th century, British artillery officer Henry William Lovett Hime proposed that another work tentatively attributed to Bacon, "Epistola de Secretis Operibus Artis et Naturae, et de Nullitate Magiae" contained an encrypted formula for gunpowder. This claim has been disputed by historians of science including Lynn Thorndike, John Maxson Stillman and George Sarton and by Bacon's editor Robert Steele, both in terms of authenticity of the work, and with respect to the decryption method. In any case, the formula claimed to have been decrypted (7:5:5 saltpeter:charcoal:sulfur) is not useful for firearms use or even firecrackers, burning slowly and producing mostly smoke. However, if Bacon's recipe is taken as measurements by volume rather than weight, a far more potent and serviceable explosive powder is created suitable for firing hand-cannons, albeit less consistent due to the inherent inaccuracies of measurements by volume. One example of this composition resulted in 100 parts saltpeter, 27 parts charcoal, and 45 parts sulfur, by weight.
https://en.wikipedia.org/wiki?curid=12063194
383,668
1,222,415
CuSn is an intermetallic alloy with a defect NiAs type structure. In NiAs type nomenclature it would have the stoichiometry CuCuSn, with 0.2 Cu atoms occupying a usually unoccupied crystallographic position in the lattice. These copper atoms are displaced to the grain boundaries when charged to form LiCuSn. With retention of most of the metal-metal bonding down to 0.5 V, CuSn has become an attractive potential anode material due to its high theoretical specific capacity, resistance against Li metal plating especially when compared to carbon-based anodes, and ambient stability. In this and related NiAs-type materials, lithium intercalation occurs through an insertion process to fill the two crystallographic vacancies in the lattice, at the same time as the 0.2 extra coppers are displaced to the grain boundaries. Efforts to charge compensate the main group metal lattice to remove the excess copper have had limited success. Although significant retention of structure is noted down to the ternary lithium compound LiCuSn, over discharging the material results in disproportionation with formation of LiSn and elemental copper. This complete lithiation is accompanied by volume expansion of approximately 250%. Current research focuses on investigating alloying and low dimensional geometries to mitigate mechanical stress during lithiation. Alloying tin with elements that do not react with lithium, such as copper, has been shown to reduce stress. As for low dimensional applications, thin films have been produced with discharge capacities of 1127 mAhg with excess capacity assigned to lithium ion storage at grain boundaries and associated with defect sites. Other approaches include making nanocomposites with CuSn at its core with a nonreactive outer shell, SnO-c hybrids have been shown to be effective, to accommodate volume changes and overall stability over cycles.
https://en.wikipedia.org/wiki?curid=42601555
1,221,756
1,774,075
The Ramblin' Wreck, the iconic 1930 Ford Model A Sport coupe that serves as the official mascot of the student body, was acquired in this era. The Wreck is present at all major sporting events and student body functions, and leads the football team into Bobby Dodd Stadium at Historic Grant Field, a duty it has performed since 1961. Dean of Student Affairs Jim Dull recognized a need for an official Ramblin' Wreck when he observed the student body's fascination with classic cars. Fraternities, in particular, would parade around their House Wrecks as displays of school spirit and enthusiasm. It was considered a rite of passage to own a broken-down vehicle. In 1960, Dull began a search for a new official symbol to represent the institute. He specifically wanted a classic pre-war Ford. Dull's search employed newspaper ads, radio commercials, and other means to locate this vehicle. The search took him throughout the state and country, but no suitable vehicle was found until the autumn of 1960 when Dull spotted a polished 1930 Ford Model A outside of his apartment located in Towers Dormitory. The owner was Captain Ted J. Johnson, Atlanta's chief Delta Air Lines pilot. When Johnson returned to his car, he found a note from Dull attached to his windshield. Dull's note offered to purchase the car to serve as Georgia Tech's official mascot. Johnson, after great deliberation, agreed to take $1,000; he eventually returned the money in 1984 so that the car would be remembered as an official donation to Georgia Tech and the Alexander-Tharpe Fund. The Ramblin' Wreck was officially transferred to the Athletic Association on May 26, 1961.
https://en.wikipedia.org/wiki?curid=8753550
1,773,078
1,741,559
H-Site-specific Natural Isotope Fractionation-Nuclear Magnetic Resonance(H-SNIF-NMR) is a type of NMR specialized in measuring the deuterium concentration of organic molecules at natural abundances. The NMR spectra distinguishes hydrogen atoms in different chemical environments (e.g. The order of carbon that hydrogen binds to, adjacent functional groups, and even geminal positions of methylene groups), making it a powerful tool for position-specific isotope analysis. The chemical shift (in frequency units) of H is 6.5 times lower than that of H. Thus, it is difficult to resolve H peaks. To provide high-enough resolution to separate H peaks, high strength magnetic field instruments (~11.4T) are applied. Application of NMR to study hydrogen isotopes of natural products was pioneered by G'erard Martin and his co-workers in the 1980s. For several decades it has been developed and expanded. The D/H NMR measurement is sometimes coupled with IR-MS measurement to create a referential standard. The sensitivity of SNIF-NMR is relatively low, typically requiring ~1 mmol of samples for each measurement. The precision with respect to isotope ratio is also relatively poor compared with mass spectrometry. Even the state-of-art instruments can only measure D/H ratios with around 50~200‰ error depending on the compound. Therefore, so far technique can only distinguish the large D/H variations in preserved materials. In 2007, Philippe Lesot and his collezzes advanced this technique with a 2-Dimensional NMR using chiral liquid crystals (CLCs) instead of isotropic solvents to dissolve organic molecules. This enables the measurements of quadrupolar doublets for each nonequivalent deuterium atom. Thus reduces peak overlaps and provides more detailed information of hydrogen chemical environment.
https://en.wikipedia.org/wiki?curid=50525886
1,740,577
710,244
After being defeated repeatedly by Japan and Western nations in the 19th century, Chinese reformers began promoting modern science and technology as part of the Self-Strengthening Movement. After the Communist victory in 1949 science and technology research was organized based on the model of the Soviet Union. It was characterized by a bureaucratic organization led by non-scientists, research according to the goals of central plans, separation of research from production, specialized research institutes, concentration on practical applications, and restrictions on information flows. Researchers should work as collectives for society rather than as individuals seeking recognition. Many studied in the Soviet Union which also transferred technology. The Cultural revolution, which sought to remove perceived "bourgeois" influences and attitudes, caused large negative effects and disruptions. Among other measures it saw the scientific community and formal education attacked, intellectuals were sent to do manual labor, universities and academic journals were closed, most research ceased, and for nearly a decade China trained no new scientists and engineers.
https://en.wikipedia.org/wiki?curid=277914
709,873
728,024
The writing is generously scaled – letter forms vary between in height, lines are spaced approximately apart, and there is a margin of at the top. It can be determined that there were 18 lines to a page. C. H. Roberts commented: ".. to judge from the spacing and the size of the text, it is unlikely that the format was affected by considerations of economy". There are no apparent punctuation marks or breathings shown in the fragment; but the diaeresis is applied to an initial iota at both the second line of the recto and the second line of the verso; and possibly too on the first line of the recto. Taken together with the over-scaled writing, this suggests that the manuscript may have been intended for congregational reading. If the original codex did indeed contain the entire text of the canonical Gospel of John, it would have constituted a single quire book of around 130 pages (i.e. 33 large folded papyrus sheets written on both sides); measuring approximately when closed. Roberts noted a glued vertical join in the papyrus slightly inside the inner margin and visible on the verso, indicating that the large sheets used for the codex were likely to have been specially prepared for the purpose, each having been constructed from two standard sized sheets measuring approximately , with a central narrower sheet approximately constituting the spine. Roberts describes the handwriting as "heavy, rounded and rather elaborate", but nevertheless not the work of "a practised scribe" (i.e. not a professional bookhand). Roberts notes comments that had recently been made by the editors of the Egerton Gospel (P.Egerton 2); and says similarly it could be said of 𝔓 that it "has a somewhat informal air about it and with no claims to fine writing is yet a careful piece of work".
https://en.wikipedia.org/wiki?curid=848618
727,641
913,379
Optogenetics provides millisecond-scale temporal precision which allows the experimenter to keep pace with fast biological information processing (for example, in probing the causal role of specific action potential patterns in defined neurons). Indeed, to probe the neural code, optogenetics by definition must operate on the millisecond timescale to allow addition or deletion of precise activity patterns within specific cells in the brains of intact animals, including mammals (see Figure 1). By comparison, the temporal precision of traditional genetic manipulations (employed to probe the causal role of specific genes within cells, via "loss-of-function" or "gain of function" changes in these genes) is rather slow, from hours or days to months. It is important to also have fast readouts in optogenetics that can keep pace with the optical control. This can be done with electrical recordings ("optrodes") or with reporter proteins that are biosensors, where scientists have fused fluorescent proteins to detector proteins. Additionally, beyond its scientific impact optogenetics represents an important case study in the value of both ecological conservation (as many of the key tools of optogenetics arise from microbial organisms occupying specialized environmental niches), and in the importance of pure basic science as these opsins were studied over decades for their own sake by biophysicists and microbiologists, without involving consideration of their potential value in delivering insights into neuroscience and neuropsychiatric disease.
https://en.wikipedia.org/wiki?curid=14958673
912,900
1,468,818
When the World Health Organization declared that the Western African Ebola virus epidemic was a public health emergency of international concern in August 2014, LSHTM coordinated various response efforts. More than 500 academic and professional services staff volunteered to respond, with many volunteers deployed via Save the Children, Public Health England, Médecins Sans Frontières and the WHO. LSHTM continued to pay the salary of anyone who volunteered to work on Ebola care and control in Guinea, Liberia and Sierra Leone, or backfill posts in WHO offices. Staff and students carried out mathematical modelling and other research to support Ebola response planning. Experts, including LSHTM faculty, established an Ebola Response Anthropology Platform to help health workers develop culturally sensitive interventions and developed free online education programmes to combat the spread of the disease. Researchers, including researchers from LSHTM, carried out accelerated clinical trials in the field, including the EBOVAC Ebola vaccine trials which is still ongoing. LSHTM was part of an independent panel advising on major reforms targeted at prevention of future global outbreaks and now runs the UK Public Health Rapid Support Team in partnership with Public Health England, funded by the UK Government. Former LSHTM Director Peter Piot was named among ‘the Ebola fighters’ as Time Person of the Year and LSHTM itself also won various awards for its response.
https://en.wikipedia.org/wiki?curid=531365
1,467,994
1,893,812
At a functional level, increased aperture area and longer rhabdomere length both serve to increase photon capture efficiency in the male dorsal eye. Also, the smaller ommatidial angles and different rhabdomere arrangement observed in dorsal eyes are central to the function of the dorsal eye because they are neural superposition eyes, meaning that neural pooling of information from neighboring ommatidia is used to enhance sensitivity. Using a model that takes longer rhabdomeres, larger facet diameters, smaller ommatidial angles, and neural superposition into account, Zeil shows that the dorsal eye of males is able to detect small objects against a homogeneous background at a much greater distance than ventral or female eyes. The optical properties of longer rhabdomeres, increased facet diameter, and smaller ommatidial angles also aid in detection of small objects by increasing the resolution up to six times that of ventral or female eyes when neural pooling resulting from superposition is taken into account. Greater sensitivity to small light changes due to longer rhabdomeres and increased facet diameter, in combination with the ability to detect females at farther distances with higher resolution, allows male Bibionid flies to search for females at lower light levels (greater portion of the day) and to respond quickly to the presence of a female in order to catch her and initiate the "marriage by capture" that occurs in this family of flies. Unlike other fly families, the extreme dimorphism seen in Bibionids may be particularly relevant because these species do not swarm under a landmark, causing the course of females to be relatively unpredictable. As in the olfactory example above, the functional consequences of sex differences in Bibionids are linked to sex-specific behavior for which these sex differences play a key adaptive role.
https://en.wikipedia.org/wiki?curid=25348242
1,892,729
1,120,126
To place multiferroic materials in their appropriate historical context, one also needs to consider magnetoelectric materials, in which an electric field modifies the magnetic properties and vice versa. While magnetoelectric materials are not necessarily multiferroic, all ferromagnetic ferroelectric multiferroics are linear magnetoelectrics, with an applied electric field inducing a change in magnetization linearly proportional to its magnitude. Magnetoelectric materials and the corresponding magnetoelectric effect have a longer history than multiferroics, shown in blue in the graph to the right. The first known mention of magnetoelectricity is in the 1959 Edition of Landau & Lifshitz' "Electrodynamics of Continuous Media" which has the following comment at the end of the section on piezoelectricity: ""Let us point out two more phenomena, which, in principle, could exist. One is piezomagnetism, which consists of linear coupling between a magnetic field in a solid and a deformation (analogous to piezoelectricity). The other is a linear coupling between magnetic and electric fields in a media, which would cause, for example, a magnetization proportional to an electric field. Both these phenomena could exist for certain classes of magnetocrystalline symmetry. We will not however discuss these phenomena in more detail because it seems that till present, presumably, they have not been observed in any substance."" One year later, I. E. Dzyaloshinskii showed using symmetry arguments that the material CrO should have linear magnetoelectric behavior, and his prediction was rapidly verified by D. Astrov. Over the next decades, research on magnetoelectric materials continued steadily in a number of groups in Europe, in particular in the former Soviet Union and in the group of H. Schmid at U. Geneva. A series of East-West conferences entitled Magnetoelectric Interaction Phenomena in Crystals (MEIPIC) was held between 1973 (in Seattle) and 2009 (in Santa Barbara), and indeed the term "multi-ferroic magnetoelectric" was first used by H. Schmid in the proceedings of the 1993 MEIPIC conference (in Ascona).
https://en.wikipedia.org/wiki?curid=2522070
1,119,553
485,171
A plasma consists of a fluid of positive and negative charged particles, generally created by heating or photo-ionizing (direct / tunneling / multi-photon / barrier-suppression) a dilute gas. Under normal conditions the plasma will be macroscopically neutral (or quasi-neutral), an equal mix of electrons and ions in equilibrium. However, if a strong enough external electric or electromagnetic field is applied, the plasma electrons, which are very light in comparison to the background ions (by a factor of 1836), will separate spatially from the massive ions creating a charge imbalance in the perturbed region. A particle injected into such a plasma would be accelerated by the charge separation field, but since the magnitude of this separation is generally similar to that of the external field, apparently nothing is gained in comparison to a conventional system that simply applies the field directly to the particle. But, the plasma medium acts as the most efficient transformer (currently known) of the transverse field of an electromagnetic wave into longitudinal fields of a plasma wave. In existing accelerator technology various appropriately designed materials are used to convert from transverse propagating extremely intense fields into longitudinal fields that the particles can get a kick from. This process is achieved using two approaches: standing-wave structures (such as resonant cavities) or traveling-wave structures such as disc-loaded waveguides etc. But, the limitation of materials interacting with higher and higher fields is that they eventually get destroyed through ionization and breakdown. Here the plasma accelerator science provides the breakthrough to generate, sustain, and exploit the highest fields ever produced by science in the laboratory.
https://en.wikipedia.org/wiki?curid=1948002
484,922
754,639
The Paul Samuelson's (1915–2009) "Foundations of Economic Analysis" published in 1947 was an attempt to show that mathematical methods could represent a core of testable economic theory. Samuelson started with two assumptions. First, people and firms will act to maximize their self-interested goals. Second, markets tend towards an equilibrium of prices, where demand matches supply. He extended the mathematics to describe equilibrating behavior of economic systems, including that of the then new macroeconomic theory of John Maynard Keynes. Whilst Richard Cantillon had imitated Isaac Newton's mechanical physics of inertia and gravity in competition and the market, the physiocrats had copied the body's blood system into circular flow of income models, William Jevons had found growth cycles to match the periodicity of sunspots, Samuelson adapted thermodynamics formulae to economic theory. Reasserting economics as a hard science was being done in the United Kingdom also, and one celebrated "discovery", of A. W. Phillips, was of a correlative relationship between inflation and unemployment. The workable policy conclusion was that securing full employment could be traded-off against higher inflation. Samuelson incorporated the idea of the Phillips curve into his work. His introductory textbook "Economics" was influential and widely adopted. It became the most successful economics text ever. Paul Samuelson was awarded the new Nobel Prize in Economics in 1970 for his merging of mathematics and political economy.
https://en.wikipedia.org/wiki?curid=7881361
754,236
788,731
Cubene (1,2-dehydrocubane) and 1,4-cubanediyl(1,4-dehydrocubane) are enormously strained compounds which both undergo nucleophilic addition very rapidly, and this has enabled chemists to synthesize cubylcubane. X-ray diffraction structure solution has shown that the central cubylcubane bond is exceedingly short (1.458 Å), much shorter than the typical C-C single bond (1.578 Å). This is attributed to the fact that the exocyclic orbitals of cubane are s-rich and close to the nucleus. Chemists at the University of Chicago extended and modified the sequence in a way that permits the preparation of a host of [n]cubylcubane oligomers. The [n]cubylcubanes are rigid molecular rods with the particular promise at the time of making liquid crystals with exceptional UV transparency. As the number of linked cubane units increases, the solubility of [n]cubylcubane plunges; as a result, only limited chain length (up to 40 units) have been successfully synthesized in solutions. The skeleton of [n]cubylcubanes is still composed of enormously strained carbon cubes, which therefore limit its stability. In contrast, researchers at Penn State University showed that poly-cubane synthesized by solid-state reaction is 100% sp carbon bonded with a tetrahedral angle (109.5°) and exhibits exceptional optical properties (high refractive index).
https://en.wikipedia.org/wiki?curid=417665
788,307
952,171
The perceived threat of invasion led to a major expansion of the Australian military. By mid-1942 the Army had a strength of ten infantry divisions, three armoured divisions and hundreds of other units. The RAAF and RAN were also greatly expanded, though it took years for these services to build up to their peak strengths. Due to the increased need for manpower, the restrictions which prohibited non-Europeans from joining the military ceased to be enforced from late 1941, and about 3,000 Indigenous Australians eventually enlisted. Most of these personnel were integrated into existing formations, but a small number of racially segregated units such as the Torres Strait Light Infantry Battalion were formed. A number of small units made up of Indigenous Australians were also established to patrol northern Australia and harass any Japanese forces which landed there; the members of these units did not receive pay or awards for their service until 1992. Thousands of Australians who were ineligible for service in the military responded to the threat of attack by joining auxiliary organisations such as the Volunteer Defence Corps and Volunteer Air Observers Corps, which were modelled on the British Home Guard and Royal Observer Corps respectively. Australia's population and industrial base were not sufficient to maintain the expanded military after the threat of invasion had passed, and the Army was progressively reduced in size from 1943 while only 53 of the 73 RAAF squadrons approved by the government were ever raised.
https://en.wikipedia.org/wiki?curid=4578255
951,666
450,924
Soligenix Inc. and Emergent agreed to establish a "commercially viable production technology" for the development of RiVax, a potential vaccine aimed to protect against ricin exposure. Currently, there are no treatments for ricin poisoning that have been proven effective. Soligenix is a late-stage biopharmaceutical company that specializes in the development of treatments for rare diseases. A product of castor oil production, the ricin toxin can be a useful biological weapon due to its extreme potency, stability, and accessibility. The National Institute of Allergy and Infectious Diseases funded the development of RiVax costing an estimated $24.7 million. The organization also financially backed the contract between Emergent and Soligenix. Most of the work was conducted in Baltimore, Maryland, at Emergent's manufacturing facility. An expansion of the Baltimore plant, finished in 2017, had $163 million in funding from the U.S. government. In January 2020, Emergent informed Soligenix of manufacturing issues, having provided doses of RiVax that were "out of specification", causing the study to be suspended even after two trial participants had received doses. In April 2020, the Department of Health and Human Services announced that it would not provide further funding for RiVax clinical trials, although the agency did not announce whether this was related to previous issues. In subsequent securities filings, Soligenix stated that it was pursuing $19 million in damages from Emergent in arbitration proceedings.
https://en.wikipedia.org/wiki?curid=6262709
450,705
1,212,349
The history of earth science and the history of astrophysics were also closely tied to military purposes and funding throughout the Cold War. American geodesy, oceanography, and seismology grew from small sub-disciplines in into full-fledged independent disciplines as for several decades, virtually all funding in these fields came from the Department of Defense. A central goal that tied these disciplines together (even while providing the means for intellectual independence) was the figure of the Earth, the model of the earth's geography and gravitation that was essential for accurate ballistic missiles. In the 1960s, geodesy was the superficial goal of the satellite program CORONA, while military reconnaissance was in fact a driving force. Even for geodetic data, new secrecy guidelines worked to restrict collaboration in a field that had formerly been fundamentally international; the Figure of the Earth had geopolitical significance beyond questions of pure geoscience. Still, geodesists were able to retain enough autonomy and subvert secrecy limitations enough to make use of the findings of their military research to overturn some of the fundamental theories of geodesy. Like geodesy and satellite photography research, the advent of radio astronomy had a military purpose hidden beneath official astrophysical research agenda. Quantum electronics permitted both revolutionary new methods of analyzing the universe and—using the same equipment and technology—the monitoring of Soviet electronic signals.
https://en.wikipedia.org/wiki?curid=4999816
1,211,697