doc_id
int32 18
2.25M
| text
stringlengths 245
2.96k
| source
stringlengths 38
44
| __index_level_0__
int64 18
2.25M
|
---|---|---|---|
645,116 |
Relative to other methods of analysis, powder diffraction allows for rapid, non-destructive analysis of multi-component mixtures without the need for extensive sample preparation. This gives laboratories around the world the ability to quickly analyze unknown materials and perform materials characterization in such fields as metallurgy, mineralogy, chemistry, forensic science, archeology, condensed matter physics, and the biological and pharmaceutical sciences. Identification is performed by comparison of the diffraction pattern to a known standard or to a database such as the International Centre for Diffraction Data's Powder Diffraction File (PDF) or the Cambridge Structural Database (CSD). Advances in hardware and software, particularly improved optics and fast detectors, have dramatically improved the analytical capability of the technique, especially relative to the speed of the analysis. The fundamental physics upon which the technique is based provides high precision and accuracy in the measurement of interplanar spacings, sometimes to fractions of an Ångström, resulting in authoritative identification frequently used in patents, criminal cases and other areas of law enforcement. The ability to analyze multiphase materials also allows analysis of how materials interact in a particular matrix such as a pharmaceutical tablet, a circuit board, a mechanical weld, a geologic core sampling, cement and concrete, or a pigment found in an historic painting. The method has been historically used for the identification and classification of minerals, but it can be used for nearly any material, even amorphous ones, so long as a suitable reference pattern is known or can be constructed.
|
https://en.wikipedia.org/wiki?curid=2296159
| 644,776 |
1,663,954 |
Effects-based operations (EBO) is a United States military concept that emerged during the Persian Gulf War for the planning and conduct of operations combining military and non-military methods to achieve a particular effect. An effects-based approach to operations was first applied in modern times in the design and execution of the Desert Storm air campaign of 1991. The principal author of the daily attack plans—then Lt Colonel, now retired Lt General David A. Deptula—used an effects-based approach in building the actual Desert Storm air campaign targeting plan. Deptula describes the background, rationale, and provides an example of how an effects-based approach to targeting was conducted in Desert Storm in the publication, "Effects-Based Operations: Change in the Nature of Warfare." The doctrine was developed with an aim of putting desired strategic effects first and then planning from the desired strategic objective back to the possible tactical level actions that could be taken to achieve the desired effect. Contrary to conventional military approaches of force-on-force application that focused on attrition and annihilation, EBO focused on desired outcomes attempting to use a minimum of force. The approach was enabled by advancements in weaponry—particularly stealth and precision weapons—in conjunction with a planning approach based on specific effects rather than absolute destruction. Deptula, speaking at the Gulf War Air Campaign Tenth Anniversary Retrospective, on 17 January 2001 on One Massachusetts Avenue, NW, Washington, DC, defined the goal of EBO; "If we focus on effects, the end of strategy, rather than force-on-force the traditional means to achieve it militarily, that enables us to consider different and perhaps more effective ways to accomplish the same goal quicker than in the past, with fewer resources and most importantly with fewer casualties." Others have postulated that EBO could be interpreted as an emerging understanding that attacking a second-order target may have first order consequences for a variety of objectives, wherein the Commander's intent can be satisfied with a minimum of collateral damage or risk to his own forces.
|
https://en.wikipedia.org/wiki?curid=9203715
| 1,663,017 |
312,516 |
From the neurological perspective, the well-known tendencies of teenagers to be emotional, impulsive, and to take high risks are due to the fact that the limbic system (responsible for emotional thought) is developing faster than the prefrontal cortex (logical reasoning). From the evolutionary viewpoint, this mismatch is adaptive in that it helps young people connect with other people (by being emotional) and learn to negotiate the complexities of life (by taking risks yet being more sensitive to rewards). As a result, teenagers are more prone to feelings of fear, anxiety, and depression than adults. In order to attract potential mates, males are especially prone to take risks and showcase their athleticism whereas females tend to direct attention to their beauty. Young males (who have the highest reproductive variance) take more risks than any other group in both experiments and observations. By undertaking risky endeavors, males are thought to signal the qualities which may be directly related to one’s ability to provision and protect one’s family, namely, physical skill, good judgment, or bravery. Social dominance, confidence, and ambition could help in competition among other males, while social dominance, ambition, and wealth might alleviate the costs of failure. In addition, traits like bravery and physical prowess may also be valued by cooperative partners due to their benefits in group-hunting and warfare, thereby increasing the potential audience for risk takers. The tendency of adolescent and young-adult males to engage in risky and aggressive behaviors is known as the 'young male syndrome'. His self-worth is tied to being perceived as a 'real man'. His likelihood of committing or falling victim to a violent crime peaks between his late teens and late twenties. Young females, on the other hand, are under strong peer pressure to be physically attractive, potentially leading to problems with their body image. A teenage girl or young woman's bond with her first sexual partner is often deep. In both sexes, intense adolescent intrasexual competition, amorous infatuations, and sexual experimentation are common.
|
https://en.wikipedia.org/wiki?curid=41162624
| 312,348 |
771,206 |
The P-38 was the only fighter to make it into combat during World War II with turbo-supercharged V-1710s. The operating conditions of the Western European air war – flying for long hours in intensely cold weather at – revealed several problems with these engines. They had a poor manifold fuel-air distribution and poor temperature regulation of the turbo-supercharger air, which resulted in frequent engine failures (detonation occurred as the result of persistent uneven fuel-air mixture across the cylinders caused by the poor manifold design). Specially formulated fuels were a necessity for the P-38 as were specific spark plugs needed for specific cylinders. The turbo-supercharger had additional problems with getting stuck in the freezing air in either high or low boost mode; the high boost mode could cause detonation in the engine, while the low boost mode would be manifested as power loss in one engine, resulting in sudden fishtailing in flight. These problems were aggravated by sub-optimal engine management techniques taught to many pilots during the first part of WWII, including a cruise setting that ran the engine at high RPM and low manifold pressure with a rich mixture. These settings can contribute to over-cooling of the engine, fuel condensation problems, accelerated mechanical wear, and the likelihood of components binding or "freezing up." Details of the failure patterns were described in a report by General Doolittle to General Spatz in January 1944. In March 1944, the first Allison engines appearing over Berlin belonged to a group of P-38Hs of 55th Fighter Group, engine troubles contributing to a reduction of the force to half strength over the target. It was too late to correct these problems in the production lines of Allison or GE, and as the numbers of Merlin-engined P-51 Mustangs based in England mounted up through the end of 1943 and into 1944, the P-38s were steadily withdrawn from Europe until October 1944 when they were no longer used for bomber escort duty with the Eighth Air Force. A few P-38s would remain in the European theater as the F-5 for photo reconnaissance.
|
https://en.wikipedia.org/wiki?curid=76309
| 770,792 |
1,167,139 |
In the 1940s, geologist Hans Suess speculated that the regularity that was observed in the abundances of elements may be related to structural properties of the atomic nucleus. These considerations were seeded by the discovery of radioactivity by Becquerel in 1896 as an aside of advances in chemistry which aimed at production of gold. This remarkable possibility for transformation of matter created much excitement among physicists for the next decades, culminating in discovery of the atomic nucleus, with milestones in Ernest Rutherford's scattering experiments in 1911, and the discovery of the neutron by James Chadwick (1932). After Aston demonstrated that the mass of helium is less than four times that of the proton, Eddington proposed that, through an unknown process in the Sun's core, hydrogen is transmuted into helium, liberating energy. Twenty years later, Bethe and von Weizsäcker independently derived the CN cycle, the first known nuclear reaction that accomplishes this transmutation. The interval between Eddington's proposal and derivation of the CN cycle can mainly be attributed to an incomplete understanding of nuclear structure. The basic principles for explaining the origin of elements and energy generation in stars appear in the concepts describing nucleosynthesis, which arose in the 1940s, led by George Gamow and presented in a 2-page paper in 1948 as the Alpher–Bethe–Gamow paper. A complete concept of processes that make up cosmic nucleosynthesis was presented in the late 1950s by Burbidge, Burbidge, Fowler, and Hoyle, and by Cameron. Fowler is largely credited with initiating collaboration between astronomers, astrophysicists, and theoretical and experimental nuclear physicists, in a field that we now know as nuclear astrophysics (for which he won the 1983 Nobel Prize). During these same decades, Arthur Eddington and others were able to link the liberation of nuclear binding energy through such nuclear reactions to the structural equations of stars.
|
https://en.wikipedia.org/wiki?curid=5470137
| 1,166,521 |
200,013 |
In the 19th century, experimenters began to detect unexpected forms of radiation: Wilhelm Röntgen caused a sensation with his discovery of X-rays in 1895; in 1896 Henri Becquerel discovered that certain kinds of matter emit radiation on their own accord. In 1897, J. J. Thomson discovered the electron, and new radioactive elements found by Marie and Pierre Curie raised questions about the supposedly indestructible atom and the nature of matter. Marie and Pierre coined the term "radioactivity" to describe this property of matter, and isolated the radioactive elements radium and polonium. Ernest Rutherford and Frederick Soddy identified two of Becquerel's forms of radiation with electrons and the element helium. Rutherford identified and named two types of radioactivity and in 1911 interpreted experimental evidence as showing that the atom consists of a dense, positively charged nucleus surrounded by negatively charged electrons. Classical theory, however, predicted that this structure should be unstable. Classical theory had also failed to explain successfully two other experimental results that appeared in the late 19th century. One of these was the demonstration by Albert A. Michelson and Edward W. Morley—known as the Michelson–Morley experiment—which showed there did not seem to be a preferred frame of reference, at rest with respect to the hypothetical luminiferous ether, for describing electromagnetic phenomena. Studies of radiation and radioactive decay continued to be a preeminent focus for physical and chemical research through the 1930s, when the discovery of nuclear fission by Lise Meitner and Otto Frisch opened the way to the practical exploitation of what came to be called "atomic" energy.
|
https://en.wikipedia.org/wiki?curid=13758
| 199,910 |
1,729,761 |
During the initial phase of the first experiment station, Atwater expanded his fertilizer program and began to study and experiment with the growth and composition of field crops. The field crop research continued even after the appropriation ceased on a nearby farm; Atwater became particularly interested in plant metabolism and was one of the first researchers to provide proof that legumes assimilate nitrogen from the air. As his experiments and accomplishments became known, Atwater's assistance was requested for a variety of projects. From 1879 to 1882, he conducted extensive human food studies on behalf of the United States Fish Commission and Smithsonian Institution. In 1879, the U.S. Fish Commission offered Atwater funds to study the composition and nutritional value of North American species of fish and invertebrates. For the 1882-1883 school year, Atwater took a leave of absence from Wesleyan to study the digestibility of lean fish with von Voit in Germany. Together, they found fish comparable to lean beef; during this time he became aware of how German scientists were studying nutrition and hoped to bring similar research to the United States upon his return. In 1885, Atwater's first series of studies on peas grown in nutrient solution were published in the American Chemical Journal. That same year, the Massachusetts Bureau of Statistics of Labor requested a study of data that had been collected by the bureau concerning family food purchases. In the study, Atwater calculated the daily per capita supplies of carbohydrates, fat, and protein provided within the data, and taking into account the included cost data, made recommendations on how more economical diets, while still having adequate nutritional value, could be chosen. The report he prepared was included in the Bureau's 1886 Annual Report.
|
https://en.wikipedia.org/wiki?curid=10465237
| 1,728,787 |
37,354 |
In late 2014, the US Navy began early preparation work on the SSN(X). It was planned that the first submarine would be procured in 2025. However, their introduction (i.e., procurement of the first submarine) has been pushed back to 2033/2034. The long-range shipbuilding plan is for the new SSN to be authorized in 2034, and become operational by 2044 after the last Block VII "Virginia" is built. Roughly a decade will be spent identifying, designing, and demonstrating new technologies before an analysis of alternatives is issued in 2024. An initial small team has been formed to consult with industry and identify the threat environment and technologies the submarine will need to operate against in the 2050-plus timeframe. One area already identified is the need to integrate with off-board systems so future "Virginia" boats and the SSN(X) can employ networked, extremely long-ranged weapons. A torpedo propulsion system concept from the Pennsylvania State University could allow a torpedo to hit a target away and be guided by another asset during the terminal phase. Targeting information might also come from another platform like a patrol aircraft or an unmanned aerial vehicle (UAV) launched from the submarine. Researchers have identified a quieter advanced propulsion system and the ability to control multiple unmanned underwater vehicles (UUVs) at once as key SSN(X) components. The future submarines will operate through the end of the 21st century, and potentially into the 22nd century. New propulsion technology, moving beyond the use of a rotating mechanical device to push the boat through the water, could come in the form a biomimetic propulsion system that would eliminate noise-generating moving parts like the drive shaft and the spinning blades of the propulsor.
|
https://en.wikipedia.org/wiki?curid=32726
| 37,341 |
1,554,781 |
A great deal of new and innovative techniques and ideas have been implemented globally. As countries on an international context gain momentum in the realms of green energy. Mexico specifically gained major momentum for this shortly after the Rio Conference in 1992. The Rio Conference is also known as the Earth Summit. It is a rapidly growing conference, especially within the last 20 years. It is located was located in Brazil and included 172 number of government participants. 108 of these are at a head of state or government level. The Earth Summits goal is to transform the attitudes and minds of people, into a more global change focus for the future preservation of the world. One of the first major changes for renewable energy was that Mexico's public electricity service law was revised. The revision allowed the participation of private entities within the process of electricity generation. This change impacted “self-supply of electricity, co-generation (production of electricity from waste heat for self-supply), small electricity production (under 30 MW for sale to the national electric utility CFE), and independent power production for exclusive sale to CFE” as according to the International Energy Agency. In the past a total number of 4000 MN in permits have been awarded by Energy Regulatory Commission. These permits included bio-gas to electricity, small hydro, solar and wind farms. Furthermore, a grid connection contract for renewable energy was created in 2001. This in essence was a set of rules on detailing transmission charges relating to the transmitting or feeding in to the national grid. This change impacted solar, wind and small hydroelectric installations. It allowed for the grid to hold the amount of clean green energy to generate enough power to source energy during peak local times.
|
https://en.wikipedia.org/wiki?curid=41447629
| 1,553,900 |
1,935,799 |
The macroflora of the Posidonia slate can be described as extremely poor in species. Apart from the remains of Horsetails, it is without exception the remains of coarse branches and fronds from gymnosperms, in which one has a certain can assume transport resistance. Remains of Ferns are completely missing, except for tall arboreal ferns (Peltaspermales). Mostly of the flora was reported from the area of Braunschweig. The major explanation for the flora could be that the plants in question are mono-or oligotypic stands on the edge of the waters that flow into the Posidonienschiefer sea, probably tear away in the course of flood events, easily fragmented during transport and wave waves, possibly especially in the occasional storm events postulated. In terms of taphonomy, this would result in a comparison with today's reed "Phragmites", which can form extensive stocks on the edge of shallower and slowly flowing waters ("Reed belts"). The Wood remnants clearly indicate one higher diversity of Coniferous flora in the delivery area than the remains of leafy branches. This fact is likely to be proportionate, similar to that frequent occurrence of charcoalized or gagged trunks, mostly of them are believed to be "driftwoods" that only take a long time drifting also suggests a frequent settlement with mussels and full-grown Sea Lilies. The deposition settings are at large distance from the nearest coastline (for southern Germany about 100 kilometers), making only plants strong to transportation able to resist enough to get deposited. At Irlbach and Kheleim, NE of Regensburg, where the Posidonienschiefer has its near mainland deposit with abundant sand, a rich deposit filled with plant remains of different kind (Seds, Reproductive organs, Leafs, Stems, Cuticles and wood) with traces of coal was recovered, however, it was never studied in depth. Of all the plant material expected only a few Bennetites leafs and two conifer branches with leaves where cited and none studied.
|
https://en.wikipedia.org/wiki?curid=62853123
| 1,934,691 |
244,474 |
On 1 June 1894, at a meeting of the British Association for the Advancement of Science at Oxford University, Lodge gave a memorial lecture on the work of Hertz (recently deceased) and the German physicist's proof of the existence of electromagnetic waves 6 years earlier. Lodge set up a demonstration on the quasi-optical nature of "Hertzian waves" (radio waves) and demonstrated their similarity to light and vision including reflection and transmission. Later in June and on 14 August 1894 he did similar experiments, increasing the distance of transmission up to 55 meters. In these lectures Lodge demonstrated a detector that would become standard in radio work, an improved version of Branly's detector which Lodge dubbed the "coherer". It consisted of a glass tube containing metal filings between two electrodes. When the small electrical charge from waves from an antenna were applied to the electrodes, the metal particles would cling together or "cohere" causing the device to become conductive allowing the current from a battery to pass through it. In Lodge's setup the slight impulses from the coherer were picked up by a mirror galvanometer which would deflect a beam of light being projected on it, giving a visual signal that the impulse was received. After receiving a signal the metal filings in the coherer were broken apart or "decohered" by a manually operated vibrator or by the vibrations of a bell placed on the table near by that rang every time a transmission was received. Lodge also demonstrated tuning using a pair of Leyden jars that could be brought into resonance. Lodge's lectures were widely publicized and his techniques influenced and were expanded on by other radio pioneers including Augusto Righi and his student Guglielmo Marconi, Alexander Popov, Lee de Forest, and Jagadish Chandra Bose.
|
https://en.wikipedia.org/wiki?curid=3800477
| 244,347 |
385,309 |
Tesla went on to develop a wireless power distribution system that he hoped would be capable of transmitting power long distance directly into homes and factories. Early on he seemed to borrow from the ideas of Mahlon Loomis, proposing a system composed of balloons to suspend transmitting and receiving electrodes in the air above in altitude, where he thought the pressure would allow him to send high voltages (millions of volts) long distances. To further study the conductive nature of low pressure air he set up a test facility at high altitude in Colorado Springs during 1899. Experiments he conducted there with a large coil operating in the megavolts range, as well as observations he made of the electronic noise of lightning strikes, led him to conclude incorrectly that he could use the entire globe of the Earth to conduct electrical energy. The theory included driving alternating current pulses into the Earth at its resonant frequency from a grounded Tesla coil working against an elevated capacitance to make the potential of the Earth oscillate. Tesla thought this would allow alternating current to be received with a similar capacitive antenna tuned to resonance with it at any point on Earth with very little power loss. His observations also led him to believe a high voltage used in a coil at an elevation of a few hundred feet would "break the air stratum down", eliminating the need for miles of cable hanging on balloons to create his atmospheric return circuit. Tesla would go on the next year to propose a "World Wireless System" that was to broadcast both information and power worldwide. In 1901, at Shoreham, New York he attempted to construct a large high-voltage wireless power station, now called Wardenclyffe Tower, but by 1904 investment dried up and the facility was never completed.
|
https://en.wikipedia.org/wiki?curid=570662
| 385,114 |
53,844 |
The first modern report of sickle cell disease may have been in 1846, where the autopsy of an executed runaway slave was discussed; the key finding was the absence of the spleen. Reportedly, African slaves in the United States exhibited resistance to malaria, but were prone to leg ulcers. The abnormal characteristics of the red blood cells, which later lent their name to the condition, was first described by Ernest E. Irons (1877–1959), intern to Chicago cardiologist and professor of medicine James B. Herrick (1861–1954), in 1910. Irons saw "peculiar elongated and sickle-shaped" cells in the blood of a man named Walter Clement Noel, a 20-year-old first-year dental student from Grenada. Noel had been admitted to the Chicago Presbyterian Hospital in December 1904 with anaemia. Noel was readmitted several times over the next three years for "muscular rheumatism" and "bilious attacks" but completed his studies and returned to the capital of Grenada (St. George's) to practice dentistry. He died of pneumonia in 1916 and is buried in the Catholic cemetery at Sauteurs in the north of Grenada. Shortly after the report by Herrick, another case appeared in the "Virginia Medical Semi-Monthly" with the same title, "Peculiar Elongated and Sickle-Shaped Red Blood Corpuscles in a Case of Severe Anemia." This article is based on a patient admitted to the University of Virginia Hospital on 15 November 1910. In the later description by Verne Mason in 1922, the name "sickle cell anemia" is first used. Childhood problems related to sickle cells disease were not reported until the 1930s, despite the fact that this cannot have been uncommon in African-American populations.
|
https://en.wikipedia.org/wiki?curid=21010263
| 53,824 |
1,726,291 |
There is significant debate about who is responsible for the regulation of nanotechnology. While some non-nanotechnology specific regulatory agencies currently cover some products and processes (to varying degrees) – by "bolting on" nanotechnology to existing regulations – there are clear gaps in these regimes. This enables some nanotechnology applications to figuratively "slip through the cracks" without being covered by any regulations. An example of this has occurred in the US, and involves nanoparticles of titanium dioxide (TiO) for use in sunscreen where they create a clearer cosmetic appearance. In this case, the US Food and Drug Administration (FDA) reviewed the immediate health effects of exposure to nanoparticles of TiO for consumers. However, they did not review its impacts for aquatic ecosystems when the sunscreen rubs off, nor did the EPA, or any other agency. Similarly the Australian equivalent of the FDA, the Therapeutic Goods Administration (TGA) approved the use of nanoparticles in sunscreens (without the requirement for package labelling) after a thorough review of the literature, on the basis that although nanoparticles of TiO and zinc oxide (ZnO) in sunscreens do produce free radicals and oxidative DNA damage "in vitro", such particles were unlikely to pass the dead outer cells of the stratum corneum of human skin; a finding which some academics have argued seemed not to apply the precautionary principle in relation to prolonged use on children with cut skin, the elderly with thin skin, people with diseased skin or use over flexural creases. Doubts over the TGA's decision were raised with publication of a paper showing that the uncoated anatase form of TiO used in some Australian sunscreens caused a photocatalytic reaction that degraded the surface of newly installed prepainted steel roofs in places where they came in contact with the sunscreen coated hands of workmen. Such gaps in regulation are likely to continue alongside the development and commercialization of increasingly complex second and third generation nanotechnologies.
|
https://en.wikipedia.org/wiki?curid=18069384
| 1,725,320 |
1,744,211 |
By contrast, the study was well received by Muller's peers in climate science research. James Hansen, a leading climate scientist and head of NASA Goddard Institute for Space Studies commented that he had not yet read the research papers but was glad Muller was looking at the issue. He said "It should help inform those who have honest scepticism about global warming." Phil Jones the director of the Climatic Research Unit (CRU) at the University of East Anglia, said: "I look forward to reading the finalised paper once it has been reviewed and published. These initial findings are very encouraging and echo our own results and our conclusion that the impact of urban heat islands on the overall global temperature is minimal." Michael Mann, director of the Earth System Science Center at Pennsylvania State University, commented that "...they get the same result that everyone else has gotten," and "that said, I think it's at least useful to see that even a critic like Muller, when he takes an honest look, finds that climate science is robust." Peter Thorne, from the Cooperative Institute for Climate and Satellites in North Carolina and chair of the International Surface Temperature Initiative, said: "This takes a very distinct approach to the problem and comes up with the same answer, and that builds confidence that pre-existing estimates are in the right ballpark. There is very substantial value in having multiple groups looking at the same problem in different ways." The ice core research scientist Eric Steig wrote at RealClimate.org that it was unsurprising that Berkeley Earth's results matched previous results so well: "Any of various simple statistical analyses of the freely available data ...show... that it was very very unlikely that the results would change".
|
https://en.wikipedia.org/wiki?curid=31299401
| 1,743,227 |
772,515 |
Regarding the first point, nanoscale MOF (NMOF) synthesis has been mentioned in an earlier section. The latter obstacle addresses the limitation of the antenna effect. Smaller linkers tend to improve MOF stability, but have higher energy absorptions, predominantly in the ultraviolet (UV) and high-energy visible regions. A design strategy for MOFs with redshifted absorption properties has been accomplished by using large, chromophoric linkers. These linkers are often composed of polyaromatic species, leading to large pore sizes and thus decreased stability. To circumvent the use of large linkers, other methods are required to redshift the absorbance of the MOF so lower energy excitation sources can be used. Post-synthetic modification (PSM) is one promising strategy. Luo et al. introduced a new family of lanthanide MOFs with functionalized organic linkers. The MOFs, deemed MOF-1114, MOF-1115, MOF-1130, and MOF-1131, are composed of octahedral SBUs bridged by amino functionalized dicarboxylate linkers. The amino groups on the linkers served as sites for covalent PSM reactions with either salicylaldehyde or 3-hydroxynaphthalene-2-carboxyaldehyde. Both of these reactions extend the π-conjugation of the linker, causing a redshift in the absorbance wavelength from 450 nm to 650 nm. The authors also propose that this technique could be adapted to similar MOF systems and, by increasing pore volumes with increasing linker lengths, larger pi-conjugated reactants can be used to further redshift the absorption wavelengths. Biological imaging using MOFs has been realized by several groups, namely Foucault-Collet and co-workers. In 2013, they synthesized a NIR-emitting Yb-NMOF using phenylenevinylene dicarboxylate (PVDC) linkers. They were observed cellular uptake in both HeLa cells and NIH-3T3 cells using confocal, visible, and NIR spectroscopy. Although low quantum yields persist in water and Hepes buffer solution, the luminescence intensity is still strong enough to image cellular uptake in both the visible and NIR regimes.
|
https://en.wikipedia.org/wiki?curid=9821563
| 772,100 |
1,925,198 |
By 1980, complementary bipolar ICs became possible and Allison Research released the first monolithic Blackmer gain cell IC. The ECG-101, which was designed by Paul Buff, contained only the core of a modified Blackmer cell – a set of eight matched transistors – and was intended for pure class A operation. It had a unique sonic signature that had almost no undesirable, odd-order harmonics and was easier to stabilize than the original Blackmer cell. In 1981 dbx, Inc. released their own monolithic IC, the dbx2150/2151/2155, which was designed by Dave Welland, the future co-founder of Silicon Labs. The three numeric designations denoted three grades of the same chip; 2151 being the best, 2155 the worst; the middle-of-the-line 2150 was the most widely used version. The eight-pin single-in-line package (SIP8) assured good isolation between inputs and outputs, and became the industry standard that was used in the later dbx2100, THAT2150 and THAT2181 ICs. These circuits, like the original hybrid dbx ICs, were a small-volume niche product that was used exclusively in professional analogue audio. Typical applications include mixing consoles, compressors, noise gates, duckers, de-essers and state variable filters. The dbx noise reduction system, which used the Blackmer cell, had limited success in semi-professional market and failed in consumer markets, losing to Dolby C. The only mass market where dbx achieved substantial use was the North American Multichannel Television Sound, which was introduced in 1984 and operating until the end of analogue television broadcasting in 2009.
|
https://en.wikipedia.org/wiki?curid=63925564
| 1,924,094 |
844,030 |
The Wasp III is the result of a multi-year joint development effort between AeroVironment and the Defense Advanced Research Projects Agency (DARPA) to create a small, portable, reliable, and rugged unmanned aerial platform designed for front-line day or night reconnaissance and surveillance. The Wasp weighs only , is 16 in (38 cm) long, and has a wingspan of 29 in (72 cm); it can be broken down and re-assembled to fit in a backpack. It can be controlled manually or programmed for GPS-based autonomous navigation and can carry interchangeable targeting payload modules, including forward and side-looking infrared and color cameras that transmit streaming video directly to the hand-held ground controller, the same controller used for the larger RQ-11B Raven and RQ-20 Puma. The aircraft can fly for 45 minutes out to at an altitude of 1,000 ft (300 m) with a top speed of . The Air Force Special Operations Command (AFSOC) selected the Wasp III for the Battlefield Air Targeting Micro Air Vehicle (BATMAV) program in December 2006 to allow battlefield airmen to look for enemy targets beyond their line of sight; AFSOC began testing the tiny UAV in October 2007 and approved full-rate production in January 2008. In November 2007, the U.S. Marine Corps also awarded AeroVironment a $19.3 million contract to deliver Wasp III systems under the Air Force BATMAV contract to equip Marines at platoon level, complementing Raven UAVs deployed at company and battalion levels.
|
https://en.wikipedia.org/wiki?curid=24740113
| 843,580 |
1,659,009 |
Chinese scientists first discovered the severe acute respiratory syndrome (SARS) coronavirus in February 2003, but due to initial misinterpretation of the data, the information of the correct agent associated with SARS was suppressed and the outbreak investigation had a delayed start. Advanced hospital facilities were at the greatest risk as they were most susceptible to virus transmission, so it was the "classical gumshoe epidemiology" of "contact tracing and isolation" that brought swift action against the epidemic. Lipkin was requested to assist with the investigation by Chen Zhou, vice president of the Chinese Academy of Sciences and Xu Guanhua, minister of the Ministry of Science and Technology in China to "assess the state of the epidemic, identify the gaps in science, and develop a strategy for containing the virus and reducing morbidity and mortality." This brought the development of Real-time polymerase chain reaction (qPCR) technology, which essentially allowed for the detection of infection at earlier time points as the process, in this instance, targets the N gene sequence and amplify the analysis in a closed system. This markedly reduces the risk of contamination during processing. Test kits were developed with this PCR-based assay analysis and 10,000 were hand-delivered to Beijing during the height of the outbreak by Lipkin, whereupon he trained local clinical microbiologists on the proper usage. He became ill upon his return to the U.S. and was quarantined.
|
https://en.wikipedia.org/wiki?curid=20485252
| 1,658,076 |
272,158 |
Whichever the form of the crystallizer, to achieve an effective process control it is important to control the retention time and the crystal mass, to obtain the optimum conditions in terms of crystal specific surface and the fastest possible growth. This is achieved by a separation – to put it simply – of the crystals from the liquid mass, in order to manage the two flows in a different way. The practical way is to perform a gravity settling to be able to extract (and possibly recycle separately) the (almost) clear liquid, while managing the mass flow around the crystallizer to obtain a precise slurry density elsewhere. A typical example is the DTB ("Draft Tube and Baffle") crystallizer, an idea of Richard Chisum Bennett (a Swenson engineer and later President of Swenson) at the end of the 1950s. The DTB crystallizer (see images) has an internal circulator, typically an axial flow mixer – yellow – pushing upwards in a draft tube while outside the crystallizer there is a settling area in an annulus; in it the exhaust solution moves upwards at a very low velocity, so that large crystals settle – and return to the main circulation – while only the fines, below a given grain size are extracted and eventually destroyed by increasing or decreasing temperature, thus creating additional supersaturation. A quasi-perfect control of all parameters is achieved as DTF crystallizers offer superior control over crystal size and characteristics. This crystallizer, and the derivative models (Krystal, CSC, etc.) could be the ultimate solution if not for a major limitation in the evaporative capacity, due to the limited diameter of the vapor head and the relatively low external circulation not allowing large amounts of energy to be supplied to the system.
|
https://en.wikipedia.org/wiki?curid=1266658
| 272,010 |
1,164,479 |
Visitors to "The Art of Video Games" at the Smithsonian American Art Museum were greeted by a 12-foot projection that included excerpts from most of the 80 games featured in the exhibition with a chipmusic soundtrack written and recorded by 8 Bit Weapon and ComputeHer. An interior gallery included a series of short videos showing the range of emotional responses players of all ages have while interacting with games. Five themed videos addressing the themes of Beginnings, Inspiration, Narrative, Experience and The Future showcased excerpts from interviews with 20 influential figures in the gaming world—Nolan Bushnell, David Cage, Steve Cartwright, Jenova Chen, Don Daglow, Noah Falstein, Ed Fries, Ron Gilbert, Robin Hunicke, Henry Jenkins, Jennifer MacLean, RJ Mical, Mike Mika, David Perry, Jane Pinckard, George L. Rose, Kellee Santiago, Tim Schafer, Jesse Schell, Warren Spector and Tommy Tallarico. The videos are also available on the museum's website. A five-channel installation displaying advances in core mechanics illustrated how home video games have evolved dramatically since their introduction in the 1970s through elements like avatars, jumping, running, climbing, flying, cutscenes and landscapes. The room also held a selection of concept art from several games of different eras. Five playable games, one from each era, showed how players interact with diverse virtual worlds, highlighting innovative techniques that set the standard for many subsequent games. The playable games were "Pac-Man", "Super Mario Brothers", "The Secret of Monkey Island", "Myst", and "Flower". Interactive kiosks in the final gallery covered five eras of game technology, from early pioneers to contemporary designers, and 20 gaming systems from Atari VCS to PlayStation 3. Each kiosk featured a game from each of four genres—action, target, adventure and tactics—that visitors could select to listen to commentary, game dialogue and music.
|
https://en.wikipedia.org/wiki?curid=32507866
| 1,163,862 |
1,936,124 |
In 2003, she joined the advisory board of United States Environmental Protection Agency PM2.5 Clean Air. Between 2003 and 2006 Prather studied whether ATOFMS could be used to measure the carbonaceous components of aerosols (including PAHs) and help to understand atmospheric processes, distinguishing between organic (OC) and elemental carbon (EC). Prather showed it was possible to distinguish EC and OC on a single particle level, and investigated their chemical associations with ammonium, nitrate, and sulfate. Her group explored ways to calibrate the ATOFMS data, making real-time apportionment of ambient particles possible. They did this by classifying particles using an artificial neural network (ART-2a). In 2008 she became the co-lead scientist in CalWater in collaboration with F. Martin Ralph; a multi-year interdisciplinary research effort focusing on how aerosols are impacting the water supply in the West Coast of the United States. Her PhD student Kerri Pratt led the Ice in Clouds Experiment - Layer Clouds (ICE-L) study. ICE-L included the first aircraft ATOFMS, named "Shirley." Pratt and Prather studied ice crystals "in situ" on high speed aircraft flying above Wyoming, and found that the particles were mainly composed of dust or biological particles (bacteria, fungal spores or plants). Understanding the composition of airborne particles is imperative to properly evaluate their impact on climate change, as well as provide insight into how aerosol impact cloud formation and precipitation.
|
https://en.wikipedia.org/wiki?curid=59545995
| 1,935,016 |
1,468,812 |
LSHTM served a crucial role in reshaping scientific research and public health in the post-war period. Sir Austin Bradford Hill, a professor of Medical Statistics at the school, was the statistician on the Medical Research Council Streptomycin in Tuberculosis Trials Committee and their 1948 study evaluating the use of streptomycin in treating tuberculosis, which is generally accepted as the first randomized controlled trial to have been conducted. Two years later Bradford Hill and Sir Richard Doll, a member of the MRC Statistical Research Unit based at LSHTM, were the first to demonstrate the association between cigarette smoking and lung cancer. They also set up the well-known British Doctors Study to provide evidence for a causal relationship. In 1951, alumnus Max Theiler was awarded the Nobel Prize in Physiology or Medicine "for his discoveries concerning yellow fever and how to combat it". George Macdonald, Professor of Tropical Hygiene and Director of the Ross Institute at LSHTM, was the first to propose the basic reproduction number (formula_1) in his 1952 study of malaria, which remains a key statistic in the study of infectious diseases to this day. The following year, alumnus Jerry Morris was the first to establish the role of physical exercise in preventing heart disease. Richard Doll established the connection between asbestos and lung cancer in 1955, while two Readers at the school established the connection between air pollution and respiratory diseases in 1958.
|
https://en.wikipedia.org/wiki?curid=531365
| 1,467,988 |
90,302 |
Two federal agencies have supported various weather modification research projects, which began in the early-1960s: The United States Bureau of Reclamation (Reclamation; Department of the Interior) and the National Oceanic and Atmospheric Administration (NOAA; Department of Commerce). Reclamation sponsored several cloud seeding research projects under the umbrella of Project Skywater from 1964 to 1988, and NOAA conducted the Atmospheric Modification Program from 1979 to 1993. The sponsored projects were carried out in several states and two countries (Thailand and Morocco), studying both winter and summer cloud seeding. From 1962 to 1988 Reclamation developed cloud seeding applied research to augment water supplies in the western US. The research focused on winter orographic seeding to enhance snowfall in the Rocky Mountains and Sierra Nevada, and precipitation in coast ranges of southern California. In California Reclamation partnered with the California Department of Water Resources (CDWR) to sponsor the Serra Cooperative Pilot Project (SCPP), based in Auburn CA, to conduct seeding experiments in the central Sierra. The University of Nevada and Desert Research Institute provided cloud physics, physical chemistry, and other field support. The High Plains Cooperative Pilot Project (HIPLEX), focused on convective cloud seeding to increase rainfall during the growing season in Montana, Kansas, and Texas from 1974 to 1979. In 1979, the World Meteorological Organization, and other member-states led by the Government of Spain conducted a Precipitation Enhancement Project (PEP) in Spain, with inconclusive results due probably to location selection issues. Reclamation sponsored research at several universities including Colorado State University, Universities of Wyoming, Washington, UCLA, Utah, Chicago, NYU, Montana, Colorado and research teams at Stanford, Meteorology Research Inc., and Penn State University, and South Dakota School of Mines and Technology, North Dakota, Texas A&M, Texas Tech, and Oklahoma. Cooperative efforts with state water resources agencies in California, Colorado, Montana, Kansas, Oklahoma, Texas, and Arizona assured that the applied research met state water management needs. The High Plains Cooperative Pilot Project also engaged in partnerships with NASA, Environment Canada, and the National Center for Atmospheric Research (NCAR). More recently, in cooperation with six western states, Reclamation sponsored a small cooperative research program called the Weather Damage Modification Program, from 2002–2006.
|
https://en.wikipedia.org/wiki?curid=449660
| 90,262 |
152,276 |
As climate change becomes a bigger issue, it has moved from social and natural sciences to political debates. Carrying capacity currently tends to be thought of as a natural and normal balance between nature and humans. Carrying capacity depends on the amount of natural resources available to a population and how much of the resource is needed. When it began to be used, it looked at human impacts on the environment or specific species. Anthropological criticisms of the concept of carrying capacity are that it does not successfully capture the nuances of how multilayered human and environment relationships are. Discussions of carrying capacity often utilize a framework that places undue blame on populations that often experience the worse effects of climate change and environmental degradation. The Gwembe Tonga Research Project (GTRP) is a long term study in Africa, that uses the building of the Kariba Dam on the Zambezi River as a case study to explore the effects of large scale development on populations. The building of this dam and the subsequent flooding in the area displaced 57,000 people. Increasing drought cycles along with displaced people joining land that was already populated caused a great deal of precarity for the displaced population, and kinship networks and famine foods were utilized to deal with scarcity. The study was started in 1956. It originally wrapped up in 1962, but the researchers chose to continue indefinitely to better understand the community and how it changes over time. The population was resettled from development on Lake Kariba. Some of the villages were forced to settle below the new dam. Six thousand people settled in Lusitu, with very ethnically different people with around one thousand people and a new environment. Droughts in the area are becoming more frequent, and there are definitely some environmental costs. However, with GTRP, it has been found that there is no inevitable permanent damage to the ecology. In Lusitu, there was a terrible drought between 1994 and 1995, which resulted in no harvest. However, the next year, the people saw a good harvest. It was not enough for the whole population, but it was better than other years. The drought allowed the soil to rest, and lead to a bigger harvest than in recent years. The economy has been struggling since the copper industry collapsed in the 1970s.
|
https://en.wikipedia.org/wiki?curid=47544
| 152,208 |
1,311,158 |
According to Irrational's Ken Levine, the name "Columbia", in reference to the female figure that personifies the United States, and the idea of American exceptionalism did not come about until six to eight months before the game's reveal. An early concept was to depict a group of technology geeks against a band of luddites, but Levine found that such conflict exists "only in shades of reality" and wasn't compelling enough. Instead, the Irrational team recentered on the idea of American exceptionalism, a tangible concept that continues to be repeated throughout history. The idea came to Levine after watching a PBS documentary, "America 1900", about the late 19th century, which quickly caught on with the rest of the team. In particular, Levine pointed to one quote of U.S. President William McKinley on the eve of the Philippine–American War, which spoke to the need of America to "uplift and civilize and Christianize" the natives of the Philippines. Though the accuracy of the quote is disputed, Irrational's lead artist Shawn Robinson noted that "BioShock Infinite"s goal is "not to teach any history", but felt such historical aspects helped to ground the work's fiction. Levine stated that in the same manner that "BioShock" was not built specifically around Objectivism, "Infinite" is not built around jingoism, but only uses the concepts to help set the stage to tell the story of individuals caught up in the conflicts. Another work that Levine took inspiration from was Erik Larson's "The Devil in the White City" about Dr. H. H. Holmes, the first recorded serial killer at the 1893 World's Fair in Chicago; Levine considered how the work gave "a great optimism and excitement for the future and one of this ominous feeling at the same time". Levine noted that in contrast to the character of Andrew Ryan from the first "BioShock", where history had influenced some of his decisions, Booker and other characters have been directly involved with some of the aforementioned history, reflected in how these characters react to certain scenarios.
|
https://en.wikipedia.org/wiki?curid=38977195
| 1,310,440 |
663,161 |
Zooplankton are tiny animals suspended in the water column. Like phytoplankton, these species have developed mechanisms that keep them from sinking to deeper waters, including drag-inducing body forms, and the active flicking of appendages (such as antennae or spines). Remaining in the water column may have its advantages in terms of feeding, but this zone's lack of refugia leaves zooplankton vulnerable to predation. In response, some species, especially Daphnia sp., make daily vertical migrations in the water column by passively sinking to the darker lower depths during the day, and actively moving towards the surface during the night. Also, because conditions in a lentic system can be quite variable across seasons, zooplankton have the ability to switch from laying regular eggs to resting eggs when there is a lack of food, temperatures fall below 2 °C, or if predator abundance is high. These resting eggs have a diapause, or dormancy period, that should allow the zooplankton to encounter conditions that are more favorable to survival when they finally hatch. The invertebrates that inhabit the benthic zone are numerically dominated by small species, and are species-rich compared to the zooplankton of the open water. They include: Crustaceans (e.g. crabs, crayfish, and shrimp), molluscs (e.g. clams and snails), and numerous types of insects. These organisms are mostly found in the areas of macrophyte growth, where the richest resources, highly-oxygenated water, and warmest portion of the ecosystem are found. The structurally diverse macrophyte beds are important sites for the accumulation of organic matter, and provide an ideal area for colonization. The sediments and plants also offer a great deal of protection from predatory fishes.
|
https://en.wikipedia.org/wiki?curid=4479734
| 662,816 |
1,235,090 |
In May 2017, SwissMicros released pre-production samples of an RPN calculator closely resembling the HP-42S, the "DM42". The final product was released on the 9 December 2017. Even though slightly smaller (144×77×13 mm, 170 g) than the original HP-42S (148×80×15 mm, 170 g), the calculator comes with an additional top row of keys for soft menus, a keyboard layout supporting direct alpha character input, a much larger high-contrast display (Sharp low power transflective memory LCD with a resolution of 400×240, protected by Gorilla Glass) showing all four stack levels at once (configurable), ca. 75 KB usable RAM, a beeper, a callable real-time clock as well as an infrared port for HP 82240A/HP 82240B printer support and a USB interface (with Micro-B connector) emulating a FAT16-formatted USB mass storage device for easy program transfer and state backup / transfer as well as for firmware updates. The calculator, which comes in a stainless steel case with matte black physical vapor deposition (PVD) coating, supports keyboard overlays and is based on a modified version of Thomas Okken's GPLed Free42 simulator with Intel's decimal floating-point math library for higher precision (decimal128) running on an STM32L476RG processor (ARM Cortex-M4 core, 128 KB RAM, 1 MB internal flash) with another 8 MB of external QSPI flash (of which ca. 6 MB are available to users). It is powered by a CR2032 coin cell or via USB and clocked dynamically at 24-80 MHz. The "DM42" is also the hardware basis for the community-developed WP 43S calculator, a successor to the WP 34S.
|
https://en.wikipedia.org/wiki?curid=730770
| 1,234,427 |
1,044,346 |
Biological structuralism objects to an exclusively Darwinian explanation of natural selection, arguing that other mechanisms also guide evolution, and sometimes implying that these supersede selection altogether. Structuralists have proposed different mechanisms that might have guided the formation of body plans. Before Darwin, Étienne Geoffroy Saint-Hilaire argued that animals shared homologous parts, and that if one was enlarged, the others would be reduced in compensation. After Darwin, D'Arcy Thompson hinted at vitalism and offered geometric explanations in his classic 1917 book "On Growth and Form". Adolf Seilacher suggested mechanical inflation for "pneu" structures in Ediacaran biota fossils such as "Dickinsonia". Günter P. Wagner argued for developmental bias, structural constraints on embryonic development. Stuart Kauffman favoured self-organisation, the idea that complex structure emerges holistically and spontaneously from the dynamic interaction of all parts of an organism. Michael Denton argued for laws of form by which Platonic universals or "Types" are self-organised. In 1979 Stephen J. Gould and Richard Lewontin proposed biological "spandrels", features created as a byproduct of the adaptation of nearby structures. Gerd Müller and Stuart Newman argued that the appearance in the fossil record of most of the current phyla in the Cambrian explosion was "pre-Mendelian" evolution caused by plastic responses of morphogenetic systems that were partly organized by physical mechanisms. Brian Goodwin, described by Wagner as part of "a fringe movement in evolutionary biology", denied that biological complexity can be reduced to natural selection, and argued that pattern formation is driven by morphogenetic fields. Darwinian biologists have criticised structuralism, emphasising that there is plentiful evidence from deep homology that genes have been involved in shaping organisms throughout evolutionary history. They accept that some structures such as the cell membrane self-assemble, but question the ability of self-organisation to drive large-scale evolution.
|
https://en.wikipedia.org/wiki?curid=53955838
| 1,043,802 |
1,779,309 |
The SWAM experiments will require two main inputs. First is platforms to access the nooks and corners in the undersea domain and the second is the signal processing abilities that will pre-process the data and undertake effective processing to derive meaningful inputs. The conventional ship-borne deployment of sensors has not yielded desired results and is significantly resource intensive to cover the massive area to be studied. Underwater gliders have proven to be the most suited platform for undertaking acoustic surveys in the underwater domain specifically for oceanographic data collection. The buoyancy engine driven underwater gliders are slow, relatively cheap, have long endurance and are less noisy, thus ideally suited for data collection for acoustic surveys. These platforms can be deployed in large numbers to cover huge areas and the output stitched together for data analysis. They are the among the recent advancements in autonomous underwater vehicles (AUVs), but since they are not propeller driven can be used for acoustic data collection as they have low noise and long endurance. The acoustic analysis capabilities have also remained limited to a certain small group of countries, including US, France, Japan, Australia and members of the Nordic Acoustics Association (NAA). The tropical littoral ASW has been a recent phenomenon and some of these countries have successfully invested and developed capabilities to overcome the challenges. The Americans recognized Chinese expansion of influence in the maritime domain particularly in the South China Sea (SCS) only towards the end of the 20th century. The Asian Seas International Acoustics Experiment (ASIAEX) was a massive SWAM experiment they planned right at the beginning of this century. Initially, six US universities led by the University of Washington planned phase-1 of the project and in phase-2, 20 other universities from China, Taiwan, and others were included.
|
https://en.wikipedia.org/wiki?curid=60878831
| 1,778,306 |
293,993 |
One of the most prevalent types of work-related injuries is musculoskeletal disorder. Work-related musculoskeletal disorders (WRMDs) result in persistent pain, loss of functional capacity and work disability, but their initial diagnosis is difficult because they are mainly based on complaints of pain and other symptoms. Every year, 1.8 million U.S. workers experience WRMDs and nearly 600,000 of the injuries are serious enough to cause workers to miss work. Certain jobs or work conditions cause a higher rate of worker complaints of undue strain, localized fatigue, discomfort, or pain that does not go away after overnight rest. These types of jobs are often those involving activities such as repetitive and forceful exertions; frequent, heavy, or overhead lifts; awkward work positions; or use of vibrating equipment. The Occupational Safety and Health Administration (OSHA) has found substantial evidence that ergonomics programs can cut workers' compensation costs, increase productivity and decrease employee turnover. Mitigation solutions can include both short term and long-term solutions. Short and long-term solutions involve awareness training, positioning of the body, furniture and equipment and ergonomic exercises. Sit-stand stations and computer accessories that provide soft surfaces for resting the palm as well as split keyboards are recommended. Additionally, resources within the HR department can be allocated to provide assessments to employees to ensure the above criteria are met. Therefore, it is important to gather data to identify jobs or work conditions that are most problematic, using sources such as injury and illness logs, medical records, and job analyses.
|
https://en.wikipedia.org/wiki?curid=36479878
| 293,834 |
2,112,924 |
One of the most significant yet uncertain components of predictive climate change models is the impact of aerosols on the climate system. Aerosols affect Earth's radiation balance directly and indirectly. The direct effect occurs when aerosol particles scatter, absorb, or exhibit a combination of these two optical properties when interacting with incoming solar and infrared radiation in the atmosphere. Aerosols that typically scatter light include sulfates, nitrates, and some organic particles, while those that tend to exhibit a net absorption include mineral dust and black carbon (or soot). The second mechanism by which aerosols alter the planet's temperature is called the indirect effect, which occurs when a cloud's microphysical properties are altered causing either an increase in reflection of incoming solar radiation, or an inhibited ability of clouds to develop precipitation. The first indirect effect is an increase in the amount of water droplets, which leads to an increase in clouds that reflect more solar radiation and therefore cool the planet's surface. The second indirect effect (also called the cloud's lifetime effect) is the increase in droplet numbers, which simultaneously causes an increase in droplet size, and therefore less potential for precipitation. That is, smaller droplets mean clouds live longer and retain higher liquid water content, which is associated with lower precipitation rates and higher cloud albedo. This highlights the importance of aerosol size as one of the primary determinants of aerosol quantity in the atmosphere, how aerosols are removed from the atmosphere, and the implications of these processes in climate . Fine particles are generally those below 2 micrometers (μm) in diameter. Within this category, the range of particles that accumulate in the atmosphere (due to low volatility or condensation growth of nuclei) are from 0.1-1 μm, and are usually removed from the air through wet deposition. Wet deposition can be precipitation, snow or hail. On the other hand, coarse particles, such as old sea-spray and plant-derived particles, are removed from the atmosphere via dry deposition. This process is sometimes also called sedimentation. However, different types of biogenic organic aerosols exhibit different microphysical properties, and therefore their removal mechanisms from the air will depend on humidity. Without a better understanding of aerosol sizes and composition in the North Atlantic Ocean, climate models have limited ability to predict the magnitude of the cooling effect of aerosols in global climate.
|
https://en.wikipedia.org/wiki?curid=62264747
| 2,111,709 |
757,766 |
By the mid-1980s, computers had outgrown these video systems and needed better displays. Most home and basic office computers suffered from the use of the old scanning method, with the highest display resolution being around 640x200 (or sometimes 640x256 in 625-line/50 Hz regions), resulting in a severely distorted tall narrow pixel shape, making the display of high resolution text alongside realistic proportioned images difficult (logical "square pixel" modes were possible but only at low resolutions of 320x200 or less). Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than the 6, 7 and 8 MHz of bandwidth that NTSC and PAL signals were confined to. IBM's Monochrome Display Adapter and Enhanced Graphics Adapter as well as the Hercules Graphics Card and the original Macintosh computer generated video signals of 342 to 350p, at 50 to 60 Hz, with approximately 16 MHz of bandwidth, some enhanced PC clones such as the AT&T 6300 (aka Olivetti M24) as well as computers made for the Japanese home market managed 400p instead at around 24 MHz, and the Atari ST pushed that to 71 Hz with 32 MHz bandwidth - all of which required dedicated high-frequency (and usually single-mode, i.e. not "video"-compatible) monitors due to their increased line rates. The Commodore Amiga instead created a true interlaced 480i60/576i50 RGB signal at broadcast video rates (and with a 7 or 14 MHz bandwidth), suitable for NTSC/PAL encoding (where it was smoothly decimated to 3.5~4.5 MHz). This ability (plus built-in genlocking) resulted in the Amiga dominating the video production field until the mid-1990s, but the interlaced display mode caused flicker problems for more traditional PC applications where single-pixel detail is required, with "flicker-fixer" scan-doubler peripherals plus high-frequency RGB monitors (or Commodore's own specialist scan-conversion A2024 monitor) being popular, if expensive, purchases amongst power users. 1987 saw the introduction of VGA, on which PCs soon standardized, as well as Apple's Macintosh II range which offered displays of similar, then superior resolution and color depth, with rivalry between the two standards (and later PC quasi-standards such as XGA and SVGA) rapidly pushing up the quality of display available to both professional and home users.
|
https://en.wikipedia.org/wiki?curid=56189
| 757,362 |
1,386,513 |
Struve was a highly successful administrator who brought fame to Yerkes Observatory and rebuilt the astronomy department of the University of Chicago. In particular, he gradually renewed the scientific staff, dismissing stagnated permanent researchers who were not making significant contributions to science but were occupying the faculty positions. The process was difficult. Struve used to arrive first and leave last from the observatory, taking notes on working hours of staff which he then used in his bureaucratic moves. In replacement, he hired several young and talented researchers who later became world-famous scientists. Those included Subrahmanyan Chandrasekhar (Nobel Prize in Physics in 1983), Gerard Kuiper (protagonist of the famous Kuiper Prize), Bengt Strömgren, Gerhard Herzberg (Nobel Prize in Chemistry in 1971), William Wilson Morgan and Jesse L. Greenstein. After World War II, he also invited a number of leading European researchers, such as Pol Swings, Jan Oort (father of radio astronomy), Marcel Minnaert, H. C. van der Hulst and Albrecht Unsöld. As most of them were foreigners, their appointment met strong opposition from the science officials for various reasons, such as taking jobs from Americans during the Great Depression. India-born Chandrasekhar, who spent a month in the Soviet Union in 1934, was also suspected of Communist connections. Struve spent extraordinary efforts defending and justifying each case, and those efforts paid off in building the scientific school at Yerkes and Chicago University. For example, Chandrasekhar spent his entire career as a scientist and administrator at Chicago University, assisting Struve and eventually replacing him as president of the American Astronomical Society (from 1949) and as the Editor in Chief of the "Astrophysical Journal".
|
https://en.wikipedia.org/wiki?curid=584584
| 1,385,746 |
2,087,734 |
Sushil Atreya's research addresses crosscutting themes of the origin and evolution of the atmospheres of the planets and moons of the Solar System, climate evolution, and planetary habitability. Atreya combines numerical modeling, spacecraft and ground-based observations, and data analysis in his study. He has also been developing the concepts for future planetary exploration missions, especially in situ entry probe missions at Jupiter, Saturn, Uranus, Neptune, and Venus. Atreya has published widely in his field of research. Atreya and colleagues made the first highly precise measurements of the primordial argon isotopic ratio on Mars using the mass spectrometer on the Curiosity Rover. It showed that Mars has lost much of its atmosphere in the past 4 billion years, and the so-called rocks from Mars are indeed Martian meteorites. Atreya was amongst the first to discover from orbital observations the presence of methane on Mars — a gas that is largely associated with life on Earth — followed by precise measurements from the surface over a decade with the tunable laser spectrometer on the Curiosity Rover at Gale Crater. With colleagues on the Galileo probe mass spectrometer team, Atreya found that the elements heavier than helium are enriched in Jupiter relative to their solar ratios which was a paradigm shifting constraint on the models of the formation of Jupiter and the other giant planets. On Juno, Atreya is involved in the determination of the global abundance of water using microwave radiometry. Atreya's photochemical models showed how a massive Earth-like atmosphere of nitrogen could evolve on Saturn's largest moon Titan, before the gas was detected on the satellite by Voyager. Using the data from the Cassini-Huygens mass spectrometer, Atreya was amongst the first to reveal the existence of a cycle of methane on Titan that is akin to the hydrologic cycle on Earth. Sushil Atreya is author of the book "Atmospheres and the Ionospheres of the Outer Planets and their Satellites" (Springer) and editor of "Origin and Evolution of Planetary and Satellite Atmospheres" (University of Arizona Press).
|
https://en.wikipedia.org/wiki?curid=69895415
| 2,086,532 |
1,064,348 |
The principal architects for the buildings were Rex Savidge and John Gelsthorpe of Architects' Design Group (ADG) from Baker Street, Nottingham. Heliodon modelling was used to determine the visual and psychological impact of the structures to be used on site due to their scale. This created a Zone of Visual Influence (ZVI) a system pioneered by the CEGB for the creation of future 2000 MW power stations in the 1960s. The tower layouts exploit line and lozenge formations. The opposite pairs of the lozenge group were coloured light and dark to avoid tendency of shapes to coalesce when viewed at a moderate distance. The offset tower of the line group has a light yellow colour with intense hue which acts as a nodal point. This was decided via a technical committee, the architects and William Holford in 1962.<ref name="86/2016"></ref> However, 10 years after construction the towers were indistinguishable from one another, the yellow tint was much faded. The main building colours are limited to black, white and yellow. The ancillary buildings are grouped around two courts through which the approach road passes. The Executive partner for (ADG) was Rex Savidge and Architect in charge was John Gelsthorpe assisted by Norman Simpson. West Burton was granted an award by the Civic Trust for its "outstanding contribution to the surrounding scene". The Civic Trust, announcing the 82 awards it made in 1968 from more than 1,400 entries from 94 counties in the United Kingdom described West Burton as "an immense engineering work of great style which, far from detracting from the visual scene, acts as a magnet to the eye from many parts of the Trent Valley".
|
https://en.wikipedia.org/wiki?curid=19273099
| 1,063,794 |
568,768 |
The 431-cubic-inch displacement 1938-40 Cadillac V16 was one of the last new American auto engine designs prior to World War Two. As such, it incorporated some of the latest thinking. Nine main bearings provided a crankshaft main bearing support between each 135 degree opposing pair of cylinders. The square bore and stroke lowered piston speed and promoted crankshaft rigidity, no small matter for an engine with eight cylinders in line per cylinder bank. The side valve engine design was no handicap for the time because the era's typical top engine speed of 3400-3700 rpm provided little opportunity to exploit the high speed breathing efficiency of overhead valves. Luxury car drivers presumably valued smoothness and silence more than high speed power. Hydraulic valve lifters promoted silent running and an absence of periodic adjustment. Unlike most cars of the era, an external oil filter safeguarded the precision valve lifters. Despite the use of side valves, the engine produced as much power as the prior 45 degree V16, and with much less complexity. The earliest engines produced featured an innovative friction wheel drive to the generator. This was soon replaced by a conventional V belt drive. Cadillac claimed that the 1938, 1939, and 1940 Series 90 Sixteen had the best performance of any production car in the world at the time and would accelerate 10-60 in high gear only in 16 seconds. The definitive engineering report on the 135 degree Cadillac V16 engine is "The Evolution of the Cadillac Sixteen engine," by E.W. Seaholm, in charge of Cadillac engine design. It was published by the industry journal "Automotive Industries," November 27, 1937.
|
https://en.wikipedia.org/wiki?curid=1129179
| 568,478 |
293,967 |
In July 2008, the NRO declassified the existence of its Synthetic Aperture Radar satellites, citing difficulty in discussing the creation of the Space-Based Radar with the United States Air Force and other entities. In August 2009, FOIA archives were queried for a copy of the NRO video, "Satellite Reconnaissance: Secret Eyes in Space." The seven-minute video chronicles the early days of the NRO and many of its early programs. It was proposed that the NRO share the imagery of the United States itself with the National Applications Office for domestic law enforcement purposes. The NAO was disestablished in 2009. The NRO is a non-voting associate member of the Civil Applications Committee (CAC). The CAC is an inter-agency committee that coordinates and oversees the Federal- Civil use of classified collections. The CAC was officially chartered in 1975 by the Office of the President to provide Federal- Civil agencies access to National Systems data in support of mission responsibilities. According to "Asia Times Online", one important mission of NRO satellites is the tracking of non-US submarines on patrol or on training missions in the world's oceans and seas. At the National Space Symposium in April 2010, NRO director General Bruce Carlson, United States Air Force (Retired) announced that until the end of 2011, NRO is embarking on "the most aggressive launch schedule that this organization has undertaken in the last twenty-five years. There are a number of very large and very critical reconnaissance satellites that will go into orbit in the next year to a year and a half."
|
https://en.wikipedia.org/wiki?curid=377048
| 293,808 |
1,060,682 |
Between 1877 and 1931, Albert A. Michelson made multiple measurements of the speed of light. His 1877–79 measurements were performed under the auspices of Simon Newcomb, who was also working on measuring the speed of light. Michelson's setup incorporated several refinements on Foucault's original arrangement. As seen in Figure 5, Michelson placed the rotating mirror R near the principal focus of lens L ("i.e." the focal point given incident parallel rays of light). If the rotating mirror R were exactly at the principal focus, the moving image of the slit would remain upon the distant plane mirror M (equal in diameter to lens L) as long as the axis of the pencil of light remained on the lens, this being true regardless of the RM distance. Michelson was thus able to increase the RM distance to nearly 2000 feet. To achieve a reasonable value for the RS distance, Michelson used an extremely long focal length lens (150 feet) and compromised on the design by placing R about 15 feet closer to L than the principal focus. This allowed an RS distance of between 28.5 to 33.3 feet. He used carefully calibrated tuning forks to monitor the rotation rate of the air-turbine-powered mirror R, and he would typically measure displacements of the slit image on the order of 115 mm. His 1879 figure for the speed of light, 299944±51 km/s, was within about 0.05% of the modern value. His 1926 repeat of the experiment incorporated still further refinements such as the use of polygonal prism-shaped rotating mirrors (enabling a brighter image) having from eight through sixteen facets and a 22 mile baseline surveyed to fractional parts-per-million accuracy. His figure of 299,796±4 km/s was only about 4 km/s higher than the current accepted value. Michelson's final 1931 attempt to measure the speed of light in vacuum was interrupted by his death. Although his experiment was completed posthumously by F. G. Pease and F. Pearson, various factors militated against a measurement of highest accuracy, including an earthquake which disturbed the baseline measurement.
|
https://en.wikipedia.org/wiki?curid=148819
| 1,060,130 |
148,655 |
Beginning in mid-1962 and continuing over the next few years, members of the PDP-1 programming community at MIT, including Russell and the other Hingham Institute members, began to spread out to other schools and employers such as Stanford University and DEC, and as they did they spread the game to other universities and institutions with a PDP-1 computer. As a result, "Spacewar!" was perhaps the first video game to be available outside a single research institute. Over the next decade, programmers at these other institutions began coding their own variants, including features such as allowing more ships and players at once, replacing the hyperspace feature with a cloaking device, space mines, and even a first-person perspective version played on two screens that simulates each pilot's view out of the cockpit. Some of these "Spacewar!" installations also replicated Saunders' gamepad. DEC learned about the game soon after its creation, and gave demonstrations of it running on their PDP-1, as well as publishing a brochure about the game and the computer in 1963. According to a second-hand account heard by Russell while working at DEC, "Spacewar!" was reportedly used as a smoke test by DEC technicians on new PDP-1 systems before shipping because it was the only available program that exercised every aspect of the hardware. Although the game was widespread for the era, it was still very limited in its direct reach: while less expensive than most mainframe computers, the PDP-1 was priced at and only 53 were ever sold, most without a monitor and many of the remainder to secure military locations or research labs with no free computer time, which prevented the original "Spacewar!" from reaching beyond a narrow, academic audience. Though some later DEC models, such as the PDP-6, came with "Spacewar!" pre-loaded, the audience for the game remained very limited; the PDP-6, for example, sold only 23 units.
|
https://en.wikipedia.org/wiki?curid=5482854
| 148,594 |
1,754,556 |
Traditional science is representative of two distinct philosophical traditions within the history of science, but e-Science, it is being argued, requires a paradigm shift, and the addition of a third branch of the sciences. "The idea of open data is not a new one; indeed, when studying the history and philosophy of science, Robert Boyle is credited with stressing the concepts of skepticism, transparency, and reproducibility for independent verification in scholarly publishing in the 1660s. The scientific method later was divided into two major branches, deductive and empirical approaches. Today, a theoretical revision in the scientific method should include a new branch, Victoria Stodden advocate[s], that of the computational approach, where like the other two methods, all of the computational steps by which scientists draw conclusions are revealed. This is because within the last 20 years, people have been grappling with how to handle changes in high performance computing and simulation." As such, e-science aims at combining both empirical and theoretical traditions, while computer simulations can create artificial data, and real-time big data can be used to calibrate theoretical simulation models. Conceptually, e-Science revolves around developing new methods to support scientists in conducting scientific research with the aim of making new scientific discoveries by analyzing vast amounts of data accessible over the internet using vast amounts of computational resources. However, discoveries of value cannot be made simply by providing computational tools, a cyberinfrastructure or by performing a pre-defined set of steps to produce a result. Rather, there needs to be an original, creative aspect to the activity that by its nature cannot be automated. This has led to various research that attempts to define the properties that e-Science platforms should provide in order to support a new paradigm of doing science, and new rules to fulfill the requirements of preserving and making computational data results available in a manner such that they are reproducible in traceable, logical steps, as an intrinsic requirement for the maintenance of modern scientific integrity that allows an extenuation of "Boyle's tradition in the computational age".
|
https://en.wikipedia.org/wiki?curid=1001976
| 1,753,566 |
835,013 |
Horrigan Hall is named after the university's first president, Msgr. Alfred Horrigan, and serves as the campus center. Architects Thomas J. Nolan & sons designed the facility in "modern" 1950s style and Al J. Schneider Company was the general contractor. The exterior is of rough textured brick with limestone trim. The three-story building sits atop a hill on campus featuring a six-story tower. Originally known as the Administration Building, Horrigan Hall was constructed in February 1953, costing $1 million. The facility was completely funded by private donations. Horrigan Hall has gone through a few remodeling and upgrades over the years. In December 1961 a new sound system was added, with central air following in 1970. In 1986–87 an elevator was installed and a new 2001 Newburg Road entrance was added as an alternative to the original 2000 Norris Place street entrance. Further remodeling and expansion is planned. Three new buildings have been proposed in front of and connected to the existing hall. The project is dubbed "Bellarmine Centro" and calls for the addition of more than of new space and approximately of remodeled space in the existing building. There will be space for a new Graduate School of Management, bookstore, admissions, registrar, bursar and financial aid offices. Classrooms will be added and expanded and a new space dedicated to triple the size of the Thomas Merton Center, the official repository of Merton's manuscripts, which hosts approximately 3,000 research international scholars and visitors annually. A garden and green space will be added, including a green roof accessible to students and faculty. Bellarmine Centro is estimated to cost $38 million and will be funded entirely by private sources.
|
https://en.wikipedia.org/wiki?curid=394938
| 834,564 |
840,937 |
Version 2, released a few months later on 17 April 1984, was an incremental improvement to the original Turbo Pascal, to the point that the reference manual was at first identical to version 1's, down to having 1983 as the copyright date on some of the compiler's sample output, but had a separate "Addendum to Reference Manual: Version 2.0 and 8087 Supplement" manual with separate page numbering. Additions included an overlay system, where separate overlay procedures would be automatically swapped from disk into a reserved space in memory. This memory was part of the 64kB RAM used by the program's code, and was automatically the size of the largest overlay procedure. Overlay procedures could include overlay sections themselves, but unless a RAM disk was used, the resulting disk swapping could be slow. 2.0 also added the Dispose procedure to manage the heap, allowing individual dynamic variables to be freed, as an alternative to the more primitive 'Mark/Release' system and increased compatibility with WordStar commands plus use of the numeric keypad on the IBM PC and compatibles. Such PCs also had new text window and CGA graphics mode commands as well as being able to use the PC's speaker for tones. Finally, DOS and CP/M-86 machines with an 8087 maths coprocessor (or later compatible) had an alternative TURBO-87 compiler available to purchase. It supported the 8087's "long real" data types with a range of 1.67E-307 to 1.67E+308 to 14 significant figure precision but with a much greater processing speed. The manual notes that although source code for the Turbo Pascal's software real data types offering a range of 1E-63 to 1E+63 to 11 significant figures, these were incompatible at a binary level: as well as having a much larger range, the software reals took six bytes in memory and the 8087 ones were eight.
|
https://en.wikipedia.org/wiki?curid=38273
| 840,487 |
1,359,686 |
Researchers often use heterologous expression techniques to study protein interactions. For example, bacteria has been optimized in the heterologous expression and biosynthesis of nitrogenase through NifEN. This is able to be expressed and engineered in E.coli. Through this host, it remains exceedingly challenging to heterologously express a complex, heteromultimeric metalloprotein like NifEN with a full complement of subunits, metalloclusters, and functionality. The NifEN variant engineered in this bacterial host can retain its cofactor efficacy at analogous cofactors-binding sites, which provide proof for heterologous expression and encourage future investigation of this metalloenzyme. Additionally, there have been recent reports of the utility of new filamentous fungal systems in the production of industrial proteins. Advantages include high transformation frequencies, the production of proteins at neutral pH, low viscosity of the fermentation broth due to strain selection for a nonfilamentous format and short fermentation times. Many human gene products, such as albumin, IgG, and interleukin 6, have been expressed in heterologous systems with varying degrees of success. Inconsistent results have hinted at a shift from gene-by-gene studies to a whole-organism approach to post-translational modification. Oocytes are readily optimized for their large size and translational capacity, which is able to observe integrated cell responses. This applies to studies of single molecules within single cells to medium-throughput drug-screening applications. By screening oocytes for the expression of injected cDNA, the application of micro injection as a model for heterologous expression can be studied further in terms of cell signaling, transport, architecture, and protein function.
|
https://en.wikipedia.org/wiki?curid=29071957
| 1,358,935 |
1,807,120 |
At the early 18th century, the electric signals from living tissues have been investigated. These researchers have promoted many innovations in healthcare especially in medical diagnostic. Some example is based on electrical signals produced by human tissues, including Electrocardiogram (ECG), Electroencephalography (EEG) and Electromyogram (EMG). Besides, with the development of technologies, the biomagnetic measurement from the human body, consisting of Magnetocardiogram (MCG), Magnetoencephalography (MEG) and Magnetomyogram (MMG), provided clear evidence that the existence of the magnetic fields from ionic action currents in electrically active tissues can be utilized to record activities. For the first attempt, David Cohen used a point-contact superconducting quantum interference device (SQUID) magnetometer in a shielded room to measure the MCG. They reported that the sensitivity of the recorded MCG was orders of magnitude higher than the previously recorded MCG. The same researcher continued this MEG measurement by using a more sensitive SQUID magnetometer without noise averaging. He compared the EEG and alpha rhythm MEG recorded by both normal and abnormal subjects. It is shown that the MEG has produced some new and different information provided by the EEG. Because the heart can produce a relatively large magnetic field compared to the brain and other organs, the early biomagnetic field research originated from the mathematical modelling of MCG. Early experimental studies also focused on the MCG. In addition, these experimental studies suffer from unavoidable low spatial resolution and low sensitivity due to the lack of sophisticated detection methods. With advances in technology, research has expanded into brain function, and preliminary studies of evoked MEGs began in the 1980s.
|
https://en.wikipedia.org/wiki?curid=30883500
| 1,806,101 |
464,112 |
In the early 1990s, an ALZA-funded research program began to develop a new dosage form of methylphenidate for the treatment of children with attention deficit hyperactivity disorder (ADHD). Methylphenidate's short half-life required multiple doses to be administered each day to attain long-lasting coverage, which made it an ideal candidate for the OROS technology. Multiple candidate pharmacokinetic profiles were evaluated and tested in an attempt to determine the optimal way to deliver the drug, which was especially important given the puzzling failure of an existing extended-release formulation of methylphenidate (Ritalin SR) to act as expected. The zero-order (flat) release profile that the PPOP was optimal at delivering failed to maintain its efficacy over time, which suggested that acute tolerance to methylphenidate formed over the course of the day. This explained why Ritalin SR was inferior to twice-daily Ritalin IR, and led to the hypothesis that an ascending pattern of drug delivery was necessary to maintain clinical effect. Trials designed to test this hypothesis were successful, and ALZA subsequently developed a modified PPOP design that utilized an overcoat of methylphenidate designed to release immediately and rapidly raise serum levels, followed by 10 hours of first-order (ascending) drug delivery from the modified PPOP design. This design was called the Push-Stick Osmotic Pump (PSOP), and utilized two separate drug layers with different concentrations of methylphenidate in addition to the (now quite robust) push layer.
|
https://en.wikipedia.org/wiki?curid=8031252
| 463,882 |
2,139,813 |
SensorDynamics was founded in 2003 by Herbert Gartner, Hubertus Christ, Jürgen Tittel and Volker Kempe. Financed by national and international Venture Capital investors the company successfully closed four equity rounds in 2004, 2007, 2009 and 2011. In 2005 SensorDynamics was ranked under the top 100 European high tech companies by Tornado Insider and awarded the Fast Forward Award 2005 in Austria. From the beginning the Company strongly cooperated with the Institute of Silicon Technology Fraunhofer Society in Itzehoe and expanded this cooperation with long-term agreements in 2007. In October 2007 SensorDynamics started a deep cooperation with US consumer electronics supplier Kionix. In August 2008 EnOcean and SensorDynamics announced the launch world's first energy harvesting system on chip SOC product. In March 2009 SensorDynamics launched MEMS gyroscopes for industrial, medical and consumer applications. In November 2009 Chipworks selected SensorDynamics' MEMS product SD755 as product of the year 2009. In November 2010 SensorDynamics announced XY angular rate and an XYZ angular rate devices in 6x6x1.8 mm3 QFN40-packages. In December 2010 SensorDynamics announced worldwide's first fully characterized and specified 6 x 6 x 1.2 mm 6DoF IMU (six degrees of freedom inertial measurement unit) including evaluation boards. In July 2011 SensorDynamics was acquired by Maxim Integrated, a recognized leader in analog and mixed-signal semiconductor products. Maxim Integrated, headquartered in San Jose, California, is in Fortune 1000, and is included in the NASDAQ-100, the Russell 1000, and MSCI USA indices. Maxim was paying $164 million to acquire SensorDynamics.
|
https://en.wikipedia.org/wiki?curid=23979212
| 2,138,583 |
181,355 |
"L. acidophilus" grows naturally in the oral, intestinal, and vaginal cavities of mammals. Nearly all Lactobacillus species have special mechanisms for heat resistance which involves enhancing the activity of chaperones. Chaperones are highly conserved stress proteins that allow for enhanced resistance to elevated temperatures, ribosome stability, temperature sensing, and control of ribosomal function at high temperatures. This ability to function at high temperatures is extremely important to cell yield during the fermentation process, and genetic testing on "L. acidophilus" in order to increase its temperature tolerance is currently being done. When being considered as a probiotic, it is important for "L. acidophilus" to have traits suitable for life in the gastrointestinal tract. Tolerance of low pH and high toxicity levels are often required. These traits vary and are strain specific. Mechanisms by which these tolerances are expressed include differences in cell wall structure, along with other changes is protein expression. Changes in salt concentration have been shown to affect "L. acidophilus" viability, but only after exposure to higher salt concentrations. In another experiment highlighted by the American Dairy Science Association, viable cell counts only showed a significant reduction after exposure to NaCl concentrations of 7.5% or higher. Cells were also observed to distinctly elongate when grown in conditions of 10% NaCl concentration or higher. "L. acidophilus" is also very well suited for living in a dairy medium, as fermented milk is the ideal method of delivery for introducing "L. acidophilus" into a gut microbiome. The viability of "L. acidophilus" cells encapsulated by spray drying technology stored at refrigerated condition (4 °C) is higher than the viability of cells stored at room temperature (25 °C).
|
https://en.wikipedia.org/wiki?curid=965861
| 181,260 |
1,516,783 |
Alfredo Jahn (Caracas, 1867– Caracas,1942) finished his studies at the Central University of Venezuela in 1886. The following year, he participated in the preliminary studies for the construction of a major railroad between Caracas and Valencia with extension to San Carlos. As a Civil Engineer, he worked with civil engineer and lawyer German Jimenez in the National Plan of Highways and Railroads of Venezuela by order of the National Government. He was responsible for the construction of the railroad from Caracas to Valencia. He also built the highway from Caracas to El Junquito. In 1887 he accompanied the Venezuelan chemist Vicente Marcano on a scientific expedition to the upper Orinoco river, sent by President Antonio Guzmán Blanco. The trip provided geographical positions and a collection of plants and archaeological objects found today in United States and Germany. As a geographer he identified the levels of the Lake of Valencia, its tributaries river and determined all the heights of the Range of the Coast. He lived with the Orinoco basin indigenous people and wrote books on their customs and dialects. As a botanist he classified many plants in Venezuela, donated rare specimen samples to the Smithsonian Institution, and wrote a book on the Palms of Venezuela ("the Palms of the Flora Venezuelana" – Caracas 1908). In 1911 he became the first person to ascend Pico Humboldt in the Sierra Nevada de Mérida in Venezuela. As founding member of the Sociedad Venezolana de Ciencias Naturales (Venezuelan Society of Natural Sciences) was its president in 1935 and 1937. He received an honorary doctorate from the University of Hamburg, and the Medal of the Berlin Geographical Society. He received the Order of the Liberator. The Alfredo Jahn Cave in Miranda is named for him; it is the sixth largest in the country.
|
https://en.wikipedia.org/wiki?curid=29302481
| 1,515,931 |
162,122 |
Big business grew much more slowly in Britain than in the United States, with the result that by the late 19th century the much larger American corporations were growing faster and could undersell the smaller competitors in Britain. A key was the vertical integration. In the United States, the typical firm expanded by reaching backward into the supply chain and forward into the distribution system. In the Ford Motor Company, raw iron and coal went in one end, and Model Ts were delivered by local dealers at the other end, British firms did not try to own the sources of raw materials, they bought on the market. Instead of setting up their own distribution system they worked with well-established wholesalers. The British businessmen were much more comfortable and a smaller niche, even though it made it much harder to lower costs and prices. Furthermore, the Americans had a rapidly growing home market, and Investment Capital was much more readily available. British businessmen typically used their savings not to expand their business but to purchase highly prestigious country estates – they looked to the landed country gentry for their role model Where the Americans look to the multi-millionaires. Crystal Palace hosted a world class industrial exhibition in 1851 in London. It was a marvelous display of the latest achievements in material progress, clearly demonstrating British superiority. The Americans were impressed, and repeatedly opened world–class industrial exhibits. By contrast, the British never repeated their success. In 1886 British sociologist Herbert Spencer commented: "Absorbed by his activities, and spurred on by his unrestricted ambitions, the American is a less happy being than the inhabitant of a country where the possibilities of success are very much smaller."
|
https://en.wikipedia.org/wiki?curid=33643110
| 162,037 |
1,417,501 |
Life at air–water interfaces has never been considered easy, mainly because of the harsh environmental conditions that influence the SML. However, high abundances of microorganisms, especially of bacteria and picophytoplankton, accumulating in the SML compared to the underlying water were frequently reported, accompanied by a predominant heterotrophic activity. This is because primary production at the immediate air–water interface is often hindered by photoinhibition. However, some exceptions of photosynthetic organisms, e.g., Trichodesmium, Synechococcus, or Sargassum, show more tolerance towards high light intensities and, hence, can become enriched in the SML. Previous research has provided evidence that neustonic organisms can cope with wind and wave energy, solar and ultraviolet (UV) radiation, fluctuations in temperature and salinity, and a higher potential predation risk by the zooneuston. Furthermore, wind action promoting sea spray formation and bubbles rising from deeper water and bursting at the surface release SML-associated microbes into the atmosphere. In addition to being more concentrated compared to planktonic counterparts, the bacterioneuston, algae, and protists display distinctive community compositions compared to the underlying water, in both marine and freshwater habitats. Furthermore, the bacterial community composition was often dependent on the SML sampling device being used. While being well defined with respect to bacterial community composition, little is known about viruses in the SML, i.e., the virioneuston. This review has its focus on virus–bacterium dynamics at air–water interfaces, even if viruses likely interact with other SML microbes, including archaea and the phytoneuston, as can be deduced from viral interference with their planktonic counterparts. Although viruses were briefly mentioned as pivotal SML components in a recent review on this unique habitat, a synopsis of the emerging knowledge and the major research gaps regarding bacteriophages at air–water interfaces is still missing in the literature.
|
https://en.wikipedia.org/wiki?curid=12264442
| 1,416,702 |
162,140 |
It has been suggested that the industrial sector was slow to adjust to global changes, and that there was a striking preference for leisure over industrial entrepreneurship among the elite. In 1910, the British Empire's share of world industrial capacity stood at 15%, just behind Germany's 16%, and less than half of the United States' 35%. Despite signs of relative weakness in certain sectors of the UK economy, the major achievements of the Edwardian years should be underlined. The city was the financial centre of the world—far more efficient and wide-ranging than New York, Paris or Berlin. British investment abroad doubled in the Edwardian years, from £2 billion in 1900 to £4 billion in 1913. Britain had built up a vast reserve of overseas credits in its formal Empire, as well as in its informal empire in Latin America and other nations. The British held huge financial holdings in the United States, especially in railways. These assets proved vital in paying for supplies in the first years of the World War. Social amenities, especially in urban centres, were accumulating – prosperity was highly visible. Among the working class there was a growing demand for access to a greater say in government, but the level of industrial unrest on economic issues only became significant about 1908. In large part, it was the demands of the coal miners and railway workers, as articulated by their trade unions, that prompted a high level of strike activity in the years immediately before the First World War.
|
https://en.wikipedia.org/wiki?curid=33643110
| 162,055 |
270,924 |
The 19th century was also the period in which physiology, including neurophysiology, professionalized and saw some of its most significant discoveries. Among its leaders were Charles Bell (1774–1843) and François Magendie (1783–1855) who independently discovered the distinction between sensory and motor nerves in the spinal column, Johannes Müller (1801–1855) who proposed the doctrine of specific nerve energies, Emil du Bois-Reymond (1818–1896) who studied the electrical basis of muscle contraction, Pierre Paul Broca (1824–1880) and Carl Wernicke (1848–1905) who identified areas of the brain responsible for different aspects of language, as well as Gustav Fritsch (1837–1927), Eduard Hitzig (1839–1907), and David Ferrier (1843–1924) who localized sensory and motor areas of the brain. One of the principal founders of experimental physiology, Hermann Helmholtz (1821–1894), conducted studies of a wide range of topics that would later be of interest to psychologists – the speed of neural transmission, the natures of sound and color, and of our perceptions of them, etc. In the 1860s, while he held a position in Heidelberg, Helmholtz engaged as an assistant a young physician named Wilhelm Wundt. Wundt employed the equipment of the physiology laboratory – chronoscope, kymograph, and various peripheral devices – to address more complicated psychological questions than had, until then, been investigated experimentally. In particular he was interested in the nature of apperception – the point at which a perception occupies the central focus of conscious awareness.
|
https://en.wikipedia.org/wiki?curid=1573230
| 270,776 |
325,405 |
In September 2019 there were three Boxer-related announcements. On 10 September it was revealed that the target date for the UK's MIV programme to receive its main gate approval was 22 October 2019. It was reported that the business case for the purchase of an initial batch of 508 vehicles, valued at about GBP1.2 billion (US$1.48 billion), was currently under scrutiny by financial, commercial, and technical experts before receiving final approval by ministers. UK MoD officials submitted their final business case for the purchase of the Boxer MIVs on 9 September 2019 to meet the British Army's target of getting its first Boxer in service by 2023. At the 2019 Defence and Security Equipment International exhibition (DSEI 2019) in London, Germany's Flensburger Fahrzeugbau Gesellschaft (FFG) presented an armoured recovery mission module (ARM) for the Boxer Christoph Jehn, FFG's project manager, stated the ARM was developed as a private venture from 2017. The company noticed Boxer users struggling to recover stranded vehicles with the aid of other Boxers and so decided to develop the bespoke mission module for the purpose. The ARM has an approximate weight of 13 tonnes, is manned by two personnel and connects to the Boxer using standard mechanical interfaces. On 24 September 2019 it was announced that the first Boxer for the Australian Army had formally been handed over. The turretless vehicle was the first of 25 Boxers – 13 multipurpose and 12 reconnaissance variants – that are being manufactured in Germany through to 2021 to meet an early Australian capability requirement for familiarisation and training purposes. Production of the other 186 platforms will begin in late 2020/early 2021 at a military vehicle centre of excellence constructed by Rheinmetall at Ipswich, southwest of Brisbane, and that formally opened in October 2020. This is the company's largest facility outside Germany.<ref name="Jane's Armoured Fighting Vehicles 2019-2020 GTK/MRAV/PWV (Boxer) Wheeled Armoured Vehicle Programme"></ref> Also in September 2019 reports emerged that Algeria had selected the Boxer and that production would commence shortly. As of Q1 2022 this had not been confirmed by ARTEC.
|
https://en.wikipedia.org/wiki?curid=656230
| 325,232 |
2,034,961 |
The International Committee for Standardization of Hematology (ICSH) was established and initiated by the European Society of Hematology (ESH) in 1963. From the 1970s to about 2000, the Management Committee met and worked at least once a year. As a result, members of the committees published many important guidelines and recommendations on standardization of laboratory haematological procedures in major review journals. In fact, in 1965, a commentary publication entitled "Recommendations and Requirements for the Measurement of Haemoglobin in Human Blood" was published. Since then, this topic has become a standardization issue. At the same time, more than 100 normative recommendations and comments on standardization have been dealt with. This has also attracted the attention of the council. A complete list of publications is also required. At same time,ICSH cooperates with various international organizations, especially those associated with the World Health Organization, and holds frequent expert group meetings, especially meetings or special sessions, for those concerned with the international community. Since then, ICSH has withdrawn from the International Hematological Society (IHS) meetings in North America and Western Europe due to frequent expert group meetings held by people concerned with the international community. From active laboratory practice, founding Committee members, effectiveness and productivity. So, there has been a steady decline since 2000, which is very unfortunate for ICSH. But the International Society of Laboratory Hematology (ISLH) made great progress in the 1990s. ISLH also led to standardization of laboratory haematology, which was a key point for the world, and ICSH began to implement. The working group that developed the international guidelines for platelet count reference methods was analysed by flow cytometry and immature reticulocyte fraction (Davis, 1997; ICSH, 2001). Other ISLH Then came the consensus document, especially the consensus rule document on blood tests. (Barnes et al., 2005). As a result, discussions were held between the ISLH board and the remaining senior managers. Members of the old ICSH Board returned from 2002 to 2005. In April 2007, ICSH was re-registered as an independent foundation in the Netherlands and a non-profit company in the United States.
|
https://en.wikipedia.org/wiki?curid=60779324
| 2,033,788 |
915,974 |
The SIMD vector processor (VMX128) was modified for the Xbox to include a dot-product instruction. The dot-product instruction took far less latency than discrete instructions. The VMX128 was also modified by the addition of direct 3D (D3D) compressed data format. This led to an approximate 50 percent savings in required band-width and memory footprint making the CPU having a theoretical peak performance of 115.2 GFLOPS, being capable of 9.6 billion dot products per second. Each core of the CPU was capable of simultaneous multithreading and was clocked at 3.2 GHz. However, to reduce CPU die size, complexity, cost, and power demands, the processor used in-order execution in contrast to the Intel Coppermine 128-based Pentium III used in the original Xbox, which used more complex out-of-order execution. The original chip used a 90 nm process, although a newer 65 nm process SOI revision was implemented on later models, which was in-turn superseded by a 45 nm combined CPU and GPU chip. A 21.6 GB/s front side bus, aggregated 10.8 GB/s upstream and downstream, connected Xenon with the graphics processor/northbridge. Xenon was equipped with an 8th way set associative 1 MB Level 2 cache on-die running at half CPU clock speed. This cache was shared amongst the three CPU cores. Each core had separate L1 caches, each containing a two-way set associative 32-Kbyte L1 instruction cache and a four-way set associative 32-Kbyte L1 data cache. The write-through data cache did not allocate cache lines on writes. The CPU also contained ROM storing Microsoft private encrypted keys, used to decrypt game data. The heat sink implemented to cool the Xenon CPU was composed of aluminum fins with a copper base, and a heat pipe. Newer revisions, which had a smaller core, do not feature the heat pipe or copper base. The heat sink was cooled by two 70 mm fans at the rear of the console on original-style consoles, while a single fan mounted on the side of the consoles was used in Xbox 360 S consoles. There were several types of fan used in Xbox 360s, which were produced by Nidec, Sunon, and Delta Electronics.
|
https://en.wikipedia.org/wiki?curid=12366110
| 915,493 |
565,700 |
On 16 August 2010, Iranian officials withdrew Mohammed Soleimani from the men's under-48-kilogram category taekwondo final against Israel's Gili Haimovitz, citing an ankle injury. According to the officials, Soleimani had first hurt his ankle at the World Junior Championships in Mexico earlier in the year, and the injury had flared up again during his semifinal contest against the US's Gregory English. Soleimani was sent to hospital for an X-ray, and his ankle was put into a cast. Haimovitz was awarded the gold medal by default at a victory ceremony at the Suntec International Convention Centre from which Soleimani, the silver medallist, was absent. Interviewed later on, Soleimani said he was "very sad" to have missed the bout as he was "sure [he] was going to get the gold medal". Israel's chef-de-mission Daniel Oren claimed that the pullout had been politically motivated. He said: "It's not the first time this has happened at the Olympics. But this is a first for a medal match. To be honest, once our boy got into the final, we knew that this is going to happen. I spoke to our boy after the final and he, of course, was disappointed that he did not have a chance to win his gold through an actual fight. I feel more sorry for the Iranian boy. He must have trained hard to get to this stage and was not given a chance to fight. We are dealing with sports here, youth sports, in fact. It's a pity that politics got involved." However, IOC spokesman Mark Adams said: "As far as the IOC is concerned, there is no sinister intent here. What we know factually is that the athlete injured his ankle and was sent to the hospital for an X-ray. Tests revealed he did not suffer anything broken, and he is all right now. So unless more factual information is available, it [the controversy] is mere speculation." This was reiterated by IOC President Jacques Rogge on 17 August: "He [Soleimani] was driven to the hospital, was examined by a Singaporean doctor, totally independent, not belonging to the organisation and he diagnosed an ankle sprain. For us, that's the end of the story." Previously, Iran has stated that since its existing government does not recognize Israel as a state, its policy is to withdraw from competing against the country.
|
https://en.wikipedia.org/wiki?curid=13765908
| 565,410 |
2,195,499 |
In May 1912 in a President's Address to the University of Sydney Engineering Society JPVM set out his ideas on the effective manner in which Engineers should be trained. He believed teaching of engineering was more than just technical matters & to his mind included character, clearness of thought & expression, general knowledge, tastes, discipline & contact with men of vastly different interests. There should be a liberal arrangement of the syllabus by doing away with individual years & place only a minimum limit of four years on the course so that the student can take as long as he pleased with certain essential subjects leaving a certain number as options. He did not believe that the Matriculation test in languages & literature was essential for entrance to an Engineering course. The course of instruction needs to be geared to Australian needs with more emphasis on applications compared to finer points of design. A sound quantitative knowledge of portions of science which directly concern his professional work should be provided. The system of training is to consist of a clear exposition of the fundamental principles of science & then a study of the methods of application of these principles. As soon as the principles are understood in a general way then he must be then shown the practical application-there is a tendency to remember the applications & forget the principles. Such a defect can arise if students are given principles in one course & it is then left for him to deal with applications at a later stage. There is also a need for short graduate courses of 6-10 lectures. The need for an Australian physical Measurements Laboratory such as in Germany, London, Washington & Japan was raised at this time.
|
https://en.wikipedia.org/wiki?curid=52676648
| 2,194,249 |
1,649,337 |
Since 2004, different types of nanotube and nanowire based motors have been developed, in addition to nano- and micromotors of different shapes. Most of these motors use hydrogen peroxide as fuel, but some notable exceptions exist. These silver halide and silver-platinum nanomotors are powered by halide fuels, which can be regenerated by exposure to ambient light. Some nanomotors can even be propelled by multiple stimuli, with varying responses. These multi-functional nanowires move in different directions depending on the stimulus (e.g. chemical fuel or ultrasonic power) applied. For example, bimetallic nanomotors have been shown to undergo rheotaxis to move with or against fluid flow by a combination of chemical and acoustic stimuli. In Dresden Germany, rolled-up microtube nanomotors produced motion by harnessing the bubbles in catalytic reactions. Without the reliance on electrostatic interactions, bubble-induced propulsion enables motor movement in relevant biological fluids, but typically still requires toxic fuels such as hydrogen peroxide. This has limited nanomotors' in vitro applications. One in vivo application, however, of microtube motors has been described for the first time by Joseph Wang and Liangfang Zhang using gastric acid as fuel. Recently titanium dioxide has also been identified as a potential candidate for nanomotors due to their corrosion resistance properties and biocompatibility. Future research into catalytical nanomotors holds major promise for important cargo-towing applications, ranging from cell sorting microchip devices to directed drug delivery.
|
https://en.wikipedia.org/wiki?curid=1390297
| 1,648,405 |
1,237,348 |
Entrusted with the management and expansion of this metropolitan network is the TASK Computer Centre, which was set up at Gdańsk University of Technology in 1994. Today it is based in the new Faculty of Electronics, Telecommunications and Informatics building. As one of the five supercomputer centres in Poland, it provides the scientific community with processing resources in the form of high speed computers and specialist software. These resources help in the development of various fields of knowledge, for example: chemistry, physics, engineering, electronics and oceanography. Presently, there are over 50 projects being realised using the supercomputers at the centre, concerning among other things: molecular modeling of nucleic proteins and acids; quantum chemistry calculations; research into the properties of nanomaterials; modeling wave motion, currents and rising storms in the Baltic Sea and the Bay of Gdańsk; research into Nordic Seas dynamics and also modeling the behavior of skeleton muscles. The TASK Computer Centre is co-creating a national PIONIER fibre-optic network for the scientific and information community, and is also actively participating in six projects of the so-called innovative economy: MAYDAY, Pl-Grid, PLATON, Pomeranian Digital Library, Integrated Oceanographic Data System and NEWMAN. Virtually all these projects help to create new jobs. The total value in the region of 300 million zlotys. At the start of 2008 the Centre installed Galera, a computer cluster with the theoretical computer power over 50 Tflops. Thanks to Galera, the TASK Computer Centre has become one of the world's leading supercomputing sites. Galera was in 2014 still listed among the world's 200 fastest computers in the prestigious TOP 500 chart.
|
https://en.wikipedia.org/wiki?curid=785196
| 1,236,684 |
1,392,224 |
The first is that we are led to a theory with differential wave propagation. The field functions are continuous functions of continuous parameters and , and the changes in the fields at a point are determined by properties of the fields infinitesimally close to the point . For most wave fields (for example, sound waves and the vibrations of strings and membranes) such a description is an idealization which is valid for distances larger than the characteristic length which measures the granularity of the medium. For smaller distances these theories are modified in a profound way. The electromagnetic field is a notable exception. Indeed, until the special theory of relativity obviated the necessity of a mechanistic interpretation, physicists made great efforts to discover evidence for such a mechanical description of the radiation field. After the requirement of an “ether” which propagates light waves had been abandoned, there was considerably less difficulty in accepting this same idea when the observed wave properties of the electron suggested the introduction of a new field. Indeed there is no evidence of an ether which underlies the electron wave. However, it is a gross and profound extrapolation of present experimental knowledge to assume that a wave description successful at “large” distances (that is, atomic lengths ≈"10" cm) may be extended to distances an indefinite number of orders of magnitude smaller (for example, to less than nuclear lengths ≈"10" cm). In the relativistic theory, we have seen that the assumption that the field description is correct in arbitrarily small space-time intervals has led—in perturbation theory—to divergent expressions for the electron self-energy and the bare charge. Renormalization theory has sidestepped these divergence difficulties, which may be indicative of the failure of the perturbation expansion. However, it is widely felt that the divergences are symptomatic of a chronic disorder in the small-distance behaviour of the theory. We might then ask why local field theories, that is, theories of fields which can be described by differential laws of wave propagation, have been so extensively used and accepted. There are several reasons, including the important one that with their aid a significant region of agreement with observations has been found. But the foremost reason is brutally simple: there exists no convincing form of a theory which avoids differential field equations.
|
https://en.wikipedia.org/wiki?curid=701496
| 1,391,453 |
1,466,242 |
Right after the end of the war, in 1947 The Law on Serbian Academy of Science brought certain changes in its structure, when instead of the expert academies, six departments were formed as well as the certain number of the institutes. With the enlarged spectre of activities, the need for the necessary spatial capacities increased significantly, so the primary goal was the conversion of the entire building in Knez Mihailova Street into the office space, which included an extensive adaptation. The design was assigned to an architect Grigorij Samojlov, who, along with an architect Đorđe Smiljanić completed the transformation of the inner part of the building, at the same time completing one of the most important interior designs. Samojlov showed extraordinary skill by reshaping the existing object into the almost compact academic combination with central atrium, he formed a two-tract office system, completely eliminated the passages, except for the central one, which was partially redesigned into the main entrance hall, whereas the ground floor kept its commercial character. The creation of the entrance from Knez Mihailova Street and designing of the access to the conference hall contributed to the realization of the representative space. According to the new concept, Samojlov designed the exterior in the modernized academic style with purified geometrized decorative repertoire. At the same time, the Congress hall was adapted, gaining the gallery and in the arched niches two paintings "The Science", painted by Petar Lubarda and "The Art", painted by Мilo Milunović. Along with the adaptation certain changes were made on the very facade: the glass marquise was removed from the mezzanine, the mezzanine windows were changed, as well as the shop windows on the ground floor, a decorative dome was reconstructed and a cornice and all decorative elements were removed.
|
https://en.wikipedia.org/wiki?curid=4705234
| 1,465,419 |
1,096,440 |
"Orientia tsutsugamushi" causes a complex and potentially life-threatening disease known as scrub typhus. Infection starts when chiggers bite on the skin during their feeding. The bacteria are deposited at the site of feeding (inoculation), where they multiply. They cause progressive tissue damage (necrosis), which leads to formation of an eschar on the skin. Necrosis progresses to inflammation of the blood vessels, called vasculitis. This in turn causes inflammation of the lymph nodes, called lymphadenopathy. Within a few days, vasculitis extends to various organs including the liver, brain, kidney, meninges and lungs. The disease is responsible for nearly a quarter of all the febrile (high fever) illness in endemic areas. Mortality in severe cases or due to improper treatment or misdiagnosis may be as high as 30–70%. About 6% of infected people die untreated, and 1.4% of the patients die even with medical treatment. Moreover, the death rate can be as high as 14% with neurological problems and 24% with multi-organ dysfunction among treated patients. In cases of misdiagnosis and failure of treatment, systemic complications rapidly develop including acute respiratory distress syndrome, acute kidney failure, encephalitis, gastrointestinal bleeding, hepatitis, meningitis, myocarditis, pancreatitis, pneumonia, septic shock, subacute thyroiditis, and multiple organ dysfunction syndrome. Harmful effects involving multiple organ failure and neurological impairment are difficult to treat, and can cause lifelong debilitation or be directly fatal. The central nervous system is often affected and results in various complications including cerebellitis, cranial nerve palsies, meningoencephalitis, plexopathy, transverse myelitis, and Guillan-Barré syndrome. Death rates due to complications can be up to 14% in brain infections, and 24% with multiple organ failure. In India, scrub typhus has become the major cause of acute encephalitis syndrome, which was earlier caused mainly by a viral infection, Japanese encephalitis.
|
https://en.wikipedia.org/wiki?curid=12300557
| 1,095,880 |
272,687 |
It remains unclear who first discovered the organism. David Douglas Cunningham, Surgeon Major of the British Indian army, may have seen it in 1885 without being able to relate it to the disease. Peter Borovsky, a Russian military surgeon working in Tashkent, conducted research into the etiology of "oriental sore", locally known as "sart" sore, and in 1898 published the first accurate description of the causative agent, correctly described the parasite's relation to host tissues and correctly referred it to the protozoa. However, because his results were published in Russian in a journal with low circulation, his results were not internationally acknowledged during his lifetime. In 1901, William Boog Leishman identified certain organisms in smears taken from the spleen of a patient who had died from "dum-dum fever" (Dum Dum is an area close to Calcutta) and proposed them to be trypanosomes, found for the first time in India. A few months later, Captain Charles Donovan (1863–1951) confirmed the finding of what became known as Leishman-Donovan bodies in smears taken from people in Madras in southern India. But it was Ronald Ross who proposed that Leishman-Donovan bodies were the intracellular stages of a new parasite, which he named "Leishmania donovani". The link with the disease "kala-azar" was first suggested by Charles Donovan, and was conclusively demonstrated by Charles Bentley's discovery of "L. donovani" in patients with "kala-azar". Transmission by the sandfly was hypothesized by Lionel Napier and Ernest Struthers at the School of Tropical Medicine at Calcutta and later proven by his colleagues. The disease became a major problem for Allied troops fighting in Sicily during the Second World War; research by Leonard Goodwin then showed pentostam was an effective treatment.
|
https://en.wikipedia.org/wiki?curid=198491
| 272,539 |
1,820,377 |
In later years Forrester became Professor of Medicine at the University of California, Los Angeles, School of Medicine, the George Burns and Gracie Allen Professor of Cardiovascular Research, and Chief of the Division of Cardiology at Cedars-Sinai Medical Center. In these roles, he served as mentor for several hundred cardiologists, a number of whom are currently leaders in cardiovascular medicine. Forrester has published over 400 full-length scientific manuscripts and book chapters dealing with these topics. In 2009, he was the second-ever recipient of the American College of Cardiology’s highest honor, the Lifetime Achievement Award.{} Additional honors include Cable News Network's 10 Medical Scientists to Watch (1989), the Leon Goodman Award for excellence in laser research, the Distinguished Scientific Achievement Award of the American Heart Association in Los Angeles, the Jan Kellerman Award for research in preventive cardiology, and the 2011 Simon Dack Award for Outstanding Scholarship from the Journal of American College of Cardiology.{} The following year he was the annual awardee for the Camp Hill High School Wall of Honor. In 2013 he received the annual Pioneer of Medicine Award from Cedars-Sinai Medical Center. At the award ceremony, his colleagues created a 10-minute video describing his career and his contributions to their own careers. In 2014 he was chosen to deliver the lecture celebrating the American College of Cardiology's 65th Anniversary.{} In "1949-2014: 65 Years of Cardiovascular History", he described the people and events that changed the care of heart disease in his lifetime. He is the author of "The Heart Healers: The Misfits, Mavericks, and Rebels Who Created the Greatest Medical Breakthrough of Our Lives", published in September, 2015 by St. Martin's Press. The book, which appeared as an Amazon best seller in its category, is a historical memoir of his personal relationships with the pioneers who created heart surgery, defibrillators, pacemakers, coronary care units, heart imaging, and angioplasty, and tells stories about the emotional impact of these lifesaving advances on his individual patients. In 2019 he received the annual Cedars-Sinai Medical Center Lifetime Achievement Award.
|
https://en.wikipedia.org/wiki?curid=44523992
| 1,819,340 |
1,173,564 |
In 2012, there are devices built for medical professionals to analyze specific diseases or take specific health measurements, but there is not one all-purpose consumer device to diagnose a variety of conditions. Numerous accounts speculate that the advent of high-power computer chips, cell-phone technology, and improved scanners means that such a device will likely be invented in the next few years. There are devices now which can perform a single function analysis, such as a thermometer measuring bodily temperature, but the idea of a medical tricorder is that it should be able to perform a variety of basic yet important tasks. For example, it may be possible to combine a high-power microscope with a cellphone and use it to analyze swab samples electronically. Two electrodes on a device may measure heart action and serve as a portable electrocardiogram. Glucose levels can be measured by sampling tiny blood samples. It may analyze polarized light coming from a person's skin to reveal information about cancer or the healing of a wound. Sensors may pick up on abnormalities with DNA as well as the presence of antibodies. An ultrasonic probe can plug into a smartphone, allowing it to be used to create ultrasound images. Medical tricorders may work by sensing "volatile organic compounds our bodies secrete" by some means of smell. A second report confirms that sensitive electronic "noses" may detect infections such as pneumonia from a person's exhaled breaths.
|
https://en.wikipedia.org/wiki?curid=37804721
| 1,172,944 |
1,835,621 |
Development of CICE began in 1994 by Elizabeth Hunke at Los Alamos National Laboratory (LANL). Since its initial release in 1998 following development of the Elastic-Viscous-Plastic (EVP) sea ice rheology within the model, it has been substantially developed by an international community of model users and developers. Enthalpy-conserving thermodynamics and improvements to the sea ice thickness distribution were added to the model between 1998 and 2005. The first institutional user outside of LANL was Naval Postgraduate School in the late-1990s, where it was subsequently incorporated into the Regional Arctic System Model (RASM) in 2011. The National Center for Atmospheric Research (NCAR) was the first to incorporate CICE into a global climate model in 2002, and developers of the NCAR Community Earth System Model (CESM) have continued to contribute to CICE innovations and have used it to investigate polar variability in Earth's climate system. The United States Navy began using CICE shortly after 2000 for polar research and sea ice forecasting and it continues to do so today. Since 2000, CICE development or coupling to oceanic and atmospheric models for weather and climate prediction has occurred at the University of Reading, University College London, the U.K. Met Office Hadley Centre, Environment and Climate Change Canada, the Danish Meteorological Institute, the Commonwealth Science and Industrial Research Organisation, and Beijing Normal University, among other institutions. As a result of model development in the global community of CICE users, the model's computer code now includes a comprehensive saline ice physics and biogeochemistry library that incorporates mushy-layer thermodynamics, anisotropic continuum mechanics, Delta-Eddington radiative transfer, melt-pond physics and land-fast ice. CICE version 6 is open-source software and was released in 2018 on GitHub.
|
https://en.wikipedia.org/wiki?curid=58881336
| 1,834,572 |
1,868,220 |
The AVU relied heavily on World Bank grants and did not become self-financing as it originally had planned to do. For this reason, the AVU changed into an intergovernmental organization in 2003 via an intergovernmental charter signed by 15 SSA countries. Beyond not meeting its initial expectations, critics argue that the AVU continues to inefficiently distribute educational services to the African continent. Their main critique is that higher education enrollment has increased from 200,000 in 1970 to 4.5 million in 2008. Yet, AVU claims to have trained only 63,823 students since its conception in 1997. Additionally, others have pointed out the lack of training received by AVU administrators in a study of Kenya specifically. These administrators were unable to use the digital equipment issued by the World Bank which significantly stinted the program's progress. Moreover, the AVU did not set its own admission requirements for students, but rather relied on the existing requirements of oversee universities. However, the biggest obstacle to AVU's success was the lack of infrastructure on the African continent. Since the program was administered through telephone lines that transmitted a satellite signal connecting Western partner universities to AVU-partnering universities on the African continent, the program had a difficult time spreading its resources. Lastly, the cost of to improve the continent's technological infrastructure increased the costs of AVU programs beyond that of previous programs provided by local institutions. In 2002, a year-long AVU course at Kenyatta University in Kenya cost $6,400 U.S. dollars, which was nearly twice as much as class costs at local institutions. Not only did AVU raise costs beyond that of similar programs provided locally, but it also competed with local universities. Nevertheless, others advocate for AVU. The African continent is now a dynamic e-learning environment. Some attribute this success to the World Bank's initial AVU project. Overall, the continent showed a 15.2% increase in the self-paced growth rate of e-learning from 2011 to 2016, earning nearly $250 million from e-learning in 2011. Additionally, this increase in elearining is largely responsible for spread of fiber optic cables across the continent.
|
https://en.wikipedia.org/wiki?curid=37796710
| 1,867,144 |
1,002,369 |
US Army M728A1s were deployed in support of the United Nations' Resolution, NATO led Implementation Force (IFOR) in December 1995. Their initial mission was to assist in protecting and demining the international airport at Sarajevo. By September 1996 their mission had expanded to include road clearance, bunker demolition and protecting humanitarian aid and assist relief delivery in the whole of Bosnia and Herzegovina, as well as to help protect civilian refugees when required by the Red Cross. Task Force Eagle assumed control of its area of responsibility during a ceremony with United Nations forces at Eagle Base in Tuzla on December 20 consisting of elements of the 1st Armored Division and its supporting elements from the U.S. V Corps and were joined by forces from twelve other nations. During the campaign in Bosnia, at least 3 M60 Panther MCDVs were used in conjunction with the M728. The Panther would lead the convoy followed by the M728. The Panther operator would control the vehicle from the M728 via a remote control system during road clearing operations. There was a Closed Circuit Television (CCTV) camera system attached to the front of the Panther so the remote-operator could see where the tank was going through a screen on the remote control unit. The radio control signal was received by a long antenna protruding from the engine deck. The M728 also provided a good secondary clearing action by use of its bulldozer blade as it followed the Panther. It would skim the trail cleared by the Panther pushing away debris and keeping the route clear for other following vehicles, also smoothing out the road surface and could be used for filling in craters left by any exploding mines or ordnance. The CEV was also useful for quickly recovering the Panther should it become stuck, and its crane allowed easy loading and unloading of the mine roller onto transport vehicles. The 1st Armored Division was relieved by the 1st Infantry Division and returned to Germany in November 1996.
|
https://en.wikipedia.org/wiki?curid=559429
| 1,001,851 |
152,575 |
A contract for two aircraft, now designated the XP-89, and a full-scale mock-up was approved on 13 June, although construction of the mock-up had begun immediately after the USAAF announced that the N-24 had been selected. It was inspected on 25 September and the USAAF had some reservations. The inspectors believed that the radar operator needed to be moved forward, closer to the pilot, with both crewmen under a single canopy, the magnesium alloy components of the wing replaced by aluminum alloy, and the fuel tankage directly above the engines moved. Other changes had to be made as wind tunnel and other aerodynamic tests were conducted. The swept wings proved to be less satisfactory at low speeds, and a thin straight wing was selected instead. Delivery of the first prototype was scheduled for November 1947, 14 months after the inspection. The position of the horizontal stabilizer also proved to be unsatisfactory, as it was affected by the engine exhaust, and it would be "blanked-out" by airflow from the wing at high angles of attack. It was moved halfway up the tail, but its position flush with the leading edge of the vertical stabilizer proved to cause extra drag through turbulence and reduced the effectiveness of the elevators and rudder. Moving the horizontal stabilizer forward solved the problem. Another major change occurred when USAAF revised its specification to delete the rear gun installation on 8 October. Another inspection of the mock-up was held on 17 December, and the inspectors suggested only minor changes, even though the fuselage fuel tanks were still above the engines. Northrop's efforts to protect the fuel tanks were considered sufficient, as the only alternative was to redesign the entire aircraft.
|
https://en.wikipedia.org/wiki?curid=458875
| 152,507 |
455,154 |
All the units are equipped with emergency response systems, which can prevent release of radioactive material into the environment even in case of serious accident; for example breakage of pipes in the reactor cooling circuit. The reactor cooling circuit is housed in hermetic reinforced concrete boxes that can withstand a pressure of . Also part of the emergency response system are special steam condensers which have a 3000 m2 capacity. Smolensk NPP has special systems that will be able to remove heat from the reactor even in case of the plant’s complete loss of electrical power. The environment radiation control sub-system conducts round-the-clock monitoring of a 30-km zone around the station, with the help of sensors installed all over the area which can detect abnormal radioactivity levels in, water, soil, plants and agricultural produce consumed by the local population. The findings of independent experts from the State Sanitary Inspection and the State Hydro-meteorological Committee say that the radiation background at Smolensk Nuclear Power Plant and in the nearby area is consistent with background radiation. Hence, Smolensk NPP is an ecologically friendly facility having no significant radiological or chemical effect on the population and the environment. The key goal set by the Smolensk NPP is to reduce the risk of incidents that may have negative effect on the personnel and the environment. All the measures provided for by the safety enhancement concept of "Rosenergoatom" Concern are aimed at reducing the risk of incidents at the REA NPPs. The safety declaration of Smolensk NPP says that the plant seeks to become the safest NPP in Russia.
|
https://en.wikipedia.org/wiki?curid=20003674
| 454,932 |
1,044,341 |
Jean-Baptiste Lamarck's 1809 evolutionary theory, transmutation of species, was based on a progressive (orthogenetic) drive toward greater complexity. Lamarck also shared the belief, common at the time, that characteristics acquired during an organism's life could be inherited by the next generation, producing adaptation to the environment. Such characteristics were caused by the use or disuse of the affected part of the body. This minor component of Lamarck's theory became known, much later, as Lamarckism. Darwin included "Effects of the increased Use and Disuse of Parts, as controlled by Natural Selection" in "On the Origin of Species", giving examples such as large ground feeding birds getting stronger legs through exercise, and weaker wings from not flying until, like the ostrich, they could not fly at all. In the late 19th century, neo-Lamarckism was supported by the German biologist Ernst Haeckel, the American paleontologists Edward Drinker Cope and Alpheus Hyatt, and the American entomologist Alpheus Packard. Butler and Cope believed that this allowed organisms to effectively drive their own evolution. Packard argued that the loss of vision in the blind cave insects he studied was best explained through a Lamarckian process of atrophy through disuse combined with inheritance of acquired characteristics. Meanwhile, the English botanist George Henslow studied how environmental stress affected the development of plants, and he wrote that the variations induced by such environmental factors could largely explain evolution; he did not see the need to demonstrate that such variations could actually be inherited. Critics pointed out that there was no solid evidence for the inheritance of acquired characteristics. Instead, the experimental work of the German biologist August Weismann resulted in the germ plasm theory of inheritance, which Weismann said made the inheritance of acquired characteristics impossible, since the Weismann barrier would prevent any changes that occurred to the body after birth from being inherited by the next generation.
|
https://en.wikipedia.org/wiki?curid=53955838
| 1,043,797 |
954,959 |
Georges Cuvier's analysis of fossils and discovery of extinction disrupted static views of nature in the early 19th century, confirming geology as showing a historical sequence of life. British natural theology, which sought examples of adaptation to show design by a benevolent Creator, adopted catastrophism to show earlier organisms being replaced in a series of creations by new organisms better adapted to a changed environment. Charles Lyell (1797–1875) also saw adaptation to changing environments as a sign of a benevolent Creator, but his uniformitarianism envisaged continuing extinctions, leaving unanswered the problem of providing replacements. As seen in correspondence between Lyell and John Herschel, scientists were looking for creation by laws rather than by miraculous interventions. In continental Europe, the idealism of philosophers including Lorenz Oken (1779–1851) developed a "Naturphilosophie" in which patterns of development from archetypes were a purposeful divine plan aimed at forming humanity. These scientists rejected transmutation of species as materialist. radicalism threatening the established hierarchies of society. The idealist Louis Agassiz (1807–1873), a persistent opponent of transmutation, saw mankind as the goal of a sequence of creations, but his concepts were the first to be adapted into a scheme of theistic evolutionism, when in "Vestiges of the Natural History of Creation" published in 1844, its anonymous author (Robert Chambers) set out goal-centred progressive development as the Creator's divine plan, programmed to unfold without direct intervention or miracles. The book became a best-seller and popularised the idea of transmutation in a designed "law of progression". The scientific establishment strongly attacked "Vestiges" at the time, but later more sophisticated theistic evolutionists followed the same approach of looking for patterns of development as evidence of design.
|
https://en.wikipedia.org/wiki?curid=328815
| 954,454 |
616,508 |
During April and again in June 1993, Coast Guard Forces St. Louis (CGF)was activated for flooding on the Mississippi, Missouri and Illinois River basins. The '500 year' flooding closed over of river to navigation and claimed 47 lives. Historic levels of rainfall in the river tributaries caused many levee breaks along the Missouri and Mississippi Rivers displacing thousands of people from their homes and businesses. The commander of CGF St. Louis set into motion a preconceived operations plan to deal with the many requests for assistance from state and local governments for law enforcement assistance, help with sandbagging, water rescues, evacuation of flood victims, and aerial surveillance of levee conditions. The unprecedented duration of the flood also caused Coast Guard personnel to assume some humanitarian services not normally a part of flood operations. Food, water and sandbags were transported to work sites to assist sandbagging efforts by local governments. Red Cross and Salvation Army relief workers were given transportation assistance. Many homeless animals displaced by the flood waters were rescued and turned over to local animal shelters. Utility repair crews were assisted with transportation of personnel and repair parts. Disaster Response Units (DRU) were formed from active duty and reserve units throughout the Second Coast Guard District and consisted of eight members equipped with three 16-foot flood punts powered by 25 horsepower outboard motors. The DRU's accounted for 1517 boat sorties and 3342 hours of underway operations. Coast Guard helicopters from CG Air Stations in Traverse City and Detroit, Michigan; Chicago, Illinois; Elizabeth City, North Carolina; and Mobile, Alabama provided search and rescue, logistical support and aerial survey intelligence. The Coast Guard Auxiliary provided three fixed wing aircraft. There were 473 aircraft sorties with 570 hours of aircraft operations. CGF St. Louis stood down from the alert phase of operations on 27 August. A total of 380 Active Duty, 352 Reserve, 179 Auxiliary, and 5 Coast Guard civilians were involved in the operation.
|
https://en.wikipedia.org/wiki?curid=6204230
| 616,194 |
1,857,055 |
Attenborough starts the second programme by looking at potential future events, before warning that what happens over the next few years is crucial. A BBC weather forecast for the year 2050 shows that summer temperatures of 38 °C for the UK are "par for the course". The probable range by which the planet will warm over the next century is between 1.4 °C and 5.8 °C. Or, says Attenborough, "to put it another way, the impact of global warming will be somewhere between severe and catastrophic." The naturalist is invited to watch a film that illustrates regional change over the next 100 years. A 2 °C rise for the south of England, for example, may not seem to be much but that is not all there is to it. Rainfall is also predicted to be more intense and storms could be five times more frequent than they are at the moment. This makes extreme events, such as the 2004 Boscastle flood, much more likely. Current defences for severe wind or rain will shortly become inadequate. Even Hurricane Katrina, with the devastation it caused, is described as "not particularly powerful". In Australia, a new approach is needed to combat brush fires after the hottest year on record. If the Amazon tropical rainforest were to disappear, not only would an entire ecosystem vanish, but a valuable way of cooling the planet would go as well. Meanwhile, the glaciers continue to melt: one scientist reveals that an area the size of Texas has been lost over the last 20 years. Attenborough is told that a warming of 2 °C is inevitable, as a consequence of our actions over the last 25 years, but whether or not we end up at 6 °C is still very much within our control.
|
https://en.wikipedia.org/wiki?curid=7564445
| 1,855,987 |
133,331 |
In terms of applications, a massive number of new technologies were developed in the 20th century. Technologies such as electricity, the incandescent light bulb, the automobile and the phonograph, first developed at the end of the 19th century, were perfected and universally deployed. The first car was introduced by Karl Benz in 1885. The first airplane flight occurred in 1903, and by the end of the century airliners flew thousands of miles in a matter of hours. The development of the radio, television and computers caused massive changes in the dissemination of information. Advances in biology also led to large increases in food production, as well as the elimination of diseases such as polio by Dr. Jonas Salk. Gene mapping and gene sequencing, invented by Drs. Mark Skolnik and Walter Gilbert, respectively, are the two technologies that made the Human Genome Project feasible. Computer science, built upon a foundation of theoretical linguistics, discrete mathematics, and electrical engineering, studies the nature and limits of computation. Subfields include computability, computational complexity, database design, computer networking, artificial intelligence, and the design of computer hardware. One area in which advances in computing have contributed to more general scientific development is by facilitating large-scale archiving of scientific data. Contemporary computer science typically distinguishes itself by emphasizing mathematical 'theory' in contrast to the practical emphasis of software engineering.
|
https://en.wikipedia.org/wiki?curid=14400
| 133,278 |
736,883 |
Immersive technology has grown immensely in the past few decades, and is continuing to progress. VR has even been described as the learning aid of the 21st century. Head mounted displays(HMD) is what allows users to get the full immersive experience. The HMD market is expected to be worth over 25 billion USD by the year 2022. The technologies of VR and AR received a boost in attention when Mark Zuckerberg, founder/creator of Facebook, bought Oculus for 2 billion USD in 2014. Recently, the Oculus quest was released, which is wireless and allows users to move more freely. It costs around 400 USD which is around the same price as the previous generation headsets with cables. Other massive corporations such as, Sony, Samsung, HTC are also making huge investments into VR/AR. In regards to education, there are currently many researchers who are exploring the benefits and applications of virtual reality in the classroom. However, there is little systemic work that currently exists regarding how researchers have applied immersive VR for higher education purposes using HMD's. The most popular use of immersive technology comes in the world of videogames. Completely immersing users into their favorite game, HMD's have allowed individuals to experience the realm of videogames in an entirely new light. Current videogames such as Star Wars: Squadron, Half-Life: Alyx, and No Man's Sky are giving users the ability to experience every aspect of the digital world in their game. While there is still a lot to learn about immersive technology and what it has to offer, it has come an entirely long way from its beginning on the early 1800s.
|
https://en.wikipedia.org/wiki?curid=10499965
| 736,495 |
2,199,946 |
An only child, Dallos was born in 1934 in Budapest, Hungary. He attended the Technical University of Budapest from 1953 to 1956, majoring in electrical engineering. After participating in the 1956 anti-Soviet revolution, he escaped and immigrated to the United States. He finished his undergraduate work at the Illinois Institute of Technology (1958), followed by MS (1959) and Ph.D (1962) degrees from Northwestern University. He was one of the first doctoral students to specialize in biomedical engineering (adviser R.W. Jones) under the aegis of the Electrical Engineering department. His thesis work on modeling predictive eye movements is still being cited... Upon completing his degree, he accepted a position with Raymond Carhart in Audiology at Northwestern and became a full professor seven years later. His entire faculty career, which spanned fifty years, was at Northwestern University. In 1977-78 he spent a sabbatical year at the Karolinska Institutet, Stockholm, Sweden, working with Åke Flock. In 1991 he was recruited to be the founding chair of the new Department of Neurobiology and Physiology. Later he served terms as Associate Dean in the College of Arts and Sciences and as Vice President for Research. He was the founding Editor-in-Chief of the journal Auditory Neuroscience (1994–97), served on the Council of Neurology Institute of the NIH (1984–87) and was President of the Association for Research in Otolaryngology (ARO; 1992–93), while also serving on numerous other advisory committees and boards and holding various editorships.
|
https://en.wikipedia.org/wiki?curid=47027926
| 2,198,694 |
2,057,624 |
David Anderson (born August 4, 1952) is a former college professor. He was trained as a literary historian at Princeton University (Ph.D. 1980), where he studied with D. W. Robertson, and at universities in Italy (beginning with support from a Fulbright Fellowship, 1978), especially the scholarly circle around Giuseppe Billanvich at the Catholic University of Milan. It was as a visiting scholar at Milan that he completed his study of the post-classical interpretations of Statius' epic poem "Thebaid" and their influence on Boccaccio and Chaucer, published as "Before the Knight's Tale" (University of Pennsylvania Press, 1988), and identified in scattered manuscript sources a previously unrecognized commentary that was shown to be present in Boccaccio's library and used by that author in his works: "Boccaccio's Glosses on Statius," "Studi sul Boccaccio (1996)". At the University of Pennsylvania (1980–1988), he was the first male member of the faculty to take an extended parenting leave. At the time, the request was unusual, and his act occasioned public discussion of gender bias in employment policies and was the subject of a front-page article in the "Daily Pennsylvanian," "Mr. Mom: Paternity Leave Allows English Prof to Raise Son". In 1986 he curated an exhibition and catalogue of manuscripts and early printed books illustrating Chaucer's works and their cultural influences, "Sixty Books Old and New" (New Chaucer Society, 1986). The exhibition was held at the University of Pennsylvania and the Rosenbach Foundation and was supported by several Philadelphia-area charitable organizations, which made possible the distribution of copies of the catalogue to English teachers in the Philadelphia public school district. Anderson's other publications include "Pound's Cavalcanti" (Princeton University Press, 1983), which remains "invaluable" to Ezra Pound scholars.
|
https://en.wikipedia.org/wiki?curid=51095438
| 2,056,439 |
746,666 |
NDT methods rely upon use of electromagnetic radiation, sound and other signal conversions to examine a wide variety of articles (metallic and non-metallic, food-product, artifacts and antiquities, infrastructure) for integrity, composition, or condition with no alteration of the article undergoing examination. Visual inspection (VT), the most commonly applied NDT method, is quite often enhanced by the use of magnification, borescopes, cameras, or other optical arrangements for direct or remote viewing. The internal structure of a sample can be examined for a volumetric inspection with penetrating radiation (RT), such as X-rays, neutrons or gamma radiation. Sound waves are utilized in the case of ultrasonic testing (UT), another volumetric NDT method – the mechanical signal (sound) being reflected by conditions in the test article and evaluated for amplitude and distance from the search unit (transducer). Another commonly used NDT method used on ferrous materials involves the application of fine iron particles (either suspended in liquid or dry powder – fluorescent or colored) that are applied to a part while it is magnetized, either continually or residually. The particles will be attracted to leakage fields of magnetism on or in the test object, and form indications (particle collection) on the object's surface, which are evaluated visually. Contrast and probability of detection for a visual examination by the unaided eye is often enhanced by using liquids to penetrate the test article surface, allowing for visualization of flaws or other surface conditions. This method (liquid penetrant testing) (PT) involves using dyes, fluorescent or colored (typically red), suspended in fluids and is used for non-magnetic materials, usually metals.
|
https://en.wikipedia.org/wiki?curid=255047
| 746,271 |
576,835 |
Stereotactic radiosurgery was first developed in 1949 by the Swedish neurosurgeon Lars Leksell to treat small targets in the brain that were not amenable to conventional surgery. The initial stereotactic instrument he conceived used probes and electrodes. The first attempt to supplant the electrodes with radiation was made in the early fifties, with x-rays. The principle of this instrument was to hit the intra-cranial target with narrow beams of radiation from multiple directions. The beam paths converge in the target volume, delivering a lethal cumulative dose of radiation there, while limiting the dose to the adjacent healthy tissue. Ten years later significant progress had been made, due in considerable measure to the contribution of the physicists Kurt Liden and Börje Larsson. At this time, stereotactic proton beams had replaced the x-rays. The heavy particle beam presented as an excellent replacement for the surgical knife, but the synchrocyclotron was too clumsy. Leksell proceeded to develop a practical, compact, precise and simple tool which could be handled by the surgeon himself. In 1968 this resulted in the Gamma Knife, which was installed at the Karolinska Institute and consisted of several cobalt-60 radioactive sources placed in a kind of helmet with central channels for irradiation with gamma rays. This prototype was designed to produce slit-like radiation lesions for functional neurosurgical procedures to treat pain, movement disorders, or behavioral disorders that did not respond to conventional treatment. The success of this first unit led to the construction of a second device, containing 179 cobalt-60 sources. This second Gamma Knife unit was designed to produce spherical lesions to treat brain tumors and intracranial arteriovenous malformations (AVMs). Additional units were installed in the 1980s all with 201 cobalt-60 sources.
|
https://en.wikipedia.org/wiki?curid=1172094
| 576,539 |
935,287 |
The X-31 design was essentially an all-new airframe design, although it borrowed heavily on design elements and sometimes actual parts of previous production, prototype, and conceptual aircraft designs, including the British Aerospace Experimental Airplane Programme (choice of wing type with canards, plus underfuselage intake), the German TKF-90 (wing planform concepts and underfuselage intake), F/A-18 Hornet (forebody, including cockpit, ejection seat, and canopy; electrical generators), F-16 Fighting Falcon (landing gear, fuel pump, rudder pedals, nosewheel tires, and emergency power unit), F-16XL (leading-edge flap drives), V-22 Osprey (control surface actuators), Cessna Citation (main landing gear's wheels and brakes), F-20 Tigershark (hydrazine emergency air-start system, later replaced) and B-1 Lancer (spindles from its control vanes used for the canards). This was done on purpose, so that development time and risk would be reduced by using flight-qualified components. To reduce the cost of tooling for a production run of only two aircraft, Rockwell developed the "fly-away tooling" concept (perhaps the most successful spinoff of the program), whereby 15 fuselage frames were manufactured via CNC, tied together with a holding fixture, and attached to the factory floor with survey equipment. That assembly then became the tooling for the plane, which was built around it, thus "flying away" with its own tooling.
|
https://en.wikipedia.org/wiki?curid=579826
| 934,793 |
551,235 |
Substantial evidence in Tibetan highlanders suggests that variation in haemoglobin and blood-oxygen levels are adaptive as Darwinian fitness. It has been documented that Tibetan women with a high likelihood of possessing one to two alleles for high blood-oxygen content (which is rare in other women) had more surviving children; the higher the oxygen capacity, the lower the infant mortality. In 2010, for the first time, the genes responsible for the unique adaptive traits were identified following genome sequencing of 50 Tibetans and 40 Han Chinese from Beijing. Initially, the strongest signal of natural selection detected was a transcription factor involved in response to hypoxia, called endothelial Per-Arnt-Sim (PAS) domain protein 1 ("EPAS1"). It was found that one single-nucleotide polymorphism (SNP) at "EPAS1" shows a 78% frequency difference between Tibetan and mainland Chinese samples, representing the fastest genetic change observed in any human gene to date. Hence, Tibetan adaptation to high altitude becomes the fastest process of phenotypically observable evolution in humans, which is estimated to have occurred a few thousand years ago, when the Tibetans split up from the mainland Chinese population. The time of genetic divergence has been variously estimated as 2,750 (original estimate), 4,725, 8,000, or 9,000 years ago. Mutations in "EPAS1", at higher frequency in Tibetans than their Han neighbours, correlate with decreased haemoglobin concentrations among the Tibetans, which is the hallmark of their adaptation to hypoxia. Simultaneously, two genes, egl nine homolog 1 ("EGLN1") (which inhibits haemoglobin production under high oxygen concentration) and peroxisome proliferator-activated receptor alpha ("PPARA"), were also identified to be positively selected in relation to decreased haemoglobin nature in the Tibetans.
|
https://en.wikipedia.org/wiki?curid=39127332
| 550,947 |
87,308 |
Jenner sent a paper reporting his observations to the Royal Society in April 1797. It was not submitted formally and there is no mention of it in the Society's records. Jenner had sent the paper informally to Sir Joseph Banks, the Society's president, who asked Everard Home for his views. Reviews of his rejected report, published for the first time in 1999, were skeptical and called for further vaccinations. Additional vaccinations were performed and in 1798 Jenner published his work entitled "An Inquiry into the Causes and Effects of the Variolae Vaccinae, a disease discovered in some of the western counties of England, particularly Gloucestershire and Known by the Name of Cow Pox." It was an analysis of 23 cases including several individuals who had resisted natural exposure after previous cowpox. It is not known how many Jenner vaccinated or challenged by inoculation with smallpox virus; e.g. Case 21 included 'several children and adults'. Crucially all of at least four whom Jenner deliberately inoculated with smallpox virus resisted it. These included the first and last patients in a series of arm-to-arm transfers. He concluded that cowpox inoculation was a safe alternative to smallpox inoculation, but rashly claimed that the protective effect was lifelong. This last proved to be incorrect. Jenner also tried to distinguish between 'True' cowpox which produced the desired result and 'Spurious' cowpox which was ineffective and/or produced severe reaction. Modern research suggests Jenner was trying to distinguish between effects caused by what would now be recognised as non-infectious vaccine, a different virus (e.g. paravaccinia/milker's nodes), or contaminating bacterial pathogens. This caused confusion at the time, but would become important criteria in vaccine development. A further source of confusion was Jenner's belief that fully effective vaccine obtained from cows originated in an equine disease, which he mistakenly referred to as "grease". This was criticised at the time but vaccines derived from horsepox were soon introduced and later contributed to the complicated problem of the origin of vaccinia virus, the virus in present-day vaccine.
|
https://en.wikipedia.org/wiki?curid=61088
| 87,273 |
209,649 |
After World War II, the hypothesis that Peking Man inhabited the cave once again became the mainstay, modeled around Jiǎ's 1975 book "The Cave Home of Peking Man". In 1985, American archaeologist Lewis Binford and Chinese palaeoanthropologist Ho Chuan Kun instead hypothesised that Zhoukoudian was a "trap" which humans and animals fell into. They further proposed deer remains, earlier assumed to have been Peking Man's prey, were, in fact, predominantly carried in by the giant hyaena "Pachycrocuta", and ash was deposited by naturally occurring wildfires, fueled by bat guano, as they did not believe any human species had yet mastered hunting or fire at this time. In 2001, American geologist Paul Goldberg, Israeli archaeologist Steve Weiner, and colleagues determined Layer 4 was primarily deposited with loess (wind-blown dust), and Layer 3 with travertine (water-lain limestone). They also concluded that supposed evidence of fire is actually a result of completely different depositional circumstances related to water. In 2000, American anthropologist Noel T. Boaz and colleagues argued the state of the bones is consistent with general hyena biting, gnawing, and bone-crunching, and suggested "Pachycrocuta", the largest known hyena to have ever lived, was more than capable of splitting robust bones, contrary to Weidenreich. They identified bite marks on 67% of the Peking Man fossils (28 specimens), and attributed this and all other perimortem (around the time of death) damage to hyenas. Boaz and colleagues conceded that stone tools must indicate human activity in (or at least near) the cave, but, with few exceptions, tools were randomly scattered across the layers (as mentioned by several previous scientists), which Goldberg and colleagues ascribed to bioturbation . This means that the distribution of the tools gives no indication of the duration of human habitation. In 2016, Shuangquan Zhang and colleagues were unable to detect significant evidence of animal, human, or water damage to the few deer bones collected from Layer 3, and concluded they simply fell into the cave from above. They noted taphonomic debates are nonetheless still ongoing. Indeed, the fire debate is still heated, with Chinese palaeoanthropologist Xing Gao and colleagues declaring "clear-cut evidence for intentional fire use" in 2017.
|
https://en.wikipedia.org/wiki?curid=253340
| 209,542 |
983,085 |
At the 2014 Davos meeting, Thomas Friedman reported that the link between technology and unemployment seemed to have been the dominant theme of that year's discussions. A survey at Davos 2014 found that 80% of 147 respondents agreed that technology was driving jobless growth. At the 2015 Davos, Gillian Tett found that almost all delegates attending a discussion on inequality and technology expected an increase in inequality over the next five years, and gives the reason for this as the technological displacement of jobs. 2015 saw Martin Ford win the Financial Times and McKinsey Business Book of the Year Award for his "Rise of the Robots: Technology and the Threat of a Jobless Future", and saw the first world summit on technological unemployment, held in New York. In late 2015, further warnings of potential worsening for technological unemployment came from Andy Haldane, the Bank of England's chief economist, and from Ignazio Visco, the governor of the Bank of Italy. In an October 2016 interview, US President Barack Obama said that due to the growth of artificial intelligence, society would be debating "unconditional free money for everyone" within 10 to 20 years. In 2019, computer scientist and artificial intelligence expert Stuart J. Russell stated that "in the long run nearly all current jobs will go away, so we need fairly radical policy changes to prepare for a very different future economy." In a book he authored, Russell claims that "One rapidly emerging picture is that of an economy where far fewer people work because work is unnecessary." However, he predicted that employment in healthcare, home care, and construction would increase.
|
https://en.wikipedia.org/wiki?curid=32040137
| 982,572 |
2,056,098 |
Field trials have proven that the device produces usable images (image analysis is a separate topic covered in the broader literature). The technology is substantially more cost-effective than other existing SPI devices and able to be deployed from small vessels (ca. 5 m) by two persons operating a light frame or davit. Development of the device continues with better penetration geometries and technologies, more hydrodynamic housings, and extra sensor options. Koenig et al. (2001) reviewed some exciting developments in optical sensors (also known as optodes or reactive foils) capable of resolving sub-centimetre oxygen distribution (using the non-consumptive ruthenium fluorescence method) and pH. Very small redox (Eh) probes have also been available for quite some time. Vopel et al. (2003) demonstrated the utility of combining such instruments in studying animal-sediment interactions. These instruments can be integrated into the sediment imager relatively easily and would allow absolute quantification of sediment geochemical profiles at a small number of sites to inform the analysis of the surrounding SP images. Adding UV illumination is only a manufacturing issue. UV capabilities could extend the role of SPI in direct pollution monitoring of harbours or assessing the effects of petrochemical spills. SP image resolution is high enough to permit sediment tracer studies without expensive dyeing if the tracer mineral presents unique colour or fluorescence characteristics.
|
https://en.wikipedia.org/wiki?curid=14917968
| 2,054,914 |
1,528,655 |
Because they ride on a cushion of air generated between their wings and the water's surface and don't actually fly, as such, surface-effect vehicles (SEVs) are able to sustain payloads approximately three times the weight of those carried by equivalent-sized airplanes. This fact - taken together with their small size, quick acceleration, high speed, close proximity to the water and ability to blend-in with small, stationary boats on radar while loitering - makes small SEVs like the Bavar ideal platforms for carrying-out asymmetric approaches to conventional naval surface forces, especially at night and within the confines of an area as restricted (and often congested) as the Persian Gulf. The Bavar II exhibits a small radar signature and is therefore difficult to pick-up and track, especially while lying passive/motionless, when set against a cluttered backdrop, while merely trolling (see photo) and/or at longer ranges. Its reduced cross-section is intended to allow the Bavar to remain undetected while carrying out reconnaissance/patrol/attack missions, but its tellingly quick approach is often given-away through its employment of a (radar-reflecting) high propeller and a (heat-emitting) high, exposed engine. For these reasons, future versions are expected to incorporate a smaller and lower, enclosed turbofan, as well as emphasize the more extensive use of carbon-fibre, facetted surfaces and radar-absorbing paint to further minimize their profiles.
|
https://en.wikipedia.org/wiki?curid=37738505
| 1,527,791 |
2,018,260 |
Schroeder commanded at all levels from platoon through Joint Task Force and has extensive experience in strategic planning, resource management/allocation, and operations research and systems analysis. As the Deputy Commander of the U.S. Army in Europe, Schroeder supervised 14 separate organizations with an operating budget over $1.5 billion and 29,467 personnel. Schroeder was the principle executive at the Department of the Army responsible for developing options and priorities to govern resource allocation decisions. From 1988 to 1991, he commanded the U.S. Army Engineer School at Ft. Leonard Wood, Missouri, training 32,000 soldiers annually, providing curriculum and professional schooling for Army Engineer officers and non-commissioned officers. As the Army's Program Manager for the development of Fort Drum, he developed the program for, and obtained approval of, the Army's $1.4 billion expansion/modernization of Fort Drum. He conceived innovative and unprecedented approaches to capitalize the facilities involving the private sector as well as state and federal agencies. In addition to being Chief of Staff of the XVIIIth Airborne Corps and the 24th Infantry Division, earlier assignments include being a Resident Engineer in Germany, a Special Forces Detachment Commander in Southeast Asia, as well as senior staff officer in major commands in the United States and overseas. Schroeder served in Rwanda prior to retiring from the military. Subsequent to retirement from the Army, Schroeder was the Senior Vice President and General Manager for Modeling, Simulation, and Training for LORAL Federal Systems and later Lockheed Martin Information Systems. He is currently a consultant with several American corporations.
|
https://en.wikipedia.org/wiki?curid=46785140
| 2,017,097 |
741,256 |
One of his most important works, the "Missa Papae Marcelli" (Pope Marcellus Mass) has been historically associated with erroneous information involving the Council of Trent. According to this tale (which forms the basis of Hans Pfitzner's opera "Palestrina"), it was composed in order to persuade the Council of Trent that a draconian ban on the polyphonic treatment of text in sacred music (as opposed, that is, to a more directly intelligible homophonic treatment) was unnecessary. However, more recent scholarship shows that this mass was in fact composed before the cardinals convened to discuss the ban (possibly as much as 10 years before). Historical data indicates that the Council of Trent, as an official body, never actually banned any church music and failed to make any ruling or official statement on the subject. These stories originated from the unofficial points-of-view of some Council attendees who discussed their ideas with those not privy to the Council's deliberations. Those opinions and rumors have, over centuries, been transmuted into fictional accounts, put into print, and often incorrectly taught as historical fact. While Palestrina's compositional motivations are not known, he may have been quite conscious of the need for intelligible text; however, this was not to conform with any doctrine of the Counter-Reformation, because no such doctrine exists. His characteristic style remained consistent from the 1560s until the end of his life. Roche's hypothesis that Palestrina's seemingly dispassionate approach to expressive or emotive texts could have resulted from his having to produce many to order, or from a deliberate decision that any intensity of expression was unbecoming in church music, reflects modern expectations about expressive freedom and underestimates the extent to which the mood of Palestrina's settings is adapted to the liturgical occasions for which the texts were set, rather than the line-by-line meaning of the text, and depends on the distinctive characters of the church modes and variations in vocal grouping for expressive effect. Performing editions and recordings of Palestrina have tended to favour his works in the more familiar modes and standard (SATB) voicings, under-representing the expressive variety of his settings.
|
https://en.wikipedia.org/wiki?curid=12776
| 740,864 |
663,581 |
In 2000, controversial and self-described "transgenic artist" Eduardo Kac appropriated standard laboratory work by biotechnology and genetics researchers in order to both utilize and critique such scientific techniques. In the only putative work of transgenic art by Kac, the artist claimed to have collaborated with a French laboratory (belonging to the Institut National de la Recherche Agronomique) to procure a green-fluorescent rabbit: a rabbit implanted with a green fluorescent protein gene from a type of jellyfish ["Aequorea victoria"] in order for the rabbit to fluoresce green under ultraviolet light. The claimed work came to be known as the "GFP bunny", and which Kac called "Alba". This claim by Kac has been disputed by the scientists at the lab who noted that they had performed exactly the same experiment (i.e., the insertion of the jellyfish GFP protein-coding gene) on numerous other animals (cats, dogs, etc.) previously and did not create "Alba" (known to the researchers only as "Rabbit Number 5256") under the direction of Kac. The laboratory consequently kept possession of the transgenic rabbit which it had created and funded and the "transgenic art" was never exhibited at the Digital Avignon festival [2000] as intended. Kac—claiming that his rabbit was the first GFP bunny created in the name of Art—used this dispute to popularize the issue as one of disguised censorship by launching a "Free Alba" campaign. A doctored photo of the artist holding a day-glow-green tinted rabbit appears on his website. The members of the Critical Art Ensemble have written books and staged multimedia performance interventions around this issue, including "The Flesh Machine" (focusing on in vitro fertilisation, surveillance of the body, and liberal eugenics) and "Cult of the New Eve" (In order to analyze how, in their words, "Science is the institution of authority regarding the production of knowledge, and tends to replace this particular social function of conventional Christianity in the west").
|
https://en.wikipedia.org/wiki?curid=22562859
| 663,236 |
596,993 |
The origin and invention of the Bowden cable is open to some dispute, confusion and myth. The invention of the Bowden cable has been popularly attributed to Sir Frank Bowden, founder and owner of the Raleigh Bicycle Company who, circa 1902, was reputed to have started replacing the rigid rods used for brakes with a flexible wound cable but no evidence for this exists. The "Bowden mechanism" was invented by Irishman Ernest Monnington Bowden (1860 to April 3, 1904) of 35 Bedford Place, London, W.C. The first patent was granted in 1896 (English Patent 25,325 and U.S. Pat. No. 609,570), and the invention was reported in the Automotor Journal of 1897 where Bowden's address was given as 9 Fopstone Rd, Earls Court. The two Bowdens are not known to be closely related. The principal element of this was a flexible tube (made from hard wound wire and fixed at each end) containing a length of fine wire rope that could slide within the tube, directly transmitting pulling, pushing or turning movements on the wire rope from one end to the other without the need of pulleys or flexible joints. The cable was particularly intended for use in conjunction with bicycle brakes. The Bowden Brake was launched amidst a flurry of enthusiasm in the cycle press in 1896. It consisted of a stirrup, pulled up by the cable from a handlebar mounted lever, with rubber pads acting against the rear wheel rim. At this date bicycles were fixed-wheel (no freewheel), additional braking being offered by a 'plunger' brake pressing on the front tyre. The Bowden offered extra braking power still, and was novel enough to appeal to riders who scorned the plunger arrangement, which was heavy and potentially damaging to the (expensive) pneumatic tyre. The problem for Bowden was his failure to develop effective distribution networks and the brake was often incorrectly, or inappropriately fitted, resulting in a good number of complaints being aired in the press. Its most effective application was on those machines fitted with Westwood pattern steel rims which offered flat bearing surfaces for the brake pads.
|
https://en.wikipedia.org/wiki?curid=295477
| 596,688 |
547,569 |
The first DNA sequencing methods were developed by Gilbert (1973) and Sanger (1975). Gilbert introduced a sequencing method based on chemical modification of DNA followed by cleavage at specific bases whereas Sanger's technique is based on dideoxynucleotide chain termination. The Sanger method became popular due to its increased efficiency and low radioactivity. The first automated DNA sequencer was the AB370A, introduced in 1986 by Applied Biosystems. The AB370A was able to sequence 96 samples simultaneously, 500 kilobases per day, and reaching read lengths up to 600 bases. This was the beginning of the "first generation" of DNA sequencers, which implemented Sanger sequencing, fluorescent dideoxy nucleotides and polyacrylamide gel sandwiched between glass plates - slab gels. The next major advance was the release in 1995 of the AB310 which utilized a linear polymer in a capillary in place of the slab gel for DNA strand separation by electrophoresis. These techniques formed the base for the completion of the human genome project in 2001. The human genome project spurred the development of cheaper, high throughput and more accurate platforms known as Next Generation Sequencers (NGS). In 2005, 454 Life Sciences released the 454 sequencer, followed by Solexa Genome Analyzer and SOLiD (Supported Oligo Ligation Detection) by Agencourt in 2006. Applied Biosystems acquired Agencourt in 2006, and in 2007, Roche bought 454 Life Sciences, while Illumina purchased Solexa. Ion Torrent entered the market in 2010 and was acquired by Life Technologies (now Thermo Fisher Scientific). And BGI started manufacturing sequencers in China after acquiring Complete Genomics under their MGI arm. These are still the most common NGS systems due to their competitive cost, accuracy, and performance.
|
https://en.wikipedia.org/wiki?curid=127511
| 547,282 |
1,741,364 |
After Landau moved to the Kapitza institute in Moscow (to avoid arrest for comparing Stalinism to Nazism), Pomeranchuk also moved there, working for the tanning industry. He returned to Leningrad in 1938, lecturing, completing his Ph.D. and becoming employed as a junior scientist. He joined the Lebedev Institute of the Academy of Sciences of the USSR in Moscow as a senior scientist in 1940. In 1941 the institute was evacuated to Kazan. Under Abraham Alikhanov, he studied cosmic rays in Armenia from 1942. In 1943, he transferred to Laboratory No.2 under Igor Kurchatov as part of the Soviet project to develop nuclear weapons. Alikhanov founded Laboratory No.3 (which became the Institute for Theoretical and Experimental Physics (ITEP)) and Pomeranchuk worked there from 1946 (and for the rest of his life), founding and leading the Theoretical department, as well as being Professor of Theoretical Physics at the Moscow Mechanical Institute where students admired his infectious enthusiasm for his subject. Rudolf Peierls was consoled by the fact that it was "very clever Pomeranchuk" - and no-one else - who corrected his 1/"T" law for heat conduction in high-temperature condensed matter physics. His work in the 1940s was dominated by neutron research and his manuscript with Akhiezer was the basic guide for Soviet nuclear reactor construction. In 1950, he published a paper suggesting that the entropy of helium-3 as a liquid was less than as a solid. In 1950, Pomeranchuk received an order from Josef Stalin to go to Arzamas-16, located in the closed city of Sarov, Nizhny Novgorod region, to work on Soviet nuclear weaponry. Missing his family and his 'hobby physics' problems, he was advised not to apply for a revocation but wait until the order was "forgotten." He returned to ITEP within a year. He continued enthusiastically with work on quantum field theory and S-matrix theory, particle collisions and Regge theory, the latter in vigorous collaboration with Vladimir Gribov. His last paper on Regge theory was published posthumously. For his work, Pomeranchuk was twice awarded the Stalin Prize (1950, 1952). He was elected a corresponding member of the Academy of Sciences of the USSR in 1953 and full member in 1964.
|
https://en.wikipedia.org/wiki?curid=3581465
| 1,740,382 |
57,843 |
In 1983, Robert Howard started R.H. Research, later named Howtek, Inc. in Feb 1984 to develop a color inkjet 2D printer, Pixelmaster, commercialized in 1986, using Thermoplastic (hot-melt) plastic ink. A team was put together, 6 members from Exxon Office Systems, Danbury Systems Division, an inkjet printer startup and some members of Howtek, Inc group who became popular figures in the 3D printing industry. One Howtek member, Richard Helinski (patent US5136515A, Method and Means for constructing three-dimensional articles by particle deposition, application 11/07/1989 granted 8/04/1992) formed a New Hampshire company C.A.D-Cast, Inc, name later changed to Visual Impact Corporation (VIC) on 8/22/1991. A prototype of the VIC 3D printer for this company is available with a video presentation showing a 3D model printed with a single nozzle inkjet. Another employee Herbert Menhennett formed a New Hampshire company HM Research in 1991 and introduced the Howtek, Inc, inkjet technology and thermoplastic materials to Royden Sanders of SDI and Bill Masters of Ballistic Particle Manufacturing (BPM) where he worked for a number of years. Both BPM 3D printers and SPI 3D printers use Howtek, Inc style Inkjets and Howtek, Inc style materials. Royden Sanders licensed the Helinksi patent prior to manufacturing the Modelmaker 6 Pro at Sanders prototype, Inc (SPI) in 1993. James K. McMahon who was hired by Howtek, Inc to help develop the inkjet, later worked at Sanders Prototype and now operates Layer Grown Model Technology, a 3D service provider specializing in Howtek single nozzle inkjet and SDI printer support. James K. McMahon worked with Steven Zoltan, 1972 drop-on-demand inkjet inventor, at Exxon and has a patent in 1978 that expanded the understanding of the single nozzle design inkjets (Alpha jets) and help perfect the Howtek, Inc hot-melt inkjets. This Howtek hot-melt thermoplastic technology is popular with metal investment casting, especially in the 3D printing jewelry industry. Sanders (SDI) first Modelmaker 6Pro customer was Hitchner Corporations, Metal Casting Technology, Inc in Milford, NH a mile from the SDI facility in late 1993-1995 casting golf clubs and auto engine parts.
|
https://en.wikipedia.org/wiki?curid=1305947
| 57,819 |
1,450,182 |
The wars in Cuba at the turn of the 20th century spurred Congress to authorize an increase in the size of the Corps of Cadets to 481 in 1900. The period between 1900 and 1915 saw a construction boom as much of West Point's old infrastructure was torn down and rebuilt. A new administration building, barracks, academic building, riding hall, gymnasium, and a cadet chapel were all completed by 1914. In 1916, Congress increased the size of the Corps of Cadets to 1,332. Many of the most famous graduates in the 20th century graduated during the 15-year period between 1900 and 1915: Douglas MacArthur (1903), Joseph Stilwell (1904), Henry "Hap" Arnold (1907), George S. Patton (1909), Dwight D. Eisenhower & Omar Bradley (both 1915) all graduated during this time. The Class of 1915 is known as the "Class the Stars Fell Upon" for the exceptionally high percentage of general officers (59 of the 164) that rose from that class. This period also saw the infancy of intercollegiate athletics at the academy. The Army-Navy football rivalry was born the decade before in 1890 with a victory by Navy at West Point, followed with Army's avenging that loss in Annapolis the following year. The academy's other major sports teams began play during this period. The outbreak of America's involvement in World War I caused a sharp increase in the demand for army officers, and the academy accelerated graduation of all four classes then in attendance to meet this requirement, beginning with the early graduation of the First Class on 20 April 1917, followed by the Second Class in August 1917, and graduation of both the Third and Fourth Classes just before the Armistice of 11 November 1918, when only freshman cadets remained (those who had entered in the summer of 1918). In all, wartime contingencies and post-war adjustments resulted in ten classes, varying in length of study from two to four years, within a seven-year period before the regular course of study was fully resumed.
|
https://en.wikipedia.org/wiki?curid=20793344
| 1,449,366 |
1,566,661 |
Acoustic telemetry is based on the principles of sonar, which was developed to detect submarines during World War I. The properties of acoustic systems favour their use in deep waters with high conductivity and low turbulence. The first acoustic telemetry equipment was developed for studying fish in 1956 by the U.S. Bureau of Commercial Fisheries and the Minneapolis-Honeywell Regulator Corporation. Individuals that want to track marine wildlife in salt water face unique challenges. Radio waves are highly absorbed by salt water, making them a poor choice for sending messages through the ocean. Sound waves, on the other hand, are not similarly impeded by seawater. Due to the fact that sound can travel more than 4 times faster in water than in air, this allows for near real-time listening over long distances with proper acoustic telemetry equipment. Acoustic signals are the preferred communication tool for researchers who wish to track fish and wildlife in marine habitats in real time. As with radio, acoustic telemetry requires transmitters to send signals and receivers to hear them. The transmitters are electronic tags that emit a series of sound pulses into the surroundings. They can be surgically implanted or attached externally to an organism. The range of signal reception can vary from a few meters to more than a thousand meters. The signal typically transmits once every minute or two, in order to conserve battery life. Receivers are small, data-logging computers that “listen” for tagged individuals. When a signal is identified, the tag's unique ID code is saved with the date and time. The data from any single receiver provide a record of each signal to that location by a tagged individual. Researchers might deploy many receivers over large regions to understand the movement patterns of tagged individuals. Hydrophones, a type of underwater microphone, receive acoustic signals and then either store or convert them into radio signals for rapid transmission through the air to receivers on shore.
|
https://en.wikipedia.org/wiki?curid=42307707
| 1,565,774 |
1,643,654 |
Researchers originally assumed that perfect conservation of these long stretches of DNA implied evolutionary importance, as these regions appear to have experienced strong negative (purifying) selection for 300-400 million years. More recently, this assumption has been replaced by two main hypotheses: that UCEs are created through a reduced negative selection rate, or through reduced mutation rates, also known as a “cold spot” of evolution. Many studies have examined the validity of each hypothesis. The probability of finding ultra-conserved elements by chance (under neutral evolution) has been estimated at less than 10 in 2.9 billion bases. In support of the cold spot hypothesis, UCEs were found to be mutating 20 fold less than expected under conservative models for neutral mutation rates. This fold change difference in mutation rates was consistent between humans, chimpanzees, and chickens. Ultra-conserved elements are not exempt from mutations, as exemplified by the presence of 29,983 polymorphisms in the UCE regions of the human genome assembly GRCh38. However, affected phenotypes were only caused by 112 of these polymorphisms, most of which were located in coding regions of the UCEs. A study performed in mice determined that deleting UCEs from the genome did not create obvious deleterious phenotypes, despite deletion of UCEs in proximity to promoters and protein coding genes. Affected mice were fertile and targeted screens of the nearby coding genes showed no altered phenotype. A separate mouse study demonstrated that ultra-conserved enhancers were robust to mutagenesis, concluding that perfect conservation of UCE sequences is not required for their function, which would suggest another reason for the sequence consistency besides evolutionary importance. Computational analysis of human ultra-conserved noncoding elements (UCNEs) found that the regions are enriched for A-T sequences and are generally GC poor. However, the UNCEs were found to be enriched for CpG, or highly methylated. This may indicate that there is some change to DNA structure in these regions favoring their precise retention, but this possibility has not been validated through testing.
|
https://en.wikipedia.org/wiki?curid=32778957
| 1,642,727 |
1,084,455 |
In 2009, Zanno and colleagues stated therizinosaurs were the most-widely regarded candidates for herbivory among theropods and listed features associated with this diet. These included small, densely packed, coarse serrations; lanceolate (lance-shaped) teeth with a low replacement rate; a beak at the front of the jaws; an inset tooth row that suggests fleshy cheeks; an elongated neck; a small skull; a very large gut capacity as indicated by the rib circumference at the trunk and the outwards flaring processes of the ilia; and the loss of cursorial (related to running) adaptations in the hind limbs, including development of functionally tetradactyl feet. Zanno and colleagues found the clades at the base of Maniraptora—Ornithomimosauria, Therizinosauria, and Oviraptorosauria—had either direct or morphological evidence for herbivory, which would mean either this diet evolved independently multiple times in coelurosaurian theropods or that the primitive condition of the group was at least facultative herbivory with carnivory only emerging in more derived maniraptorans. Zanno and Peter J. Makovicky found, in 2011, therizinosaurs and some other groups of herbivorous dinosaurs that had beaks and retained teeth were unable to lose their teeth completely because they lacked gastric mills (gizzards) and needed the teeth to process food, and that the high-fiber folivorous (leaf-based) diet of therizinosaurs and other archosaurs may also have precluded the evolution of a complete beak. Lautenschlager found in 2014 the hands of therizinosaurs would have had to be able to extend the range of the animal to a point that could not be reached by the head if they were used for browsing and pulling down vegetation. In genera where both neck and forelimb elements are preserved, however, the necks were equal in length or longer than the forelimbs, so pulling vegetation would only make sense if lower parts of long branches were pulled down to access out-of-reach parts of trees.
|
https://en.wikipedia.org/wiki?curid=636977
| 1,083,898 |
55,814 |
However, this conclusion was questioned by several researchers following the publication of the paper. J. M. Kale Sniderman used the same methodology as Head and colleagues on the Pleistocene monitor lizard "Varanus priscus", comparing it to the extant Komodo dragon. Sniderman calculates that following this method, the modern tropics should be able to support lizards much larger than what is observed today, or in the reverse, that "Varanus priscus" is much larger than what would be implied by the ambient temperature of its native range. In conclusion it is argued that Paleocene rainforests may not have been any hotter than those today and that the massive size of "Titanoboa" and "Varanus priscus" may instead be the results of lacking significant mammalian competition. Mark W. Denny, Brent L. Lockwood and George N. Somero also disagree with Head's conclusion. They note that although this method first employed by Makarieva is applicable to smaller poikilotherms, it is not constant across all size ranges. As thermal equilibrium is achieved through the relation between volume and surface area, they argue that the large size of "Titanoboa" coupled with the high temperatures proposed by Head "et al." would mean that the animal would overheat easily if resting in a coiled up state. The authors conclude that several key factors influence the relationship between "Titanoboa" and the temperature of the area it inhabited. Varying posture could help cool down if needed, basking behavior or heat absorption through the substrate are both unknown and the potentially semi-aquatic nature of the animal creates additional factors to consider. Ultimately, Denny and colleagues argue that the nature of the giant snake renders it a poor indicator for the climate of the Paleocene and that the mean annual temperature must have been 4 to 6° C (7 to 11 °F) cooler than the current estimate.
|
https://en.wikipedia.org/wiki?curid=21396352
| 55,790 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.