doc_id
int32
18
2.25M
text
stringlengths
245
2.96k
source
stringlengths
38
44
__index_level_0__
int64
18
2.25M
1,747,386
Research advanced so quickly that the need for an up-to-date checklist became apparent. Mary Parke (1902–1981), who was a founder member of the British Phycological Society, produced a preliminary checklist of British marine algae in 1953, corrections and additions of this were published in 1956, 1957 and 1959. In 1964 M.Parke and Peter Stanley Dixon (1929–1993) published a revised check-list, a second revision of this was produced in 1968 and a third revision in 1976. Distribution was added to the checklist in 1986 with G.R.South and I.Tittley's "A Checklist and Distributional Index of the Benthic Marine Algae of the North Atlantic Ocean". In 2003 "A Check-list and Atlas of the Seaweeds of Britain and Ireland" was published by Gavin Hardy and Michael Guiry with a revised edition in 2006. This shows how rapidly knowledge of algae, at least in the British Isles, advanced. First efforts had been made by interested biologists and people capable of identifying the algae, this required books using the botanical names. Botanical keys to identify the plants then developed, followed by checklists. As more information was brought to light by interested workers, some volunteers, the checklists were improved and eventually a mapping scheme brought together all this information. The same pattern of knowledge developed with birds, mammals and flowering plants, though to a different time-scale and knowledge in other parts of the world has developed to this degree.
https://en.wikipedia.org/wiki?curid=7587938
1,746,400
1,330,443
As attention focused on the combat capabilities of attack drones, the USN and USAF were looking for drones that could turn 6Gs and quickly roll into tight turns. While at the same time, US designers were wondering if dog-fights between robot planes were just around the corner. From 25 January-28 April 1971, a batch of Maneuverability Augmentation System for Tactical Air Combat Simulation (MASTACS) systems were modified onto existing US Navy BQM-34A drones. These UAVs were test flown to evaluate their maneuvering characteristics, which were deemed good. On 10 May 1971, the MASTACS exercise was ready to commence off the coast of California, against two USN F4 Phantoms flown by Vietnam combat experienced pilots. The F4s were equipped with both the infrared homing Sidewinder and radar-guided Sparrow air-to-air missiles. As the two F4s approached Santa Catalina Island, a MASTAC-equipped Firebee was ground-launched. The F4s were vectored towards the interception and the air-to-air battle was on. No restrictions were placed on the F4 pilots, the air battle was to be a "no holds barred contest", with the very real possibility of a Phantom being rammed by a UAV as it maneuvered during the dogfight. The first action was a head-on maneuver, as the Phantom lined up for the kill, the UAV (drone) pulled a high-G turn and flew over the F4's canopy. The Firebee was banking into 100-degree maneuvers, and making 180-degree reversal turns within 12 seconds. The Phantoms were no longer attacking the UAV, they were now the targets. The UAVs had been able to pull and hold 6 Gs within three seconds of receiving the command, and still maintain altitude. The Phantoms were unable to maintain track on the UAV, but fired their air-to-air missiles anyway, receiving no hits.
https://en.wikipedia.org/wiki?curid=3525512
1,329,714
959,104
When an endotracheal tube (ET) is placed, one of the key advantages is that a direct air-tight passageway is provided from the output of the manual resuscitator to the lungs, thus eliminating the possibilities of inadvertent stomach inflation or lung injuries from gastric acid aspiration. However this places the lungs at increased risk from separate lung injury patterns caused by accidental forced over-inflation (called volutrauma or barotrauma). Sponge-like lung tissue is delicate, and over-stretching can lead to adult respiratory distress syndrome – a condition that requires prolonged mechanical ventilator support in the ICU and is associated with poor survival ("e.g.", 50%), and significantly increased care costs of up to $30,000 per day. Lung volutrauma, which can be caused by "careful" delivery of large, slow breaths, can also lead to a "popped" or collapsed lung (called a pneumothorax), with at least one published report describing "a patient in whom a sudden tension pneumothorax developed during ventilation with a bag-valve device." Additionally, there is at least one report of manual resuscitator use where the lungs were accidentally over-inflated to the point where "the heart contained a large volume of air," and the "aorta and pulmonary arteries were filled with air" – a condition called an air embolism which "is almost uniformly fatal". However, the case was of a 95-year-old woman, as the authors point out that this type of complication has previously only been reported in premature infants.
https://en.wikipedia.org/wiki?curid=1862226
958,598
1,841,086
Since animals are far less plastic than plants, ecophenotypic variation is noteworthy. When encountered, it can cause confusion in identification if it is not anticipated. The most obvious examples are again common observations, as the dwarfing of aquarium fish living in a restricted environment. In asexual reproduction, the parent passes on the entire genome to the next generation. Mutations to the genes are the only source of genetic variation. In sexual reproduction, each parent contributes half of his or her genome to the offspring; thus the offspring contain a mixture of genetic material. Adaptations are traits that increase fitness, the driving force for natural selection. The level of fitness associated with an allele can only be ascertained by comparison with alternate alleles. Traits that increase the survival rate of a species contribute to an animal's fitness, but selection will only favor such traits insofar as survival improves the reproductive success of the organism. More interesting are examples where causation is less clear. Among mollusks, examples include the muricid snail species "Nucella lamellosa", which in rough, shallow waters is generally less spiny than in deeper, quiet waters. In unionid freshwater bivalves, there are lake, small river, and large river forms of several species. In vertebrates, experiments on mice show reduced length of ears and tails in response to being reared in a lower temperature, a phenomenon known as Allen's rule.
https://en.wikipedia.org/wiki?curid=33712524
1,840,034
1,506,834
Patents for the therapeutic use of ONYX-015 are held by ONYX Pharmaceuticals and it was used in combination with the standard chemotherapeutic agents cisplatin and 5-fluorouracil to combat head and neck tumours. Onyx-015 has been extensively tested in clinical trials, with the data indicating that it is safe and selective for cancer. However, limited therapeutic effect has been demonstrated following injection and systemic spread of the virus was not detected. "ONYX-015" when combined with chemotherapy, however, proved reasonably effective in a proportion of cases. During these trials a plethora of reports emerged challenging the underlying p53-selectivity, with some reports showing that in some cancers with a wild-type p53 ONYX-015 actually did better than in their mutant p53 counterparts. These reports slowed the advancement through Phase III trials in the US, however recently China licensed "ONYX-015" for therapeutic use as "H101". Further development of Onyx-015 was abandoned in the early 2000s, the exclusive rights being licensed to the Chinese company, Shanghai Sunway Biotech. On November 17, 2005, the Chinese State Food and Drug Administration approved H101, an oncolytic adenovirus similar to Onyx-015 (E1B-55K/E3B-deleted), for use in combination with chemotherapy for the treatment of late-stage refractory nasopharyngeal cancer. Outside of China, the push to the clinic for "ONYX-015" has been largely been discontinued for financial reasons and until a "real" mechanism can be found.
https://en.wikipedia.org/wiki?curid=39032553
1,505,988
1,218,057
In further attempting to understand human - environmental relationships disciplines of environmental social science have begun to explore relationships between humans and non-humans, to understand how both interact with each other within the natural world. Interactions that force to reconceptualize identity as ecocultural. Ideas related to exploring human and animal interactions within the natural world, have become prominent in environmental ethics. Shoreman Ouimet and Kopnina define environment ethics as "a sub-discipline of philosophy that deals with the ethical problems surrounding the environment, in some cases providing ethical justification and moral motivation for the cause of environmental protection or for considerations of animal welfare". This has culminated in debates regarding environmental value and moral rights and who within the larger ecosystem should be assigned these rights. Environmental ethics explores the dialectic between human and nature exploring how the human configuration of nature may in turn reshape humans, their relationships and their conditions. Ideas that have emerged from the questions seeking to examine this dialectic include those of "post-domesticity and domesticity". Domesticity refers to societal dynamics produced in societies in which humans have daily contact with animals other than pets in contrast in post-domesticity people are quite distant from the animals they consume in referencing the ideas of Bulliet (2005) Emel and Neo convey that a distance from witnessing the processes that govern animal life including births, deaths while consuming animals as food, impacts people differently than if they were to be interacting continuously with animals. They mention that post- domesticity may produce feelings of guilt however the continued distance between animal life brought by interacting with animals as a commodity may cause people to only distantly relate to them or think of them as packages in a store disassociating them from the life-cycles they embody. Therefore, environmental science has paved the way to multiple concepts, ideas and paradigms that differ among each other but all seek to intertwine issues related to the environment with other fields and issues in the social sciences.
https://en.wikipedia.org/wiki?curid=24842835
1,217,403
2,098,967
Pierce left Georgetown College as a sophomore for New York to serve in the gas defense section of the United States Army Signal Corps. He was in the army from April to December 1918. He subsequently graduated from Georgetown College in 1920. In that same year, he began teaching at the University of Kentucky and later at the University of South Dakota. He stayed on at the University of Chicago after receiving his Ph.D. to teach quantitative analysis. During this time he co-authored seminal chemistry textbook "Quantitative Analysis" with Edward Lauth Haenisch. "Quantitative Analysis" would go through 21 editions until 1963. During World War II, Pierce worked for the Office of Scientific Research and Development's National Defense Research Committee. Pierce had been recruited by his doctoral advisor, W. Albert Noyes, Jr., to join the "division 10" central laboratory at Northwestern University. Pierce's lab focused on chemical warfare defense and developed carbon filtering for use in chemical protective masks. Pierce's work included an assignment in Australia and the South Pacific. He was awarded the President's Certificate of Merit in 1948 for his services.
https://en.wikipedia.org/wiki?curid=44332849
2,097,759
803,912
McDougall (1923) first made the distinction between explicit and implicit memory. In the 1970s procedural and declarative knowledge was distinguished in literature on artificial intelligence. Studies in the 1970s divided and moved towards two areas of work: one focusing on animal studies and the other to amnesic patients. The first convincing experimental evidence for a dissociation between declarative memory ("knowing what") and non-declarative or procedural ("knowing how") memory was from Milner (1962), by demonstrating that a severely amnesic patient, Henry Molaison, formerly known as patient H.M., could learn a hand–eye coordination skill (mirror drawing) in the absence of any memory of having practiced the task before. Although this finding indicated that memory was not made up of a single system positioned in one place in the brain, at the time, others agreed that motor skills are likely a special case that represented a less cognitive form of memory. However, by refining and improving experimental measures, there has been extensive research using amnesic patients with varying locations and degrees of structural damage. Increased work with amnesic patients led to the finding that they were able to retain and learn tasks other than motor skills. However, these findings had shortcomings in how they were perceived as amnesic patients sometimes fell short on normal levels of performance and therefore amnesia was viewed as strictly a retrieval deficit. Further studies with amnesic patients found a larger domain of normally functioning memory for skill abilities. For example, using a mirror reading task, amnesic patients showed performance at a normal rate, even though they are unable to remember some of the words that they were reading. In the 1980s much was discovered about the anatomy physiology of the mechanisms involved in procedural memory. The cerebellum, hippocampus, neostriatum, and basal ganglia were identified as being involved in memory acquisition tasks.
https://en.wikipedia.org/wiki?curid=21312313
803,483
1,628,248
Studies using animals genetically engineered to lack EP and supplemented by studies using treatment with EP receptor antagonists and agonists indicate that this receptor serves several functions. 1) It mediates hyperalgesia due to EP1 receptors located in the central nervous system but suppresses pain perception due to E located on dorsal root ganglia neurons in rats. Thus, PGE causes increased pain perception when administered into the central nervous system but inhibits pain perception when administered systemically; 2) It promotes colon cancer development in Azoxymethane-induced and APC gene knockout mice. 3) It promotes hypertension in diabetic mice and spontaneously hypertensive rats. 4) It suppresses stress-induced impulsive behavior and social dysfunction in mice by suppressing the activation of Dopamine receptor D1 and Dopamine receptor D2 signaling. 5) It enhances the differentiation of uncommitted T cell lymphocytes to the Th1 cell phenotype and may thereby favor the development of inflammatory rather than allergic responses to immune stimulation in rodents. Studies with human cells indicate that EP serves a similar function on T cells. 6) It may reduce expression of Sodium-glucose transport proteins in the apical membrane or cells of the intestinal mucosa in rodents. 7) It may be differentially involved in etiology of acute brain injuries. Pharmacological inhibition or genetic deletion of EP receptor produce either beneficial or deleterious effects in rodent models of neurological disorders such as ischemic stroke, epileptic seizure, surgically induced brain injury and traumatic brain injury.
https://en.wikipedia.org/wiki?curid=14522485
1,627,329
1,226,526
The goal of implantable microelectrodes is to interface with the body's nervous system for recording and sending bioelectrical signals to study disease, improve prostheses, and monitor clinical parameters. Microfabrication has led to the development of Michigan probes and the Utah electrode array, which have increased electrodes per unit volume, while addressing problems of thick substrates causing damage during implantation and triggering foreign-body reaction and electrode encapsulation via silicon and metals in the electrodes. Michigan probes have been used in large-scale recordings and network analysis of neuronal assemblies, and the Utah electrode array has been used as a brain–computer interface for the paralyzed. Extracellular microelectrodes have been patterned onto an inflatable helix-shaped plastic in cochlear implants to improve deeper insertion and better electrode-tissue contact for transduction of high-fidelity sounds. Integrating microelectronics onto thin, flexible substrates has led to the development of a cardiac patch that adheres to the curvilinear surface of the heart by surface tension alone for measuring cardiac electrophysiology, and electronic tattoos for measuring skin temperature and bioelectricity. Wireless recording of electrophysiological signals is possible through addition of a piezocrystal to a circuit of two recording electrodes and a single transistor on an implanted micro-device. An external transducer emits pulses of ultrasonic energy} which impinge on the piezocrystal, and extracellular voltage changes are backscattered ultrasonically by the piezocrystal, allowing for measurement. A network of so-called "neural dust" motes can map signals throughout a region of the body where the micro-sensors are implanted.
https://en.wikipedia.org/wiki?curid=19081158
1,225,865
671,226
He was an early investigator into X-ray imaging, but his claim to have made the first X-ray image in the United States is incorrect. He learned of Röntgen's discovery of unknown rays passing through wood, paper, insulators, and thin metals leaving traces on a photographic plate, and attempted this himself. Using a vacuum tube, which he had previously used to study the passage of electricity through rarefied gases, he made successful images on 2 January 1896. Edison provided Pupin with a calcium tungstate fluoroscopic screen which, when placed in front of the film, shortened the exposure time by twenty times, from one hour to a few minutes. Based on the results of experiments, Pupin concluded that the impact of primary X-rays generated secondary X-rays. With his work in the field of X-rays, Pupin gave a lecture at the New York Academy of Sciences. He was the first person to use a fluorescent screen to enhance X-rays for medical purposes. A New York surgeon, Dr. Bull, sent Pupin a patient to obtain an X-ray image of his left hand prior to an operation to remove lead shot from a shotgun injury. The first attempt at imaging failed because the patient, a well-known lawyer, was "too weak and nervous to be stood still nearly an hour" which is the time it took to get an X-ray photo at the time. In another attempt, the Edison fluorescent screen was placed on a photographic plate and the patient's hand on the screen. X-rays passed through the patients hand and caused the screen to fluoresce, which then exposed the photographic plate. A fairly good image was obtained with an exposure of only a few seconds and showed the shot as if "drawn with pen and ink." Dr. Bull was able to take out all of the lead balls in a very short time.
https://en.wikipedia.org/wiki?curid=215430
670,874
13,051
The accident was triggered by the Tōhoku earthquake and tsunami, which occurred in the Pacific Ocean about east of the Japanese mainland at 14:46 JST on Friday, 11 March 2011. On detecting the earthquake, the active reactors automatically shut down their normal power-generating fission reactions. Because of these shutdowns and other electrical grid supply problems, the reactors' electricity supply failed, and their emergency diesel generators automatically started. Critically, these were required to provide electrical power to the pumps that circulated coolant through the reactors' cores. This continued circulation was vital to remove residual decay heat, which continues to be produced after fission has ceased. However, the earthquake had also generated a tsunami high that arrived shortly afterwards, swept over the plant's seawall and then flooded the lower parts of the reactor buildings at units 1–4. This flooding caused the failure of the emergency generators and loss of power to the circulating pumps. The resultant loss of reactor core cooling led to three nuclear meltdowns, three hydrogen explosions, and the release of radioactive contamination in Units 1, 2 and 3 between 12 and 15 March. The spent fuel pool of the previously shut down Reactor 4 increased in temperature on 15 March due to decay heat from newly added spent fuel rods, but did not boil down sufficiently to expose the fuel.
https://en.wikipedia.org/wiki?curid=31162817
13,046
106,499
In the 1980s, France pressed for an independent European crew launch vehicle. Around 1978, it was decided to pursue a reusable spacecraft model and starting in November 1987 a project to create a mini-shuttle by the name of Hermes was introduced. The craft was comparable to early proposals for the Space Shuttle and consisted of a small reusable spaceship that would carry 3 to 5 astronauts and 3 to 4 metric tons of payload for scientific experiments. With a total maximum weight of 21 metric tons it would have been launched on the Ariane 5 rocket, which was being developed at that time. It was planned solely for use in low Earth orbit space flights. The planning and pre-development phase concluded in 1991; the production phase was never fully implemented because at that time the political landscape had changed significantly. With the fall of the Soviet Union ESA looked forward to co-operation with Russia to build a next-generation space vehicle. Thus the Hermes programme was cancelled in 1995 after about 3 billion dollars had been spent. The Columbus space station programme had a similar fate.
https://en.wikipedia.org/wiki?curid=10363
106,454
1,311,159
Levine considered how the founding documents of the United States can be interpreted in several ways, leading to conflict between those that hold various interpretations of those ideals, leading to "Infinite"s different factions. Figureheads of the powers-that-be like Saltonstall are based on both historical and present-day nationalistic personalities, seeking to put the needs of America before others. One example given by Levine is President Theodore Roosevelt, whose ideals were highly influential during America's transformation in the early 20th century; Levine considered how Roosevelt willingly gave up office to fight during the Spanish–American War. On the other hand, the Vox Populi were based on historical factions that often splintered into small, independent groups that undertook violent actions, such as the Red Army Faction from the 1970s and the Animal Liberation Front and Earth Liberation Front of present day. During the course of the game's development period, the series of "Occupy" protests occurred across several cities; Levine, comparing these protests to other historical ones already incorporated into Columbia's history, used the real-time events to refine the game's story. Specifically, due to the nature of the various decentralized groups involved with the "Occupy" protests, Levine was able to define how the Vox Populi group would grow from its haphazard beginnings. Levine reflected that despite the game's earlier setting, many of the modern day political turmoil calls back to similar tactics and behavior used in the early days of America's democracy, and thus provided a means to flesh out these aspects within the game.
https://en.wikipedia.org/wiki?curid=38977195
1,310,441
1,134,462
Since the early 1930s, single-beam sounders were used to make bathymetry maps. More recently multibeam echosounders (MBES) are typically used, which use hundreds of very narrow adjacent beams arranged in a fan-like swath of typically 90 to 170 degrees across. The tightly packed array of narrow individual beams provides very high angular resolution. The width of the swath is depth dependent, and allows a boat to map more seafloor per transect than a single-beam echosounder by making fewer passes. The beams update many times per second (typically 0.1–50 Hz depending on water depth), allowing faster boat speed while maintaining high coverage of the seafloor. Attitude sensors provide data for the correction of the boat's roll and pitch, and a gyrocompass provides accurate heading information to correct for vessel yaw. A Global Navigation Satellite System (GNSS)) positions the soundings with respect to the surface of the earth. Sound speed profiles (speed of sound in water as a function of depth) of the water column correct for refraction or "ray-bending" of the sound waves owing to non-uniform water column characteristics such as temperature, conductivity, and pressure. A computer system processes all the data, correcting for all of the above factors as well as for the angle of each individual beam. The resulting sounding measurements are then processed to produce a map.
https://en.wikipedia.org/wiki?curid=463112
1,133,869
1,569,637
The programme that would develop into the current organisation was initially developed by an Essex general practitioner, Dr Alan Dean, to facilitate day-to-day management of his own general practice, in collaboration with IT staff at the BATA shoe factory in East Tilbury near to his practice. Early programmers included a Czech programmer Jan Boda. In 1987 a venture capital company was set up, named Value Added Information Medical Products Ltd (VAMP) to recruit other practices and form an information base. The early development team of three, Marcella Devenish (formerly a midwife), Philip Lee-Warren (formerly a Senior Medical Laboratory Scientific Officer), under the leadership of Kieran O'Mally, developed further the computerised system using the BOS Microcobol development environment. During this period the number of UK practices using the software product IGP (Integrated General Practice) expanded from a few hundred to over two and a half thousand. In return for anonymous Healthcare data, VAMP Ltd offered GPs the cash equivalent of GBP 500 a month in order to build the VAMP research databank for research purposes. As of 1988, the VAMP research databank comprised 57 practices and 543,100 patients. This figure was doubled to 1.2 million in 1990. One year later, 970 practices allowed VAMP to access their data, while about 1000 practices had a straightforward maintenance agreement with VAMP.
https://en.wikipedia.org/wiki?curid=7874514
1,568,749
486,774
In a second direction, beginning in the late 1960s and early 1970s, Mischel pioneered work illuminating the ability to delay gratification and to exert self-control in the face of strong situational pressures and emotionally "hot" temptations. His studies with preschoolers in the late 1960s often referred to as "the marshmallow experiment", examined the processes and mental mechanisms that enable a young child to forgo immediate gratification and to wait instead for a larger desired but delayed reward. The test was simple: give the child an option between an immediate treat or more of a delayed treat. For example, the proctor would give the child an option to eat one marshmallow immediately or to wait ten minutes and receive not one, but two marshmallows to eat. The test didn't have to be conducted with marshmallows specifically; it could be done with Oreo cookies, M&Ms, or other desirable treats. As Mischel followed up with the parents of the children who took the test years later, he found a staggering correlation between those kids who had difficulty delaying gratification and their outcomes in life as an adult. For those kids who had trouble waiting for the delectable delight, they tended to have higher rates of obesity and below-average levels of academic achievement later in life. Their counterparts who were able to wait longer for the treat had stark different outcomes down the road, including lower body mass index and higher standardized test scores. Still a stark contrast appeared when studying children who were raised by parents below the poverty line compared with children whose parents were college-educated. A significantly larger portion of the low-income children ate the treat immediately conversely from the counterparts who waited.
https://en.wikipedia.org/wiki?curid=4392099
486,525
1,029,388
While modern ciphers like AES and the higher quality asymmetric ciphers are widely considered unbreakable, poor designs and implementations are still sometimes adopted and there have been important cryptanalytic breaks of deployed crypto systems in recent years. Notable examples of broken crypto designs include the first Wi-Fi encryption scheme WEP, the Content Scrambling System used for encrypting and controlling DVD use, the A5/1 and A5/2 ciphers used in GSM cell phones, and the CRYPTO1 cipher used in the widely deployed MIFARE Classic smart cards from NXP Semiconductors, a spun off division of Philips Electronics. All of these are symmetric ciphers. Thus far, not one of the mathematical ideas underlying public key cryptography has been proven to be 'unbreakable', and so some future mathematical analysis advance might render systems relying on them insecure. While few informed observers foresee such a breakthrough, the key size recommended for security as best practice keeps increasing as increased computing power required for breaking codes becomes cheaper and more available. Quantum computers, if ever constructed with enough capacity, could break existing public key algorithms and efforts are underway to develop and standardize post-quantum cryptography.
https://en.wikipedia.org/wiki?curid=520066
1,028,854
1,866,324
According to McDonald, the final movement, "The Call of the Mountains", is "music that epitomizes Ives's transcendental ideal of unity in diversity, four musical layers that are rhythmically and melodically independent but together forge a continuous tonality that evokes timelessness and eternity." He suggests that "the apotheosis of 'The Call of the Mountains' is Ives’s own music of the future." In keeping with the program ("walking up the mountain side to view the firmament"), the instruments slowly unite in their efforts to achieve their goal, culminating in a "transcendental" ostinato passage during which the cello plays a slow, majestic descending whole tone scale beginning and ending on D, strongly reminiscent of the ending of the last movement of Ives' Symphony No. 4, completed several years after the composition of the quartet. Jan Swafford described the ending as follows: "Over a downstriding whole-tone scale in the cello, the violins and viola chime like great bells in the heavens, in patterns like bell ringing. The struggle and the arguments have prepared the way for a revelation." McDonald wrote: "The communal act with which the program ends is signified by the balance between instrumental independence and cohesion. The four debaters maintain their individuality yet form some sort of consensus as a means of peaceful coexistence; the opposition between individual and community is resolved. The four men’s communion with nature on the mountaintop is evoked by the first violin, which reaches higher on its E-string than most composers or players of the time would have dared to attempt, as it paraphrases the hymn 'Nearer, My God, to Thee'..., the title suggesting the ultimate object of the ascent up the mountain. God's eternity is rendered by the circularity of each of the lower three instrumental layers, whose ostinati repeat at different periodicities."
https://en.wikipedia.org/wiki?curid=60526195
1,865,250
1,312,876
For most of the early twenty-first century, China was the main destination for the world's scrap material. This resulted from a combination of factors, including the growing need for metal, paper and plastics in China's expanding industry, lax environmental regulations, cheap labor, and inexpensive shipping using containers that would otherwise be returned to China empty. In the United States, this resulted in a strong market for many scrap commodities and allowed local recycling programs to come close to breaking even economically or even to turn a small profit. That situation abruptly changed in 2017 when China announced its ‘’’National Sword’’’ program which banned many scrap imports and imposed strict quality standards on others, starting in February 2018. Acceptable limits for contamination in imported waste were cut from 5-10 percent to 0.5 percent. Since then almost no plastic scrap has been exported to China from the United States and shipments of metal and paper scrap have been sharply reduced. Scrap plastic imports dropped from 3.5 million metric tons in 2017 to 21,300 metric tons in the first half of 2018. As a result, scrap prices in the US have plummeted. The US recycling industry is responding by investing in better sorting equipment to attempt to meet the National Sword standards and by developing new market for waste materials both within the US and in other countries, including Southeast Asia and India. However several countries in Southeast Asia have announced their own restrictions on recycling.
https://en.wikipedia.org/wiki?curid=7709125
1,312,156
1,154,805
Under Stark and – after 1939 – under his successor Abraham Esau, the PTR strongly dedicated itself to armament research. A newly founded laboratory for acoustics was not only to investigate general – but mainly also military – fields of application. This included, among other things, the acoustic finding of artillery, the military utilization of ultrasound and the development of decoding procedures. In addition, researchers of PTR developed acoustic mines and a steering system for torpedoes which orientated itself on the sound field of traveling ships. Due to its classical metrological tasks, the PTR was also closely connected with the armament industry of the Third Reich. Since exact measures are a basic requirement for the manufacture of military equipment, the PTR gained a key role in armament production and defense. The extent to which the PTR was also involved in the German nuclear weapons project is controversial. It is, however, known that – prior to his time as PTR president – Abraham Esau conducted – until 1939 – a group of researchers dealing with nuclear fission. Later, he took over the specialist area "nuclear fission" in the Reich Research Council which supervised, from spring 1942 on, the German uranium project. Shortly after that, Hermann Göring subordinated the working group under the former PTR physicist Kurt Diebner to Division V for atomic physics at the PTR. Esau received the title "Authorized Representative of the Reichsmarschall for Nuclear Physics", a post which he, however, ceded to Walther Gerlach already at the end of 1943.
https://en.wikipedia.org/wiki?curid=336112
1,154,195
341,637
On 23 September 2014, the U.S. Air Force performed the first-ever drop of a precision-guided aerial mine, consisting of a Quickstrike mine equipped with a JDAM kit. The Quickstrike is a Mark 80-series general-purpose bomb with the fuze replaced with a target detection device (TDD) to detonate it when a ship passes within lethal range, a safe/arm device in the nose, and a parachute-retarder tailkit in the back. Dropping of naval mines has historically been challenging, as the delivery aircraft has to fly low and slow, at , making it vulnerable to hostile fire; the first aerial mining mission of Operation Desert Storm resulted in the loss of an aircraft, and the U.S. has not flown any combat aerial minings since. The Quickstrike-J is a JDAM-equipped 1,000 lb or 2,000 lb version, and the GBU-62B(V-1)/B Quickstrike-ER is a 500 lb or 2,000 lb gliding version based on the JDAM-ER, which has a range of when launched from . Precision airdropping of naval mines is the first advance in aerial mine delivery techniques since World War II and can increase the survivability of delivery aircraft, since instead of making multiple slow passes at low altitude directly over the area an aircraft can release all of their mines in a single pass from a standoff distance and altitude, and increase the mines' effectiveness, since instead of laying a random pattern of mines in a loosely defined area they can be laid directly into harbor mouths, shipping channels, canals, rivers, and inland waterways, reducing the number of mines required and enhancing the possibility of blocking ship transit corridors. Enemy naval ports can also be blockaded, and a defensive minefield quickly planted to protect areas threatened by amphibious assault. A bomb version called "Quicksink" was tested in 2022.
https://en.wikipedia.org/wiki?curid=201690
341,456
1,418,666
After marrying Barbara, he returned to the University of Cape Town in early 1950 to lecture. Following a sabbatical at Harvard in 1956-57, the couple agreed to move to the United States, and Cormack became a professor at Tufts University in the fall of 1957. Cormack became a naturalized citizen of the United States in 1966. Although he was mainly working on particle physics, Cormack's side interest in x-ray technology led him to develop the theoretical underpinnings of CT scanning. This work was initiated at the University of Cape Town and Groote Schuur Hospital in early 1956 and continued briefly in mid-1957 after returning from his sabbatical. His results were subsequently published in two papers in the Journal of Applied Physics in 1963 and 1964. These papers generated little interest until Hounsfield and colleagues built the first CT scanner in 1971, taking Cormack's theoretical calculations into a real application. For their independent efforts, Cormack and Hounsfield shared the 1979 Nobel Prize in Physiology or Medicine. It is notable that the two built a very similar type of device without collaboration in different parts of the world [3]. He was member of the International Academy of Science, Munich. In 1990, he was awarded the National Medal of Science.
https://en.wikipedia.org/wiki?curid=883598
1,417,867
774,451
According to communist doctrine, planning had the stated purpose of enabling the people - through the communist party and state institutions - to undertake activities that would have been frustrated by a market economy, including the rapid expansion of universal education and health care, urban development with mass good-quality housing, and industrial development of all regions of the country. Nevertheless, markets continued to exist in Soviet-type planned economies. Even after the collectivization of agriculture in the Soviet Union in the 1930s, members of collective farms and anyone with a private garden plot were free to sell their own produce (farm workers were often paid in kind). Licensed markets operated in every town and city borough where non-state-owned enterprises (such as cooperatives and collective farms) were able to offer their products and services. From 1956 and 1959 onwards, all wartime controls over manpower were removed and people could apply for and quit jobs freely in the Soviet Union. The use of market mechanisms went furthest in Yugoslavia, Czechoslovakia and Hungary. From 1975, Soviet citizens had the right to engage in private handicraft; collective farmers could raise and sell livestock privately from 1981. Note that households were free to dispose of their income as they chose, and incomes were lightly taxed.
https://en.wikipedia.org/wiki?curid=43069513
774,035
1,539,396
While they are a neighboring long-distance running rival, Ethiopia does not have the same successful track record in the steeplechase, but was encouraged by Getnet Wale winning the 2019 IAAF Diamond League. Here the Ethiopians took to the lead. Chala Beyo took the point first with Wale ,and Kipruto pushing the pace out front. After two laps, Wale took over. Beyo would not finish. Starting slower, Lamecha Girma ran in the middle of the pack for a while before moving forward to take over leading duties for the team. The fast pace dropped off many of the runners, the lead pack dwindling to the entire Kenyan team, Hillary Bor, Djilali Bedrani, returning silver medalist Soufiane El Bakkali, Wale and Girma. With two laps to go, Kipruto moved out to the lead and looked around for his teammates to join him, but help did not come forward. Instead, Wale moved ahead again and El Bakkali planted himself on Kipruto's shoulder. As the pace increased, the other three Kenyan's fell off the back of the pack. Bedrani and Bor were the next to go. It was a four-man group at the bell with Girma on the tail end. Through the penultimate turn, El Bakkali took the lead. For most of the last decade, the steeplechase was decided by a devastating move off the first barrier on the backstretch, usually by Ezekiel Kemboi. It is where Kipruto won the race in 2017 and the Olympics in 2016. Here, coming off the barrier, Kipruto gained a couple of feet on Wale but El Bakkali remained in command. Instead, Girma ran around the group and into the lead. Kipruto tried to react, passing El Bakkali over the water jump. Wale had no answer for the speed and the medalists were decided. Going into the final barrier, Girma opened up two metres on Kipruto. Coming off the barrier, Kipruto launched into a sprint gaining slightly on Girma. Desperately looking for the finish Girma dipped a little early, Kipruto dipped like a seasoned professional hurdler. In the photo finish, Kipruto took the gold by .01. 18-year-old Girma got the consolation prize of the Ethiopian national record that 19-year-old Wale had improved twice already in 2019.
https://en.wikipedia.org/wiki?curid=61854351
1,538,523
589,672
The Lancer Evolution WRC is powered by the same 1996 cc "4G63" engine that has been used in its sports and rally cars since the 1980s, in this iteration producing at 5500 rpm and at 3500 rpm. The car debuted at the 2001 Rallye Sanremo, after a relatively short development (Ralliart could not introduce the Lancer WRC later because of a contract they made with the FIA in 1999, which allowed them to run the old specification Lancers). The model was based on the then-new eighth-generation Cedia model, of which the road-going Evolution VII is based on. The WRC rules allowed more freedom in most areas of the car, therefore the engineers were able to make changes to the car they couldn't do to the older Group A Lancers. These changes included modifications to the engine and its surroundings (lighter internal parts, more rearward tilt to optimize the front weight distribution, new turbo and new exhaust system), but the most significant change was made to the suspensions: now both the front and rear suspensions were MacPhersons, and also bigger wheel arches were implemented, allowing more suspension travel. However, the drivetrain remained the same as before, and when Tommi Mäkinen left the team at the end of 2001, the new drivers couldn't get on with this special transmission, which required an aggressive left-foot braking approach. The original Evo WRC (sometimes referred to as Step1) was replaced with Step2 from 2002 Finland onwards.
https://en.wikipedia.org/wiki?curid=30876895
589,370
1,618,768
The rainforests are subjects of heightened attention due to the excessive deforestation and logging that occurs in those ecosystems. Deforestation contributes to emissions, which is a cause of climate change. Much of the deforestation in the tropics is the result of agricultural land use. In the 1980s, the United Nations Food and Agriculture Organization conducted a study that concluded that 15.4 million hectares (100 acres) of tropical forest was lost per year. In addition, 5.6 million hectares were logged each year. This landmark study sparked widespread interest in the tropical ecosystem, and a great number of non-profits and outspoken ecologists engaged in an extended fight to "save the rainforest" that continues today. This battle has manifested itself in a number of ways, one of which is the outcropping of biodiversity institutes in tropical locations dedicated to stopping the excessive deforestation of the landscape, one of the most notable of which was established in Costa Rica. The work of the Costa Rican National Biodiversity Institute (INBio) has served as a model for other biodiversity institutes. First, rainforests harbor the most alkaloid-producing plants of any biome; alkaloids are compounds that are crucial to the production of Western drugs. Due to the abundance of these compounds, pharmaceutical companies all over the world look to the rainforests for new medicinal treatments. In the early 1990s, the heads of INBio signed a deal with the pharmaceutical behemoth Merck that called for cooperation between the two entities in discovering and exploring new natural treatments in the Costa Rican rainforests. Ecologists, government officials, and corporations alike praised this decision as decisive progress in an ongoing struggle to work cooperatively in utilizing tropical biodiversity while ensuring the stability of tropical ecosystems.
https://en.wikipedia.org/wiki?curid=19826061
1,617,853
1,716,516
Because the BESSY components were used only as an injector system, the construction of a new main ring was still needed for SESAME. The estimated cost of the ring was $10 million so additional sources of funding were required. On 23 July 2001 a formal proposal supported by the German and French Ministers of Research, and later the Commissioner for Research Philippe Busquin, was submitted to European Commissioner Chris Patten. In October 2001 "chef de cabinet" of Commissioner Patten, Anthony Cary, informed Schopper that an independent evaluation by a panel of international experts was needed. The "Techno-Economic Feasibility Study" was under the guidance of Professor Guy Le Lay of the University of Marseille. The report concluded that the project was promising and would "effectively stimulate scientific activity and cooperation in the Middle East". However, in August 2003, Commissioner Patten stated that "the Commission is not in a position at this stage to provide Community funding to SESAME". In a subsequent meeting arranged between the Jordanian government, SESAME representatives and the EU Commission, the main of contention was the project's energy level. It was claimed that a competitive facility needed a higher energy level. A compromise was reached that the machine should start at 2 GeV, with 2.5 GeV available at a later stage. This would have increased the cost of the ring by another $2 million.
https://en.wikipedia.org/wiki?curid=12260613
1,715,547
1,641,505
The first true instance of neurofeedback occurred in 1963, when University of Chicago professor Joseph Kamiya trained a volunteer to recognize and alter alpha brain wave activity. Just five years later, Barry Sterman conducted a revolutionary study on cats at the behest of NASA that proved that cats trained to consciously alter their sensorimotor rhythm were resistant to doses of hydrazine that typically induce seizures. This finding was applied to humans in 1971 when Sterman trained an epileptic to control her seizures through a combination of sensorimotor rhythm and EEG neurotherapy to the extent that she obtained a driver's license after only three months of treatment. Around the same time Hershel Toomim was founding Toomim Biofeedback Laboratories and Biocomp Research Institute on the basis of a device known as the Alpha Pacer that measured brain waves. After decades of work with various biofeedback mechanisms, Toomim accidentally stumbled upon conscious control of cerebral blood flow in 1994. He developed a device specific to this measure that he called a Near Infrared Spectrophotometry Hemencephalography system, coining the term "hemoencephalography", in 1997. A clinician user of NIR HEG, Jeffrey Carmen, adapted Toomim's system for migraines in 2002 by integrating peripheral thermal biofeedback into the design. Since then, both techniques have been applied to numerous disorders of frontal and prefrontal lobe function. Sherrill, R. (2004).
https://en.wikipedia.org/wiki?curid=1806538
1,640,578
2,033,476
Overall the exhibition design attempted to minimize architectural structures, such as walls, between artworks so that the works would visually interact with each other. However the exhibition was divided into a lit section and a darkened corridor that featured more self-contained works. Notable works in the exhibition included Takamatsu Jirō's "Chairs and Table in Perspective" (遠近法の椅子とテーブル, 1966), a sculptural set of chairs and table overlaid with a grid that are formed in extreme forced perspective, with the chairs near the front of the sculpture literally larger than those at the back; Arata Isozaki's boldly-colored sculptural maquette for Fukuoka Sōgo Bank (建築空間(福岡相互銀行大分支店), 1966) suspended on the wall; a site-specific installation on the wall of Ay-O's "Finger Boxes" (フィンガー・ボックス"," 1963-66), which invite viewers to insert their fingers to experience different tactile sensations; photographer Shōmei Tōmatsu's "No. 24" (1966), an installation that invited viewers to place their feet on footprints to stand within an off-kilter enclosing white space; and Yamaguchi's "" (港, 1966), a set of larger-than-human sized colored transparent acrylic J-shapes and cubes, positioned in different arrangements with lights inside that was shown again at the Solomon R. Guggenheim Museum in 1967. A number of works included mobile elements, such as industrial designer Hiroshi Tomura's spiral-shaped mobile sculptures made out of thin plastic that hung from the ceiling, Shintarō Tanaka's delicately balanced arching "Heart Mobile" (ハートのモビール"," 1966), Takamichi Itō's noise-making viewer-rotated cylinders, and a display of sliding posters, that viewers could manipulate, by designers Kazumasa Nagai, Ikkō Tanaka, and Tadanori Yokoo. Other works featured reflective surfaces, such as designer Awazu's stainless water basin that reflected the surrounding sculptures and visitors against text characters inside the basin, Shin’ya Izumi's sculpture of stacked glass bottles, and 's "Compose" (1966), a set of reflective acrylic concave half-spheres set into boxes at eye level.
https://en.wikipedia.org/wiki?curid=59143735
2,032,304
195,764
UCR is classified among "R1: Doctoral Universities – Very high research activity." The 2022 "U.S. News & World Report" Best Colleges rankings places UCR 83rd nationwide and tied for 33rd among public universities. Over 27 of UCR's academic programs, including the Graduate School of Education and the Bourns College of Engineering, are highly ranked nationally based on peer assessment, student selectivity, financial resources, and other factors. "Washington Monthly" ranked UCR 2nd in the United States in terms of social mobility, research and community service, while "U.S. News" ranks UCR as the fifth most ethnically diverse and, by the number of undergraduates receiving Pell Grants (42 percent), the 15th most economically diverse student body in the nation. Over 70% of all UCR students graduate within six years without regard to economic disparity. UCR's extensive outreach and retention programs have contributed to its reputation as a "university of choice" for minority students. In 2005, UCR became the first public university campus in the nation to offer a gender-neutral housing option.
https://en.wikipedia.org/wiki?curid=230311
195,664
2,113,005
NAAMES scientists developed several novel measurement techniques during the project. For example, sorting flow cytometry combined with bioluminescent detection of ATP and NADH provides relatively precise determination of phytoplankton net primary productivity, growth rate, and biomass. Both laboratory and field tests validated this approach, which does not require traditional carbon-14 isotope incubation techniques. Other NAAMES investigators employed new techniques to measure particle size distribution, which is an important metric of biogeochemistry and ecosystem dynamics. By coupling a submersible laser diffraction particle sizer with a continuously flowing seawater system, scientists were able to accurately measure particle size distribution just as well as more established (but more time- and effort-intensive) methods such as Coulter counter and flow-cytobot. In addition to new oceanographic techniques, the NAAMES team also developed a novel method of collecting cloud water. An aircraft-mounted probe used inertial separation to collect cloud droplets from the atmosphere. Their axial cyclone technique was reported to collect cloud water at a rate of 4.5 ml per minute, which was stored and later analyzed in the lab.
https://en.wikipedia.org/wiki?curid=62264747
2,111,790
1,079,974
In 2016, Panchanathan was promoted to Executive Vice President, ASU Knowledge Enterprise Development and Chief Research and Innovation Officer at ASU. In this role, Dr. Panchanathan leads the advancement of research, innovation, entrepreneurship, corporate engagement and strategic partnerships, and international development. Under his leadership, ASU's research has grown exponentially, with annual research expenditures quadrupling to more than half a billion dollars over the past 15 years. Continuing on its path as a rapidly growing research enterprise, Arizona State University reported $635 million in research expenditures for fiscal year 2018, up from $545 million in FY17, according to a recent report by the U.S. National Science Foundation. At that time, ASU was holding its rank at No. 44 for total research expenditures in the U.S., remaining ahead of the California Institute of Technology and the University of Chicago. Among institutions without a medical school, ASU ranked No. 8, ahead of Princeton University and Carnegie Mellon University. In a 2017 Brazilian Congress of Industry Innovation panel discussion, Panchanathan highlighted how universities like ASU ought to work hand-in-hand with businesses to create curriculum that fosters the entrepreneurial traits employers look for today, in order to produce a future of innovation ecosystems. On October 22, 2019, Panchanathan testified before the U.S. Senate Subcommittee on Science, Oceans, Fisheries, and Weather in a hearing titled, "Research and Innovation: Ensuring America's Economic and Strategic Leadership," examining the role that research and innovation play in ensuring U.S. leadership in the global economy.
https://en.wikipedia.org/wiki?curid=40985346
1,079,419
242,880
The use of archetypes in psychology was advanced by Jung in an essay entitled "Instinct and the Unconscious" in 1919. The first element in Greek 'arche' signifies 'beginning, origin, cause, primal source principle', by extension it can signify 'position of a leader, supreme rule and government'. The second element 'type' means 'blow or what is produced by a blow, the imprint of a coin ...form, image, prototype, model, order, and norm', ...in the figurative, modern sense, 'pattern underlying form, primordial form'. In his psychological framework, archetypes are innate, universal or personal prototypes for ideas and may be used to interpret observations. The method he favoured was hermeneutics which was central in his practice of psychology from the start. He made explicit references to hermeneutics in the Collected Works and during his theoretical development of the notion of archetypes. Although he lacks consistency in his formulations, his theoretical development of archetypes is rich in hermeneutic implications. As noted by Smythe and Baydala (2012), A group of memories and attitudes associated with an archetype can become a complex, e.g. a mother complex may be associated with a particular mother archetype. Jung treated the archetypes as psychological organs, analogous to physical ones in that both are morphological givens which probably arose through evolution.
https://en.wikipedia.org/wiki?curid=448370
242,753
1,412,054
HNF-1α is a transcription factor expressed in organs of endoderm origin, including liver, kidneys, pancreas, intestines, stomach, spleen, thymus, testis, and keratinocytes and melanocytes in human skin. It has been shown to affect intestinal epithelial cell growth and cell lineages differentiation. For instance, HNF1A is an important cell-intrinsic transcription factor in adult B lymphopoiesis. The participation of HNF-1α in glucose metabolism and diabetes has been reported, including the involvement in GLUT1 and GLUT2 transporter expression in pancreatic β-cells and angiotensin-converting enzyme 2 gene expression in pancreatic islets. HNF-1α could promote the transcription of several proteins involved in the management of type II diabetes including dipeptidyl peptidase-IV (DPP-IV/CD26). HNF-1α is also involved in various metabolic pathways of other organs, such as being a transcriptional regulator of bile acid transporters in the intestine and kidneys. HNF-1α is involved in the promotion of hepatic organic cation transporters, which uptake certain classes of pharmaceuticals; hence, the loss of its function can lead to drug metabolism problems. In addition, HNF-1α regulates the expression of acute phase proteins, such as fibrinogen, c-reactive protein, and interleukin 1 receptor, which are involved with inflammation. Moreover, significantly lower levels of HNF-1α in pancreatic tumors and hepatocellular adenomas than in normal adjacent tissues was observed, suggesting that HNF-1α might play a possible tumor suppressor role.
https://en.wikipedia.org/wiki?curid=14075925
1,411,259
2,028,729
Fankuchen was born in Brooklyn in a family of modest means and went to Cooper Union where he received a BS in 1926. He then studied at Cornell with a Hecksher fellowship and received a PhD in 1933 after working under C.C. Murdock. He then went to Manchester University on a post-doctoral fellowship from the Schweinburg Foundation and worked with Sir Lawrence Bragg (1933–34) followed by two years at Birkbeck College with J.D. Bernal during which time they examined the structure of chymotrypsin and haemoglobin. He also collaborated with Bernal's student Dorothy Hodgkin (then Dorothy Crowfoot) in studying steroids. He then returned to the United States and worked on protein chemistry at the Massachusetts Institute of Technology. He then joined the Anderson Institute for Biological Research, Red Wing, Minnesota serving as an assistant director (1941–42). In 1941 he was elected a Fellow of the American Physical Society. From 1942 until his death in 1964 he was a faculty member at the Polytechnic Institute of Brooklyn. He was influential in training a generation of crystallographers and organized a monthly seminar group called the "Point Group". His courses produced so-called "Fan's two-week wonders" who learned crystallography and knew what could be done with it.
https://en.wikipedia.org/wiki?curid=63885122
2,027,561
1,843,065
In 1985 Gulf Oil was acquired by Chevron Corporation which maintained its own research facilities in Richmond, California, the complex had grown to with 54 multi-story lab buildings and employed nearly 2,000 scientists and engineers with an annual operation budget of over $100 million. The University of Pittsburgh proposed that it would be able to maintain and operate the facility in order to keep the center open for the benefit of the region. Gulf and Chevron agreed to the university's proposal and donated the site, valued at $100 million including the fully furnished and equipped laboratories, a computer telecommunications center, an executive office building, and unique facilities such as large cold room containing a wind tunnel. Chevron also added a $3 million start up grant, and the Commonwealth of Pennsylvania added a $3 million matching grant for economic development. The donation was announced by university Chancellor Wesley Posvar at a press conference in April, 1985. The university took over the facility in early 1986 and renamed it the University of Pittsburgh Applied Research Center. On March 17, 1986, the university signed its first major tenant, General Motors Corporation, to a four-year, $13 million contract and in two years was sheltering 80 small businesses.
https://en.wikipedia.org/wiki?curid=22828647
1,842,012
333,068
The Accuracy International receiver is bolted with four screws and permanently bonded with epoxy material to the aluminium chassis, and was designed for ruggedness, simplicity and ease of operation. To this end, the heavy-walled, flat-bottomed, flat-sided receiver is a stressed part, machined in-house by AI from a solid piece of forged carbon steel. AW rifles are supplied in two action lengths—standard AW (short) and long SM (magnum). The six bolt lugs, arranged in two rows of three, engage a heat-treated steel locking ring insert pinned inside the front bridge of the action. The ring can be removed and replaced to refresh headspace control on older actions. The AW system cast steel bolt has a diameter combined with gas relief holes in a diameter bolt body and front action bridge allowing high-pressure gases a channel of escape in the event of a cartridge-case head failure. Against penetrating water or dirt the bolt has milled slots, which also prevent freezing or similar disturbances. Unlike conventional bolt-action rifles, the bolt handle is bent to the rear, which eases the repeating procedure for the operator and reduces the contour of the weapon. The action cocks on opening with a short, 60 degree bolt throw and has a non-rotating (fixed) external extractor and an internal ejector. Firing pin travel is to keep lock times to a minimum. Finally, an integral dovetail rail located above the receiver is designed to accommodate different types of optical or electro-optical sights. As an option, a MIL-STD-1913 rail (Picatinny rail) can be permanently pinned, bonded and bolted to the action, providing a standard interface for many optical systems.
https://en.wikipedia.org/wiki?curid=737928
332,890
418,451
Psychophysiology measures exist in multiple domains; reports, electrophysiological studies, studies in neurochemistry, neuroimaging and behavioral methods. Evaluative reports involve participant introspection and self-ratings of internal psychological states or physiological sensations, such as self-report of arousal levels on the self-assessment manikin, or measures of interoceptive visceral awareness such as heartbeat detection. Merits to self-report are an emphasis on accurately understand the participants' subjective experience and understanding their perception; however, its pitfalls include the possibility of participants misunderstanding a scale or incorrectly recalling events. Physiological responses also can be measured via instruments that read bodily events such as heart rate change, electrodermal activity (EDA), muscle tension, and cardiac output. Many indices are part of modern psychophysiology, including brain waves (electroencephalography, EEG), fMRI (functional magnetic resonance imaging), electrodermal activity (a standardized term encompassing skin conductance response, SCR, and galvanic skin response, GSR), cardiovascular measures (heart rate, HR; beats per minute, BPM; heart rate variability, HRV; vasomotor activity), muscle activity (electromyography, EMG), electrogastrogram (EGG) changes in pupil diameter with thought and emotion (pupillometry), eye movements, recorded via the electro-oculogram (EOG) and direction-of-gaze methods, and cardiodynamics, recorded via impedance cardiography. These measures are beneficial because they provide accurate and perceiver-independent objective data recorded by machinery. The downsides, however, are that any physical activity or motion can alter responses, and basal levels of arousal and responsiveness can differ among individuals and even between situations.
https://en.wikipedia.org/wiki?curid=23732
418,246
104,002
In late 2004, Indonesia faced a 'mini-crisis' due to international oil prices rises and imports. The currency exchange rate reached Rp 12,000/USD1 before stabilizing. Under President Susilo Bambang Yudhoyono (SBY), the government was forced to cut its massive fuel subsidies, which were planned to cost $14 billion in October 2005. This led to a more than doubling in the price of consumer fuels, resulting in double-digit inflation. The situation had stabilized but the economy continued to struggle with inflation at 17% in late 2005. Economic outlook became more positive as the 2000s progressed. Growth accelerated to 5.1% in 2004 and reached 5.6% in 2005. Real per capita income has reached fiscal levels in 1996–1997. Growth was driven primarily by domestic consumption, which accounts for roughly three-fourths of Indonesia's gross domestic product (GDP). The Jakarta Stock Exchange was the best performing market in Asia in 2004, up by 42%. Problems that continue to put a drag on growth include low foreign investment levels, bureaucratic red tape, and widespread corruption which costs RP. 51.4 trillion (US$5.6 billion) or approximately 1.4% of GDP annually. However, there is a robust economic optimism due to the conclusion of the peaceful 2004 elections.
https://en.wikipedia.org/wiki?curid=14647
103,957
957,910
The argument from intelligent design appears to have begun with Socrates, although the concept of a cosmic intelligence is older and David Sedley has argued that Socrates was developing an older idea, citing Anaxagoras of Clazomenae, born about 500 BC, as a possible earlier proponent. The proposal that the order of nature showed evidence of having its own human-like "intelligence" goes back to the origins of Greek natural philosophy and science, and its attention to the orderliness of nature, often with special reference to the revolving of the heavens. Anaxagoras is the first person who is definitely known to have explained such a concept using the word ""nous"" (which is the original Greek term that leads to modern English "intelligence" via its Latin and French translations). Aristotle reports an earlier philosopher from Clazomenae named Hermotimus who had taken a similar position. Amongst Pre-Socratic philosophers before Anaxagoras, other philosophers had proposed a similar intelligent ordering principle causing life and the rotation of the heavens. For example Empedocles, like Hesiod much earlier, described cosmic order and living things as caused by a cosmic version of love, and Pythagoras and Heraclitus attributed the cosmos with "reason" ("logos"). In his "Philebus" 28c Plato has Socrates speak of this as a tradition, saying that "all philosophers agree—whereby they really exalt themselves—that mind ("nous") is king of heaven and earth. Perhaps they are right." and later states that the ensuing discussion "confirms the utterances of those who declared of old that mind ("nous") always rules the universe".
https://en.wikipedia.org/wiki?curid=30731
957,404
647,053
Structure-from-motion photogrammetry with multi-view stereo provides hyperscale landform models using images acquired from a range of digital cameras and optionally a network of ground control points. The technique is not limited in temporal frequency and can provide point cloud data comparable in density and accuracy to those generated by terrestrial and airborne laser scanning at a fraction of the cost. Structure from motion is also useful in remote or rugged environments where terrestrial laser scanning is limited by equipment portability and airborne laser scanning is limited by terrain roughness causing loss of data and image foreshortening. The technique has been applied in many settings such as rivers, badlands, sandy coastlines, fault zones, landslides, and coral reef settings. SfM has been also successfully applied for the assessment of large wood accumulation volume and porosity in fluvial systems, as well as for the characterization of rock masses through the determination of some properties as the orientation, persistence, etc. of discontinuities. A full range of digital cameras can be utilized, including digital SLR's, compact digital cameras and even smart phones. Generally though, higher accuracy data will be achieved with more expensive cameras, which include lenses of higher optical quality. The technique therefore offers exciting opportunities to characterize surface topography in unprecedented detail and, with multi-temporal data, to detect elevation, position and volumetric changes that are symptomatic of earth surface processes. Structure from motion can be placed in the context of other digital surveying methods.
https://en.wikipedia.org/wiki?curid=5386671
646,713
829,379
As a concrete example of a classic screen scraper, consider a hypothetical legacy system dating from the 1960s—the dawn of computerized data processing. Computer to user interfaces from that era were often simply text-based dumb terminals which were not much more than virtual teleprinters (such systems are still in use , for various reasons). The desire to interface such a system to more modern systems is common. A robust solution will often require things no longer available, such as source code, system documentation, APIs, or programmers with experience in a 50-year-old computer system. In such cases, the only feasible solution may be to write a screen scraper that "pretends" to be a user at a terminal. The screen scraper might connect to the legacy system via Telnet, emulate the keystrokes needed to navigate the old user interface, process the resulting display output, extract the desired data, and pass it on to the modern system. A sophisticated and resilient implementation of this kind, built on a platform providing the governance and control required by a major enterprise—e.g. change control, security, user management, data protection, operational audit, load balancing, and queue management, etc.—could be said to be an example of robotic process automation software, called RPA or RPAAI for self-guided RPA 2.0 based on artificial intelligence.
https://en.wikipedia.org/wiki?curid=46471245
828,933
126,834
Crystal structures of proteins (which are irregular and hundreds of times larger than cholesterol) began to be solved in the late 1950s, beginning with the structure of sperm whale myoglobin by Sir John Cowdery Kendrew, for which he shared the Nobel Prize in Chemistry with Max Perutz in 1962. Since that success, over 130,000 X-ray crystal structures of proteins, nucleic acids and other biological molecules have been determined. The nearest competing method in number of structures analyzed is nuclear magnetic resonance (NMR) spectroscopy, which has resolved less than one tenth as many. Crystallography can solve structures of arbitrarily large molecules, whereas solution-state NMR is restricted to relatively small ones (less than 70 kDa). X-ray crystallography is used routinely to determine how a pharmaceutical drug interacts with its protein target and what changes might improve it. However, intrinsic membrane proteins remain challenging to crystallize because they require detergents or other denaturants to solubilize them in isolation, and such detergents often interfere with crystallization. Membrane proteins are a large component of the genome, and include many proteins of great physiological importance, such as ion channels and receptors. Helium cryogenics are used to prevent radiation damage in protein crystals.
https://en.wikipedia.org/wiki?curid=34151
126,782
825,271
Surface plasmons are those plasmons that are confined to surfaces and that interact strongly with light resulting in a polariton. They occur at the interface of a material exhibiting positive real part of their relative permittivity, i.e. dielectric constant, (e.g. vacuum, air, glass and other dielectrics) and a material whose real part of permittivity is negative at the given frequency of light, typically a metal or heavily doped semiconductors. In addition to opposite sign of the real part of the permittivity, the magnitude of the real part of the permittivity in the negative permittivity region should typically be larger than the magnitude of the permittivity in the positive permittivity region, otherwise the light is not bound to the surface (i.e. the surface plasmons do not exist) as shown in the famous book by Heinz Raether. At visible wavelengths of light, e.g. 632.8 nm wavelength provided by a He-Ne laser, interfaces supporting surface plasmons are often formed by metals like silver or gold (negative real part permittivity) in contact with dielectrics such as air or silicon dioxide. The particular choice of materials can have a drastic effect on the degree of light confinement and propagation distance due to losses. Surface plasmons can also exist on interfaces other than flat surfaces, such as particles, or rectangular strips, v-grooves, cylinders, and other structures. Many structures have been investigated due to the capability of surface plasmons to confine light below the diffraction limit of light. One simple structure that was investigated was a multilayer system of copper and nickel. Mladenovic "et al." report the use of the multilayers as if its one plasmonic material. The copper oxide is prevented with the addition of the nickel layers. It is an easy path the integration of plasmonics to use copper as the plasmonic material because it is the most common choice for metallic plating along with nickel. The multilayers serve as a diffractive grating for the incident light. Up to 40 percent transmission can be achieved at normal incidence with the multilayer system depending on the thickness ratio of copper to nickel. Therefore, the use of already popular metals in a multilayer structure prove to be solution for plasmonic integration.
https://en.wikipedia.org/wiki?curid=231630
824,828
586,533
Various genetic discoveries have been essential in the development of genetic engineering. Genetic inheritance was first discovered by Gregor Mendel in 1865 following experiments crossing peas. Although largely ignored for 34 years he provided the first evidence of hereditary segregation and independent assortment. In 1889 Hugo de Vries came up with the name "(pan)gene" after postulating that particles are responsible for inheritance of characteristics and the term "genetics" was coined by William Bateson in 1905. In 1928 Frederick Griffith proved the existence of a "transforming principle" involved in inheritance, which Avery, MacLeod and McCarty later (1944) identified as DNA. Edward Lawrie Tatum and George Wells Beadle developed the central dogma that genes code for proteins in 1941. The double helix structure of DNA was identified by James Watson and Francis Crick in 1953.As well as discovering how DNA works, tools had to be developed that allowed it to be manipulated. In 1970 Hamilton Smith's lab discovered restriction enzymes that allowed DNA to be cut at specific places and separated out on an electrophoresis gel. This enabled scientists to isolate genes from an organism's genome. DNA ligases, that join broken DNA together, had been discovered earlier in 1967 and by combining the two enzymes it was possible to "cut and paste" DNA sequences to create recombinant DNA. Plasmids, discovered in 1952, became important tools for transferring information between cells and replicating DNA sequences. Frederick Sanger developed a method for sequencing DNA in 1977, greatly increasing the genetic information available to researchers. Polymerase chain reaction (PCR), developed by Kary Mullis in 1983, allowed small sections of DNA to be amplified and aided identification and isolation of genetic material.
https://en.wikipedia.org/wiki?curid=37214939
586,233
1,312,653
Larval mites of the genus "Arrenurus" are also common ectoparasites of many mosquito species. In contrast to "P. barbigera", "Arrenurus" mites are fully aquatic and prefer permanent habitats, such as swamps and marshes. Females lay eggs in protected areas hidden among the abundant vegetation of these habitats, and upon hatching, larvae can be found swimming throughout the upper water column in search of hosts. Once an immature host is located, "Arrenurus" larvae loosely bind to their integument, and monitor them until the adult emerges. Host muscle contractions just prior to emergence stimulate mite larvae to move towards the ecdysial opening and attach to the host along intersegmental sutures on their thorax and abdomen. Differences in preferred attachment site between mite species appear to be related to differences in host emergence behavior. Full larval engorgement takes approximately three days, during which they have the potential to significantly impact the health of their host. In laboratory settings, the survival of "Anopheles crucians" mosquitoes parasitized by "Arrenurus (Meg.) pseudotenuicollis" was found to decrease from 23.32 to 6.25 days between those harboring the least and greatest numbers of attached mites respectively. Under similar conditions, infection intensities equalling 17-32 mites decreased the number of eggs laid by gravid "An. crucians" by nearly 100%. High mite loads also significantly decreased the fecundity of field-collected "An. crucians", but to a lesser extent than those infected in the lab. Similar consequences of high "Arrenurus" mite infection intensities were observed in other host-mite relationships. For example, Smith and McIver (1984) found that "Arrenurus" "danbyensis" loads of greater than 5 mites decreased the fecundity of "Coquillettidia" "perturbans" females by approximately 3.5 eggs per additional mite. Even though "Arrenurus" mite larvae have been considered as potential biocontrol agents, unrealistic numbers would need to be released in order to prove effective on their own.
https://en.wikipedia.org/wiki?curid=26945207
1,311,934
1,303,843
Leonard Hayflick (born 20 May 1928) is a Professor of Anatomy at the UCSF School of Medicine, and was Professor of Medical Microbiology at Stanford University School of Medicine. He is a past president of the Gerontological Society of America and was a founding member of the council of the National Institute on Aging (NIA). The recipient of a number of research prizes and awards, including the 1991 Sandoz Prize for Gerontological Research, he has studied the aging process for more than fifty years. He is known for discovering that normal human cells divide for a limited number of times "in vitro" (refuting the contention by Alexis Carrel that normal body cells are immortal). This is known as the Hayflick limit. His discoveries overturned a 60-year old dogma that all cultured cells are immortal. Hayflick demonstrated that normal cells have a memory and can remember at what doubling level they have reached. He demonstrated that his normal human cell strains were free from contaminating viruses. His cell strain WI-38 soon replaced primary monkey kidney cells and became the substrate for the production of most of the world's human virus vaccines. Hayflick discovered that the etiological agent of primary atypical pneumonia (also called "walking pneumonia") was not a virus as previously believed. He was the first to cultivate the causative organism called a mycoplasma, the smallest free-living organism, which Hayflick isolated on a unique culture medium that bears his name. He named the organism "Mycoplasma pneumoniae".
https://en.wikipedia.org/wiki?curid=482990
1,303,127
473,040
The engine's displacement is and it has a bore/stroke ratio of , attaining cylinder pressures of and a compression ratio of 16.0:1. It uses an aluminum engine block, die-cast aluminum bedplate, and an aluminum cylinder head. A chain driven dual overhead camshaft, employing weight-saving hollow sections and lobes, operates four valves per cylinder with low-friction, hydraulic roller finger followers. The pistons are made from aluminum for reduced reciprocating mass, feature a concave, shallow-bowl profile to facilitate efficient combustion, and are cooled by under-skirt oil spraying. The crankshaft employs four counterweights to minimize mass, and both it and the con-rods are made of forged steel. The engine features multiple improvements to reduce NVH, such as a cam cover made of GRP and fully decoupled from the engine to reduce noise and vibration, while also saving weight compared to aluminum; a composite intake manifold encapsulated in acoustic padding as well as an external plastic shield that both significantly reduce noise emissions; a mechanical crankshaft isolator which reduces radiated noise and torsional vibrations in the accessory drive system; and scissor gears for the timing drive system, incorporating tooth profiles ground with a Low Noise Shifting (LNS) process for optimal noise reduction. More than 150 patented diesel control functions are deployed by the engine's ECU, which was developed in-house by General Motors and jointly engineered in Italy (by GM Powertrain Torino), Germany, and the United States, and will be used in all future GM four-cylinder diesel engines.
https://en.wikipedia.org/wiki?curid=41614721
472,804
1,664,686
The technology used today to extract bitumen from the Athabasca oil sands is the water-intensive steam-assisted gravity drainage (SAGD) method. In the late 1990s former nuclear engineer Bill Heins of General Electric Company's RCC Thermal Products conceived an evaporator technology called falling film or mechanical vapor compression evaporation. In 1999 and 2002 Petro-Canada's MacKay River facility was the first to install 1999 and 2002 GE SAGD zero-liquid discharge (ZLD) systems using a combination of the new evaporative technology and crystallizer system in which all the water was recycled and only solids were discharged off site. This new evaporative technology began to replace older water treatment techniques employed by SAGD facilities which involved the use of warm lime softening to remove silica and magnesium and weak acid cation ion exchange used to remove calcium. The vapor-compression evaporation process replaced the once-through steam generators (OTSG) traditionally used for steam production. OTSG generally ran on natural gas which in 2008 had become increasingly valuable. The water quality of evaporators is four times better which is needed for the drum boilers. The evaporators, when coupled with standard drum boilers, produce steam which is more "reliable, less costly to operate, and less water-intensive." By 2008 about 85 per cent of SAGD facilities in the Alberta oil sands had adopted evaporative technology. "SAGD, unlike other thermal processes such as cyclic steam stimulation (CSS), requires 100 per cent quality steam."
https://en.wikipedia.org/wiki?curid=5634651
1,663,749
327,737
"On the Origin of Species" was first published on Thursday 24 November 1859, priced at fifteen shillings with a first printing of 1250 copies. The book had been offered to booksellers at Murray's autumn sale on Tuesday 22 November, and all available copies had been taken up immediately. In total, 1,250 copies were printed but after deducting presentation and review copies, and five for Stationers' Hall copyright, around 1,170 copies were available for sale. Significantly, 500 were taken by Mudie's Library, ensuring that the book promptly reached a large number of subscribers to the library. The second edition of 3,000 copies was quickly brought out on 7 January 1860, and incorporated numerous corrections as well as a response to religious objections by the addition of a new epigraph on page ii, a quotation from Charles Kingsley, and the phrase "by the Creator" added to the closing sentence. During Darwin's lifetime the book went through six editions, with cumulative changes and revisions to deal with counter-arguments raised. The third edition came out in 1861, with a number of sentences rewritten or added and an introductory appendix, "An Historical Sketch of the Recent Progress of Opinion on the Origin of Species". In response to objections that the origin of life was unexplained, Darwin pointed to acceptance of Newton's law even though the cause of gravity was unknown, and Leibnitz had accused Newton of introducing "occult qualities & miracles". The fourth edition in 1866 had further revisions. The fifth edition, published on 10 February 1869, incorporated more changes and for the first time included the phrase "survival of the fittest", which had been coined by the philosopher Herbert Spencer in his "Principles of Biology" (1864).
https://en.wikipedia.org/wiki?curid=29932
327,563
1,489,614
Born on 2 January 1962 at Hili, a border town near Balurghat in Dakshin Dinajpur district in the Indian state of West Bengal to Kalachand Kundu and Dipali couple, Tapas Kumar Kundu graduated in agriculture (BSc hons) from Bidhan Chandra Krishi Viswavidyalaya in 1986 and completed his master's degree in biochemistry from University of Agricultural Sciences, Bangalore in 1989, winning a gold medal for standing first in the master's degree examination. He enrolled for doctoral studies at the Indian Institute of Science in 1990 and secured a PhD under the guidance of M. R. S. Rao in 1995 for his thesis, "Zinc-Metalloprotein Nature of Rat Spermatidal Protein TP2 and its Interactions with DNA". After a short stint at the National Institute of Genetics, Japan during 1995–96 as a visiting foreign research associate, he did his post-doctoral studies at the Laboratory of Biochemistry and Molecular Biology of Rockefeller University during 1996–99. Returning to India the same year, he joined Jawaharlal Nehru Centre for Advanced Scientific Research at their Molecular Biology and Genetics Unit where he became an assistant professor in 2005 and heads the Transcription and Disease Laboratory of the institution in the capacity of a JNCASR Silver Jubillee Professor since 2015.
https://en.wikipedia.org/wiki?curid=52066021
1,488,775
1,399,819
What all primary rodent prey species share is that they are either nocturnal or crepuscular and are often common to abundant in openings of woods and rocky environments. The reason Eurasian eagle-owls take rats, which associate rather closely with humans, much more regularly than they take house mice ("Mus musculus") is not their larger size (many other rodents regularly taken by eagle-owls are as small or smaller than a house mouse) but the rat's ability to flourish along the edges of crop fields, refuse dumps, roadsides and abandoned buildings and fields, whereas the mice are more closely tied to active human habitations. A single refuse dump can host several thousand rats and eagle-owl territories that include dumps can locally expect higher productivity due to the abundance of this food source. Squirrels are readily predated but, being diurnal, are largely unavailable to this and other owls. When they are captured, it is presumably around dawn or dusk from their tree hole, in species such as the widespread red squirrel ("Sciurus vulgaris"), or burrow entrance, in the case of ground squirrels, most regularly recorded as prey in Asia Minor and the Central Asia. A few specimens of the woolly flying squirrels ("Eupetaurus cinereus"), the largest of the world's flying squirrels and one of the rarest, were the only identified foods for Eurasian eagle-owls in northern Pakistan alongside some European hares. Although not common in the diet, a few species of marmot have been taken and may be an important aspect of the prey biomass. In Europe, adult-sized alpine marmots, ("Marmota marmota") have been killed. However, the average weight of Tarbagan marmots ("Marmota sibirica") taken in Mongolia was only and that of alpine marmots in Engadin, Switzerland was only . These obviously comprising young marmots as adults have average weights from , from spring low weights to pre-hibernation weights during autumn hypophagia. Other than marmots, the largest rodent known to be hunted by Eurasian eagle-owls is the introduced aquatic rodents, the muskrat ("Ondatra zibethica"), which in one study estimated to average when taken, and the nutria ("Myocastor coypus"), equal in size to the large marmots.
https://en.wikipedia.org/wiki?curid=45597481
1,399,041
669,331
The processing of the chosen material for constructing submersible research vehicles guides much of the rest of the construction process. For example, the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) employs several Autonomous Underwater Vehicles (AUVs) with varied construction. The most commonly used metals for constructing the high-pressure vessels of these craft are wrought alloys of aluminum, steel, and titanium. Aluminum is chosen for medium-depth operations where extremely high strength is not necessary. Steel is an extremely well-understood material which can be tuned to have incredible yield strength and yield stress. It is an excellent material for resisting the extreme pressures of the sea but has a very high density that limits the size of steel pressure vessels due to weight concerns. Titanium is nearly as strong as steel and three times as light. It seems like the obvious choice to use but has several issues of its own. Firstly, it is much more costly and difficult to work with titanium, and improper processing can lead to substantial flaws. To add features such as viewports to a pressure vessel, delicate machining operations must be used, which carry a risk in titanium. The Deepsea Challenger, for example, used a sphere of steel to house its pilot. This sphere is estimated to be able to withstand 23,100 psi of hydrostatic pressure, which is roughly equivalent to an ocean depth of 52,000 feet, far deeper than Challenger Deep. Smaller titanium spheres were used to house many of the vessel’s electronics, as the smaller size lowered the risk of catastrophic failure.
https://en.wikipedia.org/wiki?curid=11553912
668,981
1,800,292
In a number of applications, there is a so-called King effect where the top-ranked item(s) have a significantly greater frequency or size than the model predicts on the basis of the other items. The Laherrère/Deheuvels paper shows the example of Paris, when sorting the sizes of towns in France. When the paper was written Paris was the largest city with about ten million inhabitants, but the next largest town had only about 1.5 million. Towns in France excluding Paris closely follow a parabolic distribution, well enough that the 56 largest gave a very good estimate of the population of the country. But that distribution would predict the largest city to have about two million inhabitants, not 10 million. The King Effect is named after the notion that a King must defeat all rivals for the throne and takes their wealth, estates and power, thereby creating a buffer between himself and the next-richest of his subjects. That specific effect (intentionally created) may apply to corporate sizes, where the largest businesses use their wealth to buy up smaller rivals. Absent intent, the King Effect may occur as a result of some persistent growth advantage due to scale, or to some unique advantage. Larger cities are more efficient connectors of people, talent and other resources. Unique advantages might include being a port city, or a Capital city where law is made, or a center of activity where physical proximity increases opportunity and creates a feedback loop. An example is the motion picture industry; where actors, writers and other workers move to where the most studios are, and new studios are founded in the same place because that is where the most talent resides.
https://en.wikipedia.org/wiki?curid=1096151
1,799,283
702,159
1988 also saw the debut of Telenet Japan's "Exile", a series of action-platform RPGs, beginning with "XZR: Idols of Apostate". The series was controversial for its plot, which revolves around a time-traveling Crusades-era Syrian Islamic Assassin who assassinates various religious/historical figures as well as modern-day political leaders, with similarities to the present-day "Assassin's Creed" action game series. The gameplay of "Exile" included both overhead exploration and side-scrolling combat, featured a heart monitor to represent the player's Attack Power and Armour Class statistics, and another controversial aspect of the game involved taking drugs (instead of potions) that increase/decrease attributes but with side-effects such as affecting the heart-rate or causing death. An early attempt at incorporating a point-and-click interface in a real-time overhead action RPG was "Silver Ghost", a 1988 NEC PC-8801 game by Kure Software Koubou. It was an action-strategy RPG where characters could be controlled using a cursor. It was cited by Camelot Software Planning's Hiroyuki Takahashi as inspiration for the "Shining" series of tactical RPGs. According to Takahashi, "Silver Ghost" was "a simulation action type of game where you had to direct, oversee and command multiple characters." Unlike later tactical RPGs, however, "Silver Ghost" was not turn-based, but instead used real-time strategy and action role-playing game elements. A similar game released by Kure Software Koubou that same year was "First Queen", a unique hybrid between a real-time strategy, action RPG, and strategy RPG. Like an RPG, the player can explore the world, purchase items, and level up, and like a strategy video game, it focuses on recruiting soldiers and fighting against large armies rather than small parties. The game's "Gochyakyara" ("Multiple Characters") system let the player control one character at a time while the others are controlled by computer AI that follow the leader, and where battles are large-scale with characters sometimes filling an entire screen.
https://en.wikipedia.org/wiki?curid=32408675
701,794
277,471
Though diverse in chemical properties and functions, neurotoxins share the common property that they act by some mechanism leading to either the disruption or destruction of necessary components within the nervous system. Neurotoxins, however, by their very design can be very useful in the field of neuroscience. As the nervous system in most organisms is both highly complex and necessary for survival, it has naturally become a target for attack by both predators and prey. As venomous organisms often use their neurotoxins to subdue a predator or prey very rapidly, toxins have evolved to become highly specific to their target channels such that the toxin does not readily bind other targets (see ). As such, neurotoxins provide an effective means by which certain elements of the nervous system may be accurately and efficiently targeted. An early example of neurotoxin based targeting used radiolabeled tetrodotoxin to assay sodium channels and obtain precise measurements about their concentration along nerve membranes. Likewise through isolation of certain channel activities, neurotoxins have provided the ability to improve the original Hodgkin-Huxley model of the neuron in which it was theorized that single generic sodium and potassium channels could account for most nervous tissue function. From this basic understanding, the use of common compounds such as tetrodotoxin, tetraethylammonium, and bungarotoxins have led to a much deeper understanding of the distinct ways in which individual neurons may behave.
https://en.wikipedia.org/wiki?curid=326357
277,321
916,625
By far the largest military action in which the United States engaged during this era was the War of 1812. With Britain locked in a major war with Napoleonic France, its policy was to block American shipments to France. The United States sought to remain neutral while pursuing overseas trade. Britain cut the trade and impressed seamen on American ships into the Royal Navy, despite intense protests. Britain supported an Indian insurrection in the American Midwest, with the goal of creating an Indian barrier state there that would block American expansion. The United States finally declared war on the United Kingdom in 1812, the first time the U.S. had officially declared war. Not hopeful of defeating the Royal Navy, the U.S. attacked the British Empire by invading British Canada, hoping to use captured territory as a bargaining chip. The invasion of Canada was a debacle, though concurrent wars with Native Americans on the western front (Tecumseh's War and the Creek War) were more successful. After defeating Napoleon in 1814, Britain sent large veteran armies to invade New York, raid Washington and capture the key control of the Mississippi River at New Orleans. The New York invasion was a fiasco after the much larger British Army retreated to Canada. The raiders succeeded in the burning of Washington on 25 August 1814, but were repulsed in their Chesapeake Bay Campaign at the Battle of Baltimore and the British commander killed. The major invasion in Louisiana was stopped by a one-sided military battle that killed the top three British generals and thousands of soldiers. The winners were the commanding general of the Battle of New Orleans, Major General Andrew Jackson, who later became president, and the Americans, who basked in a victory over a much more powerful nation. The peace treaty proved successful, and the U.S. and Britain never again went to war. The losers were the Indians, who never gained the independent territory in the Midwest promised by Britain.
https://en.wikipedia.org/wiki?curid=161323
916,142
536,285
In 1948, Valentine Bargmann and Eugene Wigner proposed the relativistic Bargmann–Wigner equations to describe free particles and the equations are in the form of multi-component spinor field wavefunctions. In 1950, William Vallance Douglas Hodge presented "The topological invariants of algebraic varieties" at the Proceedings of the International Congress of Mathematicians. Between 1954 and 1957, Eugenio Calabi worked on the Calabi conjecture for Kähler metrics and the development of Calabi–Yau manifolds. In 1957, Tullio Regge formulated the mathematical property of potential scattering in the Schrödinger equation. Stanley Mandelstam, along with Regge, did the initial development of the Regge theory of strong interaction phenomenology. In 1958, Murray Gell-Mann and Richard Feynman, along with George Sudarshan and Robert Marshak, deduced the chiral structures of the weak interaction in physics. Geoffrey Chew, along with others, would promote matrix notation for the strong interaction, and the associated bootstrap principle, in 1960. In the 1960s, set-builder notation was developed for describing a set by stating the properties that its members must satisfy. Also in the 1960s, tensors are abstracted within category theory by means of the concept of monoidal category. Later, multi-index notation eliminates conventional notions used in multivariable calculus, partial differential equations, and the theory of distributions, by abstracting the concept of an integer index to an ordered tuple of indices.
https://en.wikipedia.org/wiki?curid=6134187
536,006
1,223,382
As an avid reader with a strong interest in the budding field of astrophysics, Hale was drawn to the writings of William Huggins, Norman Lockyer, and Ernest Rutherford. His fascination with science, however, did not preclude interests more typical of a normal boy, such as fishing, boating, swimming, skating, tennis, and bicycling. He was an enthusiastic reader of the stories of Jules Verne—particularly drawn to the tales of adventure set in the mountains of California. Hale spent summers at his grandmother's house in the old New England village of Madison, Connecticut, where he met his future wife, Evelina Conklin. After graduating from Oakland Public School in Chicago, Hale attended the Allen Academy, where he studied chemistry, physics, and astronomy. He supplemented his practical home experience by attending a course in shop-work at the Chicago Manual Training School. During these years, Hale developed a knowledge of the principles of architecture and city planning with the help of his father's friend, well-known architect Daniel Burnham. Upon Burnham's advice and encouragement, Hale decided at the age of seventeen to continue his education at the Massachusetts Institute of Technology (MIT).
https://en.wikipedia.org/wiki?curid=623538
1,222,722
787,315
In contexts where the application requires extremely tight (narrow) tolerance ranges, the requirement may push slightly past the limit of the ability of the machining and other processes (stamping, rolling, bending, etc.) to stay within the range. In such cases, selective assembly is used to compensate for a lack of "total" interchangeability among the parts. Thus, for a pin that must have a sliding fit in its hole (free but not sloppy), the dimension may be spec'd as 12.00 +0 −0.01 mm for the pin, and 12.00 +.01 −0 for the hole. Pins that came out oversize (say a pin at 12.003mm diameter) are not necessarily scrap, "but" they can only be mated with counterparts that "also" came out oversize (say a hole at 12.013mm). The same is then true for matching "under"size parts with "under"size counterparts. Inherent in this example is that for this product's application, the 12 mm dimension does not require extreme "accuracy", "but" the desired fit between the parts does require good "precision" (see the article on accuracy and precision). This allows the makers to "cheat a little" on "total" interchangeability in order to get more value out of the manufacturing effort by reducing the rejection rate (scrap rate). This is a sound engineering decision as long as the application and context support it. For example, for machines for which there is no intention for any future field service of a parts-replacing nature (but rather only simple replacement of the whole unit), this makes good economic sense. It lowers the unit cost of the products, and it does not impede future service work.
https://en.wikipedia.org/wiki?curid=908518
786,892
357,240
The first fossil, Mauer 1 (a jawbone), was discovered by a worker in Mauer, southeast of Heidelberg, Germany, in 1907. It was formally described the next year by German anthropologist Otto Schoetensack, who made it the type specimen of a new species, "Homo heidelbergensis". He split this off as a new species primarily because of the mandible's archaicness—in particular its enormous size—and it was the then-oldest human jaw in the European fossil record at 640,000 years old. The mandible is well preserved, missing only the left premolars, part of the 1st left molar, the tip of the left coronoid process (at the jaw hinge), and fragments of the mid-section as the jaw was found in 2 pieces and had to be glued together. It may have belonged to a young adult based on slight wearing on the 3rd molar. In 1921, the skull Kabwe 1 was discovered by Swiss miner Tom Zwiglaar in Kabwe, Zambia (at the time Broken Hill, Northern Rhodesia), and was assigned to a new species, ""H. rhodesiensis"", by English palaeontologist Arthur Smith Woodward. These were two of the many putative species of Middle Pleistocene "Homo" which were described throughout the first half of the 20th century. In the 1950s, Ernst Mayr had entered the field of anthropology, and, surveying a "bewildering diversity of names," decided to define only three species of "Homo": ""H. transvaalensis"" (the australopithecines), "H. erectus" (including the Mauer mandible, and various putative African and Asian taxa) and "Homo sapiens" (including anything younger than "H. erectus", such as modern humans and Neanderthals). Mayr defined them as a sequential lineage, with each species evolving into the next (chronospecies). Though later Mayr changed his opinion on the australopithecines (recognising "Australopithecus"), his more conservative view of archaic human diversity became widely adopted in the subsequent decades.
https://en.wikipedia.org/wiki?curid=442638
357,054
1,515,113
Progressive education, a pedagogy promoting learning through real-life experiences, was at its zenith in the United States in the 1920s and 30s, and the Avery Coonley School was a widely known model of these theories in action. Avery Coonley was featured regularly in "Progressive Education" and other professional journals, and in 1938, the editor of "Progressive Education," Gertrude Hartman, published a profile of the Avery Coonley School in her book "Finding Wisdom: Chronicles of a School of Today". She noted, "[v]isitors from all parts of the United States and from foreign countries [came] to see the school—sometimes as many as thirty in a day." Hartman described in great detail the curriculum and daily life of the school, which had by then added a seventh and eighth grade, commenting that the most noticeable thing about it is the spirit of earnestness and joy which pervades it ... There is a fundamental philosophy of life and education underlying the work of the school, which gives clear direction to all its activities. Broad areas are planned, which form the framework of the curriculum. These however are flexible and subject to modification depending upon conditions. The book described the progress of the students from their first year of kindergarten through their ten years of study, capturing in photos, stories, and examples of students' work their encounters with nature, history, art and other subjects through creative play and collaborative projects. She concluded with observations about the importance of the daily life of the school in developing social responsibility: It is the belief of those concerned with the school that out of the kind of education described here will emerge more socially enlightened members of society than the education of our generation has produced ... The method in which those in the school place their faith is the building into the very nature of growing boys and girls, through their growing years, those qualities of mind and spirit from which alone a new attitude towards human relations can evolve. "Finding Wisdom" became a classic in the education field and solidified the Avery Coonley School's national reputation as a model of progressive education.
https://en.wikipedia.org/wiki?curid=8052357
1,514,262
1,515,832
Extremophiles are organisms that thrive in the most volatile environments on the planet and it is due to their talents that they haven begun playing a large role in biotechnology. These organisms live everywhere from environments of high acidity or salinity to areas with limited or no oxygen are places they call home. Scientists show keen interest in organisms with rare or strange talents and in the past 20-30 years extremophiles have been at the forefront with thousands of researchers delving into their abilities. The area in which there has been the most talk, research, and development in relation to these organisms is biotechnology. Scientists around the globe are either extracting DNA to modify genomes or directly using extremophiles to complete tasks. Thanks to the discovery and interest in these organisms the enzymes used in PCR were found, making the rapid replication of DNA in the lab possible. Since they gained the spotlight researchers have been amassing databases of genome data for the hopes that new traits and abilities can be used to further biotechnical advancements Everything from the biodegradation of waste to the production of new fuels is on the horizon with the developments made in the field of biotechnology. There are many different kinds of extremophiles with each kind favoring a different environment. These organisms have become more and more important to biotechnology as their genomes have been uncovered, revealing a plethora of genetic potential. Currently the main uses of extremophiles lies in processes such as PCR, biofuel generation and biomining, but there are many other smaller scale operations at play. There are also labs that have identified what they wish to do with extremophiles, but haven't been able to fully achieve their goals. While these large scale goals have not yet been met the scientific community is working towards their completion in hope of creating new technologies and processes.
https://en.wikipedia.org/wiki?curid=59675902
1,514,980
1,675,848
Low frequency earthquakes do not exhibit the same seismic character as regular earthquakes namely because they lack distinct, impulsive body waves. P-wave arrivals from LFEs have amplitudes so small that they are often difficult to detect, so when the JMA first distinguished the unique class of earthquake it was primarily by the detection of S-wave arrivals which were emergent. Because of this, detecting LFEs is nearly impossible using classical techniques. Despite their lack of important seismic identifiers, LFEs can be detected at low Signal-to-Noise-Ratio (SNR) thresholds using advanced seismic correlation methods. The most common method for identifying LFEs involves the correlation of the seismic record with a template constructed from confirmed LFE waveforms. Since LFEs are such subtle events and have amplitudes that are frequently drowned out by background noise, templates are built by stacking similar LFE waveforms to reduce the SNR. Noise is reduced to such an extent that a relatively clean waveform can be searched for in the seismic record, and when correlation coefficients are deemed high enough an LFE is detected. Determination of the slip orientation responsible for LFEs and earthquakes in general is done by the P-wave first-motion method. LFE P-waves, when successfully detected, have first motions indicative of compressional stress, indicating that thrust-sense slip is responsible for their generation. Extracting high quality P-wave data out of LFE waveforms can be quite difficult, however, and is furthermore important for accurate hypocentral depth determinations. The detection of high quality P-wave arrivals is a recent advent thanks to the deployment of highly sensitive seismic monitoring networks. The depth occurrence of LFEs are generally determined by P-wave arrivals but have also been determined by mapping LFE epicenters against subducting plate geometries. This method does not discriminate whether or not the observed LFE was triggered at the plate interface or within the down-going slab itself, so additional geophysical analysis is required to determine where exactly the focus is located. Both methods find that LFEs are indeed triggered at the plate contact.
https://en.wikipedia.org/wiki?curid=7674011
1,674,906
420,665
Typical disadvantages include sensitivity to ambient humidity levels and a lack of bass response, due to phase cancellation from a lack of enclosure, but these are not shared by all designs. The bass rolloff 3db point occurs when the narrowest panel dimension equals a quarter wavelength of the radiated frequency for dipole radiators, so for a Quad ESL-63, which is 0.66 meters wide, this occurs at around 129 Hz, comparable to many box speakers (calculated with the speed of sound taken as 343 m/s). There is also the difficult physical challenge of reproducing low frequencies with a vibrating taut film with little excursion amplitude; however, as most diaphragms have a very large surface area compared to cone drivers, only small amplitude excursions are required to put relatively large amounts of energy out. While bass is lacking quantitatively (due to lower excursion than cone drivers) it can be of better quality ('tighter' and without 'booming') than that of electrodynamic (cone) systems. Phase cancellation can be somewhat compensated for by electronic equalization (a so-called shelving circuit that boosts the region inside the audio band where the generated sound pressure drops because of phase cancellation). Nevertheless "maximum" bass levels cannot be augmented because they are ultimately limited by the membrane's maximum permissible excursion before it comes too close to the high-voltage stators, which may produce electrical arcing and burn holes through it. Recent, technically more advanced solutions for perceived lack of bass include the use of large, curved panels (Sound-Lab, MartinLogan CLS), electrostatic subwoofer panels (Audiostatic, Quad), and long-throw electrostatic elements allowing large diaphragm excursions (Audiostatic). Another trick often practiced is to step up the bass (20–80 Hz) with a higher transformation ratio than the mid and treble.
https://en.wikipedia.org/wiki?curid=196048
420,460
1,429,340
In his PhD thesis, Changeux suggested that the recognition and transmission of signals by membrane, and in particular by synapses, could use the same mechanisms as the allosteric regulation of enzymes. More than forty years of research would follow, mainly focussed on nicotinic acetylcholine receptors (see below). In 1967, Changeux extended the MWC model to bi-dimensional lattice of receptors (an idea that would also be developed three decades afterward by Dennis Bray). He then applied this idea to the post-synaptic membrane of electric organs (analog to striated muscle). His team demonstrated the existence of several interconvertible states for the nicotinic receptor, resting, open and desensitized, displaying different affinities for the ligands, such as the endogenous agonist acetylcholine. The transitions between the states followed different kinetics, and those kinetics plus the differential affinities sufficed to explain the shape of the post-synaptic potential. A full mechanistic model of the nicotinic receptor from striated muscle (or electric organ) was to be provided much later, when Changeux collaborated with Stuart Edelstein, another specialist of allostery, who worked decades on hemoglobin. In addition to the allosteric modulation of the channel gating by the agonists, many other regulations of the ligand-gated ion channels activity have since been discovered. The modulators bind to a variety of allosteric sites, whether on the agonist binding sites, other binding sites at the subunit interfaces, on the cytoplasmic part of the protein or in the transmembrane domain.
https://en.wikipedia.org/wiki?curid=1798741
1,428,536
2,061,968
In late September 1912, it was announced that guard Lenoir Chambers was voted to be team captain for the upcoming season. In addition, it was announced baskets would be placed that week on outside tennis courts and hopefully practice would begin in the fall. However, the goals had not been placed by mid–October. In early November, "The Tar Heel" published a column where it credited the team's poor performance the previous year due to starting practice after Christmas, stating successful programs practiced in October. The writer commented that at this point captain Chambers had the players "report" and the team managers had not organized a try–out. The writer further mentioned that several students had been playing on the outdoor courts that were set–up and would provide good opportunities to find new members as well as good competition. On Monday December 2 at 8 PM local time, the try–outs started. Several men showed up including returning players Tillett, Carrington, and Smith returned from the previous year, along with Meb Long who played two years ago. At this point, the schedule was rumored to feature 18 to 20 games, including six at home and a road trip into Virginia for several contests against the likes of Randolph–Macon College, University of Virginia, Washington and Lee University, and more.
https://en.wikipedia.org/wiki?curid=62183290
2,060,778
896,875
It is difficult to judge the degree to which ordinary people change to using metric in their daily lives. In countries that have recently changed, older segments of the population tend to still use the older units. Also, local variations abound in which units are round metric quantities or not. In Canada, for example, ovens and cooking temperatures are usually measured in degrees Fahrenheit and Celsius. Except for in cases of import items, all recipes and packaging include both Celsius and Fahrenheit, so Canadians are typically comfortable with both systems of measurement. This extends to manufacturing, where companies are able to use both imperial and metric units, since the major export market is the US, but metric is required for both domestic use and for nearly all other exports. This may be due to the overwhelming influence of the neighbouring United States; similarly, many Canadians still often use non-metric measurements in day-to-day discussions of height and weight, though most driver's licences and other official government documents record weight and height only in metric (Saskatchewan driver licences, prior to the introduction of the current one-piece licence, indicated height in feet and inches but have switched to centimetres following the new licence format). In Canadian schools, however, metric is the standard, except when it comes up in recipes, where both are included, or in practical lessons involving measuring wood or other materials for manufacturing. In the United Kingdom, degrees Fahrenheit are seldom encountered (except when some people talk about hot summer weather), while other metric units are often used in conjunction with older measurements, and road signs use miles rather than kilometres. Another example is "hard" and "soft" metric: Canada converted liquid dairy products to litre, 500 mL, and 250 mL sizes, which caused some complaining at the time of the conversion, as a litre of milk is slightly over 35 imperial fluid ounces, while the former imperial quart used in Canada was 40 ounces. This is an example of a "hard" metric conversion. Conversely, butter in Canada is sold primarily in a 454 g package, which converts to one Imperial pound. Similarly, one of the standard sizes of some liquor bottles is 1.14 litres, which translates to 40 ounces (one Imperial/Canadian quart). This is considered a "soft" metric conversion.
https://en.wikipedia.org/wiki?curid=20353
896,403
479,987
Japanese capital flooded into Taiwan and large Japanese-owned firms would overshadow the Taiwanese firms paving the way for Taiwan to achieve full-blown capitalism. As a result, Japanese colonial intentions were made to modernize the island's economy, industry, and public works rather than exploitation, subjugation and oppression. New and advanced ideas, concepts, inputs and values were introduced to the Taiwanese from the Japanese through its own process of modern industrialization. Taiwan would soon modernize its infrastructure through several public works projects such by establishing railway and shipping lines, telegraph and telephone systems, shipyards, and public education to prepare the country for further development. Small and medium-sized Taiwanese manufacturing companies flourished as some 310,000 to 410,000 Taiwanese farms and landlords grew and sold paddy rice along with some 3300 local Taiwanese firms serving as refined millers. Despite previous domination of the import-export trade by native Taiwanese merchants, the production, distribution, and the import-export trade was almost entirely under control by the Japanese as new developments of modern capitalism began to take root in Taiwan. From the early to mid-twentieth century, Taiwan was predominantly an agricultural economy despite its lack of arable land located in a subtropical zone prone to plant diseases and bugs that did not make the island conducive for agriculture. To cope with Taiwan's natural disadvantages, Japan began investing in intensive research and development and established rural institutions to create new methods of agricultural cultivation such as modern irrigation systems, newer and improved breeds of plants and crops that resisted changes in weather patterns, diseases, and bugs. Through modern irrigation and agricultural cultivation techniques, Taiwan would soon become the advanced rice producing country in East Asia between 1930s to the 1950s. In 1940, Taiwan produced more than 50 times its fair share of rice in terms of total proportion and 3.3 times its share in the total world population at that time. Taiwan was a formidable agricultural exporting economy exporting a myriad of crops in large quantities.
https://en.wikipedia.org/wiki?curid=16234875
479,744
165,870
The first documented experiments dealing with EMG started with Francesco Redi’s works in 1666. Redi discovered a highly specialized muscle of the electric ray fish (Electric Eel) generated electricity. By 1773, Walsh had been able to demonstrate that the eel fish’s muscle tissue could generate a spark of electricity. In 1792, a publication entitled "De Viribus Electricitatis in Motu Musculari Commentarius" appeared, written by Luigi Galvani, in which the author demonstrated that electricity could initiate muscle contraction. Six decades later, in 1849, Emil du Bois-Reymond discovered that it was also possible to record electrical activity during a voluntary muscle contraction. The first actual recording of this activity was made by Marey in 1890, who also introduced the term electromyography. In 1922, Gasser and Erlanger used an oscilloscope to show the electrical signals from muscles. Because of the stochastic nature of the myoelectric signal, only rough information could be obtained from its observation. The capability of detecting electromyographic signals improved steadily from the 1930s through the 1950s, and researchers began to use improved electrodes more widely for the study of muscles. The AANEM was formed in 1953 as one of several currently active medical societies with a special interest in advancing the science and clinical use of the technique. Clinical use of surface EMG (sEMG) for the treatment of more specific disorders began in the 1960s. Hardyck and his researchers were the first (1966) practitioners to use sEMG. In the early 1980s, Cram and Steger introduced a clinical method for scanning a variety of muscles using an EMG sensing device.
https://en.wikipedia.org/wiki?curid=997173
165,785
1,056,209
One recent advance in meshfree methods aims at the development of computational tools for automation in modeling and simulations. This is enabled by the so-called weakened weak (W2) formulation based on the G space theory. The W2 formulation offers possibilities to formulate various (uniformly) "soft" models that work well with triangular meshes. Because a triangular mesh can be generated automatically, it becomes much easier in re-meshing and hence enables automation in modeling and simulation. In addition, W2 models can be made soft enough (in uniform fashion) to produce upper bound solutions (for force-driving problems). Together with stiff models (such as the fully compatible FEM models), one can conveniently bound the solution from both sides. This allows easy error estimation for generally complicated problems, as long as a triangular mesh can be generated. Typical W2 models are the Smoothed Point Interpolation Methods (or S-PIM). The S-PIM can be node-based (known as NS-PIM or LC-PIM), edge-based (ES-PIM), and cell-based (CS-PIM). The NS-PIM was developed using the so-called SCNI technique. It was then discovered that NS-PIM is capable of producing upper bound solution and volumetric locking free. The ES-PIM is found superior in accuracy, and CS-PIM behaves in between the NS-PIM and ES-PIM. Moreover, W2 formulations allow the use of polynomial and radial basis functions in the creation of shape functions (it accommodates the discontinuous displacement functions, as long as it is in G1 space), which opens further rooms for future developments. The W2 formulation has also led to the development of combination of meshfree techniques with the well-developed FEM techniques, and one can now use triangular mesh with excellent accuracy and desired softness. A typical such a formulation is the so-called smoothed finite element method (or S-FEM). The S-FEM is the linear version of S-PIM, but with most of the properties of the S-PIM and much simpler.
https://en.wikipedia.org/wiki?curid=6243282
1,055,661
27,211
Development is the process by which a multicellular organism (plant or animal) goes through a series of a changes, starting from a single cell, and taking on various forms that are characteristic of its life cycle. There are four key processes that underlie development: Determination, differentiation, morphogenesis, and growth. Determination sets the developmental fate of a cell, which becomes more restrictive during development. Differentiation is the process by which specialized cells from less specialized cells such as stem cells. Stem cells are undifferentiated or partially differentiated cells that can differentiate into various types of cells and proliferate indefinitely to produce more of the same stem cell. Cellular differentiation dramatically changes a cell's size, shape, membrane potential, metabolic activity, and responsiveness to signals, which are largely due to highly controlled modifications in gene expression and epigenetics. With a few exceptions, cellular differentiation almost never involves a change in the DNA sequence itself. Thus, different cells can have very different physical characteristics despite having the same genome. Morphogenesis, or the development of body form, is the result of spatial differences in gene expression. Specially, the organization of differentiated tissues into specific structures such as arms or wings, which is known as pattern formation, is governed by morphogens, signaling molecules that move from one group of cells to surrounding cells, creating a morphogen gradient as described by the French flag model. Apoptosis, or programmed cell death, also occurs during morphogenesis, such as the death of cells between digits in human embryonic development, which frees up individual fingers and toes. Expression of transcription factor genes can determine organ placement in a plant and a cascade of transcription factors themselves can establish body segmentation in a fruit fly.
https://en.wikipedia.org/wiki?curid=9127632
27,201
479,177
The area that is now modern-day northern Nigeria was once dominated during prehistoric times by the Nok civilization. When the British arrived in what became Northern Nigeria, there were by then lets say for arguments sake two nations ruling the North predominantly the Fulani and then also the Indigenous Hausa, the latter of which the Hausa kingdoms were constantly being attacked by the Fulani who are conquers by nature. The British to keep a balance assisted the Hausa and with their help confronted the Fulani. When the 19th century opened, the Fulani appeared to be the predominant race in the Sudan. Fulani is the Hausa name for the people who call themselves Fulbe. From 1900 to 1946, the regional British government in Northern Nigeria made exportation a high priority and set up a system of regulation to control disease and maintain quality. In terms of exports, one of the main agricultural industries involved the production and export of hides and skins. It had been developed long before the British arrived for the pre-colonial caravan trade and have reached an international market. World demand soared during the two world wars. The regional government imposed new, more efficient procedures in flaying, trimming, and drying hides and skins. It imposed new rules regarding minimum standards, and compulsory inspection, which had the effect of raising quality and obtaining higher prices. European merchants and industrialists had unlimited access to, and sometimes prevailed on, the colonial state, but the latter exercised autonomous decision-making power on matters affecting the economy and commerce of the colony.
https://en.wikipedia.org/wiki?curid=4305089
478,937
1,358,646
Biological systems exist as a complex interplay of countless cellular components interacting across four dimensions to produce the phenomenon called life. While it is common to reduce living organisms to non-living samples to accommodate traditional static imaging tools, the further the sample deviates from the native conditions, the more likely the delicate processes in question will exhibit perturbations. The onerous task of capturing the true physiological identity of living tissue, therefore, requires high-resolution visualization across both space and time within the parent organism. The technological advances of live-cell imaging, designed to provide spatiotemporal images of subcellular events in real time, serves an important role for corroborating the biological relevance of physiological changes observed during experimentation. Due to their contiguous relationship with physiological conditions, live-cell assays are considered the standard for probing complex and dynamic cellular events. As dynamic processes such as migration, cell development, and intracellular trafficking increasingly become the focus of biological research, techniques capable of capturing 3-dimensional data in real time for cellular networks ("in situ") and entire organisms ("in vivo") will become indispensable tools in understanding biological systems. The general acceptance of live-cell imaging has led to a rapid expansion in the number of practitioners and established a need for increased spatial and temporal resolution without compromising the health of the cell.
https://en.wikipedia.org/wiki?curid=37587408
1,357,896
1,898,595
Being perfectly aware of the fact that bacteria and viruses are one of most variable elements in nature, prone to unlimited mutational events, and taking for granted that it is impossible to manage all the external factors that can influence the development of a pathogenic virus, nobody is talking about defeating a new possible outbreak of plague or any other infective agent of the past: here the aim is to define a strategy, a "guideline", to be more prepared when a new dangerous pathogen will come. The contribution of the environment in infections is to be defined and factors such as human migration, climate change, overcrowding in cities or animal domestication are some of the major causes that contribute to the emergence and spread of disease. Of course, these factors are unpredictable and this is a reason why researchers are trying to bring relevant information from the past, that can be useful, today and tomorrow. While they continue to develop strategies to defeat emerging threats using diagnostic, molecular and advanced tools, they are still looking back at how ancient pathogens have evolved and adapted through historical events. The more it's known about the genomic basis of virulence in historical diseases, the more it can be understood about the emergence and re-emergence of infectious diseases today and in the future.
https://en.wikipedia.org/wiki?curid=62084336
1,897,511
1,879,544
Transparent exopolymer particles (TEPs) are extracellular acidic polysaccharides produced by phytoplankton and bacteria in saltwater, freshwater, and wastewater. They are incredibly abundant and play a significant role in biogeochemical cycling of carbon and other elements in water. Through this, they also play a role in the structure of food webs and trophic levels. TEP production and overall concentration has been observed to be higher in the Pacific Ocean compared to the Atlantic, and is more related to solar radiation in the Pacific. TEP concentration has been found to decrease with depth, having the highest concentration at the surface, especially associated with the SML, either by upward flux or sea surface production. Chlorophyll a has been found to be the best indicator of TEP concentration, rather than heterotrophic grazing abundance, further emphasizing the role of phytoplankton in TEP production. TEP concentration is especially enhanced by haptophyte phytoplanktonic dominance, solar radiation exposure, and close proximity to sea ice. TEPs also do not seem to show any diel cycles. High concentrations of TEPs in the surface ocean slow the sinking of solid particle aggregations, prolonging pelagic residence time. TEPs may provide an upward flux of materials such as bacteria, phytoplankton, carbon, and trace nutrients. High TEP concentrations were found under arctic sea ice, probably released by sympagic algae. TEP is efficiently recycled in the ocean, as heterotrophic grazers such as zooplankton and protists consume TEP and produce new TEP precursors to be reused, further emphasizing the importance of TEPs in marine carbon cycling. TEP abundance tends to be higher in coastal, shallow waters compared to deeper, oceanic waters. Diatom-dominated phytoplankton colonies produce larger, and stickier, TEPs, which may indicate that TEP size distribution and composition may be a useful tool in determining aggregate planktonic community structure.
https://en.wikipedia.org/wiki?curid=65452829
1,878,463
1,565,555
"Aspergillus clavatus" undergoes rapid growth, resulting in the formation of a velvety and fairly dense felt that is observed to be bluish-grey green in colour. The emerging conidial heads are large and clavate when very young, quickly splitting into conspicuous and compact divergent columns. The conidia bearing conidiophores are generally coarse, smooth walled, uncoloured, hyaline and can grow to be very long. Elongated club-shaped vesicles clavate, and bear phialides (singular: phialide) over their entire-surface, contributing to its short and densely packed structure. The sterigmata are usually found to be uniseriate, numerous and crowded. Conidia formed in them are elliptical, smooth and comparatively thick-walled. "A. clavatus" usually express conidiophores 1.5–3.00 mm in length, which arises from specialized and widened hyphal cells that eventually become the branching foot cells. The conidia on "A. clavatus" has been measured up to 3.0 – 4.5 X 2.5 – 3.5 μm. Cleistothecia are produced in crosses after approximately 4–10 weeks of incubation on suitable growth media at 25 °C. Cleistothecia are yellowish-brown (fawn) to dark brown in colour and range in diameter from 315-700 µm in diameter and have a relatively hard outer wall (peridium). At maturity the cleistothecia contain asci that themselves contain ascospores, which are clear, lenticular (with ridges evident) and between 6.0-7.0 µm in diameter.
https://en.wikipedia.org/wiki?curid=51666001
1,564,668
1,872,766
The Model 700 was available in several different versions. In its most basic incarnation, it had two line-level, balanced inputs. One popular upgrade was the addition of one or two microphone preamps, which were installed on removable cards into slots in the machine. These allowed stereo recording directly into the Model 700, bypassing a mixing console. Since the recorder had a significantly lower noise floor than most mixers of the same era, this method made the best use of the system's available dynamic range. Another, much more rare accessory was the "Model 700D Disc Mastering Delay". This was a device used for mastering vinyl records, and attached to a proprietary 25-pin digital output on the back of the Model 700 recorder. Because of the nature of vinyl records (which rotate at a constant angular velocity but at a changing linear velocity with respect to the needle as it moves inwards), it is necessary to send the audio signal to the computer controlling the movement (pitch) of the cutting lathe. In this way, during quiet passages the pitch is decreased thus enabling more grooves per inch. The computer needs an audio signal that is exactly one rotation ahead of the actual audio fed to the cutting lathe, so that it can move the cutting head rapidly forward before a loud passage is grooved. The disc mastering delay achieved this just like the analog mastering tape decks that used an extra playback head for this purpose.
https://en.wikipedia.org/wiki?curid=4786858
1,871,689
411,645
Vygotsky was a pioneering psychologist and his major works span six separate volumes, written over roughly ten years, from "Psychology of Art" (1925) to "Thought and Language" [or "Thinking and Speech"] (1934). Vygotsky's interests in the fields of developmental psychology, child development, and education were extremely diverse. His philosophical framework includes interpretations of the cognitive role of mediation tools, as well as the re-interpretation of well-known concepts in psychology such as internalization of knowledge. Vygotsky introduced the notion of zone of proximal development, a metaphor capable of describing the potential of human cognitive development. His work covered topics such as the origin and the psychology of art, development of higher mental functions, philosophy of science and the methodology of psychological research, the relation between learning and human development, concept formation, interrelation between language and thought development, play as a psychological phenomenon, learning disabilities, and abnormal human development (aka "defectology"). His scientific thinking underwent several major transformations throughout his career, but generally Vygotsky's legacy may be divided into two fairly distinct periods, and the transitional phase between the two during which Vygotsky experienced the crisis in his theory and personal life. These are the mechanistic "instrumental" period of the 1920s, integrative "holistic" period of the 1930s, and the transitional years of, roughly, 1929–1931. Each of these periods is characterized by its distinct themes and theoretical innovations.
https://en.wikipedia.org/wiki?curid=95176
411,443
120,781
With expanded techniques and growing awareness of the field at the close of the war, operational research was no longer limited to only operational, but was extended to encompass equipment procurement, training, logistics and infrastructure. Operations Research also grew in many areas other than the military once scientists learned to apply its principles to the civilian sector. With the development of the simplex algorithm for linear programming in 1947 and the development of computers over the next three decades, Operations Research can now solve problems with hundreds of thousands of variables and constraints. Moreover, the large volumes of data required for such problems can be stored and manipulated very efficiently." Much of operations research (modernly known as 'analytics') relies upon stochastic variables and a therefore access to truly random numbers. Fortunately the cybernetics field also required the same level of randomness. The development of increasingly better random number generators has been a boon to both disciplines. Modern applications of operations research include city planning, football strategies, emergency planning, optimizing all facets of industry and economy, and undoubtedly with the likelihood of the inclusion of terrorist attack planning and definitely counter-terrorist attack planning. More recently, the research approach of operations research, which dates back to the 1950s, has been criticized for being collections of mathematical models but lacking an empirical basis of data collection for applications. How to collect data is not presented in the textbooks. Because of the lack of data, there are also no computer applications in the textbooks.
https://en.wikipedia.org/wiki?curid=43476
120,732
338,179
"H. habilis" is associated with the Early Stone Age Oldowan stone tool industry. Individuals likely used these tools primarily to butcher and skin animals and crush bones, but also sometimes to saw and scrape wood and cut soft plants. Knappers - individuals shaping stones, appear to have carefully selected lithic cores and knew that certain rocks would break in a specific way when struck hard enough and on the right spot, and they produced several different types, including choppers, polyhedrons, and discoids. Nonetheless, specific shapes were likely not thought of in advance, and probably stem from a lack of standardisation in producing such tools as well as the types of raw materials at the knappers' disposal. For example, spheroids are common at Olduvai which features an abundance of large and soft quartz and quartzite pieces, whereas Koobi Fora lacks spheroids and provides predominantly hard basalt lava rocks. Unlike the later Acheulean culture invented by "H. ergaster" / "H. erectus", Oldowan technology does not require planning and foresight to manufacture, and thus does not indicate high cognition in Oldowan knappers, though it does require a degree of coordination and some knowledge of mechanics. Oldowan tools infrequently exhibit retouching and were probably discarded immediately after use most of the time.
https://en.wikipedia.org/wiki?curid=14348
337,999
1,695,910
During the development cycle of the 3.8 release, changes were made to the codice_12 memory management functions. In traditional Unix operating systems, codice_12 allocates more memory by extending the Unix data segment, a practice that has made it difficult to implement strong protection against security problems. The codice_12 implementation now in OpenBSD makes use of the codice_15 system call, which was modified so that it returns random memory addresses and ensures that different areas are not mapped next to each other. In addition, allocation of small blocks in shared areas are now randomized and the codice_16 function was changed to return memory to the kernel immediately rather than leaving it mapped into the process. A number of additional, optional checks were also added to aid in development. These features make program bugs easier to detect and harder to exploit: instead of memory being corrupted or an invalid access being ignored, they often result in a segmentation fault and abortion of the process. This has brought to light several issues with software running on OpenBSD 3.8, particularly with programs reading beyond the start or end of a buffer, a type of bug that would previously not be detected directly but can now cause an error. These abilities took more than three years to implement without considerable performance loss.
https://en.wikipedia.org/wiki?curid=3648704
1,694,957
1,184,730
Other types of nanosensors, including quantum dots and gold nanoparticles, are currently being developed to detect pollutants and toxins in the environment. These take advantage of the localized surface plasmon resonance (LSPR) that arises at the nanoscale, which results in wavelength specific absorption. This LSPR spectrum is particularly sensitive, and its dependence on nanoparticle size and environment can be used in various ways to design optical sensors. To take advantage of the LSPR spectrum shift that occurs when molecules bind to the nanoparticle, their surfaces can be functionalized to dictate which molecules will bind and trigger a response. For environmental applications, quantum dot surfaces can be modified with antibodies that bind specifically to microorganisms or other pollutants. Spectroscopy can then be used to observe and quantify this spectrum shift, enabling precise detection, potentially on the order of molecules. Similarly, fluorescent semiconducting nanosensors may take advantage of fluorescence resonance energy transfer (FRET) to achieve optical detection. Quantum dots can be used as donors, and will transfer electronic excitation energy when positioned near acceptor molecules, thus losing their fluorescence. These quantum dots can be functionalized to determine which molecules will bind, upon which fluorescence will be restored. Gold nanoparticle-based optical sensors can be used to detect heavy metals very precisely; for example, mercury levels as low as 0.49 nanometers. This sensing modality takes advantage of FRET, in which the presence of metals inhibits the interaction between quantum dots and gold nanoparticles, and quenches the FRET response. Another potential implementation takes advantage of the size dependence of the LSPR spectrum to achieve ion sensing. In one study, Liu et al. functionalized gold nanoparticles with a Pb sensitive enzyme to produce a lead sensor. Generally, the gold nanoparticles would aggregate as they approached each other, and the change in size would result in a color change. Interactions between the enzyme and Pb ions would inhibit this aggregation, and thus the presence of ions could be detected.
https://en.wikipedia.org/wiki?curid=603278
1,184,102
965,946
The first important work on element 103 was done at Berkeley by the nuclear-physics team of Albert Ghiorso, Torbjørn Sikkeland, Almon Larsh, Robert M. Latimer, and their co-workers on February 14, 1961. The first atoms of lawrencium were reportedly made by bombarding a three-milligram target consisting of three isotopes of californium with boron-10 and boron-11 nuclei from the Heavy Ion Linear Accelerator (HILAC). The Berkeley team reported that the isotope 103 was detected in this manner, and that it decayed by emitting an 8.6 MeV alpha particle with a half-life of . This identification was later corrected to 103, as later work proved that Lr did not have the properties detected, but Lr did. This was considered at the time to be convincing proof of synthesis of element 103: while the mass assignment was less certain and proved to be mistaken, it did not affect the arguments in favor of element 103 having been synthesized. Scientists at Joint Institute for Nuclear Research in Dubna (then in the Soviet Union) raised several criticisms: all but one were answered adequately. The exception was that Cf was the most common isotope in the target, and in the reactions with B, Lr could only have been produced by emitting four neutrons, and emitting three neutrons was expected to be much less likely than emitting four or five. This would lead to a narrow yield curve, not the broad one reported by the Berkeley team. A possible explanation was that there was a low number of events attributed to element 103. This was an important intermediate step to the unquestioned discovery of element 103, although the evidence was not completely convincing. The Berkeley team proposed the name "lawrencium" with symbol "Lw", after Ernest Lawrence, inventor of the cyclotron. The IUPAC Commission on Nomenclature of Inorganic Chemistry accepted the name, but changed the symbol to "Lr". This acceptance of the discovery was later characterized as being hasty by the Dubna team.
https://en.wikipedia.org/wiki?curid=17746
965,437
1,742,830
Starting with a liquid crystal in the nematic phase, the desired helical pitch (the distance along the helical axis for one complete rotation of the nematic plane subunits) can be achieved by doping the liquid crystal with a chiral molecule. For light circularly polarized with the same handedness, this regular modulation of the refractive index yields selective reflection of the wavelength given by the helical pitch, allowing the liquid-crystal laser to serve as its own resonator cavity. Photonic crystals are amenable to band theory methods, with the periodic dielectric structure playing the role of the periodic electric potential and a photonic band gap (reflection notch) corresponding to forbidden frequencies. The lower photon group velocity and higher density of states near the photonic bandgap suppresses spontaneous emission and enhances stimulated emission, providing favorable conditions for lasing. If the electronic band edge falls in the photonic bandgap, electron-hole recombination is strictly suppressed. This allows for devices with high lasing efficiency, low lasing threshold, and stable frequency, where the liquid-crystal laser acts its own waveguide. "Colossal" nonlinear change in refractive index is achievable in doped nematic-phase liquid crystals, that is the refractive index can change with illumination intensity at a rate of about 10cm/W of illumination intensity. Most systems use a semiconductor pumping laser to achieve population inversion, though flash lamp and electrical pumping systems are possible.
https://en.wikipedia.org/wiki?curid=31580571
1,741,846
1,798,435
The epipterygoid, a small, triangular structure, separates the pterygoid from the maxilla. Instead of being vertical or even slightly overturned as seen in most ankylosaurids, the main body of the pterygoids is near horizontal which, as a result, makes the interpterygoid vacuity in palatal view. The occipital condyle lacks a neck and is heart-shaped. The occiput is low and rectangular in shape. The paroccipital processes fall well short of the medial edge of the squamosal horn. Both the basisphenoid and basioccipital are fused together, with the sutural area being expanded as a ridge. This ridge marks the insertion for the rectus capitis and longus capitis muscles. Both the left and right jugal horns thrust more towards the sides than towards the underside. Towards the sides of the tooth row is a broad maxillary shelf that extends beneath the middle of the orbit. A long, narrow osteoderm is partially fused along each side of the mandible but does not not extend dorsally onto the lateral surface. The tooth row is positioned along the margins of the dentary. The ventral half of the mandible has a rough texture on the lateral surface, while the dorsal half of the mandible has a smooth texture. The position of the cheeks on the lower jaws is marked by the boundary between the smooth and the textured surfaces during occlusion as it is opposite to the lateral edge of the maxillary shelf. The coronoid process is small and low, and is present towards the front of the base of the process. The predentary is subtriangular in cross-section and bears numerous nutrient foramina to serve the rhamphotheca on the dorsal surface. The left dentary preserves 15 teeth and alveoli in the left dentary and 16 in the right dentary.
https://en.wikipedia.org/wiki?curid=21100859
1,797,426
1,695,307
Paper-based microfluidic devices have several applications outside of the medical field. For example, paper-based biosensors have been used extensively in environmental monitoring. Two recent devices were developed for the detection of "Salmonella" and "E. coli""." The latter device was specifically used to detect "E. coli" in seven field water samples from Tucson, Arizona. Antibody-conjugated polystyrene particles were loaded in the middle of the microfluidic channel, after the sample inlet. Immunoagglutination occurs when samples containing "Salmonella" or "E. coli", respectively, come into contact with these particles. The amount of immunoagglutination can be correlated with increased Mie scattering of light, which was detected with a specialized smartphone application under ambient light. Paper-based microfluidics has also been used to detect pesticides in food products, such as apple juice and milk. A recent design used piezoelectric inkjet printing to imprint paper with the enzyme acetylcholinesterase (AChE) and the substrate indophenyl acetate (IPA), and this paper-based microfluidic device was used to detect organophosphate pesticides (AChE inhibitors) via a decrease in blue-purple color. This device is distinguished by its use of bioactive paper instead of compartments with pre-stored reagents, and it was demonstrated to have good long-term stability, making it ideal for field use. A more recent paper-based microfluidic design utilized a sensor, consisting of fluorescently labeled single-stranded DNA (ssDNA) coupled with graphene oxide, on its surface to simultaneously detect heavy metals and antibiotics in food products. Heavy metals increased fluorescence intensity, whereas antibiotics decreased fluorescence intensity. Recently, paper-based devices have become very attractive for making inexpensive, disposable and convenient analytical devices for the determination of reactive phosphate in water. These devices utilize the molybdenum blue protocol for phosphate detection.
https://en.wikipedia.org/wiki?curid=56210030
1,694,355
1,154,912
The most important group of FDCA conversions is undoubtedly the polymerization. The potential applications of furan-based building blocks for polymer applications has been extensively reviewed by Gandini. A notable example is polyethylene 2,5-furandicarboxylate, but also other polyesters and various polyamides and polyurethanes have been described in literature. Amongst others like Dupont and Corbion, the company Avantium claims to have developed a cost-effective route to produce FDCA and the derived polyesters. FDCA has also been applied in pharmacology. It was demonstrated that its diethyl ester had a strong anaesthetic action similar to cocaine. Dicalcium 2,5-furandicarboxylate was shown to inhibit the growth of "Bacillus megatorium". Screening studies on FDCA-derived anilides showed their important anti-bacterial action. The diacid itself is a strong complexing agent, chelating such ions as: Ca, Cu and Pb; it is utilized in medicine to remove kidney stones. HMF is metabolized via FDCA in mammals including humans. A very diluted solution of FDCA in tetrahydrofuran is utilized for preparing artificial veins for transplantation. At the beginning of this chapter, it was mentioned that FDCA is a chemically stable compound. This property has been well benefited in industry – FDCA as most of polycarboxylic acids can be an ingredient of fire foams. Such foams help to extinguish fires in a short time caused by polar and non-polar solvents.
https://en.wikipedia.org/wiki?curid=20480829
1,154,302
161,271
Competition is developing among exchanges for the fastest processing times for completing trades. For example, in June 2007, the London Stock Exchange launched a new system called TradElect that promises an average 10 millisecond turnaround time from placing an order to final confirmation and can process 3,000 orders per second. Since then, competitive exchanges have continued to reduce latency with turnaround times of 3 milliseconds available. This is of great importance to high-frequency traders, because they have to attempt to pinpoint the consistent and probable performance ranges of given financial instruments. These professionals are often dealing in versions of stock index funds like the E-mini S&Ps, because they seek consistency and risk-mitigation along with top performance. They must filter market data to work into their software programming so that there is the lowest latency and highest liquidity at the time for placing stop-losses and/or taking profits. With high volatility in these markets, this becomes a complex and potentially nerve-wracking endeavor, where a small mistake can lead to a large loss. Absolute frequency data play into the development of the trader's pre-programmed instructions.
https://en.wikipedia.org/wiki?curid=2484768
161,186
334,166
In May 1997, Russia and Israel agreed to jointly fulfill an order from China to develop and deliver an early warning system. China reportedly ordered one Phalcon for $250 million, which entailed retrofitting a Russian-made Ilyushin-76 cargo plane [also incorrectly reported as a Beriev A-50 Mainstay] with advanced Elta electronic, computer, radar and communications systems. Beijing was expected to acquire several Phalcon AEW systems, and reportedly could buy at least three more [and possibly up to eight] of these systems, the prototype of which was planned for testing beginning in 2000. In July 2000, the US pressured Israel to back out of the $1 billion agreement to sell China four Phalcon phased-array radar systems. Following the cancelled A-50I/Phalcon deal, China turned to indigenous solutions. The Phalcon radar and other electronic systems were taken off from the unfinished Il-76, and the airframe was handed to China via Russia in 2002. The Chinese AWACS has a unique phased array radar (PAR) carried in a round radome. Unlike the US AWACS aircraft, which rotate their rotodomes to give a 360 degree coverage, the radar antenna of the Chinese AWACS does not rotate. Instead, three PAR antenna modules are placed in a triangular configuration inside the round radome to provide a 360 degree coverage. The installation of equipment at the Il-76 began in late 2002 aircraft by Xian aircraft industries (Xian Aircraft Industry Co.). The first flight of an airplane KJ-2000 made in November 2003. All four machines will be equipped with this type. The last to be introduced into service the Chinese Air Force until the end of 2007.<ref name="http://www.globalsecurity.org/military/world/china/kj-2000.htm"></ref> China is also developing a carrier-based AEW&C, Xian KJ-600 via Y-7 derived Xian JZY-01 testbed.
https://en.wikipedia.org/wiki?curid=236466
333,988
1,184,286
The remainder of technological innovation applicable to the cylindrical grinder is almost identical and entangled in a sense, to the rest of machine tools. The innovation of the last 70 years can be characterized by three waves of change. The first wave was the creation of numerical control by John T. Parsons in the 1940s. The U.S. Air Force, looking for a faster, cheaper, and more efficient means of part and tool production for airplanes, played a large role in developing NC both politically and financially. The first implementation of NC in machine tools occurred in the 1950s and continued through the 1960s. The second wave of innovation, occurring during the 1970s and 1980s, is marked by the massive demand for microcomputers to be used to direct NC. The joining of computers marked the birth of Computer Numerical Control which once again revolutionized the ability of the cylindrical grinder. Now the machine was able to receive instruction from a computer which would give it precise directions on every imaginable dimension and measurement needed to produce the desired product. This was a completely different work environment in comparison to mid-century production where a worker had to direct the machine at every point on how to manipulate the work. The third wave of change came in the 1990s with the advent of the Personal Computer. Integrating CNC and the PC into one dynamic system allowed for even further control of the manufacturing process that required little to no human supervision.
https://en.wikipedia.org/wiki?curid=10333611
1,183,659
1,292,940
Returning in time to 1920, Slater had gone to Harvard to work for a Ph.D. with Percy Bridgman, who studied the behaviour of substances under very high pressures. Slater measured the compressibility of common salt and ten other alkali halides—compounds of lithium, sodium, potassium and rubidium, with fluorine, chlorine and bromine. He described the results as "exactly in accord with Bohr's recent views of the relation between electron structure and the periodic table". This brought Slater's observation concerning the mechanical properties of ionic crystals into line with the theory that Bohr had based on the spectroscopy of gaseous elements. He wrote the alkali halide paper in 1923, having "by the summer of 1922" been "thoroughly indoctrinated ... with quantum theory", in part by the courses of Edwin Kemble following a fascination with Bohr's work during his undergraduate days. In 1924, Slater went to Europe on a Harvard Sheldon Fellowship. After a brief stay at the University of Cambridge, he went on to the University of Copenhagen, where "he explained to Bohr and Kramers his idea (that was) a sort of forerunner of the duality principle, (hence) the celebrated paper" on the work that others dubbed the Bohr-Kramers-Slater (BKS) theory. "Slater suddenly became an internationally known name.". Interest in this "old-quantum-theory" paper subsided with the arrival of full quantum mechanics, but Philp M. Morse's biography states that "in recent years it has been recognized that the correct ideas in the article are those of Slater." Slater discusses his early life through the trip to Europe in a transcribed interview.
https://en.wikipedia.org/wiki?curid=405835
1,292,229
338,173
Typically, "H. ergaster" / "H. erectus" is considered to have been the first human to have lived in a monogamous society, and all preceding hominins were polygynous. However, it is highly difficult to speculate with any confidence the group dynamics of early hominins. The degree of sexual dimorphism and the size disparity between males and females is often used to correlate between polygyny with high disparity and monogamy with low disparity based on general trends (though not without exceptions) seen in modern primates. Rates of sexual dimorphism are difficult to determine as early hominin anatomy is poorly known, and are largely based on few specimens like. In some cases, sex is arbitrarily determined in large part based on perceived size and apparent robustness in the absence of more reliable elements in sex identification (namely the pelvis). Mating systems are also based on dental anatomy, but early hominins possess a mosaic anatomy of different traits not seen together in modern primates; the enlarged cheek teeth would suggest marked size-related dimorphism and thus intense male–male conflict over mates and a polygynous society, but the small canines should indicate the opposite. Other selective pressures, including diet, can also dramatically impact dental anatomy. The spatial distribution of tools and processed animal bones at the FLK Zinj and PTK sites in Olduvai Gorge indicate the inhabitants used this area as a communal butchering and eating grounds, as opposed to the nuclear family system of modern hunter gatherers where the group is subdivided into smaller units each with their own butchering and eating grounds.
https://en.wikipedia.org/wiki?curid=14348
337,993
586,620
As the Chief of the Bureau of Naval Ordnance stated in his 1897 "Small Arms" report to the Secretary of the Navy, "In making what may appear a radical departure in its selection of a caliber smaller than as yet adopted elsewhere...the Bureau is convinced it is looking to the future … it is surely wise to attempt to advance with one stride as far as existing conditions allow toward the goal to which the world is moving with slow steps." The report went on to list the advantages of the smaller 6 mm caliber: greatly increased velocity, the flatness of bullet trajectory, reduced recoil, a 100% increase in penetration compared to the former .45-70 Government cartridge, and the ability to carry twice the number of cartridges per individual sailor or Marine. The report also acknowledged that the 6 mm round had two principal disadvantages: first, as a small-caliber round, the 6 mm bullet would not sufficiently wound an enemy to put him out of action, and second, the "shock" or stopping power of the smaller bullet would not "stop the onset of excited men at short range". In answer to these objections, the report gave three responses: first, "the battle of the future will be fought at long range, and men will not live to come to close quarters with an enemy who stands his ground"; second, "99 percent of wounded enemy soldiers were unlikely to investigate the severity of their wound, but simply retire to the rear", and last, "the explosive effect of a small-caliber, high-velocity bullet against the human body—the bullet tumbles or fragments to produce devastating wounds against bone or fluid-filled organs—would be more incapacitating at all ranges than wounds made by a slow-moving bullet of large caliber".
https://en.wikipedia.org/wiki?curid=18509293
586,320
1,250,623
MICP treatment may be limited to deep soil due to limitations of bacterial growth and movement in subsoil. MICP may be limited to the soils containing limited amounts of fines due to the reduction in pore spaces in fine soils. Based on the size of microorganism, the applicability of biocementation is limited to GW, GP, SW, SP, ML, and organic soils. Bacteria are not expected to enter through pore throats smaller than approximately 0.4 µm. In general, the microbial abundance was found to increase with the increase in particle size. On the other hand, the fine particles may provide more favorable nucleation sites for calcium carbonate precipitation because the mineralogy of the grains could directly influence the thermodynamics of the precipitation reaction in the system. The habitable pores and traversable pore throats were found in coarse sediments and some clayey sediments at shallow depth. In clayey soil, bacteria are capable of reorienting and moving clay particles under low confining stress (at shallow depths). However, inability to make these rearrangements under high confining stresses limits bacterial activity at larger depths. Furthermore, sediment-cell interaction may cause puncture or tensile failure of the cell membrane. Similarly, at larger depths, silt and sand particles may crush and cause a reduction in pore spaces, reducing the biological activity. Bacterial activity is also impacted by challenges such as predation, competition, pH, temperature, and nutrient availability. These factors can contribute to the population decline of bacteria. Many of these limitations can be overcome through the use of MICP through bio-stimulation - a process through which indigenous ureolytic soil bacteria are enriched in situ. This method is not always possible as not all indigenous soils have enough ureolytic bacteria to achieve successful MICP.
https://en.wikipedia.org/wiki?curid=39190731
1,249,947
842,775
As a preliminary test indicating infection, plucked hairs and skin and nail scrapings can be directly viewed under a microscope for detection of fungal elements. "T. rubrum" cannot be distinguished from other dermatophytes in this direct examination. It can be distinguished "in vitro" from other dermatophytes by means of characteristic micromorphology in culture, usually consisting of small, tear-drop-shaped microconidia, as well as its usual blood-red colony reverse pigmentation on most growth media. In addition, the Bromocresol purple (BCP) milk solid glucose agar test can be used to distinguish it. Different "Trichophyton" species release different amounts of ammonium ion, altering the pH of this medium. In this test, medium supporting "T. rubrum" remains sky blue, indicating neutral pH, until 7 to 10 days after inoculation. In primary outgrowth on Sabouraud dextrose agar with cycloheximide and antibacterials, contaminating organisms may cause confusion, as "T. rubrum" colonies deprived of glucose by competing contaminants may grow without forming the species' distinctive red pigment. Both antibiotic-resistant bacteria and saprotrophic fungi may outcompete "T. rubrum" for glucose if they contaminate the sample. Red pigment production can be restored in such contaminated isolates using casamino acids erythritol albumin agar (CEA). "T. rubrum" cultures can be isolated on both cycloheximide-containing media and cycloheximide-free media. The latter are conventionally used for the detection of nail infections caused by non-dermatophytes such as "Neoscytalidium dimidiatum". A skin test is ineffective in diagnosing active infection and often yields false negative results.
https://en.wikipedia.org/wiki?curid=9784397
842,325
1,071,897
Another advantage of an HDR reservoir is that its confined nature makes it highly suitable for load-following operations, whereby the rate of energy production is varied to meet the varying demand for electric power—a process that can greatly increase the economic competitiveness of the technology. This concept was evaluated near the end of the Phase II testing period, when energy production was increased by 60% for 4 hours each day, by a programmed vent-down of the high-pressure reservoir regions surrounding the production borehole. Within two days it became possible to computerize the process, such that production was automatically increased and decreased according to the desired schedule for the rest of the test period. The transitions between the two production levels took less than 5 minutes, and at each level steady-state production was consistently maintained. Such load-following operations could not be implemented in a natural hydrothermal system or even in an EGS system because of the unconfined volume and boundary conditions. Load following almost never improves economics for geothermal development because the fuel cost is effectively paid up front, so delaying use just hurts the economics. Normal geothermal systems have also (by necessity) been applied to follow loads but this kind of generation increases maintenance costs and generally reduces revenue (in spite of the higher prices for some of the load).
https://en.wikipedia.org/wiki?curid=12951705
1,071,343
758,142
While attending (and later leading) the Freewill Baptist Parsonsfield Seminary, Bates founder, Oren Burbank Cheney worked for racial and gender equality, religious freedom, and temperance. In 1836, Cheney enrolled in Dartmouth College (after briefly attending Brown), due to Dartmouth's significant support of the abolitionist cause against slavery. After graduating, Cheney was ordained a Baptist minister and began to establish himself as an educational and religious scholar. Parsonsfield mysteriously burned down in 1854, allegedly due to arson by opponents of abolition. The event caused Cheney to advocate for the building of a new seminary in a more central part of Maine. With Cheney's influence in the state legislature, the Maine State Seminary was chartered in 1855 and implemented a liberal arts and theological curriculum, making the first coeducational college in New England. Soon after establishment several donors stepped forward to finance portions of the school, such as Seth Hathorn, who donated the first library and academic building, which was renamed Hathorn Hall. The Cobb Divinity School became affiliated with the college in 1866. Four years later in 1870, Bates sponsored a college preparatory school, called the Nichols Latin School. The college was affected by the financial panic of the later 1850s and required additional funding to remain operational. Cheney's impact in Maine was noted by Boston business magnate Benjamin Bates who developed an interest in the college. Bates gave $100,000 in personal donations and overall contributions valued at $250,000 to the college. The school was renamed Bates College in his honor in 1863 and was chartered to offer a liberal arts curriculum beyond its original theological focus. Two years later the college would graduate the first woman to receive a college degree in New England, Mary Mitchel. The college began instruction with a six-person faculty tasked with the teaching of moral philosophy and the classics. From its inception, Bates College served as an alternative to a more traditional and historically conservative Bowdoin College. There is a complex relationship between the two colleges, revolving around socioeconomic class, academic quality, and collegiate athletics.
https://en.wikipedia.org/wiki?curid=319763
757,737
1,968,285
In 1989 Clayton accepted a professorship at Clemson University to develop a graduate research program in astrophysics there. He began this academic segment (1989–present) by hiring three talented young astrophysicists to vitalize joint research with the Compton Gamma Ray Observatory (launched in 1991 after several delays). Its four instruments successfully detected gamma-ray lines identifying several of the radioactive nuclei that Clayton had predicted to be present in supernova remnants. Clayton had been designated ten years earlier Co-Investigator on the NASA proposal submitted by James Kurfess for the Oriented Scintillation Spectrometer Experiment OSSE, one of the four successful instruments carried into orbit by Space Shuttle "Atlantis", and he carried that research contract to Clemson. Simultaneously Clayton developed at Clemson his stardust research, introducing annual workshops for its researchers. The initial NASA-sponsored workshop at Clemson in 1990 was so lively that it was repeated the following year jointly with Washington University in St. Louis cosponsorship, and in later years cosponsored also by the University of Chicago and by the Carnegie Institution of Washington. These workshops featured the excitement of new isotopic discoveries, and also helped participants focus their ideas for submission of abstracts to NASA's Lunar and Planetary Science Conference. Otherwise participants' workshop discussions were not shared or publicized.
https://en.wikipedia.org/wiki?curid=36061850
1,967,155