doc_id
int32 18
2.25M
| text
stringlengths 245
2.96k
| source
stringlengths 38
44
| __index_level_0__
int64 18
2.25M
|
---|---|---|---|
1,857,051 |
"…Man has an unprecedented control over the world and everything in it. And so, whether he likes it or not, what happens next is very largely up to him." He notes that when he spoke those words he could have had no idea that man might have unleashed forces that are now altering the Earth's climate. The naturalist highlights several meteorological and climatological catastrophes: Hurricane Katrina, the collapse of glaciers in Greenland, drought in the Amazon River, forest fires in Australia, and one of Europe's hottest summers (that caused 27,000 deaths). He wonders if, somehow, there is a connection between these events. Scientists all over the world are linking the changes in the Earth's weather to a global rise in temperatures. The actual figure is just 0.6 °C since 1900, but this is only an average. For example, the Arctic has warmed by up to 3 °C, and this threatens its whole ecosystem. A team has been surveying polar bears in the region for the last 25 years, and over that time, the animals have declined in number by a quarter. Each year the Arctic ice is also now melting three weeks earlier. The overall rate of glacier melt is accelerating: in southern Greenland, the amount of ice flowing into the sea has doubled in a decade, resulting in a rise of sea levels. This is exacerbated by the increase in temperatures, which causes oceans to expand. When Hurricane Katrina struck New Orleans, the sea temperatures of the Gulf of Mexico and the Atlantic Ocean were the highest ever recorded. In addition, the 2005 hurricane season was the worst ever. Scientists who have studied such severe weather warn that from now on hurricanes in the area will be more intense, more destructive and possibly more frequent. Also in 2005, the Amazon region suffered its worse drought in 60 years, decimating local fish populations. Six months later, trees have still not recovered. The abnormally warm seas in the Atlantic had disrupted the rainfall in the forest and for similar reasons, coral reefs are also at risk, leading to the phenomenon of coral bleaching.
|
https://en.wikipedia.org/wiki?curid=7564445
| 1,855,983 |
933,189 |
Likewise, railroads changed the style of transportation. For the common person in the early 1800s, transportation was often traveled by horse or stagecoach. The network of trails along which coaches navigated were riddled with ditches, potholes, and stones. This made travel fairly uncomfortable. Adding to injury, coaches were cramped with little leg room. Travel by train offered a new style. Locomotives proved themselves a smooth, headache free ride with plenty of room to move around. Some passenger trains offered meals in the spacious dining car followed by a good night sleep in the private sleeping quarters. Railroad companies in the North and Midwest constructed networks that linked nearly every major city by 1860. In the heavily settled Corn Belt (from Ohio to Iowa), over 80 percent of farms were within of a railway. A large number of short lines were built, but due to a fast developing financial system based on Wall Street and oriented to railway securities, the majority were consolidated into 20 trunk lines by 1890. Most of these railroads made money and ones that didn't were soon bought up and incorporated in a larger system or "rationalized". Although the transcontinental railroads dominated the media, with the completion of the First transcontinental railroad in 1869 dramatically symbolizing the nation's unification after the divisiveness of the Civil War, most construction actually took place in the industrial Northeast and agricultural Midwest, and was designed to minimize shipping times and costs. The railroads in the South were repaired and expanded and then, after a lot of preparation, changed from a 5-foot gauge to standard gauge of 4 foot 8 ½ inches in two days in May 1886.
|
https://en.wikipedia.org/wiki?curid=587997
| 932,697 |
225,620 |
Nano materials are used in a variety of, manufacturing processes, products and healthcare including paints, filters, insulation and lubricant additives. In healthcare Nanozymes are nanomaterials with enzyme-like characteristics. They are an emerging type of artificial enzyme, which have been used for wide applications in such as biosensing, bioimaging, tumor diagnosis, antibiofouling and more. High quality filters may be produced using nanostructures, these filters are capable of removing particulate as small as a virus as seen in a water filter created by Seldon Technologies. Nanomaterials membrane bioreactor (NMs-MBR), the next generation of conventional MBR, are recently proposed for the advanced treatment of wastewater. In the air purification field, nano technology was used to combat the spread of MERS in Saudi Arabian hospitals in 2012. Nanomaterials are being used in modern and human-safe insulation technologies, in the past they were found in Asbestos-based insulation. As a lubricant additive, nano materials have the ability to reduce friction in moving parts. Worn and corroded parts can also be repaired with self-assembling anisotropic nanoparticles called TriboTEX. Nanomaterials have also been applied in a range of industries and consumer products. Mineral nanoparticles such as titanium-oxide have been used to improve UV protection in sunscreen. In the sports industry, lighter bats to have been produced with carbon nanotubes to improve performance. Another application is in the military, where mobile pigment nanoparticles have been used to create more effective camouflage. Nanomaterials can also be used in three-way-catalyst (TWC) applications. TWC converters have the advantage of controlling the emission of nitrogen oxides (NOx), which are precursors to acid rain and smog. In core-shell structure, nanomaterials form shell as the catalyst support to protect the noble metals such as palladium and rhodium. The primary function is that the supports can be used for carrying catalysts active components, making them highly dispersed, reducing the use of noble metals, enhancing catalysts activity, and improving the mechanical strength.
|
https://en.wikipedia.org/wiki?curid=868108
| 225,504 |
852,449 |
Soldiers of an Infantry Brigade Combat Team received the first APMI cartridges in March 2011, with plans to field them in all seven deployed IBCTs within six months. The first shell was fired on 26 March by Company C, 1-506th Infantry, 4th Brigade Combat Team, 101st Airborne Division, which landed four meters from the target. Heavy mortars are traditionally employed at battalion-level for immediate fire suppression and support, but they were the primary indirect fire weapon available to remote forward operating bases, so guided mortars gave a battalion commander accurate artillery fire without needing to request an M982 Excalibur from a brigade-level howitzer. The ATK XM395 PGMM cartridge uses a standard M934 high-explosive 120 mm projectile body with a GPS receiver in the nose and computer controlled aerodynamic directional fins for stability and to keep it on the programmed trajectory. It has a multi-mode airburst, point detonation, and delay fuse. Unguided 120 mm mortars have accuracy of at maximum range, which can be reduced to with precision position and pointing systems. The PGMM can hit within 10 meters of a target, and often hits within four, making it seven times more accurate. Although not designed to replace unguided mortars, the PGMM allows mortar teams to eliminate point targets that would require 8-10 rounds using one or two. This lengthens the amount of time a typical team with 25 rounds can operate and increases the number of targets it can engage without needing resupply. It also expands potential target zones that previously required soldiers to clear because inaccurate artillery would cause collateral damage, since insurgents deliberately attacked from populated areas hoping troops wouldn't risk civilian casualties. The XM395 kit costs $10,000 each, much less than the guided Excalibur 155 mm shell. There is also no current requirement for guidance for 60 mm or 81 mm mortars.
|
https://en.wikipedia.org/wiki?curid=6805466
| 851,995 |
2,053,521 |
Scientists have begun to study how billions of bacteria, fungi and other microbes living in the stomach and intestines of humans affect our health for the last few decades. What we eat determines which of these microorganisms thrive, and the composition of the intestinal flora has a great influence on other parts of our body. For example, some intestinal bacteria cause inflammation in our immune system, while other bacteria secrete substances that penetrate blood or block arteries, which helps explain why heart disease patients have different microbes and health conditions. Grains of paradise are plants that grow in swampy areas in West Africa-vine chocked swamps a member of the ginger family. It is a plant that gorillas like eating but it contains a powerful anti- swollen compound. It grows up to 1.5 meters with a trumpet shape and reddish-brown seeds. Gorillas do use the plant to make nests on the ground and beds that they use over the night for sleeping; they also use the seeds to treat coughs, toothaches and measles. The plant also provides comfort and warmth to the weak and cold bodies of the gorilla. The invention of processed high-calorie cookies containing vitamins and nutrients and the addition of several fruits and vegetables ultimately helped to standardize the diet of gorillas. The animal biscuit diet begins to prolong life and looks healthier and can sometimes survive for 50 years. The researches found that the biscuit diet has many shortcomings. Although gorillas are genetically similar to humans, their digestive systems are very different and more like horses. Like a horse, a gorilla is a digestive organ that processes food primarily in the very long large intestine, not in the stomach. This means they are good for breaking down the fiber, but not very good for sugar or grain. If the zookeeper feed them sweet potatoes or commercially grown fruit, they will eat them but that didn't bring much energy to them.
|
https://en.wikipedia.org/wiki?curid=60770250
| 2,052,339 |
1,884,483 |
The entry of the United States into the war in late 1941, however, brought a significant period of change to the College, with eighty-five acres of its land and the majority of its buildings being transferred in March 1942 to the United States Army for hospital purposes. The 153rd Station Hospital occupied the site briefly until July 1942, when replaced by the 105th General Hospital Unit. Only twenty-four students and a drastically reduced staff remained on campus. The College administration moved to the newly completed Cooper Laboratory and Riddell Dormitory was retained until September 1942 by which time temporary buildings had been constructed for the College by the Department of Public Works, in the northeast corner of the campus. The College also occupied the nearby College View State School as a laboratory from March 1942 to April 1943. In January 1943 more temporary buildings were erected for the College, which re-opened for enrolments in February 1943. College wartime work included the testing of alternative fuels and growing crops of opium poppy, urgently needed during war for the production of morphine. Extensive temporary facilities were erected by the Civil Construction Corps for the military hospital, including nearly two dozen large timber hospital wards, interconnected by covered walkways, on the eastern side of the campus core. A large "tent city" was established to the south of the core, serving as living quarters for soldiers undergoing rehabilitation. Existing buildings were altered to serve a variety of wartime purposes. The Foundation Building was used as both the administrative headquarters for the US Army and as a laboratory and pharmacy, its verandahs enclosed to provide more space. Shelton Hall (now Morrison Hall) was used as the hospital, its dormitories well suited for use as hospital wards, with dental services, X-Ray facilities and operating theatres located on the ground floor. In 1943 a 'U'-shaped morgue was constructed, used for the examination and preparation of deceased soldiers for transportation back to their families in the United States. In 1944 the two most northerly wings of the building were removed prior to the Americans leaving the College, and from 1945 the remaining section was utilised as a residence and later a girls' change room before being converted into a small chapel in 1959. 19,000 patients from the battlefields of the Pacific and New Guinea were treated at Gatton during the period of occupation by the US Army.
|
https://en.wikipedia.org/wiki?curid=44321024
| 1,883,402 |
228,316 |
Researchers have used OCT to produce detailed images of mice brains, through a "window" made of zirconia that has been modified to be transparent and implanted in the skull. Optical coherence tomography is also applicable and increasingly used in industrial applications, such as nondestructive testing (NDT), material thickness measurements, and in particular thin silicon wafers and compound semiconductor wafers thickness measurements surface roughness characterization, surface and cross-section imaging and volume loss measurements. OCT systems with feedback can be used to control manufacturing processes. With high speed data acquisition, and sub-micron resolution, OCT is adaptable to perform both inline and off-line. Due to the high volume of produced pills, an interesting field of application is in the pharmaceutical industry to control the coating of tablets. Fiber-based OCT systems are particularly adaptable to industrial environments. These can access and scan interiors of hard-to-reach spaces, and are able to operate in hostile environments—whether radioactive, cryogenic, or very hot. Novel optical biomedical diagnostic and imaging technologies are currently being developed to solve problems in biology and medicine. As of 2014, attempts have been made to use optical coherence tomography to identify root canals in teeth, specifically canal in the maxillary molar, however, there is no difference with the current methods of dental operatory microscope. Research conducted in 2015 was successful in utilizing a smartphone as an OCT platform, although much work remains to be done before such a platform would be commercially viable. Photonic integrated circuits may be a promising option to miniaturized OCT. Similarly to integrated circuits silicon-based fabrication techniques can be used to produced miniaturized photonic systems. First in vivo human retinal imaging has been reported recently
|
https://en.wikipedia.org/wiki?curid=628583
| 228,199 |
1,591,304 |
In the second round, Georgetown met the region's tenth seed, No. 23 Davidson, the winner of 23 straight games, led by junior point guard Stephen Curry, a future National Basketball Association star. Before defeating Gonzaga in the first round two days earlier, Davidson had not won an NCAA tournament game in 39 years, but when asked about the match-up with Davidson after the win over UMBC, John Thompson III said, "Watching Davidson makes me feel worried...At this point, you just need to figure out how to get a win and move on." Despite Thompson's qualms, heavily favored Georgetown seemed to have the game under control in the first half: After a 2-0 Davidson lead, the Hoyas – featuring the country's top defense entering the game, allowing opponents to shoot only 37 percent from the field and allowing an average of only 57.6 points per game – built a 38-27 lead by halftime. Shooting 71 percent from the floor early in the second half and with Curry missing 10 of his first 12 shots of the game, Georgetown extended the lead to 46-29 with 17:56 left to play. But then Curry came alive, and the entire character of the game changed. Led by Curry, Davidson went on a 16-2 run that cut Georgetown's lead to 50-48 with 8:47 left to play. Thanks largely to Curry, for whom the Hoyas suddenly had no answer, the Wildcats went on to take a 60-58 lead – their first lead since 2-0 – with 4:40 left, and Curry then scored a three-pointer and a two-pointer to stretch the lead to 65-60. Curry made six of his last nine shots from the field and five of six free throws in the final 23 seconds and scored 30 points – 25 of them in the second half – and Davidson upset Georgetown 74-70 to extend its winning streak, the longest active streak in the country at the time, to 24 games and bring Georgetown's season to a stunning end. The Hoyas shot 63 percent from the field as a team – Jessie Sapp scored 14 points, Jonathan Wallace had 12, and DaJuan Summers finished with 10 – but Roy Hibbert managed only six points and a single rebound and Georgetown committed 20 turnovers. The loss to Davidson was the first in what became a pattern of upsets in, and early exits from, the NCAA tournament over the next several years.
|
https://en.wikipedia.org/wiki?curid=14518554
| 1,590,409 |
1,747,905 |
Science fiction prototyping has a number of applications. The most obvious is for product innovation, in which the two earliest examples are Intel's 21st Century Robot (an open innovation project to develop a domestic robot) and Essex University's eDesk (a mixed-reality immersive education desk) both of which were introduced in the previous section. Beyond product innovation, science fiction prototyping finds itself being applied to many diverse areas. For example, at the University of Washington (USA) they have used it to facilitate broader contextual and societal thinking about computers, computer security risks, and security defense as part of an optional senior-level course in computer security. In 2014, these ideas were refined into a SFP methodology called Threatcasting with early adopters including the United States Air Force Academy, the Government of California, and the Army Cyber Institute at West Point Military Academy. An earlier variation called Futurcasting was used by government to provide a tool to influence the direction of society and politics. It did this by using stories about possible futures as a medium to engage the population in conversations about futures they would like to encourage or avoid. Science Fiction Prototyping is also being used in business environments. For example, in Canterbury Christ Church University (UK) Business School it is being used as a vehicle to introduce creative thinking in support of entrepreneurship courses. In the National Taiwan University (Taiwan), it is used to increase business school students' interests in science and technology for business innovation. Elsewhere the Business Schools of the universities of Leeds and Manchester (UK) are exploring its use in community development projects. Finally, it is being applied to Education. For example, in San-Diego State University (USA) Department of Learning Design and Technology they have explored it as a means for motivating pre-university students to take up STEM studies and careers. Further afield, in China, they have identified a novel use for the methodology to address the mandatory requirement for all science and engineering students to take a course in English language. In particular Shijiazhuang University (China) are exploring the potential for Science Fiction Prototyping to overcome the dullness that some science students experience in language learning by using it as an integrated platform for teaching "Computer English", combining language and science learning. China is also concerned to improve the creative and innovation capabilities of their graduate which this approach supports.
|
https://en.wikipedia.org/wiki?curid=42519513
| 1,746,919 |
92,357 |
Current trends in treating the disorder include medications for symptom-based treatments that aim to minimize the secondary characteristics associated with the disorder. If an individual is diagnosed with FXS, genetic counseling for testing family members at risk for carrying the full mutation or premutation is a critical first-step. Due to a higher prevalence of FXS in boys, the most commonly used medications are stimulants that target hyperactivity, impulsivity, and attentional problems. For co-morbid disorders with FXS, antidepressants such as selective serotonin reuptake inhibitors (SSRIs) are utilized to treat the underlying anxiety, obsessive-compulsive behaviors, and mood disorders. Following antidepressants, antipsychotics such as risperidone and quetiapine are used to treat high rates of self-injurious, aggressive and aberrant behaviors in this population (Bailey Jr et al., 2012). Anticonvulsants are another set of pharmacological treatments used to control seizures as well as mood swings in 13%–18% of individuals with FXS. Drugs targeting the mGluR5 (metabotropic glutamate receptors) that are linked with synaptic plasticity are especially beneficial for targeted symptoms of FXS. Lithium is also currently being used in clinical trials with humans, showing significant improvements in behavioral functioning, adaptive behavior, and verbal memory. Few studies suggested using folic acid, but more researches are needed due to the low quality of that evidence. Alongside pharmacological treatments, environmental influences such as home environment and parental abilities as well as behavioral interventions such as speech therapy, sensory integration, etc. all factor in together to promote adaptive functioning for individuals with FXS. While metformin may reduce body weight in persons with fragile X syndrome, it is uncertain whether it improves neurological or psychiatric symptoms.
|
https://en.wikipedia.org/wiki?curid=53142
| 92,316 |
1,405,721 |
In the past, physicians have traditionally hand-written or verbally communicated orders for patient care, which are then transcribed by various individuals (such as unit clerks, nurses, and ancillary staff) before being carried out. Handwritten reports or notes, manual order entry, non-standard abbreviations and poor legibility lead to errors and injuries to patients, . A follow up IOM report in 2001 advised use of electronic medication ordering, with computer- and internet-based information systems to support clinical decisions. Prescribing errors are the largest identified source of preventable hospital medical error. A 2006 report by the Institute of Medicine estimated that a hospitalized patient is exposed to a medication error each day of his or her stay. While further studies have estimated that CPOE implementation at all nonrural hospitals in the United States could prevent over 500,000 serious medication errors each year. Studies of computerized physician order entry (CPOE) has yielded evidence that suggests the medication error rate can be reduced by 80%, and errors that have potential for serious harm or death for patients can be reduced by 55%, and other studies have also suggested benefits. Further, in 2005, CMS and CDC released a report that showed only 41 percent of prophylactic antibacterials were correctly stopped within 24 hours of completed surgery. The researchers conducted an analysis over an eight-month period, implementing a CPOE system designed to stop the administration of prophylactic antibacterials. Results showed CPOE significantly improved timely discontinuation of antibacterials from 38.8 percent of surgeries to 55.7 percent in the intervention hospital. CPOE/e-Prescribing systems can provide automatic dosing alerts (for example, letting the user know that the dose is too high and thus dangerous) and interaction checking (for example, telling the user that 2 medicines ordered taken together can cause health problems). In this way, specialists in pharmacy informatics work with the medical and nursing staffs at hospitals to improve the safety and effectiveness of medication use by utilizing CPOE systems.
|
https://en.wikipedia.org/wiki?curid=2021968
| 1,404,931 |
2,055,996 |
Humans are strongly visually oriented. We like information in the form of pictures and are able to integrate many different kinds of data when they are presented in one or more images. It seems natural to seek a way of directly imaging the sediment-water interface in order to investigate animal-sediment interactions in the marine benthos. Rhoads and Cande (1971) took pictures of the sediment-water interface at high resolution (sub-millimetre) over small spatial scales (centimetres) in order to examine benthic patterns through time or over large spatial scales (kilometres) rapidly. Slicing into seabeds and taking pictures instead of physical cores, they analysed images of the vertical sediment profile in a technique that came to be known as SPI. This technique advanced in subsequent decades through a number of mechanical improvements and digital imaging and analysis technology. SPI is now a well-established approach accepted as standard practice in several parts of the world, though its wider adoption has been hampered partly because of equipment cost, deployment, and interpretation difficulties. It has also suffered some paradigm setbacks. The amount of information that a person can extract from imagery, in general, is not easily and repeatedly reduced to quantifiable and interpretable values (but see Pech et al. 2004; Tkachenko 2005). Sulston and Ferry (2002) wrote about this difficulty in relation to the study of the human genome. Electron microscope images of their model organism ("Caenorhabditis elegans") carried a lot of information but were ignored by many scientists because they were not readily quantified, yet that pictorial information ultimately resulted in a deep, and quantifiable, understanding of underlying principles and mechanisms. In the same way, SPI has been used successfully by focusing on the integration of visual data and a few objectively quantifiable parameters in site reconnaissance and monitoring.
|
https://en.wikipedia.org/wiki?curid=14917968
| 2,054,813 |
1,516,158 |
She continued her education intending to study mathematics, astronomy and physics at Swarthmore College. However, Richardson instead graduated Phi Beta Kappa with a bachelor's degree in philosophy and a minor in physics in 1962 before she pursued graduate work in philosophy at Harvard University. Meanwhile she was able to enroll in plant taxonomy and evolution courses at Harvard that would later contribute to her big-picture approach to studying protein structure. Since Harvard's philosophy focused on modern philosophy instead of Richardson's interest, classical philosophy, Richardson left with her master's degree from Harvard in 1966. Post-graduation, Richardson tried teaching high school, but soon realized that this career path was not for her. She subsequently rejoined the scientific world, working as a technician at Massachusetts Institute of Technology in the same laboratory as her husband, David Richardson, whom she met at Swarthmore College. At MIT, David Richardson was pursuing his doctorate in Al Cotton's lab using X-ray crystallography to study the structure of staphylococcal nuclease. Jane Richardson learned the necessary technical skills and scientific background in biochemistry and biophysics through work at the lab as she worked alongside her husband, whom she still works with today. Richardson later began drawing her eponymous diagrams as a method of interpreting the structures of protein molecules. Over the course of her career, Richardson has been recognized by many prestigious institutions by the scientific community. In July 1985 she was awarded a MacArthur Fellowship for her work in biochemistry. She was elected to the National Academy of Sciences and the American Academy of Arts and Sciences in 1991 and to the Institute of Medicine in 2006. As part of her role in the National Academy of Sciences, Richardson serves on panels that advise the White House and the Pentagon regarding nationally important scientific matters (e.g.,). For the 2012-2013 year, Richardson was elected president of the Biophysical Society for the 2012-2013 year, and she became a fellow of the American Crystallographic Association in 2012. Richardson is currently a James B. Duke Professor of Biochemistry at Duke University.
|
https://en.wikipedia.org/wiki?curid=11502437
| 1,515,306 |
1,731,619 |
The central thesis is, that harm reduction is not only a social concept, but also a biological one. More specifically, evolution does not make moral distinctions in the selection process, but utilizes a cannabis-based approach, seen from the oldest pollen, where "Cannabis" and "Humulus" diverged between 18.23 mya and 27.8 mya ago, and consistent with Cannabis dated to 19.6 ago (Ma), in northwestern China, and converge on the northeastern Tibetan Plateau, in the general vicinity of Qinghai Lake, which is deduced as the Cannabis centre of origin, and co-localizes with the first steppe community that evolved in Asia, or Yunnan, in the southwest of China, also identified as "the birthplace of tea ... the first area where humans figured out that eating tea leaves or brewing a cup could be pleasant", and helpful, by its cannabimimetic bioactivity of catechin derivatives occurring in tea leaves, as the region of origin, to harm reduction in order to promote survival of the fittest. Evidence provided from peer-reviewed scientific literature supports the hypothesis, that humans, and all animals, since the primordial CB receptor evolved at least 600 million years ago; a date that broadly consistent with the Cambrian explosion, make and use internally produced cannabis-like products (endocannabinoids) as part of the evolutionary harm reduction program. More specifically, endocannabinoids homeostatically regulate all body systems (cardiovascular, digestive, endocrine, excretory, immune, nervous, musculo-skeletal, reproductive), and modulating endocannabinoid activity have therapeutic potential in almost all diseases affecting humans. Therefore, the health of each individual is dependent on this system are working appropriately, and imagine what could be achieved if signaling through these receptors could be controlled: happy, slim, and healthy people who remember that they're pain-free, by forgetting, and ignore it, achieved though cannabis, the evolutionary byproduct of a plant that evolved to affect the ECS and to become its natural key, stemming back to aquatic species 400 million years before the arrival of plants and trees.
|
https://en.wikipedia.org/wiki?curid=20853174
| 1,730,643 |
1,917,001 |
The International Medical Faculty of Osh State University is a higher educational establishment in the south of Kyrgyzstan. More than 2202 foreign students study at the faculty and 151-teaching staff works at the 6 departments of IMF. 10 Doctors of Medical Science, 29 Candidates of Sciences, 2 Ph.D., 2 Honored Doctors, 11 senior teachers, 91 teachers, and more than 30 Excellent Workers of Public Health and Education are successfully working at the faculty. Teachers of IMF actively participate in international conferences and workshops, and also they organize students` research clubs. Main directions of scientific work at the International Medical Faculty: • Histology and Embryology: morphological features of the placenta in women with full-term pregnancy considering age, constitutional causes, and ethnic factors. • Pathophysiology: investigation of morpho-functional features of the mucous membranes of the upper respiratory tract. • Biochemistry and chemistry: the development of drugs based on natural compounds and the synthesis of active substances (HIV, cancer cells, tuberculosis). • Rheumatology: clinical and immunological features of pulmonary disease in rheumatoid arthritis • Pediatrics: clinical and functional characteristics of non-rheumatic heart damage in children in the southern region of the republic. • Surgery: modern possibilities of combined low invasive correction of the bile duct. Surgical tactics with hemorrhoids complicated by anemia. • Gynecology: Superficial activity of amniotic fluid and placental microstructures in low-mountain, mid-mountain, and high-mountain areas of Kyrgyzstan. • Pharmacology: pharmacoepidemiology of drugs for community-acquired pneumonia in the elderly in the KR. • Challenges of global civilization: social and humanitarian aspects.
|
https://en.wikipedia.org/wiki?curid=12800364
| 1,915,902 |
1,917,004 |
The International Medical Faculty of Osh State University is a higher educational establishment in the south of Kyrgyzstan. More than 2202 foreign students study at the faculty and 151-teaching staff works at the 6 departments of IMF. 10 Doctors of Medical Science, 29 Candidates of Sciences, 2 Ph.D., 2 Honored Doctors, 11 senior teachers, 91 teachers, and more than 30 Excellent Workers of Public Health and Education are successfully working at the faculty. Teachers of IMF actively participate in international conferences and workshops, and also they organize students` research clubs. Main directions of scientific work at the International Medical Faculty: • Histology and Embryology: morphological features of the placenta in women with full-term pregnancy considering age, constitutional causes, and ethnic factors. • Pathophysiology: investigation of morpho-functional features of the mucous membranes of the upper respiratory tract. • Biochemistry and chemistry: the development of drugs based on natural compounds and the synthesis of active substances (HIV, cancer cells, tuberculosis). • Rheumatology: clinical and immunological features of pulmonary disease in rheumatoid arthritis • Pediatrics: clinical and functional characteristics of non-rheumatic heart damage in children in the southern region of the republic. • Surgery: modern possibilities of combined low invasive correction of the bile duct. Surgical tactics with hemorrhoids complicated by anemia. • Gynecology: Superficial activity of amniotic fluid and placental microstructures in low-mountain, mid-mountain, and high-mountain areas of Kyrgyzstan. • Pharmacology: pharmacoepidemiology of drugs for community-acquired pneumonia in the elderly in the KR. • Challenges of global civilization: social and humanitarian aspects.
|
https://en.wikipedia.org/wiki?curid=12800364
| 1,915,905 |
1,603,243 |
Current biological anthropology suggests that similarities in structures in the brain can, to an extent, be compared with certain aspects of behavior as their roots. However, it is difficult to quantify exactly which neuron connections are required for advanced function as opposed to basic reactionary cognitive operations, as identified in small insects or other small-brained organisms. Regardless, circuitry common to a wide quantity of organisms has been identified, suggesting a convergence at least of the evolution of common neural Behavioral plasticity which allow for common functions and trends of inherited behavior. It is possible that this is due to the size of the brain having direct correlation to the degree of function. However, it has been noted by experiments carried out on insects by Martin Giurfa in 2015, namely observing honey bees and fruit flies, which suggests that structures in the brain, regardless of size, can relate to functions and explain behavioral skills far greater than gross size can:"As in larger brains, two basic neural architectural principles of many invertebrate brains are the existence of specialized brain structures and circuits, which refer to specific sensory domains, and of higher-order integration centres, in which information pertaining to these different domains converges and is integrated, thus allowing cross-talking and information transfer. These characteristics may allow positive transfer from a set of stimulus to novel ones, even if these belong to different sensory modalities. This principle appears crucial for certain tasks such as rule learning." To this end, recent years have instead dedicated entirely to mapping signals and pathways of the brain in order to compare across species as opposed to using brain size. Further studies in this field are ongoing, especially as the process of tracking and stimulating neuron development changes.
|
https://en.wikipedia.org/wiki?curid=12185843
| 1,602,342 |
23,593 |
Kurt Lewin taught at Cornell from 1933 to 1935 and is considered the "father of social psychology". Norman Borlaug taught at the university from 1982 to 1988 and is considered the "father of the Green Revolution", being awarded the Nobel Peace Prize, the Presidential Medal of Freedom, the Congressional Gold Medal, and 49 honorary doctorates. Frances Perkins joined the Cornell faculty in 1952 after serving as the first female member of the United States Cabinet and served until her death in 1965. Perkins was a witness to the Triangle Shirtwaist Factory fire in her adolescence and went on to champion the National Labor Relations Act, the Fair Labor Standards Act, and the Social Security Act while United States Secretary of Labor. Buckminster Fuller was a visiting professor at Cornell for one year (1952), and Henry Louis Gates, African American Studies scholar and subject of an arrest controversy and White House "Beer Summit", taught at Cornell from 1985 to 1989. Plant genetics pioneer Ray Wu invented the first method for sequencing DNA, considered a major breakthrough in genetics as it has enabled researchers to more closely understand how genes work. Emmy Award-winning actor John Cleese, known for his roles in "Monty Python", "James Bond", "Harry Potter" and "Shrek", has taught at Cornell since 1999. Charles Evans Hughes taught in the law school from 1893 to 1895 before becoming Governor of New York, United States Secretary of State, and Chief Justice of the United States. Georgios Papanikolaou, who taught at Cornell's medical school from 1913 to 1961, invented the Pap smear test for cervical cancer. Robert C. Baker ('43), widely credited for inventing the chicken nugget, taught at Cornell from 1957 to 1989. Carl Sagan was a professor at the university from 1968 to 1996. He narrated and co-wrote the PBS series "", the Emmy Award- and Peabody Award-winning show that became the most watched series in public-television history. He also wrote the novel "Contact", the basis for a 1997 film of the same name, and he won a Pulitzer Prize for his novel "The Dragons of Eden: Speculations on the Evolution of Human Intelligence". M. H. Abrams was a professor emeritus of English and was the founding editor of "The Norton Anthology of English Literature". James L. Hoard, a scientist who worked on the Manhattan Project and an expert in crystallography, was a professor emeritus of chemistry and taught from 1936 to 1971.
|
https://en.wikipedia.org/wiki?curid=7954422
| 23,584 |
1,885,820 |
In 1908, Chamberlin was hired to lead the Biology Department at Brigham Young University (BYU), a university owned and operated by the Church of Jesus Christ of Latter-day Saints (LDS Church), during a period which BYU president George H. Brimhall sought to increase its academic standing. LDS College professor J. H. Paul, in a letter to Brimhall, had written Chamberlin was "one of the world's foremost naturalists, though, I think, he is only about 28 years of age. I have not met his equal ... We must not let him drift away". Chamberlin oversaw expanded biology course offerings and led insect-collecting trips with students. Chamberlin joined a pair of newly hired brothers on the faculty, Joseph and Henry Peterson, who taught psychology and education. Chamberlin and the two Petersons worked to increase the intellectual standing of the University. In 1909 Chamberlin's own brother William H. Chamberlin was hired to teach philosophy. The four academics, all active members of the Church, were known for teaching modern scientific and philosophic ideas and encouraging lively debate and discussion. The Chamberlins and Petersons held the belief that the theory of evolution was compatible with religious views, and promoted historical criticism of the Bible, the view that the writings contained should be viewed from the context of the time: Ralph Chamberlin published essays in the "White and Blue", BYU's student newspaper, arguing that Hebrew legends and historical writings were not to be taken literally. In an essay titled "" Chamberlin concluded: "Only the childish and immature mind can lose by learning that much in the Old Testament is poetical and that some of the stories are not true historically." Chamberlin believed that evolution explained not only the origin of organisms but of human theological beliefs as well.
|
https://en.wikipedia.org/wiki?curid=3997448
| 1,884,738 |
159,100 |
The first Titan II launch, Missile N-2, was carried out on 16 March 1962 from LC-16 at Cape Canaveral and performed extremely well, flying downrange and depositing its reentry vehicle in the Ascension splash net. There was only one problem: a high rate of longitudinal vibrations during first stage burn. While this did not affect missile launches for the Air Force, NASA officials were concerned that this phenomenon would be harmful to astronauts on a crewed Gemini flight. The second launch, Missile N-1, lifted from LC-15 on 7 June. First stage performance was near-nominal, but the second stage developed low thrust due to a restriction in the gas generator feed. The Range Safety officer sent a manual shutdown command to the second stage, causing premature RV separation and impact well short of the intended target point. The third launch, Missile N-6 on 11 July, was completely successful. Aside from pogo oscillation (the nickname NASA engineers invented for the Titan's vibration problem since it was thought to resemble the action of a pogo stick), the Titan II was experiencing other teething problems that were expected of a new launch vehicle. The 25 July test (Vehicle N-4) had been scheduled for 27 June, but was delayed by a month when the Titan's right engine experienced severe combustion instability at ignition that caused the entire thrust chamber to break off of the booster and fall down the flame deflector pit, landing about 20 feet from the pad (the Titan's onboard computer shut the engines down the moment loss of thrust occurred). The problem was traced to a bit of cleaning alcohol carelessly left in the engine. A new set of engines had to be ordered from Aerojet, and the missile lifted off from LC-16 on the morning of 25 July. The flight went entirely according to plan up to first stage burn, but the second stage malfunctioned again when the hydraulic pump failed and thrust dropped nearly 50%. The computer system compensated by running the engine for an additional 111 seconds, when propellant depletion occurred. Because the computer had not sent a manual cutoff command, reentry vehicle separation and vernier solo phase did not occur. Impact occurred downrange, half the planned distance.
|
https://en.wikipedia.org/wiki?curid=841594
| 159,018 |
746,886 |
Ostrom's early work emphasized the role of public choice on decisions influencing the production of public goods and services. Among her better known works in this area is her study on the polycentricity of police functions in Indianapolis. Caring for the commons had to be a multiple task, organised from the ground up and shaped to cultural norms. It had to be discussed face to face, and based on trust. Dr. Ostrom, besides poring over satellite data and quizzing lobstermen herself, enjoyed employing game theory to try to predict the behaviour of people faced with limited resources. In her Workshop in Political Theory and Policy Analysis at Indiana University—set up with her husband Vincent, a political scientist, in 1973—her students were given shares in a national common. When they discussed what they should do before they did it, their rate of return from their "investments" more than doubled. Her later, and more famous, work focused on how humans interact with ecosystems to maintain long-term sustainable resource yields. Common pool resources include many forests, fisheries, oil fields, grazing lands, and irrigation systems. She conducted her field studies on the management of pasture by locals in Africa and irrigation systems management in villages of western Nepal (e.g., Dang Deukhuri). Her work has considered how societies have developed diverse institutional arrangements for managing natural resources and avoiding ecosystem collapse in many cases, even though some arrangements have failed to prevent resource exhaustion. Her work emphasized the multifaceted nature of human–ecosystem interaction and argues against any singular "panacea" for individual social-ecological system problems.
|
https://en.wikipedia.org/wiki?curid=5033761
| 746,490 |
1,852,637 |
Sutherland's home life meant a lot to him, it was a home of affection and culture, every member of it excelled in either literature, music or art. In July 1882 Sutherland was offered the position of superintendent of the School of Mines, Ballarat, but it was too far from his home and the public library, and the offer was declined. For many years he earned just enough to pay his way by acting as an examiner and contributing articles to the press; the rest of his time was given to scientific research. In 1884 he applied without success for the chair of chemistry at the University of Adelaide, and in 1888 when the professor of natural philosophy Henry Martyn Andrew died Sutherland was appointed lecturer at the University of Melbourne until the chair could be filled. Sutherland had applied for this position through the Victorian agent-general in London, but the application was reportedly mis-filed and was not considered. Professor Thomas Ranken Lyle was appointed and in 1897, when he was away on leave, Sutherland was again made lecturer. Sutherland had begun contributing to the Philosophical Magazine in 1885, and on an average about two articles a year front his pen appeared in it for the next 25 years. For the last 10 years of his life he was a regular contributor and leader writer on the Melbourne "Age", particularly on scientific subjects. Sutherland declined an offer of an appointment on the staff of the paper. Sutherland wrote on such topics as the surface tension of liquids, diffusion, the rigidity of solids, the properties of solutions (including an influential analysis of the structure of water), the origin of spectra and the source of the Earth's magnetic field. Sutherland devoted most of his time to scientific research. A list of 69 of his contributions to scientific magazines appears in W. A. Osborne's, "William Sutherland a Biography". Sutherland died quietly in his sleep on 5 October 1911 from a ruptured heart.
|
https://en.wikipedia.org/wiki?curid=2422389
| 1,851,575 |
1,890,763 |
In her later years, Stoney suffered from ill health, largely attributed to her over-exposure to radiation in her work. It was reported that she had X-ray dermatitis of her left hand, a painful skin condition associated in modern times with radiation therapy as a treatment for cancer. Stoney moved to the south coastal town of Bournemouth in England; here she was on the staff of two hospitals, practicing radiology part-time. She occupied the position of Honorary Medical Officer to the Electrical Department of the Royal Victoria and West Hants Hospital in Bournemouth. Stoney was the founder and president of the Wessex branch of the British Institute of Radiology. She served as the consulting actinotherapist at the Victoria Cripples Home. During retirement she penned a number of articles in contribution to the medical literature of the time. She published research on topics such as fibroids, goitre, Graves' disease, soldier's heart, rickets and osteomalacia. Stoney retired from all of her hospital positions in 1928 at the age of 58. She, along with her older sister Edith, travelled in retirement. One trip was to India, where Stoney wrote her final scientific paper, the subject of which was osteomalacia (bone softening), in particular in relation to pelvic deformities in childbirth. She studied and investigated this topic overseas, and specifically the association between UV exposure, vitamin D and skeletal development. In India, she also used her expertise to advise on the use of UV light in hospitals. Stoney died at the age of 62, on 7 October 1932. She was suffering from a long and painful illness, vertebral cancer, again largely attributed to her work in the presence of high levels of radiation. The "British Journal of Radiology" published her official obituary which spanned five pages, containing many warm personal testimonials. After her sister's death, Edith Stoney continued to travel and research.
|
https://en.wikipedia.org/wiki?curid=43034967
| 1,889,680 |
646,915 |
In 1938 the institute, despite its relative poverty, built a biochemical division and another one dedicated to cellular pathology, whose direction was entrusted to the hands of Boivin (who went on to discover endotoxins that are contained in the germ's body and are freed after its death). During the same period, Andre Lwoff assumed the direction of a new microbial physiology branch built on rue Dutot. The general mobilization after France's declaration of war against Germany, in September 1939, emptied the Institute and significantly reduced its activities, as members of appropriate age and condition were recruited into the army, but the almost total absence of battles during the first months of the conflict helped maintain the sanitary situation on the front. After the occupation of France, the Germans never tried to gather information from the institute's research; their confidence in Germany's advantage in this field decreased their curiosity, and their only interest was in the serums and vaccines that it could provide to their troops or the European auxiliaries they recruited. This relative freedom allowed the institute to become, during the two years after the occupation, a pharmacy for the Resistance thanks to the initiative of Vallery-Radot, Pasteur's nephew. The Germans became suspicious of the institute's staff only after an outbreak of typhoid in a Wehrmacht division that was stationed near Paris before being sent to the Russian front. The cause of the epidemic was later found to be due to a member of the Institute stealing a culture of the germ responsible for the disease and, with the collaboration of an accomplice, infecting a large quantity of butter used to feed German troops. The fact that the epidemic spread after the Germans sold some of the butter to civilians was proof that the illness's breakout was not caused by local water quality. Afterward, the German authorities ordered that the institute's stores containing microbial cultures could be opened only by authorized members; similar security problems also induced them to demand complete lists of the staff's names and functions; missing names caused the Germans to send two biologists, Dr. Wolmann and his wife, as well as other three lab assistants, to a concentration camp. The institute was not a location for German entrenchment even during the battles for Paris's liberation because of the honor and respect it commanded, as well as out of fear that involving it in any type of conflict might "free the ghosts of long defeated diseases".
|
https://en.wikipedia.org/wiki?curid=866902
| 646,575 |
1,580,257 |
American Science and Engineering Inc, (AS&E) is an American manufacturer of advanced X-ray equipment and related technologies, founded in 1958 by Martin Annis, PHd. Annis asked George W. Clark to join him in starting his company. Their primary work in the beginning was as a developer for NASA. Annis brought on as Chairman of the Board of Directors, Bruno Rossi, PhD, of Bruno Rossi of MIT to help guide their efforts. Rossi had earlier confirmed the existence of cosmic rays, and postulated that black holes would emit tremendous bursts of cosmic radiation as they swallowed celestial objects. At the urging of Rossi, Annis brought on board Riccardo Giacconi, from Italy, to work on the effort to develop a detector. As a consultant to American Science and Engineering, Inc., Rossi initiated the rocket experiments that discovered the first extra-solar source of X-rays, Scorpius X-1. Despite Rossi's pivotal discoveries and work in this area, in 2002 Richardo Giacconi alone won the Nobel Prize for its discovery and invention Nobel Prize in Physics. The AS&E team made possible the Einstein Observatory (equipped with the first full imaging X-ray telescope on board a satellite, launched in 1978). Throughout his tenure as president of AS&E (from its inception in 1958 until he left in 1993), Annis was a leading inventor/scientist for the company, including inventing the backscatter technology, which enabled the detection of plastic explosives, he also invented the body scanner, originally developed as a machine to detect contraband in/on people for prisons, but later adopted for use in airports. Body scanners are now standard as part of pre-boarding security screenings in airports around the world. AS&E also produced the first 4th generation CT scanner for commercial use in 1976.
|
https://en.wikipedia.org/wiki?curid=29027538
| 1,579,367 |
708,216 |
The distinctive forms of the flakes were originally thought to indicate a wide-ranging Levallois culture resulting from the expansion of archaic "Homo sapiens" out of Africa. However, the wide geographical and temporal spread of the technique has rendered this interpretation obsolete. Adler "et al." further argue that Levallois technology evolved independently in different populations and thus cannot be used as a reliable indicator of Paleolithic human population change and expansion. Aside from technique, the overarching commonality in Levallois complexes is the attention given to maximizing core efficiency. Lycett and von Cramon-Taubedel (2013) measured variability in shape and geometrics relationships between cores over multiple regions, with an outcome that suggests a tendency for knappers to choose planforms with a specific surface morphology. In other words, they conclude that Levallois knappers cared less about the overall outline or shape of their core and more about the striking surface, evidence of complex pre-planning and recognition of an "ideal form" of Levallois core. A recent article by Lycett and Eren (2013) statistically shows the efficiency of the Levallois technique which at times has been called into question. Lycett and Eren created 75 Levallois flakes from 25 Texas Chert nodules. They counted the 3957 flakes and separated them into four stages in order to show efficiency, which grew subsequently in each stage. Based on the comparative study of 567 debitage flakes and 75 preferential Levallois flakes, Lycett and Eren found out the thickness is more evenly distributed and less variable across preferential Levallois flakes, which indicates the thickness is an important factor for efficiency and retouch potential. The experiment also shows that the Levallois core is an economic optimal strategy of raw material (lithic) usage, which means it can generate longest cutting edge per weight unit of raw material. This result also implies that the mobility of prehistoric people was higher when applying Levallois technology; prehistoric people may explore more area with Levallois cores, which can make longer cutting edge than the other flake-making technique under same amount of cores, and no need to worry about the lack of raw material to make tools.
|
https://en.wikipedia.org/wiki?curid=1323044
| 707,847 |
1,813,307 |
In addition to the MCI indexed defined above, there are also two other variations of the MCI. The QMCI (Quantitative Macroinvertebrate Community Index) and the SQMCI (Semi-Quantitative Macroinvertebrate Community Index). Both MCI and QMCI are widely used in countries like New Zealand. The combination of widespread use and good performance of the MCI and the QMCI in detecting water quality in aquatic ecosystems has sparked interest in further refinement of the methods in New Zealand. The QMCI, just like the MCI, was initially designed to evaluate the organic enrichment in aquatic ecosystems. The third index, the SQMCI, was created to reduce sampling and processing efforts required for the QMCI. The SQMCI will respond in a similar matter to the QMCI in community dominance, however, will require fewer samples to achieve the same precision. The SQMCI gives a comparative appraisal to the QMCI with under 40% of the exertion, in circumstances that macroinvertebrate densities are not required. This diminishes expenses and also enhances the logical solidness of biomonitoring projects. Both the QMCI and SQMCI are similar to the MCI in the way that they are graded on a 1 (extremely tolerant) to 10 (highly intolerant) scale. However, they differ in the way that MCI is calculated using presence-absence data whereas QMCI uses quantitative or percentage data. Having a qualitative, quantitative, and semi-quantitative version of the same index has raised some questions as to if this is a good thing or not. All three indexes have the same purpose, which is to measure the quality of an aquatic ecosystem, however, there are no clear recommendations about when each one is most appropriate to be used. In a study conducted on 88 rivers, Scarsbrook et al. (2000) concluded MCI is more useful than the QMCI for recognizing changes in stream water quality over time. Having three forms of a similar index may prompt to various conclusions and also opens the route for specific utilization of either file to give bias to a specific position or position taken by a specialist. In August 2019, the Ministry for the Environment released a draft National Policy Statement for Freshwater Management, and a report from Scientific and Technical Advisory Group that recommended including three different measures, MCI, QMCI and Average Score Per Metric (ASPM).
|
https://en.wikipedia.org/wiki?curid=51406305
| 1,812,273 |
185,262 |
There have been many studies linking race to increased proximity to particulate matter, and thus susceptibility to adverse health effects that go in tandem with long term exposure. In a study analyzing the effects of air pollution on racially segregated neighborhoods in the United States, results show that "the proportions of Black residents in a tract was linked to higher asthma rates". Many scholars link this disproportionality to racial housing segregation and their respective inequalities in "toxic exposures". This reality is made worse by the finding that "health care occurs in the context of broader historic and contemporary social and economic inequality and persistent racial and ethnic discrimination in many sectors of American life". Residential proximity to particulate emitting facilities increases exposure to PM 2.5 which is linked to increased morbidity and mortality rates. Multiple studies confirm the burden of PM emissions is higher among non-White and poverty ridden populations, though some say that income does not drive these differences. This correlation between race and housing related health repercussions stems from a longstanding environmental justice problem linked to the practice of historic redlining. An example of these factors contextualized is an area of Southeastern Louisiana, colloquially dubbed 'Cancer Alley' for its high concentration of cancer related deaths due to neighboring chemical plants. Cancer Alley being a majority African American community, with the neighborhood nearest to the plant being 90% Black, perpetuates the scientific narrative that Black populations are located disproportionately closer to areas of high PM output than White populations. A 2020 article relates the long term health effects of living in high PM concentrations to increased risk, spread, and mortality rates from the SARS-CoV-2 or COVID-19, and faults a history of racism for this outcome.
|
https://en.wikipedia.org/wiki?curid=30876688
| 185,165 |
774,000 |
Melioidosis is found in all age groups. For Australia and Thailand, the median age of infection is at 50 years; 5 to 10% of the patients are under 15 years. The single most important risk factor for developing melioidosis is diabetes mellitus, followed by hazardous alcohol use, chronic kidney disease, and chronic lung disease. More than 50% of people with melioidosis have diabetes; diabetics have a 12-fold increased risk of contracting melioidosis. Diabetes decreases the ability of macrophages to fight the bacteria and reduced the T helper cell production. Excessive release of Tumor necrosis factor alpha and Interleukin 12 by mononuclear cells increases the risk of septic shock. Other risk factors include thalassaemia, occupational exposure (e.g. rice paddy farmers), recreational exposure to soil, water, being male, age greater than 45 years, and prolonged steroid use/immunosuppression. However, 8% of children and 20% of adults with melioidosis have no risk factors. HIV infection does not appear to predispose to melioidosis, although several other co-infections have been reported. Infant cases have been reported possibly due to mother-to-child transmission, community-acquired infection, or healthcare-associated infection. Those who are well may also be infected with "B. pseudomallei". For example, 25% of children started producing antibodies against "B. pseudomallei" between 6 months to 4 years of staying in endemic areas although they did not experience any melioidosis symptoms; suggesting they were exposed to it over this time. This means that many people without symptoms will test positive in serology tests in endemic areas. In Thailand, the seropositivity rate exceeds 50%, while in Australia the seropositivity rate is only 5%. The disease is clearly associated with increased rainfall, with the number of cases rising following increased precipitation. Severe rainfall increases the concentration of the bacteria in the topsoil, thus increasing thus of transmitting the bacteria through the air. A recent CDC Advisory indicated that the recent detection of the organism in the environment in Mississippi following the occurrence of two indigenous cases of melioidosis, confirms that parts of the southern USA should now be regarded as melioidosis-endemic.
|
https://en.wikipedia.org/wiki?curid=471444
| 773,584 |
1,968,282 |
During (1977–84) Clayton resided part-time annually at the Max Planck Institute for Nuclear Physics in Heidelberg as Humboldt Prize awardee, sponsored by Till Kirsten. Annual academic leaves from Rice University facilitated this. There he joined the Meteoritical Society seeking audience for his newly published theoretical picture of a new type of isotopic astronomy based on the relative abundances of the isotopes of the chemical elements within interstellar dust grains. He hoped that such interstellar grains could be discovered within meteorites; and he also advanced a related theory that he called "cosmic chemical memory" by which the effects of stardust can be measured in meteoritic minerals even if stardust itself no longer exists there. Clayton designated the crystalline component of interstellar dust that had condensed thermally from hot and cooling stellar gases by a new scientific name, "stardust". Stardust became an important component of cosmic dust. Clayton has described the stiff resistance encountered from meteoriticist referees of his early papers advancing this new theory. He nonetheless established that research program at Rice University, where he continued guiding graduate-student research on that topic. He and student Kurt Liffman computed a pathbreaking history of survival rates of refractory stardust in the interstellar medium after its ejection from stars; and with student Mark D. Leising computed a propagation model of positron annihilation lines within nova explosions and of the angular distribution of gamma ray lines from radioactive Al in the galaxy. Following laboratory discovery in 1987 of meteoritic "stardust" bearing unequivocal isotopic markers of stars, Clayton was awarded the 1991 Leonard Medal, the highest honor of the Meteoritical Society. Feeling vindicated, Clayton exulted in "Nature" "the human race holds solid samples of supernovae in its hands and studies them in terrestrial laboratories".
|
https://en.wikipedia.org/wiki?curid=36061850
| 1,967,152 |
1,183,789 |
According to Robert W. Scharstein from the Department of Electrical Engineering at the University of Alabama, the mathematics used in the third edition is just enough to convey the subject and the problems are valuable teaching tools that do not involve the "plug and chug disease." Although students of electrical engineering are not expected to encounter complicated boundary-value problems in their career, this book is useful to them as well, because of its emphasis on conceptual rather than mathematical issues. He argued that with this book, it is possible to skip the more mathematically involved sections to the more conceptually interesting topics, such as antennas. Moreover, the tone is clear and entertaining. Using this book "rejuvenated" his enthusiasm for teaching the subject.Colin Inglefield, an associate professor of physics at Weber State University (Utah), commented that the third edition is notable for its informal and conversational style that may appeal to a large class of students. The ordering of its chapters and its contents are fairly standard and are similar to texts at the same level. The first chapter offers a valuable review of vector calculus, which is essential for understanding this subject. While most other authors, including those aimed at a more advanced audience, denote the distance from the source point to the field point by formula_1, Griffiths uses a script formula_2 (see figure). Unlike some comparable books, the level of mathematical sophistication is not particularly high. For example, Green's functions are not anywhere mentioned. Instead, physical intuition and conceptual understanding are emphasized. In fact, care is taken to address common misconceptions and pitfalls. It contains no computer exercises. Nevertheless, it is perfectly adequate for undergraduate instruction in physics. As of June 2005, Inglefield has taught three semesters using this book.
|
https://en.wikipedia.org/wiki?curid=58203281
| 1,183,163 |
1,900,107 |
During his career, Seyfarth's work appeared in magazines and journals and in the advertisements of various architectural supply firms. The extent to which this was done is not entirely known, but articles by Eleanor Jewett (1892–1968), art critic for the "Chicago Tribune" ("Cape Cod Architecture seen in B.L.T.'s Home" discussing the Taylor house at 92 Dell Place in Glencoe, Illinois) and Herbert Croly of the "Architectural Record" ("The Local Feeling in Western Country Houses", October, 1914, which discusses the Kozminski and McBride houses in Highland Park at 521 Sheridan Road and 2130 Linden, respectively) survive to give us some idea of how Seyfarth's work was received during the time he was practicing. (Croly would later go on to become the founding editor of "The New Republic" magazine.) Additionally, photographs of houses he designed appeared in "The Western Architect" magazine a number of times in the 1920s. Also surviving are copies of advertisements from the Arkansas Soft Pine Bureau (see image, right), the California Redwood Association (again with the McBride House), the Pacific Lumber Company (featuring the Churchill house in Highland Park at 1375 Sheridan Road), The Creo-Dipt Company (see image, left), the White Pine Bureau, the American Face Brick Association and the Stewart Iron Works Company of Cincinnati (with a picture of the H. C. Dickinson house at 7150 S. Yale in Chicago). In 1908, his Prairie-style house for Dickinson was published in "House Beautiful" magazine. In 1918, the Arkansas Soft Pine Bureau released a 32 page portfolio featuring houses built from Seyfarth's designs. Included were photographs, floor plans, bills of material, an estimate of the costs, and a brief description of important features. Entitled "The Home You Longed For", the booklet was announced in "Building Age Magazine" under the heading "New Catalogs of Interest to the Trade". Although generally unavailable today, it must have enjoyed a wide circulation in its time. It was acquired the following year for the collection of the Carnegie Library of Pittsburgh, and referred to that same year in an illustrated article that appeared in "Printers' Ink Monthly" entitled "The Loose-Leaf Portfolio - an Aid to Reader Interest", where it was described as "...a very attractive loose-leaf portfolio" and as a successful example of its type of publication.
|
https://en.wikipedia.org/wiki?curid=24940492
| 1,899,019 |
1,285,682 |
Seasonal hunger was common in pre-colonial and early colonial times, and gave rise to several coping strategies such as growing secondary crops like millet or sweet potatoes in case the maize crop failed, gathering wild food or relying on support from family or friends. In a purely peasant economy, farmers grow food primarily for their families’ needs. They normally have only small surpluses to store or for sale and little money to buy food in a time of shortage, even if it was available in any market. There were no significant markets, as any surplus grain not stored would be bartered for livestock or passed to dependents. If drought coincided with warfare, famine could be catastrophic, as in the great 1861-63 famine in southern Malawi, when 90% of the population of some villages died of starvation or disease, or through war. However, seasonal shortages occurred in most years and droughts ever six years on average. The imposition of colonial rule itself caused local food shortages, sometimes amounting to famine, where villages were burned and cattle killed. There were several significant famine in the first half of the 20th century, including one in the lower valley of the Shire River, an area which frequently experienced shortages, in 1903. Low rainfall in 1900–01, 1918 and 1920–21 and 1922 caused severe drought in the south and centre of the country, while in 1926 crops were destroyed by flooding. There was also distress in the north nearKasungu in 1924-25 and around Mzimba in 1938, and the shores of Lake Malawi suffered food shortages almost annually in the 1930s. However, for the first 50 years of colonial rule, much of the country fared better that the drier areas of southern Tanganyika, eastern Northern Rhodesia or Mozambique, where famine was endemic. The colonial authorities also provided some famine relief by moving maize from districts with surpluses to those with shortages and making free issues to children, the old and destitute, but they were reluctant to issue free relief to the able-bodied. After the great famine ended in 1863, despite regular seasonal hunger and high levels of chronic malnutrition, as well as acute episodes of food shortage and famine, there was no "famine that kills" until 1949.
|
https://en.wikipedia.org/wiki?curid=14543243
| 1,284,982 |
915,083 |
In 2017, researchers at the Children's Hospital of Philadelphia were able to further develop the extra-uterine system. The study uses fetal lambs which are then placed in a plastic bag filled with artificial amniotic fluid. The system consist in 3 main components: a pumpless arteriovenous circuit, a closed sterile fluid environment and an umbilical vascular access. Regarding the pumpless arteriovenous circuit, the blood flow is driven exclusively by the fetal heart, combined with a very low resistance oxygenator to most closely mimic the normal fetal/placental circulation. The closed sterile fluid environment is important to ensure sterility. Scientists developed a technique for umbilical cord vessel cannulation that maintains a length of native umbilical cord (5–10 cm) between the cannula tips and the abdominal wall, to minimize decannulation events and the risk of mechanical obstruction. The umbilical cord of the lambs are attached to a machine outside of the bag designed to act like a placenta and provide oxygen and nutrients and also remove any waste. The researchers kept the machine "in a dark, warm room where researchers can play the sounds of the mother's heart for the lamb fetus." The system succeeded in helping the premature lamb fetuses develop normally for a month. Indeed, scientists have run 8 lambs with maintenance of stable levels of circuit flow equivalent to the normal flow to the placenta. Specifically, they have run 5 fetuses from 105 to 108 days of gestation for 25–28 days, and 3 fetuses from 115 to 120 days of gestation for 20–28 days. The longest runs were terminated at 28 days due to animal protocol limitations rather than any instability, suggesting that support of these early gestational animals could be maintained beyond 4 weeks. Alan Flake, a fetal surgeon at the Children's Hospital of Philadelphia hopes to move testing to premature human fetuses, but this could take anywhere from three to five years to become a reality. Flake, who led the study, calls the possibility of their technology recreating a full pregnancy a "pipe dream at this point" and does not personally intend to create the technology to do so.
|
https://en.wikipedia.org/wiki?curid=1473331
| 914,602 |
282,662 |
The John Calhoun Baker University Center, which opened in January 2007, is named after the 14th president of the university. The facility replaced the original Baker Center located on East Union Street across from College Green and serves as the hub of campus activity. Electronic maps and virtual university e-tours, available at center information desks and online, direct visitors across campus. The building features a modified Federal architecture and large windows that admit natural light and afford expansive views of the campus. In contrast to the exterior's red brick and white columns, the interior has a more contemporary style with high domed ceilings. Terrazzo mosaics of aspects of the earth's globe are embedded in the atrium of the main entrance to the building. Baker Center contains a large food court called West 82; a pub bistro called Latitude 39; a Grand Ballroom; The Honors Collegium, The Wall of Presidents, the Bobcat Student Lounge, a shop called Bobcat Depot that sells apparel, computers, and accessories; a theater seating 400; study areas; computer labs; administrative offices; and numerous conference rooms. The Front Room, a large coffee house named after a former popular university rathskeller, features a stage, artwork and a community fireplace. It serves Starbucks products and university bakery items and is housed on the fourth floor, which opens onto its own outside terrace as well as onto the intersection of Park Place and Court Streets, making it a hot spot for students between classes. Other amenities include a United States Post Office and the Trisolini Art Gallery, named after a prominent fine arts faculty member.
|
https://en.wikipedia.org/wiki?curid=483329
| 282,509 |
1,660,868 |
Possible causes include changes in palaeogeography or tectonic activity, a modified nutrient supply, or global cooling. The dispersed positions of the continents, high level of tectonic/volcanic activity, warm climate, and high CO levels would have created a large, nutrient-rich ecospace, favoring diversification. There seems to be an association between orogeny and the evolutionary radiation, with the Taconic orogeny in particular being singled out as a driver of the GOBE by enabling greater erosion of nutrients such as iron and phosphorus and their delivery to the oceans around Laurentia. In addition, the changing geography led to a more diverse landscape, with more different and isolated environments; this no doubt facilitated the emergence of bioprovinciality, and speciation by isolation of populations. On the other hand, global cooling has also been offered as a cause of the radiation, with an uptick in fossil diversity correlating with the increasing abundance of cool-water carbonates over the course of this time interval. Another alternative is that the breakup of an asteroid led to the Earth being consistently pummelled by meteorites, although the proposed Ordovician meteor event happened at 467.5±0.28 million years ago. Another effect of a collision between two asteroids, possibly beyond the orbit of Mars, is a reduction in sunlight reaching the Earth's surface due to the vast dust clouds created. Evidence for this geological event comes from the relative abundance of the isotope helium-3, found in ocean sediments laid down at the time of the biodiversification event. The most likely cause of the production of high levels of helium-3 is the bombardment of lithium by cosmic rays, something which could only have happened to material which travelled through space. The volcanic activity that created the Flat Landing Brook Formation in New Brunswick, Canada may have caused rapid climatic cooling and biodiversification.
|
https://en.wikipedia.org/wiki?curid=18288753
| 1,659,934 |
653,995 |
Theory of mind requires the collaboration of functionally related regions of the brain to form the distinction between self and other mental states and to create a comprehensive understanding of those mental states so that we may recognize, understand, and predict behavior. In general the theory of mind process is mediated by the dopaminergic-serotonergic system, which involves the TPJ as well as other associative regions necessary for mentalizing. Recent studies suggest that both the left TPJ, working in conjunction with the frontal cortex, and the right TPJ are involved in the representation of mental states; furthermore they suggest that the TPJ is particularly active in making the distinction between the mental states of self and others. A study in "Nature Neuroscience" from 2004 describes how the TPJ is involved in processing socially relevant cues including gaze direction and goal-directed action and also explains that results from the study show that lesions to this area of the brain result in an impaired ability to detect another persons belief. Moreover, studies have reported an increase in activity in the TPJ when patients are absorbing information through reading or images regarding other peoples' beliefs but not while observing information about physical control stimuli. Some studies, however, have shown that the TPJ, along with the cingulate cortex, is more specifically involved with attributing beliefs, but the process of mentalizing more generally is associated more with the medial prefrontal cortex. Another study in "Current Biology" from 2012 identifies the importance of the TPJ in both low-level, such as simple discrimination, and high-level, such as the ability to empathize, sociocognitive operations. In July 2011, a review from "Neuropsychologia" presented a model of the mentalizing network that established that mental states are first detected in the TPJ. The TPJ is composed of two discrete anatomical regions, the inferior parietal lobule (IPL) and the caudal parts of the superior temporal sulcus (pSTS), and both are active in the process of distinction between mental states of different individuals; thus, it is probable that this detection is the outcome of the combination and coordination of these two parts. Additionally, the right TPJ is involved in the ventral attention stream and contributes to the ability to focus attention on a particular stimuli or objective. It has also been observed that the interaction and communication between the dorsal and ventral streams involves the TPJ.
|
https://en.wikipedia.org/wiki?curid=14158261
| 653,651 |
262,799 |
The science of conserving fossil remains was in its infancy, and new techniques had to be improvised to deal with what soon became known as "pyrite disease". Crystalline pyrite in the bones was being oxidized to iron sulphate, accompanied by an increase in volume that caused the remains to crack and crumble. When in the ground, the bones were isolated by anoxic moist clay that prevented this from happening, but when removed into the drier open air, the natural chemical conversion began to occur. To limit this effect, De Pauw immediately, in the mine-gallery, re-covered the dug-out fossils with wet clay, sealing them with paper and plaster reinforced by iron rings, forming in total about six hundred transportable blocks with a combined weight of a hundred and thirty tons. In Brussels after opening the plaster he impregnated the bones with boiling gelatine mixed with oil of cloves as a preservative. Removing most of the visible pyrite he then hardened them with hide glue, finishing with a final layer of tin foil. Damage was repaired with papier-mâché. This treatment had the unintended effect of sealing in moisture and extending the period of damage. In 1932 museum director Victor van Straelen decided that the specimens had to be completely restored again to safeguard their preservation. From December 1935 to August 1936 the staff at the museum in Brussels treated the problem with a combination of alcohol, arsenic, and 390 kilograms of shellac. This combination was intended to simultaneously penetrate the fossils (with alcohol), prevent the development of mold (with arsenic), and harden them (with shellac). The fossils entered a third round of conservation from 2003 until May 2007, when the shellac, hide glue and gelatine were removed and impregnated with polyvinyl acetate and cyanoacrylate and epoxy glues. Modern treatments of this problem typically involve either monitoring the humidity of fossil storage, or, for fresh specimens, preparing a special coating of polyethylene glycol that is then heated in a vacuum pump, so that moisture is immediately removed and pore spaces are infiltrated with polyethylene glycol to seal and strengthen the fossil.
|
https://en.wikipedia.org/wiki?curid=228798
| 262,660 |
439,745 |
Linux on IBM Z gives the flexibility of running Linux with the advantages of fault-tolerant mainframe hardware capable of over 90,000 I/O operations per second and with a mean time between failure (MTBF) measured in decades. Using virtualization, numerous smaller servers can be combined onto one mainframe, gaining some benefits of centralization and cost reduction, while still allowing specialized servers. Instead of paravirtualization, IBM mainframes use full virtualization, which permits workload density far greater than paravirtualization does. Combining full virtualization of the hardware plus lightweight Virtual Machine containers that run Linux in isolation (somewhat similar in concept to Docker) result in a platform that supports more virtual servers than any other in a single footprint, which also can lower operating costs. Additional savings can be seen from reduced need for floor space, power, cooling, networking hardware, and the other infrastructure needed to support a data center. IBM mainframes allow transparent use of redundant processor execution steps and integrity checking, which is important for critical applications in certain industries such as banking. Mainframes typically allow hot-swapping of hardware, such as processors and memory. IBM Z provides fault tolerance for all key components, including processors, memory, I/O Interconnect, power supply, channel paths, network cards, and others. Through internal monitoring, possible problems are detected and problem components are designed to be switched over without even failing a single transaction. In the rare event of failure, firmware will automatically enable a spare component, disable the failing component, and notify IBM to dispatch a service representative. This is transparent to the operating system, allowing routine repairs to be performed without shutting down the system. Many industries continue to rely on mainframes where they are considered to be the best option in terms of reliability, security, or cost.
|
https://en.wikipedia.org/wiki?curid=33589774
| 439,531 |
1,329,555 |
Presently, NEIST is a full-fledged multidisciplinary research institution having research areas like Advanced Computation and Data Science, Medicinal Chemistry, Natural Products Chemistry, Synthetic Organic Chemistry, Biotechnology, Infectious disease, Communicable and Non-communicable disease, Medicinal, Aromatic and Economic plants, Geoscience, Petroleum and Natural Gas, Applied Civil Engineering, Chemical Engineering, General Engineering, Cellulose, Membrane technologies, Pulp and Paper, Material Science, Coal, etc. Over the years, the laboratory has produced more than 117 technologies in the areas of Agrotechnology, Biological and Oil Field Chemicals. In the last couple of decades, the institute has also produced more than 200 Ph.Ds. to cater the skilled human resource needed for this region. Many Ph.Ds. of this institute are doing well in industry and academia in India and abroad. The institute also supports economically poor bright students for Mentorship Program. Currently the institution is equipped with state-of-the-art instruments and infrastructure to carry out research in frontier areas of science and technology. A Common Facility Center (CFC) under the STINER (Science and Technological Intervention in North East India) project funded by Ministry of Development of North Eastern Region (MDoNER), Govt of India has been established at NEIST. The main goal of the project is to bring all the relevant proven technologies to the people of North Eastern Region (NER), more particularly to farmers and artisans community, so that the quality of their profession can be boost up through science and technological intervention. During the unprecedented situation of COVID-19 across the globe, the institute has organized a unique Summer Research Training Program 2020 (SRTP) through online mode to provide training to over 16000 participants. The training program for Drug Discovery Hackathon 2020, an initiative of MHRD's Innovation Cell, was conducted by the institute. A COVID-19 research laboratory has also been set up in the institute premises to facilitate COVID-19 testing facility in the entire region. Under the CSIR-AROMA mission, the institute has planned to set up "Multilocational Trial & Regional Research Experimental Farm" across the North East.
|
https://en.wikipedia.org/wiki?curid=17938085
| 1,328,827 |
82,657 |
The Navy committed to the $15 billion (2003) program in advance of rigorous analysis or clearly defined purpose, appearance, or survivability. Proponents typically pointed to its speed, asymmetric littoral threats, and impact on the U.S. shipbuilding industry. The LCS suffered from requirements creep, adding more missions and equipment, potentially rendering it too complex and expensive to use. When it was decided the ship would not be expendable, the original concept of a small, cheap, simple coastal warship became bigger, more expensive, and more complicated, with a smaller crew due to automation. The task force assigned six different missions which had been previously performed by individual ships: submarine and mine hunting; combating small boats; intelligence gathering; transporting special forces; and counter-drug and piracy patrols. Each ship would be big enough to sail across the Pacific alone, embark a helicopter, have a minimum 40 knot top speed, and cost $220 million. The Navy was only willing to build one type of ship; the task force, realizing that it was virtually impossible for one vessel to fill all roles, advocated a large hull to cover the mission range through modularity, organic combat power, and unmanned systems. Empty space was left for weapon and sensor mission modules costing $150 million. When the first production contracts were awarded in 2004, no mission module worked outside of a laboratory. Fast, cheap construction was emphasized, solving problems with technology.
|
https://en.wikipedia.org/wiki?curid=460005
| 82,623 |
503,103 |
Once the idea was initially accepted, a number of people went to work. Human factors specialist Frances Mount began to develop the rationale and operational scenarios for the "Cupola", and got considerable support from Chief Astronaut John Young and Shuttle Commander Gordon Fullerton. Charles Wheelwright, who had defined the specifications for every window on every prior United States crewed spacecraft, began to define the design specifications of the "Cupola" windows. Laurie Weaver, who had just started with NASA as a student intern, began to work on a series of different configurations for the "Cupola". She started with Kitmacher's idea, based on the Shuttle Aft Flight Deck, in this case two Aft Flight Decks mounted back to back, placed atop a short cylinder. An inexpensive mock-up made of PVC tubes was built and tested underwater, where critical dimensions could be measured to ensure that two crew members in zero-g would have adequate access. Then she built a series of small cardboard models, looking at a variety of different alternative shapes. The different configurations and their positive and negative attributes were presented at a series of Crew Station Reviews over the next year in which participants rated each. The "Cupola" that evolved was octagonal in shape, with eight similar windows around the periphery, four quadrant windows overhead, and mounted on a cylinder. The module was designed to fully contain at least two crewmembers "floating" side by side in zero-g neutral body posture. About this time, Kitmacher and designer Jay Cory applied the term "Cupola" for the first time. Kitmacher wrote the requirements and the name into the Man-Systems Architectural Control Document and into the requests for proposals for Work Package 1 at MSFC and Work Package 2 at JSC. Later, Kitmacher went on to lead the Man-Systems group, leading the first lunar outpost and moonbase studies and the "Cupola" reappeared on several of his rover and module designs.
|
https://en.wikipedia.org/wiki?curid=739132
| 502,845 |
1,815,033 |
Distributed Operations is a form of maneuver warfare where small, highly capable units spread across a large area of operations will create an advantage over an adversary through the deliberate use of separation and coordinated, independent tactical actions. DO units will use close combat or supporting arms to disrupt the enemy's access to key terrain and avenues of approach. Positioning numerous smaller ships over a vast geographic area and swiftly aggregate them, would better support counterinsurgency operations allowing a fast response to threats while maintaining the ability to overmatch adversaries through a well-integrated and enabled network with highly precise and coordinated fires. A February 2021 Marine Corps tentative manual on Expeditionary Advanced Base Operations (EABO) states that Littoral manoeuvre will rely heavily on surface platforms such as the light amphibious warship (LAW) and a range of surface connectors, as well as aviation assets. The LAW is envisioned as the principal littoral maneuver vessel of the littoral force. A November 9, 2020, press report stated that, as part of its LAW industry studies, the Navy had received nine LAW concept designs from 16 design firms and shipyards, some of which have paired into teams. The report quoted a Navy official as stating that the following firms were participating in the industry studies: Austal USA, BMT Designers, Bollinger Shipyards, Crescere Marine Engineering, Damen, Hyak Marine, Independent Maritime Assessment Associates, Nichols Brothers Boat Builders, Sea Transport, Serco, St. John Shipbuilding, Swiftships, Technology Associates Inc., Thoma-Sea, VT Halter Marine and Fincantieri. Light Amphibious Warships (LAWs) would be instrumental to these operations, with LAWs embarking, transporting, landing, and subsequently reembarking these small Marine Corps units. This type of warfare will be dependent on well trained and professional small unit leaders, focused and energetic training of small units and more robust communications and tactical mobility assets for those smaller units. A greater focus will also be placed on language and cultural training.
|
https://en.wikipedia.org/wiki?curid=5757585
| 1,813,999 |
1,582,081 |
In Fourth National Climate Assessment (NCA4) Volume 1, released in October 2017, entitled "Climate Science Special Report" (CSSR), researchers reported that "it is extremely likely that human activities, especially emissions of greenhouse gases, are the dominant cause of the observed warming since the mid-20th century. For the warming over the last century, there is no convincing alternative explanation supported by the extent of the observational evidence." A 2018 CRS cited the October 2017 CSSR: "Detection and attribution studies, climate models, observations, paleoclimate data, and physical understanding lead to high confidence (extremely likely) that more than half of the observed global mean warming since 1951 was caused by humans, and high confidence that internal climate variability played only a minor role (and possibly even a negative contribution) in the observed warming since 1951. The key message and supporting text summarizes extensive evidence documented in the peer-reviewed detection and attribution literature, including in the IPCC Fifth Assessment Report." Volume 2 entitled "Impacts, Risks, and Adaptation in the United States" was released on November 23, 2018. According to Volume II, "Without substantial and sustained global mitigation and regional adaptation efforts, climate change is expected to cause growing losses to American infrastructure and property and impede the rate of economic growth over this century." The National Oceanic and Atmospheric Administration (NOAA) was "administrative lead agency" in the preparation of the Fourth National Climate Assessment. According to NOAA, "human health and safety" and American "quality of life" is "increasingly vulnerable to the impacts of climate change". The USGCRP team that produced the report included thirteen federal agencies— NOAA, the DOA, DOC, DOD, DOE, HHS, DOI, DOS, DOT, EPA, NASA, NSF, Smithsonian Institution, and the USAID—with the assistance of "1,000 people, including 300 leading scientists, roughly half from outside the government."
|
https://en.wikipedia.org/wiki?curid=352597
| 1,581,191 |
1,926,371 |
To survive and infect other hosts, "E. invadens" must form a carbohydrate rich cyst. The chemical makeup and of formation of the cyst is of paramount importance to scientists all around the world because detailing this could lead to treatment development. Studies investigating cyst wall composition have shown that the wall specifically contains chitin, chitosan fibrils, and chitin binding proteins. As opposed to walls of plants and fungi who have multilayered walls, the "Entamoeba" cyst wall is homogenous—containing only one layer. The combination of these elements confers resistance to extreme environmental conditions such as desiccation, heat, and detergent Despite the amount of research conducted to date, the formation of the cyst wall during encystation has not yet been clearly defined. Scientists do know that during this process, the level of cytoplasmic vesicles is significantly reduced, which is thought to be caused by vesicles fusing with the plasma membrane in order to deposit the cyst wall on the exterior of the cell. These cyst wall properties allow the infective form of the parasite to persist for prolonged periods of time in extreme environments before it is ingested by the next host. Once ingested, the cyst travels through the digestive system unaffected until it reaches the small intestine, where it encounters several triggers, that have been found to induce excystation. Such triggers include, low glucose, osmotic shock, and a combination of water, bicarbonate, and bile. Eight pathogenic trophozoites emerge from each cyst (Brewer, 2008) and begin to feed on the bacteria that is naturally found in the reptilian gut in addition to mucin cells that make up the mucosal layer of the large intestine. The parasite will also secrete enzymes that continue to destroy the mucosal layer. This degradation recruits more bacteria to the scene of invasion, further fueling trophozoite replication. In addition to feeding the trophozoites, this excess flow of bacteria also can result in secondary bacterial infections that could also assist in the systemic distribution of the parasite causing liver or brain abscesses. The increased population of trophozoites through various cell signaling pathways will initiate trophozoite aggregation, which is the first step in encystation. Once encystation occurs, the infective cysts are excreted into the environment.
|
https://en.wikipedia.org/wiki?curid=44779596
| 1,925,267 |
484,917 |
Under the supervision of Paul Weiss while earning his Ph.D. at the University of Chicago, Sperry became interested in neuronal specificity and brain circuitry and began questioning the existing concepts about these two topics. He asked the simple question first asked in his Introduction to Psychology class at Oberlin: Nature or nurture? He began a series of experiments in an attempt to answer this question. Sperry crosswired the motor nerves of rats' legs so the left nerve controlled the right leg and vice versa. He would then place the rats in a cage that had an electric grid on the bottom separated into four sections. Each leg of the rat was placed into one of the four sections of the electric grid. A shock was administered to a specific section of the grid, for example the grid where the rat's left back leg was located would receive a shock. Every time the left paw was shocked the rat would lift his right paw and vice versa. Sperry wanted to know how long it would take the rat to realize he was lifting the wrong paw. After repeated tests Sperry found that the rats never learned to lift up the correct paw, leading him to the conclusion that some things are just hardwired and cannot be relearned. In Sperry's words, "no adaptive functioning of the nervous system took place." During Sperry's postdoctoral years with Karl Lashley at Harvard and at the Yerkes Laboratories of Primate Biology in Orange Park, Florida, he continued his work on neuronal specificity that he had begun as a doctoral student and initiated a new series of studies involving salamanders. The optic nerves were sectioned and the eyes rotated 180 degrees. The question was whether vision would be normal after regeneration or would the animal forever view the world as "upside down" and right-left reversed. Should the latter prove to be the case, it would mean that the nerves were somehow "guided" back to their original sites of termination. Restoration of normal vision (i.e., "seeing" the world in a "right-side-up" orientation) would mean that the regenerating nerves had terminated in new sites, quite different from the original ones. The animals reacted as though the world was upside down and reversed from right to left. Furthermore, no amount of training could change the response. These studies, which provided strong evidence for nerve guidance by "intricate chemical codes under genetic control" (1963) culminated in Sperry's chemoaffinity hypothesis (1951).
|
https://en.wikipedia.org/wiki?curid=102958
| 484,668 |
2,056,041 |
Even with these limitations SPI can be an extremely powerful analytical, reconnaissance, and monitoring tool. Sediment-type maps have often been constructed by retrieving grab or core samples followed by days or weeks of laboratory-based processing. After an SPI device is lowered into the sediment and the image recorded, it can be hauled up and lowered repetitively without fully recovering the device. Such a vessel ‘stitching’ an SPI device along a prescribed route can survey an area with unprecedented economy compared to physical sample recovery. There is, of course, a trade-off between sampling data quality and quantity. SPI allows much greater spatial coverage for a given amount of field time at the cost of the detailed sediment descriptors typically produced from physical cores (half phi interval texture analysis, carbon content, etc.). Managing this balance is the essence of good SPI use and highlights its strengths. For example, Hewitt et al. (2002), Thrush et al. (1999), and Zajac (1999) call attention to the value of integrating macrofaunal community observations collected at different scales and their application in describing processes occurring at different scales within a heterogeneous benthic landscape. When evaluating landscape-scale questions it is rarely feasible to simply and comprehensively sample the total spatial extent with dense, equivalently detailed sampling points. The researcher must compromise between data collection grain, dimensions of the actual sampling unit (typically 0.1 m grab or similar), and lag- distance between sample units over which results will be interpolated (often tens to hundreds of metres for grab samples). Sediment profile imagery can be an efficient monitoring tool when coupled with more detailed sampling techniques such as macrofaunal core sampling, or continuous sediment survey transects (Gowing et al. 1997). It offers point data that can be economically collected at sufficient frequency to connect more resource-intensive samples in an ecologically meaningful way. A study can therefore operate at nested spatio-temporal scales with SPI providing overall maps and connectivity while other sampling techniques are used to characterise assemblages and variability within habitat types. This type of integration is necessary for developing our understanding and predictability of soft-sediment processes (Thrush et al. 1999; Noda 2004).
|
https://en.wikipedia.org/wiki?curid=14917968
| 2,054,858 |
1,618,665 |
S-nitrosylation is precisely targeted, reversible, spatiotemporally restricted and necessary for a wide range of cellular responses, including the prototypic example of red blood cell mediated autoregulation of blood flow that is essential for vertebrate life. Although originally thought to involve multiple chemical routes "in vivo", accumulating evidence suggests that S-nitrosylation depends on enzymatic activity, entailing three classes of enzymes (S-nitrosylases) that operate in concert to conjugate NO to proteins, drawing analogy to ubiquitinylation. "S"-Nitrosylation was first described by Stamler et al. and proposed as a general mechanism for control of protein function, including examples of both active and allosteric regulation of proteins by endogenous and exogenous sources of NO. The redox-based chemical mechanisms for S-nitrosylation in biological systems were also described concomitantly. Important examples of proteins whose activities were subsequently shown to be regulated by S-nitrosylation include the NMDA-type glutamate receptor in the brain. Aberrant S-nitrosylation following stimulation of the NMDA receptor would come to serve as a prototypic example of the involvement of S-nitrosylation in disease. S-nitrosylation similarly contributes to physiology and dysfunction of cardiac, airway and skeletal muscle and the immune system, reflecting wide-ranging functions in cells and tissues. It is estimated that ~70% of the proteome is subject to S-nitrosylation and the majority of those sites are conserved. S-Nitrosylation is thus established as ubiquitous in biology, having been demonstrated to occur in all phylogenetic kingdoms and has been described as the prototypic redox-based signalling mechanism, hypothesized to have evolved on primordial Earth.
|
https://en.wikipedia.org/wiki?curid=31710978
| 1,617,750 |
2,056,071 |
The goal was to obtain the greatest imaging area in the smallest cylindrical volume using a consumer flatbed scanner. Typical flatbed scanners image an area of about 220 x 300 mm (660 cm), so a system had to be found which could be reconfigured to fit inside a sealed transparent capsule. There are two basic imaging methods in modern flatbed scanners. From the 1980s to the late-1990s the market was dominated by systems that could capture an image from any depth of field. Most such digital imaging devices used a [[charge-coupled device]] (CCD) array. In a CCD, discrete dots of photosensitive material produce a specific charge based on the intensity of light hitting it. A CCD does not detect colour. In this technology, a scene is illuminated, a narrow band of reflected light from the scene passes through a slit (to eliminate light coming from other directions), is then concentrated by an array of mirrors (typically folded into a box) into a prism typically a few centimetres in length. The prism splits the light into its constituent colours. Small CCD arrays are carefully placed at the point where the primary colours are sharply focused. The separate colour intensities are combined to composite values and recorded by the computer (or scanner electronic assemblies) as a line of pixels. The moving scan head then advances a short distance to gather the next line of the scene. Thus resolution in one axis is determined by CCD array size and focused optics, while the other axis’ resolution is determined by the smallest reliable step the scan head advancing motor can make. The optical assemblies of this type of scanner are fairly robust to vibration, but the traditional light source (a cold cathode tube of balanced colour temperature) is not. It was therefore replaced with an array of solid-state white light emitting diodes (LEDs). Another advantage of this replacement is that the sources could be alternated between white light and ultraviolet (UV) of about 370 nm wavelength. This UV light source allowed detection of visibly fluorescing materials (typically tracer minerals or hydrocarbons) by the prototype.
|
https://en.wikipedia.org/wiki?curid=14917968
| 2,054,887 |
2,067,466 |
"Percina jenkinsi" has been federally listed as endangered throughout its range with critical habitat on August 5, 1986. A recovery plan was completed on June 20, 1986. At their Knoxville nonprofit, Conservation Fisheries, INC. (CFI) J.R. Shutes and Pat Rakes are trying to keep this rare species alive. The Conasauga River might hold a limited 200 individuals of this species and CFI holds three, the only ones in captivity. The goal is to have seed stock ready to restore the fish to the river, if and when society restores that river to its clean, free-flowing state. The Tennessee Aquarium in Chattanooga, and other private facilities, and state and federal wildlife agencies have efforts under way as well. The Southeastern Fishes Council put together a list they call the Desperate Dozen, "the twelve fish most likely to become extinct soon," and this list includes "Percina jenkinsi." Current management includes preserving the Conasauga river populations and presently used habitat, utilize existing legislation of the Federal Endangered Species Act for water quality regulations, stream alteration regulations, etc., conduct life history research on the species to include reproduction, food habits, age and growth, and mortality factors, determine the number of individuals required to maintain a viable population, and searching for additional populations and habitats suitable for reintroduction efforts. The Forest Service is playing a lead role in conservation efforts in the upper watershed. The Conasauga River Alliance—a partnership of local citizens, businesses, conservation groups, and government agencies—is coordinating conservation activities in the middle section of the watershed. Both are active and assist each other throughout these parts of the watershed. Other partnerships include U.S. Fish and Wildlife Service, NC State University, Regional Solid Waste Management Authority, local industries and utility companies, Georgia DNR, private and public landowners, volunteers, and local city and county governments.
|
https://en.wikipedia.org/wiki?curid=12620539
| 2,066,275 |
774,453 |
On the other side of the argument, economists contend that no surplus value was generated from labour activity or from commodity markets in the socialist planned economies; they therefore claim that there was no exploiting class, even if inequalities existed. Since prices were controlled and set below market-clearing levels, there was no element of value added at the point of sale - as occurs in capitalist market economies. Prices were built up from the average cost of inputs, including wages, taxes, interest on stocks and working capital as well as allowances to cover the recoupment of investment and for depreciation, so there was no profit margin in the price charged to customers. Wages did not reflect the purchase price of labour, since labour was not a commodity traded in a market and the employing organizations did not own the means of production. Wages were set at a level that permitted a decent standard of living; they rewarded specialist skills and educational qualifications. In macroeconomic terms, the plan allocated the whole national product to workers in the form of wages for the workers' own use, with a fraction withheld for investment and for imports from abroad. The difference between the average value of wages and the value of national output per worker did not imply the existence of surplus value since it was part of a consciously formulated plan for the development of society. The presence of inequality in the socialist planned economies did not imply that an exploiting class existed. In the Soviet Union, communist-party members were able to buy scarce goods in special shops and the leadership elite took advantage of state property to live in more spacious accommodation - and sometimes in luxury. Although they received privileges not commonly available and some additional income in kind, there was no difference in their official remuneration in comparison to their non-party peers. Enterprise managers and workers received only the wages and bonuses related to the production targets that the planning authorities had set. Outside of the cooperative sector - which enjoyed greater economic freedoms and whose profits were shared among all members of the cooperative - there was no profit-taking class. Other analysts maintain that workers in the Soviet Union and in other Marxist–Leninist states had genuine control over the means of production through institutions such as trade unions.
|
https://en.wikipedia.org/wiki?curid=43069513
| 774,037 |
944,293 |
The qualifications for prospective astronauts had been a point of contention after the creation of NASA in 1958. The proposition for astronauts to have a background as a pilot was a logical choice, specifically test pilots with a disposition to train and learn to fly new craft designs. The consensus sought jet test pilots from the military, a field where women were not allowed at the time, and by default excluded from consideration. However, NASA also required potential astronauts to hold college degrees – a qualification that John Glenn of the Mercury 7 group did not possess. Although Glenn had begun studying chemistry at Muskingum College in 1939, when the United States entered World War II he left college before completing his final year to enlist in the U.S. Navy, demonstrating that NASA was sometimes willing to make exceptions to these requirements. The larger issue behind this pretense, recognized by Glenn and the overall fight of the Mercury 13, was the organization of social order. Change was needed for women to be considered, but vehemently resisted in secrecy by those already benefiting from their gender-supported positions. Little to no support ever surfaced for the merit, strength, or intellect women possessed for the role of an astronaut, despite the evidence for the contrary. Some obvious concerns for NASA during the space race included, but were not limited to, oxygen consumption and weight for the drag effect on takeoff. After the undeniable success of their testing, the FLATs were no longer having to prove their physical and psychological fitness. They were pushing the 'social order' to convince NASA that women had a right to hold the same roles men were granted as astronauts. It was not until 1972 that an amendment to Title VII of the Civil Rights Act of 1964 finally granted women legal assistance for entering the realm of space. By 1978, the jet fighter pilot requirement was no longer an obstacle for women candidates. NASA had its first class with women that year. They were admitted into a new category of astronaut, the mission specialist.
|
https://en.wikipedia.org/wiki?curid=4054350
| 943,791 |
376,993 |
Jennifer Doudna was born February 19, 1964, in Washington, D.C., as the daughter of Dorothy Jane (Williams) and Martin Kirk Doudna. Her father received his Ph.D. in English literature from the University of Michigan, and her mother, a stay-at-home parent, held a master's degree in education. When Doudna was seven years old, the family moved to Hawaii so her father could accept a teaching position in American literature at the University of Hawaii at Hilo. Doudna's mother earned a second master's degree in Asian history from the university and taught history at a local community college. Growing up in Hilo, Hawaii, Doudna was fascinated by the environmental beauty of the island and its flora and fauna. Nature built her sense of curiosity and her desire to understand the underlying biological mechanisms of life. This was coupled with the atmosphere of intellectual pursuit that her parents encouraged at home. Her father enjoyed reading about science and filled the home with many books on popular science. When Doudna was in the sixth grade, he gave her a copy of James Watson's 1968 book on the discovery of the structure of DNA, "The Double Helix," which was a major inspiration. Doudna also developed her interest in science and mathematics in school. Even though Doudna was told that "Women don't go into science," she knew that she wanted to be a scientist no matter what. Nothing said to her made her doubt it, Doudna said, When someone tells me I can’t do something and I know that I can, it just makes me more resolved to do it." While she attended Hilo High School, Doudna's interest in science was nurtured by her 10th-grade chemistry teacher, Ms. Jeanette Wong, whom she has routinely cited as a significant influence in sparking her nascent scientific curiosity. A visiting lecturer on cancer cells further encouraged her pursuit of science as a career choice. She spent a summer working in the University of Hawaii at Hilo lab of noted mycologist Don Hemmes and graduated from Hilo High School in 1981.
|
https://en.wikipedia.org/wiki?curid=36836014
| 376,798 |
1,516,787 |
Ibrahim López García (Cabure, 1925 – Maracaibo, 1994). A visionary like few formed in civil engineering, exerted for many years the teaching and the investigation in the Central University of Venezuela and the University of Zulia. His practice was based on a careful observation of nature, its surprising structures and designs. He tried to make engineering proposals lighter and fresh, optimizing resources and reducing environmental impact. As a result of these deep convictions, he founded the Social Ecological Movement for the XXI Century at the end of the sixties, with strong environmental principles. He also carried out several construction projects such as the roof of the "José Pérez Colmenares Stadium" ( Maracay, Aragua state) in which his new thinking becomes evident with a design inspired by the shape of the palm leaf. In 1970, he prepared a challenging work of promotion, titled "On tops, domes and flights" that tried to break with some paradigms of the modernity. First, it criticizes our reliance on fire-based technology, on combustion, by proposing the use of alternative energies. He criticizes that humans have been inspired by fish and birds to design aircraft, helicopters and submarines since its principle is linear, which in its opinion has the result of waste of energy and fuel. Thanks to a study that for a long time made on the spores, shells of turtles and other natural domes, proposes the construction of an airship based on the way of moving of these beings. His airship model consisted of a large central dome surrounded by a ring of aerodynamic rotating domes that would allow it to travel in the air and even in the water, to which is added an engine that applies the laws of electromagnetism.
|
https://en.wikipedia.org/wiki?curid=29302481
| 1,515,935 |
749,992 |
During the year 2000, an unusual meeting took place with a next door neighbor (Francis 'Frank' Smith) of the then HMAS "Stirling" Naval Base commander. He was an Aircraft Maintenance Engineer (originally trained at Government Aircraft Factories Fisherman's bend) who had been aware of the fluid dynamics issues of the Collins class for some time, purely by interest and observation on television. After a lengthy discussion, he was invited to discuss and demonstrate where possible, his observations at the "Stirling" Naval Base with Navy and Defence Science and Technology Organisation (DSTO) staff who were there at that time as part of an investigative group. He showed on a white board, the aerofoil issue with the Dorsal – Sail conning tower structure showing that the aspect ratio (span (height) to chord (width)) was too short and that severe turbulence / cavitation would be generated by such a design. This was demonstrated again on the white board using aircraft aerofoil wing shapes as a basis for the discussion. That the turbulence / cavitation generated would, by natural rearward flow, move down the rear upper surface deck of the hull and be drawn into the propeller. He was also able to demonstrate that the design of the bow section would not pass a flow test for generated turbulence / cavitation, with the change in shape from circular bow section to long hull, being ill-conceived. He made several recommendations during the lecture that would be cost-effective and possible. 1) To lengthen and taper the dorsal fin and create a more streamlined integration of the dorsal to flat upper Hull deck section. and 2) To 'fill in' the hollow section of hull aft of the bow curvature. Both these could be achieved with Carbon Fibre or Fibreglass covers as no load bearing strength would be required. Subsequent studies by the DSTO showed that the submarine's hull shape, particularly the redesigned sonar dome, the fin, and the rear of the submarine, focused the displaced water into two turbulent streams; when the seven propeller blades hit these streams, the propeller's vibration was increased, causing cavitation. These problems were fixed by modifying the casing of the submarine with fiberglass fairings.
|
https://en.wikipedia.org/wiki?curid=810758
| 749,594 |
120,639 |
To date, the longest human occupation of space is the International Space Station which has been in continuous use for . Valeri Polyakov's record single spaceflight of almost 438 days aboard the Mir space station has not been surpassed. The health effects of space have been well documented through years of research conducted in the field of aerospace medicine. Analog environments similar to those one may experience in space travel (like deep sea submarines) have been used in this research to further explore the relationship between isolation and extreme environments. It is imperative that the health of the crew be maintained as any deviation from baseline may compromise the integrity of the mission as well as the safety of the crew, hence the reason why astronauts must endure rigorous medical screenings and tests prior to embarking on any missions. However, it does not take long for the environmental dynamics of spaceflight to commence its toll on the human body; for example, space motion sickness (SMS) – a condition which affects the neurovestibular system and culminates in mild to severe signs and symptoms such as vertigo, dizziness, fatigue, nausea, and disorientation – plagues almost all space travelers within their first few days in orbit. Space travel can also have a profound impact on the psyche of the crew members as delineated in anecdotal writings composed after their retirement. Space travel can adversely affect the body's natural biological clock (circadian rhythm); sleep patterns causing sleep deprivation and fatigue; and social interaction; consequently, residing in a Low Earth Orbit (LEO) environment for a prolonged amount of time can result in both mental and physical exhaustion. Long-term stays in space reveal issues with bone and muscle loss in low gravity, immune system suppression, and radiation exposure. The lack of gravity causes fluid to rise upward which can cause pressure to build up in the eye, resulting in vision problems; the loss of bone minerals and densities; cardiovascular deconditioning; and decreased endurance and muscle mass.
|
https://en.wikipedia.org/wiki?curid=28431
| 120,590 |
1,443,078 |
Fay Kellogg (1871–1918) learnt her architectural skills with a German tutor who taught her drafting, at the Pratt Institute in Brooklyn, and by working with Marcel de Monclos in his Paris atelier. She had hoped to study at the École des Beaux-Arts but as a woman was refused admission. As a result of her efforts, however, the institution later opened its doors to women wishing to study architecture. On her return to the United States, Kellogg helped design the Hall of Records in Lower Manhattan before opening a studio of her own. She went on to design hundreds of buildings in the New York area, encouraging the New York Times to describe her as "one of the most successful woman architects in America". Mary Gannon and Alice Hands were early graduates of the New York School of Applied Design for Women in 1892, and formed an architectural firm in 1894, Gannon and Hands, that focused on low-cost residential housing in New York City. Julia Morgan (1872–1957) was the first woman to receive a degree in architecture from the École des Beaux-Arts. She was initially refused admission as a woman in 1896 but reapplied and was successfully admitted in 1898. After graduating in 1901, she returned to California where she had a prolific and innovative career, blazing new paths professionally, stylistically, structurally, and aesthetically, setting high standards of excellence in the profession. Completing over 700 projects, she is especially known for her work for women's organizations and key clients, including Hearst Castle in San Simeon, considered to be one of her masterpieces. She was the first woman architect licensed in California. Mary Rockwell Hook (1877–1978) from Kansas also traveled to study architecture at the École des Beaux-Arts where she suffered from discrimination against women after sitting for examinations. She was not successful in gaining admittance, and returned to America in 1906, where she did practice architecture. She designed the Pine Mountain Settlement School in Kentucky as well as a number of buildings in Kansas City where she was the first architect to incorporate the natural terrain into her designs and the first to use cast-in-place concrete walls.
|
https://en.wikipedia.org/wiki?curid=35552352
| 1,442,265 |
1,015,624 |
In 2013, Lockheed Martin began to lay off workers at the Fort Worth, Texas plant where the F-35s were being assembled. They said that revised estimates indicated that the costs of refitting the 187 aircraft built by the time testing concludes in 2016 would be lower than feared. The GAO's Michael Sullivan said that the company had failed to get an early start on systems engineering and had not understood the requirements or the technologies involved at the program's start. The Pentagon vowed to continue funding the program during budget sequestration if possible. It was feared that the U.S. budget sequestration in 2013 could slow development of critical software and the Congress ordered another study to be made on the software development delays. As of 2014, software development remained the "number one technical challenge" for the F-35. In June 2013, Frank Kendall, Pentagon acquisition, technology and logistics chief, declared "major advances" had been made in the F-35 program over the last three years and that he intended to approve production-rate increases in September. Air Force Lt. Gen. Christopher Bogdan, program executive officer, reported far better communications between government and vendor managers, and that negotiations over Lot 6 and 7 talks were moving fast. It was also stated that operating costs had been better understood since training started. He predicted "we can make a substantial dent in projections" of operating costs. In July 2013, further doubt was cast on the latest production schedule, with further software delays and continuing sensor, display and wing buffet problems. In August it was revealed that the Pentagon was weighing cancellation of the program as one possible response to the budget sequestration and the United States Senate Appropriations Subcommittee on Defense voted to cut advance procurement for the fighter.
|
https://en.wikipedia.org/wiki?curid=54719700
| 1,015,101 |
1,149,564 |
Cahokia, Pre-Columbian North Americas largest civic center north of Mexico, produced some of the finest and most widely spread ceramics. Pottery from the Cahokia site was especially fine, exhibiting smooth surfaces, very thin walls, and distinctive tempering, slips and coloring. Archaeologists have recorded how these qualities changed and evolved through time, and most examples can be pinpointed very accurately within the phases of the sites chronology. Ramey Incised and Powell Plain are two varieties that emerged during the Stirling Phase, and are considered two of the most important local varieties. A distinctive trait of this period is the shell temper. The cores of the sherds are typically a range of greys to buffs and creams. Some have slips of liquid clay and pigment with common colors being red, grey, and black and the surfaces polished to a high sheen. Although their attributes are nearly identical, there are major differences. While the Powell Plain has an unadorned surface, Ramey Incised are burnished and decorated with a series of incised motifs decorating the upper shoulders of the jar most often interpreted as having underworld or water connections. The incised decoration is added when the clay is still wet by tracing a design with a blunt-ended tool. The specific shapes and incised motifs are used to place the artifacts securely into the local chronology. Most have been found in association with high status items fashioned from exotic materials and associated with specialized structures such as mortuaries and temples, and were almost certainly vessels used exclusively by the elites and for ritual purposes. As the influence of the Cahokian religion, lifestyle and trade network expanded outward from its American Bottom origins, examples of its high status pottery went with it. Numerous examples have been found of local imitations of the exotic wares, albeit usually with less technical skill. Examples of Cahokian made or inspired wares have been found as far away as Aztalan in Wisconsin, the Winterville site in Mississippi. and Fort Ancient sites in Ohio.
|
https://en.wikipedia.org/wiki?curid=28080740
| 1,148,957 |
162,195 |
There was also a systemic malaise in British industry, which was famously inefficient and opposed to innovations. Tony Judt described the prevailing attitude of post-war industrialists: "British factory managers preferred to operate in a cycle of under-investment, limited research and development, low wages and a shrinking pool of clients, rather than risk a fresh start with new products in new markets." The overriding emphasis placed on exports by the British government, in its effort to repair the nation's dollar deficit, made things worse, because it encouraged manufacturers to place all investment in expanding output, at the expense of updating machinery, introducing new technologies, improving production methods, etc. This policy was sustainable in the short-term, because in the late 1940s and early 50s world trade boomed and Britain, with its large and relatively undamaged industrial base, was in a uniquely advantageous position to satisfy demand. In 1950, 25% percent of world exports were British-made, and the total volume of British manufactured goods was double that of France and Germany combined. However, by the late 1950s, the economies of West Germany, France, Japan, and Italy, had recovered from wartime infrastructure damage, replacing destroyed stock with state-of-the-art machinery and applying modern production methods in a process called "rejuvenation by defeat". Continental governments actively encouraged recovery through direct investment/subsidies in targeted industries, in the case of Italy and France, or more widely through encouraging easy access to credit through national banks, a marked characteristic in France and West Germany. British industrialists saw no such intervention from their own government, which more or less left the private sector to itself. British goods were also more expensive abroad because of Sterling's overvaluation, but inferior in quality compared to the products flooding the world market from the United States, Germany and Italy.
|
https://en.wikipedia.org/wiki?curid=33643110
| 162,110 |
2,066,961 |
Independently the Royal Brompton group of the 1980s, a Cambridge (UK) consortium of clinicians and engineers developed a system in 2009 that has revisited structure light pattern as a noninvasive method for collecting accurate representations of chest and abdominal wall movement. The methodology has several advantages: there are no fluorescent markers required to define chest or abdominal surface and the hardware can be minimalized to 2 digital cameras and a digital projector when imaging the anterior surface of the body. The projector shines a grid of black and white squares from superior iliac crest to clavicle; the subject can wear a plain t-shirt of any colour. The 2 digital cameras image the grid on chest and abdomen and the software extracts 2 sets of 2D image positions of the grid points and stereo vision is used to reconstruct these grid points to form a 3D representation of the chest and abdominal wall surface. The group has tried anything from one grid point to 2000 and is current working on roughly 200 to 300. The thoracic volume is calculated from the volume beneath the reconstructed virtual surface and can be plotted in real time. Calibration is internal: the relative position of cameras and shape of the subject is auto-calibrated at reconstruction time; and externally by placing an object of reference size in front of the cameras. This can be a sheet of paper of known size and potentially has now been transferred to projected points within the grid itself. The group have now collected data on up to 70 adults and 5 children and have also collected data from small mammals. They have compared simultaneous measurements of tidal breathing followed by full inspiration and force expiration by pneumotach and SLP. The pneumotach data was obtained using a laptop based spirometer and exported for analysis using J-scope software. Extremely good correlations have been obtained from tidal breathing and forced expiration manoeuvres. Tidal breathing correlations have shown to be 0.99 for individual data sets and for n=70 the mean correlation was 0.92 at an SD of 0.04. For forced expiration the correlation was shown to be 0.98, n=70 mean correlation was 0.98 with an SD of 0.12. For forced expirations inspiratory manoeuvres correlation coefficients for peak flow were r² 0.84, FEV1 0.95, FEF75 0.76 and FEF50 0.69.
|
https://en.wikipedia.org/wiki?curid=22391357
| 2,065,770 |
559,071 |
In 1848, John Martyn Harlow described that Phineas Gage had his frontal lobe pierced by an iron tamping rod in a blasting accident. He became a case study in the connection between the prefrontal cortex and executive functions. In 1861, Paul Broca heard of a patient at the Bicêtre Hospital who had a 21-year progressive loss of speech and paralysis but neither a loss of comprehension nor mental function. Broca performed an autopsy and determined that the patient had a lesion in the frontal lobe in the left cerebral hemisphere. Broca published his findings from the autopsies of twelve patients in 1865. His work inspired others to perform careful autopsies with the aim of linking more brain regions to sensory and motor functions. Another French neurologist, Marc Dax, made similar observations a generation earlier. Broca's hypothesis was supported by Gustav Fritsch and Eduard Hitzig who discovered in 1870 that electrical stimulation of motor cortex caused involuntary muscular contractions of specific parts of a dog's body and by observations of epileptic patients conducted by John Hughlings Jackson, who correctly deduced in the 1870s the organization of the motor cortex by watching the progression of seizures through the body. Carl Wernicke further developed the theory of the specialization of specific brain structures in language comprehension and production. Richard Caton presented his findings in 1875 about electrical phenomena of the cerebral hemispheres of rabbits and monkeys. In 1878, Hermann Munk found in dogs and monkeys that vision was localized in the occipital cortical area, David Ferrier found in 1881 that audition was localized in the superior temporal gyrus and Harvey Cushing found in 1909 that the sense of touch was localized in the postcentral gyrus. Modern research still uses the Korbinian Brodmann's cytoarchitectonic (referring to study of cell structure) anatomical definitions from this era in continuing to show that distinct areas of the cortex are activated in the execution of specific tasks.
|
https://en.wikipedia.org/wiki?curid=4794482
| 558,782 |
643,984 |
Price's framework, like Garfield's, takes for granted the structural inequality of science production, as a minority of researchers creates a large share of publication and an even smaller share have a real measurable impact on subsequent research (with as few as 2% of papers having 4 citations or more at the time). Despite the unprecedented growth of post-war science, Price claimed for the continued existence of an "invisible college" of elite scientists that, as in the time of Robert Boyle undertook the most valuable work. While Price was aware of the power relationships that ensured the domination of such an elite, there was a fundamental ambiguity in the bibliometrics studies, that highlighted the concentration of academic publishing and prestige but also created tools, models and metrics that normalized pre-existing inequalities. The central position of the Scientific Citation Index amplified this performative effect. In the end of the 1960s Eugene Garfield formulated a "law of concentration" that was formally a reinterpretation of the Samuel Bradford's "law of scattering", with a major difference: while Bradford talked for the perspective of a specific research project, Garfield drew a generalization of the law to the entire set of scientific publishing: "the core literature for all scientific disciplines involves a group of no more than 1000 journals, and may involve as few as 500." Such law was also a justification of the practical limitation of the citation index to a limited subset of "core" journals, with the underlying assumption that any expansion into second-tier journals would yield diminishing returns. Rather than simply observing structural trends and patterns, bibliometrics tend to amplify and stratify them even further: "Garfield's citation indexes would have brought to a logical completion, the story of a stratified scientific literature produced by (…) a few, high-quality, "must-buy" international journals owned by a decreasing number of multinational corporations ruling the roost in the global information market."
|
https://en.wikipedia.org/wiki?curid=1223245
| 643,644 |
1,193,781 |
Visual inspection of the cervix, using acetic acid (white vinegar; VIA) or Lugol's iodine (VILI) to highlight precancerous lesions so they can be viewed with the "naked eye", shifts the identification of precancer from the laboratory to the clinic. This method is also referred to as direct visual inspection or cervicoscopy. Such procedures eliminate the need for laboratories and transport of specimens, require very little equipment and provide women with immediate test results. A range of medical professionals—doctors, nurses, or professional midwives—can effectively perform the procedure, provided they receive adequate training and supervision. As a screening test, VIA may perform as well as or better than cervical cytology in accurately identifying pre-cancerous lesions. This has been demonstrated in various studies where trained physicians and mid-level providers correctly identified between 45% and 79% of women at high risk of developing cervical cancer. Though VIA has limited specificity and low positive predictive value (~10%), it is economical, requires little equipment, and provides immediate results. By comparison, the sensitivity of cytology has been shown to be between 47 and 62%. Cytology provides higher specificity (fewer false positives) than VIA. Like cytology, one of the limitations of VIA is that results are highly dependent on the accuracy of an individual's interpretation. This means that initial training and on-going quality control are of paramount importance. Increased false positives are particularly important in a screen-and-treat setting, since over-treatment and resulting impairment of fertility is more likely.
|
https://en.wikipedia.org/wiki?curid=31322039
| 1,193,141 |
1,050,663 |
Recent developments have extended thermal shift approaches to the analysis of ligand interactions in complex mixtures, including intact cells. Initial observations of individual proteins using fast parallel proteolysis (FastPP) showed that stabilization by ligand binding could impart resistance to proteolytic digestion with thermolysin. Protection relative to reference was quantified through either protein staining on gels or western blotting with a labeling antibody directed to a tag fused to the target protein. CETSA, for cellular thermal shift assay, is a method that monitors the stabilization effect of drug binding through the prevention of irreversible protein precipitation, which is usually initiated when a protein becomes thermally denatured. In CETSA, aliquots of cell lysate are transiently heated to different temperatures, following which samples are centrifuged to separate soluble fractions from precipitated proteins. The presence of the target protein in each soluble fraction is determined by western blotting and used to construct a CETSA melting curve that can inform regarding in vivo targeting, drug distribution, and bioavailability. Both FastPP and CETSA generally require antibodies to facilitate target detection, and consequently are generally used in contexts where the target identity is known a priori. Newer developments seek to merge aspects of FastPP and CETSA approaches, by assessing the ligand-dependent dependent proteolytic protection of targets in cells using mass spectroscopy (MS) to detect shifts in proteolysis patterns associated with protein stabilization. Present implementations still require a priori knowledge of expected targets to facilitate data analysis, but improvements in MS data collection strategies, together with the use of improved computational tools and database structures can potentially allow the approach to be used for de novo target decryption on the total cell proteome scale. This would be a major advance for drug discovery since it would allow the identification of discrete molecular targets (as well as off-target interactions) for drugs identified through high-content cellular or phenotypic drug screens.
|
https://en.wikipedia.org/wiki?curid=38876059
| 1,050,117 |
1,874,041 |
Anticoagulation therapy has a long history. In 1884 John Berry Haycraft described a substance found in the saliva of leeches, "Hirudo medicinalis", that had anticoagulant effects. He named the substance ‘Hirudine’ from the Latin name. The use of medicinal leeches can be dated back all the way to ancient Egypt. In the early 20th century Jay McLean, L. Emmet Holt Jr. and William Henry Howell discovered the anticoagulant heparin, which they isolated from the liver (hepar). Heparin remains one of the most effective anticoagulants and is still used today, although it has its disadvantages, such as requiring intravenous administration and having a variable dose-response curve due to substantial protein binding. In the 1980s low molecular-weight heparin (LMWH) were developed. They are derived from heparin by enzymatic or chemical depolymerization and have better pharmacokinetic properties than heparin. In 1955 the first clinical use of warfarin, a vitamin K antagonist, was reported. Warfarin was originally used as a rat poison in 1948 and thought to be unsafe for humans, but a suicide attempt suggested that it was relatively safe for humans. Vitamin K antagonists are the most commonly used oral anticoagulants today and warfarin was the 11th most prescribed drug in the United States in 1999 and is actually the most widely prescribed oral anticoagulant worldwide. Warfarin has its disadvantages though, just like heparin, such as a narrow therapeutic index and multiple food and drug interactions and it requires routine anticoagulation monitoring and dose adjustment. Since both heparin and warfarin have their downsides the search for alternative anticoagulants has been ongoing and DTIs are proving to be worthy competitors. The first DTI was actually hirudin, which became more easily available with genetic engineering. It is now available in a recombinant form as lepirudin (Refludan) and desirudin (Revasc, Iprivask). Development of other DTIs followed with the hirudin analog, bivalirudin, and then the small molecular DTIs. However, such DTIs were also having side effects such as bleeding complications and liver toxicity, and their long-term effects were in doubt.
|
https://en.wikipedia.org/wiki?curid=37120076
| 1,872,964 |
1,856,684 |
In 2001 Insoll began his longest running research mission, the "Early Islamic Bahrain" project, sponsored by HRH Shaikh Salman bin Hamad Al-Khalifa, Crown Prince and Prime Minister of the Kingdom of Bahrain. This has involved excavations and surveys nearly every year since, with co-directors Dr Salman Almahari and Dr Rachel MacLean, and latterly, Prof. Robert Carter. The aims of the project were to reconstruct settlement patterns in Bahrain from the Late Antique period onwards, and evaluate archaeological evidence for trade, conversion to Islam, and the composition of the population over time. These aims are being achieved through identification of a sequence of major settlements occupied at different periods between the 6th and 19th centuries, and though exploring extensive archaeological evidence for trade contacts with Iraq, elsewhere in the Arabian peninsula, Iran, India, the Red Sea and Indian Ocean, and material manifestations of a complex and diverse community including African, Indian, and Christian elements that have been recorded. The latter attested by a large building, possibly the Bishop’s palace of the diocese of Mashmahig, recently identified and excavated in Samahij. The research has employed scientific techniques innovative in Arabian Gulf archaeology, as with micro-malacological analyses which indicated the impact of disease through identification of the vectors for Oriental lung fluke and schistosomiasis/bilharzia. The research has resulted in a permanent site museum in Bilad al-Qadim, an international conference "Islamic Archaeology in Global Perspective" in Bahrain National Museum (2017), and publications, including a study of all the Islamic inscriptions on Bahrain from before 1900, and an "Archaeological Guide to Bahrain" to encourage tourism. The project has had an impact in Bahrain where it has generated substantial interest in social media and via public archaeology days.
|
https://en.wikipedia.org/wiki?curid=38891541
| 1,855,616 |
1,276,223 |
By the spring of 1963, the Ovshinskys had exhausted the savings with which they had initially funded ECL. Before seeking public funding, Stan wanted validation of the importance of his work from a well-recognized scientist. He telephoned Nobel Laureate John Bardeen, a co-inventor of the transistor and co-discoverer of the BCS theory of superconductivity. Bardeen immediately recognized the importance of Ovshinsky's work but his schedule did not permit him to visit ECL for five months. Stan replied, "We'll be broke by then." In his place, Bardeen sent Hellmut Fritzsche, a University of Chicago physicist. Fritzsche became very positive in his support of Ovshinsky's work and helped attract other scientists to the Ovshinsky laboratory. As Fritzsche and Brian Schwartz later wrote, "There is a mysterious quality in Ovshinsky's persona that attracts people into his sphere, builds life long friendships and awakens deep respect and devotion. Meeting him leaves each person with a deep impression of his superior intellect, his self confidence, his compassion to improve society combined with his certainty that his vision can be realized. His enthusiasm is contagious. In his presence, you feel how exciting it would be to join him in his endeavors." Among the many famous scientists who came regularly to ECL as friends or collaborators over the next years, were David Adler, Bardeen, Arthur Bienenstock, Morrel Cohen, Kenichi Fukui, William Lipscomb, Sir Nevill Mott, Linus Pauling, Isadore I. Rabi, Edward Teller, David Turnbull, Victor Weisskopf, and Robert R. Wilson. Some joined as consultants or as members of the Board of Directors. Meanwhile, the ECL community developed a uniquely productive, non-hierarchical, multicultural, international environment, reflecting Stan and Iris' social values. In 1964, Stan and Iris changed the laboratory's name to Energy Conversion Devices and moved the company to larger quarters in Troy, Michigan.
|
https://en.wikipedia.org/wiki?curid=2785878
| 1,275,530 |
710,163 |
During the 1950s and 1960s there was a great burst of activity by propellant chemists to find high-energy liquid and solid propellants better suited to the military. Large strategic missiles need to sit in land-based or submarine-based silos for many years, able to launch at a moment's notice. Propellants requiring continuous refrigeration, which cause their rockets to grow ever-thicker blankets of ice, were not practical. As the military was willing to handle and use hazardous materials, a great number of dangerous chemicals were brewed up in large batches, most of which wound up being deemed unsuitable for operational systems. In the case of nitric acid, the acid itself () was unstable, and corroded most metals, making it difficult to store. The addition of a modest amount of nitrogen tetroxide, , turned the mixture red and kept it from changing composition, but left the problem that nitric acid corrodes containers it is placed in, releasing gases that can build up pressure in the process. The breakthrough was the addition of a little hydrogen fluoride (HF), which forms a self-sealing metal fluoride on the interior of tank walls that "Inhibited" Red Fuming Nitric Acid. This made "IRFNA" storeable. Propellant combinations based on IRFNA or pure as oxidizer and kerosene or hypergolic (self igniting) aniline, hydrazine or unsymmetrical dimethylhydrazine (UDMH) as fuel were then adopted in the United States and the Soviet Union for use in strategic and tactical missiles. The self-igniting storeable liquid bi-propellants have somewhat lower specific impulse than LOX/kerosene but have higher density so a greater mass of propellant can be placed in the same sized tanks. Gasoline was replaced by different hydrocarbon fuels, for example RP-1 a highly refined grade of kerosene. This combination is quite practical for rockets that need not be stored.
|
https://en.wikipedia.org/wiki?curid=1078982
| 709,792 |
789,430 |
Future directions for AAC focus on improving device interfaces, reducing the cognitive and linguistic demands of AAC, and the barriers to effective social interaction. AAC researchers have challenged manufacturers to develop communication devices that are more appealing aesthetically, with greater options for leisure and play and that are easier to use. The rapid advances in smartphone and tablet computer technologies has the potential to radically change the availability of economical, accessible, flexible communication devices, which can generate astonishing results; however, the user interfaces are needed that meet the various physical and cognitive challenges of AAC users. Android and other open source operating systems, provide opportunities for small communities, such as AAC, to develop the accessibility features and software required. Other promising areas of development include the access of communication devices using signals from movement recognition technologies that interpret body motions, or electrodes measuring brain activity, and the automatic transcription of dysarthric speech using speech recognition systems. Utterance-based systems, in which frequent utterances are organized in sets to improve the speed of communication exchange, are also in development. Similarly, research has focused on the provision of timely access to vocabulary and conversation appropriate for specific interactions. Natural language generation techniques have been investigated, including the use of logs of past conversations with conversational partners, data from a user's schedule and from real-time Internet vocabulary searches, as well as information about location from global positioning systems and other sensors. However, despite the frequent focus on technological advances in AAC, practitioners are urged to retain the focus on the communication needs of the AAC users: "The future for AAC will not be driven by advances in technology, but rather by how well we can take advantage of those advancements for the enhancement of communicative opportunities for individuals who have complex communication needs".
|
https://en.wikipedia.org/wiki?curid=2106968
| 789,006 |
571,132 |
In 1914, the Nernsts were entertaining co-workers and students they had brought to their country estate in a private railway car when they learned that war had been declared. Their two older sons entered the army, while father enlisted in the voluntary driver's corps. He supported the German army against their opponent's charges of barbarism by signing the Manifesto of the Ninety-Three, On 21 August 1914, he drove documents from Berlin to the commander of the German right wing in France, advancing with them for two weeks until he could see the glow of the Paris lights at night. The tide turned at the battle of the Marne. When the stalemate in the trenches began, he returned home. He contacted Colonel Max Bauer, the staff officer responsible for munitions, with the idea of driving the defenders out of their trenches with shells releasing tear gas. When his idea was tried one of the observers was Fritz Haber, who argued that too many shells would be needed, it would be better to release a cloud of heavier-than-air poisonous gas; the first chlorine cloud attack on 22 April 1915 was not supported by a strong infantry thrust, so the chance that gas would break the stalemate was irrevocably gone. Nernst was awarded the Iron Cross second class. As a Staff Scientific Advisor in the Imperial German Army, he directed research on explosives, much of which was done in his laboratory where they developed guanidine perchlorate. Then he worked on the development of trench mortars. He was awarded the Iron Cross first class and later the "Pour le Mérite". When the high command was considering unleashing unrestricted submarine warfare, he asked the Kaiser for an opportunity to warn about the enormous potential of the United States as an adversary. They would not listen, Ludendorff shouted him down for "incompetent nonsense." He published his book "The Foundations of the New Heat Theorem".
|
https://en.wikipedia.org/wiki?curid=75876
| 570,841 |
973,109 |
Mortality, though it has lessened to a limited degree, at least in developed countries with timely access to initial and tertiary care, varies where the chance of survival is diminished as the number of organs involved increases. Mortality in MODS from septic shock (which itself has a high mortality of 25-50%), and from multiple traumas, especially if not rapidly treated, appear to be especially severe. If more than one organ system is affected, the mortality rate is still higher, and this is especially the case when five or more systems or organs are affected. Old age is a risk factor in and of itself, and immunocompromised patients, such as with cancer or AIDS or a transplant, are at risk. Prognosis must take into account any co-morbidities the patient may have, their past and current health status, any genetic or environmental vulnerabilities they have, the nature and type of the illness or injury (as an example, data from COVID-19 is still being analyzed, whereas other cases from long-existing illnesses are much better understood), and any resistance to drugs used to treat microbial infections or any hospital-acquired co-infection. Earlier and aggressive treatment, the use of experimental treatments, or at least modern tools such as ventilators, ECMO, dialysis, bypass, and transplantation, especially at a trauma center, may improve outcomes in certain cases, but this depends in part on speedy and affordable access to high-quality care, which many areas lack. Measurements of lactate, cytokines, albumin and other proteins, urea, blood oxygen and carbon dioxide levels, insulin, and blood sugar, adequate hydration, constant monitoring of vital signs, good communication within and between facilities and staff, and adequate staffing, training, and charting are important in MODS, as in any serious illness.
|
https://en.wikipedia.org/wiki?curid=856149
| 972,599 |
1,779,297 |
The Sound Surveillance System (SOSUS) was a top-secret project involving a large scale, underwater sensor network driven by the US Navy from 1949 to monitor Russian vessels in the GIUK Gap. Towards the later part of the cold war, the facility was opened to academic research in an effort to support operational and maintenance costs. Doing so resulted in a significant boost to research in the field of underwater acoustics, leading to the stabilization of sonar performance in deep waters for multiple non-military applications. The Naval Facility Point Sur experimental facility which was started in 1958 was shut down in 1984 for lack of funding. The ship shock test facility and the SURTASS-LFA project was shifted and scaled down due to environmental concerns raised by the Natural Resources Defense Council (NRDC). NRDC compelled the Navy to file environmental impact statement (EIS) for the first time in the early 90s. In 1996, 13 Cuvier's beaked whales were found stranded alive off the coast of Kyparissiakos Gulf, Greece. Alexandros Frantzis, the scientific director of the Pelagos Cetacean Research Institute linked the stranding to the use of sonar in the immediate area. North Atlantic Treaty Organization (NATO), was involved in a joint international experiment using a high powered low-frequency sonar at the time of the stranding. The event became a rallying point for environmental activists who demanded a complete ban on such trials, and the US Navy agreed to fund research to determine the impacts on marine animals. These incidents and more reflected a major shift in geopolitical realities where socio-economic issues had to be balanced with national security demands.
|
https://en.wikipedia.org/wiki?curid=60878831
| 1,778,295 |
542,482 |
The quantitative biological effects of cosmic rays are poorly known, and are the subject of ongoing research. Several experiments, both in space and on Earth, are being carried out to evaluate the exact degree of danger. Additionally, the impact of the space microgravity environment on DNA repair has in part confounded the interpretation of some results. Experiments over the last 10 years have shown results both higher and lower than predicted by current quality factors used in radiation protection, indicating large uncertainties exist. Experiments in 2007 at Brookhaven National Laboratory's NASA Space Radiation Laboratory (NSRL) suggest that biological damage due to a given exposure is actually about half what was previously estimated: specifically, it suggested that low energy protons cause more damage than high energy ones. This was explained by the fact that slower particles have more time to interact with molecules in the body. This may be interpreted as an acceptable result for space travel as the cells affected end up with greater energy deposition and are more likely to die without proliferating into tumors. This is in contrast to the current dogma on radiation exposure to human cells which considers lower energy radiation of higher weighting factor for tumor formation. Relative biological effectiveness (RBE) depends on radiation type described by particle charge number, Z, and kinetic energy per amu, E, and varies with tumor type with limited experimental data suggesting leukemia's having the lowest RBE, liver tumors the highest RBE, and limited or no experimental data on RBE available for cancers that dominate human cancer risks including lung, stomach, breast, and bladder cancers. Studies of Harderian gland tumors in a single strain of female mice with several heavy ions have been made, however it is not clear how well the RBE for this tumor type represents the RBE for human cancers such as lung, stomach, breast and bladder cancers nor how RBE changes with sex and genetic background.
|
https://en.wikipedia.org/wiki?curid=14415787
| 542,202 |
262,377 |
Rochester is a member of the Association of American Universities and is classified among "R1: Doctoral Universities – Very High Research Activity". Rochester had a research expenditure of $397 million in 2020. In 2008, Rochester ranked 44th nationally in research spending, but this ranking has declined gradually to 66 in 2020. Some of the major research centers include the Laboratory for Laser Energetics, a laser-based nuclear fusion facility, and the extensive research facilities at the University of Rochester Medical Center. Recently, the university has also engaged in a series of new initiatives to expand its programs in biomedical engineering and optics, including the construction of the new $37 million Robert B. Goergen Hall for Biomedical Engineering and Optics on the River Campus. Other new research initiatives include a cancer stem cell program and a Clinical and Translational Sciences Institute. UR also has the ninth highest technology revenue among U.S. higher education institutions, with $46 million being paid for commercial rights to university technology and research in 2009. Notable patents include Zoloft and Gardasil. WeBWorK, a web-based system for checking homework and providing immediate feedback for students, was developed by University of Rochester professors Gage and Pizer. The system is now in use at over 800 universities and colleges, as well as several secondary and primary schools. Rochester scientists work in diverse areas. For example, physicists developed a technique for etching metal surfaces, such as platinum, titanium, and brass, with powerful lasers, enabling self-cleaning surfaces that repel water droplets and will not rust if tilted at a 4-degree angle; and medical researchers are exploring how brains rid themselves of toxic waste during sleep.
|
https://en.wikipedia.org/wiki?curid=31918
| 262,238 |
911,561 |
Digital consultant apps like Babylon Health's GP at Hand, Ada Health, AliHealth Doctor You, KareXpert and Your.MD use AI to give medical consultation based on personal medical history and common medical knowledge. Users report their symptoms into the app, which uses speech recognition to compare against a database of illnesses. Babylon then offers a recommended action, taking into account the user's medical history. Entrepreneurs in healthcare have been effectively using seven business model archetypes to take AI solution[] to the marketplace. These archetypes depend on the value generated for the target user (e.g. patient focus vs. healthcare provider and payer focus) and value capturing mechanisms (e.g. providing information or connecting stakeholders). IFlytek launched a service robot "Xiao Man", which integrated artificial intelligence technology to identify the registered customer and provide personalized recommendations in medical areas. It also works in the field of medical imaging. Similar robots are also being made by companies such as UBTECH ("Cruzr") and Softbank Robotics ("Pepper"). The Indian startup Haptik recently developed a WhatsApp chatbot which answers questions associated with the deadly coronavirus in India. With the market for AI expanding constantly, large tech companies such as Apple, Google, Amazon, and Baidu all have their own AI research divisions, as well as millions of dollars allocated for acquisition of smaller AI based companies. Many automobile manufacturers are beginning to use machine learning healthcare in their cars as well. Companies such as BMW, GE, Tesla, Toyota, and Volvo all have new research campaigns to find ways of learning a driver's vital statistics to ensure they are awake, paying attention to the road, and not under the influence of substances or in emotional distress. Examples of projects in computational health informatics include the COACH project.
|
https://en.wikipedia.org/wiki?curid=351581
| 911,082 |
895,266 |
Metal Gear RAY differs from previous Metal Gear models in that it is not a nuclear launch platform, but instead, a weapon of conventional warfare, originally designed by the U.S. Marines to hunt down and destroy the many Metal Gear derivatives that became common after Metal Gear REX's plans leak following the events of Shadow Moses. RAY is designed to be even more maneuverable and flexible in deployment than other models and can operate both on land and in water. While RAY has a pair of Gatling guns and missile launchers on its back and knees to defend itself from more conventional battlefield threats, its primary weapon is a powerful water jet cutter, which can cut through heavily armored foes, such as Metal Gear derivatives. RAY is more organic in appearance and in function than previous models. Its streamlined shape helps to deflect enemy fire and allows for greater maneuverability both on land and in water. Its interior workings are also somewhat organic, as it has artificial fibers that contract when electricity is applied, much like natural muscle, instead of typical hydraulics; this pseudo-muscle tissue makes it very maneuverable. It also has a nervous-system-like network of conductive nanotubes, which connect the widely dispersed sensor systems and relay commands from the cockpit to the various parts of RAY's body, automatically bypassing damaged systems and rerouting to auxiliary systems when needed. Another feature is its blood-like armor-repair nanopaste, which is secreted from valves and coagulates wherever the exterior surface is damaged. Particularly unusual is its "face", with two "eyes" and a gaping "mouth", only seen when the head armor is removed. The original version is an operational prototype labeled "MARINES", which has a cockpit for a single pilot, and a long tail intended for balance while making leaps or operating underwater. The entire forward interior of the cockpit is a heads-up display, allowing the pilot to look around as if there were no obstruction between him and the battlefield. The mass-production model labeled "U.S. NAVY" lacks the tail of the prototype, but it has rounded knees and only one sensory input or "eye" instead of the prototype's two. They are also painted in an olive-drab camouflage pattern.
|
https://en.wikipedia.org/wiki?curid=3175926
| 894,796 |
1,907,213 |
On return trip to Sarawak in March 1992, it was found that the tree had been cut down by local people most likely for fuel and building material. There was no other trees of the same variety existed in the Lundu region. The collectors then returned home with samples from other varieties of the "Calophyllum lanigerum" species, most of which failed to show any anti-HIV properties. Then, a few existing species were eventually located in the Singapore Botanic Gardens. With sufficient raw materials, the scientists were able to isolate the active ingredient as (+)-Calanolide A. Since the plant source is relatively rare, a method of total synthesis was developed in 1996 which showed same effectiveness against HIV virus when compared to the original compound. In addition, the samples collected from the trip in March 1992 yielded another positive result. A sister compound (-)-Calanolide B (also known as Costatolide) was isolated from the latex of "Calophyllum teysmannii" var. "innophylloide" trees. Sarawak Forestry Department was in collaboration with University of Illinois at Chicago (UIC) for sustainable harvest of (-)-Calanolide B. Although Costatolide is less potent than Calanolide A, the yield from raw materials is much higher (20 to 25%) when compared to Calanolide A (0.05%). Therefore, Costatolide had been accepted as replacement for Calanolide A. In June 1993, The Calophyllum Species (Prohibition of Felling and Restriction of Export) Order was issued by the Sarawak state government to ensure adequate supply of the trees for medicinal purposes. In the same year, International Convention on Biological Diversity (CBD) treaty came into effect with 179 countries as its signatories. However, the United States did not sign the treaty. Therefore, the Sarawak state government set up Sarawak Biodiversity Centre (SBC) in 1997 to regulate bioprospecting activities in the state.
|
https://en.wikipedia.org/wiki?curid=14562897
| 1,906,117 |
1,248,701 |
Europe around the 15th century was changing quickly. The new world had just been opened up, overseas trade and plunder was pouring wealth through the international economy and attitudes among businessmen were shifting. In 1561, a system of Industrial Monopoly Licences, similar to modern patents had been introduced into England. But by the reign of Queen Elizabeth I, the system was reputedly much abused and used merely to preserve privileges, encouraging nothing new in the way of innovation or manufacture. When a protest was made in the House of Commons and a Bill was introduced, the Queen convinced the protesters to challenge the case in the courts. This was the catalyst for the Case of Monopolies or "Darcy v Allin". The plaintiff, an officer of the Queen's household, had been granted the sole right of making playing cards and claimed damages for the defendant's infringement of this right. The court found the grant void and that three characteristics of monopoly were (1) price increases (2) quality decrease (3) the tendency to reduce artificers to idleness and beggary. This put a temporary end to complaints about monopoly, until King James I began to grant them again. In 1623 Parliament passed the Statute of Monopolies, which for the most part excluded patent rights from its prohibitions, as well as the guilds. From King Charles I, through the civil war and to King Charles II, monopolies continued, and were considered especially useful for raising revenue. Then in 1684, in "East India Company v Sandys" it was decided that exclusive rights to trade only outside the realm were legitimate on the grounds that only large and powerful concerns could trade in the conditions prevailing overseas. In 1710, to deal with high coal prices caused by a Newcastle Coal Monopoly, the New Law was passed. Its provisions stated that "all and every contract or contracts, Covenants and Agreements, whether the same be in writing or not in writing...[between] persons whatsoever concerned the said Coal trade, for Ingrossing Coals, or for restraining or hindering any Person or Persons whomsoever from freely... disposing of Coals... are hereby declared to be illegal." When Adam Smith wrote the "Wealth of Nations" in 1776 he was somewhat cynical of the possibility for change.
|
https://en.wikipedia.org/wiki?curid=12870157
| 1,248,025 |
2,028,838 |
Since its creation and until 1920, the school depended on the National Ministry of Public Education. In 1920, it became dependent on the National University of the Litoral, as an annex to the Faculty of Mathematical, Physical-Chemical and Natural Sciences applied to industry. From that moment on, the Industrial School became, in fact, the institution that prepared, through its middle cycle, future entrants to the university careers of said Faculty. The increase in the population, mainly due to the immigration contribution, was the main factor in the economic development of commerce and industries. Consequently, this gradual industrial development brought about a considerable increase in the number of graduates who were not pursuing university studies, but were devoting themselves fully to the labor field, with high professional qualifications. In other cases, in parallel to working as a technician, many were undertaking higher education. As a consequence, the initial objective was broadened, in the face of a new reality, being necessary to train technicians of the best professional level, regardless of whether they continued university studies. In 1961 the objectives were established in an indubitable way, when it was established that the School should not train future engineering students, nor qualified workers, but the middle layer of industry executives, who are the link between the upper and lower levels of the structure. occupational, therefore possessing its own profile. This led to the proposal of a substantial change in the study plans during 1962-63, which included the following aspects: updating of plans and programs, departmentalized teaching structure, re-equipping of laboratories and workshops, renovation of teaching materials, modification of the promotion regime and permanent evaluation of the teaching and learning process.
|
https://en.wikipedia.org/wiki?curid=5364412
| 2,027,670 |
667,880 |
In January 2005, Harvard University President Lawrence Summers sparked controversy at a National Bureau of Economic Research (NBER) Conference on Diversifying the Science & Engineering Workforce. Dr. Summers offered his explanation for the shortage of women in senior posts in science and engineering. He made comments suggesting the lower numbers of women in high-level science positions may in part be due to innate differences in abilities or preferences between men and women. Making references to the field and behavioral genetics, he noted the generally greater variability among men (compared to women) on tests of cognitive abilities, leading to proportionally more men than women at both the lower and upper tails of the test score distributions. In his discussion of this, Summers said that "even small differences in the standard deviation [between genders] will translate into very large differences in the available pool substantially out [from the mean]". Summers concluded his discussion by saying:So my best guess, to provoke you, of what's behind all of this is that the largest phenomenon, by far, is the general clash between people's legitimate family desires and employers' current desire for high power and high intensity, that in the special case of science and engineering, there are issues of intrinsic aptitude, and particularly of the variability of aptitude, and that those considerations are reinforced by what are in fact lesser factors involving socialization and continuing discrimination.Despite his protégée, Sheryl Sandberg, defending Summers' actions and Summers offering his own apology repeatedly, the Harvard Graduate School of Arts and Sciences passed a motion of "lack of confidence" in the leadership of Summers who had allowed tenure offers to women plummet after taking office in 2001. The year before he became president, Harvard extended 13 of its 36 tenure offers to women and by 2004 those numbers had dropped to 4 of 32 with several departments lacking even a single tenured female professor. This controversy is speculated to have significantly contributed to Summers resignation from his position at Harvard the following year.
|
https://en.wikipedia.org/wiki?curid=3135183
| 667,531 |
1,618,766 |
The roots of tropical ecology can be traced to the voyages of European naturalists in the late 19th and early 20th centuries. Men who might be considered early ecologists such as Alexander Von Humboldt, Thomas Belt, Henry Walter Bates, and even Charles Darwin sailed to tropical locations and wrote extensively about the exotic flora and fauna they encountered. While many naturalists were simply drawn to the exotic nature of the tropics, some historians argue that the naturalists conducted their studies on tropical islands in order to increase the likelihood that their work might bring about social and political change. In any case, these early explorations and the subsequent writings that came from them comprise much of the early work of tropical ecology and served to spark further interest in the tropics among other naturalists. Henry Walter Bates, for example, wrote extensively about a species of toucan he encountered while traveling along the Amazon River. Bates discovered that if one toucan called out, the other surrounding toucans would mimic his or her call, and the forest would quickly fill with the sounds of toucans; this was one of the first documented studies of animal mimicry. Alexander Von Humboldt voyaged throughout South America, from Venezuela through the Andes Mountains. There, Humboldt and his associate, Aimé Bonpland, stumbled upon an interesting ecological concept. As the pair traveled from the base of the mountains to the peak, they noticed that the species of plants and animals would change according to which climatic zone they were in relative to their elevation. This simple discovery aided the theorization of the "life zone concept," which would eventually give way to the popularization of the concept of ecosystems. Another voyager, William Beebe, researched many species of birds in tropical locations and published a large gamut of academic works on his findings that greatly shaped the field of ornithology. According to his biographer Carol Grant Gould, "The effects William Beebe had on science... are enormous and lasting. He made an effective transition between the Victorian natural historian, content to collect and classify the natural world, and the modern experimental biologist." The work of these early pioneers not only lead to an increased interest in the burgeoning field of tropical ecology, but also had far reaching implications for scientific study on the whole.
|
https://en.wikipedia.org/wiki?curid=19826061
| 1,617,851 |
2,101,390 |
An imaging cycler microscope (ICM) is a fully automated (epi)fluorescence microscope which overcomes the spectral resolution limit resulting in parameter- and dimension-unlimited fluorescence imaging. The principle and robotic device was described by Walter Schubert in 1997 and has been further developed with his co-workers within the human toponome project. The ICM runs robotically controlled repetitive incubation-imaging-bleaching cycles with dye-conjugated probe libraries recognizing target structures in situ (biomolecules in fixed cells or tissue sections). This results in the transmission of a randomly large number of distinct biological informations by re-using the same fluorescence channel after bleaching for the transmission of another biological information using the same dye which is conjugated to another specific probe, a.s.o. Thereby noise-reduced quasi-multichannel fluorescence images with reproducible physical, geometrical, and biophysical stabilities are generated. The resulting power of combinatorial molecular discrimination (PCMD) per data point is given by 65,536, where 65,536 is the number of grey value levels (output of a 16-bit CCD camera), and "k" is the number of co-mapped biomolecules and/or subdomains per biomolecule(s). High PCMD has been shown for "k" = 100, and in principle can be expanded for much higher numbers of "k". In contrast to traditional multichannel–few-parameter fluorescence microscopy (panel a in the figure) high PCMDs in an ICM lead to high functional and spatial resolution (panel b in the figure). Systematic ICM analysis of biological systems reveals the supramolecular segregation law that describes the principle of order of large, hierarchically organized biomolecular networks in situ (toponome). The ICM is the core technology for the systematic mapping of the complete protein network code in tissues (human toponome project). The original ICM method includes any modification of the bleaching step. Corresponding modifications have been reported for antibody retrieval and chemical dye-quenching debated recently. The Toponome Imaging Systems (TIS) and multi-epitope-ligand cartographs (MELC) represent different stages of the ICM technological development. Imaging cycler microscopy received the American ISAC best paper award in 2008 for the three symbol code of organized proteomes.
|
https://en.wikipedia.org/wiki?curid=42191418
| 2,100,178 |
934,567 |
In his 2005 paper "Come Back and See Me Next Tuesday," Nettl asks whether ethnomusicologists can, or even should practice a unified field methodology as opposed to each scholar developing their own individual approach. Nettl considers several factors when sampling music from different cultures. The first thing is that in order to discover the best representation of any culture, it is important to be able to “discern between ordinary experience and ideal,” all while considering the fact that “the ‘ideal’ musician may also know and do things completely outside the ken of the rest.” Another factor is the process of selecting teachers, which depends on what the fieldworker wishes to accomplish. Regardless of whatever method a fieldworker decides to use to conduct research, fieldworkers are expected to “show respect for their material and for the people with whom they work.” As Nettl explains, ethnomusicology is a field heavily relies on both the collection of data and the development of strong personal relationships, which often cannot be quantified by statistical data. He summarizes Bronisław Malinowski's classification of anthropological data (or, as Nettl applies it, ethnomusicological data) by outlining it as three types of information: 1) texts, 2) structures, and 3) the non-ponderable aspects of everyday life. The third type of information, Nettl claims is the most important because it captures the ambiguity of experience that cannot be captured well through writing. He cites another attempt made by Morris Friedrich, an anthropologist, to classify field data into fourteen different categories in order to demonstrate the complexity that information gathered through fieldwork contains. There are a myriad of factors, many of which exist beyond the researcher's comprehension, that prevent a precise and accurate representation of what one has experienced in the field. As Nettl notices, there is a current trend in ethnomusicology to no longer even attempt to capture a whole system or culture, but to focus on a very specific niche and try to explain it thoroughly. Nettl's question, however, still remains: should there be a uniform method for going about this type of fieldwork?
|
https://en.wikipedia.org/wiki?curid=80077
| 934,075 |
1,404,473 |
Capt. Dudley Pound’s Admiralty Plans Division report of 6 April 1918 on operations with D.C.B’s controlled from aircraft began “It is considered that the time is now ripe to formulate concrete plans. It is assumed that 60 - 80 miles should be considered the maximum range possible at present under normal conditions. Owing to limitations of wave-lengths, four D.C.B's would be used at a time, but further relays of four boats could be sent at intervals of not less than 5 miles. The D.C.B"s. will contain an explosive charge considerably heavier than any modern Torpedo.” The targets he evaluated in detail were enemy vessels at Emden. Zeebrugge. Ostend, at enemy harbours in Adriatic, at Constantinople and its vicinity and at sea. The report states “As regards lock-gates, wharfs, piers, etc. These can be found at Emden, Zeebrugge, Ostend, etc. Targets on the Elbe are, at present, at rather long range unless it is feasible to employ aircraft in relays.” He states “These boats, with their heavy load of explosive, will tide over the time until suitable aircraft are produced which can carry a torpedo with a head capable of creating a decisive effect on capital ships.” and “If 3 or 4 Flotillas (of four each) of these boats were prepared, a continued attack might be made on Ostend” Following a request from The Commander-In-Chief Grand Fleet on 22 July 1918 the report of the Dover Trials assessed the employment of these boats in the Bight or for fleet operations and this report of the 27 September 1918 began with the declaration that stated "Wireless controlling gear for steering a vessel from an aircraft, ship or shore station, is an accomplished fact, and can probably be fitted to any type of vessel. Successful experiments have already been carried out with submarines, motor launches, and 40 foot coastal motor boats." Their main sources of radio control developments were Captain Ryan at the Hawkcraig Experimental Station and Captain Low in the Feltham Experimental Works. However, the DCB Section accessed the work of others such as the Birmingham inventor George Joseph Dallison and the Russian Air Force officer Sergey Alekseevich Oulianine who was based in Paris at this time.
|
https://en.wikipedia.org/wiki?curid=67154389
| 1,403,683 |
1,684,905 |
Claude received travel grants from Belgian government for his doctoral thesis on the transplantation of mouse cancers into rats. With this he worked his postdoctoral research in Berlin during the winter of 1928–1929, first at the Institut für Krebsforschung, and then at the Kaiser Wilhelm Institute for Biology, Dahlem, in the laboratory of tissues culture of Prof. Albert Fischer. Back in Belgium he received fellowship in 1929 from the Belgian American Educational Foundation (Commission for Relief in Belgium, CRB) for research in the United States. He applied for the Rockefeller Institute (now the Rockefeller University) in New York, USA. Simon Flexner, then Director, accepted his proposal to work on the isolation and identification of the Rous sarcoma virus. In September 1929 he joined the Rockefeller Institute. In 1930, he discovered the process of cell fractionation, which was groundbreaking in his time. The process consists of grinding up cells to break the membrane and release the cell's contents. He then filtered out the cell membranes and placed the remaining cell contents in a centrifuge to separate them according to mass. He divided the centrifuged contents into fractions, each of a specific mass, and discovered that particular fractions were responsible for particular cell functions. In 1938 he identified and purified for the first time component of Rous sarcoma virus, the causal agent of carcinoma, as "ribose nucleoprotein" (eventually named RNA). He was the first to use electron microscope to study biological cells. Earlier electron microscopes were used only in physical researches. His first electron microscopic study was on the structure of mitochondria in 1945. He was given American citizenship in 1941. He discovered that mitochondria are the "power houses" of all cells. He also discovered cytoplasmic granules full of RNA and named them "microsomes", which were later renamed ribosomes, the protein synthesizing machineries of cell. With his associate, Keith Porter, he found a "lace-work" structure that was eventually proven to be the major structural feature of the interior of all eukaryotic cells. This was the discovery of endoplasmic reticulum (a Latin for "fishnet").
|
https://en.wikipedia.org/wiki?curid=1505548
| 1,683,960 |
1,684,904 |
Albert Claude was born in 1899 (but according to civil register 1898) in Longlier, a hamlet in Neufchâteau, Belgium, to Florentin Joseph Claude and Marie-Glaudice Watriquant Claude. He was the youngest among three brothers and one sister. His father was a Paris-trained baker and ran a bakery-cum-general store at Longlier valley near railroad station. His mother, who developed breast cancer in 1902, died when he was seven years old. He spent his pre-school life with his ailing mother. He started education in Longlier Primary School, a pluralistic school of single room, mixed grades, and all under one teacher. In spite of the inconveniences, he remarked the education system as "excellent." He served as a bell boy, ringing church bell every morning at 6. Due to economic depression the family moved to Athus, a prosperous region with steel mills, in 1907. He entered German-speaking school. After two years he was asked to look after his uncle who was disabled with cerebral haemorrhage in Longlier. He dropped out of school and practically nursed his uncle for several years. At the outbreak of the First World War he was apprenticed to steel mills and worked as an industrial designer. Inspired by Winston Churchill, then British Minister of War, he joined the resistance and volunteered in British Intelligence Service in which he served during the whole war. At the end of the war he was decorated with the Interallied Medal along with veteran status. He then wanted to continue education. Since he had no formal secondary education, particularly required for medicine course, such as in Greek and Latin, he tried to join School of Mining in Liège. By that time Marcel Florkin became head of the Direction of Higher Education in Belgium's Ministry of Public Instruction, and under his administration passed the law that enabled war veterans to pursue higher education without diploma or other examinations. As an honour to his war service, he was given admission to the University of Liège in 1922 to study medicine. He obtained his degree of Doctor of Medicine in 1928.
|
https://en.wikipedia.org/wiki?curid=1505548
| 1,683,959 |
367,076 |
Although the experimental protocol had not been published, physicists in several countries attempted, and failed, to replicate the excess heat phenomenon. The first paper submitted to "Nature" reproducing excess heat, although it passed peer review, was rejected because most similar experiments were negative and there were no theories that could explain a positive result; this paper was later accepted for publication by the journal "Fusion Technology". Nathan Lewis, professor of chemistry at the California Institute of Technology, led one of the most ambitious validation efforts, trying many variations on the experiment without success, while CERN physicist Douglas R. O. Morrison said that "essentially all" attempts in Western Europe had failed. Even those reporting success had difficulty reproducing Fleischmann and Pons' results. On 10 April 1989, a group at Texas A&M University published results of excess heat and later that day a group at the Georgia Institute of Technology announced neutron production—the strongest replication announced up to that point due to the detection of neutrons and the reputation of the lab. On 12 April Pons was acclaimed at an ACS meeting. But Georgia Tech retracted their announcement on 13 April, explaining that their neutron detectors gave false positives when exposed to heat. Another attempt at independent replication, headed by Robert Huggins at Stanford University, which also reported early success with a light water control, became the only scientific support for cold fusion in 26 April US Congress hearings. But when he finally presented his results he reported an excess heat of only one degree Celsius, a result that could be explained by chemical differences between heavy and light water in the presence of lithium. He had not tried to measure any radiation and his research was derided by scientists who saw it later. For the next six weeks, competing claims, counterclaims, and suggested explanations kept what was referred to as "cold fusion" or "fusion confusion" in the news.
|
https://en.wikipedia.org/wiki?curid=7463
| 366,884 |
786,995 |
Psychedelics cause perceptual and cognitive distortions without delirium. The state of intoxication is often called a "trip". Onset is the first stage after an individual ingests (LSD, psilocybin, ayahuasca, and mescaline) or smokes (dimethyltryptamine) the substance. This stage may consist of visual effects, with an intensification of colors and the appearance of geometric patterns that can be seen with one's eyes closed. This is followed by a plateau phase, where the subjective sense of time begins to slow and the visual effects increase in intensity. The user may experience synesthesia, a crossing-over of sensations (for example, one may "see" sounds and "hear" colors). These outward sensory effects have been referred to as the "mystical experience", and current research suggests that this state could be beneficial to the treatment of some mental illnesses, such as depression and possibly addiction. In instances where some patients have seen a lack of improvement from the use of antidepressants, serotonergic hallucinogens have been observed to be rather effective in treatment. In addition to the sensory-perceptual effects, hallucinogenic substances may induce feelings of depersonalization, emotional shifts to a euphoric or anxious/fearful state, and a disruption of logical thought. Hallucinogens are classified chemically as either indolamines (specifically tryptamines), sharing a common structure with serotonin, or as phenethylamines, which share a common structure with norepinephrine. Both classes of these drugs are agonists at the 5-HT receptors; this is thought to be the central component of their hallucinogenic properties. Activation of 5-HT may be particularly important for hallucinogenic activity. However, repeated exposure to hallucinogens leads to rapid tolerance, likely through down-regulation of these receptors in specific target cells. Research suggests that hallucinogens affect many of these receptor sites around the brain and that through these interactions, hallucinogenic substances may be capable of inducing positive introspective experiences. The current research implies that many of the effects that can be observed occur in the occipital lobe and the frontomedial cortex; however, they also present many secondary global effects in the brain that have not yet been connected to the substance's biochemical mechanism of action.
|
https://en.wikipedia.org/wiki?curid=45621
| 786,572 |
754,637 |
After World War II, Canadian-born John Kenneth Galbraith (1908–2006) became one of the standard bearers for pro-active government and liberal-democrat politics. In "The Affluent Society" (1958), Galbraith argued that voters reaching a certain material wealth begin to vote against the common good. He also argued that the "conventional wisdom" of the conservative consensus was not enough to solve the problems of social inequality. In an age of big business, he argued, it is unrealistic to think of markets of the classical kind. They set prices and use advertising to create artificial demand for their own products, distorting people's real preferences. Consumer preferences actually come to reflect those of corporations – a "dependence effect" – and the economy as a whole is geared to irrational goals. In "The New Industrial State" Galbraith argued that economic decisions are planned by a private-bureaucracy, a technostructure of experts who manipulate marketing and public relations channels. This hierarchy is self-serving, profits are no longer the prime motivator, and even managers are not in control. Because they are the new planners, corporations detest risk, require steady economic and stable markets. They recruit governments to serve their interests with fiscal and monetary policy, for instance adhering to monetarist policies which enrich money-lenders in the City through increases in interest rates. While the goals of an affluent society and complicit government serve the irrational technostructure, public space is simultaneously impoverished. Galbraith paints the picture of stepping from penthouse villas onto unpaved streets, from landscaped gardens to unkempt public parks. In "Economics and the Public Purpose" (1973) Galbraith advocates a "new socialism" as the solution, nationalising military production and public services such as health care, introducing disciplined salary and price controls to reduce inequality.
|
https://en.wikipedia.org/wiki?curid=7881361
| 754,234 |
1,217,593 |
Another pioneer within the field of activity-dependent plasticity is Michael Merzenich, currently a professor in neuroscience at the University of California, San Francisco. One of his contributions includes mapping out and documenting the reorganization of cortical regions after alterations due to plasticity. While assessing the recorded changes in the primary somatosensory cortex of adult monkeys, he looked at several features of the data including how altered schedules of activity from the skin remap to cortical modeling and other factors that affect the representational remodeling of the brain. His findings within these studies have since been applied to youth development and children with language-based learning impairments. Through many studies involving adaptive training exercises on computer, he has successfully designed methods to improve their temporal processing skills. These adaptive measures include word-processing games and comprehension tests that involve multiple regions of the brain in order to answer. The results later translated into his development of the Fast ForWord program in 1996, which aims to enhance cognitive skills of children between kindergarten and twelfth grade by focusing on developing "phonological awareness". It has proven very successful at helping children with a variety of cognitive complications. In addition, it has led to in depth studies of specific complications such as autism and intellectual disability and the causes of them. Alongside a team of scientists, Merzenich helped to provide evidence that autism probes monochannel perception where a stronger stimulus-driven representation dominates behavior and weaker stimuli are practically ignored in comparison.
|
https://en.wikipedia.org/wiki?curid=20510214
| 1,216,940 |
77,470 |
Under the leadership of Hale, Noyes, and Millikan (aided by the booming economy of Southern California), Caltech grew to national prominence in the 1920s and concentrated on the development of Roosevelt's "Hundredth Man". On November 29, 1921, the trustees declared it to be the express policy of the institute to pursue scientific research of the greatest importance and at the same time "to continue to conduct thorough courses in engineering and pure science, basing the work of these courses on exceptionally strong instruction in the fundamental sciences of mathematics, physics, and chemistry; broadening and enriching the curriculum by a liberal amount of instruction in such subjects as English, history, and economics; and vitalizing all the work of the Institute by the infusion in generous measure of the spirit of research". In 1923, Millikan was awarded the Nobel Prize in Physics. In 1925, the school established a department of geology and hired William Bennett Munro, then chairman of the division of History, Government, and Economics at Harvard University, to create a division of humanities and social sciences at Caltech. In 1928, a division of biology was established under the leadership of Thomas Hunt Morgan, the most distinguished biologist in the United States at the time, and discoverer of the role of genes and the chromosome in heredity. In 1930, Kerckhoff Marine Laboratory was established in Corona del Mar under the care of Professor George MacGinitie. In 1926, a graduate school of aeronautics was created, which eventually attracted Theodore von Kármán. Kármán later helped create the Jet Propulsion Laboratory, and played an integral part in establishing Caltech as one of the world's centers for rocket science. In 1928, construction of the Palomar Observatory began.
|
https://en.wikipedia.org/wiki?curid=5786
| 77,441 |
736,880 |
One of the first devices that was designed to look like and function as a virtual reality headset was called a stereoscope. It was invented in the 1830s during the early days of photography, and it used a slightly different image in each eye to create a kind of 3D effect. Although as photography continued to develop in the late 1800s, stereoscopes became more and more obsolete. Immersive technology became more available to the people in 1957 when Morton Heilig invented the Sensorama cinematic experience that included speakers, fans, smell generators, and a vibrating chair to immerse the viewer in the movie. When one imagines the VR headsets they see today, they must give credit to The Sword of Damocles which was invented in 1968 and allowed users to connect their VR headsets to a computer rather than a camera. In 1991, Sega launched the Sega VR headset which was made for arcade/home use, but only the arcade version was released due to technical difficulties. Augmented reality began to rapidly develop within the 1990s when Louis Rosenberg created Virtual Fixtures, which was the first fully immersive augmented reality system, used for the Air Force. The invention enhanced operator performance of manual tasks in remote locations by using two robot controls in an exoskeleton. The first introduction of augmented reality displayed to a live audience was in 1998, when the NFL first displayed a virtual yellow line to represent the line of scrimmage/first down. In 1999, Hirokazu Kato developed the ARToolkit, which was an open source library for the development of AR applications. This allowed people to experiment with AR and release new and improved applications. Later, in 2009 Esquire's magazine was the first to use a QR code on the front of their magazine to provide additional content. Once The Oculus came out in 2012, it revolutionized virtual reality and eventually raised 2.4 million dollars and began releasing their pre-production models to developers. Facebook purchased Oculus for 2 billion dollars in 2014, which showed the world the upward trajectory of VR. In 2013, Google announced their plans to develop their first AR headset, Google Glass. The production stopped in 2015 due to privacy concerns, but relaunched in 2017 exclusively for the enterprise. In 2016, Pokémon Go took the world by storm and became one of the most downloaded apps of all time. It was the first augmented reality game that was accessible through ones phone.
|
https://en.wikipedia.org/wiki?curid=10499965
| 736,492 |
495,053 |
Around 1904–1905, the works of Hendrik Lorentz, Henri Poincaré and finally Albert Einstein's special theory of relativity, exclude the possibility of propagation of any effects faster than the speed of light. It followed that Newton's law of gravitation would have to be replaced with another law, compatible with the principle of relativity, while still obtaining the Newtonian limit for circumstances where relativistic effects are negligible. Such attempts were made by Henri Poincaré (1905), Hermann Minkowski (1907) and Arnold Sommerfeld (1910). In 1907 Einstein came to the conclusion that to achieve this a successor to special relativity was needed. From 1907 to 1915, Einstein worked towards a new theory, using his equivalence principle as a key concept to guide his way. According to this principle, a uniform gravitational field acts equally on everything within it and, therefore, cannot be detected by a free-falling observer. Conversely, all local gravitational effects should be reproducible in a linearly accelerating reference frame, and vice versa. Thus, gravity acts like a fictitious force such as the centrifugal force or the Coriolis force, which result from being in an accelerated reference frame; all fictitious forces are proportional to the inertial mass, just as gravity is. To effect the reconciliation of gravity and special relativity and to incorporate the equivalence principle, something had to be sacrificed; that something was the long-held classical assumption that our space obeys the laws of Euclidean geometry, e.g., that the Pythagorean theorem is true experimentally. Einstein used a more general geometry, pseudo-Riemannian geometry, to allow for the curvature of space and time that was necessary for the reconciliation; after eight years of work (1907–1915), he succeeded in discovering the precise way in which space-time should be curved in order to reproduce the physical laws observed in Nature, particularly gravitation. Gravity is distinct from the fictitious forces centrifugal force and coriolis force in the sense that the curvature of spacetime is regarded as physically real, whereas the fictitious forces are not regarded as forces. The very first solutions of his field equations explained the anomalous precession of Mercury and predicted an unusual bending of light, which was confirmed "after" his theory was published. These solutions are explained below.
|
https://en.wikipedia.org/wiki?curid=11694610
| 494,797 |
634,356 |
Owing to its ease and economy of cultivation, the Neff strain of "A. castellanii", discovered in a pond in Golden Gate Park in the 1960s, has been effectively used as a classic model organism in the field of cell biology. From just 30 L of simple medium inoculated with "A. castellanii", about 1 kg of cells can be obtained after several days of aerated culture at room temperature. Pioneered in the laboratory of Edward D. Korn at the National Institutes of Health (NIH), many important biological molecules have been discovered and their pathways elucidated using the "Acanthamoeba" model. Thomas Dean Pollard applied this model at the NIH, Harvard Medical School, Johns Hopkins University School of Medicine, and the Salk Institute for Biological Studies to discover and characterize many proteins that are essential for cell motility, not only in amoebae, but also in many other eukaryotic cells, especially those of the human nervous and immune systems, the developing embryo, and cancer cells. "Acanthamoeba" also has served as a model to study the evolution of certain G-proteins. This unicellular eukaryote expresses few GPCRs over its cell membrane that serve vital role for the microorganism, structural homology bioinformatics tools have been used to show the presence of a homolog of human M1-muscarinic receptor in "A. castellanii". Blocking these muscarinic receptors in past studies has proven to be amoebicidal in Acanthamoeba spp. More recently, voltage-gated calcium channels in "Acanthamoeba" spp. (CavAc) have been reported to have similarities with human voltage-gated calcium channels such as TPC-1 and L-type calcium channels and respond to Ca-channel blockers such as loperamide. This model microbe has been studied to understand complex neurodegenerative states including Alzheimer's disease. Scientists have isolated a neurotransmitter acetylcholine in "Acanthamoeba" and the enzymatic machinery needed for its synthesis.
|
https://en.wikipedia.org/wiki?curid=369857
| 634,018 |
925,952 |
In 2017, the U.S. Army approved a requirement for 1,111 M3E1 units to be fielded to soldiers as part of an Urgent Material Release. The M3E1 is part of the Product Manager Crew Served Weapon portfolio. A key benefit of the M3E1 is that it can fire multiple types of rounds, giving soldiers increased capability on the battlefield. By using titanium, the updated M3E1, based on the M3A1 introduced in 2014, is more than six pounds lighter. The M3E1 is also 2.5 inches shorter and has an improved carrying handle, shoulder padding and an improved sighting system that can be adjusted for better comfort without sacrificing performance. The wiring harness was included in the M3E1 configuration that provides a foregrip controller and programmable fuze setter for an interchangeable fire control system. For added safety and cost savings, an automatic round counter enables soldiers and logisticians to accurately track the service life of each weapon. The M3E1 uses the same family of ammunition as the M3, which has been successfully tested. In November 2017, the U.S. Marine Corps announced they planned to procure the M3E1 MAAWS. 1,200 M3E1s would be acquired with one fielded to every infantry squad. In addition to infantry use, the Marines are considering it to replace the Mk 153 SMAW in combat engineer squads. The weapons perform similar functions and the improvements incorporated into the new M3E1 place it in the same size and approximate weight class as the SMAW. While the SMAW weighs less when loaded, the MAAWS has a greater variety of ammunition available and a maximum effective range of 1,000 meters, twice that of the SMAW. The Marines plan to test both weapons' effectiveness against bunkers to inform their decision.
|
https://en.wikipedia.org/wiki?curid=479277
| 925,466 |
826,013 |
The C9 was a development of Sauber's previous C8 design, retaining a monocoque that largely consisted of aluminium, although considerably stiffer and with numerous other improvements. The rear suspension changed from vertically positioned spring/damper units arranged over the top of the gearbox to a horizontal layout aligned with the longitudinal axis of the car. Aerodynamic changes included the repositioning of the combination of oil/water radiator to the nose of the car, which allowed the use of a modified splitter plate. Commensurate with the repositioning of the radiators, the large NACA ducts were removed from the top of the door sills. The rear deck had been considerably re-profiled and the rear wing was now mounted solely on a central support. Aerodynamically, the car had two configurations: one for sprint circuits and a low drag version for the 5.8 kilometre Mulsanne Straight at Le Mans. In its sprint configuration, it produced of downforce at 320 km/h (200 mph) while generating of drag. The sprint circuit configuration had a L/D ratio of 4:1 while the low drag version was around 3:1. The early engines were again prepared by Swiss engine specialist, Heini Mader, though this is now known to have been a cover for Mercedes back door involvement with the project later on. It had been progressively lightened with the use of a new crankshaft, higher efficiency KKK turbochargers and a liner-less block. It was a semi-stressed part of the chassis and ran a dry sump. There were no special qualifying engines and on 2.2 bar of boost it was said to be rated at "almost ". Maximum race boost was 1.9 bar. Maximum RPM was 7,000 but drivers generally kept to 6,500 during races. The torque curve was almost uniform between 3,000 and 6,000 rpm, giving the engine plenty of flexibility. The engine retained a cross plane crankshaft and the firing order was 1-5-4-8-6-3-7-2. Later M119HL engines were sourced from the Mercedes engine facility at Untertürkheim, supervised by Hermann Hiereth. The addition of 16 valve heads in 1989 took power up by about to around at 1.6 bar and 7,000 RPM. The increase in fuel efficiency meant absolute power could also be taken from just under 800 hp with 2.2 bar of boost to about .
|
https://en.wikipedia.org/wiki?curid=2059863
| 825,569 |
1,966,402 |
The interfacing mechanism is contained inside a common EI source, like that found in any GC-MS system. The liquid phase from a nano HPLC column is admitted from the capillary column port, where the connection tubing and the nebulizer are first introduced and sealed to prevent vacuum loss. The mechanism is based on the formation of an aerosol in high-vacuum conditions, followed by a quick droplet desolvation and final vaporization of the solute prior to the ionization. The completion of the process is quick and complete and reduces chances of thermal decomposition as reported in the Figure, where a scheme of the interface is shown. The core of the interface is represented by the micro-nebulizer. The nebulizer tip protrudes into the ion source so that the spray expansion is completely contained inside the ion volume. The eluate emerges as liquid phase at a flow rate of 300-500 nL/min, and any premature in-tube solvent evaporation is prevented by a convenient thermal insulation of the nebulizer and the connecting tubing from the surrounding source heat. The high temperature of the ion source, between 300 and 400°C, has a double function: to compensate for the latent heat of vaporization during the droplet desolvation, and to convert the solute into the gas phase. If all components of this simple interface are correctly placed and sized, then each substance separated by the nano-column is smoothly converted into the gas phase, the peak profile is nicely reproduced, and high quality mass spectra are generated. Major advantages of this technical solution are the following: 1) It delivers high-quality, fully library matchable mass spectra of most sub-1 kDa molecules amenable by HPLC, 2) It is a chemical ionization free interface (unless operated intentionally) with accurate reproduction of the expected isotope ion abundances, 3) Response is never influenced by matrix components in the sample or in the mobile phase, 4) It can be considered a universal detector for small molecules because response is not related to compound polarity.
|
https://en.wikipedia.org/wiki?curid=24897052
| 1,965,273 |
1,593,152 |
The microneurography technique allows the recording of impulse activity of individual nerve fibers with absolute resolution in attending human subjects. Hence the subject is able to cooperate in various kinds of tests while the exact and complete information carried by the individual nerve fiber is monitored and offered for analysis of correlations between neural activity and physical or mental events. On the other hand, the particular physical conditions involving a microelectrode freely floating in the tissue preclude brisk and large movements because the exact electrode position is easily jeopardized. The experiment can be time-consuming when targeting deeply located nerves because the search procedure can be particularly demanding. However, accessing superficial subcutaneous nerves such as the superficial peroneal nerve on the dorsum of the foot is very rapid. This has permitted that the technique that was initially felt to be unsuited as a diagnostic test can now be applied for clinical purposes. Microneurography strength is in its unique power for exploration of normal neural mechanisms as well as pathophysiological conditions of various neurological disorders. Microneurography records intact axons in vivo and is minimally invasive. There have been no reports of persistent nerve damage. As a result, repeated recordings with the same subject are possible and longitudinal observations can be made. The technique has recently been used in clinical trials of new anti-neuropathic pain agents. During the recording, it is important to create an atmosphere of psychological confidence and to observe carefully the subject´s reactions so that the procedure can be adjusted accordingly. The technique requires training and skill and it is highly recommended to take up the method in an experienced laboratory where the method is regularly used.
|
https://en.wikipedia.org/wiki?curid=7360700
| 1,592,255 |
1,703,445 |
The analysis of ancient genomes of anatomically modern humans has, in recent years, completely revolutionized our way of studying population migrations, transformation and evolution. Nevertheless, much still remains unknown. The first and obvious problem related to this kind of approach, which is going to be partially overcome by the continuous improvement of the ancient DNA extraction techniques, is the difficulty of recovering well preserved ancient genomes, a challenge that is particularly observed in Africa and in Asia, where the temperatures are higher than in other colder regions of the world. Further, Africa is, among all the continents, the one that harbors the most genetic diversity. Besides DNA degradation, also exogenous contamination limits paleogenomic sequencing and assembly processes. As we do not possess ancient DNA coming from the time and the region inhabited by the original ancestors of present-day non-African population, we still know little about their structure and location. The second and more important challenge that this matter has to face is the recovery of DNA from early modern humans (100,000 – 200,000 BP). These data, together with a major number of archaic genomes to analyze and with the knowledge of the timing and of the distribution of archaic genetic admixture, will allow scientists to more easily reconstruct the history of our species. In fact, collecting more data about or genetic history will allow us to track human evolution not only in terms of migrations and natural selection, but also in terms of culture. In the next decade paleogenomics research field is going to focus its attention mainly on three topics: the definition, at a fine-scale detail, of past human interactions by denser sampling, the comprehension of how these interactions have contributed to agricultural transition by analysis of DNA of understudied regions and, finally, the quantification of the natural selection contribution to present-day phenotypes. To interpret all these data geneticists will be required to cooperate, as they have already done with anthropologists and archaeologists, with historians.
|
https://en.wikipedia.org/wiki?curid=58990374
| 1,702,489 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.