doc_id
int32 18
2.25M
| text
stringlengths 245
2.96k
| source
stringlengths 38
44
| __index_level_0__
int64 18
2.25M
|
---|---|---|---|
1,633,624 |
Thus not only did Darwin's theory give a new basis to the study of organic structure, but, whilst rendering the general theory of organic evolution equally acceptable and necessary, it explained the existence of low and simple forms of life as survivals of the earliest ancestry of the more highly complex forms, and revealed the classifications of the systematist as unconscious attempts to construct the genealogical tree or pedigree of plants and animals. Finally, it brought the simplest living matter or formless protoplasm before the mental vision as the starting point whence, by the operation of necessary mechanical causes, the highest forms have been evolved, and it rendered unavoidable the conclusion that this earliest living material was itself evolved by gradual processes, the result also of the known and recognized laws of physics and chemistry, from material which we should call not living. It abolished the conception of life as an entity above and beyond the common properties of matter, and led to the conviction that the marvellous and exceptional qualities of that which we call living matter are nothing more nor less than an exceptionally complicated development of those chemical and physical properties which we recognize in a gradually ascending scale of evolution in the carbon compounds, containing nitrogen as well as oxygen, sulphur and hydrogen as constituent atoms of their enormous molecules. Thus mysticism was finally banished from the domain of biology, and zoology became one of the physical sciencesthe science which seeks to arrange and discuss the phenomena of animal life and form, as the outcome of the operation of the laws of physics and chemistry.
|
https://en.wikipedia.org/wiki?curid=30410498
| 1,632,701 |
1,294,341 |
In addition to surgery, Pott's focused on public health as well drawing his attention to the challenges of chimney-sweeps in his community. He wrote, "in their early infancy, they are most frequently treated with great brutality, and almost starved with cold and hunger; they are thrust up narrow, and sometimes hot chimneys, where they are bruised, burned and almost suffocated; and when they get to puberty, become peculiarly liable to a most noisome, painful, and fatal disease." Pott's approach was unique amongst his contemporaries in that he did not simply note an association but approached chimney sweeps' carcinoma from a causal perspective. His work helped later identify soot as the disease-causing agent. Pott did not figure this out on his own, but rather pointed research in the right direction. During Pott's time, there were many discussions pertaining to the cause of scrotal cancer. James Earle believed that the cancer was caused as a result of the soot entering and residing in the rugae of the scrotum. He famously argued this point based on a trend seen where gardeners who used soot to kill slugs had developed skin carcinoma on their hands. In 1878, George Lawson suggested that the cancer was caused by the friction generated by the chimney sweeper's overalls against the scrotum while sifting through soot. This idea was explored by Passey in 1992 who demonstrated that it was the ethereal extract of soot which was capable of inducing sarcoma. Furthermore, Pott also identified that prepubescent boys were most vulnerable to the carcinoma. Pott rose to fame for these connections between occupational hazards and cancer malignancy even though the connection was not fully understood at the time. Pott's "Chirurgical Observations" provided a framework to shape the modern understanding of "Occupational cancers."
|
https://en.wikipedia.org/wiki?curid=189427
| 1,293,630 |
1,031,100 |
LiDAR has frequently been used by members of the NTS to search for tall tree sites and to locate areas within a site with the greatest potential for tall tree finds. They have found LiDAR to be a useful tool for scouting locations prior to visits, but the values need to be ground truthed for accuracy. Michael Taylor writes: "In the flat areas like Humboldt Redwoods State Park the LiDAR was usually within accuracy and tended to be on the conservative side. For steep hill areas the LIDAR often over-estimated by more due to the fact that redwoods tend to lean down-hill in notch canyons as they seek the open areas for more light. If the tree grows near a ravine this over-estimation from LiDAR was more the norm than the exception. Perhaps only 50% of the LiDAR hit list trees from Redwood National Park were actually trees over . From Humboldt Redwoods State Park nearly 100% of the LiDAR returns that came back as being over were actually trees over when confirmed from the ground or climber deployed tape. It depends on the terrain and how well the ground/trunk interface was captured. For steep and dense canopies the ground determination is a great challenge. " An overview of using LiDAR for tree measurement was written by Paul Jost on the NTS website. Data for much of the United States can be downloaded from the USGS or from various state agencies. Several different data viewers are available. Isenburg and Sewchuk have developed software for Visualizing LiDAR in Google Earth. Another viewer is called Fusion, a LiDAR viewing and analysis software tool developed by the Silviculture and Forest Models Team, Research Branch of the US Forest Service. Steve Galehouse provides a step by step guide to using the Fusion software to supplement the instructions on the Fusion website itself.
|
https://en.wikipedia.org/wiki?curid=39003927
| 1,030,564 |
93,342 |
Some effective altruists start non-profit or for-profit organizations to implement cost-effective ways of doing good. On the non-profit side, for example, Michael Kremer and Rachel Glennerster conducted randomized controlled trials in Kenya to find out the best way to improve students' test scores. They tried new textbooks and flip charts, as well as smaller class sizes, but found that the only intervention that raised school attendance was treating intestinal worms in children. Based on their findings, they started the Deworm the World Initiative. From 2013 to August 2022, GiveWell designated Deworm the World as a top charity based on their assessment that mass deworming is "generally highly cost-effective"; however, there is substantial uncertainty about the benefits of mass deworming programs, with some studies finding long-term effects and others not. The Happier Lives Institute conducts research on the effectiveness of cognitive behavioral therapy (CBT) in developing countries; Canopie develops an app that provides cognitive behavioural therapy to women who are expecting or postpartum; Giving Green analyzes and ranks climate interventions for effectiveness; the Fish Welfare Initiative works on improving animal welfare in fishing and aquaculture; and the Lead Exposure Elimination Project works on reducing lead poisoning in developing countries.
|
https://en.wikipedia.org/wiki?curid=36903454
| 93,301 |
1,003,171 |
This is an iterative CT reconstruction algorithm with edge-preserving TV regularization to reconstruct CT images from highly undersampled data obtained at low dose CT through low current levels (milliampere). In order to reduce the imaging dose, one of the approaches used is to reduce the number of x-ray projections acquired by the scanner detectors. However, this insufficient projection data which is used to reconstruct the CT image can cause streaking artifacts. Furthermore, using these insufficient projections in standard TV algorithms end up making the problem under-determined and thus leading to infinitely many possible solutions. In this method, an additional penalty weighted function is assigned to the original TV norm. This allows for easier detection of sharp discontinuities in intensity in the images and thereby adapt the weight to store the recovered edge information during the process of signal/image reconstruction. The parameter formula_25 controls the amount of smoothing applied to the pixels at the edges to differentiate them from the non-edge pixels. The value of formula_25 is changed adaptively based on the values of the histogram of the gradient magnitude so that a certain percentage of pixels have gradient values larger than formula_25. The edge-preserving total variation term, thus, becomes sparser and this speeds up the implementation. A two-step iteration process known as forward-backward splitting algorithm is used. The optimization problem is split into two sub-problems which are then solved with the conjugate gradient least squares method and the simple gradient descent method respectively. The method is stopped when the desired convergence has been achieved or if the maximum number of iterations is reached.
|
https://en.wikipedia.org/wiki?curid=11403316
| 1,002,653 |
1,519,497 |
In recent years, Giordano has dedicated numerous efforts to studying the relationship between cancer and environmental pollution in the Italian region of Campania, linking his career as a researcher to that of a science communicator. He was among the first to report an increased incidence of various types of cancer in populations near illegal toxic waste sites. Not only has he published scientific articles on this subject, but he has also committed himself to making these data known through two books on the subject, respectively ""Campania, latera di Velini"" and ""Munizza di stator"" edited by Denaro Libri and Minerva respectively. It also launched a petition to protect the environment, signed by over 500 researchers and people from various professional sectors. He has also been the promoter of numerous non-profit initiatives aimed at safeguarding the environment and human health (for more information on Antonio Giordano's contribution to the topic of Health and Environment, also read his interview with Italian scientists and scientists from North America Foundation). Recently, as evidence of his commitment in this direction, Giordano was appointed Technical Consultant of the Public Prosecutor's Office of Avellino for the Iso Chimica case and Scientific Director of the Mediterranea - Food and Wine Academy of Naples. He has carried out some scientific studies highlighting the anticancer properties of tomato. Most recently he was the author and promoter of the Veritas study, a pilot study aimed at clarifying the link between the onset of diseases and exposure to environmental pollutants.
|
https://en.wikipedia.org/wiki?curid=44199904
| 1,518,639 |
203,876 |
Alternatively, emergency medicine in urban areas consists of diverse provider groups, including physicians, physician assistants, nurse practitioners and registered nurses who coordinate with specialists in both inpatient and outpatient facilities to address patients' needs, more specifically in the ED. For all systems, regardless of funding source, EMTALA mandates EDs to conduct a medical examination for anyone that presents at the department, irrespective of paying ability. Non-profit hospitals and health systems – as required by the ACA – must provide a certain threshold of charity care "by actively ensuring that those who qualify for financial assistance get it, by charging reasonable rates to uninsured patients and by avoiding extraordinary collection practices." While there are limitations, this mandate provides support to many in need. That said, despite policy efforts and increased funding and federal reimbursement in urban areas, the triple aim (of improving patient experience, enhancing population health, and reducing the per-capita cost of care) remains a challenge without providers' and payers' collaboration to increase access to preventive care and decrease in ED usage. As a result, many experts support the notion that emergency medical services should only serve immediate risks in urban and rural areas.
|
https://en.wikipedia.org/wiki?curid=52974
| 203,771 |
1,978,469 |
Sorger's research involves multiple systems pharmacology approaches to cancer. The first focuses on preclinical pharmacology, the stage at which the molecular mechanisms of disease are studied and new drugs sought. An investigation into the causes of irreproducibility drug-response measurements led to a series of conceptual, computational, and experimental improvements in scoring drug action that are now widely used in academe and industry and have enabled the discovery of new mechanisms of action for existing drugs. Recent work has focused on deep learning as means to further understand complex protein networks and drug mechanisms. The second project involves developing methods to study drug mechanism at scale in patients through highly multiplexed tissue imaging of the biopsies routinely acquired from patients (particularly cancer patients). This has led to a very rapidly growing tissue imaging and digital histology program that is part of the US National Cancer Institute Moonshot and promises to substantially advance precision cancer care. The third project involves studying the clinical trial record to understand how successful and failed trials differ. An early success was the discovery that the great majority of approved combination cancer therapies exhibit independent action – not synergy. As Merck & Co. investigators subsequently realized, this fundamentally changes how immunotherapy combinations should be developed. The group is now engaged in a large-scale effort to digitize and make freely available all survival data from Phase 3 clinical trials.
|
https://en.wikipedia.org/wiki?curid=69534725
| 1,977,331 |
434,968 |
Diagnosing infections with "C. canimorsus" can be difficult. Common practice for culturing isolates is to keep agar plates for one week; sometimes, cultures of "C. canimorsus" are not visible at that point due to slow growth or inappropriate media. "C. canimorsus" requires very specific culture media and conditions; enriched media are necessary. "C. canimorsus" displays enhanced growth in high concentrations of carbon dioxide, so culturing the bacteria in candle extinction jars or carbon dioxide incubators is necessary. To diagnose this bacillus, certain reactions may be tested. The bacterium should test positive for catalase and oxidase, arginine dihydrolase, maltose, and lactose. It should test negative for nitrate reduction, urease, and HS production. "C. canimorsus" can be distinguished from other Gram-negative bacteria by testing negative for inulin and sucrose. Due to the relatively slow growth of this bacterium, diagnosis often relies upon the clinician having knowledge that the patient was previously in contact with a canine or feline. Once aware of this, clinicians can request that agar plates be kept longer than one week to ensure proper isolation of the bacterium. Sometimes, even these methods fail. Cases have been noted where cultures repeatedly came up negative for "C. canimorsus", only to determine its presence with 16S rRNA gene sequencing. PCR assays of species-specific genes may also be beneficial. For individuals presenting with meningitis, "C. canimorsus" can be diagnosed with a cerebrospinal fluid Gram stain. These methods are more costly, but are the best way to ensure species-level identification. Isolates are usually obtained from blood cultures (88% of the time) and less frequently from bite wounds. In incidents where the patient is in full septic shock, whole blood smears may be effective.
|
https://en.wikipedia.org/wiki?curid=8770962
| 434,754 |
12,341 |
According to Sweeney, the hardest part of the engine to program was the renderer, as he had to rewrite its core algorithm several times during development, though he found less "glamorous" the infrastructure connecting all the subsystems. Despite requiring a significant personal effort, he said the engine was his favorite project at Epic, adding: "Writing the first Unreal Engine was a 3.5-year, breadth-first tour of hundreds of unique topics in software and was incredibly enlightening." Among its features were collision detection, colored lighting, and a limited form of texture filtering. It also integrated a level editor, UnrealEd, that had support for real-time constructive solid geometry operations as early as 1996, allowing mappers to change the level layout on the fly. Even though "Unreal" was designed to compete with id Software (developer of "Doom" and "Quake"), co-founder John Carmack complimented the game for the use of 16-bit color and remarked its implementation of visual effects such as volumetric fog. "I doubt any important game will be designed with 8-bit color in mind from now on. Unreal has done an important thing in pushing toward direct color, and this gives the artists a lot more freedom," he said in an article written by Geoff Keighley for "GameSpot". "Light blooms [the spheres of light], fog volumes, and composite skies were steps I was planning on taking, but Epic got there first with Unreal," he said, adding: "The Unreal engine has raised the bar on what action gamers expect from future products. The visual effects first seen in the game will become expected from future games."
|
https://en.wikipedia.org/wiki?curid=417152
| 12,336 |
534,415 |
After the golden years of the 1920s, the downturn hit hard at Northwestern University, a private school in Illinois. Its annual income dropped 25 percent from $4.8 million in 1930–31 to $3.6 million in 1933–34. The endowment investments shrank, fewer parents could pay full tuition, and annual giving from alumni and philanthropy fell from $870,000 in 1932 to a low of $331,000 in 1935. The university responded with two salary cuts of 10 percent each for all employees. It imposed a hiring freeze, a building freeze, and slashed appropriations for maintenance, books, and research. From a balanced budget in 1930–31, the university had deficits in the range of $100,000 for the next four years, which was made up by using the endowment. Enrollments fell in most schools, with law and music hardest hit. However, the movement toward state certification of school teachers enabled Northwestern to open up a new graduate program in education, bringing in a new clientele. At this financial low point, in June 1933, President Robert Maynard Hutchins of the University of Chicago proposed to merge the two universities, estimating annual savings of $1.7 million. The two presidents were enthusiastic, and the faculty were supportive. However, the Northwestern alumni were vehemently opposed, fearing the loss of their traditions. The medical school was oriented toward training practitioners, and felt it would lose its mission if it was merged into the larger, research oriented University of Chicago medical school. The merger plan was thus dropped. The Deering family gave an unrestricted gift of $6 million in 1935 that rescued the budget, bringing it up to $5.4 million in 1938–39. That allowed many of the spending cuts to be restored, including half the salary reductions.
|
https://en.wikipedia.org/wiki?curid=44440652
| 534,136 |
1,049,957 |
Gray wolves howl to assemble the pack (usually before and after hunts), to pass on an alarm (particularly at a den site), to locate each other during a storm or unfamiliar territory and to communicate across great distances. Wolf howls can under certain conditions be heard over areas of up to . Wolf howls are generally indistinguishable from those of large dogs. Male wolves give voice through an octave, passing to a deep bass with a stress on ""O"", while females produce a modulated nasal baritone with stress on ""U"". Pups almost never howl, while yearling wolves produce howls ending in a series of dog-like yelps. Howling consists of a fundamental frequency that may lie between 150 and 780 Hz, and consists of up to 12 harmonically related overtones. The pitch usually remains constant or varies smoothly, and may change direction as many as four or five times. Howls used for calling pack mates to a kill are long, smooth sounds similar to the beginning of the cry of a great horned owl. When pursuing prey, they emit a higher pitched howl, vibrating on two notes. When closing in on their prey, they emit a combination of a short bark and a howl. When howling together, wolves harmonize rather than chorus on the same note, thus creating the illusion of there being more wolves than there actually are. Lone wolves typically avoid howling in areas where other packs are present. Wolves from different geographic locations may howl in different fashions: the howls of European wolves are much more protracted and melodious than those of North American wolves, whose howls are louder and have a stronger emphasis on the first syllable. The two are however mutually intelligible, as North American wolves have been recorded to respond to European-style howls made by biologists.
|
https://en.wikipedia.org/wiki?curid=61366571
| 1,049,411 |
1,184,588 |
In the open sea, the crew found out that one sailor from "Vostok" and few from "Mirny" caught up a sexually transmitted disease in Port Jackson. This type of diseases was especially widespread in Australia which was at that time a place where convicts were sent from Britain. Vessels did not escape may storms, and the crew got used to pitchings and winds. However, the calm that suddenly established at 8 pm on May 19 provoked a strong lateral pitching, because of which "Vostok" scooped up so much water from the scavut netting that the water level in hold increased from 13 to 26 inches. Also, the stream of water in the mess crushed lieutenant Zavodskoy. Scattered gun cores that rolled from side to side made it difficult to repair the damage. Pitching continued for the next day as well. On May 24 at 7 am, travellers reached New Zealand and anchored in the Queen Charlotte Sound to make contacts with Māori people. Bellingshausen used Cook' maps and descriptions. Historian Barratt named the following events as "comparative ethnography session". These ethnographic observations turned out to be extremely important since the places that Russian sailors had visited were connection points between tribes of North Island and South Island. In 1828 colonizers destroyed hapū notes, and Mikhailov' depictions became historical sources of paramount importance.
|
https://en.wikipedia.org/wiki?curid=40880361
| 1,183,960 |
460,306 |
On November 11, 1944, Republic received an order for three prototypes of the new XP-84. Since the design promised superior performance to the P-80 Shooting Star and Republic had extensive experience in building single-seat fighters, no competition was held for the contract. The name Thunderjet was chosen to continue the Republic Aviation tradition started with the P-47 while emphasizing the new method of propulsion. On January 4, 1945, even before the aircraft took to the air, the USAAF expanded its order to 25 service test YP-84A and 75 production P-84B (later modified to 15 YP-84A and 85 P-84B). Meanwhile, wind tunnel testing by the National Advisory Committee for Aeronautics revealed longitudinal instability and buckling of stabilizer skin at high speeds. The weight of the aircraft, a great concern given the low thrust of early turbojets, was growing so quickly that the USAAF had to set a gross weight limit of 13,400 pounds (6,078 kg). The results of preliminary testing were incorporated into the third prototype, designated XP-84A, which was also fitted with a more powerful J35-GE-15 engine with 4,000 pound-force (17.80 kN) of thrust.
|
https://en.wikipedia.org/wiki?curid=442516
| 460,080 |
1,410,878 |
A literature review by Willie Soon and Sallie Baliunas, published in the relatively obscure journal "Climate Research" on 31 January 2003, used data from previous papers to argue that the Medieval Warm Period had been warmer than the 20th century, and that recent warming was not unusual. In March they published an extended paper in "Energy & Environment", with additional authors. The Bush administration's Council on Environmental Quality chief of staff Philip Cooney inserted references to the papers in the draft first Environmental Protection Agency "Report on the Environment", and removed all references to reconstructions showing world temperatures rising over the last 1,000 years. In the Soon and Baliunas controversy, two scientists cited in the papers said that their work was misrepresented, and the "Climate Research" paper was criticised by many other scientists, including several of the journal's editors. On 8 July "Eos" featured a detailed rebuttal of both papers by 13 scientists including Mann and Jones, presenting strong evidence that Soon and Baliunas had used improper statistical methods. Responding to the controversy, the publisher of "Climate Research" upgraded Hans von Storch from editor to editor in chief, but von Storch decided that the Soon and Baliunas paper was seriously flawed and should not have been published as it was. He proposed a new editorial system, and though the publisher of "Climate Research" agreed that the paper should not have been published uncorrected, he rejected von Storch's proposals to improve the editorial process, and von Storch with three other board members resigned. Senator James M. Inhofe stated his belief that "manmade global warming is the greatest hoax ever perpetrated on the American people", and a hearing of the United States Senate Committee on Environment and Public Works which he convened on 29 July 2003 heard the news of the resignations.
|
https://en.wikipedia.org/wiki?curid=5354105
| 1,410,086 |
1,368,559 |
Creative director Chris Metzen noted that to avoid the same failure that "Titan" became, their group had to rethink how Blizzard's more successful games had come about, ignoring the scale and business opportunity of the end result and instead understand what tools and skills they had already to build from. In brainstorming ideas, the team thought about the current state of first-person shooters (FPS), a genre that many on the team had played throughout their careers, which has enjoyed many groundbreaking titles but still has a potential for innovation, according to Kaplan. Kaplan stated that some of the ideas in the current FPS they wanted to emulate were the use of in-game maneuvers like rocket jumping and grappling hooks that helped players move with fluidity across maps and team-based shooters such as "Team Fortress Classic" and "Team Fortress 2". At the same time, multiplayer online battle arena (MOBA) games were starting to take off, which required players to cooperate with others to successfully win the match. Kaplan said that their team considered how to adapt the large-scale and fast-paced gameplay of "Team Fortress 2" with the smaller scale and cooperative nature of MOBAs, forming the basis of "Overwatch". Metzen also commented that the concept of teamwork in "Overwatch" was partially influenced by their own team's current morale following the cancellation of "Titan". Metzen said that during "Titan"s development, the team was highly fractured which impacted the project's cancellation. On starting "Overwatch" with a smaller group, they all wanted to come together and support each other to make their next game a success, "a redemption story for us as people and as craftsmen". Morhaime described "Overwatch"s intention as to "create an awesome [first-person shooter] experience that's more accessible to a much wider audience while delivering the action and depth that shooter fans love." On the FPS nature of the game, Kaplan commented that "the real focus of the shooting in the game is not to chase realism. We don't have real world guns in the game. You're not playing a soldier in a present-day military conflict." Simplicity of design was a high-value goal, taking cues from the success of the simple approach used in Blizzard's "Hearthstone".
|
https://en.wikipedia.org/wiki?curid=55762533
| 1,367,803 |
1,190,791 |
Born July 25, 1871 in New York City, she was raised in Harlem by her father Francis, an Episcopal priest, and her mother, Elizabeth Floy, who came from a prosperous New York family. Her ancestors were of Dutch and English descent and were all in America before 1720. Washburn was an only child; she did not appear to have childhood companions her age and spent much of her time with adults or reading. She learned to read long before she started school; this caused her to advance quickly when she started school at age 7. In school, she learned French and German. When she was eleven years old, she started at public school for the first time. In 1886, she graduated from high school at the age of fifteen, and that fall, she entered Vassar College, Poughkeepsie, New York, as a preparatory student. This preparatory status was due to her lack of Latin and French. During her undergraduate years at Vassar, Washburn developed a strong interest in philosophy through poetry and other literary works. She also became a member of Kappa Alpha Theta sorority, and was first introduced to the field of psychology. After she graduated from Vassar in 1891, Washburn became determined to study under James McKeen Cattell in the newly established psychological laboratory at Columbia University. As Columbia had not yet admitted a woman graduate student, she was admitted only as an auditor. Despite the derogatory feelings toward women gaining education at the time, Cattell treated her as a normal student and became her first mentor. She attended his seminary, lectures, and worked in the laboratory alongside men. At the end of her first year of admission at Columbia, Cattell encouraged her to enter the newly organized Sage School of Philosophy at Cornell University to obtain her Ph.D because this would not have been possible at Columbia as an auditor student. She was accepted in 1891 with a scholarship.
|
https://en.wikipedia.org/wiki?curid=508123
| 1,190,157 |
991,288 |
Historically, LDLT began with terminal pediatric patients, whose parents were motivated to risk donating a portion of their compatible healthy livers to replace their children's failing ones. The first report of successful LDLT was by Silvano Raia at the University of Sao Paulo Faculty of Medicine in July 1989. It was followed by Christoph Broelsch at the University of Chicago Medical Center in November 1989, when two-year-old Alyssa Smith received a portion of her mother's liver. Surgeons eventually realized that adult-to-adult LDLT was also possible, and now the practice is common in a few reputable medical institutes. It is considered more technically demanding than even standard, cadaveric donor liver transplantation, and also poses the ethical problems underlying the indication of a major surgical operation (hemihepatectomy or related procedure) on a healthy human being. In various case series, the risk of complications in the donor is around 10%, and very occasionally a second operation is needed. Common problems are biliary fistula, gastric stasis and infections; they are more common after removal of the right lobe of the liver. Death after LDLT has been reported at 0% (Japan), 0.3% (USA) and <1% (Europe), with risks likely to decrease further as surgeons gain more experience in this procedure. Since the law was changed to permit altruistic non-directed living organ donations in the UK in 2006, the first altruistic living liver donation took place in Britain in December 2012.
|
https://en.wikipedia.org/wiki?curid=517879
| 990,771 |
1,787,263 |
In general, the defect propagation theory plays a bigger role at low sediment transport rates since for high rates defects maybe washed away and bedforms generally initiated across the entire bed spontaneously. Venditti et al. (2005) report that instantaneous initiation begins with the formation of a cross-hatch pattern, which leads to chevron-shaped forms that migrate independently of the pattern structure. This chevron-like structure reorganizes to form the future crest lines of the bedforms. Venditti et al. (2006), based on the earlier model by Liu (1957), proposed that instantaneous initiation is a manifestation of an interfacial hydrodynamic instability of Kelvin-Helmholtz type between a highly active pseudofluid sediment layer and the fluid above it. In addition, Venditti et al. (2005) imply that there is no linkage between the instantaneous initiation and coherent turbulent flow structures, since spatially- and temporally-random events should lock in place to generate the cross-hatch pattern. Moreover, there is no clear explanation of the effect of turbulence in the formation of bedforms since bedforms may also occur under laminar flows . It is important to note, that laminar-generated bedform studies used the temporally-averaged flow conditions to determine the degree of turbulence, indicating Reynolds number in the laminar regime. However, instantaneous process, such as burst and sweeps, which are infrequent at low Reynolds number but still present, can be the driving mechanisms to generate the bedforms. The generation of bedforms in laminar flows is still a topic of debate within the scientific community, since if true, it suggests that there should be other processes for defect development other than the one suggested by Best (1992). This alternative model for bedform development at low sediment transport rates should explain the generation of defects and bedforms for cases where the flow is not turbulent.
|
https://en.wikipedia.org/wiki?curid=22450812
| 1,786,258 |
1,386,521 |
On May 25, 1925, Struve married Mary Martha Lanning, who considered herself a musician but worked as a secretary at Yerkes. Lanning was slightly older than Struve and had been previously married. They had no children; thus the famous Struve astronomical dynasty came to an end. Other branches of the Struve family besides the line of Otto Wilhelm von Struve continued, but yielded no distinguished scientists. On October 26, 1927, Struve became a naturalized US citizen. At that time, he was fluent in spoken and written English, but had a slight German accent which remained with him for life. Even after marriage, Struve continued working days and nights, something that his non-scientist wife could not fully accept. Although they remained together, their relations were cold in later years. Struve's health deteriorated in the late 1950s. He was suffering from hepatitis, first contracted back in Russia and Turkey. In 1956, while using a telescope at Mount Wilson, Struve had a bad fall, breaking several ribs and cracking two vertebrae. He was hospitalized for about two months and had to wear a body cast for a month after recovery. He was permanently hospitalized around 1963 and died on April 6, 1963, in Berkeley. He was survived by his mother and wife. His mother died on October 1, 1964, at the age of ninety. Mary was discovered dead on August 5, 1966, but was estimated to have died in July 1966.
|
https://en.wikipedia.org/wiki?curid=584584
| 1,385,754 |
1,289,240 |
The McDonnell Douglas F-4C Phantom II was one of the aircraft types evaluated by the RAAF as a potential replacement for its aging English Electric Canberra bombers in the early 1960s. In mid-1963 a team of senior RAAF officers headed by the Chief of the Air Staff, Air Marshal Valston Hancock, travelled to the United States to evaluate the General Dynamics F-111 (then known as the "TFX"), North American A-5 Vigilante and F-4C Phantom II strike aircraft. While in the United States, the team also inspected the Boeing KC-135 Stratotanker, which was considered necessary to support these aircraft. In addition, the RAAF officers travelled to the United Kingdom and France to evaluate the BAC TSR-2 and Dassault Mirage IV, respectively. In its final report, the team rejected the F-4C on the grounds that the aircraft lacked the range, performance at low altitude and reconnaissance capability that the RAAF required. The F-111 was considered to be the most suitable aircraft of those considered, but the team proposed that the RAAF acquire 36 Vigilantes as they also met the force's requirements and could be delivered within a shorter time frame. The Australian Government rejected this advice, and decided to purchase 24 F-111s. At the time the order was placed in late 1963 these aircraft were scheduled to be delivered in 1967; the delivery date was pushed back to 1968 after Australia decided to order the unique F-111C variant. In late 1963 the United States Government offered to lend Australia 24 Boeing B-47 Stratojet bombers until the F-111s were delivered. The Australian Air Board opposed acquiring these aircraft on the grounds that they were obsolete and would be expensive to operate. Instead, it recommended to Cabinet that a package of F-4C strike aircraft, the RF-4C reconnaissance variant of this design, and KC-135 tankers be leased from the United States if an interim force was considered necessary. Cabinet considered the two options during 1964, and rejected both of them. Between 1965 and 1970 six Australian pilots serving on exchange postings to the United States Air Force (USAF) flew Phantoms in combat during the Vietnam War.
|
https://en.wikipedia.org/wiki?curid=36437816
| 1,288,531 |
897,374 |
The cumulative damage of the semiconductor lattice ("lattice displacement" damage) caused by ionizing radiation over the exposition time. It is measured in rads and causes slow gradual degradation of the device's performance. A total dose greater than 5000 rads delivered to silicon-based devices in seconds to minutes will cause long-term degradation. In CMOS devices, the radiation creates electron–hole pairs in the gate insulation layers, which cause photocurrents during their recombination, and the holes trapped in the lattice defects in the insulator create a persistent gate biasing and influence the transistors' threshold voltage, making the N-type MOSFET transistors easier and the P-type ones more difficult to switch on. The accumulated charge can be high enough to keep the transistors permanently open (or closed), leading to device failure. Some self-healing takes place over time, but this effect is not too significant. This effect is the same as hot carrier degradation in high-integration high-speed electronics. Crystal oscillators are somewhat sensitive to radiation doses, which alter their frequency. The sensitivity can be greatly reduced by using swept quartz. Natural quartz crystals are especially sensitive. Radiation performance curves for TID testing may be generated for all resultant effects testing procedures. These curves show performance trends throughout the TID test process and are included in the radiation test report.
|
https://en.wikipedia.org/wiki?curid=1041641
| 896,901 |
526,886 |
This function, referred to as a potential function, computes the molecular potential energy as a sum of energy terms that describe the deviation of bond lengths, bond angles and torsion angles away from equilibrium values, plus terms for non-bonded pairs of atoms describing van der Waals and electrostatic interactions. The set of parameters consisting of equilibrium bond lengths, bond angles, partial charge values, force constants and van der Waals parameters are collectively termed a force field. Different implementations of molecular mechanics use different mathematical expressions and different parameters for the potential function. The common force fields in use today have been developed by using chemical theory, experimental reference data, and high level quantum calculations. The method, termed energy minimization, is used to find positions of zero gradient for all atoms, in other words, a local energy minimum. Lower energy states are more stable and are commonly investigated because of their role in chemical and biological processes. A molecular dynamics simulation, on the other hand, computes the behaviour of a system as a function of time. It involves solving Newton's laws of motion, principally the second law, formula_3. Integration of Newton's laws of motion, using different integration algorithms, leads to atomic trajectories in space and time. The force on an atom is defined as the negative gradient of the potential energy function. The energy minimization method is useful to obtain a static picture for comparing between states of similar systems, while molecular dynamics provides information about the dynamic processes with the intrinsic inclusion of temperature effects.
|
https://en.wikipedia.org/wiki?curid=734256
| 526,613 |
672,750 |
Jellyfish and zooxanthellae have a history together in the scientific world as "Symbiodinium" was first cultured from the jellyfish "Cassiopea," a model jellyfish species. Many different types of zooxanthellae have been observed forming relationships with jellyfish across many different phylogenetic branches, and the roles they play will change throughout the jellyfish’s life cycle. However, as the jellyfishes ages, the diversity of zooxanthellae attaching to them decreases, suggesting that zooxanthellae compete with each other to form relationships with the jellyfish. Not all jellyfish form relationships with these microbes and for the most part the ones that do are found in tropic and subtropic waters. The relationship between jellyfish and zooxanthellae is affected a little differently than coral in terms of climate change despite both of them being a part of the cnidaria family. One study suggested that certain species of jellyfish and their symbiotic zooxanthellae may have some type of resistance to decreasing pH caused by climate change to a certain point. Although, jellyfish bleaching events have been documented during extreme heat events. While the causal factors that normally seem to affect the relationship between zooxanthellae and their host may not apply to jellyfish, light intensity does. Light availability can affect the lipid production of zooxanthellae that the jellyfish then utilize. To maximize their light uptake, jellyfish will both swim near the surface and do very specific migrations. The migration patterns also assist with helping the zooxanthellae access specific nutrients. Many of these jellyfish appear to be mixotrophic consuming both live prey and utilizing phototrophy. This may be what helps jellyfish survive climate change and bleaching as they could switch feeding methods rather than attempting to recover lost zooxanthellae quickly. There are many unknowns in when it comes to the relationship between zooxanthellae and jellyfish that scientists look to answer.
|
https://en.wikipedia.org/wiki?curid=479508
| 672,398 |
763,671 |
The first radio antenna used to identify an astronomical radio source was built by Karl Guthe Jansky, an engineer with Bell Telephone Laboratories, in 1932. Jansky was assigned the task of identifying sources of static that might interfere with radiotelephone service. Jansky's antenna was an array of dipoles and reflectors designed to receive short wave radio signals at a frequency of 20.5 MHz (wavelength about 14.6 meters). It was mounted on a turntable that allowed it to rotate in any direction, earning it the name "Jansky's merry-go-round." It had a diameter of approximately and stood tall. By rotating the antenna, the direction of the received interfering radio source (static) could be pinpointed. A small shed to the side of the antenna housed an analog pen-and-paper recording system. After recording signals from all directions for several months, Jansky eventually categorized them into three types of static: nearby thunderstorms, distant thunderstorms, and a faint steady hiss above shot noise, of unknown origin. Jansky finally determined that the "faint hiss" repeated on a cycle of 23 hours and 56 minutes. This period is the length of an astronomical sidereal day, the time it takes any "fixed" object located on the celestial sphere to come back to the same location in the sky. Thus Jansky suspected that the hiss originated outside of the Solar System, and by comparing his observations with optical astronomical maps, Jansky concluded that the radiation was coming from the Milky Way Galaxy and was strongest in the direction of the center of the galaxy, in the constellation of Sagittarius.
|
https://en.wikipedia.org/wiki?curid=46656
| 763,262 |
1,683,701 |
The Sectoral Model, proposed by economist Homer Hoyt in 1939, explores the notion that the development of cities is centralised around transportation lines where wedges of residential land is concentrated by social class. Although, this model acts as a highly generalised theory does not equally represent urban morphologies of the developing world. Leaders in the field state that the twenty first century global economy's tendency towards capital reduces the demand for labour and in turn causing growing unemployment. This leads to the notion that innovations in transport play a significant role in the accessibility of city’s surface. For example, the wide implementation of suburban railway networks and motor buses were expected to have proportionately impact the higher increases in periphery land prices. Contrastingly, it is expected that in periods of low innovation within the transport sector land values would have been proportionately higher closer to the city centre. Although, opposers to this theory suggest that the fact that transport innovations are typically closely associated with increases in building and development activity which in itself is known to raise land values and thus, it is difficult to assess the relative strengths and underlying influences. Thus, the commercial, industrial and residential components within the urban this model’s land-use pattern are susceptible to the locational analysis and their attempt to maximise profits by substituting higher rent for increased transport costs from the city centre and vice versa. This acts as an alternative model to the ‘original’ urban morphology model by Burgess with proposed a city as a concentric being which adequately kept each land use activity within its proposed circle. However, in reality it is often the addition of varying transport options within differing locations that allow for the formation of individual sectors for each land use task.
|
https://en.wikipedia.org/wiki?curid=48914891
| 1,682,758 |
284,671 |
Because the new technologies are still not widely used, AFOs are often still made from polypropylene-based plastic, mostly in the shape of an "L" with the upright part behind the calf and the lower part under the foot. The continuous "L" shape of these constructions, however, only offers the rigidity that can be produced with these materials. Such constructions made of polypropylene are still called "DAFO (dynamic ankle-foot orthosis)", "SAFO (solid ankle-foot orthosis)" or "Hinged AFO". Either they are not stable enough to transfer the high forces required to balance the weak plantar flexors when standing and walking (DAFO) or they block the mobility of the ankle joint (SAFO). A "Hinged AFO" at that time, only allow for the compensation of that kind of muscular weakness that could be achieved with the orthotic joints available at that time. In this technology, for example, it is still common to block plantar flexion, as the orthotic joints available at the time cannot simultaneously transmit the large forces that are required to compensate for muscle deviations and at the same time offer the necessary dynamics. While there was this multitude of AFOs in clinical practice, which differed in their design, studies have already established that there was a clear lack of details regarding the design and the materials used for manufacture. Eddison and Chockalingam have called for a new standardization of the terminology. With a focus on the care of children with cerebral palsy (CP), there was already a recommendation from clinical practice in the context of the manufacture of orthotics made of polypropylene and their multitude of possible designs that the potential for improving the gait pattern through specifically adapted orthotics through further studies to investigate. The integration of orthotic joints with modern functional elements in the old production technology with polypropylene, on the other hand, is unusual because the orthotic shells made of polypropylene could not transfer the high forces under the boundary condition of an acceptable weight or would be too soft.
|
https://en.wikipedia.org/wiki?curid=26734587
| 284,517 |
723,282 |
Milanković put the Sun at the center of his theory, as the only source of heat and light in the Solar System. He considered three cyclical movements of the Earth: eccentricity (100,000-year cycle – Johannes Kepler, 1609), axial tilt (41,000-year cycle – from 22,1° to 24,5°; Presently, the Earth's tilt is 23.5° – Ludwig Pilgrim, 1904), and precession (23,000-year cycle – Hipparchus, 130 BC). Each cycle works on a different time-scale and each affects the amount of solar energy received by the planets. Such changes in the geometry of an orbit lead to the changes in the insolation – the quantity of heat received by any spot at the surface of a planet. These orbital variations, which are influenced by gravity of the Moon, Sun, Jupiter, and Saturn, form the basis of the Milankovitch cycle. His original contribution to celestial mechanics is called Milanković's system of vector elements of planetary orbits. He reduced six Lagrangean-Laplacian elliptical elements to two vectors determining the mechanics of planetary movements. The first specifies the planet's orbital plane, the sense of revolution of the planet, and the orbital ellipse parameter; the second specifies the axis of the orbit in its plane and the orbital eccentricity. By applying those vectors he significantly simplified the calculation and directly obtained all the formulas of the classical theory of secular perturbations. Milanković, in a simple but original manner, first deduced Newton's law of gravitation from Kepler's laws. Then Milanković treated the two-body and the many-body problems of celestial mechanics. He accepted but corrected the Le Verrier and Stockwell computation by using newer and more accurate values for the masses of the planets in the solar system.
|
https://en.wikipedia.org/wiki?curid=150014
| 722,902 |
174,312 |
Ghiorso and co-workers analyzed filter papers which had been flown through the explosion cloud on airplanes (the same sampling technique that had been used to discover ). Larger amounts of radioactive material were later isolated from coral debris of the atoll, which were delivered to the U.S. The separation of suspected new elements was carried out in the presence of a citric acid/ammonium buffer solution in a weakly acidic medium (pH ≈ 3.5), using ion exchange at elevated temperatures; fewer than 200 atoms of einsteinium were recovered in the end. Nevertheless, element 99 (einsteinium), namely its Es isotope, could be detected via its characteristic high-energy alpha decay at 6.6 MeV. It was produced by the capture of 15 neutrons by uranium-238 nuclei followed by seven beta-decays, and had a half-life of 20.5 days. Such multiple neutron absorption was made possible by the high neutron flux density during the detonation, so that newly generated heavy isotopes had plenty of available neutrons to absorb before they could disintegrate into lighter elements. Neutron capture initially raised the mass number without changing the atomic number of the nuclide, and the concomitant beta-decays resulted in a gradual increase in the atomic number:
|
https://en.wikipedia.org/wiki?curid=9479
| 174,221 |
1,208,137 |
Various machine elements that potentially lent themselves to screw making (such as the lathe, the leadscrew, the slide rest, gears, slide rests geared direct to spindles, and "change gear" gear trains) were developed over the centuries, with some of those elements being quite ancient. Various sparks of inventive power during the Middle Ages and Renaissance combined some of these elements into screw-making machines that presaged the industrial era to follow. For example, various medieval inventors whose names are lost to history clearly worked on the problem, as shown by Wolfegg Castle's "Medieval Housebook" (written circa 1475–1490), and Leonardo da Vinci and Jacques Besson left us with drawings of screw-cutting machines from the 1500s; not all of these designs are known to have been built, but clearly similar machines were a reality during Besson's lifetime. However, it was not until 1760–1800 that these various elements were brought together successfully to create (in contemporaneous parallel) two new types of machine tool: the screw-cutting lathe (for low-volume, toolroom-style production of "machine" screws, with easy selection of various pitches) and the first high-volume-production, specialized, single-purpose machine tools for the production of screws, which were created to produce "wood" screws [meaning screws made of metal for use in wood] at high volume and low unit price. Screw-cutting lathes fed into the just-dawning evolution of modern machine shop practice, whereas the wood-screw-making machines fed into the just-dawning evolution of the modern hardware industry, that is, the concept of one factory supplying the needs of thousands of customers, who consumed screws in growing quantities for carpentry, cabinet making, and other trades, but did not make the hardware themselves (purchasing it instead from capital-intensive specialist makers for lower unit cost than they could achieve on their own). These two classes of machine tools simultaneously took the various classes of screws and moved them, for the first time, from the category of expensive, hand-made, seldom-used objects into the category of affordable, often-interchangeable commodity. (The interchangeability developed gradually, from intra-company to inter-company to national to international).
|
https://en.wikipedia.org/wiki?curid=29477317
| 1,207,491 |
2,078,344 |
Through kinship theory, the occurrence of Prader-Willi Syndrome is theorized to result from the absence of the paternally derived gene, and the only copy is the maternally 'silent' copy. In this case, the child is expected to express behaviors which reduced maternal costs in evolutionary history. In particular, a reduction of infant feeding responses and low appetite over the period of years when the child would near completely rely on breastmilk from its mother would allow mothers to allocate their resources across themselves and multiple offspring. Breastfeeding an infant is estimated to cost around an additional 500 kcals per day, if additional energy is not consumed during lactation, body stores are used making breastfeeding a costly maternal endeavor. Low appetite in early age is followed by a sudden onset of appetite and feeding behaviors around age 2. In traditional subsistence communities, this corresponds with the age that children would be weaned from breastmilk and offered supplementary solid foods. It is likely that these weaning foods were less directly costly to the mother than breastfeeding, either because gathering these food items were less energetically costly for her to gather, or she could rely on social partners to assist her in gathering the food. Therefore, Prader-Willi allows us to see the roles of normally invisible imprinting effects which arose in relation to ancestral parental provisioning conflicts. In typically-developing individuals, the imprinted genes related to Prader-Willi and Angelman Syndrome are balanced in a "tug-of-war" designed by natural selection. However, when one of these genes is missing the balance is disrupted and the effects result in the phenotypes seen in these disorders.
|
https://en.wikipedia.org/wiki?curid=57329591
| 2,077,145 |
1,195,596 |
The evolutionary switch to bipedalism may have influenced the origins of music. The background is that noise of locomotion and ventilation may mask critical auditory information. Human locomotion is likely to produce more predictable sounds than those of non-human primates. Predictable locomotion sounds may have improved our capacity of entrainment to external rhythms and to feel the beat in music. A sense of rhythm could aid the brain in distinguishing among sounds arising from discrete sources and also help individuals to synchronize their movements with one another. Synchronization of group movement may improve perception by providing periods of relative silence and by facilitating auditory processing. The adaptive value of such skills to early human ancestors may have been keener detection of prey or stalkers and enhanced communication. Thus, bipedal walking may have influenced the development of entrainment in humans and thereby the evolution of rhythmic abilities. Primitive hominids lived and moved around in small groups. The noise generated by the locomotion of two or more individuals can result in a complicated mix of footsteps, breathing, movements against vegetation, echoes, etc. The ability to perceive differences in pitch, rhythm, and harmonies, i.e. "musicality", could help the brain to distinguish among sounds arising from discrete sources, and also help the individual to synchronize movements with the group. Endurance and an interest in listening might, for the same reasons, have been associated with survival advantages eventually resulting in adaptive selection for rhythmic and musical abilities and reinforcement of such abilities. Listening to music seems to stimulate release of dopamine. Rhythmic group locomotion combined with attentive listening in nature may have resulted in reinforcement through dopamine release. A primarily survival-based behavior may eventually have attained similarities to dance and music, due to such reinforcement mechanisms . Since music may facilitate social cohesion, improve group effort, reduce conflict, facilitate perceptual and motor skill development, and improve trans-generational communication, music-like behavior may at some stage have become incorporated into human culture.
|
https://en.wikipedia.org/wiki?curid=4220231
| 1,194,956 |
1,664,104 |
Team-Based Learning has been suggested to help students who seem uninterested in subject material, do not do their homework, and have difficulty understanding material. TBL can transform traditional content with application and problem solving skills, while developing interpersonal skills. Vaughn et al. (2019) stated that team-based learning is an effective method for gaining better “content acquisition, vocabulary growth, and reading comprehension” (p. 121). Jakobsen and Knetemann (2017) further add that team-based learning allows students to take a much deeper look at course content and serve to hold their attention better than traditional methods. Its implementation in education can also be important for developing skills and abilities that are useful for businesses, organizations, careers, and industries where many projects and tasks are performed by teams. Learning how to learn, work, interact, and collaborate in a team is essential for success in this kind of an environment. Many of the medical schools have adopted some version of TBL for several of the benefits listed above, and also for greater long-term knowledge retention. According to a study done by the Washington University School of Medicine, individuals who learned through an active team based learning curriculum had greater long-term knowledge retention compared to a traditional passive lecture curriculum. Evidently, faculty of professional schools are thus directing their focus towards developing application and integration of knowledge beyond the content-based curricula, rather than simple course objectives such as simply memorizing a concept. Michaelsen adds that "assignments that require groups to make decisions and enable them to report their decisions in a simple form, will usually generate high levels of group interaction" and are:
|
https://en.wikipedia.org/wiki?curid=5774409
| 1,663,167 |
1,968,276 |
Following his two-year (1961–63) postdoctoral research fellowship at Caltech, Clayton was awarded an Assistant Professorship, one of the four founding faculty members in Rice University's newly created Department of Space Science (later renamed Space Physics and Astronomy). There he initiated a graduate-student course explaining nuclear reactions in stars as the mechanism for the creation of the atoms of our chemical elements. His pioneering textbook based on that course ("Principles of Stellar Evolution and Nucleosynthesis", McGraw-Hill 1968) earned ongoing praise. In 2018, 50 years after its first publication, it is still in common usage in graduate education throughout the world. At Rice Clayton was awarded the newly endowed Andrew Hays Buchanan Professorship of Astrophysics in 1968 and held that endowed professorship for twenty years until responding to the opportunity to guide a new astrophysics program at Clemson University in 1989. During the 1970s at Rice University Clayton guided Ph.D. theses of many research students who achieved renown, especially Stanford E. Woosley, William Michael Howard, H. C. Goldwire, Richard A. Ward, Michael J. Newman, Eliahu Dwek, Mark Leising and Kurt Liffman. Senior thesis students at Rice University included Bradley S. Meyer and Lucy Ziurys, both of whom forged distinguished careers in the subjects of those senior theses. Historical photos of several students can be seen on Clayton' s photo archive for the history of nuclear astrophysics. Clayton followed the historic Apollo 11 mission while on holiday with his family in Ireland while traveling to Cambridge UK for his third research summer there.
|
https://en.wikipedia.org/wiki?curid=36061850
| 1,967,146 |
2,044,287 |
Höber was born in Stettin to businessman Anselm Emil and his wife Elise née Köhlau. Educated at the gymnasium at Stettin, he went to study medicine in Freiburg im Breisgau in 1892. He completed studies at the University of Erlangen in 1897 and worked with his uncle Isidor Rosenthal where he became interested in physiology. He then studied the work of Walther Nernst on biological membranes. He received a degree in medicine in 1898 and joined the physiological institute at Zurich under Justus Gaule. He then received his habilitation in 1898 after working under Max von Frey. His 1902 work "Physikalische Chemie der Zelle Und der Gewebe" on the chemistry of cells and tissues made his well-known. He then worked at the Zurich Institute before moving to Kiel in 1909 becoming an assistant to Victor Hensen. He became an adjunct professor in 1910, working later, closely, with Albrecht Bethe who replaced Hensen. In 1915 Bethe moved to Frankfurt but Höber was initially rejected as the obvious senior to replace the position, possibly, because his wife was of Jewish origin. This decision was however recalled and he was made full professor and director on February 18, 1915 serving there until 1933. In 1930 he became rector for the Christian-Albrechts University, Kiel which led to his separation from research work. His support of liberal theologian Otto Baumgarten (1858-1934) who was opposed by the National Socialist student union led to being targeted. He was physically attacked in April 1933. He was forced to retire in September and he emigrated to England in October to work at the University of London. In December 1933 he received a position at the Pennsylvania Medical School and moved to the United States.
|
https://en.wikipedia.org/wiki?curid=24147166
| 2,043,107 |
1,979,298 |
This body of work was largely ignored until 1971, when Lawrence Fisher and Roman Weil re-introduced immunization to the academic community in a journal article that followed a 1969 report written for the Center for Research in Securities Prices. Shortly thereafter, in 1972, I.T. Vanderhoof presented the concept to the American actuarial community. Academic papers on immunization, duration, and dedication began to appear in increasing numbers, as interest rates began to rise. As rates rose further and further above their long term averages, the financial investment industry began to pay attention, and their inquiries increasingly attracted the attention of academic researchers. Realizing that the high rates would allow them to lock in unprecedented rates of return, defined-benefit pension fund managers embraced the concepts. Goldman Sachs and other high level firms began to produce software to help bond portfolio managers apply the theory to their institutional sized portfolios. Most of the examples used in the literature typically utilized portfolios consisting of several hundred million dollars. In 1981, Leibowitz and Weinberger published a report on “contingent immunization” discussing the blending of active management of bond portfolios with immunization to provide a floor on returns. Leibowitz also published a paper in two parts defining dedicated portfolios in 1986. One of the side benefits of the theoretical work and practical interest was the development of new fixed income instruments, such as zero-coupon bonds.
|
https://en.wikipedia.org/wiki?curid=30457132
| 1,978,160 |
1,001,766 |
In 2003, "The New York Times" published an investigative article that noted "longstanding tensions" between the faculty and Principal McCaskill, "spilled into the open in October, with news reports that several teachers accused him of repeatedly sending sexually explicit e-mail messages from his school computer to staff members." While the article praised him for his addition of music and sports programs, it mostly described the principal as autocratic, controlling the school "largely through fear and intimidation," and documented acts of personal vindictiveness toward teachers; severe censorship of the student newspaper and of assigned English texts, including the refusal to let the Pulitzer Prize-finalist novel "Continental Drift" by Russell Banks be used for a class; and of bureaucratic mismanagement. A follow-up column in 2004 found that there was increased teacher exodus, specifically documenting Principal McCaskill's campaign against Alice Alcala, who described as one of the city's leading Shakespeare teachers. Alcala had won Brooklyn Tech a $10,000 grant and brought in the Royal National Theatre of Great Britain for student workshops, but after Alcala had done so, McCaskill repeatedly denied her access to the auditorium and gave her low performance rankings. Shortly after, Alcala left for Manhattan's Murry Bergtraum High School, where she brought in $1,800 in grants for Shakespeare education; meanwhile, at Brooklyn Tech, there was no longer any course solely devoted to Shakespeare, according to the column.
|
https://en.wikipedia.org/wiki?curid=378883
| 1,001,248 |
1,190,583 |
Stewart was a child actor and holder of an Equity card. His first appearance on television came in 1978, in a BBC Scotland adaptation of John Buchan's 1922 novel "Huntingtower". Amongst his contemporaries at the East Kilbride Rep Theatre was the actor John Hannah. Leaving acting behind, he studied geography and geology at Strathclyde University, graduating in 1986 with a first class honours Bachelor of Science degree. He obtained his doctorate, entitled "The evolution of neotectonic normal fault scarps in the Aegean Region" in 1990 at the University of Bristol on research into earthquakes in Greece and Turkey. In 1990 he began teaching geology at the West London Institute of Higher Education (WLIHE) in Osterley (occupying the Warden's flat with his wife for several years), and from 1995 at Brunel University due to its merger with WLIHE. After 12 years in London he moved back to Scotland to develop a new career as a science broadcaster. Nostalgic for Brunel, he said "And invariably, you move on to places that for all their benefits, seem surprisingly narrow, and more fallow, in comparison. In short, it was a remarkable place to be". He moved to the University of Plymouth in 2004, later becoming Professor of Geoscience Communication, a position he believed to be unique in the world. He left to become Jordan-UK El Hassan bin Talal Research Chair in Sustainability in 2021 on a four year secondment to the Royal Scientific Society, based in Amman.
|
https://en.wikipedia.org/wiki?curid=16807222
| 1,189,949 |
597,084 |
Heizer has since continued his exploration of the dynamics between positive forms and negative space. His "Adjacent, Against, Upon" (1976) juxtaposes three large granite slabs in different relationships to cast concrete forms; the 30-50 ton granite slabs were quarried in the Cascade Mountain Range and transported by barge and train to Myrtle Edwards Park. For "Displaced/Replaced Mass" (1969/1977), later installed outside the Marina del Rey, California, home of Roy and Carol Doumani, he planted four granite boulders of different sizes from the High Sierra into lid-less concrete boxes in the earth so that the tops of the rocks are roughly level with the ground. For a 1982 work at the former IBM Building in New York, Heizer sheared off the top of a large rock and cut grooves into the surface before setting it on supports hidden within a stainless steel structure. Designed as a fountain, the boulder appears to float over running water. He called it "Levitated Mass", a title he would use again in future. Commissioned by the president of the Ottawa Silica Company, the "Effigy Tumuli" earthwork in Illinois is composed of five abstract animal earthworks reclaiming the site of an abandoned surface coal mine along the Illinois River; the shapes (1983–85)—a frog, a water strider, a catfish, a turtle, and a snake—reflect the environment of the site, which overlooks the river.
|
https://en.wikipedia.org/wiki?curid=1038039
| 596,779 |
120,641 |
Fortunately, with new and rapidly evolving technological advancements, those in Mission Control are able to monitor the health of their astronauts more closely utilizing telemedicine. One may not be able to completely evade the physiological effects of space flight, but they can be mitigated. For example, medical systems aboard space vessels such as the International Space Station (ISS) are well equipped and designed to counteract the effects of lack of gravity and weightlessness; on-board treadmills can help prevent muscle loss and reduce the risk of developing premature osteoporosis. Additionally, a crew medical officer is appointed for each ISS mission and a flight surgeon is available 24/7 via the ISS Mission Control Center located in Houston, Texas. Although the interactions are intended to take place in real time, communications between the space and terrestrial crew may become delayed – sometimes by as much as 20 minutes – as their distance from each other increases when the spacecraft moves further out of LEO; because of this the crew are trained and need to be prepared to respond to any medical emergencies that may arise on the vessel as the ground crew are hundreds of miles away. As one can see, travelling and possibly living in space poses many challenges. Many past and current concepts for the continued exploration and colonization of space focus on a return to the Moon as a "stepping stone" to the other planets, especially Mars. At the end of 2006 NASA announced they were planning to build a permanent Moon base with continual presence by 2024.
|
https://en.wikipedia.org/wiki?curid=28431
| 120,592 |
1,921,293 |
Skylab, a science and engineering laboratory, was launched into Earth orbit by a Saturn V rocket on May 14, 1973. Detailed X-ray studies of the Sun were performed. The S150 experiment performed a faint X-ray source survey. The S150 was mounted atop the SIV-B upper stage of the Saturn 1B rocket which orbited briefly behind and below Skylab on July 28, 1973. The entire SIV-B stage underwent a series of preprogrammed maneuvers, scanning about 1° every 15 seconds, to allow the instrument to sweep across selected regions of the sky. The pointing direction was determined during data processing, using the inertial guidance system of the SIV-B stage combined with information from two visible star sensors which formed part of the experiment. Galactic X-ray sources were observed with the S150 experiment. The experiment was designed to detect 4.0-10.0 nm photons. It consisted of a single large (~1500 cm) proportional counter, electrically divided by fine wire ground planes into separate signal-collecting areas and looking through collimator vanes. The collimators defined 3 intersecting fields of view (~2 × 20°) on the sky, which allowed source positions to be determined to ~ 30'. The front window of the instrument consisted of a 2 µm thick plastic sheet. The counter gas was a mixture of argon and methane. Analysis of the data from the S150 experiment provided strong evidence that the soft X-ray background cannot be explained as the cumulative effect of many unresolved point sources.
|
https://en.wikipedia.org/wiki?curid=25119101
| 1,920,190 |
961,617 |
Thorndike contributed a great deal to psychology. His influence on animal psychologists, especially those who focused on behavior plasticity, greatly contributed to the future of that field. In addition to helping pave the way towards behaviorism, his contribution to measurement influenced philosophy, the administration and practice of education, military administration, industrial personnel administration, civil service and many public and private social services. Thorndike influenced many schools of psychology as Gestalt psychologists, psychologists studying the conditioned reflex, and behavioral psychologists all studied Thorndike's research as a starting point. Thorndike was a contemporary of John B. Watson and Ivan Pavlov. However, unlike Watson, Thorndike introduced the concept of reinforcement. Thorndike was the first to apply psychological principles to the area of learning. His research led to many theories and laws of learning. His theory of learning, especially the law of effect, is most often considered to be his greatest achievement. In 1929, Thorndike addressed his early theory of learning, and claimed that he had been wrong. After further research, he was forced to denounce his law of exercise completely, because he found that practice alone did not strengthen an association, and that time alone did not weaken an association. He also got rid of half of the law of effect, after finding that a satisfying state of affairs strengthens an association, but punishment is not effective in modifying behavior. He placed a great emphasis on consequences of behavior as setting the foundation for what is and is not learned. His work represents the transition from the school of functionalism to behaviorism, and enabled psychology to focus on learning theory. Thorndike's work would eventually be a major influence to B.F. Skinner and Clark Hull. Skinner, like Thorndike, put animals in boxes and observed them to see what they were able to learn.
|
https://en.wikipedia.org/wiki?curid=84864
| 961,108 |
1,922,876 |
In the original product imaging paper, the "positions" of the ions are imaged onto a two-dimensional detector. A photolysis laser dissociates methyl iodide (CHI), while an ionization laser is used REMPI to ionize a particular vibrational level of the CH product. Both lasers are pulsed, and the ionization laser is fired at a delay short enough that the products have not moved appreciably. Because ejection of an electron by the ionization laser does not change the recoil velocity of the CH fragment, its position at any time following the photolysis is nearly the same as it would have been as a neutral. The advantage of converting it to an ion is that, by repelling it with a set of grids (represented by the vertical solid lines in the figure), one can project it onto a two-dimensional detector. The detector is a double microchannel plate consisting of two glass discs with closely packed open channels (several micrometres in diameter). A high voltage is placed across the plates. As an ion hits inside a channel, it ejects secondary electrons that are then accelerated into the walls of the channel. Since multiple electrons are ejected for each one that hits the wall, the channels act as individual particle multipliers. At the far end of the plates approximately 10 electrons leave the channel for each ion that entered. Importantly, they exit from a spot right behind where the ion entered. The electrons are then accelerated to a phosphor screen, and the spots of light are recorded with a gated charge-coupled device (CCD) camera. The image collected from each pulse of the lasers is then sent to a computer, and the results of many thousands of laser pulses are accumulated to provide an image such as the one for ozone shown previously.
|
https://en.wikipedia.org/wiki?curid=17768388
| 1,921,773 |
1,040,299 |
With the turn of the millennium, paths towards the industrial production of BSC cells and modules started to be laid again. In 2000, Japanese manufacturer Hitachi released results of its research in BSCs with another transistor-like npn cell with 21.3% front and 19.8% rear efficiency. By 2003 Hitachi had developed BSC module technology that was licensed in 2006 to the US company Prism Solar. In 2004 a team led by Prof. Andrew Blakers at the Australian National University published its first results on the so-called Sliver BSC technology, that had taken the design route previously proposed by Mori and also realized by IES-UPM by Sangrador and Sala, i.e., a stack of laterally connected bifacial cells requiring no metal grids, however, by then with more advanced means with which thousands of cells were micromachined out of one p-type silicon wafer. The technology was later transferred to Origin Energy that planned large-scale manufacturing for the Australian market by 2008, but finally this never occurred due to price pressure from Chinese competition. In 2012 Sanyo (later acquired by Panasonic) successfully launches industrial production of bifacial PV modules, based on its HIT (Heterojunction with Intrinsic Thin layer) technology. By 2010, ECN releases results on its research on BSCs, based on the by then classical pnn Back Surface Field BSC. This technology, dubbed n-PASHA, was transferred to the leading Chinese PV manufacturer Yingli by 2012, that began to commercialize them under the brand name Panda. Yingli was at that time the no.1 PV producer holding 10% of the world's shipments, and this technology transfer by ECN can be considered a milestone in the ultimate coming of age of BSCs, in which the technology is picked up by the, by then, mighty Chinese manufacturers mainly responsible for the steep decrease experienced PV prices since the beginning of the 2010s.
|
https://en.wikipedia.org/wiki?curid=65342312
| 1,039,756 |
1,724,922 |
The Nestler laboratory's focus in neuropsychopharmacology and molecular neuroscience concentrates on forming a molecular approach to psychiatry and furthering the understanding of the molecular basis of both depression and drug addiction, using animal models to study the way drug use or stress affects the brain. His addiction research largely centers around several transcription factors, including ΔFosB and CREB (master control proteins that induce addiction or depression in vulnerable individuals or resistance to these syndromes in resilient individuals) and the associated epigenetic remodeling that occurs in specific neuronal or glial cell types in the brain. A major goal is to identify the ‘chromatin scars’—long lasting epigenetic changes at specific genomic loci—that mediate lifelong changes in disease vulnerability. Among the prominent targets of this work are medium spiny neurons of the nucleus accumbens and pyramidal neurons in prefrontal cortex and ventral hippocampus. The Nestler laboratory has driven innovative use of viral-mediated gene transfer, inducible, cell-type specific mutations in mice, and locus-specific epigenome editing to establish causal links between molecular and behavioral phenomena in animal models. The laboratory also makes creative use of advanced machine learning approaches to derive novel biological insight from large sequencing datasets.
|
https://en.wikipedia.org/wiki?curid=25283394
| 1,723,951 |
167,863 |
Santos-Dumont competed for the award on 12 November 1906, again in Bagatelle. He did five public flights that day: one at 10 am, of 40 metres; two others at 10:25 am, of 40 and 60 metres, when the axle of the right wheel broke. The damage was repaired during lunch and Santos-Dumont resumed at 4:09 pm. He covered 82.60 metres, surpassing the feat of 23 October and reaching 41.3 km/h. At 4:45 pm, with the day ending, he took off against the wind and flew 220 metres, for 21 seconds at an average speed of 37.4 km/h, winning the French Aeroclub Award. These were the first aeroplane flights recorded by a film company, Pathé. The Wright Brothers, after learning of the 12 November experiment, sent a letter to Captain Ferdinand Ferber asking for "exact news of the Bagatelle experiments," including "a faithful report of the trials and a description of the flying machine, accompanied by a schematic." Santos-Dumont even adopted the configuration proposed by the Wright brothers and placed the rudder at the front of the 14-bis, which he described as "the same as trying to shoot an arrow forward with the tail...". To test the idea that the rudder at the rear increased the angle of incidence of the wings, Santos-Dumont built a new aircraft, without abandoning the 14-bis, and tested it in March 1907, without taking off as the primitive landing gear did not allow it.
|
https://en.wikipedia.org/wiki?curid=152687
| 167,773 |
705,615 |
The discovery set the stage for the development of the Standard Model, as the model relied on the idea of symmetry of particles and forces and how particles can sometimes break that symmetry. The wide coverage of her discovery prompted the discoverer of fission Otto Robert Frisch to mention that people at Princeton would often say that her discovery was the most significant since the Michelson–Morley experiment that inspired Einstein's theory of relativity. The AAUW called it the “solution to the number-one riddle of atomic and nuclear physics.” Beyond showing the distinct characteristic of weak interaction from the other three conventional forces of interaction, this eventually led to general CP violation, the violation of the charge conjugation parity symmetry. This violation meant researchers could distinguish matter from antimatter and create a solution that would explain the existence of the universe as one that is filled with matter. This is since the lack of symmetry gave the possibility of matter-antimatter imbalance which would allow matter to exist today through the Big Bang. In recognition of their theoretical work, Lee and Yang were awarded the Nobel Prize for Physics in 1957. To further quote the impact it had, Nobel laureate Abdus Salam quipped, If any classical writer had ever considered giants (cyclops) with only the left eye. [One] would confess that one-eyed giants have been described and [would have] supplied me with a full list of them; but they always sport their solitary eye in the middle of the forehead. In my view what we have found is that space is a weak left-eyed giant. Wu's discovery would pave the way for a unified electroweak force that Salam proved, which is theoretically described to merge with the strong force to create a total new model and a Grand Unified Theory.
|
https://en.wikipedia.org/wiki?curid=36284621
| 705,246 |
964,190 |
The concept of embodiment has been inspired by research in cognitive neuroscience, such as the proposals of Gerald Edelman concerning how mathematical and computational models such as neuronal group selection and neural degeneracy result in emergent categorization. From a neuroscientific perspective, the embodied cognition theory examines the interaction of sensorimotor, cognitive, and affective neurological systems. The embodied mind thesis is compatible with some views of cognition promoted in neuropsychology, such as the theories of consciousness of Vilayanur S. Ramachandran, Gerald Edelman, and Antonio Damasio. It is also supported by a broad and ever-increasing collection of empirical studies within neuroscience. By examining brain activity with neuroimaging techniques, researchers have found indications of embodiment. In an Electroencephalography (EEG) study researchers showed that in line with the embodied cognition, sensorimotor contingency and common coding theses, sensory and motor processes in the brain are not sequentially separated, they are strongly coupled. Considering the interaction of the sensorimotor and cognitive system, a study from 2005 stresses how crucial sensorimotor cortices are for semantic comprehension of body–action terms and sentences. A functional magnetic resonance imaging (fMRI) study from 2004 showed that passively read action words, such as lick, pick or kick, led to a somatotopic neuronal activity in or adjacent to brain regions associated with actual movement of the respective body parts. Using transcranial magnetic stimulation (TMS), a study in 2005 stated that the activity of the motor system is coupled to auditory action-related sentences. When the participants listened to hand–or foot-related sentences, the motor evoked potentials (MEPs) recorded from the hand and foot muscles were reduced. These two exemplary studies indicate a relationship between cognitively understanding words referring to sensorimotor concepts and activation of sensorimotor cortices. Along the lines of embodiment, neuroimaging techniques serve to show interactions of the sensory and motor system.
|
https://en.wikipedia.org/wiki?curid=33034640
| 963,681 |
193,610 |
Vestigial structures are often homologous to structures that are functioning normally in other species. Therefore, vestigial structures can be considered evidence for evolution, the process by which beneficial heritable traits arise in populations over an extended period of time. The existence of vestigial traits can be attributed to changes in the environment and behavior patterns of the organism in question. Through an examination of these various traits, it is clear that evolution had a hard role in the development of organisms. Every anatomical structure or behavior response has origins in which they were, at one time, useful. As time progressed, the ancient common ancestor organisms did as well. Evolving with time, natural selection played a huge role. More advantageous structures were selected, while others were not. With this expansion, some traits were left to the wayside. As the function of the trait is no longer beneficial for survival, the likelihood that future offspring will inherit the "normal" form of it decreases. In some cases, the structure becomes detrimental to the organism (for example the eyes of a mole can become infected). In many cases the structure is of no direct harm, yet all structures require extra energy in terms of development, maintenance, and weight, and are also a risk in terms of disease (e.g., infection, cancer), providing some selective pressure for the removal of parts that do not contribute to an organism's fitness. A structure that is not harmful will take longer to be 'phased out' than one that is. However, some vestigial structures may persist due to limitations in development, such that complete loss of the structure could not occur without major alterations of the organism's developmental pattern, and such alterations would likely produce numerous negative side-effects. The toes of many animals such as horses, which stand on a single toe, are still evident in a vestigial form and may become evident, although rarely, from time to time in individuals.
|
https://en.wikipedia.org/wiki?curid=308981
| 193,510 |
956,993 |
Current US policy states: "Autonomous … weapons systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force." However, the policy requires that autonomous weapon systems that kill people or use kinetic force, selecting and engaging targets without further human intervention, be certified as compliant with "appropriate levels" and other standards, not that such weapon systems cannot meet these standards and are therefore forbidden. "Semi-autonomous" hunter-killers that autonomously identify and attack targets do not even require certification. Deputy Defense Secretary Robert Work said in 2016 that the Defense Department would "not delegate lethal authority to a machine to make a decision", but might need to reconsider this since "authoritarian regimes" may do so. In October 2016 President Barack Obama stated that early in his career he was wary of a future in which a US president making use of drone warfare could "carry on perpetual wars all over the world, and a lot of them covert, without any accountability or democratic debate". In the US, security-related AI has fallen under the purview of the National Security Commission on Artificial Intelligence since 2018. On October 31, 2019, the United States Department of Defense's Defense Innovation Board published the draft of a report outlining five principles for weaponized AI and making 12 recommendations for the ethical use of artificial intelligence by the Department of Defense that would ensure a human operator would always be able to look into the 'black box' and understand the kill-chain process. A major concern is how the report will be implemented.
|
https://en.wikipedia.org/wiki?curid=36973733
| 956,487 |
1,980,958 |
Foerster's student years occurred in a time when neurology was starting to develop independently from internal medicine and psychiatry under the influence of, among others, Jean-Martin Charcot (1825-1893), Wilhelm Heinrich Erb, William Richard Gowers (1845-1912) and particularly Karl Wernicke, who became well known by his direction toward functional localization approaches. By cooperating with Wernike, Foerster's great interest on the anatomy of the central nervous system was excited. The two researchers published together in 1903 an anatomical atlas of the brain ("Atlas des Gehirns"). At the time, the several schools of neurology were focused essentially on to diagnosis; because achieving an effective therapy was hardly possible. It was Foerster who took forward the idea of using physical therapy as a new way of treating patients with neurological disturbances. From this work arose his theoretical interest on the disturbances of motor coordination in the execution of movements: this resulted in the topic of his dissertation in 1902, a work which attained a great actuality in connection with the systematic introduction of rehabilitation medicine into neurology. The involvement of spinal reflexes in the genesis of muscular spasticity suggested its possible treatment by surgical interruption of the sensory branch of the thoracic and lumbar nerves (rhizotomy), and Foerster developed in 1908 an operation to cut the posterior sensory root in order to alleviate spasticity.
|
https://en.wikipedia.org/wiki?curid=2649305
| 1,979,820 |
166,882 |
A second series of approaches was designed and patented by Leonard R. Kahn. The various Kahn systems removed the hard limit imposed by the use of the strict log function in the generation of the signal. Earlier Kahn systems utilized various methods to reduce the second-order term through the insertion of a predistortion component. One example of this method was also used to generate one of the Kahn independent-sideband (ISB) AM stereo signals. It was known as the STR-77 exciter method, having been introduced in 1977. Later, the system was further improved by use of an arcsine-based modulator that included a 1-0.52E term in the denominator of the arcsin generator equation. E represents the envelope term; roughly half the modulation term applied to the envelope modulator is utilized to reduce the second-order term of the arcsin "phase"-modulated path; thus reducing the second-order term in the undesired sideband. A multi-loop modulator/demodulator feedback approach was used to generate an accurate arcsin signal. This approach was introduced in 1984 and became known as the STR-84 method. It was sold by Kahn Research Laboratories; later, Kahn Communications, Inc. of NY. An additional audio processing device further improved the sideband structure by selectively applying pre-emphasis to the modulating signals. Since the envelope of all the signals described remains an exact copy of the information applied to the modulator, it can be demodulated without distortion by an envelope detector such as a simple diode. In a practical receiver, some distortion may be present, usually at a low level (in AM broadcast, always below 5%), due to sharp filtering and nonlinear group delay in the IF filters of the receiver, which act to truncate the compatibility sideband – those terms that are not the result of a linear process of simply envelope modulating the signal as would be the case in full-carrier DSB-AM – and rotation of phase of these compatibility terms such that they no longer cancel the quadrature distortion term caused by a first-order SSB term along with the carrier. The small amount of distortion caused by this effect is generally quite low and acceptable.
|
https://en.wikipedia.org/wiki?curid=29048
| 166,795 |
1,950,961 |
"Ergo", the indications are that Matthew constructed the book iteratively, producing this third instalment, that includes the evolutionary exposition (p. 381), plus a further "colophon" ("In taking a retrospective glance at our pages ..." p. 388) that provides a better description of the technique to produce "knees" from split stems and roots. He also mentions his oversight of omitting instructions for bending planks, but then informs the reader that nature mostly takes care of that anyway (p. 390). The final sentence in that section is another one of those potential sources for confusion, as it states, "We regret that our allusion to the lamented Mr Huskisson was printed off before we knew of his death". This comment supports the retrospection in other sections that further implies later additions to the book. Seeking out Huskisson's earlier mention, he appears in passing within the section "Concerning Our Marine" (in Part III: "Miscellaneous Matter Connected With Naval Timber", p. 130), "Can it be believed that our very liberal late minister (Mr Huskisson), and our very non-liberal member for Newark (Mr Sadler), have both made a full exposè of the distresses of our shipping interest ..." (p. 136). To avoid misunderstanding, "late" is used here not in reference to no longer being alive, but rather to no longer having his role in government: he resigned in this case. It was in particularly frequent use during 1830 in the context of the aftermath of the July Revolution in France; for example, an editorial in the Perthshire Courier of 2 September 1830 remarked "To appease the popular fury, it is almost certain that the lives of some of the late Ministers are to be sacrificed". The very last section of "colophon" follows a bar and line break ("Since this volume went to press").
|
https://en.wikipedia.org/wiki?curid=54486030
| 1,949,840 |
1,668,869 |
Canada entered World War I on August 4, 1914, and soon the need for anti-toxins against tetanus became an urgent wartime matter. Robert Defries who joined the Antitoxin Lab in 1915 proposed that its operations be expanded to treat Canadian soldiers fighting trench warfare, for "not a fraction of the necessary amount [of tetanus antitoxin] was available". The proposal was quickly approved by the University of Toronto Board of Governors despite the lack of immediate funds, with University President Robert Falconer making an appeal to Prime Minister Robert Borden and the Ottawa administration. Shortly after, whisky magnate and then-Chairman of the Red Cross Society Colonel Albert Gooderham funded an operation to equip the Lab with the capacity to produce a Canadian supply by August 1915. Gooderham had seen a need for anti-toxins against tetanus, which had become an urgent wartime matter for the Canadian Expeditionary Force. He was impressed by the fledgling laboratory's capacity to produce the tetanus antitoxin in controlled conditions and at a lower price than the American sources than the Red Cross had initially contacted. He turned over a 58-acre farm on Dufferin Street north of Toronto for a new production plant on condition that it be named after the Duke of Connaught, then Governor-General of Canada (1911-1916). In February 1916, the Ontario Board of Health began to distribute the Antitoxin Laboratory's products for free across the province. Other provincial governments soon followed suit, starting in Saskatchewan.
|
https://en.wikipedia.org/wiki?curid=61019378
| 1,667,929 |
366,165 |
Paracelsus's lectures at Basel university unusually were held in German, not Latin. He stated that he wanted his lectures to be available to everyone. He also published harsh criticism of the Basel physicians and apothecaries, creating political turmoil to the point of his life being threatened. In a display of his contempt for conventional medicine, Paracelsus publicly burned editions of the works of Galen and Avicenna. On 23 June 1527 he burnt a copy of Avicenna's "Canon of Medicine", an enormous tome that was a pillar of academic study, in market square. He was prone to many outbursts of abusive language, abhorred untested theory, and ridiculed anybody who placed more importance on titles than practice ('if disease put us to the test, all our splendour, title, ring, and name will be as much help as a horse's tail'). During his time as a professor at the University of Basel, he invited barber-surgeons, alchemists, apothecaries, and others lacking academic background to serve as examples of his belief that only those who practised an art knew it: 'The patients are your textbook, the sickbed is your study.' Paracelsus was compared with Martin Luther because of his openly defiant acts against the existing authorities in medicine. Paracelsus rejected that comparison. Famously Paracelsus said, "I leave it to Luther to defend what he says and I will be responsible for what I say. That which you wish to Luther, you wish also to me: You wish us both in the fire." A companion during the Basel years expressed a quite unflattering opinion on Paracelsus: "The two years I passed in his company he spent in drinking and gluttony, day and night. He could not be found sober an hour or two together, in particular after his departure from Basle". Being threatened with an unwinnable lawsuit, he left Basel for Alsace in February 1528.
|
https://en.wikipedia.org/wiki?curid=152487
| 365,973 |
1,177,108 |
A key paper describing OSeMOSYS is available. A 2011 study uses OSeMOSYS to investigate the role of household investment decisions. A 2012 study extends OSeMOSYS to capture the salient features of a smart grid. The paper explains how to model variability in generation, flexible demand, and grid storage and how these impact on the stability of the grid. OSeMOSYS has been applied to village systems. A 2015 paper compares the merits of stand-alone, mini-grid, and grid electrification for rural areas in Timor-Leste under differing levels of access. In a 2016 study, OSeMOSYS is modified to take into account realistic consumer behavior. Another 2016 study uses OSeMOSYS to build a local multi-regional energy system model of the Lombardy region in Italy. One of the aims of the exercise was to encourage citizens to participate in the energy planning process. Preliminary results indicate that this was successful and that open modeling is needed to properly include both the technological dynamics and the non-technological issues. A 2017 paper covering Alberta, Canada factors in the risk of overrunning specified emissions targets because of technological uncertainty. Among other results, the paper finds that solar and wind technologies are built out seven and five years earlier respectively when emissions risks are included. Another 2017 paper analyses the electricity system in Cyprus and finds that, after European Union environmental regulations are applied post-2020, a switch from oil-fired to natural gas generation is indicated.
|
https://en.wikipedia.org/wiki?curid=38803848
| 1,176,485 |
81,549 |
In 2004 by Japan's NTT Docomo, with studies on the standard officially commenced in 2005. In May 2007, the LTE/SAE Trial Initiative (LSTI) alliance was founded as a global collaboration between vendors and operators with the goal of verifying and promoting the new standard in order to ensure the global introduction of the technology as quickly as possible. The LTE standard was finalized in December 2008, and the first publicly available LTE service was launched by TeliaSonera in Oslo and Stockholm on December 14, 2009, as a data connection with a USB modem. The LTE services were launched by major North American carriers as well, with the Samsung SCH-r900 being the world's first LTE Mobile phone starting on September 21, 2010, and Samsung Galaxy Indulge being the world's first LTE smartphone starting on February 10, 2011, both offered by MetroPCS, and the HTC ThunderBolt offered by Verizon starting on March 17 being the second LTE smartphone to be sold commercially. In Canada, Rogers Wireless was the first to launch LTE network on July 7, 2011, offering the Sierra Wireless AirCard 313U USB mobile broadband modem, known as the "LTE Rocket stick" then followed closely by mobile devices from both HTC and Samsung. Initially, CDMA operators planned to upgrade to rival standards called UMB and WiMAX, but major CDMA operators (such as Verizon, Sprint and MetroPCS in the United States, Bell and Telus in Canada, au by KDDI in Japan, SK Telecom in South Korea and China Telecom/China Unicom in China) have announced instead they intend to migrate to LTE. The next version of LTE is LTE Advanced, which was standardized in March 2011. Services commenced in 2013. Additional evolution known as LTE Advanced Pro have been approved in year 2015.
|
https://en.wikipedia.org/wiki?curid=19752979
| 81,516 |
1,142,878 |
Out of all described eukaryotes almost one third are herbivorous insects, about 500,000. They feed on living plant matter or the products of a plant. These insects may eat essential parts of the plant, such as the leaves or sap, or they may survive on the pollen and nectar produced by the plant. Herbivorous insects often use olfactory or visual cues to determine a potential host plant. A visual cue could simply be the outline of a certain type of leaf, or the high contrast between the petals of a flower and the leaves surrounding it. These are typically associated with the olfactory signal an insect may receive from their intended meal. The olfactory cue could be the scent of the nectar produced by a flower, a certain chemical excreted to repel unwanted predators, or the exposed sap of a cherry tree. Either of these two senses could be the driving force behind an insect choosing to consume a certain plant, but it is only after it takes the first bite, and the confirmation of this food is made by its sense of taste, that it truly feeds. After a herbivorous insect is finished feeding on a plant, it will either wait there until hungry again, or move on to another task, be it finding more food, a mate, or shelter. Herbivorous insects bring significantly more danger to a plant than that of consumption; they are among the most prominent disease-carrying creatures in the insect world. There are numerous diseases, fungi, and parasites that can be carried by nearly any herbivorous insect, many of which fatal to the plant infected. Some diseases even produce a sweet smelling, sticky secretion from the infected plant to attract more insects and spread farther. In return plants have their own defenses. Some of these defenses are toxic secondary metabolites to deter insects. These toxins limit the diet breadth of herbivores, and evolving mechanisms to nonetheless continue herbivory is an important part of maintaining diet breadth in insects, and so in their evolutionary history as a whole. Both pleiotropy and epistasis have complex effects in this regard, with the simulations of Griswold 2006 showing that more genes provide the benefit of more targets for adaptive mutations, while Fisher 1930 showed that a mutation can improve one trait while epistasis causes it to also trigger negative effects - slowing down adaptation.
|
https://en.wikipedia.org/wiki?curid=15367337
| 1,142,281 |
1,848,772 |
The Kyrgyz economy is oriented primarily towards agricultural production, mineral extraction, textiles and the service industry. There is little incentive to create knowledge- and technology-based industries. The insufficient rate of capital accumulation also hampers structural changes designed to boost innovation and technology-intensive industries. Every key economic sector is technologically dependent on other countries. In the energy sector, for instance, all technological equipment is imported from abroad and many of its assets are in foreign hands. In 2013 and 2014, three partly state-owned Russian companies invested in Kyrgyzstan's hydropower, oil and gas industries. In 2013, RusHydro began building the first of a series of hydroelectric dams that it will manage. In February 2014, Rosneft signed a framework agreement to buy 100% of Bishkek Oil and a 50% stake in the sole aviation fuel provider at the country's second-biggest airport, Osh International. The same year, Gazprom came closer to acquiring 100% of Kyrgyzgaz, which operates the country's natural gas network. In return for a symbolic investment of US$1, Gazprom undertook to assume US$40 million in debt and invest 20 billion rubles ("circa" US$551 million) in modernizing Kyrgyz gas pipelines between 2014 and 2019. Gazprom already provides most of the country's aviation fuel and has a 70% share in the retail gasoline market.
|
https://en.wikipedia.org/wiki?curid=54244683
| 1,847,714 |
1,903,460 |
TS-DFT is usually used in a two step process. In the first step, the spectrum of an optical broadband pulse is encoded by the information (e.g., temporal, spatial, or chemical information) to be captured. In the next step, the encoded spectrum is mapped by large group-velocity dispersion into a slowed temporal waveform. At this point the waveform has been sufficiently slowed so it can be digitized and processed in real-time. Without the time stretch, single shot waveforms will be too fast to be digitized by analog to digital converters. Implemented in the optical domain, this process performs a similar function as slow motion used to see fast events in videos. While video slow motion is a simple process of playing back an already recorded event, the TS-DFT performs slow motion at the speed of light and before the signal is captured. When needed, the waveform is simultaneously amplified in the dispersive fiber by the process of stimulated Raman scattering. This optical amplification overcomes the thermal noise which would otherwise limit the sensitivity in real-time detection. Subsequent optical pulses perform repetitive measurements at the frame rate of the pulsed laser. Consequently, single shot optical spectra, carrying information from fast dynamic processes, can be digitized and analyzed at high frame rates. The time-stretch dispersive Fourier transformer consists of a low-loss dispersive fiber that is also a Raman amplifier. To create Raman gain, pump lasers are coupled into the fiber by wavelength-division multiplexers, with wavelengths of pump lasers chosen to create a broadband and flat gain profile that covers the spectrum of the broadband optical pulse. Instead of Raman amplification, a discrete amplifier such as an erbium doped optical amplifier or a semiconductor optical amplifier can be placed before the dispersive fiber. However, the distributed nature of Raman amplification provides superior signal to noise ratio. Dispersive Fourier Transform has proven to be an enabling technology for wideband A/D conversion (ultra wideband analog to digital converters) and has also been used for high-throughput real-time spectroscopy and imaging (serial time-encoded amplified microscopy (STEAM)).
|
https://en.wikipedia.org/wiki?curid=34270042
| 1,902,366 |
778,686 |
The Air Ministry still saw little immediate value in the effort (they regarded it as long-range research), and having no production facilities of its own, Power Jets entered into an agreement with steam turbine specialists British Thomson-Houston (BTH) to build an experimental engine facility at a BTH factory in Rugby, Warwickshire. Work progressed quickly, and by the end of the year 1936 the prototype detail design was finalised and parts for it were well on their way to being completed, all within the original £2,000 budget. However, by 1936, Germany had also started working on jet engines (Herbert A. Wagner at Junkers and Hans von Ohain at Heinkel) and, although they too had difficulty overcoming conservatism, the German Ministry of Aviation (Reichsluftfahrtministerium) was more supportive than their British counterpart. Von Ohain applied for a patent for a turbojet engine in 1935 but having earlier reviewed and critiqued Whittle's patents, had to narrow the scope of his own filing. In Spain, air-force pilot and engineer Virgilio Leret Ruiz had been granted a patent for a jet engine in March 1935, and Republican president Manuel Azaña arranged for initial construction at the Hispano-Suiza aircraft factory in Madrid in 1936, but Leret was executed months later by Francoist Moroccan troops after commanding the defence of his seaplane base near Melilla at the onset of the Spanish Civil War. His plans were hidden from the Francoists and secretly handed to the British embassy in Madrid a few years later when his wife, Carlota O'Neill, was released from prison.
|
https://en.wikipedia.org/wiki?curid=42707
| 778,269 |
953,008 |
The earliest documented design of pressure vessels was described in 1495 in the book by Leonardo da Vinci, the Codex Madrid I, in which containers of pressurized air were theorized to lift heavy weights underwater. However, vessels resembling those used today did not come about until the 1800s, when steam was generated in boilers helping to spur the industrial revolution. However, with poor material quality and manufacturing techniques along with improper knowledge of design, operation and maintenance there was a large number of damaging and often Deathly explosions associated with these boilers and pressure vessels, with a death occurring on a nearly daily basis in the United States. Local provinces and states in the US began enacting rules for constructing these vessels after some particularly devastating vessel failures occurred killing dozens of people at a time, which made it difficult for manufacturers to keep up with the varied rules from one location to another. The first pressure vessel code was developed starting in 1911 and released in 1914, starting the ASME Boiler and Pressure Vessel Code (BPVC). In an early effort to design a tank capable of withstanding pressures up to , a diameter tank was developed in 1919 that was spirally-wound with two layers of high tensile strength steel wire to prevent sidewall rupture, and the end caps longitudinally reinforced with lengthwise high-tensile rods. The need for high pressure and temperature vessels for petroleum refineries and chemical plants gave rise to vessels joined with welding instead of rivets (which were unsuitable for the pressures and temperatures required) and in the 1920s and 1930s the BPVC included welding as an acceptable means of construction; welding is the main means of joining metal vessels today.
|
https://en.wikipedia.org/wiki?curid=636219
| 952,503 |
384,156 |
In September 2014, Richard Branson described the intended date for the first commercial flight as February or March 2015; by the time of this announcement, a new plastic-based fuel had yet to be ignited in-flight. By September 2014, the three test flights of the SS2 had only reached an altitude of around 71,000 ft, approximately 13 miles; in order to receive a Federal Aviation Administration licence to carry passengers, the craft needs to complete test missions at full speed and 62-mile height. Following the announcement of further delays, UK newspaper "The Sunday Times" reported that Branson faced a backlash from those who had booked flights with Virgin Galactic, with the company having received $80 million in fares and deposits. Tom Bower, author of "Branson: The Man behind the Mask", told the "Sunday Times": "They spent 10 years trying to perfect one engine and failed. They are now trying to use a different engine and get into space in six months. It's just not feasible." BBC science editor David Shukman commented in October 2014, that "[Branson's] enthusiasm and determination [are] undoubted. But his most recent promises of launching the first passenger trip by the end of this year had already started to look unrealistic some months ago."
|
https://en.wikipedia.org/wiki?curid=1021879
| 383,961 |
8,525 |
Also in 2015, Airbus flight tested a package of aerodynamic upgrades for the Eurofighter known as the Aerodynamic Modification Kit (AMK) consisting of reshaped (delta) fuselage strakes, extended trailing-edge flaperons and leading-edge root extensions. This increases wing lift by 25% resulting in an increased turn rate, tighter turning radius, and improved nose-pointing ability at low speed with angle of attack values around 45% greater and roll rates up to 100% higher. Eurofighter's Laurie Hilditch said these improvements should increase subsonic turn rate by 15% and give the Eurofighter the sort of "knife-fight in a phone box" turning capability enjoyed by rivals such as Boeing's F/A-18E/F or the Lockheed Martin F-16, without sacrificing the transonic and supersonic high-energy agility inherent to its delta wing-canard configuration. Eurofighter Project Pilot Germany Raffaele Beltrame said: "The handling qualities appeared to be markedly improved, providing more manoeuvrability, agility and precision while performing tasks representative of in-service operations. And it is extremely interesting to consider the potential benefits in the air-to-surface configuration thanks to the increased variety and flexibility of stores that can be carried."
|
https://en.wikipedia.org/wiki?curid=167667
| 8,522 |
884,036 |
Protein folding does not occur in one step. Instead, proteins spend most of their folding time, nearly 96% in some cases, "waiting" in various intermediate conformational states, each a local thermodynamic free energy minimum in the protein's energy landscape. Through a process known as adaptive sampling, these conformations are used by Folding@home as starting points for a set of simulation trajectories. As the simulations discover more conformations, the trajectories are restarted from them, and a Markov state model (MSM) is gradually created from this cyclic process. MSMs are discrete-time master equation models which describe a biomolecule's conformational and energy landscape as a set of distinct structures and the short transitions between them. The adaptive sampling Markov state model method significantly increases the efficiency of simulation as it avoids computation inside the local energy minimum itself, and is amenable to volunteer computing (including on GPUGRID) as it allows for the statistical aggregation of short, independent simulation trajectories. The amount of time it takes to construct a Markov state model is inversely proportional to the number of parallel simulations run, i.e., the number of processors available. In other words, it achieves linear parallelization, leading to an approximately four orders of magnitude reduction in overall serial calculation time. A completed MSM may contain tens of thousands of sample states from the protein's phase space (all the conformations a protein can take on) and the transitions between them. The model illustrates folding events and pathways (i.e., routes) and researchers can later use kinetic clustering to view a coarse-grained representation of the otherwise highly detailed model. They can use these MSMs to reveal how proteins misfold and to quantitatively compare simulations with experiments.
|
https://en.wikipedia.org/wiki?curid=413102
| 883,572 |
449,096 |
Post-processual archaeology began in Britain during the late 1970s, spearheaded by a number of British archaeologists who had become interested in aspects of French Marxist anthropology. Most prominent among these was Ian Hodder (born 1948), a former processualist who had made a name for himself for his economic analysis of spatial patterns and early development of simulation studies, particularly relating to trade, markets and urbanization in Iron Age and Roman Britain. Having been influenced by the "New Geography" and the work of the processualist David Clarke, as his research progressed, he became increasingly sceptical that such models and simulations actually tested or proved anything, coming to the conclusion that a particular pattern in the archaeological record could be produced by a number of different simulated processes, and that there was no way to accurately test which of these alternatives was correct. In effect, he came to believe that even using the processual approach to understanding archaeological data, there were still many different ways that that data could be interpreted, and that therefore radically different conclusions could be put forward by different archaeologists, despite processualism's claim that using the scientific method it could gain objective fact from the archaeological record. As a result of this, Hodder grew increasingly critical of the processualist approach, developing an interest in how culture shaped human behaviour. He was supported in this new endeavour by many of his students, including Matthew Spriggs.
|
https://en.wikipedia.org/wiki?curid=160139
| 448,878 |
550,271 |
Mark Humayun, who joined the faculty of the Keck School of Medicine of USC Department of Ophthalmology in 2001; Eugene Dejuan, now at the University of California San Francisco; engineer Howard D. Phillips; bio-electronics engineer Wentai Liu, now at University of California Los Angeles; and Robert Greenberg, now of Second Sight, were the original inventors of the active epi-retinal prosthesis and demonstrated proof of principle in acute patient investigations at Johns Hopkins University in the early 1990s. In the late 1990s the company Second Sight was formed by Greenberg along with medical device entrepreneur, Alfred E. Mann, Their first-generation implant had 16 electrodes and was implanted in six subjects by Humayun at University of Southern California between 2002 and 2004. In 2007, the company began a trial of its second-generation, 60-electrode implant, dubbed the Argus II, in the US and in Europe. In total 30 subjects participated in the studies spanning 10 sites in four countries. In the spring of 2011, based on the results of the clinical study which were published in 2012, Argus II was approved for commercial use in Europe, and Second Sight launched the product later that same year. The Argus II was approved by the United States FDA on 14 February 2013. Three US government funding agencies (National Eye Institute, Department of Energy, and National Science Foundation) have supported the work at Second Sight, USC, UCSC, Caltech, and other research labs.
|
https://en.wikipedia.org/wiki?curid=17198736
| 549,983 |
1,457,067 |
Individuals are far more likely to part with personal information when it seems that they will have some sort of control over the use of the information or if the information is given to an entity that they already have an established relationship with. In these specific circumstances, subjects will be much inclined to believe that their information has been collected for pure collection's sake. An entity may also be offering goods or services in exchange for the client's personal information. This type of collection method may seem valuable to a user due to the fact that the transaction appears to be free in the monetary sense. This forms a type of social contract between the entity offering the goods or services and the client. The client may continue to uphold their side of the contract as long as the company continues to provide them with a good or service that they deem worthy. The concept of procedural fairness indicates an individual's perception of fairness in a given scenario. Circumstances that contribute to procedural fairness are providing the customer with the ability to voice their concerns or input, and control over the outcome of the contract. Best practice for any company collecting information from customers is to consider procedural fairness. This concept is a key proponent of ethical consumer marketing and is the basis of United States Privacy Laws, the European Union's privacy directive from 1995, and the Clinton Administration's June 1995 guidelines for personal information use by all National Information Infrastructure participants. An individual being allowed to remove their name from a mailing list is considered a best information collecting practice. In a few Equifax surveys conducted in the years 1994–1996, it was found that a substantial amount of the American public was concerned about business practices using private consumer information, and that is causes more harm than good. Throughout the course of a customer-company relationship, the company can likely accumulate a plethora of information from its customer. With data processing technology flourishing, it allows for the company to make specific marketing campaigns for each of their individual customers. Data collection and surveillance infrastructure has allowed companies to micro-target specific groups and tailor advertisements for certain populations.
|
https://en.wikipedia.org/wiki?curid=4168072
| 1,456,247 |
1,338,735 |
Ross Anderson of Cambridge University was among the most vocal critics of NGSCB and of Trusted Computing. Anderson alleged that the technologies were designed to satisfy federal agency requirements; enable content providers and other third-parties to remotely monitor or delete data in users' machines; use certificate revocation lists to ensure that only content deemed "legitimate" could be copied; and use unique identifiers to revoke or validate files; he compared this to the attempts by the Soviet Union to "register and control all typewriters and fax machines." Anderson also claimed that the TPM could control the execution of applications on a user's machine and, because of this, bestowed to it a derisive "Fritz Chip" name in reference to United States Senator Ernest "Fritz" Hollings, who had recently proposed DRM legislation such as the Consumer Broadband and Digital Television Promotion Act for consumer electronic devices. Anderson's report was referenced extensively in the news media and appeared in publications such as BBC News, "The New York Times", and "The Register". David Safford of IBM Research stated that Anderson presented several technical errors within his report, namely that the proposed capabilities did not exist within any specification and that many were beyond the scope of trusted platform design. Anderson later alleged that BitLocker was designed to facilitate DRM and to lock out competing software on an encrypted system, and, in spite of his allegation that NGSCB was designed for federal agencies, advocated for Microsoft to add a backdoor to BitLocker. Similar sentiments were expressed by Richard Stallman, founder of the GNU Project and Free Software Foundation, who alleged that Trusted Computing technologies were designed to enforce DRM and to prevent users from running unlicensed software. In 2015, Stallman stated that "the TPM has proved a total failure" for DRM and that "there are reasons to think that it will not be feasible to use them for DRM."
|
https://en.wikipedia.org/wiki?curid=59524
| 1,338,002 |
1,073,585 |
Molecular dynamics (MD) has been a very powerful technique to investigate the nanoindentation at atomic scale. For instance, Alexey et al employed MD to simulate the nanoindentation process of a titanium crystal, dependence of deformation of the crystalline structure on the type of the indenter is observed, which is very hard to harvest in experiment. Tao et al performed MD simulations of nanoindentation on Cu/Ni nanotwinned multilayers films using a spherical indenter and investigated the effects of hetero-twin interface and twin thickness on hardness. Recently, a review paper by Carlos et al is published upon the atomistic studies of nanoindentation. This review covers different nanoindentation mechanisms and effects of surface orientation, crystallography (fcc, bcc, hcp, etc), surface and bulk damage on plasticity. All of the MD-obtained results are very difficult to be achieved in experiment due to the resolution limitation of structural characterization techniques. Among various MD simulation software, such as GROMACS, Xenoview, Amber, etc., LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator), which is developed by Sandia National Laboratories, is the most widely used for simulation. An interaction potential and an input file including information of atom ID, coordinates, charges, ensemble, time step, etc are fed to the simulator, and then running could be executed. After specified running timesteps, information such as energy, atomic trajectories, and structural information (such as coordination number) could be output for further analysis, which makes it possible to investigate the nanoindentation mechanism at atomic-scale. Another interesting Matlab toolbox called STABiX has been developed to quantify slip transmission at grain boundaries by analyzing indentation experiments in bicrystal.
|
https://en.wikipedia.org/wiki?curid=4023692
| 1,073,031 |
1,268,645 |
Tanks patrol desolate city streets. Turrets and missile sites threaten the skies. Robot warriors carrying pulse rifles surround military installations. What's become of Earth? Machines have taken over. Corporate greed and rapid technological advancements have made humans pawns of their own creations. During the first fifteen years of the 21st century, Mega Corp began to dominate computer technology in both peacekeeping and war-fighting applications. As this giant churned out better and better technology for manufacturing and warfare, humans were relegated to service industries or to working as drones on PC terminals. Mega Corp became the largest employer in the United States. By 2010, every computer in America used Mega Corp software and was Internet-connected and monitored through the Mega Corp Network—antitrust suits be damned. Each day, Mega Corp would issue government-endorsed messages through the Network that broadcast pro-machine propaganda. The country was becoming brainwashed. In 2018, the wonders of artificial intelligence turned ugly in the hands of a few disillusioned Mega Corp programmers. Frustrated at being a part of such an ethically challenged corporation, these hacks altered coding in various Mega Corp products—turning certain robot and tank machinery into self-directed, man-killing machines. Today, May 2019, in a war-torn, machine-ravaged world, only a few freethinkers remain. Only a few outsiders have escaped the spell of the Network. Dr. Raines is the leader of a group of rebels called the Alliance. He and a few others have developed a computer program that gives the operator control over an experimental tank. You control this tank and must defeat these robot warriors.
|
https://en.wikipedia.org/wiki?curid=6167671
| 1,267,955 |
710,974 |
Candidates for a new drug to treat a disease might, theoretically, include from 5,000 to 10,000 chemical compounds. On average about 250 of these show sufficient promise for further evaluation using laboratory tests, mice and other test animals. Typically, about ten of these qualify for tests on humans. A study conducted by the Tufts Center for the Study of Drug Development covering the 1980s and 1990s found that only 21.5 percent of drugs that started Phase I trials were eventually approved for marketing. In the time period of 2006 to 2015, the success rate was 9.6%. The high failure rates associated with pharmaceutical development are referred to as the "attrition rate" problem. Careful decision making during drug development is essential to avoid costly failures. In many cases, intelligent programme and clinical trial design can prevent false negative results. Well-designed, dose-finding studies and comparisons against both a placebo and a gold-standard treatment arm play a major role in achieving reliable data.
|
https://en.wikipedia.org/wiki?curid=3218783
| 710,603 |
620,312 |
By the late 1970s, in addition to the work on "Coxiella burnetii" and other rickettsiae, research priorities had expanded to include the development of vaccines and therapeutics against Argentine, Korean and Bolivian hemorrhagic fevers, Lassa fever and other exotic diseases that could pose potential BW threats. In 1978, the Institute assisted with humanitarian efforts in Egypt when a severe outbreak of Rift Valley fever (RVF) occurred there for the first time. The epidemic caused thousands of human cases and the deaths of large numbers of livestock. Diagnostics, along with much of the Institute's stock of RVF vaccine, were sent to help control the outbreak. At this time the Institute acquired both fixed and transportable BSL-4 containment plastic human isolators for the hospital care and safe transport of patients suffering from highly contagious and potentially lethal exotic infections. In 1978, it established an Aeromedical Isolation Team (AIT) — a military rapid response team of doctors, nurses and medics, with worldwide airlift capability, designed to safely evacuate and manage contagious patients under BSL-4 conditions. A formal agreement was signed with the Centers for Disease Control (CDC) at this time stipulating that USAMRIID would house and treat highly contagious infections in laboratory personnel should any occur. (After deploying on only four "real world" missions in 32 years, the AIT was ultimately decommissioned in 2010.)
|
https://en.wikipedia.org/wiki?curid=2727969
| 619,997 |
1,927,771 |
Tadashi Sasaki was born on May 12, 1915 in Hamada City, Shimane Prefecture. Not much is known of Sasaki’s mother. His father was a former samurai from the garrison at Hamada Castle and a teacher. Initially, Sasaki desired to study modern Japanese literature, but he was encouraged by one of his school teachers to study science. He studied electrical engineering at Kyoto University, where he graduated in 1938. After graduation, he worked for a short time on circuit design at the Electrotechnical Laboratory—a preeminent research laboratory sponsored by the Ministry of Telecommunications. This position was short-lived as the outbreak of war in Japan meant Sasaki would be recruited for wartime work. Sasaki was assigned to an aircraft maker called Kawanishi, which was based in the western Japanese port of Kobe. Sasaki did research on vacuum tubes for use in telephones, wireless, and radar. In this position, he learned about the vertical integration of technology, an experience he found to be very valuable. Additionally, Sasaki worked to develop radar technologies as well as radar technologies to aid in the war effort. This research took him to Wurzburg, Germany, where he studied anti-radar technology. He later worked in Kobe Kogyo, the first Japanese company to manufacture transistors, and then in Hayakawa Electrical Industries, where he helped to develop electronic calculators. This eventually led him to obtain American patent licences to fabricate integrated chips and thus the first commercially successful pocket calculator. His subordinates at Hayakawa company knew him as "Doctor Rocket" due to his hyperactive nature.
|
https://en.wikipedia.org/wiki?curid=38096595
| 1,926,667 |
1,585,186 |
Remote sensing as we know it today started with the first earth orbiting satellite Landsat 1 in 1973. Landsat 1 delivered the first multi-spectral images of features on land and coastal zones all over the world and already showed effectiveness in oceanography, although not specifically designed for it. In 1978 NASA makes the next step in remote sensing for oceanography with the launch of the first orbiting satellite dedicated to ocean research, Seasat. The satellite carried 5 different instruments: a Radar altimeter for retrieving sea surface height, a microwave scatterometer to retrieve wind speeds and direction, a microwave radiometer to retrieve sea surface temperature (SST), an optical and infrared radiometer to check for clouds and surface characteristics and lastly the first Synthetic Aperture Radar (SAR) instrument. Seasat was only operational for a few months but, together with the Coastal Zone Color Scanner (CZCS) on Nimbus-7, proved the feasibility of many techniques and instruments in ocean remote sensing. TOPEX/POSEIDON, an altimeter launched in 1992, provided the first continuous global map of sea surface topography and continued on the possibilities explored by Seasat. The Jason-1, Jason-2 and Jason-3 missions continue the measurements from 1992 to today to form a complete time-series of the global sea surface height. Also other techniques hosted on Seasat found continuation. The Advanced Very-High-Resolution Radiometer (AVHRR) Is the sensor carried on al NOAA missions and made SST retrieval accessible with a continuous time-series since 1979. The European Space Agency (ESA) further developed SAR with the ERS-2, ENVISAT and now Sentinel-1 missions by providing larger spatial footprints, lowering the resolution and flying twin missions to reduce the effective revisit time. Optical remote sensing of the ocean found continuation after the CZCS with polar orbiting missions ENVISAT, OrbView-2, MODIS and very recently with Sentinel-3, to form a continuous record since 1997. Sentinel-3 is now one of the best equipped missions to map the ocean hosting a SAR altimiter, multispectral spectrometer a radiometer and several other instruments on multiple satellites with alternating orbits providing exceptional temporal and spatial resolution.
|
https://en.wikipedia.org/wiki?curid=67696353
| 1,584,295 |
913,461 |
In 2012, Pasachoff and Sheehan consulted original sources, and questioned the basis for the claim that Lomonosov observed the thin arc produced by the atmosphere of Venus. A reference to the paper was even picked up by the Russian state-controlled media group RIA Novosti on 31 January 2013, under the headline "Astronomical Battle in US Over Lomonosov’s discovery." An interesting attempt was made by a group of researchers to experimentally reconstruct Lomonosov's observation using antique telescopes during the transit of Venus on 5–6 June 2012. One of them, Y. Petrunin, suggested that the telescope Lomonosov actually used was probably a 50 mm Dolland with a magnifying power of 40x. It was preserved at Pulkova Observatory but destroyed when the Germans bombed the observatory during World War II. Thus, Lomonosov's actual telescope was not available, but other presumably similar instruments were employed on this occasion, and led the researchers to affirm their belief that Lomonosov's telescope would have been adequate to the task of detecting the arc. Thus A. Koukarine, using a 67 mm Dollond on Mt. Hamilton, where seeing was likely much better than Lomonosov enjoyed at St. Petersburg, clearly observed the spiderweb-thin arc known to be due to refraction in the atmosphere of Venus. However, Koukarine's sketches do not really resemble the diagram published by Lomonosov. On the other hand, Koukarine's colleague V. Shiltsev, who more nearly observed under the same conditions as Lomonosov (using a 40 mm Dollond at Batavia, Illinois), did produce a close duplicate of Lomonosov's diagram; however, the rather large wing of light shown next to the black disk of Venus in his drawing (and Lomonosov's) is too coarse to have been the arc. Instead it appears to be a complicated manifestation of the celebrated optical effect known as the "black drop". (It should be kept in mind that, as stated in Sheehan and Westfall, "optical distortions at the interface between Venus and the Sun during transits are impressively large, and any inferences from them are fraught with peril".
|
https://en.wikipedia.org/wiki?curid=632331
| 912,982 |
612,017 |
The "Aṣṭāṅgahṛdayasaṃhitā" (Ah, "Heart of Medicine") is written in poetic language. The "Aṣṭāṅgasaṅgraha" (As, "Compendium of Medicine") is a longer and less concise work, containing many parallel passages and extensive passages in prose. The Ah is written in 7120 easily understood Sanskrit verses that present a coherent account of Ayurvedic knowledge. Ashtanga in Sanskrit means ‘eight components’ and refers to the eight sections of Ayurveda: internal medicine, surgery, gynaecology and paediatrics, rejuvenation therapy, aphrodisiac therapy, toxicology, and psychiatry or spiritual healing, and ENT (ear, nose and throat). There are sections on longevity, personal hygiene, the causes of illness, the influence of season and time on the human organism, types and classifications of medicine, the significance of the sense of taste, pregnancy and possible complications during birth, Prakriti, individual constitutions and various aids for establishing a prognosis. There is also detailed information on Five-actions therapies (Skt. "pañcakarma") including therapeutically induced vomiting, the use of laxatives, enemas, complications that might occur during such therapies and the necessary medications. The "Aṣṭāṅgahṛdayasaṃhitā" is perhaps Ayurveda’s greatest classic, and copies of the work in manuscript libraries across India and the world outnumber any other medical work. The Ah is the central work of authority for ayurvedic practitioners in Kerala. The "Aṣṭāṅgasaṅgraha", by contrast, is poorly represented in the manuscript record, with only a few, fragmentary manuscripts having survived to the twenty-first century. Evidently it was not widely read in pre-modern times. However, the As has come to new prominence since the twentieth century through being made part of the curriculum for ayurvedic college education in India.
|
https://en.wikipedia.org/wiki?curid=300767
| 611,706 |
369,211 |
The Soviet government issued the decree for Scientific Research Test Range No. 5 (NIIP-5; ) on 12 February 1955. It was actually founded on 2 June 1955, originally a test center for the world's first intercontinental ballistic missile (ICBM), the R-7 Semyorka. NIIP-5 was soon expanded to include launch facilities for space flights. The site was selected by a commission led by General Vasily Voznyuk, influenced by Sergey Korolyov, the Chief Designer of the R-7 ICBM, and soon the man behind the Soviet space program. It had to be surrounded by plains, as the radio control system of the rocket required (at the time) receiving uninterrupted signals from ground stations hundreds of kilometres away. Additionally, the missile trajectory had to be away from populated areas. Also, it is advantageous to place space launch sites closer to the equator, as the surface of the Earth has higher rotational speed in such areas. Taking these constraints into consideration, the commission chose Tyuratam, a village in the heart of the Kazakh Steppe. The expense of constructing the launch facilities and the several hundred kilometres of new road and train lines made the Cosmodrome one of the most costly infrastructure projects undertaken by the Soviet Union. A supporting town was built around the facility to provide housing, schools, and infrastructure for workers. It was raised to city status in 1966 and named Leninsk ().
|
https://en.wikipedia.org/wiki?curid=180638
| 369,018 |
1,592,593 |
In 1990, the Federal Supreme Court of Germany and the Federal Constitutional Court of Germany decided that sections of the German Code of Criminal Procedure provided justifiable legal basis for the use of genetic fingerprinting in identifying criminals and absolving innocents. The decisions, however, lacked specific details on how biological materials can be obtained and how genetic fingerprinting can be utilized; only regulations of blood tests and physical examinations were explicitly outlined. In 1998, the German Parliament authorized the establishment of a national DNA database, due to mounting pressure to prevent cases of sexual abuse and homicides involving children. This decision rendered as constitutional and supported by a compelling public interest by the Federal Constitutional Court in 2001, despite some criticism that the right of informational self-determination was violated. The court did mandate that DNA information and samples must be supported by evidence that the individual can commit a similar crime in the future. To address the legal uncertainty, the Act on Forensic DNA Analysis of 2005 introduced provisions that included exact and limited legal grounds for the use of DNA based information in criminal proceedings. Some sections order that DNA samples may only be used if they are necessary to accelerate the investigation, eliminate suspects, and a court must order genetic fingerprinting. Since its implementation, there has been a monthly addition of 8000 new sets to the database, bringing into question the necessity of such wide scale data collection and whether or not the wording of the provisions provided effective privacy protection. A recent controversial decision by the German government expanded the range of familial searching by DNA dragnet to identify genetic relatives of sexual and violent perpetrators – an action that was previously deemed as having no legal basis by the Federal Supreme Court of Germany in 2012.
|
https://en.wikipedia.org/wiki?curid=55251966
| 1,591,696 |
1,648,979 |
Groves's biggest concern was about people. Soldiers and scientists wanted to return to their peacetime pursuits, and there was a danger that wartime knowledge would be lost, leaving no one who knew how to handle and maintain nuclear weapons, much less how to improve the weapons and processes. The military side of the Manhattan Project had relied heavily on reservists, as the policy of the Corps of Engineers was to assign regular officers to field commands. The reservists were now eligible for separation. To replace them, Groves asked for fifty West Point graduates from the top ten percent of their classes to man bomb-assembly teams at Sandia Base, where the assembly staff and facilities had been moved from Los Alamos and Wendover Field in September and October 1945. He felt that only such high-quality personnel would be able to work with the scientists who were currently doing the job. They were also urgently required for many other jobs in the postwar Army. When General Thomas T. Handy turned down his request, Groves raised the matter with the Chief of Staff of the Army, General of the Army Dwight D. Eisenhower, who similarly did not approve it. Groves then went over his head too, and took the issue to the Secretary of War, Robert P. Patterson, who agreed with Groves. The personnel manned the 2761st Engineer Battalion (Special), which became a field unit under the Armed Forces Special Weapons Project (AFSWP).
|
https://en.wikipedia.org/wiki?curid=29079065
| 1,648,047 |
173,469 |
The progress of these projects was initially slow, but after Germany had declared war on the United States four days after the Pearl Harbor attack by Imperial Japan, the "Reichsluftfahrtministerium" (RLM) started the more serious "Amerikabomber" programme in the spring of 1942 for a very long range bomber, with the result that a larger, six-engine aircraft with a greater bomb load was called for. Proposals were put forward for the Junkers Ju 390, the Focke-Wulf Ta 400, a redesign of the unfinalized and unbuilt Heinkel He 277 design (itself only receiving its RLM airframe number by the February 1943 timeframe in order to give Heinkel an entry in the "Amerikabomber" program later in 1943), and a design study for an extended-wingspan six-engine Messerschmitt Me 264B. The need for six engines was prompted by the ongoing inability of Germany's aviation powerplant designers to create combat-reliable powerplants of 1,500 kW (2,000 PS) and above power output levels, thwarting efforts to do the same with just four engines instead. As the similarly six-engined Junkers Ju 390 could use components already in use for the Ju 290 this design was chosen. The Me 264 was not abandoned, however, as the "Kriegsmarine" (German navy) separately demanded a long-range maritime patrol and attack aircraft to replace the converted Fw 200 "Condor" in this role. This was reinforced by an opinion given by then-"Generalmajor" Eccard Freiherr von Gablenz of the "Wehrmacht Heer" (German Army) in May 1942, who had been recruited by "Generalfeldmarschall" Erhard Milch to give his opinion on the suitability of the Me 264 for the "Amerikabomber" mission; von Gablenz echoed the "Kriegsmarine's" later opinion. As a result, the two pending prototypes were ordered to be completed as development prototypes for the Me 264A ultra long-range reconnaissance aircraft.
|
https://en.wikipedia.org/wiki?curid=455636
| 173,378 |
946,795 |
The first left ventricular assist device (LVAD) system was created by Domingo Liotta at Baylor College of Medicine in Houston in 1962. The first LVAD was implanted in 1963 by Liotta and E. Stanley Crawford. The first successful implantation of an LVAD was completed in 1966 by Liotta along with Dr. Michael E. DeBakey. The patient was a 37-year-old woman, and a paracorporeal (external) circuit was able to provide mechanical support for 10 days after the surgery. The first successful long-term implantation of an LVAD was conducted in 1988 by Dr. William F. Bernhard of Boston Children's Hospital Medical Center and Thermedics, Inc. of Woburn, MA, under a National Institutes of Health (NIH) research contract which developed HeartMate, an electronically controlled assist device. This was funded by a three-year $6.2 million contract to Thermedics and Children's Hospital, Boston, MA, from the National Heart, Lung, and Blood Institute, a program of the NIH. The early VADs emulated the heart by using a "pulsatile" action where blood is alternately sucked into the pump from the left ventricle then forced out into the aorta. Devices of this kind include the HeartMate IP LVAS, which was approved for use in the US by the Food and Drug Administration (FDA) in October 1994. These devices began to gain acceptance in the late 1990s as heart surgeons including Eric Rose, O. H. Frazier and Mehmet Oz began popularizing the concept that patients could live outside the hospital. Media coverage of outpatients with VADs underscored these arguments.
|
https://en.wikipedia.org/wiki?curid=3301527
| 946,292 |
1,169,927 |
Under the new Communist government, Ginling's curriculum had to incorporate some political requirements. Even though these were not different from the earlier requirement of learning Party Principles under the Nationalist government, Thurston argues that the new Communist requirements were more serious because they contained “more definite challenges to Christian beliefs.” The college life went on very much as before, except for the frequent interruptions in school work for special lectures, parades, and other political functions. Religious activities also continued without any direct opposition from the government. However, as anti-American propaganda began to rise, on November 14, 1950, some students accused their American sociology professor Helen Ferris of spreading anti-revolutionary messages and of attacking the Chinese-Korean alliance. This led to widespread criticisms of not only Ferris, but of “crimes of cultural imperialism” happening in many missionary schools in China. With such hostile atmosphere, all American missionary faculty members left Ginling by spring semester 1951, either by deportation or voluntarily. Some Chinese faculty members who did not completely identify with the aggressive campaign also faced persecution. On December 17, 1950, the U.S. State Department ordered freezing of all Chinese properties in the U.S. and outlawed sending funds to China—making the Smith College’s engagement with Ginling impossible. Without its main source of the budget, Ginling accepted government funding and merged with the University of Nanking to form a public National Jinling University.
|
https://en.wikipedia.org/wiki?curid=668984
| 1,169,308 |
219,010 |
The Soviet Union had a nominally supranational pharmacopoeia, the State Pharmacopoeia of the Union of Soviet Socialist Republics (USSRP), although the de facto nature of the nationality of republics within that state differed from the de jure nature. The European Union has a supranational pharmacopoeia, the European Pharmacopoeia; it has not replaced the national pharmacopoeias of EU member states but rather helps to harmonize them. Attempts have been made by international pharmaceutical and medical conferences to settle a basis on which a globally international pharmacopoeia could be prepared, but regulatory complexity and locoregional variation in conditions of pharmacy are hurdles to fully harmonizing across all countries (that is, defining thousands of details that can all be known to work successfully in all places). Nonetheless, some progress has been made under the banner of the International Council on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH), a tri-regional organisation that represents the drug regulatory authorities of the European Union, Japan, and the United States. Representatives from the Pharmacopoeias of these three regions have met twice yearly since 1990 in the Pharmacopoeial Discussion Group to try to work towards "compendial harmonisation". Specific monographs are proposed, and if accepted, proceed through stages of review and consultation leading to adoption of a common monograph that provides a common set of tests and specifications for a specific material. Not surprisingly, this is a slow process. The World Health Organization has produced the International Pharmacopoeia (Ph.Int.), which does not replace a national pharmacopoeia but rather provides a model or template for one and also can be invoked by legislation within a country to serve as that country's regulation.
|
https://en.wikipedia.org/wiki?curid=300705
| 218,902 |
1,386,509 |
During the year and a half that Otto spent in exile in Gallipoli and later in Constantinople, he became an impoverished refugee, eating at relief agencies and taking any job he could find. For some time, he worked as a woodcutter, residing with fellow Russian officers, often 6 people in a tent. One night, a neighboring tent was hit by lightning, killing everyone inside. Struve wrote to his uncle Hermann Struve in Germany for assistance, without knowing that his uncle had died a few months earlier, on August 12, 1920. However, the widow of Hermann, Eva Struve, contacted Paul Guthnick, her late husband's successor at the Berlin-Babelsberg Observatory. Germany itself was suffering after the wars, and there was little chance to obtain a position for a Russian there. Therefore, Guthnick wrote, on December 25, 1920, to the director of Yerkes Observatory in Williams Bay, Wisconsin, Edwin B. Frost asking a position for Struve. He received a reply on January 27, 1921, where Frost promised to do his best. On March 2, 1921, Frost wrote to Struve, offering him a position at Yerkes. Given his situation in Turkey, it was a lucky chance that Struve received that letter. On March 11, Struve sent a reply, thanking Frost for the offer and accepting it. The letter was formally written in English but with German grammar, revealing the poor English proficiency of Struve (when they later met in US, they spoke in German). Struve also acknowledged that he had no experience in spectral astrophysics. Nevertheless, when applying for his position, Frost mentioned that "I am perfectly willing to take him on his lineage. We regard Otto Struve as a first-class spectroscopist and astrophysicist", and that his degree in Kharkiv was equivalent to a doctoral degree (which Struve never claimed and which was hardly so). It took several months to arrange for travel documents and funding. In late August 1921, Struve received his visa and travel tickets at the US Consulate in Turkey. In September, he boarded "S.S. Hog Island" and on October 7, 1921, arrived in New York. He was met there, put on the train, and two days later arrived in Chicago.
|
https://en.wikipedia.org/wiki?curid=584584
| 1,385,742 |
1,301,659 |
The Greek physician Hippocrates, the founder of scientific medicine, was the first to deal with the anatomy and the pathology of human spine. Galen developed an interest in anatomy from his studies of Herophilus and Erasistratus. The concept of studying disease through the methodical dissection and examination of diseased bodies, organs, and tissues may seem obvious today, but there are few if any recorded examples of true autopsies performed prior to the second millennium. Though the pathology of contagion was understood by Muslim physicians since the time of Avicenna (980–1037) who described it in "The Canon of Medicine" (), the first physician known to have made postmortem dissections was the Arabian physician Avenzoar (1091–1161) who proved that the skin disease scabies was caused by a parasite, followed by Ibn al-Nafis (b. 1213) who used dissection to discover pulmonary circulation in 1242. In the 15th century, anatomic dissection was repeatedly used by the Italian physician Antonio Benivieni (1443-1502) to determine cause of death. Antonio Benivieni is also credited with having introduced necropsy to the medical field. Perhaps the most famous early gross pathologist was Giovanni Morgagni (1682-1771). His magnum opus, "De Sedibus et Causis Morborum per Anatomem Indagatis", published in 1761, describes the findings of over 600 partial and complete autopsies, organised anatomically and methodically correlated with the symptoms exhibited by the patients prior to their demise. Although the study of normal anatomy was already well advanced at this date, "De Sedibus" was one of the first treatises specifically devoted to the correlation of diseased anatomy with clinical illness. By the late 1800s, an exhaustive body of literature had been produced on the gross anatomical findings characteristic of known diseases. The extent of gross pathology research in this period can be epitomized by the work of the Viennese pathologist (originally from Hradec Kralove in the Czech Rep.) Carl Rokitansky (1804-1878), who is said to have performed 20,000 autopsies, and supervised an additional 60,000, in his lifetime.
|
https://en.wikipedia.org/wiki?curid=19391193
| 1,300,945 |
389,630 |
In psychiatry, psychology, and mental health counseling, comorbidity refers to the presence of more than one diagnosis occurring in an individual at the same time. However, in psychiatric classification, comorbidity does not necessarily imply the presence of multiple diseases, but instead can reflect current inability to supply a single diagnosis accounting for all symptoms. On the DSM Axis I, major depressive disorder is a very common comorbid disorder. The Axis II personality disorders are often criticized because their comorbidity rates are excessively high, approaching 60% in some cases. Critics assert this indicates these categories of mental illness are too imprecisely distinguished to be usefully valid for diagnostic purposes, impacting treatment and resource allocation. Symptom overlap is a key component against DSM classification and serves as a note towards redefining criteria in disorders that the root cause may not be understood thoroughly. Regardless of criticisms, it stands that, annually, up to 45% of mental health patients fit the criteria for a comorbid diagnosis. A comorbid diagnosis is associated with more severe symptomatic expression and greater chance of dismal prognosis. Certain diagnoses such as ADHD, autism, OCD, and mood disorders have higher rates of co-occurring or being prevalent in separate diagnoses. "Comorbidity in OCD is the rule rather than the exception" with OCD diagnoses facing a lifetime rate of 90%. With overlapping symptoms comes overlap in treatment as well, CBT for example is common for both ADHD and OCD with pediatric onset and can be effective for both in a comorbid diagnosis. OCD and eating disorders have a high rate of occurrence, it is estimated that 20-60% of patients with an eating disorder have OCD. More often, comorbidity complicates and can prevent treatment efficacy on a varying scale depending on the circumstances.
|
https://en.wikipedia.org/wiki?curid=217631
| 389,435 |
1,652,529 |
A third mode of occurrence of tachylite is as the margins and thin offshoots of dikes or sills of basalt and diabase. They are sometimes only a fraction of an inch in thickness, resembling a thin layer of pitch or tar on the edge of a crystalline diabase dike, but veins several inches thick are sometimes found. In these situations tachylite is rarely vesicular, but it often shows very pronounced fluxion banding accentuated by the presence of rows of spherulites that are visible as dark brown rounded spots. The spherulites have a distinct radiate structure and sometimes exhibit zones of varying color. The non-spherulitic glassy portion is sometimes perlitic, and these rocks are always brittle. Common crystals are olivine, augite and feldspar, with swarms of minute dusty black grains of magnetite. At the extreme edges the glass is often perfectly free from crystalline products, but it merges rapidly into the ordinary crystalline diabase, which in a very short distance may contain no vitreous base whatever. The spherulites may form the greater part of the mass, they may be a quarter of an inch in diameter and are occasionally much larger than this. These coarsely spherulitic rocks pass over into the variolites by increasing coarseness in the fibers of their spherulites, which soon become recognizable as needles of feldspar or feathery growths of augite. The ultimate product of decomposition in this case also is a red palagonitic substance, but owing to the absence of steam cavities the tachylite selvages of dikes are more often found in a fresh state than the basic lapilli in ash-beds. Many occurrences of basaltic pitchstones have been reported from Skye, Mull, and the western part of Scotland; they are found also in connection with the intrusive diabase sills in the north of England and the center of Scotland. In the Saar district of Germany similar rocks occur, some of which have been described as weisselbergites (from Weisselberg).
|
https://en.wikipedia.org/wiki?curid=1198783
| 1,651,597 |
1,227,253 |
The electromagnetic pulse was produced by a pair of Marx generators built by Maxwell Laboratories of San Diego, California. The generators were mounted on pedestals constructed of wood in the same manner as the main test platform, one on each side of a large wedge shaped steel structure which acted as a ground plane for the horizontally polarized pulse. Each Marx generator consisted of a stack of 50 trays, each containing two large capacitors and a plasma switch. A large peaking capacitor, used to adjust the shape of the resulting pulse, was also part of the design. Each generator was enclosed in a large fiberglass structure which was filled with sulfur hexafluoride (SF) acting as an insulating gas. The tray capacitors were slowly charged such that each tray had up to 100kV of potential. When discharged through the plasma switches, the 50 trays in series could (ideally) produce up to 5 megavolts of electrical potential in a pulse with a rise-time in the 100 nanosecond range. The generators on either side of the wedge were charged to opposite polarities and fired into twin transmission lines (antennas) mounted on either side of the test platform. When triggered simultaneously the resulting EM waves from each generator combined at the sharp point of the wedge building, adding to a total electrical potential of 10 megavolts. The transmission lines were terminated into a 50 ohm low inductance resistive load mounted on a tall wooden tower at the far end of the platform. The result was a fast 200 gigawatt pulse of electromagnetic flux powerful enough to reliably reproduce (at short range) the deleterious effects of a thermonuclear detonation on electronic circuitry as created by such examples as the HARDTACK I, ARGUS and DOMINIC I (Operation Fishbowl) high altitude nuclear tests.
|
https://en.wikipedia.org/wiki?curid=32682509
| 1,226,592 |
1,276,258 |
Deficiency of GALT causes classic galactosemia. Galactosemia is an autosomal recessive inherited disorder detectable in newborns and childhood. It occurs at approximately 1 in every 40,000-60,000 live-born infants. Classical galactosemia (G/G) is caused by a deficiency in GALT activity, whereas the more common clinical manifestations, Duarte (D/D) and the Duarte/Classical variant (D/G) are caused by the attenuation of GALT activity. Symptoms include ovarian failure, developmental coordination disorder (difficulty speaking correctly and consistently), and neurologic deficits. A single mutation in any of several base pairs can lead to deficiency in GALT activity. For example, a single mutation from A to G in exon 6 of the GALT gene changes Glu188 to an arginine and a mutation from A to G in exon 10 converts Asn314 to an aspartic acid. These two mutations also add new restriction enzyme cut sites, which enable detection by and large-scale population screening with PCR (polymerase chain reaction). Screening has mostly eliminated neonatal death by G/G galactosemia, but the disease, due to GALT’s role in the biochemical metabolism of ingested galactose (which is toxic when accumulated) to the energetically useful glucose, can certainly be fatal. However, those afflicted with galactosemia can live relatively normal lives by avoiding milk products and anything else containing galactose (because it cannot be metabolized), but there is still the potential for problems in neurological development or other complications, even in those who avoid galactose.
|
https://en.wikipedia.org/wiki?curid=1592232
| 1,275,565 |
1,812,484 |
With her PhD she joined the fabled AT&T Bell Labs in Murray Hills in New Jersey (NJ) and later Bellcore, Red Bank also in NJ. The first ever evidence of, high critical temperature (high-Tc) superconductivity was reported (1986) by Bednoz and Muller, two researchers at the IBM's research laboratory in Zurich, Switzerland. Within months, the IBM report was followed up with discoveries of far higher temperature superconductors from all over the world and publicized by professional events like the Woodstock of physics in the March meetings of the American Physical Society in 1987 & 1988; the first one was in New York city and the next in New Orleans, Louisiana. Greene and colleagues then still at AT&T were amongst the first groups to make a number of key contributions, especially discovering the remarkable sensitivity of the superconducting transition, in the '123- materials', to the exact amount of oxygen present, as well as the interdependence of atomic crystal structure and chemical composition with superconductivity. They reported these results in the first couple of conferences such as the one in Anaheim, California in the spring, 1987 meeting of the Materials Research Society (MRS) and the Berkeley meeting later in the summer that same year. The Anaheim meeting was attended by Karl Alexander Muller, the original co-discoverer of high-Tc and Nobel Prize (1987) winner for the discovery of high temperature superconductivity. Greene's pioneering contribution was recognized in the first book on this subject, "Copper-oxide Superconductors", and in other highly cited research articles. She was also a panelist at the Woodstock-II.
|
https://en.wikipedia.org/wiki?curid=42647710
| 1,811,451 |
2,091,255 |
Weis had a summer internship in 1960, while in Cornell, at the Marine Biological Laboratory in Woods Hole, Massachusetts. Immediately after receiving her Ph.D., Weis joined the faculty of Rutgers University in Newark, New Jersey, in 1967, where she was promoted to Professor in 1976. In addition to teaching, she has conducted research on the threats faced by organisms in shallow coastal estuary environments, such as contamination from pollution, especially heavy metals, and invasive species. Much of her research has been with marine life in the New York–New Jersey Harbor area and the New Jersey Meadowlands, but she has also done research in Indonesia and Madagascar. In the 1970s, Weis and her children starred in a Tang orange drink commercial because General Foods was promoting the product with ads featuring female scientists with cute kids. During a sabbatical in 1983–1984, she received a Congressional Science Fellowship from the American Association for the Advancement of Science (AAAS) where she worked for the Environment and Public Works Committee of the US Senate. By publishing studies about how products such as pressure-treated wood used in bulkheads and pilings leach toxic metals into waterways, Weis has been able to influence laws and regulations that have led to manufacturers removing toxic metals from the products. Among the legislation that she has worked on were amendments to the Safe Drinking Water Act and Resource Conservation and Recovery Act.
|
https://en.wikipedia.org/wiki?curid=15754889
| 2,090,051 |
4,803 |
In 1915, Albert Einstein developed his theory of general relativity, having earlier shown that gravity does influence light's motion. Only a few months later, Karl Schwarzschild found a solution to the Einstein field equations that describes the gravitational field of a point mass and a spherical mass. A few months after Schwarzschild, Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution for the point mass and wrote more extensively about its properties. This solution had a peculiar behaviour at what is now called the Schwarzschild radius, where it became singular, meaning that some of the terms in the Einstein equations became infinite. The nature of this surface was not quite understood at the time. In 1924, Arthur Eddington showed that the singularity disappeared after a change of coordinates (see Eddington–Finkelstein coordinates), although it took until 1933 for Georges Lemaître to realize that this meant the singularity at the Schwarzschild radius was a non-physical coordinate singularity. Arthur Eddington did however comment on the possibility of a star with mass compressed to the Schwarzschild radius in a 1926 book, noting that Einstein's theory allows us to rule out overly large densities for visible stars like Betelgeuse because "a star of 250 million km radius could not possibly have so high a density as the Sun. Firstly, the force of gravitation would be so great that light would be unable to escape from it, the rays falling back to the star like a stone to the earth. Secondly, the red shift of the spectral lines would be so great that the spectrum would be shifted out of existence. Thirdly, the mass would produce so much curvature of the spacetime metric that space would close up around the star, leaving us outside (i.e., nowhere)."
|
https://en.wikipedia.org/wiki?curid=4650
| 4,801 |
780,221 |
Although interoception as a term has more recently gained increased popularity, different aspects of it have been studied since the 1950s. These include the features of attention, detection, magnitude, discrimination, accuracy, sensibility, and self-reporting. Despite not using the word "interoception" specifically, many publications in the physiology and medical fields have focused on understanding interoceptive information processing in different organ systems. Attention describes the ability to observe sensations within the body, it can be directed voluntarily in a "top down" manner or it can be attracted involuntarily in a "bottom up" manner. Detection reflects the presence or absence of a conscious report of interoceptive stimuli, like a heartbeat or growling stomach. Magnitude is the intensity of the stimulus, or how strongly the stimuli is felt. Discrimination describes the ability to localize interoceptive stimuli in the body to specific organs and differentiate them from other bodily stimuli that also occur, like distinguishing between a heart which is beating hard from an upset stomach. Accuracy (or sensitivity) refers to how precisely and correctly an individual can monitor specific interoceptive processes. Self-reporting is itself multifaceted. It describes the ability to reflect on interoceptive experiences occurring over different periods of time, make judgments about them, and describe them. Brain-body interactions can also be studied using neuroimaging techniques to map functional interactions between brain and peripheral signals. Although all of these components of interoception have been studied since the mid-twentieth century, they have not been brought together under the umbrella term "interoception" until more recently. The term "interoceptive awareness" is also frequently used to encompass any (or all) of the different interoception features that are accessible to conscious self-report. This multifaceted approach offers a unified way of looking at interoceptive functioning and its different features, it clarifies the definition of interoception itself, and it informs structured ways of assessing interoceptive experiences in an individual.
|
https://en.wikipedia.org/wiki?curid=54842715
| 779,804 |
1,275,779 |
The government attempted to stop the rot with a change in policy in 1873. It was no longer policy to open a telegraph facility at every office issuing money orders in outlying areas. It would now have to be shown the office was likely to be profitable. There was no proposal to disconnect already connected unprofitable offices. However, the number of these declined with increasing traffic. The situation was not helped when in 1883, against the wishes of the government and the Chancellor of the Exchequer Hugh Childers, parliament, under pressure from business groups, called for the minimum charge on inland telegrams be reduced to sixpence (2.5p). In 1885 Postmaster General George Shaw-Lefevre introduced a bill to implement the sixpence rate, which was passed into law. Shaw-Lefevre tried to mitigate the adverse effects by limiting sixpence telegrams to only 12 words, including the address. Addresses had been free but would now be charged for on all telegrams. £500,000 was spent on new wires and training additional staff in anticipation of the increased traffic. Traffic did increase from 33 million messages in 1884–85 to 50 million in 1886–87, reaching its peak by 1900 at over 90 million. At the same time, there was an increase in the deficit, mainly due to the cost of the increased staff. Despite the losses, the telegraph remained under national ownership as it was considered a public service.
|
https://en.wikipedia.org/wiki?curid=60245762
| 1,275,086 |
555,045 |
Iro "et al.", trying to interpret the nitrogen deficiency in comets, stated most of the conditions for hydrate formation in the protoplanetary nebulae, surrounding the pre-main and main sequence stars were fulfilled, despite the rapid grain growth to meter scale. The key was to provide enough microscopic ice particles exposed to a gaseous environment. Observations of the radiometric continuum of circumstellar discs around formula_4-Tauri and Herbig Ae/Be stars suggest massive dust disks consisting of millimeter-sized grains, which disappear after several million years (e.g.,). A lot of work on detecting water ices in the Universe was done on the Infrared Space Observatory (ISO). For instance, broad emission bands of water ice at 43 and 60 μm were found in the disk of the isolated Herbig Ae/Be star HD 100546 in Musca. The one at 43 μm is much weaker than the one at 60 μm, which means the water ice, is located in the outer parts of the disk at temperatures below 50 K. There is also another broad ice feature between 87 and 90 μm, which is very similar to the one in NGC 6302 (the Bug or Butterfly nebula in Scorpius). Crystalline ices were also detected in the proto-planetary disks of ε-Eridani and the isolated Fe star HD 142527 in Lupus. 90% of the ice in the latter was found crystalline at temperature around 50 K. HST demonstrated that relatively old circumstellar disks, as the one around the 5-million-year-old B9.5Ve Herbig Ae/Be star HD 141569A, are dusty. Li & Lunine found water ice there. Knowing the ices usually exist at the outer parts of the proto-planetary nebulae, Hersant "et al." proposed an interpretation of the volatile enrichment, observed in the four giant planets of the Solar System, with respect to the Solar abundances. They assumed the volatiles had been trapped in the form of hydrates and incorporated in the planetesimals flying in the protoplanets’ feeding zones.
|
https://en.wikipedia.org/wiki?curid=54140
| 554,756 |
1,039,258 |
One of the recommendations of this Global Patient Safety Challenge was the adoption of a checklist for use in surgical procedures. As understood in common language, a checklist is a physical list of tasks to be done, with some measure of marking each item's completion (e.g. a square to be filled with a checkmark). While it is a simple intervention, the checklist has historically been used in many occupations as an organizational and as a safety tool; Atul Gawande, the Safe Surgery Saves Lives program leader, describes the influence that the checklists had on the development of the WHO Surgical Safety Checklist in his 2009 book, "The Checklist Manifesto". In particular, Gawande praised the impact of pilot checklists in mitigating aviation disasters, tracing their development back to disastrous test flights of the Boeing B-17 Flying Fortress. Because of its increased complexity, an experienced pilot had missed a step in flight, causing a crash; however, test pilots continued to fly the plane, albeit with short checklists that "fit on an index card, with step-by-step checks for takeoff, flight, landing, and taxiing ... the kind of stuff that pilots know how to do." In further testing, the B-17 was flown for 1.8 million miles without a further accident; pre-flight and emergency pilot checklists became a standard safety feature for the industry. Gawande points to how checklists can be applicable in medicine by ensuring that practitioners do not skip important steps in procedures, both in complex, high-stress situations and in seemingly routine ones.
|
https://en.wikipedia.org/wiki?curid=30541130
| 1,038,717 |
133,246 |
There are many notable contributors to early Chinese disciplines, inventions, and practices throughout the ages. One of the best examples would be the medieval Song Chinese Shen Kuo (1031–1095), a polymath and statesman who was the first to describe the magnetic-needle compass used for navigation, discovered the concept of true north, improved the design of the astronomical gnomon, armillary sphere, sight tube, and clepsydra, and described the use of drydocks to repair boats. After observing the natural process of the inundation of silt and the find of marine fossils in the Taihang Mountains (hundreds of miles from the Pacific Ocean), Shen Kuo devised a theory of land formation, or geomorphology. He also adopted a theory of gradual climate change in regions over time, after observing petrified bamboo found underground at Yan'an, Shaanxi province. If not for Shen Kuo's writing, the architectural works of Yu Hao would be little known, along with the inventor of movable type printing, Bi Sheng (990–1051). Shen's contemporary Su Song (1020–1101) was also a brilliant polymath, an astronomer who created a celestial atlas of star maps, wrote a treatise related to botany, zoology, mineralogy, and metallurgy, and had erected a large astronomical clocktower in Kaifeng city in 1088. To operate the crowning armillary sphere, his clocktower featured an escapement mechanism and the world's oldest known use of an endless power-transmitting chain drive.
|
https://en.wikipedia.org/wiki?curid=14400
| 133,193 |
1,932,592 |
The right to recruit student for master's degrees in technical fields was awarded in 1980. Distance education was introduced in 1981, starting with 91 students at the time. Since 1985 the college was called "Qingdao Institute of Chemical Industry" (), a change of name that was considered a downgrade by a lot of students and teachers alike and thus not warmly welcomed. Contrary to the reservations expressed at the time, it didn't harm the institutions further development of the institution. On the contrary was the range of subjects steadily expanded in the following decades, including the expansion from three to six faculties, including Foreign Languages and Computer Sciences, in 1993, bringing the number of students close to 3,963 and the number of bachelor courses to 17. Another big step towards a comprehensive university was done with the incorporation of the "Qingdao Arts and Crafts School" () in 2001. That same year, the number of students reached 10,747. The biggest change and expansion yet followed in 2002, when full university status was granted and the school adopted its current and final name, "Qingdao University of Science and Technology". In September 2002, the new campus in Laoshan was inaugurated. In 2006, the College of Communication and Animation was opened and several language majors added to the School of Foreign Languages. A third campus was opened in 2009 in Gaomi, marking the expansion of the university beyond the borders of Qingdao. The establishment of a "Middle School Affiliated to Qingdao University of Science and Technology" in Laoshan district () commenced in 2014.
|
https://en.wikipedia.org/wiki?curid=1479349
| 1,931,484 |
1,201,758 |
The F-35C was designed for network-centric warfare, and gives the pilot enhanced situational awareness from its ability to communicate and process data obtained from onboard sensors and from other platforms. While the F-117 had no radar, the F-35C uses an AN/APG-81 AESA radar that can act as a narrowband jammer and can be used against engagement radars. Under NIFC-CA, F-35Cs will routinely be supported by Growlers and Super Hornets to jam and destroy enemy targets beyond the range of surface-to-air missiles. Data-links used to share information are high-bandwidth and jam-resistant to maintain contact. The Navy would also work with the United States Air Force in an attack, with the Navy using the EA-18G as a dedicated EW platform in contested airspace, and the Air Force contributing other stealth platforms including the B-2 Spirit, Long Range Strike Bomber (LRS-B), and future stealthy unmanned combat aerial vehicles (UCAVs); those platforms have, or are planned to have, wideband stealth using geometrical features such as large size and a tailless configuration to enable them to stay undetected when confronted by VHF radars. Even with the possibility of cyber and electronic attack to hack or jam data-links, passive detection systems to locate aircraft based on their electronic emissions, and long-range anti-radiation missiles, the flexibility of "network-centric" cooperative engagement concepts allows additional systems and platforms to be "plugged or unplugged" as required, offering increased survivability and growth potential for new methods of countering countermeasures to be integrated into new or existing concepts.
|
https://en.wikipedia.org/wiki?curid=41509737
| 1,201,117 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.