source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Geophysical%20global%20cooling
Before the concept of plate tectonics, global cooling was a geophysical theory by James Dwight Dana, also referred to as the contracting earth theory. It suggested that the Earth had been in a molten state, and features such as mountains formed as it cooled and shrank. As the interior of the Earth cooled and shrank, the rigid crust would have to shrink and crumple. The crumpling could produce features such as mountain ranges. Application The Earth was compared to a cooling ball of iron, or a steam boiler with shifting boiler plates. By the early 1900s, it was known that temperature increased with increasing depth. With the thickness of the crust, the "boiler plates", being estimated at ten to fifty miles, the downward pressure would be hundreds of thousands of pounds per square inch. Although groundwater was expected to turn to steam at a great depth, usually the downward pressure would contain any steam. Steam's effect upon molten rock was suspected of being a cause of volcanoes and earthquakes, as it had been noticed that most volcanoes are near water. It was not clear whether the molten rock from volcanoes had its origin in the molten rock under the crust, or if increased heat due to pressure under mountains caused the rock to melt. One of the reasons for volcanoes was as a way in which "the contracting earth disposes of the matter it can no longer contain." A relationship between earthquakes and volcanoes had been noted, although the causes were not known. Fault lines and earthquakes tended to happen along the boundaries of the shifting "boiler plates", but the folding of mountains indicated that sometimes the plates buckled. In the early 1900s, Professor Eduard Suess used the theory to explain the 1908 Messina earthquake, being of the opinion that the Earth's crust was gradually shrinking everywhere. He also predicted that eruptions would follow the earthquake and tsunami in Southern Italy. He attributed the earthquake to the sinking of the Eart
https://en.wikipedia.org/wiki/Li%C3%A9nard%20equation
In mathematics, more specifically in the study of dynamical systems and differential equations, a Liénard equation is a second order differential equation, named after the French physicist Alfred-Marie Liénard. During the development of radio and vacuum tube technology, Liénard equations were intensely studied as they can be used to model oscillating circuits. Under certain additional assumptions Liénard's theorem guarantees the uniqueness and existence of a limit cycle for such a system. A Liénard system with piecewise-linear functions can also contain homoclinic orbits. Definition Let and be two continuously differentiable functions on with an even function and an odd function. Then the second order ordinary differential equation of the form is called a Liénard equation. Liénard system The equation can be transformed into an equivalent two-dimensional system of ordinary differential equations. We define then is called a Liénard system. Alternatively, since the Liénard equation itself is also an autonomous differential equation, the substitution leads the Liénard equation to become a first order differential equation: which is an Abel equation of the second kind. Example The Van der Pol oscillator is a Liénard equation. The solution of a Van der Pol oscillator has a limit cycle. Such cycle has a solution of a Liénard equation with negative at small and positive otherwise. The Van der Pol equation has no exact, analytic solution. Such solution for a limit cycle exists if is a constant piece-wise function. Liénard's theorem A Liénard system has a unique and stable limit cycle surrounding the origin if it satisfies the following additional properties: g(x) > 0 for all x > 0; F(x) has exactly one positive root at some value p, where F(x) < 0 for 0 < x < p and F(x) > 0 and monotonic for x > p. See also Autonomous differential equation Abel equation of the second kind Biryukov equation Footnotes External links Dynamical systems D
https://en.wikipedia.org/wiki/Mitochondrial%20permeability%20transition%20pore
The mitochondrial permeability transition pore (mPTP or MPTP; also referred to as PTP, mTP or MTP) is a protein that is formed in the inner membrane of the mitochondria under certain pathological conditions such as traumatic brain injury and stroke. Opening allows increase in the permeability of the mitochondrial membranes to molecules of less than 1500 Daltons in molecular weight. Induction of the permeability transition pore, mitochondrial membrane permeability transition (mPT or MPT), can lead to mitochondrial swelling and cell death through apoptosis or necrosis depending on the particular biological setting. Roles in pathology The MPTP was originally discovered by Haworth and Hunter in 1979 and has been found to be involved in neurodegeneration, hepatotoxicity from Reye-related agents, cardiac necrosis and nervous and muscular dystrophies among other deleterious events inducing cell damage and death. MPT is one of the major causes of cell death in a variety of conditions. For example, it is key in neuronal cell death in excitotoxicity, in which overactivation of glutamate receptors causes excessive calcium entry into the cell. MPT also appears to play a key role in damage caused by ischemia, as occurs in a heart attack and stroke. However, research has shown that the MPT pore remains closed during ischemia, but opens once the tissues are reperfused with blood after the ischemic period, playing a role in reperfusion injury. MPT is also thought to underlie the cell death induced by Reye's syndrome, since chemicals that can cause the syndrome, like salicylate and valproate, cause MPT. MPT may also play a role in mitochondrial autophagy. Cells exposed to toxic amounts of Ca2+ ionophores also undergo MPT and death by necrosis. Structure While the MPT modulation has been widely studied, little is known about its structure. Initial experiments by Szabó and Zoratti proposed the MPT may comprise Voltage Dependent Anion Channel (VDAC) molecules. Nevertheless,
https://en.wikipedia.org/wiki/Medial%20eye%20fields
Medial eye fields are areas in the frontal lobe of the primate brain that play a role in visually guided eye movement. Most neuroscientists refer to this area as the supplementary eye fields. Eye fields are divided into two hemispheres regulated by sonic hedgehog (Shh) and Six3. See also Saccade Smooth pursuit Supplementary eye fields Notes Visual system
https://en.wikipedia.org/wiki/ADF/Cofilin%20family
ADF/cofilin is a family of actin-binding proteins associated with the rapid depolymerization of actin microfilaments that give actin its characteristic dynamic instability. This dynamic instability is central to actin's role in muscle contraction, cell motility and transcription regulation. Three highly conserved and highly (70%-82%) identical genes belonging to this family have been described in humans and mice: CFL1, coding for cofilin 1 (non-muscle, or n-cofilin) CFL2, coding for cofilin 2 (found in muscle: m-cofilin) DSTN, coding for destrin, also known as ADF or actin depolymerizing factor Actin-binding proteins regulate assembly and disassembly of actin filaments. Cofilin, a member of the ADF/cofilin family is actually a protein with 70% sequence identity to destrin, making it part of the ADF/cofilin family of small ADP-binding proteins. The protein binds to actin monomers and filaments, G actin and F actin, respectively. Cofilin causes depolymerization at the minus end of filaments, thereby preventing their reassembly. The protein is known to sever actin filaments by creating more positive ends on filament fragments. Cofilin/ADF (destrin) is likely to sever F-actin without capping and prefers ADP-actin. These monomers can be recycled by profilin, activating monomers to go back into filament form again by an ADP-to-ATP exchange. ATP-actin is then available for assembly. Structure The structure of actin depolymerizing factors is highly conserved across many organism due to actin's importance in many cellular processes. Proteins of the actin depolymerizing factor family characteristically consist of five beta sheets, four antiparallel and one parallel, and four alpha helices with a central alpha helix providing the structure and stability of the proteins. The actin depolymerizing factor homology domain (ADF-H domain) allows for binding to actin subunits and includes the central alpha helix, the N-terminus extension, and the C terminus helix. The N-term
https://en.wikipedia.org/wiki/Special%20Engineer%20Detachment
The Special Engineer Detachment (SED) was a US Army program that identified enlisted personnel with technical skills, such as machining, or who had some science education beyond high school. Those identified were organized into the Special Engineer Detachment, or SED. SED personnel began arriving at Los Alamos in October 1943. By August 1945, 1800 SED personnel worked at Los Alamos. These troops worked in all areas and activities of the Laboratory, including the Trinity Test, and were involved in overseas operations on Tinian.
https://en.wikipedia.org/wiki/BioMarin%20Pharmaceutical
BioMarin Pharmaceutical Inc. is an American biotechnology company headquartered in San Rafael, California. It has offices and facilities in the United States, South America, Asia, and Europe. BioMarin's core business and research is in enzyme replacement therapies (ERTs). BioMarin was the first company to provide therapeutics for mucopolysaccharidosis type I (MPS I), by manufacturing laronidase (Aldurazyme, commercialized by Genzyme Corporation). BioMarin was also the first company to provide therapeutics for phenylketonuria (PKU). Over the years, BioMarin has been criticised for drug pricing and for specific instances of denying access to drugs in clinical trials. History BioMarin was founded in 1997 by Christopher Starr Ph.D. and Grant W. Denison Jr. with an investment of a $1.5 million from Glyko Biomedical and went public in 1999. Seed investors were amongst others MPM Bioventures, Grosvenor Fund and Florian Schönharting. Business development In 2002, BioMarin acquired Glyko Biomedical. In 2009, BioMarin acquired Huxley Pharmaceuticals, Inc. (Huxley), which had rights to a proprietary form of 3,4-diaminopyridine (3,4-DAP), amifampridine phosphate. In 2010, BioMarin was granted marketing approval by the European Commission for 3,4-diaminopyridine (3,4-DAP), amifampridine phosphate for the treatment of the rare autoimmune disease Lambert–Eaton myasthenic syndrome (LEMS). BioMarin launched the product under the name Firdapse. In 2010, BioMarin acquired LEAD Therapeutics, Inc. (LEAD), a small private drug discovery and early stage development company with key compound LT-673, an orally available poly (ADP-ribose) polymerase (PARP) inhibitor studied for the treatment of patients with rare, genetically defined cancers. This acquisition was followed by the purchase of ZyStor Therapeutics, Inc. (ZyStor), a privately held biotechnology company developing ERTs for the treatment of lysosomal storage disorders and its lead product candidate, ZC-701, a fusion of
https://en.wikipedia.org/wiki/Acidophobe
An acidophobe is an organism that is intolerant of acidic environments. The terms acidophobia, acidophoby and acidophobic are also used. The term acidophobe is variously applied to plants, bacteria, protozoa, animals, chemical compounds, etc. The antonymous term is acidophile. Plants are known to be well-defined with respect to their pH tolerance, and only a small number of species thrive well under a broad range of acidity. Therefore the categorization acidophile/acidophobe is well-defined. Sometimes a complementary classification is used (calcicole/calcifuge, with calcicoles being "lime-loving" plants). In gardening, soil pH is a measure of acidity or alkalinity of soil, with pH = 7 indicating the neutral soil. Therefore acydophobes would prefer pH above 7. Acid intolerance of plants may be mitigated by lime addition and by calcium and nitrogen fertilizers. Acidophobic species are used as a natural instrument of monitoring the degree of acidifying contamination of soil and watercourses. For example, when monitoring vegetation, a decrease of acidophobic species would be indicative of acid rain increase in the area. A similar approach is used with aquatic species. Acidophobes Whiteworms (Enchytraeus albidus), a popular live food for aquarists, are acidophobes. Acidophobic compounds are the ones which are unstable in acidic media. Acidophobic crops: alfalfa, clover
https://en.wikipedia.org/wiki/Quadruple%20bond
A quadruple bond is a type of chemical bond between two atoms involving eight electrons. This bond is an extension of the more familiar types double bonds and triple bonds. Stable quadruple bonds are most common among the transition metals in the middle of the , such as rhenium, tungsten, technetium, molybdenum and chromium. Typically the ligands that support quadruple bonds are π-donors, not π-acceptors. History Chromium(II) acetate, Cr2(μ-O2CCH3)4(H2O)2, was the first chemical compound containing a quadruple bond to be synthesized. It was described in 1844 by E. Peligot, although its distinctive bonding was not recognized for more than a century. The first crystallographic study of a compound with a quadruple bond was provided by Soviet chemists for salts of . The very short Re–Re distance was noted. This short distance (and the salt's diamagnetism) indicated Re–Re bonding. These researchers however misformulated the anion as a derivative of Re(II), i.e., . Soon thereafter, F. Albert Cotton and C.B. Harris reported the crystal structure of potassium octachlorodirhenate or K2[Re2Cl8]·2H2O. This structural analysis indicated that the previous characterization was mistaken. Cotton and Harris formulated a molecular orbital rationale for the bonding that explicitly indicated a quadruple bond. The rhenium–rhenium bond length in this compound is only 224 pm. In molecular orbital theory, the bonding is described as σ2π4δ2 with one sigma bond, two pi bonds and one delta bond. Structure and bonding The [Re2Cl8]2− ion adopts an eclipsed conformation as shown at left. The delta bonding orbital is then formed by overlap of the d orbitals on each rhenium atom, which are perpendicular to the Re–Re axis and lie in between the Re–Cl bonds. The d orbitals directed along the Re–Cl bonds are stabilized by interaction with chlorine ligand orbitals and do not contribute to Re–Re bonding. In contrast, the [Os2Cl8]2− ion with two more electrons (σ2π4δ2δ*2) has an Os–Os triple bond a
https://en.wikipedia.org/wiki/Brian%20Wowk
Brian G. Wowk is a Canadian medical physicist and cryobiologist known for the discovery and development of synthetic molecules that mimic the activity of natural antifreeze proteins in cryopreservation applications, sometimes called "ice blockers". As a senior scientist at 21st Century Medicine, Inc., he was a co-developer with Greg Fahy of key technologies enabling cryopreservation of large and complex tissues, including the first successful vitrification and transplantation of a mammalian organ (kidney). Wowk is also known for early theoretical work on future applications of molecular nanotechnology, especially cryonics, nanomedicine, and optics. In the early 1990s he wrote that nanotechnology would revolutionize optics, making possible virtual reality display systems optically indistinguishable from real scenery as in the fictitious Holodeck of Star Trek. These systems were described by Wowk in the chapter "Phased Array Optics" in the 1996 anthology Nanotechnology: Molecular Speculations on Global Abundance , and highlighted in the September 1998 Technology Watch section of Popular Mechanics magazine. Early life and education He obtained his undergraduate and graduate degrees from the University of Manitoba in Winnipeg, Canada. Dr. Wowk obtained his PhD in physics in 1997. His graduate studies included work in online portal imaging for radiotherapy at the Manitoba Cancer Treatment and Research Foundation (now Cancer Care Manitoba), and work on artifact reduction for functional magnetic resonance imaging at the National Research Council of Canada. His work in the latter field is cited by several text books, including Functional MRI which includes an image he obtained of magnetic field changes inside the human body caused by respiration.
https://en.wikipedia.org/wiki/T/TCP
T/TCP (Transactional Transmission Control Protocol) was a variant of the Transmission Control Protocol (TCP). It was an experimental TCP extension for efficient transaction-oriented (request/response) service. It was developed to fill the gap between TCP and UDP, by Bob Braden in 1994. Its definition can be found in RFC 1644 (that obsoletes RFC 1379). It is faster than TCP and delivery reliability is comparable to that of TCP. T/TCP suffers from several major security problems as described by Charles Hannum in September 1996. It has not gained widespread popularity. RFC 1379 and RFC 1644 that define T/TCP were moved to Historic Status in May 2011 by RFC 6247 for security reasons. Alternatives TCP Fast Open is a more recent alternative. See also TCP Cookie Transactions Further reading Richard Stevens, Gary Wright, "TCP/IP Illustrated: TCP for transactions, HTTP, NNTP, and the UNIX domain protocols" (Volume 3 of TCP/IP Illustrated) // Addison-Wesley, 1996 (), 2000 (). Part 1 "TCP for Transactions". Chapters 1-12, pages 1–159
https://en.wikipedia.org/wiki/Depensation
In population dynamics, depensation is the effect on a population (such as a fish stock) whereby, due to certain causes, a decrease in the breeding population (mature individuals) leads to reduced production and survival of eggs or offspring. The causes may include predation levels rising per offspring (given the same level of overall predator pressure) and the Allee effect, particularly the reduced likelihood of finding a mate. Critical depensation When the level of depensation is high enough that the population is no longer able to sustain itself, it is said to be a critical depensation. This occurs when the population size has a tendency to decline when the population drops below a certain level (known as the "Critical depensation level"). Ultimately this may lead to the population or fishery's collapse (resource depletion), or even local extinction. The phenomenon of critical depensation may be modelled or defined by a negative second order derivative of population growth rate with respect of population biomass, which describes a situation where a decline in population biomass is not compensated by a corresponding increase in marginal growth per unit of biomass. See also Abundance (ecology) Conservation biology Local extinction Overexploitation Overfishing Small population size Threatened species
https://en.wikipedia.org/wiki/Operation%20%28mathematics%29
In mathematics, an operation is a function which takes zero or more input values (also called "operands" or "arguments") to a well-defined output value. The number of operands is the arity of the operation. The most commonly studied operations are binary operations (i.e., operations of arity 2), such as addition and multiplication, and unary operations (i.e., operations of arity 1), such as additive inverse and multiplicative inverse. An operation of arity zero, or nullary operation, is a constant. The mixed product is an example of an operation of arity 3, also called ternary operation. Generally, the arity is taken to be finite. However, infinitary operations are sometimes considered, in which case the "usual" operations of finite arity are called finitary operations. A partial operation is defined similarly to an operation, but with a partial function in place of a function. Types of operation There are two common types of operations: unary and binary. Unary operations involve only one value, such as negation and trigonometric functions. Binary operations, on the other hand, take two values, and include addition, subtraction, multiplication, division, and exponentiation. Operations can involve mathematical objects other than numbers. The logical values true and false can be combined using logic operations, such as and, or, and not. Vectors can be added and subtracted. Rotations can be combined using the function composition operation, performing the first rotation and then the second. Operations on sets include the binary operations union and intersection and the unary operation of complementation. Operations on functions include composition and convolution. Operations may not be defined for every possible value of its domain. For example, in the real numbers one cannot divide by zero or take square roots of negative numbers. The values for which an operation is defined form a set called its domain of definition or active domain. The set which contains the
https://en.wikipedia.org/wiki/Bonox
Bonox is a beef extract made in Australia, currently owned by Bega Cheese after it acquired the brand from Kraft Heinz in 2017. It is primarily a drink but can also be used as stock in cooking. History Bonox was invented by Camron Thomas for Fred Walker of Fred Walker & Co. in 1918. Bonox was launched the following year. The Walker company was purchased by Kraft Foods Inc. sometime after Walker's death in 1935. The product was produced by Kraft (from 2012 Kraft Foods, from 2015 Kraft Heinz) until 2017, when Bonox, along with other brands, was sold to Bega Cheese. It kept the same recipe and jar designs. , Bonox continues to be produced by Bega. Nutritional information This concentrated beef extract contains iron and niacin. It is a thick dark brown liquid paste which can be added to soups or stews for flavoring and can also be added to hot water and served as a beverage. Approximate per 100g Energy, including dietary fiber 401 kJ Moisture 56.6 g Protein 16.6 g Nitrogen 2.66 g Fat 0.2 g Ash 19.8 g Starch 6.5 g Available carbohydrate, without sugar alcohols 6.5 g Available carbohydrate, with sugar alcohols 6.5 g Minerals Calcium (Ca) 110 mg Copper (Cu) 0.11 mg Fluoride (F) 190 ug Iron (Fe) 2 mg Magnesium (Mg) 60 mg Manganese (Mn) 0.13 mg Phosphorus (P) 360 mg Potassium (K) 690 mg Selenium (Se) 4 ug Sodium (Na) 6660 mg Sulphur (S) 160 mg Zinc (Zn) 1.5 mg Vitamins Thiamin (B1) 0.36 mg Riboflavin (B2) 0.27 mg Niacin (B3) 5.4 mg Niacin Equivalents 8.17 mg Pantothenic acid (B5) 0.38 mg Pyridoxine (B6) 0.23 mg Biotin (B7) 12 ug See also Bovril
https://en.wikipedia.org/wiki/Biodemography%20of%20human%20longevity
Biodemography is a multidisciplinary approach, integrating biological knowledge (studies on human biology and animal models) with demographic research on human longevity and survival. Biodemographic studies are important for understanding the driving forces of the current longevity revolution (dramatic increase in human life expectancy), forecasting the future of human longevity, and identification of new strategies for further increase in healthy and productive life span. Theory Biodemographic studies have found a remarkable similarity in survival dynamics between humans and laboratory animals. Specifically, three general biodemographic laws of survival are found: Gompertz–Makeham law of mortality Compensation law of mortality Late-life mortality deceleration (now disputed) The Gompertz–Makeham law states that death rate is a sum of an age-independent component (Makeham term) and an age-dependent component (Gompertz function), which increases exponentially with age. The compensation law of mortality (late-life mortality convergence) states that the relative differences in death rates between different populations of the same biological species are decreasing with age, because the higher initial death rates are compensated by lower pace of their increase with age. The disputed late-life mortality deceleration law states that death rates stop increasing exponentially at advanced ages and level off to the late-life mortality plateau. A consequence of this deceleration is that there would be no fixed upper limit to human longevity — no fixed number which separates possible and impossible values of lifespan. If true, this would challenges the common belief in existence of a fixed maximal human life span. Biodemographic studies have found that even genetically identical laboratory animals kept in constant environment have very different lengths of life, suggesting a crucial role of chance and early-life developmental noise in longevity determination. This leads
https://en.wikipedia.org/wiki/Differential%20staining
Differential staining is a staining process which uses more than one chemical stain. Using multiple stains can better differentiate between different microorganisms or structures/cellular components of a single organism. Differential staining is used to detect abnormalities in the proportion of different white blood cells in the blood. The process or results are called a WBC differential. This test is useful because many diseases alter the proportion of certain white blood cells. By analyzing these differences in combination with a clinical exam and other lab tests, medical professionals can diagnose disease. One commonly recognizable use of differential staining is the Gram stain. Gram staining uses two dyes: Crystal violet and Fuchsin or Safranin (the counterstain) to differentiate between Gram-positive bacteria (large Peptidoglycan layer on outer surface of cell) and Gram-negative bacteria. Acid-fast stains are also differential stains. Further reading http://www.uphs.upenn.edu/bugdrug/antibiotic_manual/Gram2.htm The Gram Stain Technique
https://en.wikipedia.org/wiki/High%20Assurance%20Internet%20Protocol%20Encryptor
A High Assurance Internet Protocol Encryptor (HAIPE) is a Type 1 encryption device that complies with the National Security Agency's HAIPE IS (formerly the HAIPIS, the High Assurance Internet Protocol Interoperability Specification). The cryptography used is Suite A and Suite B, also specified by the NSA as part of the Cryptographic Modernization Program. HAIPE IS is based on IPsec with additional restrictions and enhancements. One of these enhancements includes the ability to encrypt multicast data using a "preplaced key" (see definition in List of cryptographic key types). This requires loading the same key on all HAIPE devices that will participate in the multicast session in advance of data transmission. A HAIPE is typically a secure gateway that allows two enclaves to exchange data over an untrusted or lower-classification network. Examples of HAIPE devices include: L3Harris Technologies' Encryption Products KG-245X 10Gbit/s (HAIPE IS v3.1.2 and Foreign Interoperable), KG-245A fully tactical 1 Gbit/s (HAIPE IS v3.1.2 and Foreign Interoperable) RedEagle ViaSat's AltaSec Products KG-250, and KG-255 [1 Gbit/s] General Dynamics Mission Systems TACLANE Products FLEX (KG-175F) 10G (KG-175X) Nano (KG-175N) Airbus Defence & Space ECTOCRYP Transparent Cryptography Three of these devices are compliant to the HAIPE IS v3.0.2 specification while the remaining devices use the HAIPE IS version 1.3.5, which has a couple of notable limitations: limited support for routing protocols or open network management. A HAIPE is an IP encryption device, looking up the destination IP address of a packet in its internal Security Association Database (SAD) and picking the encrypted tunnel based on the appropriate entry. For new communications, HAIPEs use the internal Security Policy Database (SPD) to set up new tunnels with the appropriate algorithms and settings. Due to lack of support for modern commercial routing protocols the HAIPEs often must be preprogrammed with sta
https://en.wikipedia.org/wiki/Greenberger%E2%80%93Horne%E2%80%93Zeilinger%20state
In physics, in the area of quantum information theory, a Greenberger–Horne–Zeilinger state (GHZ state) is a certain type of entangled quantum state that involves at least three subsystems (particle states, qubits, or qudits). The four-particle version was first studied by Daniel Greenberger, Michael Horne and Anton Zeilinger in 1989, and the three-particle version was introduced by N. David Mermin in 1990. Extremely non-classical properties of the state have been observed. GHZ states for large numbers of qubits are theorized to give enhanced performance for metrology compared to other qubit superposition states. Definition The GHZ state is an entangled quantum state for 3 qubits and its state is Generalization The generalized GHZ state is an entangled quantum state of subsystems. If each system has dimension , i.e., the local Hilbert space is isomorphic to , then the total Hilbert space of an -partite system is . This GHZ state is also called an -partite qudit GHZ state. Its formula as a tensor product is . In the case of each of the subsystems being two-dimensional, that is for a collection of M qubits, it reads Properties There is no standard measure of multi-partite entanglement because different, not mutually convertible, types of multi-partite entanglement exist. Nonetheless, many measures define the GHZ state to be maximally entangled state. Another important property of the GHZ state is that taking the partial trace over one of the three systems yields which is an unentangled mixed state. It has certain two-particle (qubit) correlations, but these are of a classical nature. On the other hand, if we were to measure one of the subsystems in such a way that the measurement distinguishes between the states 0 and 1, we will leave behind either or , which are unentangled pure states. This is unlike the W state, which leaves bipartite entanglements even when we measure one of its subsystems. The GHZ state is non-biseparable and is the representative of o
https://en.wikipedia.org/wiki/Predictive%20analytics
Predictive analytics is a form of business analytics applying machine learning to generate a predictive model for certain business applications. As such, it encompasses a variety of statistical techniques from predictive modeling and machine learning that analyze current and historical facts to make predictions about future or otherwise unknown events. It represents a major subset of machine learning applications; in some contexts, it is synonymous with machine learning. In business, predictive models exploit patterns found in historical and transactional data to identify risks and opportunities. Models capture relationships among many factors to allow assessment of risk or potential associated with a particular set of conditions, guiding decision-making for candidate transactions. The defining functional effect of these technical approaches is that predictive analytics provides a predictive score (probability) for each individual (customer, employee, healthcare patient, product SKU, vehicle, component, machine, or other organizational unit) in order to determine, inform, or influence organizational processes that pertain across large numbers of individuals, such as in marketing, credit risk assessment, fraud detection, manufacturing, healthcare, and government operations including law enforcement. Definition Predictive analytics is a set of business intelligence (BI) technologies that uncovers relationships and patterns within large volumes of data that can be used to predict behavior and events. Unlike other BI technologies, predictive analytics is forward-looking, using past events to anticipate the future. Predictive analytics statistical techniques include data modeling, machine learning, AI, deep learning algorithms and data mining. Often the unknown event of interest is in the future, but predictive analytics can be applied to any type of unknown whether it be in the past, present or future. For example, identifying suspects after a crime has been committ
https://en.wikipedia.org/wiki/W%20state
The W state is an entangled quantum state of three qubits which in the bra-ket notation has the following shape and which is remarkable for representing a specific type of multipartite entanglement and for occurring in several applications in quantum information theory. Particles prepared in this state reproduce the properties of Bell's theorem, which states that no classical theory of local hidden variables can produce the predictions of quantum mechanics. The state is named after Wolfgang Dür, who first reported the state together with Guifré Vidal, and Ignacio Cirac in 2002. Properties The W state is the representative of one of the two non-biseparable classes of three-qubit states, the other being the Greenberger–Horne–Zeilinger state, , which cannot be transformed (not even probabilistically) into each other by local quantum operations. Thus and represent two very different kinds of tripartite entanglement. This difference is, for example, illustrated by the following interesting property of the W state: if one of the three qubits is lost, the state of the remaining 2-qubit system is still entangled. This robustness of W-type entanglement contrasts strongly with the GHZ state, which is fully separable after loss of one qubit. The states in the W class can be distinguished from all other 3-qubit states by means of multipartite entanglement measures. In particular, W states have non-zero entanglement across any bipartition, while the 3-tangle vanishes, which is also non-zero for GHZ-type states. Generalization The notion of W state has been generalized for qubits and then refers to the quantum superposition with equal expansion coefficients of all possible pure states in which exactly one of the qubits is in an "excited state" , while all other ones are in the "ground state" : Both the robustness against particle loss and the LOCC-inequivalence with the (generalized) GHZ state also hold for the -qubit W state. Applications In systems in which a s
https://en.wikipedia.org/wiki/Bongkrek%20acid
Bongkrek acid (also known as bongkrekic acid) is a respiratory toxin produced in fermented coconut or corn contaminated by the bacterium Burkholderia gladioli pathovar cocovenenans. It is a highly toxic, heat-stable, colorless, odorless, and highly unsaturated tricarboxylic acid that inhibits the ADP/ATP translocase, also called the mitochondrial ADP/ATP carrier, preventing ATP from leaving the mitochondria to provide metabolic energy to the rest of the cell. Bongkrek acid, when consumed through contaminated foods, mainly targets the liver, brain, and kidneys along with symptoms that include vomiting, diarrhea, urinary retention, abdominal pain, and excessive sweating. Most of the outbreaks are found in Indonesia and China where fermented coconut and corn-based foods are consumed. In October 2020, nine members of a family in China died after eating corn noodles contaminated with Bongkrek acid. Discovery and history In 1895, there was a food-poisoning outbreak in Java, Indonesia. The outbreak was caused by the consumption of Indonesian traditional food called tempe Bongkrek. During this time, tempe Bongkrek served as a main source of protein in Java due to its inexpensiveness. Tempe Bongkrek is made by extracting the coconut meat by-product from coconut milk into a form of cake, which is then fermented with R. oligosporus mold. The first outbreak of the Bongkrek poisoning by tempe Bongkrek was recorded by Dutch researchers; however no further research to find the cause of the poisoning was conducted in 1895. During 1930s, Indonesian government went through an economic depression, and this condition caused some of the people to make tempe Bongkrek by themselves, instead of buying it directly from well-trained producers. As a result, the poisonings occurred frequently, reaching 10 to 12 a year. Dutch scientists, named W.K Mertens and A.G. van Veen from the Eijkman Institute of Jakarta, started to find the cause of the poisoning in the early 1930s. They successfully i
https://en.wikipedia.org/wiki/HTML%20email
HTML email is the use of a subset of HTML to provide formatting and semantic markup capabilities in email that are not available with plain text: Text can be linked without displaying a URL, or breaking long URLs into multiple pieces. Text is wrapped to fit the width of the viewing window, rather than uniformly breaking each line at 78 characters (defined in RFC 5322, which was necessary on older text terminals). It allows in-line inclusion of images, tables, as well as diagrams or mathematical formulae as images, which are otherwise difficult to convey (typically using ASCII art). Adoption Most graphical email clients support HTML email, and many default to it. Many of these clients include both a GUI editor for composing HTML emails and a rendering engine for displaying received HTML emails. Since its conception, a number of people have vocally opposed all HTML email (and even MIME itself), for a variety of reasons. For instance, the ASCII Ribbon Campaign advocated that all email should be sent in ASCII text format. The campaign was unsuccessful and was abandoned in 2013. While still considered inappropriate in many newsgroup postings and mailing lists, its adoption for personal and business mail has only increased over time. Some of those who strongly opposed it when it first came out now see it as mostly harmless. According to surveys by online marketing companies, adoption of HTML-capable email clients is now nearly universal, with less than 3% reporting that they use text-only clients. The majority of users prefer to receive HTML emails over plain text. Compatibility Email software that complies with RFC 2822 is only required to support plain text, not HTML formatting. Sending HTML formatted emails can therefore lead to problems if the recipient's email client does not support it. In the worst case, the recipient will see the HTML code instead of the intended message. Among those email clients that do support HTML, some do not render it consistently
https://en.wikipedia.org/wiki/NinJo
NinJo is a meteorological software system. It is a community project of the German Weather Service, the Meteorological Service of Canada, the Danish Meteorological Institute, MeteoSwiss, and the German Bundeswehr. It consists of modules for monitoring weather events, editing point forecasts and viewing meteorological data. An additional batch component is able to render graphical products off-line, these may, for example, be visualized by a web service. Essentially it is a client—server system an implemented fully with the programming language Java. NinJo was initiated by the German Weather Service (Deutscher Wetterdienst, DWD) and the German army (Bundeswehr Geo Information Service, BGIS) in 2000. Since 2006, NinJo has been used operationally. NinJo is licensed for weather services, organisations and universities not taking part in the development consortium. Description NinJo is a client-server system with interactive displays on the client side fed by batch applications implemented on the server. The system is programmed entirely in JAVA and can easily be extended by further layers and applications according to user-specific requirements. The workstation fed by the servers can be installed on different operating systems (e.g. Unix, Linux and Microsoft Windows), avoiding importing the source code onto the specific operating system. The NinJo Server imports a variety of meteorological data, such as METAR reports, weather radar and weather satellite images and numerical weather prediction (NWP) outputs, through dedicated file handling programs, and make them accessible to the client displays. The client is a NinJo workstation which presents data in separate layers. Users can add as many layers to a NinJo scene as they want with all layers show time-synchronised data for the same map area. The layers show geo-referenced data, not fix images, so the screen display is always done directly from the data and interactive probing using the mouse is giving the values
https://en.wikipedia.org/wiki/Yerevan%20Physics%20Institute
The A.I. Alikhanyan National Science Laboratory () is a research institute located in Yerevan, Armenia. It was founded in 1943 as a branch of the Yerevan State University by brothers Abram Alikhanov and Artem Alikhanian. It was often referred to by the acronym YerPhI (Yerevan Physics Institute). In 2011 it was renamed to its current name A.I. Alikhanyan National Science Laboratory. History and strategy The Yerevan Physics Institute was founded in 1943 as a branch of the Yerevan State University by brothers Abraham Alikhanov and Artem Alikhanian. Later two high-altitude cosmic ray stations were founded on Mount Aragats (3,200 m) and Nor Amberd (2,000 m). In 1963 the institute was transferred to the Soviet Union Atomic Energy State Committee. The construction of a 6 GeV electron synchrotron accomplished in 1967 became an important landmark in the history of institute, it is the first particle accelerator in Armenia (Arus "ԱՐՈՒՍ"). After collapse of Soviet Union YerPhI continued research in the fields of high-energy physics and astrophysics in Armenia and worldwide using world biggest accelerators and cosmic ray detectors. Now YerPhI get status of A. Alikhanyan National Laboratory. Brief summary of scientific activities Among the key results of YerPhI in the early years were the discovery of protons and neutrons in cosmic rays, and the establishment of the first evidence of existence of the particles with masses between that of muons and protons. The high altitude research stations have remained the main research base of the Cosmic Ray Division (CRD) of YerPhI until now. Among the CRD achievements there were: discovery of sharp knee in light components of primary cosmic rays, detection of the highest energy protons accelerated on the Sun, and the creation of the Aragats Space environ¬mental Center in 2000 for studies of the solar-terrestrial connection, where CRD becomes one of the world's leaders. The 6 GeV electron synchrotron was accomplished in 1967. During 197
https://en.wikipedia.org/wiki/Garlic%20powder
Garlic powder is a spice that is derived from dehydrated garlic and used in cooking for flavour enhancement. The process of making garlic powder includes drying and dehydrating the vegetable, then powdering it through machinery or home-based appliances depending on the scale of production. Garlic powder is a common component of spice mix. It is also a common component of seasoned salt. Production Cultivation There are two types of garlic species: Softneck (Allium sativum sativum) and Hardneck (Allium sativum ophioscorodon). Hardneck garlic varieties are believed to have more flavour than Softneck garlics, characterized by a spicy and more complex taste than other garlic strands. While Hardneck Garlics flourish in cold weather, due to their extensive time of vernalization, Softnecks seemingly grow better in warmer climates. Distinguishing between a Hardneck and Softneck garlic is done through the presence of a scape (flower stalk). The garlic species most commonly used to powder is the Softneck variety. Due to their less-complex scent and taste, the Softneck species are more suited as a garnish or spice in dishes and also have a longer storage life than Hardneck varieties. Garlic cloves thrive when planted in mid-autumn, in a location with plentiful sunlight. In tropical areas, garlic most successfully grows when planted in Autumn, maturing in early summer and is planted in later Autumn in cooler areas, to be harvested in late Summer. The larger bulbs are split and inserted into soil, around 4-6 inches apart, and 3 inches deep, with the pointy end facing upwards. Softneck and Hardneck garlic are planted identically, however, Softneck garlics are more suited to warmer climates. Garlic must be harvested at a particular time in order to prevent the vegetable from rotting, while also maximising the growth of each bulb within the skin. Green garlic is indicative of harvesting that has taken place before the cloves have ripened, ‘soft’ garlic is the term given to a h
https://en.wikipedia.org/wiki/IEEE%20P802.1p
IEEE P802.1p was a task group active from 1995 to 1998, responsible for adding traffic class expediting and dynamic multicast filtering to the IEEE 802.1D standard. The task group developed a mechanism for implementing quality of service (QoS) at the media access control (MAC) level. Although this technique is commonly referred to as IEEE 802.1p, the group's work with the new priority classes and Generic Attribute Registration Protocol (GARP) was not published separately but was incorporated into a major revision of the standard, IEEE 802.1D-1998, which subsequently was incorporated into IEEE 802.1Q-2014 standard. The work also required a short amendment extending the frame size of the Ethernet standard by four bytes which was published as IEEE 802.3ac in 1998. The QoS technique developed by the working group, also known as class of service (CoS), is a 3-bit field called the Priority Code Point (PCP) within an Ethernet frame header when using VLAN tagged frames as defined by IEEE 802.1Q. It specifies a priority value of between 0 and 7 inclusive that can be used by QoS disciplines to differentiate traffic. Priority levels Eight different classes of service are available as expressed through the 3-bit PCP field in an IEEE 802.1Q header added to the frame. The way traffic is treated when assigned to any particular class is undefined and left to the implementation. The IEEE, however, has made some broad recommendations: Note that the above recommendations have been in force since IEEE 802.1Q-2005 and were revised from the original recommendations in IEEE 802.1D-2004 to better accommodate differentiated services for IP networking. See also IEEE 802.1 IEEE 802.11e IEEE 802.3 Type of service (ToS) Ethernet priority flow control
https://en.wikipedia.org/wiki/Fibroblast%20growth%20factor
Fibroblast growth factors (FGF) are a family of cell signalling proteins produced by macrophages; they are involved in a wide variety of processes, most notably as crucial elements for normal development in animal cells. Any irregularities in their function lead to a range of developmental defects. These growth factors typically act as systemic or locally circulating molecules of extracellular origin that activate cell surface receptors. A defining property of FGFs is that they bind to heparin and to heparan sulfate. Thus, some are sequestered in the extracellular matrix of tissues that contains heparan sulfate proteoglycans and are released locally upon injury or tissue remodeling. Families In humans, 23 members of the FGF family have been identified, all of which are structurally related signaling molecules: Members FGF1 through FGF10 all bind fibroblast growth factor receptors (FGFRs). FGF1 is also known as acidic fibroblast growth factor, and FGF2 is also known as basic fibroblast growth factor. Members FGF11, FGF12, FGF13, and FGF14, also known as FGF homologous factors 1-4 (FHF1-FHF4), have been shown to have distinct functions compared to the FGFs. Although these factors possess remarkably similar sequence homology, they do not bind FGFRs and are involved in intracellular processes unrelated to the FGFs. This group is also known as "iFGF". Human FGF18 is involved in cell development and morphogenesis in various tissues including cartilage. Human FGF20 was identified based on its homology to Xenopus FGF-20 (XFGF-20). FGF15 through FGF23 were described later and functions are still being characterized. FGF15 is the mouse ortholog of human FGF19 (there is no human FGF15) and, where their functions are shared, they are often described as FGF15/19. In contrast to the local activity of the other FGFs, FGF15/19, FGF21 and FGF23 have hormonal systemic effects. Receptors The mammalian fibroblast growth factor receptor family has 4 members, FGFR1, FGFR2, FGFR3,
https://en.wikipedia.org/wiki/MIK%20%28character%20set%29
MIK (МИК) is an 8-bit Cyrillic code page used with DOS. It is based on the character set used in the Bulgarian Pravetz 16 IBM PC compatible system. Kermit calls this character set "BULGARIA-PC" / "bulgaria-pc". In Bulgaria, it was sometimes incorrectly referred to as code page 856 (which clashes with IBM's definition for a Hebrew code page). This code page is known by FreeDOS as Code page 3021. This is the most widespread DOS/OEM code page used in Bulgaria, rather than CP 808, CP 855, CP 866 or CP 872. Almost every DOS program created in Bulgaria, which has Bulgarian strings in it, was using MIK as encoding, and many such programs are still in use. Character set Each character is shown with its equivalent Unicode code point and its decimal code point. Only the second half of the table (code points 128–255) is shown, the first half (code points 0–127) being the same as ASCII. Notes for implementors of mapping tables to Unicode Implementors of mapping tables to Unicode should note that the MIK Code page unifies some characters: Binary character manipulations The MIK code page maintains in alphabetical order all Cyrillic letters which enables very easy character manipulation in binary form: 10xx xxxx - is a Cyrillic Letter 100x xxxx - is an Upper-case Cyrillic Letter 101x xxxx - is a Lower-case Cyrillic Letter In such case testing and character manipulating functions as: IsAlpha(), IsUpper(), IsLower(), ToUpper() and ToLower(), are bit operations and sorting is by simple comparison of character values. See also Hardware code page
https://en.wikipedia.org/wiki/P-bodies
In cellular biology, P-bodies, or processing bodies, are distinct foci formed by phase separation within the cytoplasm of a eukaryotic cell consisting of many enzymes involved in mRNA turnover. P-bodies are highly conserved structures and have been observed in somatic cells originating from vertebrates and invertebrates, plants and yeast. To date, P-bodies have been demonstrated to play fundamental roles in general mRNA decay, nonsense-mediated mRNA decay, adenylate-uridylate-rich element mediated mRNA decay, and microRNA (miRNA) induced mRNA silencing. Not all mRNAs which enter P-bodies are degraded, as it has been demonstrated that some mRNAs can exit P-bodies and re-initiate translation. Purification and sequencing of the mRNA from purified processing bodies showed that these mRNAs are largely translationally repressed upstream of translation initiation and are protected from 5' mRNA decay. P-bodies were originally proposed to be the sites of mRNA degradation in the cell and involved in decapping and digestion of mRNAs earmarked for destruction. Later work called this into question suggesting P bodies store mRNA until needed for translation. In neurons, P-bodies are moved by motor proteins in response to stimulation. This is likely tied to local translation in dendrites. History P-bodies were first described in the scientific literature by Bashkirov et al. in 1997, in which they describe "small granules… discrete, prominent foci" as the cytoplasmic location of the mouse exoribonuclease mXrn1p. It wasn’t until 2002 that a glimpse into the nature and importance of these cytoplasmic foci was published., when researchers demonstrated that multiple proteins involved with mRNA degradation localize to the foci. Their importance was recognized after experimental evidence was obtained pointing to P-bodies as the sites of mRNA degradation in the cell. The researchers named these structures processing bodies or "P bodies". During this time, many descriptive names w
https://en.wikipedia.org/wiki/Very-high-density%20cable%20interconnect
A very-high-density cable interconnect (VHDCI) is a 68-pin connector that was introduced in the SPI-2 document of SCSI-3. The VHDCI connector is a very small connector that allows placement of four wide SCSI connectors on the back of a single PCI card slot. Physically, it looks like a miniature Centronics type connector. It uses the regular 68-contact pin assignment. The male connector (plug) is used on the cable and the female connector ("receptacle") on the device. Other uses Apart from the standardized use with the SCSI interface, several vendors have also used VHDCI connectors for other types of interfaces: Nvidia: for an external PCI Express 8-lane interconnect, and used in Quadro Plex VCS and in Quadro NVS 420 as a display port connector ATI Technologies: on the FireMV 2400 to convey two DVI and two VGA signals on a single connector, and ganging two of these connectors side by side in order to allow the FireMV 2400 to be a low-profile quad display card. The Radeon X1950 XTX Crossfire Edition also used a pair of the connectors to grant more inter-card bandwidth than the PCI Express bus allowed at the time for Crossfire. AMD: Some Visiontek variants of the Radeon HD 7750 use a VHDCI connector alongside a Mini DisplayPort to allow a 5 (breakout to 4 HDMI+1 mDP) display Eyefinity array on a low profile card. VisionTek also released a similar Radeon HD 5570, though it lacked a Mini DisplayPort. Juniper Networks: for their 12- and 48-port 100BASE-TX PICs (physical interface cards). The cable connects to the VHDCI connector on the PIC on one end, via an RJ-21 connector on the other end, to an RJ-45 patch panel. Cisco: 3750 StackWise stacking cables National Instruments: on their high-speed digital I/O cards. AudioScience uses VHDCI to carry multiple analog balanced audio and digital AES/EBU audio streams, and clock and GPIO signals. See also SCSI connector
https://en.wikipedia.org/wiki/Knowledge%20integration
Knowledge integration is the process of synthesizing multiple knowledge models (or representations) into a common model (representation). Compared to information integration, which involves merging information having different schemas and representation models, knowledge integration focuses more on synthesizing the understanding of a given subject from different perspectives. For example, multiple interpretations are possible of a set of student grades, typically each from a certain perspective. An overall, integrated view and understanding of this information can be achieved if these interpretations can be put under a common model, say, a student performance index. The Web-based Inquiry Science Environment (WISE), from the University of California at Berkeley has been developed along the lines of knowledge integration theory. Knowledge integration has also been studied as the process of incorporating new information into a body of existing knowledge with an interdisciplinary approach. This process involves determining how the new information and the existing knowledge interact, how existing knowledge should be modified to accommodate the new information, and how the new information should be modified in light of the existing knowledge. A learning agent that actively investigates the consequences of new information can detect and exploit a variety of learning opportunities; e.g., to resolve knowledge conflicts and to fill knowledge gaps. By exploiting these learning opportunities the learning agent is able to learn beyond the explicit content of the new information. The machine learning program KI, developed by Murray and Porter at the University of Texas at Austin, was created to study the use of automated and semi-automated knowledge integration to assist knowledge engineers constructing a large knowledge base. A possible technique which can be used is semantic matching. More recently, a technique useful to minimize the effort in mapping validation and vi
https://en.wikipedia.org/wiki/List%20of%20RFCs
This is a partial list of RFCs (request for comments memoranda). A Request for Comments (RFC) is a publication in a series from the principal technical development and standards-setting bodies for the Internet, most prominently the Internet Engineering Task Force (IETF). While there are over 9,150 RFCs as of February 2022, this list consists of RFCs that have related articles. A complete list is available from the IETF website. Numerical list This is a partial list of RFCs (request for comments memoranda). A Request for Comments (RFC) is a publication in a series from the principal technical development and standards-setting bodies for the Internet, most prominently the Internet Engineering Task Force (IETF). While there are over 9,150 RFCs as of February 2022, this list consists of RFCs that have related articles. A complete list is available from the IETF website. Topical list Obsolete RFCs are indicated with struck-through text.
https://en.wikipedia.org/wiki/Mae-Wan%20Ho
Mae-Wan Ho (; 12 November 1941 – 24 March 2016) was a geneticist known for her critical views on genetic engineering and evolution. She authored or co-authored a number of publications, including 10 books, such as The Rainbow and the Worm, the Physics of Organisms (1993, 1998), Genetic Engineering: Dream or Nightmare? (1998, 1999), Living with the Fluid Genome (2003) and Living Rainbow H2O (2012). Biography Ho received a PhD in biochemistry in 1967 from Hong Kong University, was postdoctoral fellow in biochemical genetics, University of California, San Diego, from 1968 to 1972, senior research fellow in Queen Elizabeth College, lecturer in genetics (from 1976) and reader in biology (from 1985) in the Open University, and since retiring in June 2000 visiting professor of biophysics in Catania University, Sicily. Ho died of cancer in March 2016. Institute of Science in Society Ho was a co-founder and director of the Institute of Science in Society (ISIS), an interest group which published fringe articles about climate change, GMOs, homeopathy, traditional Chinese medicine, and water memory. In reviewing the organisation, David Colquhoun accused the ISIS of promoting pseudoscience and specifically criticised Ho's understanding of homeopathy. The institute is on the Quackwatch list of questionable organizations. Genetic engineering Ho, together with Joe Cummins of the University of Western Ontario, has argued that a sterility gene engineered into a crop could be transferred to other crops or wild relatives and that "This could severely compromise the agronomic performance of conventional crops and cause wild relatives to go extinct". They argued that this process could also produce genetic instabilities, which might be "leading to catastrophic breakdown", and stated that there are no data to assure that this has not happened or cannot happen. This concern contrasts with the reason why these sterile plants were developed, which was to prevent the transfer of gene
https://en.wikipedia.org/wiki/Iron%20fertilization
Iron fertilization is the intentional introduction of iron-containing compounds (like iron sulfate) to iron-poor areas of the ocean surface to stimulate phytoplankton production. This is intended to enhance biological productivity and/or accelerate carbon dioxide () sequestration from the atmosphere. Iron is a trace element necessary for photosynthesis in plants. It is highly insoluble in sea water and in a variety of locations is the limiting nutrient for phytoplankton growth. Large algal blooms can be created by supplying iron to iron-deficient ocean waters. These blooms can nourish other organisms. Ocean iron fertilization is an example of a geoengineering technique. Iron fertilization attempts to encourage phytoplankton growth, which removes carbon from the atmosphere for at least a period of time. This technique is controversial because there is limited understanding of its complete effects on the marine ecosystem, including side effects and possibly large deviations from expected behavior. Such effects potentially include release of nitrogen oxides, and disruption of the ocean's nutrient balance. Controversy remains over the effectiveness of atmospheric sequestration and ecological effects. Since 1990, 13 major large scale experiments have been carried out to evaluate efficiency and possible consequences of iron fertilization in ocean waters. A study in 2017 determined that the method is unproven; sequestering efficiency is low and sometimes no effect was seen and the amount of iron deposits that is needed to make a small cut in the carbon emissions is in the million tons per year. Approximately 25 per cent of the ocean surface has ample macronutrients, with little plant biomass (as defined by chlorophyll). The production in these high-nutrient low-chlorophyll (HNLC) waters is primarily limited by micronutrients, especially iron. The cost of distributing iron over large ocean areas is large compared with the expected value of carbon credits. Research in the
https://en.wikipedia.org/wiki/Haplogroup%20L3
Haplogroup L3 is a human mitochondrial DNA (mtDNA) haplogroup. The clade has played a pivotal role in the early dispersal of anatomically modern humans. It is strongly associated with the out-of-Africa migration of modern humans of about 70–50,000 years ago. It is inherited by all modern non-African populations, as well as by some populations in Africa. Origin Haplogroup L3 arose close to 70,000 years ago, near the time of the recent out-of-Africa event. This dispersal originated in East Africa and expanded to West Asia, and further to South and Southeast Asia in the course of a few millennia, and some research suggests that L3 participated in this migration out of Africa. L3 is also common amongst African Americans and Afro-Brazilians. A 2007 estimate for the age of L3 suggested a range of 104–84,000 years ago. More recent analyses, including Soares et al. (2012) arrive at a more recent date, of roughly 70–60,000 years ago. Soares et al. also suggest that L3 most likely expanded from East Africa into Eurasia sometime around 65–55,000 years ago as part of the recent out-of-Africa event, as well as from East Africa into Central Africa from 60 to 35,000 years ago. In 2016, Soares et al. again suggested that haplogroup L3 emerged in East Africa, leading to the Out-of-Africa migration, around 70–60,000 years ago. Haplogroups L6 and L4 form sister clades of L3 which arose in East Africa at roughly the same time but which did not participate in the out-of-Africa migration. The ancestral clade L3'4'6 has been estimated at 110 kya, and the L3'4 clade at 95 kya. The possibility of an origin of L3 in Asia was proposed by Cabrera et al. (2018) based on the similar coalescence dates of L3 and its Eurasian-distributed M and N derivative clades (ca. 70 kya), the distant location in Southeast Asia of the oldest known subclades of M and N, and the comparable age of the paternal haplogroup DE. According to this hypothesis, after an initial out-of-Africa migration of bearers of
https://en.wikipedia.org/wiki/Diversified%20Pharmaceutical%20Services
Diversified Pharmaceutical Services entered the market in 1976 as the pharmacy benefit manager for United HealthCare, a leading managed care organization. It pioneered many cost containment strategies that are now core pharmacy benefit manager services and became a recognized leader in clinical programs. History Diversified Pharmaceutical Services (DPS) grew out of the pharmacy department within United Healthcare. The company was sold to SmithKline Beecham for $2.3 billion in May 1994. In 1999, it was acquired by Express Scripts in 1999 for $700 million in cash to create what was then the third largest pharmacy benefit manager in the United States.
https://en.wikipedia.org/wiki/Haplogroup%20Z
In human mitochondrial genetics, Haplogroup Z is a human mitochondrial DNA (mtDNA) haplogroup. Origin Haplogroup Z is believed to have arisen in Central Asia, and is a descendant of haplogroup CZ. Distribution The greatest clade diversity of haplogroup Z is found in East Asia and Central Asia. However, its greatest frequency appears in some peoples of Russia, such as Evens from Kamchatka (8/39 Z1a2a, 3/39 Z1a3, 11/39 = 28.2% Z total) and from Berezovka, Srednekolymsky District, Sakha Republic (3/15 Z1a3, 1/15 Z1a2a, 4/15 = 26.7% Z total), and among the Saami people of northern Scandinavia. With the exception of three Khakasses who belong to Z4, two Yakut who belong to Z3a1, two Yakut, a Yakutian Evenk, a Buryat, and an Altai Kizhi who belong to Z3(xZ3a, Z3c), and the presence of the Z3c clade among populations of Altai Republic, nearly all members of haplogroup Z in North Asia and Europe belong to subclades of Z1. The TMRCA of Z1 is 20,400 [95% CI 7,400 <-> 34,000] ybp according to Sukernik et al. 2012, 20,400 [95% CI 7,800 <-> 33,800] ybp according to Fedorova et al. 2013, or 19,600 [95% CI 12,500 <-> 29,300] ybp according to YFull. Among the members (Z1, Z2, Z3, Z4, and Z7) of haplogroup Z, Nepalese populations were characterized by rare clades Z3a1a and Z7, of which Z3a1a was the most frequent sub-clade in Newar, with a frequency of 16.5%. Z3, found in East Asia, North Asia, and MSEA, is the oldest member of haplogroup Z with an estimated age of ~ 25.4 Kya. Haplogroup Z3a1a is also detected in other Nepalese populations, such as Magar (5.4%), Tharu, Kathmandu (mixed population) and Nepali-other (mixed population from Kathmandu and Eastern Nepal). S6). Z3a1a1 detected in Tibet, Myanmar, Nepal, India, Thai-Laos and Vietnam trace their ancestral roots to China with a coalescent age of ~ 8.4 Kya Fedorova et al. 2013 have reported finding Z*(xZ1a, Z3, Z4) in 1/388 Turks and 1/491 Kazakhs. These individuals should belong to Z1* (elsewhere observed in a Tofalar),
https://en.wikipedia.org/wiki/Soil-transmitted%20helminth
The soil-transmitted helminths (also called geohelminths) are a group of intestinal parasites belonging to the phylum Nematoda that are transmitted primarily through contaminated soil. They are so called because they have a direct life cycle which requires no intermediate hosts or vectors, and the parasitic infection occurs through faecal contamination of soil, foodstuffs and water supplies. The adult forms are essentially parasites of humans, causing soil-transmitted helminthiasis (STH), but also infect domesticated mammals. The juveniles are the infective forms and they undergo tissue-migratory stages during which they invade vital organs such as lungs and liver. Thus the disease manifestations can be both local and systemic. The geohelminths together present an enormous infection burden on humanity, amounting to 135,000 deaths every year, and persistent infection of more than two billion people. Types Soil-transmitted helminths are typically from the following families of nematodes, namely: Roundworms (family Ascarididae), e.g. Ascaris lumbricoides Whipworms (family Trichuridae), e.g. Trichuris trichiura Hookworms (family Ancylostomatidae), e.g. Ancylostoma duodenale and Necator americanus Threadworms (family Strongyloididae), e.g. Strongyloides stercoralis) Diseases Soil-transmitted helminthiasis Soil-transmitted helminthiasis is a collective name for the diseases caused by ascaris, whipworm and hookworms in humans. It includes species-specific diseases such as Ascariasis, which is caused by Ascaris lumbricoides Hookworm diseases (ancylostomiasis and necatoriasis), which are caused by Necator americanus and Ancylostoma duodenale Trichuriasis, which is caused by Trichuris trichiura Soil-transmitted helminthiasis is classified as one of the neglected tropical diseases projected to be controlled/eradicated by 2020 through the London Declaration on Neglected Tropical Diseases. Strongyloidiasis This is caused by Strongyloides stercoralis. Even though t
https://en.wikipedia.org/wiki/Computers%2C%20Freedom%20and%20Privacy%20Conference
The Computers, Freedom and Privacy Conference (or CFP, or the Conference on Computers, Freedom and Privacy) is an annual academic conference held in the United States or Canada about the intersection of computer technology, freedom, and privacy issues. The conference was founded in 1991, and since at least 1999, it has been organized under the aegis of the Association for Computing Machinery. It was originally sponsored by CPSR. CFP91 The first CFP was held in 1991 in Burlingame, California. CFP92 The second CFP was held on March 18–20, 1992 in Washington, DC. It was the first under the auspices of the Association for Computing Machinery. The conference chair was Lance Hoffman. The entire proceedings are available from the Association for Computing Machinery at https://dl.acm.org/doi/proceedings/10.1145/142652. CFP99 The Computers, Freedom and Privacy 99 Conference, sponsored by the Association for Computing Machinery, the 9th annual CFP, was held in Washington, DC from 6 April 1999 to 8 April 1999. CFP99 focused on international Internet regulation and privacy protection. There were close to 500 registered participants and attendees included high-level government officials, grassroots advocates and programmers. The conference chair for CFP99 was Marc Rotenberg and the program coordinator was Ross Stapleton-Gray. Keynote speakers at CFP99 were Tim Berners-Lee, director of the World Wide Web Consortium, Vint Cerf, president of the Internet Society and FTC Commissioner Mozelle Thompson. Others who spoke at CFP99 included: Others who spoke at CFP99 included: David Banisar, policy director at the Electronic Privacy Information Center; US Representative Bob Barr former federal prosecutor and Georgia Republican; Colin Bennett, a privacy expert at Canada's University of Victoria; Paula Breuning, a lawyer for the National Telecommunications and Information Administration in the United States Department of Commerce; Becky Burr, head of the Commerce Department unit
https://en.wikipedia.org/wiki/Urban%20mining
An urban mine is the stockpile of rare metals in the discarded waste electrical and electronic equipment (WEEE) of a society. Urban mining is the process of recovering these rare metals through mechanical and chemical treatments. In 1997, recycled gold accounted for approximately 20% of the 2700 tons of gold supplied to the market. The name was coined in the 1980s by Professor Hideo Nanjyo of the Research Institute of Mineral Dressing and Metallurgy at Tohoku University and the idea has gained significant traction in Japan (and in other parts of Asia) in the 21st century. Research published by the Japanese government's National Institute of Materials Science in 2010 estimated that there were 6,800 tonnes of gold recoverable from used electronic equipment in Japan.
https://en.wikipedia.org/wiki/Doppler%20cooling
Doppler cooling is a mechanism that can be used to trap and slow the motion of atoms to cool a substance. The term is sometimes used synonymously with laser cooling, though laser cooling includes other techniques. History Doppler cooling was simultaneously proposed by two groups in 1975, the first being David J. Wineland and Hans Georg Dehmelt and the second being Theodor W. Hänsch and Arthur Leonard Schawlow. It was first demonstrated by Wineland, Drullinger, and Walls in 1978 and shortly afterwards by Neuhauser, Hohenstatt, Toschek and Dehmelt. One conceptually simple form of Doppler cooling is referred to as optical molasses, since the dissipative optical force resembles the viscous drag on a body moving through molasses. Steven Chu, Claude Cohen-Tannoudji and William D. Phillips were awarded the 1997 Nobel Prize in Physics for their work in laser cooling and atom trapping. Brief explanation Doppler cooling involves light with frequency tuned slightly below an electronic transition in an atom. Because the light is detuned to the "red" (i.e. at lower frequency) of the transition, the atoms will absorb more photons if they move towards the light source, due to the Doppler effect. Consider the simplest case of 1D motion on the x axis. Let the photon be traveling in the +x direction and the atom in the −x direction. In each absorption event, the atom loses a momentum equal to the momentum of the photon. The atom, which is now in the excited state, emits a photon spontaneously but randomly along +x or −x. Momentum is returned to the atom. If the photon was emitted along +x then there is no net change; however, if the photon was emitted along −x, then the atom is moving more slowly in either −x or +x. The net result of the absorption and emission process is a reduced speed of the atom, on the condition that its initial speed is larger than the recoil velocity from scattering a single photon. If the absorption and emission are repeated many times, the mean veloc
https://en.wikipedia.org/wiki/Petrick%27s%20method
In Boolean algebra, Petrick's method (also known as Petrick function or branch-and-bound method) is a technique described by Stanley R. Petrick (1931–2006) in 1956 for determining all minimum sum-of-products solutions from a prime implicant chart. Petrick's method is very tedious for large charts, but it is easy to implement on a computer. The method was improved by Insley B. Pyne and Edward Joseph McCluskey in 1962. Algorithm Reduce the prime implicant chart by eliminating the essential prime implicant rows and the corresponding columns. Label the rows of the reduced prime implicant chart , , , , etc. Form a logical function which is true when all the columns are covered. P consists of a product of sums where each sum term has the form , where each represents a row covering column . Apply De Morgan's Laws to expand into a sum of products and minimize by applying the absorption law . Each term in the result represents a solution, that is, a set of rows which covers all of the minterms in the table. To determine the minimum solutions, first find those terms which contain a minimum number of prime implicants. Next, for each of the terms found in step five, count the number of literals in each prime implicant and find the total number of literals. Choose the term or terms composed of the minimum total number of literals, and write out the corresponding sums of prime implicants. Example of Petrick's method Following is the function we want to reduce: The prime implicant chart from the Quine-McCluskey algorithm is as follows: {| class="wikitable" style="text-align:center;" |- ! || 0 || 1 || 2 || 5 || 6 || 7 || ⇒ || A || B || C |- | style="a1;" | K = m(0,1) || || || || || || || ⇒ || 0 || 0 || |- | style="a1;" | L = m(0,2) || || || || || || || ⇒ || 0 || || 0 |- | style="a1;" | M
https://en.wikipedia.org/wiki/Radiation-absorbent%20material
In materials science, radiation-absorbent material (RAM) is a material which has been specially designed and shaped to absorb incident RF radiation (also known as non-ionising radiation), as effectively as possible, from as many incident directions as possible. The more effective the RAM, the lower the resulting level of reflected RF radiation. Many measurements in electromagnetic compatibility (EMC) and antenna radiation patterns require that spurious signals arising from the test setup, including reflections, are negligible to avoid the risk of causing measurement errors and ambiguities. Introduction One of the most effective types of RAM comprises arrays of pyramid shaped pieces, each of which is constructed from a suitably lossy material. To work effectively, all internal surfaces of the anechoic chamber must be entirely covered with RAM. Sections of RAM may be temporarily removed to install equipment but they must be replaced before performing any tests. To be sufficiently lossy, RAM can be neither a good electrical conductor nor a good electrical insulator as neither type actually absorbs any power. Typically pyramidal RAM will comprise a rubberized foam material impregnated with controlled mixtures of carbon and iron. The length from base to tip of the pyramid structure is chosen based on the lowest expected frequency and the amount of absorption required. For low frequency damping, this distance is often 60 cm (24"), while high-frequency panels are as short as 7.5–10 cm (3–4"). Panels of RAM are typically installed on the walls of an EMC test chamber with the tips pointing inward to the chamber. Pyramidal RAM attenuates signal by two effects: scattering and absorption. Scattering can occur both coherently, when reflected waves are in-phase but directed away from the receiver, or incoherently where waves are picked up by the receiver but are out of phase and thus have lower signal strength. This incoherent scattering also occurs within the foam structure, w
https://en.wikipedia.org/wiki/Laminar%20flow%20cabinet
A laminar flow cabinet or tissue culture hood is a carefully enclosed bench designed to prevent contamination of semiconductor wafers, biological samples, or any particle sensitive materials. Air is drawn through a HEPA filter and blown in a very smooth, laminar flow towards the user. Due to the direction of air flow, the sample is protected from the user but the user is not protected from the sample. The cabinet is usually made of stainless steel with no gaps or joints where spores might collect. Such hoods exist in both horizontal and vertical configurations, and there are many different types of cabinets with a variety of airflow patterns and acceptable uses. Laminar flow cabinets may have a UV-C germicidal lamp to sterilize the interior and contents before usage to prevent contamination of the experiment. Germicidal lamps are usually kept on for fifteen minutes to sterilize the interior before the cabinet is used. The light must be switched off when the cabinet is being used, to limit exposure to skin and eyes as stray ultraviolet light emissions can cause cancer and cataracts. See also Asepsis Biosafety cabinet Fume hood
https://en.wikipedia.org/wiki/Storage%20virtualization
In computer science, storage virtualization is "the process of presenting a logical view of the physical storage resources to" a host computer system, "treating all storage media (hard disk, optical disk, tape, etc.) in the enterprise as a single pool of storage." A "storage system" is also known as a storage array, disk array, or filer. Storage systems typically use special hardware and software along with disk drives in order to provide very fast and reliable storage for computing and data processing. Storage systems are complex, and may be thought of as a special purpose computer designed to provide storage capacity along with advanced data protection features. Disk drives are only one element within a storage system, along with hardware and special purpose embedded software within the system. Storage systems can provide either block accessed storage, or file accessed storage. Block access is typically delivered over Fibre Channel, iSCSI, SAS, FICON or other protocols. File access is often provided using NFS or SMB protocols. Within the context of a storage system, there are two primary types of virtualization that can occur: Block virtualization used in this context refers to the abstraction (separation) of logical storage (partition) from physical storage so that it may be accessed without regard to physical storage or heterogeneous structure. This separation allows the administrators of the storage system greater flexibility in how they manage storage for end users. File virtualization addresses the NAS challenges by eliminating the dependencies between the data accessed at the file level and the location where the files are physically stored. This provides opportunities to optimize storage use and server consolidation and to perform non-disruptive file migrations. Block virtualization Address space remapping Virtualization of storage helps achieve location independence by abstracting the physical location of the data. The virtualization system p
https://en.wikipedia.org/wiki/A23187
A23187 is a mobile ion-carrier that forms stable complexes with divalent cations (ions with a charge of +2). A23187 is also known as Calcimycin, Calcium Ionophore, Antibiotic A23187 and Calcium Ionophore A23187. It is produced at fermentation of Streptomyces chartreusensis. Actions and uses A23187 has antibiotic properties against gram positive bacteria and fungi. It also acts as a divalent cation ionophore, allowing these ions to cross cell membranes, which are usually impermeable to them. A23187 is most selective for Mn2+, somewhat less selective for Ca2+ and Mg2+, much less selective for Sr2+, and even less selective for Ba2+. The ionophore is used in laboratories to increase intracellular Ca2+ levels in intact cells. It also uncouples oxidative phosphorylation, the process cells use to synthesize Adenosine triphosphate which they use for energy. In addition, A23187 inhibits mitochondrial ATPase activity. A23187 also induces apoptosis in some cells (e.g. mouse lymphoma cell line, or S49, and Jurkat cells) and prevents it in others (e.g. cells dependent on interleukin 3 that have had the factor withdrawn). Inex Pharmaceuticals Corporation (Canada) reported an innovative application of A23187. Inex used A23187 as a molecular tool in order to make artificial liposomes loaded with anti-cancer drugs such as Topotecan. In IVF field, Ca Ionophore can be used in case of low fertilization rate after ICSI procedure, particularly with Globozoospermia (Round Head sperm syndrome), Ca Ionophore will replace absence of sperm acrosome, and plays role in oocyte activation after ICSI. Recommended use is 0.5 microgram/ml twice for 10 min interrupted with fresh media with 30 min incubation, followed with regular injected eggs culture for IVF. Biosynthesis The core biosynthetic enzymes are thought to include 3 proteins for the biosynthesis of the α-ketopyrrole moiety, 5 for modular type I polyketide synthases for the spiroketal ring, 4 for the biosynthesis of 3-hydroxyanthranili
https://en.wikipedia.org/wiki/Aphidicolin
Aphidicolin is a tetracyclic diterpene antibiotic isolated from the fungus Cephalosporum aphidicola with antiviral and antimitotic properties. Aphidicolin is a reversible inhibitor of eukaryotic nuclear DNA replication. It blocks the cell cycle at early S phase. It is a specific inhibitor of DNA polymerase Alpha and Delta in eukaryotic cells and in some viruses (vaccinia and herpesviruses) and an apoptosis inducer in HeLa cells. Natural aphidicolin is a secondary metabolite of the fungus Nigrospora oryzae. Bibliography
https://en.wikipedia.org/wiki/Adapter%20%28genetics%29
An adapter or adaptor, or a linker in genetic engineering is a short, chemically synthesized, single-stranded or double-stranded oligonucleotide that can be ligated to the ends of other DNA or RNA molecules. Double stranded adapters can be synthesized to have blunt ends to both terminals or to have sticky end at one end and blunt end at the other. For instance, a double stranded DNA adapter can be used to link the ends of two other DNA molecules (i.e., ends that do not have "sticky ends", that is complementary protruding single strands by themselves). It may be used to add sticky ends to cDNA allowing it to be ligated into the plasmid much more efficiently. Two adapters could base pair to each other to form dimers. A conversion adapter is used to join a DNA insert cut with one restriction enzyme, say EcoRl, with a vector opened with another enzyme, Bam Hl. This adapter can be used to convert the cohesive end produced by Bam Hl to one produced by Eco Rl or vice versa. One of its applications is ligating cDNA into a plasmid or other vectors instead of using Terminal deoxynucleotide Transferase enzyme to add poly A to the cDNA fragment.
https://en.wikipedia.org/wiki/Fully%20Buffered%20DIMM
Fully Buffered DIMM (or FB-DIMM) is a memory technology that can be used to increase reliability and density of memory systems. Unlike the parallel bus architecture of traditional DRAMs, an FB-DIMM has a serial interface between the memory controller and the advanced memory buffer (AMB). Conventionally, data lines from the memory controller have to be connected to data lines in every DRAM module, i.e. via multidrop buses. As the memory width increases together with the access speed, the signal degrades at the interface between the bus and the device. This limits the speed and memory density, so FB-DIMMs take a different approach to solve the problem. 240-pin DDR2 FB-DIMMs are neither mechanically nor electrically compatible with conventional 240-pin DDR2 DIMMs. As a result, those two DIMM types are notched differently to prevent using the wrong one. As with nearly all RAM specifications, the FB-DIMM specification was published by JEDEC. Technology Fully buffered DIMM architecture introduces an advanced memory buffer (AMB) between the memory controller and the memory module. Unlike the parallel bus architecture of traditional DRAMs, an FB-DIMM has a serial interface between the memory controller and the AMB. This enables an increase to the width of the memory without increasing the pin count of the memory controller beyond a feasible level. With this architecture, the memory controller does not write to the memory module directly; rather it is done via the AMB. AMB can thus compensate for signal deterioration by buffering and resending the signal. The AMB can also offer error correction, without imposing any additional overhead on the processor or the system's memory controller. It can also use the Bit Lane Failover Correction feature to identify bad data paths and remove them from operation, which dramatically reduces command/address errors. Also, since reads and writes are buffered, they can be done in parallel by the memory controller. This allows simpler i
https://en.wikipedia.org/wiki/Schofield%20equation
The Schofield Equation is a method of estimating the basal metabolic rate (BMR) of adult men and women published in 1985. This is the equation used by the WHO in their technical report series. The equation that is recommended to estimate BMR by the US Academy of Nutrition and Dietetics is the Mifflin-St. Jeor equation. The equations for estimating BMR in kJ/day (kilojoules per day) from body mass (kg) are: Men: Women: The equations for estimating BMR in kcal/day (kilocalories per day) from body mass (kg) are: Men: Women: Key: W = Body weight in kilograms SEE = Standard error of estimation The raw figure obtained by the equation should be adjusted up or downwards, within the confidence limit suggested by the quoted estimation errors, and according to the following principles: Subjects leaner and more muscular than usual require more energy than the average. Obese subjects require less. Patients at the young end of the age range for a given equation require more energy. Patients at the high end of the age range for a given equation require less energy. Effects of age and body mass may cancel out: an obese 30-year-old or an athletic 60-year-old may need no adjustment from the raw figure. To find actual energy needed per day (Estimated Energy Requirement), the base metabolism must then be multiplied by an activity factor. These are as follows: Sedentary people of both genders should multiply by 1.3. Sedentary is very physically inactive, inactive in both work and leisure. Lightly active men should multiply by 1.6 and women by 1.5. Lightly active means the daily routine includes some walking, or intense exercise once or twice per week. Most students are in this category. Moderately active men should multiply by 1.7 and women by 1.6. Moderately active means intense exercise lasting 20–45 minutes at least three time per week, or a job with a lot of walking, or a moderate intensity job. Very Active men should multiply by 2.1 and women by 1.9. Very active mea
https://en.wikipedia.org/wiki/Health%20ecology
Health ecology (also known as eco-health) is an emerging field that studies the impact of ecosystems on human health. It examines alterations in the biological, physical, social, and economic environments to understand how these changes affect mental and physical human health. Health ecology focuses on a transdisciplinary approach to understanding all the factors which influence an individual's physiological, social, and emotional well-being. Eco-health studies often involve environmental pollution. Some examples include an increase in asthma rates due to air pollution, or PCB contamination of game fish in the Great Lakes of the United States. However, health ecology is not necessarily tied to environmental pollution. For example, research has shown that habitat fragmentation is the main factor that contributes to increased rates of Lyme disease in human populations. History Ecosystem approaches to public health emerged as a defined field of inquiry and application in the 1990s, primarily through global research supported by the International Development Research Centre (IDRC) in Ottawa, Canada (Lebel, 2003). However, this was a resurrection of an approach to health and ecology traced back to Hippocrates in Western societies. It can also be traced back to earlier eras in Eastern societies. The approach was also popular among scientists in the centuries. However, it fell out of common practice in the twentieth century, when technical professionalism and expertise were assumed sufficient to manage health and disease. In this relatively brief era, evaluating the adverse impacts of environmental change (both the natural and artificial environment) on human health was assigned to medicine and environmental health. Integrated approaches to health and ecology re-emerged in the 20th century. These revolutionary movements were built on a foundation laid by earlier scholars, including Hippocrates, Rudolf Virchow, and Louis Pasteur. In the 20th century, Calvin Schwabe coi
https://en.wikipedia.org/wiki/Polytropic%20process
A polytropic process is a thermodynamic process that obeys the relation: where p is the pressure, V is volume, n is the polytropic index, and C is a constant. The polytropic process equation describes expansion and compression processes which include heat transfer. Particular cases Some specific values of n correspond to particular cases: for an isobaric process, for an isochoric process. In addition, when the ideal gas law applies: for an isothermal process, for an isentropic process. Where is the ratio of the heat capacity at constant pressure () to heat capacity at constant volume (). Equivalence between the polytropic coefficient and the ratio of energy transfers For an ideal gas in a closed system undergoing a slow process with negligible changes in kinetic and potential energy the process is polytropic, such that where C is a constant, , , and with the polytropic coefficient . Relationship to ideal processes For certain values of the polytropic index, the process will be synonymous with other common processes. Some examples of the effects of varying index values are given in the following table. When the index n is between any two of the former values (0, 1, γ, or ∞), it means that the polytropic curve will cut through (be bounded by) the curves of the two bounding indices. For an ideal gas, 1 < γ < 5/3, since by Mayer's relation Other A solution to the Lane–Emden equation using a polytropic fluid is known as a polytrope. See also Adiabatic process Compressor Internal combustion engine Isentropic process Isobaric process Isochoric process Isothermal process Polytrope Quasistatic equilibrium Thermodynamics Vapor-compression refrigeration
https://en.wikipedia.org/wiki/Production%20and%20Decay%20of%20Strange%20Particles
"Production and Decay of Strange Particles" is an episode of the original The Outer Limits television show. It first aired on 20 April 1964, during the first season. In a nuclear research plant, although the workers wear radiation suits, they are taken over by some odd glowing substance. It fills their suits and causes them to act like puppets. The episode mentions many modern physics concepts such as neutrinos, antimatter, quasi-stellar objects (at that time just discovered and perhaps mentioned here in TV fiction for the first time) and subatomic particles with the property of "strangeness" (a quantum property of matter which had been named only a few years before by physicists, despite objection at the time that it was no more "strange" or odd than any other property of subatomic particles). The episode name echoes a Physical Review paper of 1956, titled "Cloud-Chamber Study of the Production and Decay of Strange Particles." Opening narration In recent years, nuclear physicists have discovered a strange world of subatomic particles, fragments of atoms smaller than the imagination can picture, fragments of materials which do not obey the laws of gravity. Antimatter composed of inside-out material; shadow-matter that can penetrate ten miles of lead shielding. Hidden deep in the heart of strange new elements are secrets beyond human understanding – new powers, new dimensions, worlds within worlds, unknown. Plot A well-crafted riff on Frankenstein. While experimenting on subatomic particles, physics researchers start a chain reaction that seemingly controls the researchers themselves. Scientist after scientist is consumed, turned into nuclear 'zombies' by what seems to be a form of sentient particle from another dimension. The reaction grows towards a terrible climax. The survivors fear they may be powerless to stop it. Just as the ever-expanding particles are about to engulf the lab and explode into an atomic cataclysm that could destroy the world, the head
https://en.wikipedia.org/wiki/Elastic%20scoring
Elastic scoring is a style of orchestration or music arrangement that was first used by the Australian composer Percy Grainger. Purpose This technique of orchestration is used to provide composers with the option of allowing a diverse group of voices or instrumentalists the ability to perform their music. An example of this is when a composer or arranger provides extra sheet music parts so a flute quartet (four flutes) can play the same piece as a group comprising two flutes, alto flute and bass flute, resulting a choir-like sound. In other words, a subtle re-engineering of the original work. This technique involves making extra and/or interchangeable musical parts which provide substitutions for more or fewer musicians depending on what is required for an individual performance. This also allows a musical work to be played in smaller communities where the required instruments may not always be available. One of the main tenets of elastic scoring is that the new arrangement preserve as much as possible the original interval relationship (to the closest octave) between notes while not being overly concerned with timbre (tone colour) or number of instruments. Timbre is the aspect of music varied most through changing instrument or number of instruments. Techniques Besides providing alternative instrumentation in the form of sheet music parts, the elastic scoring concept allows three subsets of scoring music. Lateral scoring Lateral scoring can be said to have occurred when a piece of music is set for one or more instruments from the original number of instruments. An example of this is if a piece of music set for flute and piano is re-scored for clarinet and piano. In this instance, the intervallic relationships remain the same, but the tone colour has changed. Expansion scoring Expansion scoring is a style of arranging or orchestration that lets composers and arrangers enlarge the original work from a smaller score to a larger one. An example of this is when a
https://en.wikipedia.org/wiki/OpenAP
OpenAP was the first open source Linux distribution released to replace the factory firmware on a number of commercially available IEEE 802.11b wireless access points, all based on the Eumitcom WL11000SA-N board. The idea of releasing third party and open source firmware for commercially available wireless access points has been followed by a number of more recent projects, such as OpenWrt and HyperWRT. OpenAP was released in early 2002 by Instant802 Networks, now known as Devicescape Software, complete with instructions for reprogramming the flash on any of the supported devices, full source code under the GNU General Public License, and a mailing list for discussions. External links http://savannah.nongnu.org/projects/openap/ Wi-Fi Free routing software Custom firmware
https://en.wikipedia.org/wiki/GeForce%208%20series
The GeForce 8 series is the eighth generation of Nvidia's GeForce line of graphics processing units. The third major GPU architecture developed by Nvidia, Tesla represents the company's first unified shader architecture. Overview All GeForce 8 Series products are based on Tesla. As with many GPUs, the larger numbers these cards carry does not guarantee superior performance over previous generation cards with a lower number. For example, the GeForce 8300 and 8400 entry-level cards cannot be compared to the previous GeForce 7200 and 7300 cards due to their inferior performance. The same goes for the high-end GeForce 8800 GTX card, which cannot be compared to the previous GeForce 7800 GTX card due to differences in performance. Max resolution Dual Dual-link DVI Support: Able to drive two flat-panel displays up to 2560×1600 resolution. Available on select GeForce 8800 and 8600 GPUs. One Dual-link DVI Support: Able to drive one flat-panel display up to 2560×1600 resolution. Available on select GeForce 8500 GPUs and GeForce 8400 GS cards based on the G98. One Single-link DVI Support: Able to drive one flat-panel display up to 1920×1200 resolution. Available on select GeForce 8400 GPUs. GeForce 8400 GS cards based on the G86 only support single-link DVI. Display capabilities The GeForce 8 series supports 10-bit per channel display output, up from 8-bit on previous Nvidia cards. This potentially allows higher fidelity color representation and separation on capable displays. The GeForce 8 series, like its recent predecessors, also supports Scalable Link Interface (SLI) for multiple installed cards to act as one via an SLI Bridge, so long as they are of similar architecture. NVIDIA's PureVideo HD video rendering technology is an improved version of the original PureVideo introduced with GeForce 6. It now includes GPU-based hardware acceleration for decoding HD movie formats, post-processing of HD video for enhanced images, and optional High-bandwidth Digital Content Pro
https://en.wikipedia.org/wiki/Haplogroup%20JT
Haplogroup JT is a human mitochondrial DNA (mtDNA) haplogroup. Origin Haplogroup JT is descended from the macro-haplogroup R. It is the ancestral clade to the mitochondrial haplogroups J and T. JT (predominantly J) was found among the ancient Etruscans. The haplogroup has also been found among Iberomaurusian specimens dating from the Epipaleolithic at the Taforalt prehistoric site. One ancient individual carried a haplotype, which correlates with either the JT clade or the haplogroup H subclade H14b1 (1/9; 11%). Subclades Tree This phylogenetic tree of haplogroup JT subclades is based on the paper by Mannis van Oven and Manfred Kayser Updated comprehensive phylogenetic tree of global human mitochondrial DNA variation and subsequent published research. R2'JT JT J T Health Maternally inherited ancient mtDNA variants have clear impact on the presentation of disease in a modern society. Superhaplogroup JT is an example of reduced risk of Parkinson's disease And mitochondrial and mtDNa alterations continue to be promising disease biomarkers. See also Genealogical DNA test Genetic genealogy Human mitochondrial genetics Population genetics
https://en.wikipedia.org/wiki/Credibility%20theory
Credibility theory is a branch of actuarial mathematics concerned with determining risk premiums. To achieve this, it uses mathematical models in an effort to forecast the (expected) number of insurance claims based on past observations. Technically speaking, the problem is to find the best linear approximation to the mean of the Bayesian predictive density, which is why credibility theory has many results in common with linear filtering as well as Bayesian statistics more broadly. For example, in group health insurance an insurer is interested in calculating the risk premium, , (i.e. the theoretical expected claims amount) for a particular employer in the coming year. The insurer will likely have an estimate of historical overall claims experience, , as well as a more specific estimate for the employer in question, . Assigning a credibility factor, , to the overall claims experience (and the reciprocal to employer experience) allows the insurer to get a more accurate estimate of the risk premium in the following manner: The credibility factor is derived by calculating the maximum likelihood estimate which would minimise the error of estimate. Assuming the variance of and are known quantities taking on the values and respectively, it can be shown that should be equal to: Therefore, the more uncertainty the estimate has, the lower is its credibility. Types of Credibility In Bayesian credibility, we separate each class (B) and assign them a probability (Probability of B). Then we find how likely our experience (A) is within each class (Probability of A given B). Next, we find how likely our experience was over all classes (Probability of A). Finally, we can find the probability of our class given our experience. So going back to each class, we weight each statistic with the probability of the particular class given the experience. Bühlmann credibility works by looking at the Variance across the population. More specifically, it looks to see how much of t
https://en.wikipedia.org/wiki/Domain%20engineering
Domain engineering, is the entire process of reusing domain knowledge in the production of new software systems. It is a key concept in systematic software reuse and product line engineering. A key idea in systematic software reuse is the domain. Most organizations work in only a few domains. They repeatedly build similar systems within a given domain with variations to meet different customer needs. Rather than building each new system variant from scratch, significant savings may be achieved by reusing portions of previous systems in the domain to build new ones. The process of identifying domains, bounding them, and discovering commonalities and variabilities among the systems in the domain is called domain analysis. This information is captured in models that are used in the domain implementation phase to create artifacts such as reusable components, a domain-specific language, or application generators that can be used to build new systems in the domain. In product line engineering as defined by ISO26550:2015, the Domain Engineering is complemented by Application Engineering which takes care of the life cycle of the individual products derived from the product line. Purpose Domain engineering is designed to improve the quality of developed software products through reuse of software artifacts. Domain engineering shows that most developed software systems are not new systems but rather variants of other systems within the same field. As a result, through the use of domain engineering, businesses can maximize profits and reduce time-to-market by using the concepts and implementations from prior software systems and applying them to the target system. The reduction in cost is evident even during the implementation phase. One study showed that the use of domain-specific languages allowed code size, in both number of methods and number of symbols, to be reduced by over 50%, and the total number of lines of code to be reduced by nearly 75%. Domain engineering foc
https://en.wikipedia.org/wiki/Carotenoid%20oxygenase
Carotenoid oxygenases are a family of enzymes involved in the cleavage of carotenoids to produce, for example, retinol, commonly known as vitamin A. This family includes an enzyme known as RPE65 which is abundantly expressed in the retinal pigment epithelium where it catalyzed the formation of 11-cis-retinol from all-trans-retinyl esters. Carotenoids such as beta-carotene, lycopene, lutein and beta-cryptoxanthin are produced in plants and certain bacteria, algae and fungi, where they function as accessory photosynthetic pigments and as scavengers of oxygen radicals for photoprotection. They are also essential dietary nutrients in animals. Carotenoid oxygenases cleave a variety of carotenoids into a range of biologically important products, including apocarotenoids in plants that function as hormones, pigments, flavours, floral scents and defence compounds, and retinoids in animals that function as vitamins, chromophores for opsins and signalling molecules. Examples of carotenoid oxygenases include: Beta-carotene 15,15'-monooxygenase (BCO1; ) from animals, which cleaves beta-carotene symmetrically at the central double bond to yield two molecules of retinal. Beta-carotene-9',10'-dioxygenase (BCO2) from animals, which cleaves beta-carotene asymmetrically to apo-10'-beta-carotenal and beta-ionone, the latter being converted to retinoic acid. Lycopene is also oxidatively cleaved. 9-cis-epoxycarotenoid dioxygenase from plants, which cleaves 9-cis xanthophylls to xanthoxin, a precursor of the hormone abscisic acid. Yellow skin, which is a common phenotype in domestic chicken, is influenced by the accumulation of carotenoids in skin due to absence of beta-carotene dioxygenase 2 (BCDO2) enzyme. Inhibition of expression of BCO2 gene is caused by a regulatory mutation. Apocarotenoid-15,15'-oxygenase from bacteria and cyanobacteria, which converts beta-apocarotenals rather than beta-carotene into retinal. This protein has a seven-bladed beta-propeller structure. Ret
https://en.wikipedia.org/wiki/NlaIII
NlaIII is a type II restriction enzyme isolated from Neisseria lactamica. As part of the restriction modification system, NlaIII is able to prevent foreign DNA from integrating into the host genome by cutting double stranded DNA into fragments at specific sequences. This results in further degradation of the fragmented foreign DNA and prevents it from infecting the host genome. NlaIII recognizes the palindromic and complementary DNA sequence of CATG/GTAC and cuts outside of the G-C base pairs. This cutting pattern results in sticky ends with GTAC overhangs at the 3' end. Characteristics NlaIII from N. lactamica contains two key components: a methylase and an endonuclease. The methylase is critical to recognition, while the endonuclease is used for cutting. The gene (NlaIIIR) is 693 bp long and creates the specific 5’-CATG-3’ endonuclease. A homolog of NlaIIIR is iceA1 from Helicobacter pylori. In H. pylori, there exists a similar methylase gene called hpyIM which is downstream of iceA1. ICEA1 is an endonuclease that also recognizes the 5’-CATG-3’ sequence. IceA1 in H. pylori is similar to that of NlaIII in N. lactamica. NlaIII contains an ICEA protein that encompasses the 4 to 225 amino acid region. H. pylori also contains the same protein. H. pylori infection often leads to gastrointestinal issues such as peptic ulcers, gastric adenocarcinoma and lymphoma. Researchers speculate that ICEA proteins serve as potential markers for gastric cancer Isoschizomers NlaIII isoschizomers recognize and cut the same recognition sequence 5’-CATG-3’. Endonucleases that cut at this sequence include: Fael Fatl Hin1II Hsp92II CviAII IceA1 Applications NlaIII can be used in many different experimental procedures such as: Serial analysis of gene expression Molecular cloning Restriction site mapping Genotyping Southern blotting Restriction fragment length polymorphism (RFLP) analysis
https://en.wikipedia.org/wiki/Micromagnetics
Micromagnetics is a field of physics dealing with the prediction of magnetic behaviors at sub-micrometer length scales. The length scales considered are large enough for the atomic structure of the material to be ignored (the continuum approximation), yet small enough to resolve magnetic structures such as domain walls or vortices. Micromagnetics can deal with static equilibria, by minimizing the magnetic energy, and with dynamic behavior, by solving the time-dependent dynamical equation. History Micromagnetics as a field (i.e., that deals specifically with the behaviour of ferromagnetic materials at sub-micrometer length scales) was introduced in 1963 when William Fuller Brown Jr. published a paper on antiparallel domain wall structures. Until comparatively recently computational micromagnetics has been prohibitively expensive in terms of computational power, but smaller problems are now solvable on a modern desktop PC. Static micromagnetics The purpose of static micromagnetics is to solve for the spatial distribution of the magnetization M at equilibrium. In most cases, as the temperature is much lower than the Curie temperature of the material considered, the modulus |M| of the magnetization is assumed to be everywhere equal to the saturation magnetization Ms. The problem then consists in finding the spatial orientation of the magnetization, which is given by the magnetization direction vector m = M/Ms, also called reduced magnetization. The static equilibria are found by minimizing the magnetic energy, , subject to the constraint |M|=Ms or |m|=1. The contributions to this energy are the following: Exchange energy The exchange energy is a phenomenological continuum description of the quantum-mechanical exchange interaction. It is written as: where A is the exchange constant; mx, my and mz are the components of m; and the integral is performed over the volume of the sample. The exchange energy tends to favor configurations where the magnetization varies
https://en.wikipedia.org/wiki/Cryptoregiochemistry
Cryptoregiochemistry refers to the site of initial oxidative attack in double bond formation by enzymes such as fatty acid desaturases. This is a mechanistic parameter that is usually determined through the use of kinetic isotope effect experiments, based on the premise that the initial C-H bond cleavage step should be energetically more difficult and therefore more sensitive to isotopic substitution than the second C-H bond breaking step.
https://en.wikipedia.org/wiki/Interleukin%207
Interleukin 7 (IL-7) is a protein that in humans is encoded by the IL7 gene. IL-7 is a hematopoietic growth factor secreted by stromal cells in the bone marrow and thymus. It is also produced by keratinocytes, dendritic cells, hepatocytes, neurons, and epithelial cells, but is not produced by normal lymphocytes. A study also demonstrated how the autocrine production of the IL-7 cytokine mediated by T-cell acute lymphoblastic leukemia (T-ALL) can be involved in the oncogenic development of T-ALL and offer novel insights into T-ALL spreading. Structure The three-dimensional structure of IL-7 in complex with the ectodomain of IL-7 receptor has been determined using X-ray diffraction. Function Lymphocyte maturation IL-7 stimulates the differentiation of multipotent (pluripotent) hematopoietic stem cells into lymphoid progenitor cells (as opposed to myeloid progenitor cells where differentiation is stimulated by IL-3). It also stimulates proliferation of all cells in the lymphoid lineage (B cells, T cells and NK cells). It is important for proliferation during certain stages of B-cell maturation, T and NK cell survival, development and homeostasis. IL-7 is a cytokine important for B and T cell development. This cytokine and the hepatocyte growth factor (HGF) form a heterodimer that functions as a pre-pro-B cell growth-stimulating factor. This cytokine is found to be a cofactor for V(D)J rearrangement of the T cell receptor beta (TCRß) during early T cell development. This cytokine can be produced locally by intestinal epithelial and epithelial goblet cells, and may serve as a regulatory factor for intestinal mucosal lymphocytes. Knockout studies in mice suggested that this cytokine plays an essential role in lymphoid cell survival. IL-7 signaling IL-7 binds to the IL-7 receptor, a heterodimer consisting of Interleukin-7 receptor alpha and common gamma chain receptor. Binding results in a cascade of signals important for T-cell development within the thymus an
https://en.wikipedia.org/wiki/Homeoviscous%20adaptation
Homeoviscous adaptation is the adaptation of the cell membrane lipid composition to keep the adequate membrane fluidity. The maintenance of proper cell membrane fluidity is of critical importance for the function and integrity of the cell, essential for the mobility and function of embedded proteins and lipids, diffusion of proteins and other molecules laterally across the membrane for signaling reactions, and proper separation of membranes during cell division. A fundamental biophysical determinant of membrane fluidity is the balance between saturated and unsaturated fatty acids. Regulating membrane fluidity is especially important in poikilothermic organisms such as bacteria, fungi, protists, plants, fish and other ectothermic animals. The general trend is an increase in unsaturated fatty acids at lower growth temperatures and an increase in saturated fatty acids at higher temperatures. A recent work has explored the importance of the homeoviscous adaptation of the cell membrane for a psychrotolerant bacteria living in the cold biosphere of earth.
https://en.wikipedia.org/wiki/D%20arm
The D arm is a feature in the tertiary structure of transfer RNA (tRNA). It is composed of the two D stems and the D loop. The D loop contains the base dihydrouridine, for which the arm is named. The D loop's main function is that of recognition. It is widely believed that it acts as a recognition site for aminoacyl-tRNA synthetase, an enzyme involved in the aminoacylation of the tRNA molecule. The D stem is also believed to have a recognition role although this has yet to be verified. It is a highly variable region and is notable for its unusual conformation due to the over-crowding on one of the guanosine residues. It appears to play a large role in the stabilization of the tRNA's tertiary structure.
https://en.wikipedia.org/wiki/Inositol%20pentakisphosphate
Inositol pentakisphosphate (abbreviated IP5) is a molecule derived from inositol tetrakisphosphate by adding a phosphate group with the help of Inositol-polyphosphate multikinase (IPMK). It is believed to be one of the many second messengers in the inositol phosphate family. It "is implicated in a wide array of biological and pathophysiological responses, including tumorigenesis, invasion and metastasis, therefore specific inhibitors of the kinase may prove useful in cancer therapy." IP5 also plays a role in defense signaling in plants. It potentiates the interaction of the plant hormone JA-Ile by its receptor.
https://en.wikipedia.org/wiki/Cosmogenic%20nuclide
Cosmogenic nuclides (or cosmogenic isotopes) are rare nuclides (isotopes) created when a high-energy cosmic ray interacts with the nucleus of an in situ Solar System atom, causing nucleons (protons and neutrons) to be expelled from the atom (see cosmic ray spallation). These nuclides are produced within Earth materials such as rocks or soil, in Earth's atmosphere, and in extraterrestrial items such as meteoroids. By measuring cosmogenic nuclides, scientists are able to gain insight into a range of geological and astronomical processes. There are both radioactive and stable cosmogenic nuclides. Some of these radionuclides are tritium, carbon-14 and phosphorus-32. Certain light (low atomic number) primordial nuclides (isotopes of lithium, beryllium and boron) are thought to have been created not only during the Big Bang, but also (and perhaps primarily) to have been made after the Big Bang, but before the condensation of the Solar System, by the process of cosmic ray spallation on interstellar gas and dust. This explains their higher abundance in cosmic rays as compared with their abundances on Earth. This also explains the overabundance of the early transition metals just before iron in the periodic table – the cosmic-ray spallation of iron produces scandium through chromium on the one hand and helium through boron on the other. However, the arbitrary defining qualification for cosmogenic nuclides of being formed "in situ in the Solar System" (meaning inside an already-aggregated piece of the Solar System) prevents primordial nuclides formed by cosmic ray spallation before the formation of the Solar System from being termed "cosmogenic nuclides"—even though the mechanism for their formation is exactly the same. These same nuclides still arrive on Earth in small amounts in cosmic rays, and are formed in meteoroids, in the atmosphere, on Earth, "cosmogenically". However, beryllium (all of it stable beryllium-9) is present primordially in the Solar System in much la
https://en.wikipedia.org/wiki/John%27s%20equation
John's equation is an ultrahyperbolic partial differential equation satisfied by the X-ray transform of a function. It is named after Fritz John. Given a function with compact support the X-ray transform is the integral over all lines in . We will parameterise the lines by pairs of points , on each line and define as the ray transform where Such functions are characterized by John's equations which is proved by Fritz John for dimension three and by Kurusa for higher dimensions. In three-dimensional x-ray computerized tomography John's equation can be solved to fill in missing data, for example where the data is obtained from a point source traversing a curve, typically a helix. More generally an ultrahyperbolic partial differential equation (a term coined by Richard Courant) is a second order partial differential equation of the form where , such that the quadratic form can be reduced by a linear change of variables to the form It is not possible to arbitrarily specify the value of the solution on a non-characteristic hypersurface. John's paper however does give examples of manifolds on which an arbitrary specification of u can be extended to a solution.
https://en.wikipedia.org/wiki/Polymer%20brush
A polymer brush is the name given to a surface coating consisting of polymers tethered to a surface. The brush may be either in a solvated state, where the tethered polymer layer consists of polymer and solvent, or in a melt state, where the tethered chains completely fill up the space available. These polymer layers can be tethered to flat substrates such as silicon wafers, or highly curved substrates such as nanoparticles. Also, polymers can be tethered in high density to another single polymer chain, although this arrangement is normally named a bottle brush. Additionally, there is a separate class of polyelectrolyte brushes, when the polymer chains themselves carry an electrostatic charge. The brushes are often characterized by the high density of grafted chains. The limited space then leads to a strong extension of the chains. Brushes can be used to stabilize colloids, reduce friction between surfaces, and to provide lubrication in artificial joints. Polymer brushes have been modeled with Molecular Dynamics, Monte Carlo methods, Brownian dynamics simulations, and molecular theories. Structure Polymer molecules within a brush are stretched away from the attachment surface as a result of the fact that they repel each other (steric repulsion or osmotic pressure). More precisely, they are more elongated near the attachment point and unstretched at the free end, as depicted on the drawing. More precisely, within the approximation derived by Milner, Witten, Cates, the average density of all monomers in a given chain is always the same up to a prefactor: where is the altitude of the end monomer and the number of monomers per chain. The averaged density profile of the end monomers of all attached chains, convoluted with the above density profile for one chain, determines the density profile of the brush as a whole: A dry brush has a uniform monomer density up to some altitude . One can show that the corresponding end monomer density profile is given by: wh
https://en.wikipedia.org/wiki/Mental%20rotation
Mental rotation is the ability to rotate mental representations of two-dimensional and three-dimensional objects as it is related to the visual representation of such rotation within the human mind. There is a relationship between areas of the brain associated with perception and mental rotation. There could also be a relationship between the cognitive rate of spatial processing, general intelligence and mental rotation. Mental rotation can be described as the brain moving objects in order to help understand what they are and where they belong. Mental rotation has been studied to try to figure out how the mind recognizes objects in their environment. Researchers generally call such objects stimuli. Mental rotation is one cognitive function for the person to figure out what the altered object is. Mental rotation can be separated into the following cognitive stages: Create a mental image of an object from all directions (imagining where it continues straight vs. turns). Rotate the object mentally until a comparison can be made (orientating the stimulus to other figure). Make the comparison. Decide if the objects are the same or not. Report the decision (reaction time is recorded when a lever is pulled or a button is pressed). Assessment Originally developed in 1978 by Vandenberg and Kuse based on the research by Shepard and Metzler (1971), a Mental Rotation Test (MRT) consists of a participant comparing two 3D objects (or letters), often rotated in some axis, and states if they are the same image or if they are mirror images (enantiomorphs). Commonly, the test will have pairs of images each rotated a specific number of degrees (e.g. 0°, 60°, 120° or 180°). A set number of pairs will be split between being the same image rotated, while others are mirrored. The researcher judges the participant on how accurately and rapidly they can distinguish between the mirrored and non-mirrored pairs. Notable research Shepard and Metzler (1971) Roger Shepard and Jacquel
https://en.wikipedia.org/wiki/List%20of%20defunct%20network%20processor%20companies
During the dot-com/internet bubble of the late 1990s and early 2000, the proliferation of many dot-com start-up companies created a secondary bubble in the telecommunications/computer networking infrastructure and telecommunications service provider markets. Venture capital and high tech companies rushed to build next generation infrastructure equipment for the expected explosion of internet traffic. As part of that investment fever, network processors were seen as a method of dealing with the desire for more network services and the ever-increasing data-rates of communication networks. It has been estimated that dozens of start-up companies were created in the race to build the processors that would be a component of the next generation telecommunications equipment. Once the internet investment bubble burst, the telecom network upgrade cycle was deferred for years (perhaps for a decade). As a result, the majority of these new companies went bankrupt. As of 2007, the only companies that are shipping network processors in sizeable volumes are Cisco Systems, Marvell, Freescale, Cavium Networks and AMCC. OC-768/40Gb routing ClearSpeed left network processor market, reverted to supercomputing applications Propulsion Networks defunct BOPS left network processor market, reverted to DSP applications OC-192/10Gb routing Terago defunct Clearwater Networks originally named Xstream Logic, defunct Silicon Access defunct Solidum Systems acquired by Integrated Device Technology Lexra defunct Fast-Chip defunct Cognigine Corp. defunct Internet Machines morphed into IMC Semiconductors, a PCI-Express chip vendor Acorn Networks defunct XaQti acquired by Vitesse Semiconductor, product line discontinued OC-48/2.5Gb routing IP Semiconductors defunct Entridia defunct Stargate Solutions defunct Gigabit Ethernet routing Sibyte acquired by Broadcom, product line discontinued PMC-Sierra product line discontinued OC-12 routing C-port acquired by Mot
https://en.wikipedia.org/wiki/Smoothing
In statistics and image processing, to smooth a data set is to create an approximating function that attempts to capture important patterns in the data, while leaving out noise or other fine-scale structures/rapid phenomena. In smoothing, the data points of a signal are modified so individual points higher than the adjacent points (presumably because of noise) are reduced, and points that are lower than the adjacent points are increased leading to a smoother signal. Smoothing may be used in two important ways that can aid in data analysis (1) by being able to extract more information from the data as long as the assumption of smoothing is reasonable and (2) by being able to provide analyses that are both flexible and robust. Many different algorithms are used in smoothing. Smoothing may be distinguished from the related and partially overlapping concept of curve fitting in the following ways: curve fitting often involves the use of an explicit function form for the result, whereas the immediate results from smoothing are the "smoothed" values with no later use made of a functional form if there is one; the aim of smoothing is to give a general idea of relatively slow changes of value with little attention paid to the close matching of data values, while curve fitting concentrates on achieving as close a match as possible. smoothing methods often have an associated tuning parameter which is used to control the extent of smoothing. Curve fitting will adjust any number of parameters of the function to obtain the 'best' fit. Linear smoothers In the case that the smoothed values can be written as a linear transformation of the observed values, the smoothing operation is known as a linear smoother; the matrix representing the transformation is known as a smoother matrix or hat matrix. The operation of applying such a matrix transformation is called convolution. Thus the matrix is also called convolution matrix or a convolution kernel. In the case of simple series of
https://en.wikipedia.org/wiki/TalkOrigins%20Archive
The TalkOrigins Archive is a website that presents mainstream science perspectives on the antievolution claims of young-earth, old-earth, and "intelligent design" creationists. With sections on evolution, creationism, geology, astronomy and hominid evolution, the web site provides broad coverage of evolutionary biology and the socio-political antievolution movement. Origins and history The TalkOrigins Archive began in 1994 when Brett J. Vickers collected several separately posted FAQs from the talk.origins newsgroup and made them available from a single anonymous FTP site. In 1995, Vickers, then a computer science graduate student at the University of California at Irvine, created the TalkOrigins Archive web site. In 2001, Vickers transferred the TalkOrigins Archive to Wesley R. Elsberry, who organized a group of volunteers to handle the maintenance of the Archive. In 2004, Kenneth Fair incorporated the TalkOrigins Foundation as a Texas 501(c)(3) non-profit organization. The Foundation's purposes include funding and maintaining the TalkOrigins Archive and holding copyrights to Archive articles, thereby simplifying the process of reprinting and updating those articles. The copyright issue has posed a particular problem since the FAQs started off as a small collection with little thought given to copyright but have since mushroomed. In 2005, the Foundation was granted tax-exempt status by the IRS. Features The FAQs and FRAs (Frequently Rebutted Assertions) on the TalkOrigins Archive cover a wide range of topics associated with evolutionary biology and creationism. These include Mark Isaak's Index to Creationist Claims, a list of creationist positions on various issues, rebuttals, and links to primary source material. The TalkDesign site fulfills a similar role with the Intelligent Design movement. Also hosted is Jim Foley's Fossil Hominids sub-site which studies the evidence for human evolution and has an extensive list of links to websites on both evolutionary bio
https://en.wikipedia.org/wiki/Cardiovascular%20physiology
Cardiovascular physiology is the study of the cardiovascular system, specifically addressing the physiology of the heart ("cardio") and blood vessels ("vascular"). These subjects are sometimes addressed separately, under the names cardiac physiology and circulatory physiology. Although the different aspects of cardiovascular physiology are closely interrelated, the subject is still usually divided into several subtopics. Heart Cardiac output (= heart rate * stroke volume. Can also be calculated with Fick principle, palpating method.) Stroke volume (= end-diastolic volume − end-systolic volume) Ejection fraction (= stroke volume / end-diastolic volume) Cardiac output is mathematically ` to systole Inotropic, chronotropic, and dromotropic states Cardiac input (= heart rate * suction volume Can be calculated by inverting terms in Fick principle) Suction volume (= end-systolic volume + end-diastolic volume) Injection fraction (=suction volume / end-systolic volume) Cardiac input is mathematically ` to diastole Electrical conduction system of the heart Electrocardiogram Cardiac marker Cardiac action potential Frank–Starling law of the heart Wiggers diagram Pressure volume diagram Regulation of blood pressure Baroreceptor Baroreflex Renin–angiotensin system Renin Angiotensin Juxtaglomerular apparatus Aortic body and carotid body Autoregulation Cerebral Autoregulation Hemodynamics Under most circumstances, the body attempts to maintain a steady mean arterial pressure. When there is a major and immediate decrease (such as that due to hemorrhage or standing up), the body can increase the following: Heart rate Total peripheral resistance (primarily due to vasoconstriction of arteries) Inotropic state In turn, this can have a significant impact upon several other variables: Stroke volume Cardiac output Pressure Pulse pressure (systolic pressure - diastolic pressure) Mean arterial pressure (usually approximated with diastolic pressure +
https://en.wikipedia.org/wiki/Succinyl%20coenzyme%20A%20synthetase
Succinyl coenzyme A synthetase (SCS, also known as succinyl-CoA synthetase or succinate thiokinase or succinate-CoA ligase) is an enzyme that catalyzes the reversible reaction of succinyl-CoA to succinate. The enzyme facilitates the coupling of this reaction to the formation of a nucleoside triphosphate molecule (either GTP or ATP) from an inorganic phosphate molecule and a nucleoside diphosphate molecule (either GDP or ADP). It plays a key role as one of the catalysts involved in the citric acid cycle, a central pathway in cellular metabolism, and it is located within the mitochondrial matrix of a cell. Chemical reaction and enzyme mechanism Succinyl CoA synthetase catalyzes the following reversible reaction: Succinyl CoA + Pi + NDP ↔ Succinate + CoA + NTP where Pi denotes inorganic phosphate, NDP denotes nucleotide diphosphate (either GDP or ADP), and NTP denotes nucleotide triphosphate (either GTP or ATP). As mentioned, the enzyme facilitates coupling of the conversion of succinyl CoA to succinate with the formation of NTP from NDP and Pi. The reaction has a biochemical standard state free energy change of -3.4 kJ/mol. The reaction takes place by a three-step mechanism which is depicted in the image below. The first step involves displacement of CoA from succinyl CoA by a nucleophilic inorganic phosphate molecule to form succinyl phosphate. The enzyme then utilizes a histidine residue to remove the phosphate group from succinyl phosphate and generate succinate. Finally, the phosphorylated histidine transfers the phosphate group to a nucleoside diphosphate, which generates the high-energy carrying nucleoside triphosphate. Structure Subunits Bacterial and mammalian SCSs are made up of α and β subunits. In E. coli two αβ heterodimers link together to form an α2β2 heterotetrameric structure. However, mammalian mitochondrial SCSs are active as αβ dimers and do not form a heterotetramer. The E. coli SCS heterotetramer has been crystallized and char
https://en.wikipedia.org/wiki/Devicescape
Devicescape is an American developer of client/server software services for wireless networking connectivity, analytics, and context-awareness. Founded in 2001 as Instant802 Networks, the company was renamed to Devicescape in January 2005. Devicescape is a venture backed private company. Corporate History Instant802 Networks was founded in 2001 by Eduardo de-Castro and Roy Petruschka in San-Francisco, and Simon Barber had joined as a third founder a few months after company incorporation. In 2004 the company began development of packaged software products, including security for emerging devices and complete access point packages. The software was used in devices ranging from LCD projectors, televisions and digital video recorders to PDAs and SOHO access points. The company also provided software for the Wi-Fi Alliance test bed. Dave Fraser joined as CEO in 2004, and in 2005 the company was renamed Devicescape Software. The company continued to develop additional client security products. In 2006, Devicescape exited the access point business by licensing its Wireless Infrastructure Platform technology to LVL7, which was subsequently sold to Broadcom. In 2007, Devicescape contributes a new wireless stack to the Linux kernel. In 2007, Devicescape introduced Connect, a client-server system which allowed embedded devices to automatically authenticate against a large number of public Wi-Fi networks. The company released a variety of consumer applications for PCs and smartphones under the Devicescape Easy Wi-Fi brand. In 2009, Devicescape launched the Easy WiFi Network. In 2010, Devicescape applied server-based analysis to curate Wi-Fi networks discovered by client applications, so that Wi-Fi networks could be assessed for quality, location, sharing status and other factors. The company referred to this as a "Curated Virtual Network" (CVN) and became a mechanism for offloading traffic from cellular networks. Late in 2010, MetroPCS (now T-Mobile) became the firs
https://en.wikipedia.org/wiki/Spacewarp%20%28toy%29
Spacewarp is a line of build-it-yourself, marble-run toy "roller coasters" first made in the 1980s by Bandai. Users cut lengths of track to the correct size from a single roll of thick plastic tubing, forming curves and loops held in place by plastic track rail holders which attach to metal rods held vertical in a black plastic base. Steel balls roll around the track and on to a battery-powered screw conveyor that takes them to the top to start all over again. Production of Spacewarp toys ended around 1988. Replacement parts were sold until 1995. A redesigned Spacewarp toy was re-introduced to the Japanese market in 2005 by Tanomi. Improvements included redesigned parts which were less prone to breakage. History The first Spacewarp sets became available in Japan in 1983 and were sold by Bandai. That year, filmmakers were working on a movie called The Family Game which features a plot line about a boy who is fascinated with roller coasters. The filmmakers noticed the Spacewarp toy and decided to incorporate it into the movie in a few scenes. The movie The Family Game was released in Japan in November, 1983 to favorable reviews. Due to the popularity of the movie, Spacewarp sales increased so much that between 1983 and 1984, approximately one million sets were sold. Due to the product's popularity, Spacewarp applied for and received the Spacewarp trademark in the United States of America in 1986, and started selling a subset of its marble roller coasters there. Sets 1986-1990s North American Imports: 1983-1995 Japanese Market: 2004–2008 Japanese Market: Accessories Additional accessories include lighting kits, a staircase, bell ringer, escalator and more. Knock-Offs As with many popular toys, nearly identical counterfeit editions have emerged under the Chinese "Spacerail" brand. However, Spacerail acquired the Spacewarp trademark, and is continuing their tradition of running marble roller coasters. Spacerail sets: See also Rolling Ball Sculpture
https://en.wikipedia.org/wiki/Hostapd
hostapd (host access point daemon) is a user space daemon software enabling a network interface card to act as an access point and authentication server. There are three implementations: Jouni Malinen's hostapd, OpenBSD's hostapd and Devicescape's hostapd. Jouni Malinen's hostapd Jouni Malinen's hostapd is a user space daemon for access point and authentication servers. It can be used to create a wireless hotspot using a Linux computer. It implements IEEE 802.11 access point management, IEEE 802.1X/WPA/WPA2/EAP Authenticators, RADIUS client, EAP server, and RADIUS authentication server. The current version supports Linux (Host AP, MadWifi, Prism54 and some of the drivers which use the kernel's mac80211 subsystem), QNX, FreeBSD (net80211), and DragonFlyBSD. OpenBSD's hostapd OpenBSD's hostapd is a user space daemon that helps to improve roaming and monitoring of OpenBSD-based wireless networks. It implements Inter Access Point Protocol (IAPP) for exchanging station association information between access points. It can trigger a set of actions like frame injection or logging when receiving specified IEEE 802.11 frames. Devicescape's hostapd The Open Wireless Linux version of hostapd. It is kept as close as possible to the original open source release, but with OWL specific packaging and defaults. The website appears to be dead (April 2013), probably as the project itself. See also HostAP
https://en.wikipedia.org/wiki/Data%20access%20layer
A data access layer (DAL) in computer software is a layer of a computer program which provides simplified access to data stored in persistent storage of some kind, such as an entity-relational database. This acronym is prevalently used in Microsoft environments. For example, the DAL might return a reference to an object (in terms of object-oriented programming) complete with its attributes instead of a row of fields from a database table. This allows the client (or user) modules to be created with a higher level of abstraction. This kind of model could be implemented by creating a class of data access methods that directly reference a corresponding set of database stored procedures. Another implementation could potentially retrieve or write records to or from a file system. The DAL hides this complexity of the underlying data store from the external world. For example, instead of using commands such as insert, delete, and update to access a specific table in a database, a class and a few stored procedures could be created in the database. The procedures would be called from a method inside the class, which would return an object containing the requested values. Or, the insert, delete and update commands could be executed within simple functions like registeruser or loginuser stored within the data access layer. Also, business logic methods from an application can be mapped to the data access layer. So, for example, instead of making a query into a database to fetch all users from several tables, the application can call a single method from a DAL which abstracts those database calls. Applications using a data access layer can be either database server dependent or independent. If the data access layer supports multiple database types, the application becomes able to use whatever databases the DAL can talk to. In either circumstance, having a data access layer provides a centralized location for all calls into the database, and thus makes it easier to port t
https://en.wikipedia.org/wiki/Fiddle%20yard
A fiddle yard or staging yard is a collection of model railway tracks that are hidden from view and allow trains to be stored and manipulated by the operators. These tracks are used to allow most model railways to be operated in a realistic manner. Whilst it is possible to have a realistic shunting yard in view, its operation is generally unreliable with models. Trains can be rearranged by lifting them off the track and replacing them. Development Fiddle yards were first built by British modellers so that they could build small layouts and operate them in a realistic manner. The first well-known model railway to use them was 'Maybank', which was exhibited at the 1939 Model Railway Club exhibition in London. This was an urban passenger terminus that led directly into a fiddle yard, hidden beneath a locomotive depot above it. It had an influence on C. J. Freezer, who as editor of Railway Modeller, would later go on to popularise them. In the 1950s he described the "Fiddle Yard to Terminus" layout, and used it for his influential 'Minories' design. The 'Peter Denny' design of fiddle yard, using a removable 'cassette' of tracks, was developed by the Reverend Peter Denny for his Buckingham Great Central layout around 1952. This used a number of parallel tracks and could also be used for rolling stock storage or transport, off the layout. Some of these cassettes use conventional pointwork, others slide sideways as a traverser, Denny's original rotated around a central pivot. Denny also used it to rotate by half a turn and to reverse the trains wholesale, without needing to uncouple and move locomotives from one end to the other. Denny was noted for his use of non-railway mechanisms and the original was cranked around by a Meccano geared drive, with remote switching and monitoring by a row of sprung metal contacts. Designs The fiddle yard is part of a layout, and as such varies with the type layout design, particularly whether it is of the "end-to-end" or "continuous
https://en.wikipedia.org/wiki/Budu%20%28sauce%29
Budu (Jawi: بودو; , , ) is an anchovies sauce and one of the best known fermented seafood products in Kelantan and Terengganu in Malaysia, the Natuna Islands (where it is called or ), South Sumatra, Bangka Island and Western Kalimantan in Indonesia (where it is called rusip), and Southern Thailand. It is mentioned in A Grammar and Dictionary of the Malay language, With a Preliminary Dissertation, Volume 2, By John Crawfurd, published in 1852. History It is traditionally made by mixing anchovies and salt in a ratio ranging from 2:1 to 6:1 and allowing the mix to ferment for 140 to 200 days. It is used as a flavouring and is normally eaten with fish, rice, and raw vegetables. It is similar to the in Philippines, in Indonesia, in Burma, in Vietnam, or in Japan, Colombo cure in the Indian subcontinent, in China, and in Korea. The fish product is the result of hydrolysis of fish and microbial proteases. The flavor and aroma of budu are produced by the action of proteolytic microorganisms surviving during the fermentation process. Palm sugar and tamarind are usually added to promote a browning reaction, resulting in a dark brown hue. The ratio of fish to salt is key to the final desired product. Different concentrations of salt influences the microbial and enzymatic activity, resulting in different flavours. The microorganisms found during budu production are generally classified as halophilic. The microorganisms play important roles in protein degradation and flavour and aroma development. Budu is a traditional condiment among the ethnic Malays of east coast of Peninsular Malaysia, particularly in the state of Kelantan and Terengganu. Budu has been declared a Malaysian heritage food by the Malaysian Department of National Heritage. Even ethnic Chinese in Kelantan are involved in budu production. Anchovy and its products like budu are high in protein and uric acid, thus not recommended for people with gout. The uric acid content in anchovies, however, is lowe
https://en.wikipedia.org/wiki/List%20of%20random%20number%20generators
Random number generators are important in many kinds of technical applications, including physics, engineering or mathematical computer studies (e.g., Monte Carlo simulations), cryptography and gambling (on game servers). This list includes many common types, regardless of quality or applicability to a given use case. Pseudorandom number generators (PRNGs) The following algorithms are pseudorandom number generators. Cryptographic algorithms Cipher algorithms and cryptographic hashes can be used as very high-quality pseudorandom number generators. However, generally they are considerably slower (typically by a factor 2–10) than fast, non-cryptographic random number generators. These include: Stream ciphers. Popular choices are Salsa20 or ChaCha (often with the number of rounds reduced to 8 for speed), ISAAC, HC-128 and RC4. Block ciphers in counter mode. Common choices are AES (which is very fast on systems supporting it in hardware), TwoFish, Serpent and Camellia. Cryptographic hash functions A few cryptographically secure pseudorandom number generators do not rely on cipher algorithms but try to link mathematically the difficulty of distinguishing their output from a `true' random stream to a computationally difficult problem. These approaches are theoretically important but are too slow to be practical in most applications. They include: Blum–Micali algorithm (1984) Blum Blum Shub (1986) Naor–Reingold pseudorandom function (1997) Random number generators that use external entropy These approaches combine a pseudo-random number generator (often in the form of a block or stream cipher) with an external source of randomness (e.g., mouse movements, delay between keyboard presses etc.). /dev/random – Unix-like systems CryptGenRandom – Microsoft Windows Fortuna RDRAND instructions (called Intel Secure Key by Intel), available in Intel x86 CPUs since 2012. They use the AES generator built into the CPU, reseeding it periodically. True Random Number Gen
https://en.wikipedia.org/wiki/Float%20switch
A float switch is a type of level sensor, a device used to detect the level of liquid within a tank. The switch may be used to control a pump, as an indicator, an alarm, or to control other devices. One type of float switch uses a mercury switch inside a hinged float. Another common type is a float that raises a rod to actuate a microswitch. One pattern uses a reed switch mounted in a tube; a float, containing a magnet, surrounds the tube and is guided by it. When the float raises the magnet to the reed switch, it closes. Several reeds can be mounted in the tube for different level indications by one assembly. A very common application is in sump pumps and condensate pumps where the switch detects the rising level of liquid in the sump or tank and energizes an electrical pump which then pumps liquid out until the level of the liquid has been substantially reduced, at which point the pump is switched off again. Float switches are often adjustable and can include substantial hysteresis. That is, the switch's "turn on" point may be much higher than the "shut off" point. This minimizes the on-off cycling of the associated pump. Some float switches contain a two-stage switch. As liquid rises to the trigger point of the first stage, the associated pump is activated. If the liquid continues to rise (perhaps because the pump has failed or its discharge is blocked), the second stage will be triggered. This stage may switch off the source of the liquid being pumped, trigger an alarm, or both. Where level must be sensed inside a pressurized vessel, often a magnet is used to couple the motion of the float to a switch located outside the pressurized volume. In some cases, a rod through a stuffing box can be used to operate a switch, but this creates high drag and has a potential for leakage. Successful float switch installations minimize the opportunity for accumulation of dirt on the float that would impede its motion. Float switch materials are selected to resist the de
https://en.wikipedia.org/wiki/Sight%20glass
A sight glass or water gauge is a type of level sensor, a transparent tube through which the operator of a tank or boiler can observe the level of liquid contained within. Liquid in tanks Simple sight glasses may be just a plastic or glass tube connected to the bottom of the tank at one end and the top of the tank at the other. The level of liquid in the sight glass will be the same as the level of liquid in the tank. Today, however, sophisticated float switches have replaced sight glasses in many such applications. Steam boilers If the liquid is hazardous or under pressure, more sophisticated arrangements must be made. In the case of a boiler, the pressure of the water below and the steam above is equal, so any change in the water level will be seen in the gauge. The transparent tube (the “glass” itself) may be mostly enclosed within a metal or toughened glass shroud to prevent it from being damaged through scratching or impact and offering protection to the operators in the case of breakage. This usually has a patterned backplate to make the magnifying effect of the water in the tube more obvious and so allow for easier reading. In some locomotives where the boiler is operated at very high pressures, the tube itself would be made of metal-reinforced toughened glass. It is important to keep the water at the specified level, otherwise the top of the firebox will be exposed, creating an overheat hazard and causing damage and possibly catastrophic failure. To check that the device is offering a correct reading and the connecting pipes to the boiler are not blocked by scale, the water level needs to be “bobbed” by quickly opening the taps in turn and allowing a brief spurt of water through the drain cock. The National Board of Boiler and Pressure Vessel Inspectors recommends a daily testing procedure described by the American National Standards Institute, chapter 2 part I-204.3 water level gauge. While not strictly required, this procedure is designed to allow a
https://en.wikipedia.org/wiki/Transmission%20of%20plant%20viruses
Transmission of plant viruses is the movement of plant viruses between organisms. Background Viruses are known to infect both plant cells and animal cells. Since viruses are obligate intracellular parasites they must develop direct methods of transmission, between hosts, in order to survive. The mobility of animals increases the mechanisms of viral transmission that have evolved, whereas plants remain immobile, and thus plant viruses must rely on environmental factors to be transmitted between hosts. Natural transmission between plant hosts The structural differences between plant and animal cells have resulted in a variety of transmission routes being exploited, enabling the virus to be passed between different host plants. The main difference, from the point of view of a virus, is the cell wall. This forms a tough barrier between the intracellular components and the extracellular environment, which has to be penetrated. These differences, combined with the fact that plants are immobile, have resulted in plant viruses relying on the wind and soil to transmit seeds as well as vectors. Vectors either transmit the virus propagative transmission, which results in an amplification of the virus by replication within the cells of the vector, or non-propagative transmission which simply carries the virus between the plants without viral replication. Common vectors include bacteria, fungi, nematodes, arthropods and arachnids. Furthermore, human intervention, including grafting and experimental mechanical damage, physically damages the cell wall, contributes to the array of transmission routes. The virus commonly uses these methods to be passed from one host to another. However, the virus is dependent upon physical damage, generated naturally by the wind and feeding of vectors or by human intervention. Transmission between plant cells Viral infections often develop into systemic infections as a means of transmission. The virus often infects many tissues, if not the whole
https://en.wikipedia.org/wiki/Invariant-based%20programming
Invariant-based programming is a programming methodology where specifications and invariants are written before the actual program statements. Writing down the invariants during the programming process has a number of advantages: it requires the programmer to make their intentions about the program behavior explicit before actually implementing it, and invariants can be evaluated dynamically during execution to catch common programming errors. Furthermore, if strong enough, invariants can be used to prove the correctness of the program based on the formal semantics of program statements. A combined programming and specification language, connected to a powerful formal proof system, will generally be required for full verification of non-trivial programs. In this case a high degree of automation of proofs is also possible. In most existing programming languages the main organizing structures are control flow blocks such as for loops, while loops and if statements. Such languages may not be ideal for invariants-first programming, since they force the programmer to make decisions about control flow before writing the invariants. Furthermore, most programming languages do not have good support for writing specifications and invariants, since they lack quantifier operators and one can typically not express higher order properties. The idea of developing the program together with its proof originated from E.W. Dijkstra. Actually writing invariants before program statements has been considered in a number of different forms by M.H. van Emden, J.C. Reynolds and R-J Back. See also Eiffel (programming language) Notes Formal methods Programming paradigms
https://en.wikipedia.org/wiki/Electronic%20common%20technical%20document
The electronic common technical document (eCTD) is an interface and international specification for the pharmaceutical industry to agency transfer of regulatory information. The specification is based on the Common Technical Document (CTD) format and was developed by the International Council for Harmonisation (ICH) Multidisciplinary Group 2 Expert Working Group (ICH M2 EWG). History Version 2.0 of eCTD – an upgrade over the original CTD – was finalized on February 12, 2002, and version 3.0 was finalized on October 8 of the same year. , the most current version is 3.2.2, released on July 16, 2008. A Draft Implementation Guide for version 4.0 of eCTD was released in August 2012. However, work stalled on the project. An additional Draft Implementation Guide was released in February 2015 Draft specifications and guides were issued in April 2016 by the ICH and the FDA, followed by a May 13 ICH "teleconference to discuss the guidance and any questions and clarifications needed." U.S. On May 5, 2015, the U.S. Food & Drug Administration published a final, binding guidance document requiring certain submissions in electronic (eCTD) format within 24 months. The projected date for mandatory electronic submissions is May 5, 2017 for New Drug Applications (NDAs), Biologic License Applications (BLAs), Abbreviated New Drug Applications (ANDAs) and Drug Master Files (DMFs). Canada Health Canada was a sponsor and an early adopter of the eCTD workflow especially for its Health Products and Food Branch regulator, but as of April 2015 had not yet fully automated it. E.U. The E.U. and its European Medicines Agency began accepting eCTD submissions in 2003. In February 2015, the "EMA announced it would no longer accept paper application forms for products applying to the centralized procedure beginning 1 July 2015." The EMA verified on that date that it would no longer accept "human and veterinary centralised procedure applications" and that all electronic application forms would ha
https://en.wikipedia.org/wiki/Full%20custom
In integrated circuit design, full-custom is a design methodology in which the layout of each individual transistor on the integrated circuit (IC), and the interconnections between them, are specified. Alternatives to full-custom design include various forms of semi-custom design, such as the repetition of small transistor subcircuits; one such methodology is the use of standard cell libraries (which are themselves designed full-custom). Full-custom design potentially maximizes the performance of the chip, and minimizes its area, but is extremely labor-intensive to implement. Full-custom design is limited to ICs that are to be fabricated in extremely high volumes, notably certain microprocessors and a small number of application-specific integrated circuits (ASICs). As of 2008 the main factor affecting the design and production of ASICs was the high cost of mask sets (number of which is depending on the number of IC layers) and the requisite EDA design tools. The mask sets are required in order to transfer the ASIC designs onto the wafer. See also Electronics design flow
https://en.wikipedia.org/wiki/Hybrid%20bond%20graph
A hybrid bond graph is a graphical description of a physical dynamic system with discontinuities (i.e., a hybrid dynamical system). Similar to a regular bond graph, it is an energy-based technique. However, it allows instantaneous switching of the junction structure, which may violate the principle of continuity of power (Mosterman and Biswas, 1998).
https://en.wikipedia.org/wiki/Standard%20of%20Good%20Practice%20for%20Information%20Security
The Standard of Good Practice for Information Security (SOGP), published by the Information Security Forum (ISF), is a business-focused, practical and comprehensive guide to identifying and managing information security risks in organizations and their supply chains. The most recent edition is 2020, an update of the 2018 edition. A 2022 edition is coming. Upon release, the 2011 Standard was the most significant update of the standard for four years. It covers information security 'hot topics' such as consumer devices, critical infrastructure, cybercrime attacks, office equipment, spreadsheets and databases and cloud computing. The 2011 Standard is aligned with the requirements for an Information Security Management System (ISMS) set out in ISO/IEC 27000-series standards, and provides wider and deeper coverage of ISO/IEC 27002 control topics, as well as cloud computing, information leakage, consumer devices and security governance. In addition to providing a tool to enable ISO 27001 certification, the 2011 Standard provides full coverage of COBIT v4 topics, and offers substantial alignment with other relevant standards and legislation such as PCI DSS and the Sarbanes Oxley Act, to enable compliance with these standards too. The Standard is used by Chief Information Security Officers (CISOs), information security managers, business managers, IT managers, internal and external auditors, IT service providers in organizations of all sizes. The 2018 Standard is available free of charge to members of the ISF. Non-members are able to purchase a copy of the standard directly from the ISF. Organization The Standard has historically been organized into six categories, or aspects. Computer Installations and Networks address the underlying IT infrastructure on which Critical Business Applications run. The End-User Environment covers the arrangements associated with protecting corporate and workstation applications at the endpoint in use by individuals. Systems Developme
https://en.wikipedia.org/wiki/Glossary%20of%20machine%20vision
The following are common definitions related to the machine vision field. General related fields Machine vision Computer vision Image processing Signal processing 0-9 1394. FireWire is Apple Inc.'s brand name for the IEEE 1394 interface. It is also known as i.Link (Sony's name) or IEEE 1394 (although the 1394 standard also defines a backplane interface). It is a personal computer (and digital audio/digital video) serial bus interface standard, offering high-speed communications and isochronous real-time data services. 1D. One-dimensional. 2D computer graphics. The computer-based generation of digital images—mostly from two-dimensional models (such as 2D geometric models, text, and digital images) and by techniques specific to them. 3D computer graphics. 3D computer graphics are different from 2D computer graphics in that a three-dimensional representation of geometric data is stored in the computer for the purposes of performing calculations and rendering 2D images. Such images may be for later display or for real-time viewing. Despite these differences, 3D computer graphics rely on many of the same algorithms as 2D computer vector graphics in the wire frame model and 2D computer raster graphics in the final rendered display. In computer graphics software, the distinction between 2D and 3D is occasionally blurred; 2D applications may use 3D techniques to achieve effects such as lighting, and primarily 3D may use 2D rendering techniques. 3D scanner. This is a device that analyzes a real-world object or environment to collect data on its shape and possibly color. The collected data can then be used to construct digital, three dimensional models useful for a wide variety of applications. A Aberration. Optically, defocus refers to a translation along the optical axis away from the plane or surface of best focus. In general, defocus reduces the sharpness and contrast of the image. What should be sharp, high-contrast edges in a scene become gradual transiti
https://en.wikipedia.org/wiki/Global%20Buddhist%20Network
The Global Buddhist Network (GBN), previously known as the Dhammakaya Media Channel (DMC) is a Thai online television channel concerned with Buddhism. The channel's taglines were "The secrets of life revealed" and "The only one", but these were later replaced by "Channel for the path to the cessation of suffering and attainment of Dhamma". The channel features many types of programs with Buddhist content, and has programs in several languages. The channel started in 2002, as a means to reach remote provinces in Thailand. Controversially, the channel made international headlines in 2012 when it featured a teaching on the afterlife of Steve Jobs. On 26 December 2016, Thai authorities withdrew the permit for the satellite channel permanently, during the legal investigations into the temple by the Thai junta. In April 2017, it was reported, however, that the channel's programming had continued, but broadcast through the Internet only. In its online format, the channel has been renamed Global Buddhist Network. Background DMC started in 2002. The channel was owned by the Dhamma Research for Environment Foundation, part of the temple Wat Phra Dhammakaya. The channel was founded to provide an alternative to the many distractions that surround people in modern life, which lure "people into doing immoral things", as stated by Phra Somsak Piyasilo, spokesperson of the organization. The channel originated from an initiative in 2001 when people living in the far provinces of Thailand wanted to listen to the teachings of the temple. The temple therefore provided live teachings through a thousand public telephone lines, through which people could follow the activities. The telephone lines had many restrictions in use, and the temple started to broadcast through a satellite television channel instead. Later, in 2005, the temple developed an online counterpart to the channel. The channel is managed by Phra Maha Nopon Puññajayo, who supervises a team of thirty volunteers. Previou
https://en.wikipedia.org/wiki/Oliver%20E.%20Buckley%20Prize
The Oliver E. Buckley Condensed Matter Prize is an annual award given by the American Physical Society "to recognize and encourage outstanding theoretical or experimental contributions to condensed matter physics." It was endowed by AT&T Bell Laboratories as a means of recognizing outstanding scientific work. The prize is named in honor of Oliver Ellsworth Buckley, a former president of Bell Labs. Before 1982, it was known as the Oliver E. Buckley Solid State Prize. It is one of the most prestigious awards in the field of condensed matter physics. The prize is normally awarded to one person but may be shared if multiple recipients contributed to the same accomplishments. Nominations are active for three years. The prize was endowed in 1952 and first awarded in 1953. Since 2012, the prize has been co-sponsored by HTC-VIA Group. Recipients See also List of physics awards
https://en.wikipedia.org/wiki/Brown-brown
Brown-brown is a purported form of cocaine or amphetamine insufflation mixed with smokeless gunpowder. This powder often contains nitroglycerin, a drug prescribed for heart conditions, which might cause vasodilation, permitting the cocaine or amphetamine insufflation to move more freely through the body. This, in turn, is believed to allow for a more intense high. The term may also refer to heroin. Brown-brown is reportedly given to child soldiers before West African armed conflicts. One former child soldier, Michel Chikwanine, has written a graphic novel with Jessica Dee Humphreys called Child Soldier, about the experience of being captured at the age of 5 by rebel fighters in the Democratic Republic of Congo, including being given brown-brown. "The rebel soldier who had hit me used a long, jagged knife to cut my wrist and rubbed powder into the wound. They called it Brown Brown – a mixture of gunpowder and a drug called cocaine. Right away, I began to feel like my brain was trying to jump out of my head." In media and culture Films The fictional character Yuri Orlov (portrayed by Nicolas Cage) uses the drug in Liberia in the film Lord of War (2005). It is also portrayed being used by Liberian child soldiers during their preparations for a combat/assault mission in the French/Liberian film Johnny Mad Dog (2008). Several characters in the film Beasts of No Nation (2015) are seen snorting a substance, possibly cocaine, possibly heroin, that is mixed with gunpowder and burned. It is referenced in The White Chamber as a drug used to enhance war efforts. Literature In the novel Beasts of No Nation (2005) and its 2015 film adaptation, brown-brown is used by many of the child soldiers and the Commandant. Ishmael Beah describes using brown-brown, cocaine, and other drugs while he was a child soldier in Sierra Leone, in his memoir A Long Way Gone: Memoirs of a Boy Soldier (2007). In the mystery novel The Madness of Crowds (2021), 17th book of the Chief Insp
https://en.wikipedia.org/wiki/Power%20symbol
A power symbol is a symbol indicating that a control activates or deactivates a particular device. Such a control may be a rocker switch, a toggle switch, a push-button, a virtual switch on a display screen, or some other user interface. The internationally standardized symbols are intended to communicate their function in a language-independent manner. Description The well-known on/off power symbol was the result of the logical evolution in user interface design. Originally, most early power controls consisted of switches that were toggled between two states demarcated by the words On and Off. As technology became more ubiquitous, these English words were replaced with the universal symbols line "|" for "on" and circle "◯" for "off" (typically without serifs) to bypass language barriers. This standard is still used on toggle power switches. The symbol for the standby button was created by superimposing the symbols "|" and "◯"; however, it is commonly interpreted as the numerals "0" and "1" (binary code); yet, the International Electrotechnical Commission (IEC) holds these symbols as a graphical representation of a line and a circle. Standby symbol ambiguity Because the exact meaning of the standby symbol on a given device may be unclear until the control is tried, it has been proposed that a separate sleep symbol, a crescent moon, instead be used to indicate a low power state. Proponents include the California Energy Commission and the Institute of Electrical and Electronics Engineers. Under this proposal, the older standby symbol would be redefined as a generic "power" indication, in cases where the difference between it and the other power symbols would not present a safety concern. This alternative symbolism was published as IEEE standard 1621 on December 8, 2004. Standards Universal power symbols are described in the International Electrotechnical Commission (IEC) 60417 standard, Graphical symbols for use on equipment, appearing in the 1973 edition of the d
https://en.wikipedia.org/wiki/Isoamyl%20acetate
Isoamyl acetate, also known as isopentyl acetate, is an organic compound that is the ester formed from isoamyl alcohol and acetic acid, with the molecular formula C7H14O2. It is a colorless liquid that is only slightly soluble in water, but very soluble in most organic solvents. Isoamyl acetate has a strong odor which is described as similar to both banana and pear. Pure isoamyl acetate, or mixtures of isoamyl acetate, amyl acetate, and other flavors in ethanol may be referred to as banana oil or pear oil. Production Isoamyl acetate is prepared by the acid catalyzed reaction (Fischer esterification) between isoamyl alcohol and glacial acetic acid as shown in the reaction equation below. Typically, sulfuric acid is used as the catalyst. Alternatively, p-toluenesulfonic acid or an acidic ion exchange resin can be used as the catalyst. It is also produced synthetically by the rectification of amyl acetate. Applications Isoamyl acetate is used to confer banana or pear flavor in foods such as circus peanuts, Juicy Fruit and pear drops. Banana oil and pear oil commonly refer to a solution of isoamyl acetate in ethanol that is used as an artificial flavor. It is also used as a solvent for some varnishes, oil paints, and nitrocellulose lacquers. As a solvent and carrier for materials such as nitrocellulose, it was extensively used in the aircraft industry for stiffening and wind-proofing fabric flying surfaces, where it and its derivatives were generally known as 'aircraft dope'. Now that most aircraft wings are made of metal, such use is mostly limited to historically accurate reproductions and scale models. Because of its intense, pleasant odor and its low toxicity, isoamyl acetate is used to test the effectiveness of respirators or gas masks. Occurrence in nature Isoamyl acetate occurs naturally in many plants, including apple, banana, coffee, grape, guava, lychee, papaya, peach, pomegranate, and tomato. It is also released by fermentation processes, including t
https://en.wikipedia.org/wiki/OutSystems
OutSystems is a Low-code development platform which provides tools for companies to develop, deploy and manage omnichannel enterprise applications. OutSystems was founded in 2001 in Lisbon, Portugal. In June 2018 OutSystems secured a $360M round of funding from KKR and Goldman Sachs and reached the status of Unicorn. In February 2021 OutSystems raised another $150M investment from a round co-led by Abdiel Capital and Tiger Global Management, having a total valuation of $9.5 Billion. OutSystems is a member of the Consortium of IT Software Quality (CISQ). Products OutSystems is a low-code development platform for the development of mobile and web enterprise applications, which run in the cloud, on-premises or in hybrid environments. In 2014 OutSystems launched a free version of the platform that provides developers with personal cloud environments to create and deploy web and mobile applications without charge. The current version is 11.53, for both the paid and unpaid versions.
https://en.wikipedia.org/wiki/Hydnum%20repandum
Hydnum repandum, commonly known as the sweet tooth, wood hedgehog or hedgehog mushroom, is a basidiomycete fungus of the family Hydnaceae. First described by Carl Linnaeus in 1753, it is the type species of the genus Hydnum. The fungus produces fruit bodies (mushrooms) that are characterized by their spore-bearing structures—in the form of spines rather than gills—which hang down from the underside of the cap. The cap is dry, colored yellow to light orange to brown, and often develops an irregular shape, especially when it has grown closely crowded with adjacent fruit bodies. The mushroom tissue is white with a pleasant odor and a spicy or bitter taste. All parts of the mushroom stain orange with age or when bruised. A mycorrhizal fungus, Hydnum repandum is broadly distributed in Europe where it fruits singly or in close groups in coniferous or deciduous woodland. This is a choice edible species, although mature specimens can develop a bitter taste. It has no poisonous lookalikes. Mushrooms are collected and sold in local markets of Europe and Canada. Taxonomy First officially described by Carl Linnaeus in his 1753 Species Plantarum, Hydnum repandum was sanctioned by Swedish mycologist Elias Fries in 1821. The species has been shuffled among several genera: Hypothele by French naturalist Jean-Jacques Paulet in 1812; Dentinum by British botanist Samuel Frederick Gray in 1821; Tyrodon by Finnish mycologist Petter Karsten in 1881; Sarcodon by French naturalist Lucien Quélet in 1886. After a 1977 nomenclatural proposal by American mycologist Ronald H. Petersen was accepted, Hydnum repandum became the official type species of the genus Hydnum. Previously, supporting arguments for making H. repandum the type were made by Dutch taxonomist Marinus Anton Donk (1958) and Petersen (1973), while Czech mycologist Zdeněk Pouzar (1958) and Canadian mycologist Kenneth Harrison (1971) thought that H. imbricatum should be the type. Several forms and varieties of H. repandum have
https://en.wikipedia.org/wiki/Supporting%20hyperplane
In geometry, a supporting hyperplane of a set in Euclidean space is a hyperplane that has both of the following two properties: is entirely contained in one of the two closed half-spaces bounded by the hyperplane, has at least one boundary-point on the hyperplane. Here, a closed half-space is the half-space that includes the points within the hyperplane. Supporting hyperplane theorem This theorem states that if is a convex set in the topological vector space and is a point on the boundary of then there exists a supporting hyperplane containing If ( is the dual space of , is a nonzero linear functional) such that for all , then defines a supporting hyperplane. Conversely, if is a closed set with nonempty interior such that every point on the boundary has a supporting hyperplane, then is a convex set, and is the intersection of all its supporting closed half-spaces. The hyperplane in the theorem may not be unique, as noticed in the second picture on the right. If the closed set is not convex, the statement of the theorem is not true at all points on the boundary of as illustrated in the third picture on the right. The supporting hyperplanes of convex sets are also called tac-planes or tac-hyperplanes. The forward direction can be proved as a special case of the separating hyperplane theorem (see the page for the proof). For the converse direction, See also Support function Supporting line (supporting hyperplanes in ) Notes