source
stringlengths
31
203
text
stringlengths
28
2k
https://en.wikipedia.org/wiki/Abstract%20state%20machine
In computer science, an abstract state machine (ASM) is a state machine operating on states that are arbitrary data structures (structure in the sense of mathematical logic, that is a nonempty set together with a number of functions (operations) and relations over the set). Overview The ASM Method is a practical and scientifically well-founded systems engineering method that bridges the gap between the two ends of system development: the human understanding and formulation of real-world problems (requirements capture by accurate high-level modeling at the level of abstraction determined by the given application domain) the deployment of their algorithmic solutions by code-executing machines on changing platforms (definition of design decisions, system and implementation details). The method builds upon three basic concepts: ASM: a precise form of pseudo-code, generalizing Finite State Machines to operate over arbitrary data structures ground model: a rigorous form of blueprints, serving as an authoritative reference model for the design refinement: a most general scheme for stepwise instantiations of model abstractions to concrete system elements, providing controllable links between the more and more detailed descriptions at the successive stages of system development. In the original conception of ASMs, a single agent executes a program in a sequence of steps, possibly interacting with its environment. This notion was extended to capture distributed computations, in which multiple agents execute their programs concurrently. Since ASMs model algorithms at arbitrary levels of abstraction, they can provide high-level, low-level and mid-level views of a hardware or software design. ASM specifications often consist of a series of ASM models, starting with an abstract ground model and proceeding to greater levels of detail in successive refinements or coarsenings. Due to the algorithmic and mathematical nature of these three concepts, ASM models and their pr
https://en.wikipedia.org/wiki/Duration%20calculus
Duration calculus (DC) is an interval logic for real-time systems. It was originally developed by Zhou Chaochen with the help of Anders P. Ravn and C. A. R. Hoare on the European ESPRIT Basic Research Action (BRA) ProCoS project on Provably Correct Systems. Duration calculus is mainly useful at the requirements level of the software development process for real-time systems. Some tools are available (e.g., DCVALID, IDLVALID, etc.). Subsets of duration calculus have been studied (e.g., using discrete time rather than continuous time). Duration calculus is especially espoused by UNU-IIST in Macau and the Tata Institute of Fundamental Research in Mumbai, which are major centres of excellence for the approach. See also Interval temporal logic (ITL) Temporal logic Temporal logic of actions (TLA) Modal logic References External links Duration Calculus — Virtual Library entry 1991 introductions Formal specification languages Temporal logic
https://en.wikipedia.org/wiki/Blood%20shift
Blood shift has at least two separate meanings: In medicine, it is synonymous with left shift In diving physiology, it is part of the diving reflex
https://en.wikipedia.org/wiki/Happy%20ending%20problem
In mathematics, the "happy ending problem" (so named by Paul Erdős because it led to the marriage of George Szekeres and Esther Klein) is the following statement: This was one of the original results that led to the development of Ramsey theory. The happy ending theorem can be proven by a simple case analysis: if four or more points are vertices of the convex hull, any four such points can be chosen. If on the other hand, the convex hull has the form of a triangle with two points inside it, the two inner points and one of the triangle sides can be chosen. See for an illustrated explanation of this proof, and for a more detailed survey of the problem. The Erdős–Szekeres conjecture states precisely a more general relationship between the number of points in a general-position point set and its largest subset forming a convex polygon, namely that the smallest number of points for which any general position arrangement contains a convex subset of points is . It remains unproven, but less precise bounds are known. Larger polygons proved the following generalisation: The proof appeared in the same paper that proves the Erdős–Szekeres theorem on monotonic subsequences in sequences of numbers. Let denote the minimum for which any set of points in general position must contain a convex N-gon. It is known that , trivially. . . A set of eight points with no convex pentagon is shown in the illustration, demonstrating that ; the more difficult part of the proof is to show that every set of nine points in general position contains the vertices of a convex pentagon. . The value of is unknown for all . By the result of , is known to be finite for all finite . On the basis of the known values of for N = 3, 4 and 5, Erdős and Szekeres conjectured in their original paper that They proved later, by constructing explicit examples, that . In 2016 Andrew Suk showed that for Suk actually proves, for N sufficiently large, A 2020 preprint by Andreas F. Holmsen,
https://en.wikipedia.org/wiki/K-202
K-202 was a 16-bit minicomputer, created by a team led by Polish scientist Jacek Karpiński between 1970–1973 in cooperation with British companies Data-Loop and M.B. Metals. The machine could perform about 1 million instructions per second, making it highly competitive with the US Data General SuperNOVA and UK CTL Modular One. Most other minicomputers of the era were significantly slower. Approximately 30 units were claimed to be produced. All units shipped to M.B. Metals were returned for service. Due to friction resulting from competition with Elwro, a government-backed competitor, the production of K-202 was blocked and Karpiński thrown out of his company under the allegations of sabotage and embezzlement. Sometime later the K-202 had a successor, , hundreds of which were built. Description The K-202 was packaged in a metal box similar to other minicomputers in overall size, and capable of being fit into a 19-inch rack, which was common for other systems. Like most computers of the era, the front panel included a number of switches and lamps that could be used to directly set or read the values stored in main memory. A unique feature was a large dial on the right that selected what to display or set, allowing rapid access to the processor registers simply by rotating the dial. A key that turned on the power and unlocked the case was positioned on the right side of the case. The system was designed to be highly expandable. A minimal setup consisted of the central processing unit (CPU), a minimum of 4 k 16-bit words of core memory (4 kW, or 8 kB), and a single input/output channel for use with a computer terminal. The basic system supported vectored interrupts for 32 input/output devices. At the other end of the scale, a maximally expanded system could include up to 4 MW of memory, a floating point unit (FPU), multiple multi-line programmable input/output systems, and even more than one CPU. At the maximum, it could support 272 I/O interrupt levels. The expans
https://en.wikipedia.org/wiki/United%20Nations%20University%20Institute%20in%20Macau
The United Nations University Institute in Macau, formerly the United Nations University International Institute for Software Technology (UNU-IIST; ; Portuguese: Instituto Internacional para Tecnologia de Programação da Universidade das Nações Unidas) and then United Nations University Institute on Computing and Society (UNU-CS; ), is a United Nations University global think tank conducting research and training on digital technologies for sustainable development, encouraging data-driven and evidence-based actions and policies to achieve the Sustainable Development Goals. UNU-Macau is located in Macau, China. History During 1987–1989, United Nations University conducted two studies regarding the need for a research and training centre for computing in the developing nations. The studies led to the decision of the Council of the UNU to establish in Macau the United Nations University International Institute for Software Technology (UNU-IIST), which was founded on 12 March 1991 and opened its door in July 1992. UNU-IIST was created following the agreement between the UN, the governments of Macau, Portugal, and China. The Macao authorities also supplied the institute with its office premises, located in a heritage building Casa Silva Mendes, and subsidize fellow accommodation. As part of the United Nations, the institute was to address the pressing global problems of human survival, development and welfare by international co-operation, research and advanced training in software technology. After the Transfer of sovereignty over Macau, UNU-IIST lost some level of visibility. United Nations University Institute on Computing and Society Eventually, in 2015, UNU decided to evolve the former IIST into a new Institute on Computing and Society (UNU-CS). UNU-CS intended to assist the United Nations to achieve its agenda by working as a bridge between the UN and the international academic and public policy communities, training the next generation of interdisciplinary sc
https://en.wikipedia.org/wiki/Rigorous%20Approach%20to%20Industrial%20Software%20Engineering
Rigorous Approach to Industrial Software Engineering (RAISE) was developed as part of the European ESPRIT II LaCoS project in the 1990s, led by Dines Bjørner. It consists of a set of tools designed for a specification language (RSL) for software development. It is especially espoused by UNU-IIST in Macau, who run training courses on site and around the world, especially in developing countries. See also Formal methods Formal specification External links RAISE Virtual Library entry RAISE – Rigorous Approach to Industrial Software Engineering RAISE information from Dines Bjørner Formal specification languages Formal methods tools Software testing tools
https://en.wikipedia.org/wiki/Field-replaceable%20unit
A field-replaceable unit (FRU) is a printed circuit board, part, or assembly that can be quickly and easily removed from a computer or other piece of electronic equipment, and replaced by the user or a technician without having to send the entire product or system to a repair facility. FRUs allow a technician lacking in-depth product knowledge to isolate faults and replace faulty components. The granularity of FRUs in a system impacts total cost of ownership and support, including the costs of stocking spare parts, where spares are deployed to meet repair time goals, how diagnostic tools are designed and implemented, levels of training for field personnel, whether end-users can do their own FRU replacement, etc. Other equipment FRUs are not strictly confined to computers but are also part of many high-end, lower-volume consumer and commercial products. For example, in military aviation, electronic components of line-replaceable units, typically known as shop-replaceable units (SRUs), are repaired at field-service backshops, usually by a "remove and replace" repair procedure, with specialized repair performed at centralized depot or by the OEM. History Many vacuum tube computers had FRUs: Pluggable units containing one or more vacuum tubes and various passive components Most transistorized and integrated circuit-based computers had FRUs: Computer modules, circuit boards containing discrete transistors and various passive components. Examples: IBM SMS cards DEC System Building Blocks cards DEC Flip-Chip cards Circuit boards containing monolithic ICs and/or hybrid ICs, such as IBM SLT cards. Vacuum tubes themselves are usually FRUs. For a short period starting in the late 1960s, some television set manufacturers made solid-state televisions with FRUs instead of a single board attached to the chassis. However modern televisions put all the electronics on one large board to reduce manufacturing costs. Trends As the sophistication and complexity of multi-replaceable
https://en.wikipedia.org/wiki/Modular%20design
Modular design, or modularity in design, is a design principle that subdivides a system into smaller parts called modules (such as modular process skids), which can be independently created, modified, replaced, or exchanged with other modules or between different systems. Overview A modular design can be characterized by functional partitioning into discrete scalable and reusable modules, rigorous use of well-defined modular interfaces, and making use of industry standards for interfaces. In this context modularity is at the component level, and has a single dimension, component slottability. A modular system with this limited modularity is generally known as a platform system that uses modular components. Examples are car platforms or the USB port in computer engineering platforms. In design theory this is distinct from a modular system which has higher dimensional modularity and degrees of freedom. A modular system design has no distinct lifetime and exhibits flexibility in at least three dimensions. In this respect modular systems are very rare in markets. Mero architectural systems are the closest example to a modular system in terms of hard products in markets. Weapons platforms, especially in aerospace, tend to be modular systems, wherein the airframe is designed to be upgraded multiple times during its lifetime, without the purchase of a completely new system. Modularity is best defined by the dimensions effected or the degrees of freedom in form, cost, or operation. Modularity offers benefits such as reduction in cost (customization can be limited to a portion of the system, rather than needing an overhaul of the entire system), interoperability, shorter learning time, flexibility in design, non-generationally constrained augmentation or updating (adding new solution by merely plugging in a new module), and exclusion. Modularity in platform systems, offer benefits in returning margins to scale, reduced product development cost, reduced O&M costs, and t
https://en.wikipedia.org/wiki/B-staging
B-staging is a process that utilizes heat or UV light to remove the majority of solvent from an adhesive, thereby allowing a construction to be “staged”. In between adhesive application, assembly and curing, the product can be held for a period of time, without sacrificing performance. Attempts to use traditional epoxies in IC packaging often created expensive production bottlenecks, because, as soon as the epoxy adhesive was applied, the components had to be assembled and cured immediately. B-staging eliminates these bottlenecks by allowing the IC manufacturing to proceed efficiently, with each step performed on larger batches of product. B stage laminates are also used in the electronic circuit board industry, where the laminates are reinforced with woven glass fibers called prepregs. This allows manufacturers to have clean and accurate setup for multilayer pressing of cores and prepregs for production of PCBs, without the need to hassle with liquid uncured epoxies. References Semiconductor device fabrication
https://en.wikipedia.org/wiki/Way%20of%20the%20Tiger
The Way of the Tiger is a series of adventure gamebooks by Mark Smith and Jamie Thomson, originally published by Knight Books (an imprint of Hodder & Stoughton) from 1985. They are set on the fantasy world of Orb. The reader takes the part of a young monk/ninja, named Avenger, initially on a quest to avenge his foster father's murder and recover stolen scrolls. Later books presented other challenges for Avenger to overcome, most notably taking over and ruling a city. The world of Orb was originally created by Mark Smith for a Dungeons & Dragons game he ran while a pupil at Brighton College in the mid-1970s. Orb was also used as the setting for the 1984 Fighting Fantasy gamebook Talisman of Death, and one of the settings in the 1985 Falcon gamebook Lost in Time, both by Smith and Thomson. Each book has a disclaimer at the front against performing any of the ninja related feats in the book as "They could lead to serious injury or death to an untrained user". The sixth book, Inferno!, ends on a cliffhanger with Avenger trapped in the web of the Black Widow, Orb's darkest blight. As no new books were released, the fate of Avenger and Orb was unknown. Mark Smith has confirmed that the cliffhanger ending was deliberate. In August 2013, the original creators of the series were working with Megara Entertainment to develop re-edited hardcover collector editions of the gamebooks (including a new prequel (Book 0) and sequel (book 7)), and potentially a role-playing game based on the series. The two new books plus the six re-edited original books were reprinted in paperback format by Megara Entertainment in 2014, and made available as PDFs in 2019. Books and publication history The original series comprises six books: Avenger! (1985) Assassin! (1985) Usurper! (1985) Overlord! (1986) Warbringer! (1986) Inferno! (1987) The sixth book ended on a cliffhanger, which was not resolved until 27 years later. Interviewed in 2012, Mark Smith explained: "Our publishers Hodder
https://en.wikipedia.org/wiki/Use%20case%20survey
A use case survey is a list of names and perhaps brief descriptions of use cases associated with a system, component, or other logical or physical entity. This artifact is short and inexpensive to produce, and possibly advantageous over similar software development tools, depending on the needs of the project and analyst. Software requirements
https://en.wikipedia.org/wiki/Zhou%20Chaochen
Zhou Chaochen (; born 1 November 1937) is a Chinese computer scientist. Zhou was born in Nanhui, Shanghai, China. He studied as an undergraduate at the Department of Mathematics and Mechanics, Peking University (1954–1958) and as a postgraduate at the Institute of Computing Technology, Chinese Academy of Sciences (CAS) (1963–1967). He worked at Peking University and CAS until his visit to the Oxford University Computing Laboratory (now the Oxford University Department of Computer Science) (1989–1992). During this time, he was the prime investigator of the duration calculus, an interval logic for real-time systems as part of the European ESPRIT ProCoS project on Provably Correct Systems. During the periods 1990–1992 and 1995–1996, Zhou Chaochen was visiting professor at the Department of Computer Science, Technical University of Denmark, Lyngby, on the invitation of Professor Dines Bjørner. He was Principal Research Fellow (1992–1997) and later Director of UNU-IIST in Macau (1997–2002), until his retirement, when he returned to Beijing. In 2007, Zhou and Dines Bjørner, the first Director of UNU-IIST, were honoured on the occasion of their 70th birthdays. Zhou is a member of the Chinese Academy of Sciences. Books Zhou, Chaochen and Hansen, Michael R., Duration Calculus: A Formal Approach to Real-Time Systems. Springer-Verlag, Monographs in Theoretical Computer Science, An EATCS Series, 2003. . References External links Institute of Software, Chinese Academy of Sciences (ISCAS) information 1937 births Living people Chinese computer scientists Formal methods people Members of the Chinese Academy of Sciences Members of the Department of Computer Science, University of Oxford Peking University alumni Academic staff of Peking University Scientists from Shanghai Academic staff of the Technical University of Denmark Academic staff of United Nations University
https://en.wikipedia.org/wiki/Friction%20sensitivity
Friction sensitivity is an approximation of the amount of friction or rubbing a compound can withstand before prematurely exploding. For instance, nitroglycerin has an extremely high sensitivity to friction, meaning that very little rubbing against it could set off a violent explosion. There is no exact determining the amount of friction required to set off a compound, but is rather approximated by the amount of force applied and the amount of time before the compound explodes. Explosives engineering
https://en.wikipedia.org/wiki/Z%20User%20Group
The Z User Group (ZUG) was established in 1992 to promote use and development of the Z notation, a formal specification language for the description of and reasoning about computer-based systems. It was formally constituted on 14 December 1992 during the ZUM'92 Z User Meeting in London, England. Meetings and conferences ZUG has organised a series of Z User Meetings approximately every 18 months initially. From 2000, these became the ZB Conference (jointly with the B-Method, co-organized with APCB), and from 2008 the ABZ Conference (with Abstract State Machines as well). In 2010, the ABZ Conference also includes Alloy, a Z-like specification language with associated tool support. The Z User Group participated at the FM'99 World Congress on Formal Methods in Toulouse, France, in 1999. The group and the associated Z notation have been studied as a Community of Practice. List of proceedings The following proceedings were produced by the Z User Group: Bowen, J.P.; Nicholls, J.E., eds. (1993). Z User Workshop, London 1992, Proceedings of the Seventh Annual Z User Meeting, 14–15 December 1992. Springer, Workshops in Computing. Bowen, J.P.; Hall, J.A., eds. (1994). Z User Workshop, Cambridge 1994, Proceedings of the Eighth Annual Z User Meeting, 29–30 June 1994. Springer, Workshops in Computing. Bowen, J.P.; Hinchey, M.G, eds. (1995). ZUM '95: The Z Formal Specification Notation, 9th International Conference of Z Users, Limerick, Ireland, September 7–9, 1995. Springer, Lecture Notes in Computer Science, Volume 967. Bowen, J.P.; Hinchey, M.G.; Till, D., eds. (1997). ZUM '97: The Z Formal Specification Notation, 10th International Conference of Z Users, Reading, UK, April 3–4, 1997. Springer, Lecture Notes in Computer Science, Volume 1212. Bowen, J.P.; Fett, A.; Hinchey, M.G., eds. (1998). ZUM '98: The Z Formal Specification Notation, 11th International Conference of Z Users, Berlin, Germany, September 24–26, 1998. Springer, Lecture Notes in Computer Science, Vol
https://en.wikipedia.org/wiki/BCS-FACS
BCS-FACS is the BCS Formal Aspects of Computing Science Specialist Group. Overview The FACS group, inaugurated on 16 March 1978, organizes meetings for its members and others on formal methods and related computer science topics. There is an associated journal, Formal Aspects of Computing, published by Springer, and a more informal FACS FACTS newsletter. The group celebrated its 20th anniversary with a meeting at the Royal Society in London in 1998, with presentations by four eminent computer scientists, Mike Gordon, Tony Hoare, Robin Milner and Gordon Plotkin, all Fellows of the Royal Society. From 2002–2008 and since 2013 again, the Chair of BCS-FACS has been Jonathan Bowen. Jawed Siddiqi was Chair during 2008–2013. In December 2002, BCS-FACS organized a conference on the Formal Aspects of Security (FASec'02) at Royal Holloway, University of London. In 2004, FACS organized a major event at London South Bank University to celebrate its own 25th anniversary and also 25 Years of CSP (CSP25), attended by the originator of CSP, Sir Tony Hoare, and others in the field. The group liaises with other related groups such as the Centre for Software Reliability, Formal Methods Europe, the London Mathematical Society Computer Committee, the Safety-Critical Systems Club, and the Z User Group. It has held joint meetings with other BCS specialist groups such as the Advanced Programming Group and BCSWomen. FACS sponsors and supports meetings, such as the Refinement Workshop. It has often held a Christmas event each year, with a theme related to formal aspects of computing — for example, teaching formal methods and formal methods in industry. BCS-FACS supported the ABZ 2008 conference at the BCS London premises. In 2015, FACS hosted a two-day ProCoS Workshop on "Provably Correct Systems", with many former members of the ESPRIT ProCoS I and II projects and Working Group of the 1990s. Evening seminars In recent years, a series of evening seminars have been held, mainly at the
https://en.wikipedia.org/wiki/Rabbit-proof%20fence
The State Barrier Fence of Western Australia, formerly known as the Rabbit-Proof Fence, the State Vermin Fence, and the Emu Fence, is a pest-exclusion fence constructed between 1901 and 1907 to keep rabbits, and other agricultural pests from the east, out of Western Australian pastoral areas. There are three fences in Western Australia: the original No. 1 Fence crosses the state from north to south, No. 2 Fence is smaller and further west, and No. 3 Fence is smaller still and runs east–west. The fences took six years to build. When completed, the rabbit-proof fence (including all three fences) stretched . The cost to build each kilometre of fence at the time was about $250 (). When it was completed in 1950, the No. 1 Fence was the longest unbroken fence in the world. History Rabbits were introduced to Australia by the First Fleet in 1788, but they became a problem after October 1859, when Thomas Austin released 24 wild rabbits from England for hunting purposes, believing "The introduction of a few rabbits could do little harm and might provide a touch of home, in addition to a spot of hunting." The rabbits proved to be extremely prolific and spread rapidly across the southern parts of the country. Australia had ideal conditions for an explosion in the rabbit population, including the fact that they had virtually no local predators. By 1887, losses from rabbit damage compelled the New South Wales Government to offer a £25,000 reward () for "any method of success not previously known in the Colony for the effectual extermination of rabbits". A Royal Commission was held in 1901 to investigate the situation. Construction The fence posts are placed apart and have a minimum diameter of . There were initially three wires of gauge, strung , , and above ground, with a barbed wire added later at and a plain wire at , to make the fence a barrier for dingoes and foxes as well. Wire netting, extending below ground, was attached to the wire. The fence was constru
https://en.wikipedia.org/wiki/KAUT-TV
KAUT-TV (channel 43) is a television station in Oklahoma City, Oklahoma, United States, serving as the local outlet for The CW Television Network. It is owned and operated by the network's majority owner, Nexstar Media Group, alongside NBC affiliate KFOR-TV (channel 4). Both stations share studios in Oklahoma City's McCourry Heights section, while KAUT-TV's transmitter is located on the city's northeast side. History Early history The UHF channel 43 allocation in Oklahoma City was originally assigned to Christian Broadcasting of Oklahoma Inc. – a religious nonprofit corporation headed by George G. Teague, a local evangelist and co-founder of the Capitol Hill Assembly of God – which filed an application with the Federal Communications Commission (FCC) for a license and construction permit on April 4, 1977, proposing to sign on a non-commercial religious television station on the frequency. The FCC Broadcast Bureau granted the license to Christian Broadcasting of Oklahoma on November 17, 1978; two months later in January 1979, the group applied to use KFHC-TV as the planned station's callsign. On July 13, 1979, the Teague group announced it would sell the license to Golden West Broadcasters (a joint venture between actor/singer and Ravia, Oklahoma, native Gene Autry and The Signal Companies that, at the time, also owned independent station KTLA [now a fellow CW owned-and-operated station] in Los Angeles) for $60,000; the FCC granted approval of the transaction on January 24, 1980. VEU The station first signed on the air on October 15, 1980, as KAUT, initially operating as a pilot station for Golden West's subscription service Video Entertainment Unlimited (VEU). (The callsign, which references controlling group stakeholder Autry, was chosen by Golden West two months prior to sign-on; a "-TV" suffix would be added to the callsign on January 27, 1983.) It was the first broadcast outlet for the service, which Golden West's pay television unit, Golden West Subscriptio
https://en.wikipedia.org/wiki/Formal%20Methods%20Europe
Formal Methods Europe (FME) is an organization whose aim is to encourage the research and application of formal methods for the improvement of software and hardware in computer-based systems. The association's members are drawn from academia and industry. It is based in Europe, but is international in scope. FME operates under Dutch law. Activities include or have included: Dissemination of research findings and industrial experience through conferences (every 18 months) and sponsored events; Development of information resources for educators; Networking for commercial practitioners through ForTIA (Formal Techniques Industry Association). The Chair of FME is John Fitzgerald of the University of Newcastle upon Tyne, UK. ForTIA The Formal Techniques Industry Association (ForTIA) aimed to support the industrial use of formal methods under the umbrella organization of Formal Methods Europe. It was founded in 2003 through the initial efforts of Dines Bjørner and was chaired by Anthony Hall and Volkmar Lotz among others. Its scope was international and membership was by company. It organized meetings, especially in conjunction with conferences, for instance, industry days at the FM conferences organized by FME. See also BCS-FACS Formal Aspects of Computing Science Specialist Group Formal methods Anthony Hall, founding chair of ForTIA References External links FME website Formal Method Europe group on LinkedIn ForTIA website (2009) Organizations with year of establishment missing Formal methods organizations International organizations based in Sweden Information technology organizations based in Europe
https://en.wikipedia.org/wiki/MK484
The MK484 AM radio IC is a fully functional AM radio detector on a chip. It is constructed in a TO-92 case, resembling a small transistor. It replaces the similar ZN414 AM radio IC from the 1970s. The MK484 is favored by many hobbyists. It is advantageous in that it performs well with minimal discrete components, and can run from a single 1.5-volt cell. The MK484 has now in turn been replaced by the TA7642. Standard operation The simplest circuit employing the MK484 can be constructed using only a battery, an earphone (or high-impedance speaker), a coil and a variable capacitor. Extended operation The output of the MK484 can be fed into the base of a transistor to provide greater amplification as a class-A amplifier, however this is often an inefficient design. Conversely, the LM386 audio amplifier may be used to drive a small speaker. Note that higher voltage is required if the LM386 is to be used. Therefore, small signal diodes (such as 1N4148) are recommended to create a voltage drop, or use a Zener DC–DC converter with a red LED (in forward, can double as a power indicator) and a resistor (several hundred ohms for 9V operation), to avoid overvolting the MK484. Advantages Compact size Low power consumption Low cost: Rs 40 to 60 in Indian electronic markets Retails from 65 cents each on various websites References External links Ferranti ZN414Z/ZN415E/ZN416E datasheet GEC Plessey ZN414Z/415E/416E datasheet Rapid MK484 datasheet Rapid TA7642 datasheet Mitsumi LMF501T datasheet Web site with examples of some matchbox radios based on the ZN414Z Another ZN414 radio A MK484 radio Linear integrated circuits
https://en.wikipedia.org/wiki/Deaerator
A deaerator is a device that is used for the removal of dissolved gases like oxygen from a liquid. Thermal deaerators are commonly used to remove dissolved gases in feedwater for steam-generating boilers. The deaerator is part of the feedwater heating system. Dissolved oxygen in feedwater will cause serious corrosion damage in a boiler by attaching to the walls of metal piping and other equipment forming oxides (like rust). Dissolved carbon dioxide combines with water to form carbonic acid that may cause further corrosion. Most deaerators are designed to remove oxygen down to levels of 7 parts per billion by weight or less, as well as essentially eliminating carbon dioxide. Vacuum deaerators are used to remove dissolved gases from products such as food, personal care products, cosmetic products, chemicals, and pharmaceuticals to increase the dosing accuracy in the filling process, to increase product shelf stability, to prevent oxidative effects (e.g. discolouration, changes of smell or taste, rancidity), to alter pH, and to reduce packaging volume. Manufacturing of deaerators started in the 1800s and continues to the present day. History Manufacturing of deaerators started in the 1800s.They were used to purify water used in the ice manufacturing process. Feed water heaters were used for marine applications. In 1899, George M Kleucker received a patent for an improved method of de-aerating water. Two sister ships, Olympic and Titanic (1912), had contact feed heaters on board. In 1934 the US Navy purchased an atomizing deaerator. During the 1920s the feedwater heaters and deaerators designs improved. Between 1921 and 1933, George Gibson, Percy Lyon, and Victor Rohlin of Cochrane received deaerator / degasification patents for bubbling steam through liquid. 1926 Brown Stanley received a patent for reducing oxygen and nitrogen gases (deaeration). In 1937 Samuel B Applebaum of Permutit received a water deaerator and purifier patent. Deaerators continue to b
https://en.wikipedia.org/wiki/Tucson%20Bird%20Count
The Tucson Bird Count (TBC) is a community-based program that monitors bird populations in and around the Tucson, Arizona, United States metropolitan area. With nearly 1000 sites monitored annually, the Tucson Bird Count is among the largest urban biological monitoring programs in the world. Methods Each spring, TBC participants collect data on bird abundance and distribution at hundreds of point count locations arrayed across the Tucson basin. The TBC is an example of citizen science, drawing on the combined efforts of hundreds of volunteers. So that data are of suitable quality for scientific analysis and decisionmaking, all TBC volunteers are skilled birdwatchers; many are also professional field guides or biologists. TBC methods are similar to those employed by the North American Breeding Bird Survey, although the TBC uses more closely spaced sites (one site per 1-km2 square) over a smaller total area (approximately 1000 km2). The TBC's spatially systematic monitoring is complemented by a TBC park monitoring program that surveys parks, watercourses, or other areas of particular interest multiple times throughout the year. Uses of Data Uses of Tucson Bird Count data include monitoring the status of the Tucson-area bird community over time, finding the areas and land-use practices that are succeeding at sustaining native birds, and investigating the ecology of birds in human-dominated landscapes. Tucson Bird Count results have led to scientific publications, informed Tucson-area planning, and contributed a variety of projects, from locating populations of imperiled species to estimating risk to humans from West Nile Virus. The TBC and several associated research projects are examples of reconciliation ecology, in that they investigate how native species can be sustained in and around the places people live, work, and play. Researchers have also used TBC data to explore the extent to which urban humans are separated from nature (Turner et al. 2004) Recently, the
https://en.wikipedia.org/wiki/Bernoulli%20scheme
In mathematics, the Bernoulli scheme or Bernoulli shift is a generalization of the Bernoulli process to more than two possible outcomes. Bernoulli schemes appear naturally in symbolic dynamics, and are thus important in the study of dynamical systems. Many important dynamical systems (such as Axiom A systems) exhibit a repellor that is the product of the Cantor set and a smooth manifold, and the dynamics on the Cantor set are isomorphic to that of the Bernoulli shift. This is essentially the Markov partition. The term shift is in reference to the shift operator, which may be used to study Bernoulli schemes. The Ornstein isomorphism theorem shows that Bernoulli shifts are isomorphic when their entropy is equal. Definition A Bernoulli scheme is a discrete-time stochastic process where each independent random variable may take on one of N distinct possible values, with the outcome i occurring with probability , with i = 1, ..., N, and The sample space is usually denoted as as a shorthand for The associated measure is called the Bernoulli measure The σ-algebra on X is the product sigma algebra; that is, it is the (countable) direct product of the σ-algebras of the finite set {1, ..., N}. Thus, the triplet is a measure space. A basis of is the cylinder sets. Given a cylinder set , its measure is The equivalent expression, using the notation of probability theory, is for the random variables The Bernoulli scheme, as any stochastic process, may be viewed as a dynamical system by endowing it with the shift operator T where Since the outcomes are independent, the shift preserves the measure, and thus T is a measure-preserving transformation. The quadruplet is a measure-preserving dynamical system, and is called a Bernoulli scheme or a Bernoulli shift. It is often denoted by The N = 2 Bernoulli scheme is called a Bernoulli process. The Bernoulli shift can be understood as a special case of the Markov shift, where all entries in the adjacency matrix are one,
https://en.wikipedia.org/wiki/WLAE-TV
WLAE-TV (channel 32) is an educational independent television station in New Orleans, Louisiana, United States. The station is owned by the Educational Broadcasting Foundation, a partnership between the Willwoods Community (a Catholic-related organization) and the Louisiana Educational Television Authority (operator of Louisiana Public Broadcasting, which owns the PBS member stations in Louisiana that are located outside of New Orleans). WLAE's studios are located on Howard Avenue in New Orleans, and its transmitter is located on Paris Road/Highway 47 (northeast of Chalmette). History As a PBS member station In 1978, a group of married couples, supported by the Catholic Church, formed the Willwoods Community. The organization joined forces with the Louisiana Educational Television Authority, which had been looking for a way to get its locally-based programming into the state's largest market. At the time, WYES-TV (channel 12) was the city's sole public TV station, and Willwoods sought to obtain the other non-commercial license allocated to the New Orleans market. On December 14, 1981, under the banner of the "Educational Broadcasting Foundation," the partnership was granted an educational station license from the Federal Communications Commission (FCC). WLAE-TV first signed on the air on July 8, 1984; it originally served as a member station of the Public Broadcasting Service (PBS). WLAE-TV operated as a secondary member of the network through PBS' Program Differentiation Plan, and thus only carried 25% of the programming broadcast by PBS, while the remainder aired on WYES. As a side note, Sesame Street was one of the few programs that was shown on both stations. In addition to offering PBS programming, WLAE also aired, and still airs, locally produced educational programs, as well as select programming from Louisiana Public Broadcasting (mostly consisting of news and public affairs programming). WLAE is also one of very few public television stations to televi
https://en.wikipedia.org/wiki/Motor%20variable
In mathematics, a function of a motor variable is a function with arguments and values in the split-complex number plane, much as functions of a complex variable involve ordinary complex numbers. William Kingdon Clifford coined the term motor for a kinematic operator in his "Preliminary Sketch of Biquaternions" (1873). He used split-complex numbers for scalars in his split-biquaternions. Motor variable is used here in place of split-complex variable for euphony and tradition. For example, Functions of a motor variable provide a context to extend real analysis and provide compact representation of mappings of the plane. However, the theory falls well short of function theory on the ordinary complex plane. Nevertheless, some of the aspects of conventional complex analysis have an interpretation given with motor variables, and more generally in hypercomplex analysis. Elementary functions Let D = , the split-complex plane. The following exemplar functions f have domain and range in D: The action of a hyperbolic versor is combined with translation to produce the affine transformation . When c = 0, the function is equivalent to a squeeze mapping. The squaring function has no analogy in ordinary complex arithmetic. Let and note that The result is that the four quadrants are mapped into one, the identity component: . Note that forms the unit hyperbola . Thus, the reciprocation involves the hyperbola as curve of reference as opposed to the circle in C. Linear fractional transformations Using the concept of a projective line over a ring, the projective line P(D) is formed. The construction uses homogeneous coordinates with split-complex number components. The projective line P(D) is transformed by linear fractional transformations: sometimes written provided cz + d is a unit in D. Elementary linear fractional transformations include hyperbolic rotations translations and the inversion Each of these has an inverse, and compositions fill out a group of line
https://en.wikipedia.org/wiki/Ancient%20UNIX
Ancient UNIX is any early release of the Unix code base prior to Unix System III, particularly the Research Unix releases prior to and including Version 7 (the base for UNIX/32V as well as later developments of AT&T Unix). After the publication of the Lions' book, work was undertaken to release earlier versions of the codebase. SCO first released the code under a limited educational license. Later, in January 2002, Caldera International (now SCO Group) relicensed (but has not made available) several versions under the four-clause BSD license, namely: Research Unix: (early versions only) Version 1 Unix Version 2 Unix Version 3 Unix Version 4 Unix Version 5 Unix Version 6 Unix Version 7 Unix UNIX/32V , there has been no widespread use of the code, but it can be used on emulator systems, and Version 5 Unix runs on the Nintendo Game Boy Advance using the SIMH PDP-11 emulator. Version 6 Unix provides the basis for the MIT xv6 teaching system, which is an update of that version to ANSI C and the x86 or RISC-V platform. The BSD vi text editor is based on code from the ed line editor in those early Unixes. Therefore, "traditional" vi could not be distributed freely, and various work-alikes (such as nvi) were created. Now that the original code is no longer encumbered, the "traditional" vi has been adapted for modern Unix-like operating systems. SCO Group, Inc. was previously called Caldera International. As a result of the SCO Group, Inc. v. Novell, Inc. case, Novell, Inc. was found to not have transferred the copyrights of UNIX to SCO Group, Inc. Concerns have been raised regarding the validity of the Caldera license. The Unix Heritage Society The Unix Heritage Society was founded by Warren Toomey. First edition Unix was restored to a usable state by a restoration team from the Unix Heritage Society in 2008. The restoration process started with paper listings of the source code which were in Unix PDP-11 assembly language. References External links Pamela J
https://en.wikipedia.org/wiki/NetIQ%20eDirectory
eDirectory is an X.500-compatible directory service software product from NetIQ. Previously owned by Novell, the product has also been known as Novell Directory Services (NDS) and sometimes referred to as NetWare Directory Services. NDS was initially released by Novell in 1993 for Netware 4, replacing the Netware bindery mechanism used in previous versions, for centrally managing access to resources on multiple servers and computers within a given network. eDirectory is a hierarchical, object oriented database used to represent certain assets in an organization in a logical tree, including organizations, organizational units, people, positions, servers, volumes, workstations, applications, printers, services, and groups to name just a few. Features eDirectory uses dynamic rights inheritance, which allows both global and specific access controls. Access rights to objects in the tree are determined at the time of the request and are determined by the rights assigned to the objects by virtue of their location in the tree, any security equivalences, and individual assignments. The software supports partitioning at any point in the tree, as well as replication of any partition to any number of servers. Replication between servers occurs periodically using deltas of the objects. Each server can act as a master of the information it holds (provided the replica is not read only). Additionally, replicas may be filtered to only include defined attributes to increase speed (for example, a replica may be configured to only include a name and phone number for use in a corporate address book, as opposed to the entire directory user profile). The software supports referential integrity, multi-master replication, and has a modular authentication architecture. It can be accessed via LDAP, DSML, SOAP, ODBC, JDBC, JNDI, and ADSI. Supported platforms Windows 2000 Windows Server 2003 Windows Server 2008 Windows Server 2012 SUSE Linux Enterprise Server Red Hat Enterprise Li
https://en.wikipedia.org/wiki/Concurrent%20computing
Concurrent computing is a form of computing in which several computations are executed concurrently—during overlapping time periods—instead of sequentially—with one completing before the next starts. This is a property of a system—whether a program, computer, or a network—where there is a separate execution point or "thread of control" for each process. A concurrent system is one where a computation can advance without waiting for all other computations to complete. Concurrent computing is a form of modular programming. In its paradigm an overall computation is factored into subcomputations that may be executed concurrently. Pioneers in the field of concurrent computing include Edsger Dijkstra, Per Brinch Hansen, and C.A.R. Hoare. Introduction The concept of concurrent computing is frequently confused with the related but distinct concept of parallel computing, although both can be described as "multiple processes executing during the same period of time". In parallel computing, execution occurs at the same physical instant: for example, on separate processors of a multi-processor machine, with the goal of speeding up computations—parallel computing is impossible on a (one-core) single processor, as only one computation can occur at any instant (during any single clock cycle). By contrast, concurrent computing consists of process lifetimes overlapping, but execution need not happen at the same instant. The goal here is to model processes in the outside world that happen concurrently, such as multiple clients accessing a server at the same time. Structuring software systems as composed of multiple concurrent, communicating parts can be useful for tackling complexity, regardless of whether the parts can be executed in parallel. For example, concurrent processes can be executed on one core by interleaving the execution steps of each process via time-sharing slices: only one process runs at a time, and if it does not complete during its time slice, it is paused, an
https://en.wikipedia.org/wiki/Cosmic%20pluralism
Cosmic pluralism, the plurality of worlds, or simply pluralism, describes the belief in numerous "worlds" (planets, dwarf planets or natural satellites) in addition to Earth (possibly an infinite number), which may harbour extraterrestrial life. The debate over pluralism began as early as the time of Anaximander (c. 610 – c. 546 BC) as a metaphysical argument, long predating the scientific Copernican conception that the Earth is one of numerous planets. It has continued, in a variety of forms, until the modern era. Ancient Greek debates In Greek times, the debate was largely philosophical and did not conform to present notions of cosmology. Cosmic pluralism was a corollary to notions of infinity and the purported multitude of life-bearing worlds were more akin to parallel universes (either contemporaneously in space or infinitely recurring in time) than to different solar systems. After Anaximander opened the door to both an infinite universe and an infinite amount of universes, a strong pluralist stance was adopted by the atomists, notably Leucippus, Democritus, Epicurus—whose Epistle to Herodotus clearly lays out the Doctrine of Innumerable Worlds—and Lucretius who elaborates this Doctrine in his work De rerum natura. Anaxarchus told Alexander the Great that there were an infinite number of worlds that each harbored an infinite variety of extraterrestrial life, leading Alexander to weep, for he had not yet conquered even one. While these were prominent thinkers, their opponents—Plato and Aristotle—had greater effect. They argued that the Earth is unique and that there can be no other systems of worlds. This stance neatly dovetailed with later Christian ideas, and pluralism was effectively suppressed for approximately a millennium. Medieval Islamic thought Many medieval Muslim scholars endorsed the idea of cosmic pluralism. Imam Muhammad al-Baqir (676–733) wrote "Maybe you see that God created only this single world and that God did not create humans besides yo
https://en.wikipedia.org/wiki/Applix
Applix Inc. was a computer software company founded in 1983 based in Westborough, Massachusetts that published Applix TM1, a multi-dimensional online analytical processing (MOLAP) database server, and related presentation tools, including Applix Web and Applix Executive Viewer. Together, Applix TM1, Applix Web and Applix Executive Viewer were the three core components of the Applix Business Analytics Platform. (Executive Viewer was subsequently discontinued by IBM.) On October 25, 2007, Applix was acquired by Cognos. Cognos rebranded all Applix products under its name following the acquisition. On January 31, 2008, Cognos was itself acquired by IBM. Prior to OLAP industry consolidation in 2007, Applix was the purest OLAP vendor among publicly traded independent business intelligence vendors, and had the greatest growth rate. TM1 is now marketed as IBM Cognos TM1; version 10.2 became publicly available on September 13, 2013. Products and technology Applix TM1 is enterprise planning software used for collaborative planning, budgeting and forecasting, as well as analytical and reporting applications. Data in TM1 is stored and represented as multidimensional cubes, with data being stored at the "leaf" level. Computations on the leaf data are performed in real-time (for example, to aggregate numbers up a dimensional hierarchy). IBM Cognos TM1 also includes a data orchestration environment for accessing external data and systems, as well as capabilities designed for common business planning and budgeting requirements (e.g. workflow, top-down adjustments). See also Applixware References External links IBM Cognos TM1 Web Site Companies established in 1983 Companies disestablished in 2007 Defunct companies based in Massachusetts Defunct software companies of the United States Online analytical processing 1983 establishments in Massachusetts
https://en.wikipedia.org/wiki/Focal%20point%20%28game%20theory%29
In game theory, a focal point (or Schelling point) is a solution that people tend to choose by default in the absence of communication. The concept was introduced by the American economist Thomas Schelling in his book The Strategy of Conflict (1960). Schelling states that "(p)eople can often concert their intentions or expectations with others if each knows that the other is trying to do the same" in a cooperative situation (at page 57), so their action would converge on a focal point which has some kind of prominence compared with the environment. However, the conspicuousness of the focal point depends on time, place and people themselves. It may not be a definite solution. Existence The existence of the focal point is first demonstrated by Schelling with a series of questions. The most famous one is the New York City question: if you are to meet a stranger in New York City, but you cannot communicate with the person, then when and where will you choose to meet? This is a coordination game, where any place and time in the city could be an equilibrium solution. Schelling asked a group of students this question, and found the most common answer was "noon at (the information booth at) Grand Central Terminal". There is nothing that makes Grand Central Terminal a location with a higher payoff (you could just as easily meet someone at a bar or the public library reading room), but its tradition as a meeting place raises its salience and therefore makes it a natural "focal point". Later, Schelling's informal experiments have been replicated under controlled conditions with monetary incentives by Judith Mehta. Theories Although the concept of a focal point has been widely accepted in game theory, it is still unclear how a focal point forms. The researchers have proposed theories from two aspects. The level-n theory Stahl and Wilson argue that a focal point is formed because players would try to predict how other players act. They model the level of "rational expectat
https://en.wikipedia.org/wiki/Virtual%20Library%20museums%20pages
The Virtual Library museums pages (VLmp) formed an early leading directory of online museums around the world. History The VLmp online directory resource was founded by Jonathan Bowen in 1994, originally at the Oxford University Computing Laboratory in the United Kingdom. It has been supported by the International Council of Museums (ICOM) and Museophile Limited. As part of the World Wide Web Virtual Library, initiated by Tim Berners-Lee and later managed by Arthur Secret. The main VLmp site moved to London South Bank University in the early 2000s and is now hosted on the MuseumsWiki wiki, established in 2006 and hosted by Fandom (previously Wikia) as a historical record. The directory was developed and organised in a distributed manner by country, with around twenty people in different countries maintaining various sections. Canada, through the Canadian Heritage Information Network (CHIN), was the first country to become involved. The MDA maintained the United Kingdom section of museums, later the Collections Trust. The Historisches Centrum Hagen has maintained and hosted pages for Germany. Other countries actively participating included Romania. In total, around 20 countries were involved. The directory was influential in the museum field during the 1990s and 2000s. It was used as a standard starting point to find museums online. It was useful for monitoring the growth of museums internationally online. It was also used for online museum surveys. It was recommended as an educational resource and included a search facility. Virtual Museum of Computing The Virtual Museum of Computing (VMoC), part of the Virtual Library museums pages, was created as a virtual museum providing information on the history of computers and computer science. It included virtual "galleries" (e.g., on Alan Turing, curated by Andrew Hodges) and links to other computer museums. VMoC was founded in 1995, initially at the University of Oxford. As part of VLmp, it was hosted by the Internati
https://en.wikipedia.org/wiki/DATAR
DATAR, short for Digital Automated Tracking and Resolving, was a pioneering computerized battlefield information system. DATAR combined the data from all of the sensors in a naval task force into a single "overall view" that was then transmitted back to all of the ships and displayed on plan-position indicators similar to radar displays. Commanders could then see information from everywhere, not just their own ship's sensors. Development of the DATAR system was spurred by the Royal Navy's work on the Comprehensive Display System (CDS), which Canadian engineers were familiar with. The project was started by the Royal Canadian Navy in partnership with Ferranti Canada (later known as Ferranti-Packard) in 1949. They were aware of CDS and a US Navy project along similar lines but believed their solution was so superior that they would eventually be able to develop the system on behalf of all three forces. They also believed sales were possible to the Royal Canadian Air Force and US Air Force for continental air control. A demonstration carried out in the fall of 1953 was by most measures an unqualified success, to the point where some observers thought it was being faked. By this time the US Air Force was well into development of their SAGE system and the RCAF decided that commonality with that force was more important than commonality with their own Navy. The Royal Navy computerized their CDS in the new Action Data Automation system, and the US Navy decided on a somewhat simpler system, the Naval Tactical Data System. No orders for DATAR were forthcoming. When one of the two computers was destroyed by fire, the company was unable to raise funds for a replacement, and the project ended. The circuitry design used in the system would be applied to several other Ferranti machines over the next few years. History Canadian Navy during the War At the Atlantic Convoy Conference of 1943, Canada was given shared control of all convoys running between the British Isles and No
https://en.wikipedia.org/wiki/Confit
Confit (, ) (from the French word confire, literally "to preserve") is any type of food that is cooked slowly over a long period as a method of preservation. Confit, as a cooking term, describes when food is cooked in grease, oil, at a lower temperature, as opposed to deep frying. While deep frying typically takes place at temperatures of , confit preparations are done at a much lower temperature, such as an oil temperature of around , or sometimes even cooler. The term is usually used in modern cuisine to mean long, slow cooking in oil or fat at low temperatures, many having no element of preservation, such as in dishes like confit potatoes. For meat, this method requires the meat to be salted as part of the preservation process. After salting and cooking in fat, confit can last for several months or years when sealed and stored in a cool, dark place. Confit is a specialty of southwestern France. Etymology The word comes from the French verb confire (to preserve), which in turn comes from the Latin word (conficere), meaning "to do, to produce, to make, to prepare." The French verb was first applied in medieval times to fruits cooked and preserved in sugar. Fruit confit Fruit confit are candied fruit (whole fruit, or pieces thereof) preserved in sugar. The fruit must be fully infused with sugar to its core; larger fruit takes considerably longer than smaller ones to candy. Thus, while small fruit such as cherries are confits whole, it is quite rare to see whole large fruit, such as melon confits since the time and energy involved in producing large fruit confits makes them quite expensive. Meat confit Confit of goose (confit d'oie) and duck (confit de canard) are usually prepared from the legs of the bird. The meat is salted and seasoned with herbs and slowly cooked submerged in its own rendered fat (never to exceed ), in which it is then preserved by allowing it to cool and storing it in the fat. Turkey and pork may be treated in the same manner. Meat confit
https://en.wikipedia.org/wiki/Formal%20Aspects%20of%20Computing
Formal Aspects of Computing (FAOC) is a peer-reviewed scientific journal published by Springer Science+Business Media, covering the area of formal methods and associated topics in computer science. The editor-in-chief is Jim Woodcock. According to the Journal Citation Reports, the journal has a 2010 impact factor of 1.170. Until 2021, the journal was published by Springer. It is now published by ACM. See also Acta Informatica Innovations in Systems and Software Engineering References External links Academic journals established in 1989 Computer science journals Formal methods publications British Computer Society Springer Science+Business Media academic journals Quarterly journals English-language journals
https://en.wikipedia.org/wiki/Two-element%20Boolean%20algebra
In mathematics and abstract algebra, the two-element Boolean algebra is the Boolean algebra whose underlying set (or universe or carrier) B is the Boolean domain. The elements of the Boolean domain are 1 and 0 by convention, so that B = {0, 1}. Paul Halmos's name for this algebra "2" has some following in the literature, and will be employed here. Definition B is a partially ordered set and the elements of B are also its bounds. An operation of arity n is a mapping from Bn to B. Boolean algebra consists of two binary operations and unary complementation. The binary operations have been named and notated in various ways. Here they are called 'sum' and 'product', and notated by infix '+' and '∙', respectively. Sum and product commute and associate, as in the usual algebra of real numbers. As for the order of operations, brackets are decisive if present. Otherwise '∙' precedes '+'. Hence is parsed as and not as . Complementation is denoted by writing an overbar over its argument. The numerical analog of the complement of is . In the language of universal algebra, a Boolean algebra is a ∙ algebra of type . Either one-to-one correspondence between {0,1} and {True,False} yields classical bivalent logic in equational form, with complementation read as NOT. If 1 is read as True, '+' is read as OR, and '∙' as AND, and vice versa if 1 is read as False. These two operations define a commutative semiring, known as the Boolean semiring. Some basic identities 2 can be seen as grounded in the following trivial "Boolean" arithmetic: Note that: '+' and '∙' work exactly as in numerical arithmetic, except that 1+1=1. '+' and '∙' are derived by analogy from numerical arithmetic; simply set any nonzero number to 1. Swapping 0 and 1, and '+' and '∙' preserves truth; this is the essence of the duality pervading all Boolean algebras. This Boolean arithmetic suffices to verify any equation of 2, including the axioms, by examining every possible assignment of 0s and 1s to each var
https://en.wikipedia.org/wiki/Single-responsibility%20principle
The single-responsibility principle (SRP) is a computer programming principle that states that "A module should be responsible to one, and only one, actor." The term actor refers to a group (consisting of one or more stakeholders or users) that requires a change in the module. Robert C. Martin, the originator of the term, expresses the principle as, "A class should have only one reason to change". Because of confusion around the word "reason" he has also clarified saying that the "principle is about people." In some of his talks, he also argues that the principle is, in particular, about roles or actors. For example, while they might be the same person, the role of an accountant is different from a database administrator. Hence, each module should be responsible for each role. History The term was introduced by Robert C. Martin in his article "The Principles of OOD" as part of his Principles of Object Oriented Design, made popular by his 2003 book Agile Software Development, Principles, Patterns, and Practices. Martin described it as being based on the principle of cohesion, as described by Tom DeMarco in his book Structured Analysis and System Specification, and Meilir Page-Jones in The Practical Guide to Structured Systems Design. In 2014 Martin published a blog post titled "The Single Responsibility Principle" with a goal to clarify what was meant by the phrase "reason for change." Example Martin defines a responsibility as a reason to change, and concludes that a class or module should have one, and only one, reason to be changed (e.g. rewritten). As an example, consider a module that compiles and prints a report. Imagine such a module can be changed for two reasons. First, the content of the report could change. Second, the format of the report could change. These two things change for different causes. The single-responsibility principle says that these two aspects of the problem are really two separate responsibilities, and should, therefore, be in separ
https://en.wikipedia.org/wiki/Trusted%20path
A trusted path or trusted channel is a mechanism that provides confidence that the user is communicating with what the user intended to communicate with, ensuring that attackers can't intercept or modify whatever information is being communicated. The term was initially introduced by Orange Book. As its security architecture concept, it can be implemented with any technical safeguards suitable for particular environment and risk profile. Examples Electronic signature In Common Criteria and European Union electronic signature standards trusted path and trusted channel describe techniques that prevent interception or tampering with sensitive data as it passes through various system components: trusted path — protects data from the user and a security component (e.g. PIN sent to a smart card to unblock it for digital signature), trusted channel — protects data between security component and other information resources (e.g. data read from a file and sent to the smart card for signature). User login One of popular techniques for password stealing in Microsoft Windows was login spoofing, which was based on programs that simulated operating system's login prompt. When users try to log in, the fake login program can then capture user passwords for later use. As a safeguard Windows NT introduced Ctrl-Alt-Del sequence as secure attention key to escape any third party programs and invoke system login prompt. A similar problem arises in case of websites requiring authentication, where the user is expected to enter their credentials without actually knowing if the website is not spoofed. HTTPS mitigates this attack by first authenticating the server to the user (using trust anchor and certification path validation algorithm), and only then displaying the login form. References Computer network security
https://en.wikipedia.org/wiki/AM%20expanded%20band
The extended mediumwave broadcast band, commonly known as the AM expanded band, refers to the broadcast station frequency assignments immediately above the earlier upper limits of 1600 kHz in International Telecommunication Union (ITU) Region 2 (the Americas), and 1602 kHz in ITU Regions 1 (Europe, northern Asia and Africa) and 3 (southern Asia and Oceania). In Region 2, this consists of ten additional frequencies, spaced 10 kHz apart, and running from 1610 kHz to 1700 kHz. In Regions 1 and 3, where frequency assignments are spaced nine kHz apart, the result is eleven additional frequencies, from 1611 kHz to 1701 kHz. ITU Region 1 Europe The extended band is not officially allocated in Europe, and the trend of national broadcasters in the region has been to reduce the number of their AM band stations in favor of FM and digital transmissions. However, new Low-Power AM (LPAM) stations have recently come on the air from countries like Finland, Sweden, Norway, Denmark, the Netherlands and Italy. These frequencies are also used by a number of "hobby" pirate radio stations, particularly in the Netherlands, Greece, and Serbia. Vatican Radio for many years transmitted on 1611 kHz, before ceasing broadcasts on this frequency in 2012. Since 2014 a licensed Norwegian project has been broadcasting both Radio Northern Star and The Sea on 1611 kHz. ITU Region 2 In 1979, a World Administrative Radio Conference (WARC-79) adopted "Radio Regulation No. 480", which stated that "In Region 2, the use of the band 1605-1705 kHz by stations of the broadcasting service shall be subject to a plan to be established by a regional administrative radio conference..." As a consequence, on June 8, 1988 an ITU-sponsored conference held at Rio de Janeiro, Brazil adopted provisions, effective July 1, 1990, to extend the upper end of the Region 2 AM broadcast band, by adding ten frequencies which spanned from 1610 kHz to 1700 kHz. The agreement provided for a standard transmitter power of 1 kilowa
https://en.wikipedia.org/wiki/Body%20in%20white
Body in white (BIW) is the stage in automobile manufacturing in which a car body's frame has been joined together, that is before painting and before the motor, chassis sub-assemblies, or trim (glass, door locks/handles, seats, upholstery, electronics, etc.) have been integrated into the structure. Assembly involves different techniques such as welding (spot, MIG/MAG, or friction stir), riveting, clinching, bonding and laser brazing. The term derives from manufacturing practices before steel unibody monocoques, when automobile bodies were made by outside firms on a separate chassis with an engine, suspension, and bumpers attached. The manufacturers built or purchased wooden bodies (with thin, non-structural metal sheets on the outside) to bolt onto the frame. The bodies were painted white prior to the final color. A folk etymology for "body in white" is the appearance of a car body after it is dipped into a white bath of primer (undercoat paint)— despite the primer's actual gray color. BIW could also refer to when car bodywork would be made of timber – all timber products, furniture, etc., are considered to be "in the white" when at the stage of raw timber before finishing or varnishing. In car design, the "body in white" phase is where the final contours of the car body are worked out, in preparation for the ordering of the expensive production stamping die. Extensive computer simulations of crash-worthiness, manufacturability, and automotive aerodynamics are required before a clay model from the design studio can be converted into a body in white ready for production. Factories may offer BIW cars to racers, who then may replace up to 90% of the car with aftermarket parts, and niche manufacturers like Ruf Automobile start their cars with BIWs from other makers. Related terms A related term in the automotive industry is "body in black". This can refer to a car body that is formed of alternate materials such as composites rather than conventional metal; these co
https://en.wikipedia.org/wiki/Japanese%20theorem%20for%20cyclic%20polygons
In geometry, the Japanese theorem states that no matter how one triangulates a cyclic polygon, the sum of inradii of triangles is constant. Conversely, if the sum of inradii is independent of the triangulation, then the polygon is cyclic. The Japanese theorem follows from Carnot's theorem; it is a Sangaku problem. Proof This theorem can be proven by first proving a special case: no matter how one triangulates a cyclic quadrilateral, the sum of inradii of triangles is constant. After proving the quadrilateral case, the general case of the cyclic polygon theorem is an immediate corollary. The quadrilateral rule can be applied to quadrilateral components of a general partition of a cyclic polygon, and repeated application of the rule, which "flips" one diagonal, will generate all the possible partitions from any given partition, with each "flip" preserving the sum of the inradii. The quadrilateral case follows from a simple extension of the Japanese theorem for cyclic quadrilaterals, which shows that a rectangle is formed by the two pairs of incenters corresponding to the two possible triangulations of the quadrilateral. The steps of this theorem require nothing beyond basic constructive Euclidean geometry. With the additional construction of a parallelogram having sides parallel to the diagonals, and tangent to the corners of the rectangle of incenters, the quadrilateral case of the cyclic polygon theorem can be proved in a few steps. The equality of the sums of the radii of the two pairs is equivalent to the condition that the constructed parallelogram be a rhombus, and this is easily shown in the construction. Another proof of the quadrilateral case is available due to Wilfred Reyes (2002). In the proof, both the Japanese theorem for cyclic quadrilaterals and the quadrilateral case of the cyclic polygon theorem are proven as a consequence of Thébault's problem III. See also Carnot's theorem, which is used in a proof of the theorem above Equal incircles th
https://en.wikipedia.org/wiki/Source-specific%20multicast
Source-specific multicast (SSM) is a method of delivering multicast packets in which the only packets that are delivered to a receiver are those originating from a specific source address requested by the receiver. By so limiting the source, SSM reduces demands on the network and improves security. SSM requires that the receiver specify the source address and explicitly excludes the use of the (*,G) join for all multicast groups in RFC 3376, which is possible only in IPv4's IGMPv3 and IPv6's MLDv2. Any-source multicast (as counterexample) Source-specific multicast is best understood in contrast to any-source multicast (ASM). In the ASM service model a receiver expresses interest in traffic to a multicast address. The multicast network must discover all multicast sources sending to that address, and route data from all sources to all interested receivers. This behavior is particularly well suited to groupware applications where all participants in the group want to be aware of all other participants, and the list of participants is not known in advance. The source discovery burden on the network can become significant when the number of sources is large. Operation In the SSM service model, in addition to the receiver expressing interest in traffic to a multicast address, the receiver expresses interest in receiving traffic from only one specific source sending to that multicast address. This relieves the network of discovering many multicast sources and reduces the amount of multicast routing information that the network must maintain. SSM requires support in last-hop routers and in the receiver's operating system. SSM support is not required in other network components, including routers and even the sending host. Interest in multicast traffic from a specific source is conveyed from hosts to routers using IGMPv3 as specified in RFC 4607. SSM destination addresses must be in the ranges 232.0.0.0/8 for IPv4. For IPv6 current allowed SSM destination addre
https://en.wikipedia.org/wiki/Multicast%20Listener%20Discovery
Multicast Listener Discovery (MLD) is a component of the Internet Protocol Version 6 (IPv6) suite. MLD is used by IPv6 routers for discovering multicast listeners on a directly attached link, much like Internet Group Management Protocol (IGMP) is used in IPv4. The protocol is embedded in ICMPv6 instead of using a separate protocol. MLDv1 is similar to IGMPv2 and MLDv2 similar to IGMPv3. Protocol The ICMPv6 messages use type 143. Support Several operating system support MLDv2: Windows Vista and later FreeBSD since release 8.0 The Linux kernel since 2.5.68 macOS References Internet protocols Internet Standards Network layer protocols IPv6
https://en.wikipedia.org/wiki/ReserVec
ReserVec was a computerized reservation system developed by Ferranti Canada for Trans-Canada Airlines (TCA, today's Air Canada) in the late 1950s. It appears to be the first such system ever developed, predating the more famous SABRE system in the United States by about two years. Although Ferranti had high hopes that the system would be used by other airlines, no further sales were forthcoming and development of the system ended. Major portions of the transistor-based circuit design were put to good use in the Ferranti-Packard 6000 computer, which would later go on to see major sales in Europe as the ICT 1904. Background In the early 1950s the airline industry was undergoing explosive growth. A serious limiting factor was the time taken to make a single booking, which could take upwards of 90 minutes in total. TCA found their bookings typically involved between three and seven calls to the centralized booking centre in Toronto, where telephone operators would scan flight status displayed on a huge board showing all scheduled flights one month into the future. Bookings past that time could not be made, nor could an agent reliably know anything other than if the flight was full or not – to book two seats was much more complex, requiring the operator to find the "flight card" for that flight in a filing cabinet. In 1946 American Airlines decided to tackle this problem through automation, introducing the Reservisor, a simple electromechanical computer based on telephone switching systems. Newer versions of the Reservisor included magnetic drum systems for storing flight information further into the future. The ultimate version of the system, the Magnetronic Reservisor, was installed in 1956 and could store data for 2,000 flights a day up to one month into the future. Reservisors were later sold to a number of airlines, as well as Sheraton for hotel bookings, and Goodyear for inventory control. TCA experiments TCA was aware of the Reservisor, but was unimpressed by i
https://en.wikipedia.org/wiki/Ehrenfest%20theorem
The Ehrenfest theorem, named after Austrian theoretical physicist Paul Ehrenfest, relates the time derivative of the expectation values of the position and momentum operators x and p to the expectation value of the force on a massive particle moving in a scalar potential , The Ehrenfest theorem is a special case of a more general relation between the expectation of any quantum mechanical operator and the expectation of the commutator of that operator with the Hamiltonian of the system where is some quantum mechanical operator and is its expectation value. It is most apparent in the Heisenberg picture of quantum mechanics, where it amounts to just the expectation value of the Heisenberg equation of motion. It provides mathematical support to the correspondence principle. The reason is that Ehrenfest's theorem is closely related to Liouville's theorem of Hamiltonian mechanics, which involves the Poisson bracket instead of a commutator. Dirac's rule of thumb suggests that statements in quantum mechanics which contain a commutator correspond to statements in classical mechanics where the commutator is supplanted by a Poisson bracket multiplied by . This makes the operator expectation values obey corresponding classical equations of motion, provided the Hamiltonian is at most quadratic in the coordinates and momenta. Otherwise, the evolution equations still may hold approximately, provided fluctuations are small. Relation to classical physics Although, at first glance, it might appear that the Ehrenfest theorem is saying that the quantum mechanical expectation values obey Newton’s classical equations of motion, this is not actually the case. If the pair were to satisfy Newton's second law, the right-hand side of the second equation would have to be which is typically not the same as If for example, the potential is cubic, (i.e. proportional to ), then is quadratic (proportional to ). This means, in the case of Newton's second law, the right side would
https://en.wikipedia.org/wiki/ZEBRA%20%28computer%29
The ZEBRA (Zeer Eenvoudige Binaire Reken Automaat translated Very Simple Binary Automatic Calculator) was one of the first computers to be designed in the Netherlands, (the first one was the "ARRA") and one of the first Dutch computers to be commercially available. It was designed by Willem van der Poel of the Netherlands Post, Telegraph and Telephone, and first delivered in 1958. The production run consisted of fifty-five machines, manufactured and marketed by the British company Standard Telephones and Cables, Ltd. The ZEBRA was a binary, two-address machine with a 33-bit word length. Storage was provided by a magnetic drum memory holding 8K words organised as 256 tracks of 32 instructions; accumulators were also implemented as recirculating drum tracks in a manner similar to that used in the Bendix G-15. Peripherals included paper tape reader and punch, and a teleprinter. In 1967, six Zebra computers were in use in UK universities and technical colleges. Programming Large parts of the code and operating systems for ZEBRA were written by deafblind mathematician Gerrit van der Mey. In contrast to most processors, the ZEBRA didn't have different types of instructions. Instead, the operation of an instruction was controlled by fifteen bits in the operation field. Also, it didn't have a program counter in the traditional sense. The ZEBRA instruction word is 33 bits, consisting of a 13-bit drum address, referencing one of the 256 tracks of 32 entries on the memory drum, a five-bit register (or I/O) address, and the 15-bit operation field. Each bit of the operation field had a distinct meaning and could be used in nearly any combination, leading to many elegant tricks that today might be considered the domain of microprogramming. The operation bits determined things as the sign of the data to be used; if the accumulator was cleared (changing an addition into a load), if a rotation was to be applied, and so on. Also, there was an operation bits that determined if
https://en.wikipedia.org/wiki/International%20Conference%20on%20Software%20Engineering
The International Conference on Software Engineering (ICSE), is one of the largest annual software engineering conferences. It has an 'A*' rating in the Rankings of the Computing Research and Education Association of Australasia (CORE) and an 'A1' rating from the Brazilian ministry of education. Furthermore, it is the software engineering conference with the highest Microsoft Academic field rating. The first ICSE conference was in 1975 in Washington DC. List of Conferences Past and future ICSE conferences include: References External links Pointers to ICSE conference websites ICSE Steering Committee Information Software engineering conferences
https://en.wikipedia.org/wiki/Web%20fiction
Web fiction is written works of literature available primarily or solely on the Internet. A common type of web fiction is the web serial. The term comes from old serial stories that were once published regularly in newspapers and magazines. Unlike most modern books, a work of web fiction is often not published as a whole. Instead, it is released on the Internet in installments or chapters as they are finished, although published compilations and anthologies are not unknown. The web serial form dominates in the category of fan fiction, as writing a serial takes less specialized software and often less time than an ebook. Web-based fiction dates to the earliest days of the World Wide Web, including the extremely popular The Spot (1995–1997), a tale told through characters' journal entries and interactivity with its audience. The Spot spawned many similar sites, including Ferndale and East Village, though these were not as successful and did not last long. Most of these early ventures are no longer in existence. Since 2008, web fiction has proliferated in popularity. Possibly as a result of this, more fans of web serials have decided to create their own, propagating the form further, leading to the number of serious, original works growing quickly. Some serials utilize the formats of the media to include things not possible in ordinary books, such as clickable maps, pop-up character bios, sorting posts by tag, and video. Web fiction has become hugely popular in China, with revenues topping US$2.5 billion. Publication formats Some of the most popular platforms for publishing web serials and webcomics include Archive of our Own (AO3), Wattpad, Fanfiction.net, Webtoon, Tapas, LiveJournal, and Radish. With their large user bases, the popularity of these sites may arise from their interactive aspects allowing creators, readers, and other users to communicate with one another and create new communities. Another format in use is the internet forum. A free forum service
https://en.wikipedia.org/wiki/Interactive%20Connectivity%20Establishment
Interactive Connectivity Establishment (ICE) is a technique used in computer networking to find ways for two computers to talk to each other as directly as possible in peer-to-peer networking. This is most commonly used for interactive media such as Voice over Internet Protocol (VoIP), peer-to-peer communications, video, and instant messaging. In such applications, communicating through a central server would be slow and expensive, but direct communication between client applications on the Internet is very tricky due to network address translators (NATs), firewalls, and other network barriers. ICE is developed by the Internet Engineering Task Force MMUSIC working group and is published as RFC 8445, as of August 2018, and has obsolesced both RFC 5245 and RFC 4091. Overview Network address translation (NAT) became an effective technique in delaying the exhaustion of the available address pool of Internet Protocol version 4, which is inherently limited to around four billion unique addresses. NAT gateways track outbound requests from a private network and maintain the state of each established connection to later direct responses from the peer on the public network to the peer in the private network, which would otherwise not be directly addressable. VoIP, peer-to-peer, and many other applications require address information of communicating peers within the data streams of the connection, rather than only in the Internet Protocol packet headers. For example, the Session Initiation Protocol (SIP) communicates the IP address of network clients for registration with a location service, so that telephone calls may be routed to registered clients. ICE provides a framework with which a communicating peer may discover and communicate its public IP address so that it can be reached by other peers. Session Traversal Utilities for NAT (STUN) is a standardized protocol for such address discovery including NAT classification. Traversal Using Relays around NAT (TURN) places
https://en.wikipedia.org/wiki/Nintendo%20Wi-Fi%20Connection
Nintendo Wi-Fi Connection (WFC) is a discontinued online multiplayer gaming service run by Nintendo to provide free online play in compatible Nintendo DS and Wii games. The service included the company's Wii Shop Channel and DSi Shop game download services. It also ran features for the Wii and Nintendo DS systems. Games designed to take advantage of Nintendo Wi-Fi Connection offered Internet play integrated into the game. When promoting this service, Nintendo emphasized the simplicity and speed of starting an online game. For example, in Mario Kart DS, an online game was initiated by selecting the online multiplayer option from the main menu, then choosing whether to play with friends, or to play with other players (either in the local region or worldwide) at about the same skill level. After a selection was made, the game started searching for an available player. On January 26, 2012, Nintendo Wi-Fi Connection service was succeeded by and absorbed into Nintendo Network. This new online system unified the 3DS and Wii U platforms and replaced Friend Codes, while providing paid downloadable content, an online community style multiplayer system, and personal accounts. Nintendo Network is fully supported on the Nintendo 3DS and on the Wii U, whilst still continuing providing partial legacy support for both Wii and Nintendo DS under the Nintendo Wi-Fi Connection brand. Specifically, the Wii U can boot to Wii mode and then access the Wii Message Board messages which have been recorded by the gameplay progress of compatible local games, but it cannot send Wii Message Board messages remotely between different machines. On May 20, 2014, at 7:30 PM PT, Nintendo ended the Nintendo Wi-Fi Connection service (WFC) for all games, except for Nintendo Wi-Fi Connection pay and play branded games, Nintendo DSi Shop services (which were terminated on March 31, 2017), and Nintendo Wii Shop Channel services (ended on January 30, 2019). After the end of the service, there have been v
https://en.wikipedia.org/wiki/Tannaka%E2%80%93Krein%20duality
In mathematics, Tannaka–Krein duality theory concerns the interaction of a compact topological group and its category of linear representations. It is a natural extension of Pontryagin duality, between compact and discrete commutative topological groups, to groups that are compact but noncommutative. The theory is named after Tadao Tannaka and Mark Grigorievich Krein. In contrast to the case of commutative groups considered by Lev Pontryagin, the notion dual to a noncommutative compact group is not a group, but a category of representations Π(G) with some additional structure, formed by the finite-dimensional representations of G. Duality theorems of Tannaka and Krein describe the converse passage from the category Π(G) back to the group G, allowing one to recover the group from its category of representations. Moreover, they in effect completely characterize all categories that can arise from a group in this fashion. Alexander Grothendieck later showed that by a similar process, Tannaka duality can be extended to the case of algebraic groups via Tannakian formalism. Meanwhile, the original theory of Tannaka and Krein continued to be developed and refined by mathematical physicists. A generalization of Tannaka–Krein theory provides the natural framework for studying representations of quantum groups, and is currently being extended to quantum supergroups, quantum groupoids and their dual Hopf algebroids. The idea of Tannaka–Krein duality: category of representations of a group In Pontryagin duality theory for locally compact commutative groups, the dual object to a group G is its character group which consists of its one-dimensional unitary representations. If we allow the group G to be noncommutative, the most direct analogue of the character group is the set of equivalence classes of irreducible unitary representations of G. The analogue of the product of characters is the tensor product of representations. However, irreducible representations of G in general
https://en.wikipedia.org/wiki/NAND%20gate
In digital electronics, a NAND gate (NOT-AND) is a logic gate which produces an output which is false only if all its inputs are true; thus its output is complement to that of an AND gate. A LOW (0) output results only if all the inputs to the gate are HIGH (1); if any input is LOW (0), a HIGH (1) output results. A NAND gate is made using transistors and junction diodes. By De Morgan's laws, a two-input NAND gate's logic may be expressed as =+, making a NAND gate equivalent to inverters followed by an OR gate. The NAND gate is significant because any boolean function can be implemented by using a combination of NAND gates. This property is called functional completeness. It shares this property with the NOR gate. Digital systems employing certain logic circuits take advantage of NAND's functional completeness. NAND gates with two or more inputs are available as integrated circuits in transistor-transistor logic, CMOS, and other logic families. Logic The function is logically equivalent to One way of expressing A NAND B is , where the symbol signifies AND and the bar signifies the negation of the expression under it: in essence, simply . Symbols There are three symbols for NAND gates: the MIL/ANSI symbol, the IEC symbol and the deprecated DIN symbol sometimes found on old schematics. For more information see logic gate symbols. The ANSI symbol for the NAND gate is a standard AND gate with an inversion bubble connected. Hardware design and pinout NAND gates are basic logic gates, and as such they are recognised in TTL and CMOS ICs. The standard, 4000 series, CMOS IC is the 4011, which includes four independent, two-input, NAND gates. These devices are available from many semiconductor manufacturers. These are usually available in both through-hole DIL and SOIC format. Datasheets are readily available in most datasheet databases. The standard 2-, 3-, 4- and 8-input NAND gates are available: CMOS 4011: Quad 2-input NAND gate 4023: Triple 3-input NAN
https://en.wikipedia.org/wiki/Fluorescence%20correlation%20spectroscopy
Fluorescence correlation spectroscopy (FCS) is a statistical analysis, via time correlation, of stationary fluctuations of the fluorescence intensity. Its theoretical underpinning originated from L. Onsager's regression hypothesis. The analysis provides kinetic parameters of the physical processes underlying the fluctuations. One of the interesting applications of this is an analysis of the concentration fluctuations of fluorescent particles (molecules) in solution. In this application, the fluorescence emitted from a very tiny space in solution containing a small number of fluorescent particles (molecules) is observed. The fluorescence intensity is fluctuating due to Brownian motion of the particles. In other words, the number of the particles in the sub-space defined by the optical system is randomly changing around the average number. The analysis gives the average number of fluorescent particles and average diffusion time, when the particle is passing through the space. Eventually, both the concentration and size of the particle (molecule) are determined. Both parameters are important in biochemical research, biophysics, and chemistry. FCS is such a sensitive analytical tool because it observes a small number of molecules (nanomolar to picomolar concentrations) in a small volume (~1μm3). In contrast to other methods (such as HPLC analysis) FCS has no physical separation process; instead, it achieves its spatial resolution through its optics. Furthermore, FCS enables observation of fluorescence-tagged molecules in the biochemical pathway in intact living cells. This opens a new area, "in situ or in vivo biochemistry": tracing the biochemical pathway in intact cells and organs. Commonly, FCS is employed in the context of optical microscopy, in particular Confocal microscopy or two-photon excitation microscopy. In these techniques light is focused on a sample and the measured fluorescence intensity fluctuations (due to diffusion, physical or chemical reactions,
https://en.wikipedia.org/wiki/Siegfried%20Popper
Siegfried Popper (5 January 1848, Prague – 19 April 1933, Prague) was an eminent naval architect in late-nineteenth- and early twentieth-century middle Europe. Biography Popper was born in Prague to Joachim Popper, a fine goods dealer (Galanteriewarenhandler) and Anna Schulhof. He attended the Nikolander Realschule (Technical School) before attending the Technical University in Prague for one year. He gained a degree in mechanical engineering after a three-year study at Karlsruhe Institute of Technology. Naval career Popper spent three years at various engineering works in Prague before joining the Austro-Hungarian Navy on 1 December 1869 as a draughtsman. He began working on ship design in 1887, when he prepared plans for the torpedo cruiser . He rose to the rank of Schiffbau-General-Ingenieur (engineering admiral), a rank which was created for him and which was conferred on 30 April 1904. He was responsible for the design of all the ships of the navy built until his retirement on 1 April 1907. He was granted an honorary doctorate by Vienna University in 1916 and returned it, about 1930, when the university introduced "Numera Clausa". Kaiser Wilhelm II offered him a chair in naval architecture at the Technical University of Berlin at Charlottenburg, but Popper declined this offer. By the time he retired, Popper was in poor health. He was hard of hearing and his vision was poor. After retiring from the navy he worked as a consultant for Stabilimento Tecnico Triestino (S.T.T.), at that time the only fabricator of large warships in the monarchy. Retirement and death After retirement Popper devoted much of his time to translating Hebrew literature into German. Because of his deafness he was run down by a tram in Prague and died several days later, on 19 April 1933. References 1848 births 1933 deaths Naval architects from Austria-Hungary Architects from Prague Austro-Hungarian Navy officers
https://en.wikipedia.org/wiki/Eizo
is a Japanese visual technology company, founded in March 1968, which manufactures display products and other solutions for markets such as business, healthcare, graphics, air traffic control, and maritime. The company is headquartered in Hakusan, Ishikawa Prefecture. Name The name EIZO, pronounced AY-ZO, comes from the Japanese kanji meaning "image" ( ). History Nanao Electric Co., Ltd. was founded in Fukui, Ishikawa prefecture in 1967. The following year, Hakui Electronic Corporation was founded in Hakui, Ishikawa; it initially manufactured televisions. In March 1973, it became Nanao Corporation. In 1976, the company began to manufacture industrial monitors, and in 1978 it entered the gaming market by manufacturing CRTs arcade game cabinets of Space Invaders and selling tabletop video arcade machines. In 1980, the company acquired video game developer and publisher Irem Corporation. In 1981, production of computer monitors, video cassette recorders and radio cassette TVs, with a new factory opening in Hakusan. In 1984 the company began expanding overseas, distributing in Europe under the brand EIZO, with Hitec Associates Ltd established as a sales subsidiary specifically for the European market, and Nanao USA in California, United States, to launch products in that region under the same "Nanao" brand as in Japan. "EIZO" was launched as a brand of Hitec Associates in Europe in 1985, with the European arm was renamed as the Eizo Corporation in January 1990. In 1990, the company's headquarters moved to Mattō, Ishikawa, with production and sales of computer monitors under the brand NANAO beginning the following year. In 1996, the brand was unified under the name EIZO. In 1997, Irem Software Engineering Inc. was established, as a subsidiary company replacing Irem Corporation. The following couple of years saw the introduction of the FlexScan L23 13.8 LCD monitor and the FlexScan L66, the world first 1280 x 1024 resolution monitor. In 1999, Nanao Corporation a
https://en.wikipedia.org/wiki/Recursive%20Bayesian%20estimation
In probability theory, statistics, and machine learning, recursive Bayesian estimation, also known as a Bayes filter, is a general probabilistic approach for estimating an unknown probability density function (PDF) recursively over time using incoming measurements and a mathematical process model. The process relies heavily upon mathematical concepts and models that are theorized within a study of prior and posterior probabilities known as Bayesian statistics. In robotics A Bayes filter is an algorithm used in computer science for calculating the probabilities of multiple beliefs to allow a robot to infer its position and orientation. Essentially, Bayes filters allow robots to continuously update their most likely position within a coordinate system, based on the most recently acquired sensor data. This is a recursive algorithm. It consists of two parts: prediction and innovation. If the variables are normally distributed and the transitions are linear, the Bayes filter becomes equal to the Kalman filter. In a simple example, a robot moving throughout a grid may have several different sensors that provide it with information about its surroundings. The robot may start out with certainty that it is at position (0,0). However, as it moves farther and farther from its original position, the robot has continuously less certainty about its position; using a Bayes filter, a probability can be assigned to the robot's belief about its current position, and that probability can be continuously updated from additional sensor information. Model The measurements are the manifestations of a hidden Markov model (HMM), which means the true state is assumed to be an unobserved Markov process. The following picture presents a Bayesian network of a HMM. Because of the Markov assumption, the probability of the current true state given the immediately previous one is conditionally independent of the other earlier states. Similarly, the measurement at the k-th timestep is de
https://en.wikipedia.org/wiki/Left%20shift%20%28medicine%29
Left shift or blood shift is an increase in the number of immature cell types among the blood cells in a sample of blood. Many (perhaps most) clinical mentions of left shift refer to the white blood cell lineage, particularly neutrophil-precursor band cells, thus signifying bandemia. Less commonly, left shift may also refer to a similar phenomenon in the red blood cell lineage in severe anemia, when increased reticulocytes and immature erythrocyte-precursor cells appear in the peripheral circulation. Definition The standard definition of a left shift is an absolute band form count greater than 7700/microL. There are competing explanations for the origin of the phrase "left shift," including the left-most button arrangement of early cell sorting machines and a 1920s publication by Josef Arneth, containing a graph in which immature neutrophils, with fewer segments, shifted the median left. In the latter view, the name reflects a curve's preponderance shifting to the left on a graph of hematopoietic cellular differentiations. Morphology It is usually noted on microscopic examination of a blood smear. This systemic effect of inflammation is most often seen in the course of an active infection and during other severe illnesses such as hypoxia and shock. Döhle bodies may also be present in the neutrophil's cytoplasm in the setting of sepsis or severe inflammatory responses. Pathogenesis It is believed that cytokines (including IL-1 and TNF) accelerate the release of cells from the postmitotic reserve pool in the bone marrow, leading to an increased number of immature cells. See also Leukocytosis Reticulocyte References Pathology Hematopathology
https://en.wikipedia.org/wiki/Protein%20subcellular%20localization%20prediction
Protein subcellular localization prediction (or just protein localization prediction) involves the prediction of where a protein resides in a cell, its subcellular localization. In general, prediction tools take as input information about a protein, such as a protein sequence of amino acids, and produce a predicted location within the cell as output, such as the nucleus, Endoplasmic reticulum, Golgi apparatus, extracellular space, or other organelles. The aim is to build tools that can accurately predict the outcome of protein targeting in cells. Prediction of protein subcellular localization is an important component of bioinformatics based prediction of protein function and genome annotation, and it can aid the identification of drug targets. Background Experimentally determining the subcellular localization of a protein can be a laborious and time consuming task. Immunolabeling or tagging (such as with a green fluorescent protein) to view localization using fluorescence microscope are often used. A high throughput alternative is to use prediction. Through the development of new approaches in computer science, coupled with an increased dataset of proteins of known localization, computational tools can now provide fast and accurate localization predictions for many organisms. This has resulted in subcellular localization prediction becoming one of the challenges being successfully aided by bioinformatics, and machine learning. Many prediction methods now exceed the accuracy of some high-throughput laboratory methods for the identification of protein subcellular localization. Particularly, some predictors have been developed that can be used to deal with proteins that may simultaneously exist, or move between, two or more different subcellular locations. Experimental validation is typically required to confirm the predicted localizations. Tools In 1999 PSORT was the first published program to predict subcellular localization. Subsequent tools and websites
https://en.wikipedia.org/wiki/Joomla
Joomla (), also spelled Joomla! (with an exclamation mark) and sometimes abbreviated as J!, is a free and open-source content management system (CMS) for publishing web content on websites. Web content applications include discussion forums, photo galleries, e-Commerce and user communities and numerous other web-based applications. Joomla is developed by a community of volunteers supported with the legal, organisational and financial resources of Open Source Matters, Inc. Joomla is written in Hypertext Preprocessor (PHP), uses object-oriented programming techniques and software design patterns, and stores data in a Structured Query Language (MySQL) database. It has a software dependency on the Symfony PHP framework. Joomla includes features such as page caching, RSS feeds, blogs, search, and support for language internationalisation. It is built on a model–view–controller web application framework that can be used independently of the CMS. Around 6,000 extensions are available from the Joomla website, and more are available from other sources. As of 2022, it was estimated to be the fifth most used CMS on the Internet, after WordPress, Shopify, Wix and Squarespace. Overview Joomla has a web template system using a template processor. Its architecture is a front controller, routing all requests for non-static URIs via PHP which parses the URI and identifies the target page. This allows support for more human-readable permalinks. The controller manages both the frontend, public-facing view, and a backend (GUI-driven) administration interface. The administration interface (a) stores management and content information within a database, and (b) maintains a configuration file (, usually located in the file system root of the Joomla installation). The configuration file provides the connection between the server, database and file system and facilitates migrating the website from one server to another. The backend interface allows website operators to manage use
https://en.wikipedia.org/wiki/Cycles%20per%20instruction
In computer architecture, cycles per instruction (aka clock cycles per instruction, clocks per instruction, or CPI) is one aspect of a processor's performance: the average number of clock cycles per instruction for a program or program fragment. It is the multiplicative inverse of instructions per cycle. Definition The average of Cycles Per Instruction in a given process () is defined by the following weighted average: Where is the number of instructions for a given instruction type , is the clock-cycles for that instruction type and is the total instruction count. The summation sums over all instruction types for a given benchmarking process. Explanation Let us assume a classic RISC pipeline, with the following five stages: Instruction fetch cycle (IF). Instruction decode/Register fetch cycle (ID). Execution/Effective address cycle (EX). Memory access (MEM). Write-back cycle (WB). Each stage requires one clock cycle and an instruction passes through the stages sequentially. Without pipelining, in a multi-cycle processor, a new instruction is fetched in stage 1 only after the previous instruction finishes at stage 5, therefore the number of clock cycles it takes to execute an instruction is five (CPI = 5 > 1). In this case, the processor is said to be subscalar. With pipelining, a new instruction is fetched every clock cycle by exploiting instruction-level parallelism, therefore, since one could theoretically have five instructions in the five pipeline stages at once (one instruction per stage), a different instruction would complete stage 5 in every clock cycle and on average the number of clock cycles it takes to execute an instruction is 1 (CPI = 1). In this case, the processor is said to be scalar. With a single-execution-unit processor, the best CPI attainable is 1. However, with a multiple-execution-unit processor, one may achieve even better CPI values (CPI < 1). In this case, the processor is said to be superscalar. To get better CPI value
https://en.wikipedia.org/wiki/Planetary%20habitability
Planetary habitability is the measure of a planet's or a natural satellite's potential to develop and maintain environments hospitable to life. Life may be generated directly on a planet or satellite endogenously or be transferred to it from another body, through a hypothetical process known as panspermia. Environments do not need to contain life to be considered habitable nor are accepted habitable zones (HZ) the only areas in which life might arise. As the existence of life beyond Earth is unknown, planetary habitability is largely an extrapolation of conditions on Earth and the characteristics of the Sun and Solar System which appear favorable to life's flourishing. Of particular interest are those factors that have sustained complex, multicellular organisms on Earth and not just simpler, unicellular creatures. Research and theory in this regard is a component of a number of natural sciences, such as astronomy, planetary science and the emerging discipline of astrobiology. An absolute requirement for life is an energy source, and the notion of planetary habitability implies that many other geophysical, geochemical, and astrophysical criteria must be met before an astronomical body can support life. In its astrobiology roadmap, NASA has defined the principal habitability criteria as "extended regions of liquid water, conditions favorable for the assembly of complex organic molecules, and energy sources to sustain metabolism". In August 2018, researchers reported that water worlds could support life. Habitability indicators and biosignatures must be interpreted within a planetary and environmental context. In determining the habitability potential of a body, studies focus on its bulk composition, orbital properties, atmosphere, and potential chemical interactions. Stellar characteristics of importance include mass and luminosity, stable variability, and high metallicity. Rocky, wet terrestrial-type planets and moons with the potential for Earth-like chemistry ar
https://en.wikipedia.org/wiki/History%20of%20Apple%20Inc.
Apple Inc., originally named Apple Computer, Inc., is a multinational corporation that creates and markets consumer electronics and attendant computer software, and is a digital distributor of media content. Apple's core product lines are the iPhone smartphone, iPad tablet computer, and the Macintosh personal computer. The company offers its products online and has a chain of retail stores known as Apple Stores. Founders Steve Jobs, Steve Wozniak, and Ronald Wayne created Apple Computer Co. on April 1, 1976, to market Wozniak's Apple I desktop computer, and Jobs and Wozniak incorporated the company on January 3, 1977, in Cupertino, California. For more than three decades, Apple Computer was predominantly a manufacturer of personal computers, including the Apple II, Macintosh, and Power Mac lines, but it faced rocky sales and low market share during the 1990s. Jobs, who had been ousted from the company in 1985, returned to Apple in 1997 after his company NeXT was bought by Apple. The following year he became the company's interim CEO, which later became permanent. Jobs subsequently instilled a new corporate philosophy of recognizable products and simple design, starting with the original iMac in 1998. With the introduction of the successful iPod music player in 2001 and iTunes Music Store in 2003, Apple established itself as a leader in the consumer electronics and media sales industries, leading it to drop "Computer" from the company's name in 2007. The company is also known for its iOS range of smart phone, media player, and tablet computer products that began with the iPhone, followed by the iPod Touch and then iPad. As of June 30, 2015, Apple was the largest publicly traded corporation in the world by market capitalization, with an estimated value of US$1 trillion as of August 2, 2018. Apple's worldwide annual revenue in 2010 totaled US$65 billion, growing to US$127.8 billion in 2011 and $156 billion in 2012. 1971–1985: Jobs and Wozniak Pre-foundation Steve
https://en.wikipedia.org/wiki/Langmuir%20%28unit%29
The langmuir (symbol: L) is a unit of exposure (or dosage) to a surface (e.g. of a crystal) and is used in ultra-high vacuum (UHV) surface physics to study the adsorption of gases. It is a practical unit, and is not dimensionally homogeneous, and so is used only in this field. It is named after American physicist Irving Langmuir. Definition The langmuir is defined by multiplying the pressure of the gas by the time of exposure. One langmuir corresponds to an exposure of 10−6 Torr during one second. For example, exposing a surface to a gas pressure of 10−8 Torr for 100 seconds corresponds to 1 L. Similarly, keeping the pressure of oxygen gas at 2.5·10−6 Torr for 40 seconds will give a dose of 100 L. Conversion Since both different pressures and exposure times can give the same langmuir (see Definition) it can be difficult to convert Langmuir (L) to exposure pressure × time (Torr·s) and vice versa. The following equation can be used to easily convert between the two: Here, and are any two numbers whose product equals the desired Langmuir value, is an integer allowing different magnitudes of pressure or exposure time to be used in conversion. The units are represented in the [square brackets]. Using the prior example, for a dose of 100 L a pressure of 2.5 × 10−6 Torr can be applied for 40 seconds, thus, , and . However, this dosage could also be gained with 8 × 10−8 Torr for 1250 seconds, here , , . In both scenarios . Derivation Exposure of a surface in surface physics is a type of fluence, that is the integral of number flux (JN) with respect to exposed time (t) to give a number of particles per unit area (Φ): The number flux for an ideal gas, that is the number of gas molecules passing through (in a single direction) a surface of unit area in unit time, can be derived from kinetic theory: where C is the number density of the gas, and is the mean speed of the molecules (not the root-mean-square speed, although the two are related). The number density of a
https://en.wikipedia.org/wiki/NOS%20stereo%20technique
The Nederlandse Omroep Stitchting (NOS) stereo technique is a method of capturing stereo sound. The Nederlandse Omroep Stichting (English: Dutch Broadcast Foundation) found a stereo main microphone system by a number of practical attempts in the 1960s. This system resulted in a quite even distribution of the phantom sources (hearing event direction) on the stereo loudspeaker base, with two small cardioid characteristic microphones, and a recording angle of the microphone system of ±40.5° = 81°. This system got empirical an axle angle of α = ±45° = 90° and a microphone distance (microphone basis) of a = 30 cm. With this system are frequency-independent level differences effective and time of arrival differences working together in the same direction as interchannel signals (interchannel loudspeaker signals). This technique leads to a realistic stereo effect and has a reasonable mono-compatibility. These interchannel differences have nothing to do with interaural differences which come only from artificial head recordings. Even the spacing of a = 30 cm has nothing to do with human ear distance, because a useful microphone system for a set of stereo loudspeakers should be developed and not for ear phones. This recording technology is called mixed stereo or equivalence stereo. Usually this special microphone system must be built up from two single small diaphragm microphones. One should not use double diaphragm microphones because of the produced unbalanced directional characteristics and the larger phase responses. It appears advisable to experiment with the two parameters the axle angle α and the microphone basis a to which there are practical microphone mounting devices. A similar technique is known as the ORTF stereo technique, devised by the French ORTF. With this technique is the angle between the microphone axes α = ± 55° = 110° and the distance between the microphones (microphone basis) is in this case a = 17 cm and gives a total recording angle of 96°. The c
https://en.wikipedia.org/wiki/List%20of%20assigned%20/8%20IPv4%20address%20blocks
Some large blocks of IPv4 addresses, the former Class A network blocks, are assigned in whole to single organizations or related groups of organizations, either by the Internet Corporation for Assigned Names and Numbers (ICANN), through the Internet Assigned Numbers Authority (IANA), or a regional Internet registry. Each block contains 256 = 2 = 16,777,216 addresses, which covers the whole range of the last three delimited segments of an IP address. As IPv4 address exhaustion has advanced to its final stages, some organizations, such as Stanford University, formerly using , have returned their allocated blocks (in this case to APNIC) to assist in the delay of the exhaustion date. List of reserved /8 blocks List of assigned /8 blocks to commercial organisations List of assigned /8 blocks to the United States Department of Defense List of assigned /8 blocks to the regional Internet registries The regional Internet registries (RIR) allocate IPs within a particular region of the world. Note that this list may not include current assignments of /8 blocks to all regional or national Internet registries. Original list of IPv4 assigned address blocks The original list of IPv4 address blocks was published in September 1981. In previous versions of the document, network numbers were 8-bit numbers rather than the 32-bit numbers used in IPv4. At that time, three networks were added that were not listed earlier: 42.rrr.rrr.rrr, 43.rrr.rrr.rrr, and 44.rrr.rrr.rrr. The relevant portion of RFC 790 is reproduced here with minor changes: 000.rrr.rrr.rrr Reserved [JBP] 001.rrr.rrr.rrr BBN-PR BBN Packet Radio Network [DCA2] 002.rrr.rrr.rrr SF-PR-1 SF Packet Radio Network [JEM] 003.rrr.rrr.rrr BBN-RCC BBN RCC Network [SGC] 004.rrr.rrr.rrr SATNET Atlantic Satellite Network [DM11] 005.rrr.rrr.rrr SILL-PR Ft. Sill Packet Radio Network[JEM] 006.rrr.rrr.rrr SF-PR-2 SF Packet Radio Network [JEM] 007.rrr.rrr.rrr CHAOS MIT CHAOS Network [MOON] 008.rrr.rrr.rrr CLARKNET SA
https://en.wikipedia.org/wiki/Pierpont%20prime
In number theory, a Pierpont prime is a prime number of the form for some nonnegative integers and . That is, they are the prime numbers for which is 3-smooth. They are named after the mathematician James Pierpont, who used them to characterize the regular polygons that can be constructed using conic sections. The same characterization applies to polygons that can be constructed using ruler, compass, and angle trisector, or using paper folding. Except for 2 and the Fermat primes, every Pierpont prime must be 1 modulo 6. The first few Pierpont primes are: It has been conjectured that there are infinitely many Pierpont primes, but this remains unproven. Distribution A Pierpont prime with is of the form , and is therefore a Fermat prime (unless ). If is positive then must also be positive (because would be an even number greater than 2 and therefore not prime), and therefore the non-Fermat Pierpont primes all have the form , when is a positive integer (except for 2, when ). Empirically, the Pierpont primes do not seem to be particularly rare or sparsely distributed; there are 42 Pierpont primes less than 106, 65 less than 109, 157 less than 1020, and 795 less than 10100. There are few restrictions from algebraic factorisations on the Pierpont primes, so there are no requirements like the Mersenne prime condition that the exponent must be prime. Thus, it is expected that among -digit numbers of the correct form , the fraction of these that are prime should be proportional to , a similar proportion as the proportion of prime numbers among all -digit numbers. As there are numbers of the correct form in this range, there should be Pierpont primes. Andrew M. Gleason made this reasoning explicit, conjecturing there are infinitely many Pierpont primes, and more specifically that there should be approximately Pierpont primes up to . According to Gleason's conjecture there are Pierpont primes smaller than N, as opposed to the smaller conjectural number of
https://en.wikipedia.org/wiki/Portevin%E2%80%93Le%20Chatelier%20effect
The Portevin–Le Chatelier (PLC) effect describes a serrated stress–strain curve or jerky flow, which some materials exhibit as they undergo plastic deformation, specifically inhomogeneous deformation. This effect has been long associated with dynamic strain aging or the competition between diffusing solutes pinning dislocations and dislocations breaking free of this stoppage. The onset of the PLC effect occurs when the strain rate sensitivity becomes negative and inhomogeneous deformation starts. This effect also can appear on the specimen's surface and in bands of plastic deformation. This process starts at a so-called critical strain, which is the minimum strain needed for the onset of the serrations in the stress–strain curve. The critical strain is both temperature and strain rate dependent. The existence of a critical strain is attributed to better solute diffusivity due to the deformation created vacancies and increased mobile dislocation density. Both of these contribute to the instability in substitutional alloys, while interstitial alloys are only affected by the increase in mobile dislocation densities. History While the effect is named after Albert Portevin and François Le Chatelier, they were not the first to discover it. Félix Savart made the discovery when he observed non-homogeneous deformation during a tensile test of copper strips. He documented the physical serrations in his samples that are currently known as Portevin–Le Chatelier bands. A student of Savart, Antoine Masson, repeated the experiment while controlling for loading rate. Masson observed that under a constant loading rate, the samples would experience sudden large changes in elongation (as large as a few millimeters). Underlying physics Much of the underlying physics of the Portevin-Le Chatelier effect lies in a specific case of solute drag creep. Adding solute atoms to a pure crystal introduces a size misfit into the system. This size misfit leads to restriction of dislocation mo
https://en.wikipedia.org/wiki/Determinacy
Determinacy is a subfield of set theory, a branch of mathematics, that examines the conditions under which one or the other player of a game has a winning strategy, and the consequences of the existence of such strategies. Alternatively and similarly, "determinacy" is the property of a game whereby such a strategy exists. Determinacy was introduced by Gale and Stewart in 1950, under the name "determinateness". The games studied in set theory are usually Gale–Stewart games—two-player games of perfect information in which the players make an infinite sequence of moves and there are no draws. The field of game theory studies more general kinds of games, including games with draws such as tic-tac-toe, chess, or infinite chess, or games with imperfect information such as poker. Basic notions Games The first sort of game we shall consider is the two-player game of perfect information of length ω, in which the players play natural numbers. These games are often called Gale–Stewart games. In this sort of game there are two players, often named I and II, who take turns playing natural numbers, with I going first. They play "forever"; that is, their plays are indexed by the natural numbers. When they're finished, a predetermined condition decides which player won. This condition need not be specified by any definable rule; it may simply be an arbitrary (infinitely long) lookup table saying who has won given a particular sequence of plays. More formally, consider a subset A of Baire space; recall that the latter consists of all ω-sequences of natural numbers. Then in the game GA, I plays a natural number a0, then II plays a1, then I plays a2, and so on. Then I wins the game if and only if and otherwise II wins. A is then called the payoff set of GA. It is assumed that each player can see all moves preceding each of his moves, and also knows the winning condition. Strategies Informally, a strategy for a player is a way of playing in which his plays are entirely
https://en.wikipedia.org/wiki/Hypoxemia
Hypoxemia is an abnormally low level of oxygen in the blood. More specifically, it is oxygen deficiency in arterial blood. Hypoxemia has many causes, and often causes hypoxia as the blood is not supplying enough oxygen to the tissues of the body. Definition Hypoxemia refers to the low level of oxygen in blood, and the more general term hypoxia is an abnormally low oxygen content in any tissue or organ, or the body as a whole. Hypoxemia can cause hypoxia (hypoxemic hypoxia), but hypoxia can also occur via other mechanisms, such as anemia. Hypoxemia is usually defined in terms of reduced partial pressure of oxygen (mm Hg) in arterial blood, but also in terms of reduced content of oxygen (ml oxygen per dl blood) or percentage saturation of hemoglobin (the oxygen-binding protein within red blood cells) with oxygen, which is either found singly or in combination. While there is general agreement that an arterial blood gas measurement which shows that the partial pressure of oxygen is lower than normal constitutes hypoxemia, there is less agreement concerning whether the oxygen content of blood is relevant in determining hypoxemia. This definition would include oxygen carried by hemoglobin. The oxygen content of blood is thus sometimes viewed as a measure of tissue delivery rather than hypoxemia. Just as extreme hypoxia can be called anoxia, extreme hypoxemia can be called anoxemia. Signs and symptoms In an acute context, hypoxemia can cause symptoms such as those in respiratory distress. These include breathlessness, an increased rate of breathing, use of the chest and abdominal muscles to breathe, and lip pursing. Chronic hypoxemia may be compensated or uncompensated. The compensation may cause symptoms to be overlooked initially, however, further disease or a stress such as any increase in oxygen demand may finally unmask the existing hypoxemia. In a compensated state, blood vessels supplying less-ventilated areas of the lung may selectively contract, to redirec
https://en.wikipedia.org/wiki/NDMP
NDMP, or Network Data Management Protocol, is a protocol meant to transport data between network attached storage (NAS) devices and backup devices. This removes the need for transporting the data through the backup server itself, thus enhancing speed and removing load from the backup server. It was originally invented by NetApp and Intelliguard, acquired by Legato and then EMC Corporation. Currently, the Storage Networking Industry Association (SNIA) oversees the development of the protocol. Most contemporary multi-platform backup software support this protocol. External links NDMP at the SNIA web site TechTarget -- NDMP definition Backup Network protocols Network-attached storage
https://en.wikipedia.org/wiki/Shmoo%20plot
In electrical engineering, a shmoo plot is a graphical display of the response of a component or system varying over a range of conditions or inputs. Origin The origin of the shmoo plot is unclear. It is referenced in a 1966 IEEE paper. Another early reference is in manuals for IBM 2365 Processor Storage. The invention of the shmoo plot is sometimes credited to VLSI Hall Of Fame inductee Robert Huston (1941–2006). But this is unlikely because Huston did not begin working as a test engineer until 1967. Etymology The plot takes its name from the Shmoo, a fictional species created by Al Capp in the cartoon Li'l Abner. These small, blob-like creatures have shapes similar to the "working" volumes that would be enclosed by shmoo plots drawn against three independent variables (such as voltage, temperature, and response speed). Semiconductor chips do not usually exhibit "shmoo" shape plots. Historically, testing of magnetic core memory arrays produced the "shmoo" shape and the term continued into the semiconductor era. Description Shmoo plots are often used to represent the results of the testing of complex electronic systems such as computers or integrated circuits such as DRAMs, ASICs or microprocessors. The plot usually shows the range of conditions in which the device under test operates (in adherence with some remaining set of specifications). For example, when testing semiconductor memory: voltages, temperature, and refresh rates can be varied over specified ranges and only certain combinations of these factors will allow the device to operate. Plotted on independent axes (voltage, temperature, refresh rates), the range of working values will enclose a three-dimensional, usually oddly-shaped volume. Other examples of conditions and inputs that can be varied include frequency, temperature, timing parameters, system- or component-specific variables, and even varying knobs tweakable during silicon chip fabrication producing parts of varying quality which are
https://en.wikipedia.org/wiki/Potassium%20bromate
Potassium bromate (), is a bromate of potassium and takes the form of white crystals or powder. It is a strong oxidizing agent. Preparation Potassium bromate is produced when bromine is passed through a hot solution of potassium hydroxide. This first forms unstable potassium hypobromite, which quickly disproportionates into bromide and bromate: 3 2 + Electrolysis of potassium bromide solutions will also give bromate. Both processes are analogous to those used in the production of chlorates. Potassium bromate is readily separated from the potassium bromide present in both methods owing to its much lower solubility; when a solution containing potassium bromate and bromide is cooled to 0°C, nearly all bromate will precipitate, while nearly all of the bromide will stay in solution. Uses in baking Potassium bromate is typically used in the United States as a flour improver (E number E924). It acts to strengthen the dough and to allow higher rising. It is an oxidizing agent, and under the right conditions will be completely reduced to bromide in the baking process. However, if too much is added, or if the bread is not baked long enough or not at a high enough temperature, then a residual amount will remain, which may be harmful if consumed. Potassium bromate might also be used in the production of malt barley, for which application the U.S. Food and Drug Administration (FDA) has prescribed certain safety conditions, including labeling standards for the finished malt barley product. It is a very powerful oxidizer (electrode potential   similar to potassium permanganate). Regulation In October 2023, California Governor Gavin Newsom approved a law that banned the manufacture, sale and distribution of potassium bromate (along with three other additives). This is the first law in the U.S. to ban it, and will possible have nationwide effects. Potassium bromate has been banned from food products in the European Union, Argentina, Brazil, Canada, Nigeria, South Korea,
https://en.wikipedia.org/wiki/Gezira%20Scheme
The Gezira Scheme () is one of the largest irrigation projects in the world. It is centered on the Sudanese state of Gezira, just southeast of the confluence of the Blue and White Nile rivers at the city of Khartoum. The Gezira Scheme was begun by the British while the area was governed as part of Anglo-Egyptian Sudan. Water from the Blue Nile is distributed through canals and ditches to tenant farms lying between the Blue and White Nile. The Gezira (which means "island") is particularly suited to irrigation because the soil slopes away from the Blue Nile and water therefore naturally runs through the irrigation canals by gravity. The soil has a high clay content which keeps down losses from seepage. Reginald Wingate, the British governor-general of Sudan, originally envisaged the farmers growing wheat but this was abandoned as the colonial authorities thought that a better cash crop was needed. When it was discovered that Egyptian-type long staple cotton could be grown, this was welcomed as a better choice as it would also provide a raw material for the British textile industry. Cotton was first grown in the area in 1904. After many experiments with irrigation, was put under cultivation in 1914. After the lowest Nile flood for 200 years, the Sennar Dam was constructed on the Blue Nile to provide a reservoir of water. This dam was completed in 1925 and is about long. The Gezira Scheme was initially financed by the Sudan Plantations Syndicate in London and later the British government guaranteed capital to develop it. The Sudan Gezira Board took over from private enterprise in 1950 and was chaired by Arthur Gaitskell. Farmers cooperated with the Sudanese government and the Gezira Board. This network of canals and ditches was long, and with the completion in the early 1960s of the Manaqil Extension on the western side of the Gezira Scheme, by 2008 the irrigated area covered , about half the country's total land under irrigation. The main crop grown in this
https://en.wikipedia.org/wiki/Bulgarian%20solitaire
In mathematics and game theory, Bulgarian solitaire is a card game that was introduced by Martin Gardner. In the game, a pack of cards is divided into several piles. Then for each pile, remove one card; collect the removed cards together to form a new pile (piles of zero size are ignored). If is a triangular number (that is, for some ), then it is known that Bulgarian solitaire will reach a stable configuration in which the sizes of the piles are . This state is reached in moves or fewer. If is not triangular, no stable configuration exists and a limit cycle is reached. Random Bulgarian solitaire In random Bulgarian solitaire or stochastic Bulgarian solitaire a pack of cards is divided into several piles. Then for each pile, either leave it intact or, with a fixed probability , remove one card; collect the removed cards together to form a new pile (piles of zero size are ignored). This is a finite irreducible Markov chain. In 2004, Brazilian probabilist of Russian origin Serguei Popov showed that stochastic Bulgarian solitaire spends "most" of its time in a "roughly" triangular distribution. References 20th-century card games Combinatorial game theory Year of introduction missing
https://en.wikipedia.org/wiki/RSSSF
The Rec.Sport.Soccer Statistics Foundation (RSSSF) is an international organization dedicated to collecting statistics about association football. The foundation aims to build an exhaustive archive of football-related information from around the world. History This enterprise, according to its founders, was created in January 1994 by three regulars of the Rec.Sport.Soccer (RSS) Usenet newsgroup: Lars Aarhus, Kent Hedlundh, and Karel Stokkermans. It was originally known as the "North European Rec.Sport.Soccer Statistics Foundation", but the geographical reference was dropped as its membership from other regions grew. The RSSSF has members and contributors from all around the world and has spawned seven spin-off projects to more closely follow the leagues of that project's home country. The spin-off projects are dedicated to Albania, Brazil, Denmark, Norway, Romania, Uruguay, Venezuela and Egypt. In November 2002, the Polish service 90minut.pl became the official branch of RSSSF Poland. Reception RSSSF's database has been described as the "very best" for football data. Rec.Sport.Soccer Player of the Year Since 1992 a vote for the Best Footballer in the World among the readers of the rec.sport.soccer newsgroup. It was held yearly until 2005, when it was discontinued. The voting works as follows: each voter chooses five players, at most two of the same nationality, in order; these obtain five to one points. The nationality restriction was dropped for the 2003 vote, in which voting was restricted to 173 pre-selected players. Wins by player Wins by country Wins by club References Bibliography Archived 23 January 2014 External links Association football websites Usenet Organizations established in 1994 Internet properties established in 1994
https://en.wikipedia.org/wiki/Book%20%28graph%20theory%29
In graph theory, a book graph (often written  ) may be any of several kinds of graph formed by multiple cycles sharing an edge. Variations One kind, which may be called a quadrilateral book, consists of p quadrilaterals sharing a common edge (known as the "spine" or "base" of the book). That is, it is a Cartesian product of a star and a single edge. The 7-page book graph of this type provides an example of a graph with no harmonious labeling. A second type, which might be called a triangular book, is the complete tripartite graph K1,1,p. It is a graph consisting of triangles sharing a common edge. A book of this type is a split graph. This graph has also been called a or a thagomizer graph (after thagomizers, the spiked tails of stegosaurian dinosaurs, because of their pointy appearance in certain drawings) and their graphic matroids have been called thagomizer matroids. Triangular books form one of the key building blocks of line perfect graphs. The term "book-graph" has been employed for other uses. Barioli used it to mean a graph composed of a number of arbitrary subgraphs having two vertices in common. (Barioli did not write for his book-graph.) Within larger graphs Given a graph , one may write for the largest book (of the kind being considered) contained within . Theorems on books Denote the Ramsey number of two triangular books by This is the smallest number such that for every -vertex graph, either the graph itself contains as a subgraph, or its complement graph contains as a subgraph. If , then . There exists a constant such that whenever . If , and is large, the Ramsey number is given by . Let be a constant, and . Then every graph on vertices and edges contains a (triangular) . References Parametric families of graphs Planar graphs
https://en.wikipedia.org/wiki/Alphabet%20to%20E-mail
Alphabet to E-mail: How Written English Evolved and Where It's Heading () is a book by linguist Dr. Naomi Baron, a professor of linguistics at American University, Washington, D.C. It was first published in 2000, published by Routledge Press. In it, Baron explores the history of the English language in written form, and considers how it has evolved through its history, ending with an evaluation of the state of the English language today, and how the Internet and the use of email and text messaging has affected it. Baron considered that email did not have an inherent writing style, and believed it was evolving to resemble speech. She also expressed her disappointment with the effect of electronic means of communication upon the written word. Baron noted that 25-years of research revealed that: References External links Google Books entry English language Linguistics books Email Text messaging 2000 non-fiction books Routledge books
https://en.wikipedia.org/wiki/Boyer%E2%80%93Lindquist%20coordinates
In the mathematical description of general relativity, the Boyer–Lindquist coordinates are a generalization of the coordinates used for the metric of a Schwarzschild black hole that can be used to express the metric of a Kerr black hole. The Hamiltonian test for particle motion in Kerr spacetime is separable in Boyer–Lindquist coordinates. Using Hamilton–Jacobi theory one can derive a fourth constant of the motion known as Carter's constant. The 1967 paper introducing Boyer–Lindquist coordinates was a posthumous publication for Robert H. Boyer, who was killed in the 1966 University of Texas tower shooting. Line element The line element for a black hole with a total mass equivalent , angular momentum , and charge in Boyer–Lindquist coordinates and geometrized units () is where called the discriminant, and called the Kerr parameter. Note that in geometrized units , , and all have units of length. This line element describes the Kerr–Newman metric. Here, is to be interpreted as the mass of the black hole, as seen by an observer at infinity, is interpreted as the angular momentum, and the electric charge. These are all meant to be constant parameters, held fixed. The name of the discriminant arises because it appears as the discriminant of the quadratic equation bounding the time-like motion of particles orbiting the black hole, i.e. defining the ergosphere. The coordinate transformation from Boyer–Lindquist coordinates , , to Cartesian coordinates , , is given (for ) by: Vierbein The vierbein one-forms can be read off directly from the line element: so that the line element is given by where is the flat-space Minkowski metric. Spin connection The torsion-free spin connection is defined by The contorsion tensor gives the difference between a connection with torsion, and a corresponding connection without torsion. By convention, Riemann manifolds are always specified with torsion-free geometries; torsion is often used to specify equivalent, flat ge
https://en.wikipedia.org/wiki/Engineering%20and%20Physical%20Sciences%20Research%20Council
The Engineering and Physical Sciences Research Council (EPSRC) is a British Research Council that provides government funding for grants to undertake research and postgraduate degrees in engineering and the physical sciences, mainly to universities in the United Kingdom. EPSRC research areas include mathematics, physics, chemistry, artificial intelligence and computer science, but exclude particle physics, nuclear physics, space science and astronomy (which fall under the remit of the Science and Technology Facilities Council). Since 2018 it has been part of UK Research and Innovation, which is funded through the Department for Business, Energy and Industrial Strategy. History EPSRC was created in 1994. At first part of the Science and Engineering Research Council (SERC), in 2018 it was one of nine organisations brought together to form UK Research and Innovation (UKRI). Its head office is in Swindon, Wiltshire in the same building (Polaris House) that houses the AHRC, BBSRC, ESRC, MRC, Natural Environment Research Council, Science and Technology Facilities Council, TSB, Research Councils UK and the UK Space Agency. Key people Paul Golby, Chair of EngineeringUK, was appointed as the Chairman of the EPSRC from 1 April 2012 for four years. He succeeded Sir John Armitt. From 2007 to March 2014, the chief executive and deputy chair of EPSRC was David Delpy, a medical physicist and formerly vice provost at University College London. He'd been succeeded in April 2014 by Philip Nelson, former University of Southampton pro vice-chancellor for research and enterprise. In April 2016 Professor Tom Rodden was appointed as the Deputy CEO of EPSRC, a new position created to work alongside Philip Nelson while he also acts as Chair of RCUK Strategic Executive. Rodden joins the EPSRC on secondment from the University of Nottingham where he is currently Professor of Computing and Co-Director of Horizon Digital Economy Research. In October 2018 Nelson was succeeded by Lynn Gladd
https://en.wikipedia.org/wiki/External%20storage
In computing, external storage refers to non-volatile (secondary) data storage outside a computer's own internal hardware, and thus can be readily disconnected and accessed elsewhere. Such storage devices may refer to removable media (e.g. punched paper, magnetic tape, floppy disk and optical disc), compact flash drives (USB flash drive and memory card), portable storage devices (external solid state drive and enclosured hard disk drive), or network-attached storage. Web-based cloud storage is the latest technology for external storage. History Today the term external storage most commonly applies to those storage devices external to a personal computer. The terms refer to any storage external to the computer. Storage as distinct from memory in the early days of computing was always external to the computer as for example in the punched card devices and media. Today storage devices may be internal or external to a computer system. In the 1950s, introduction of magnetic tapes and hard disk drives allowed for mass external storage of information, which played the key part of the computer revolution. Initially all external storage, tape and hard disk drives are today available as both internal and external storage. In the 1964 removable disk media was introduced by the IBM 2310 disk drive with its 2315 cartridge used in IBM 1800 and IBM 1130 computers. Magnetic disk media is today not removable; however disk devices and media such as optical disc drives and optical discs are available both as internal storage and external storage. Earlier adoption of external storage As a consequence of rapid development of electronic computers, capability for integration of existing input, output, and storage devices was a determinant factor in their adoption. IBM 650 was a first mass-produced electronic computer that encompassed wide range of existing in technologies for input-output and memory devices, and it also included tape-to-card and card-to-tape conversion units. Earl
https://en.wikipedia.org/wiki/Compal%20Electronics
Compal Electronics () is a Taiwanese original design manufacturer (ODM), handling the production of notebook computers, monitors, tablets and televisions for a variety of clients around the world, including Apple Inc., Alphabet Inc., Acer, Lenovo, Dell, Toshiba, Hewlett-Packard and Fujitsu. It also licenses brands of its clients. It is the second-largest contract laptop manufacturer in the world behind Quanta Computer, and shipped over 48 million notebooks in 2010. Overview The company is known for producing selected models for Dell (Alienware included), Hewlett-Packard, Compaq, and Toshiba. Compal has designed and built laptops for all major brands and custom builders for over 22 years. The company is listed in Taiwan Stock Exchange. As of 2017, revenues were US$24 billion, with a total workforce of 64,000. The company's headquarters is in Taipei, Taiwan, with offices in mainland China, South Korea, the United Kingdom, and the United States. Compal's main production facility is in Kunshan, China. Compal is the second largest notebook manufacturer in the world after Quanta Computers, also based in Taiwan. History Compal was founded in June 1984 as a computer peripherals supplier. It went public in April 1990. In September 2011, Compal announced it would form a joint venture with Lenovo to make laptops in China. The venture was expected to start producing laptops by the end of 2012. In January 2015, Toshiba announced that due to intense price competition, it will stop selling televisions in the USA and will instead license the Toshiba TV brand to Compal. In September 2018, it was revealed that due to overwhelming demand for the Apple Watch, Compal was brought on as a second contract manufacturer to produce the Apple Watch Series 4. CCI Compal subsidiary Compal Communications (華寶通訊, CCI) is a major manufacturer of mobile phones. The phones are produced on an ODM basis, i.e., the handsets are sold through other brands. In 2006, CCI produced 68.8 million hands
https://en.wikipedia.org/wiki/Percy%20Alexander%20MacMahon
Percy Alexander MacMahon (26 September 1854 – 25 December 1929) was a mathematician, especially noted in connection with the partitions of numbers and enumerative combinatorics. Early life Percy MacMahon was born in Malta to a British military family. His father was a colonel at the time, retired in the rank of the brigadier. MacMahon attended the Proprietary School in Cheltenham. At the age of 14 he won a Junior Scholarship to Cheltenham College, which he attended as a day boy from 10 February 1868 until December 1870. At the age of 16 MacMahon was admitted to the Royal Military Academy, Woolwich and passed out after two years. Military career On 12 March 1873, MacMahon was posted to Madras, India, with the 1st Battery 5th Brigade, with the temporary rank of lieutenant. The Army List showed that in October 1873 he was posted to the 8th Brigade in Lucknow. MacMahon's final posting was to the No. 1 Mountain Battery with the Punjab Frontier Force at Kohat on the North West Frontier. He was appointed Second Subaltern on 26 January and joined the Battery on 25 February 1877. In the Historical Record of the No. 1 (Kohat) Mountain Battery, Punjab Frontier Force it is recorded that he was sent on sick leave to Muree (or Maree), a town north of Kohat on the banks of the Indus river, on 9 August 1877. On 22 December 1877 he started 18 months leave on a medical certificate granted under GGO number 1144. The nature of his illness is unknown. This period of sick leave was one of the most significant occurrences in MacMahon's life. Had he remained in India he would undoubtedly have been caught up in Roberts's War against the Afghans. In early 1878 MacMahon returned to England and the sequence of events began which led to him becoming a mathematician rather than a soldier. The Army List records a transfer to the 3rd Brigade in Newbridge at the beginning of 1878, and then shows MacMahon as 'supernumerary' from May 1878 until March 1879. In January 1879 MacMahon was posted
https://en.wikipedia.org/wiki/Stellated%20octahedron
The stellated octahedron is the only stellation of the octahedron. It is also called the stella octangula (Latin for "eight-pointed star"), a name given to it by Johannes Kepler in 1609, though it was known to earlier geometers. It was depicted in Pacioli's De Divina Proportione, 1509. It is the simplest of five regular polyhedral compounds, and the only regular compound of two tetrahedra. It is also the least dense of the regular polyhedral compounds, having a density of 2. It can be seen as a 3D extension of the hexagram: the hexagram is a two-dimensional shape formed from two overlapping equilateral triangles, centrally symmetric to each other, and in the same way the stellated octahedron can be formed from two centrally symmetric overlapping tetrahedra. This can be generalized to any desired amount of higher dimensions; the four-dimensional equivalent construction is the compound of two 5-cells. It can also be seen as one of the stages in the construction of a 3D Koch snowflake, a fractal shape formed by repeated attachment of smaller tetrahedra to each triangular face of a larger figure. The first stage of the construction of the Koch Snowflake is a single central tetrahedron, and the second stage, formed by adding four smaller tetrahedra to the faces of the central tetrahedron, is the stellated octahedron. Construction The Cartesian coordinates of the stellated octahedron are as follows: (±1/2, ±1/2, 0) (0, 0, ±1/√2) (±1, 0, ±1/√2) (0, ±1, ±1/√2) The stellated octahedron can be constructed in several ways: It is a stellation of the regular octahedron, sharing the same face planes. (See Wenninger model W19.) It is also a regular polyhedron compound, when constructed as the union of two regular tetrahedra (a regular tetrahedron and its dual tetrahedron). It can be obtained as an augmentation of the regular octahedron, by adding tetrahedral pyramids on each face. In this construction it has the same topology as the convex Catalan solid, the triakis octahed
https://en.wikipedia.org/wiki/Xain%27d%20Sleena
is a two genre platformer and side-scrolling arcade video game produced by Technos in 1986. It was licensed for release outside of Japan by Taito. In the USA, the game was published by Memetron, and the game was renamed to Solar Warrior. The European home computer ports renamed the game to Soldier of Light. Gameplay The main character, Xain, is a galactic bounty-hunter who must defeat evil forces who oppress five different planets. The player can select any order to play the various planets, so, there is no 'official' sequence of play (For the U.S. version, this game was released as 'Solar Warrior'. This version goes through a set sequence instead of having to choose planets). Each planet is played with right horizontal and vertical scrolling, shooting enemies and dodging natural hazards. Xain can crouch, double crouch (prone), jump and double jump. In some of the planets the player will need to kill a sub-boss to resume. Certain enemies carry a powerup which changes the default laser gun into a different weapon. The different weapons which are cycled through powerups include a laser-grenade gun, a 2-way gun, a spreadfire gun and a strong bullet gun with their own respective damage and directional firing capabilities. At the end of the planet, the player goes into battle with a boss. Once defeated, the player plants a bomb into the boss' base and has ten seconds to escape in a starship. The next half of the planet stage is an interlude stage during which the player must battle through waves of enemy ships while heading to the next planet. After three planets there is a battle through an asteroid field and against a giant mothership. When all five planets are liberated, the player will play the longer final stage on a gigantic metallic fortress, facing the bosses previously met on each of the five planets. Fighting bosses in this stage is optional. Halfway through the stage the player plants a bomb on the fortress core and has 60 seconds to reach the exit hanga
https://en.wikipedia.org/wiki/PCI%20eXtensions%20for%20Instrumentation
PCI eXtensions for Instrumentation (PXI) is one of several modular electronic instrumentation platforms in current use. These platforms are used as a basis for building electronic test equipment, automation systems, and modular laboratory instruments. PXI is based on industry-standard computer buses and permits flexibility in building equipment. Often, modules are fitted with custom software to manage the system. Overview PXI is designed for measurement and automation applications that require high-performance and a rugged industrial form-factor. With PXI, one can select the modules from a number of vendors and integrate them into a single PXI system, over 1150 module types available in 2006. A typical 3U PXI module measures approximately (4x6") in size, and a typical 8-slot PXI chassis is 4U high and half rack width, full width chassis contain up to 18 PXI slots. PXI uses PCI-based technology and an industry standard governed by the PXI Systems Alliance (PXISA) to ensure standards compliance and system interoperability. There are PXI modules available for almost every conceivable test, measurement, and automation application, from the ubiquitous switching modules and DMMs, to high-performance microwave vector signal generation and analysis. There are also companies specializing in writing software for PXI modules, as well as companies providing PXI hardware-software integration services. PXI is based on CompactPCI, and it offers all of the benefits of the PCI architecture including performance, industry adoption, COTS technology. PXI adds a rugged CompactPCI mechanical form-factor, an industry consortium that defines hardware, electrical, software, power and cooling requirements. Then PXI adds integrated timing and synchronization that is used to route synchronization clocks, and triggers internally. PXI is a future-proof technology, and is designed to be simply and quickly reprogrammed as test, measurement, and automation requirements change. Most PXI instru
https://en.wikipedia.org/wiki/Crystal%20oven
A crystal oven is a temperature-controlled chamber used to maintain the quartz crystal in electronic crystal oscillators at a constant temperature, in order to prevent changes in the frequency due to variations in ambient temperature. An oscillator of this type is known as an oven-controlled crystal oscillator (OCXO, where "XO" is an old abbreviation for "crystal oscillator".) This type of oscillator achieves the highest frequency stability possible with a crystal. They are typically used to control the frequency of radio transmitters, cellular base stations, military communications equipment, and for precision frequency measurement. Description Quartz crystals are widely used in electronic oscillators to precisely control the frequency produced. The frequency at which a quartz crystal resonator vibrates depends on its physical dimensions. A change in temperature causes the quartz to expand or contract due to thermal expansion, changing the frequency of the signal produced by the oscillator. Although quartz has a very low coefficient of thermal expansion, temperature changes are still the major cause of frequency variation in crystal oscillators. The oven is a thermally-insulated enclosure containing the crystal and one or more electrical heating elements. Since other electronic components in the circuit are also vulnerable to temperature drift, usually the entire oscillator circuit is enclosed in the oven. A thermistor temperature sensor in a closed-loop control circuit is used to control the power to the heater and ensure that the oven is maintained at the precise temperature desired. Because the oven operates above ambient temperature, the oscillator usually requires a warm-up period after power has been applied to reach its operating temperature. During this warm-up period, the frequency will not have the full rated stability. The temperature selected for the oven is that at which the slope of the crystal's frequency vs. temperature curve is zero, further i
https://en.wikipedia.org/wiki/TO-220
The TO-220 is a style of electronic package used for high-powered, through-hole components with pin spacing. The "TO" designation stands for "transistor outline". TO-220 packages have three leads. Similar packages with two, four, five or seven leads are also manufactured. A notable characteristic is a metal tab with a hole, used to mount the case to a heatsink, allowing the component to dissipate more heat than one constructed in a TO-92 case. Common TO-220-packaged components include discrete semiconductors such as transistors and silicon-controlled rectifiers, as well as integrated circuits. Typical applications The TO-220 package is a "power package" intended for power semiconductors and an example of a through-hole design rather than a surface-mount technology type of package. TO-220 packages can be mounted to a heat sink to dissipate several watts of waste heat. On a so-called "infinite heat sink", this can be 50 W or more. The top of the package has a metal tab with a hole used to mount the component to a heat sink. Thermal compound is often applied between package and heatsink to further improve heat transfer. The metal tab is often connected electrically to the internal circuitry. This does not normally pose a problem when using isolated heatsinks, but an electrically-insulating pad or sheet may be required to electrically isolate the component from the heatsink if the heatsink is electrically conductive, grounded or otherwise non-isolated. Many materials may be used to electrically isolate the TO-220 package, some of which have the added benefit of high thermal conductivity. In applications that require a heatsink, damage or destruction of the TO-220 device due to overheating may occur if the heatsink is dislodged during operation. A heatsinked TO-220 package dissipating 1 W of heat will have an internal (junction) temperature typically 2 to 5 °C higher than the package's temperature (due to the thermal resistance between the junction and the meta
https://en.wikipedia.org/wiki/Laramie-Poudre%20Tunnel
The Laramie-Poudre Tunnel is an early transmountain tunnel in the U.S. state of Colorado. The tunnel transfers water from the west side of the Laramie River basin, which drains to the North Platte River, to the east side Cache la Poudre River basin that drains to the South Platte River. The tunnel is about long with variable diameters with a minimum diameter of about . The diameter varied due to the different material mined through and the erosion of almost 90 years of water flow. It is located at about elevation with about a 1.7 degree down slope. The Laramie River lies about higher than the Cache La Poudre River at this location separated only by a mountain ridge. The Laramie-Poudre Tunnel is located about west-northwest of Fort Collins, Colorado, about south of the Wyoming border and about north of Rocky Mountain National Park. It was built between 1909 and 1911 for the Laramie-Poudre Reservoirs & Irrigation Co. to convey water from the Laramie River to the Poudre River for Front Range irrigation. The tunnel was driven for the purpose of conveying through the divide 800 cu.ft of water per second. "Work on the power-plant for operating the tunnel was begun Dec 1st 1909. The Hydro-electric power-plant was erected on the west bank of the Cache-la-Poudre, nearly opposite the eastern portal. "Repauno 60-per cent. gelatine" was used along with German Insolid and Z.L. fuse were used for blasting, with exception where the granite was really hard and tough. There were about 60 people employed with skills ranging from helpers, muckers, mechanics, stable-helpers, blacksmiths, book keeper and foremen. These were arranged in an 8-hr shift. Court battles between Colorado and Wyoming over water rights prevented operation until 1914 (Case 1995). As a result of the court battles the tunnel is restricted to a maximum of of water from the Laramie river instead of its designed . Most of the flow occurs during the peak snow melt season of mid May to mid July. The Larami
https://en.wikipedia.org/wiki/Puiseux%20series
In mathematics, Puiseux series are a generalization of power series that allow for negative and fractional exponents of the indeterminate. For example, the series is a Puiseux series in the indeterminate . Puiseux series were first introduced by Isaac Newton in 1676 and rediscovered by Victor Puiseux in 1850. The definition of a Puiseux series includes that the denominators of the exponents must be bounded. So, by reducing exponents to a common denominator , a Puiseux series becomes a Laurent series in an th root of the indeterminate. For example, the example above is a Laurent series in Because a complex number has th roots, a convergent Puiseux series typically defines functions in a neighborhood of . Puiseux's theorem, sometimes also called the Newton–Puiseux theorem, asserts that, given a polynomial equation with complex coefficients, its solutions in , viewed as functions of , may be expanded as Puiseux series in that are convergent in some neighbourhood of . In other words, every branch of an algebraic curve may be locally described by a Puiseux series in (or in when considering branches above a neighborhood of ). Using modern terminology, Puiseux's theorem asserts that the set of Puiseux series over an algebraically closed field of characteristic 0 is itself an algebraically closed field, called the field of Puiseux series. It is the algebraic closure of the field of formal Laurent series, which itself is the field of fractions of the ring of formal power series. Definition If is a field (such as the complex numbers), a Puiseux series with coefficients in is an expression of the form where is a positive integer and is an integer. In other words, Puiseux series differ from Laurent series in that they allow for fractional exponents of the indeterminate, as long as these fractional exponents have bounded denominator (here n). Just as with Laurent series, Puiseux series allow for negative exponents of the indeterminate as long as these negativ
https://en.wikipedia.org/wiki/Belle%20%28chess%20machine%29
Belle was a chess computer developed by Joe Condon (hardware) and Ken Thompson (software) at Bell Labs. In 1983, it was the first machine to achieve master-level play, with a USCF rating of 2250. It won the ACM North American Computer Chess Championship five times and the 1980 World Computer Chess Championship. It was the first system to win using specialized chess hardware. In its final incarnation, Belle used an LSI-11 general-purpose computer to coordinate its chess hardware. There were three custom boards for move generation, four custom boards for position evaluation, and a microcode implementation of alpha-beta pruning. The computer also had one megabyte of memory for storing transposition tables. At the end of its career, Belle was donated to the Smithsonian Institution. The overall architecture of Belle was used for the initial designs of ChipTest, the progenitor of IBM Deep Blue. Origins Following his work on the Unix operating system, Ken Thompson turned his attention to computer chess. In summer 1972, he began work on a program for the PDP-11, which would eventually become Belle. In competition, this early version encouraged Thompson to pursue a brute-force approach when designing Belle's hardware. Design Belle's design underwent many changes throughout its lifetime. The initial chess program was rewritten to utilize move-vs-evaluation quiescence search and evaluate positions by prioritizing material advantage. Belle also used a transposition table to avoid redundant examinations of positions. Hardware move generator In 1976, Joe Condon implemented a hardware move generator to be used with software version of Belle on the PDP-11. His design had several steps: A 6-bit "from" register searches the board for friendly pieces. Once a friendly piece is found, a ∆xy move-offset counter provides a bit-code for the move offset, e.g. (2,2) for a bishop or (2,0) for a rook. This offset is combined with the contents of the "from" register and moved to a 6-
https://en.wikipedia.org/wiki/Aleste
is a vertically scrolling shooter developed by Compile, originally published by Sega in 1988 for the Master System and then by CP Communications for the MSX2. The Master System version was released outside Japan as Power Strike. The game spawned the Aleste and Power Strike franchises. Plot The story of Aleste concerns the manmade supercomputer DIA 51, which has been infected by a hybrid virus that is spreading like wildfire, eventually leading DIA 51 to eliminate the human race. When Yuri, Ray's girlfriend, gets injured in DIA's assault, Raymond Waizen has all the reason in the world to get rid of DIA 51 once and for all in his Aleste fighter. Releases The game was originally released for the Master System in February 1988. This version was released outside Japan, as Power Strike. The US release was initially a mail-only limited edition, however it did later see some retail distribution at Toys R' Us and other chains in North America. The European release was a regular retail package. An MSX2 version was released in July of that year, featuring two new stages, lowered difficulty, and a series of cutscenes. A version of the game has been released on phones by Square Enix, presumably based on the MSX2 version. The MSX2 version has been re-released on Nintendo's Wii Virtual Console service in Japan. It along with Aleste 2 had also been rereleased through the now-defunct WOOMB service. Reception The Master System version called Power Strike received positive reviews. Computer and Video Games scored it 86% in 1989. Console XS scored it 90% in 1992. Aleste series Aleste was followed by several sequels: Aleste Gaiden (MSX2, 1989) Aleste 2 (MSX2, 1989) Musha Aleste (Mega Drive, 1990) GG Aleste (Game Gear, 1991) Super Aleste (SNES, 1992) Dennin Aleste (Mega-CD, 1992) GG Aleste II (Game Gear, 1993) GG Aleste 3 (Game Gear Micro, 2020) Senjin Aleste (Arcade, 2021) Aleste Branch (TBA) Cancelled games Dennin Aleste 2 (Mega-CD, cancelled) Related games There
https://en.wikipedia.org/wiki/Chack%27n%20Pop
is a platform arcade game developed and released by Taito in 1984. In the game, the player controls a small yellow creature, Chack'n, with the objective being to retrieve hearts from a cave, all while avoiding the enemies contained within them. Chack'n also has the ability to deploy bombs, which can kill said enemies, which can bring bonuses depending on if all or none of the enemies have been killed. It is considered to be a spiritual predecessor of Bubble Bobble due to the shared characters and similar game structure. Home ports were released for the SG-1000, MSX, Family Computer, Sharp X1, NEC PC-6001 and NEC PC-8801. The arcade version would later be included via emulation in Taito Legends Power-Up, Taito Memories Pocket, Taito Memories Gekan, and Taito Legends 2. The Family Computer version would later be re-released on the Nintendo Wii and Nintendo 3DS via Virtual Console. Gameplay Chack'n Pop is a platform arcade game. The player controls Chack'n, a small yellow creature with extendable legs, through a series of single-screen mazes. He is capable of walking on floors or ceilings but not walls. He can climb steps and traverse high walls by extending his legs until he is tall enough to pass onto the next step. He is also capable of throwing hand grenades to his left or right which, after a short period, explode into a cloud of smoke. Separate fire buttons control rolling to the left or right. Chack'n is killed if caught by the explosion cloud. He is delayed in this process by a series of solid walls. In order to get past the walls, he must free hearts from cages using his hand grenades. A further obstruction comes in the form of Monstas hatching from eggs. All or none of the Monstas in a level can be destroyed for a bonus at the end of the level. Each screen is played against a time limit, marked by a Mighta pushing a boulder along the top of the screen. Development and release Chack'n Pop was released by Taito around April 1984 in Japan, despite the copy
https://en.wikipedia.org/wiki/Insurance%20score
An insurance score – also called an insurance credit score – is a numerical point system based on select credit report characteristics. There is no direct relationship to financial credit scores used in lending decisions, as insurance scores are not intended to measure creditworthiness, but rather to predict risk. Insurance companies use insurance scores for underwriting decisions, and to partially determine charges for premiums. Insurance scores are applied in personal product lines, namely homeowners and private passenger automobile insurance, and typically not elsewhere. Background Insurance scoring models are built from selections of credit report factors, combined with insurance claim and profitability data, to produce numerical formulae or algorithms. A scoring model may be unique to an insurance company and to each line of business (e.g. homeowners or automobile), in terms of the factors selected for consideration and the weighting of the point assignments. As insurance credit scores are not intended to measure creditworthiness, they commonly focus on financial habits and choices (i.e., age of oldest account, number of inquiries in 24 months, ratio of total balance to total limits, number of open retail credit cards, number of revolving accounts with balances greater than 75% of limits, etc.) Therefore it is possible for a consumer with a high financial credit score, and excellent payment history, to receive a poor insurance score. Insurers consider credit report information in their underwriting and pricing decisions as a predictor of profitability and risk of loss. Various studies have found a strong relationship between credit-based insurance scores and profitability or risk of loss. The scores are generally most predictive when little or no other information exists, such as in the case of clean driving records, or claims-free policies; in instances where past claims, points, or other similar information exist on record, the personal histories will ty
https://en.wikipedia.org/wiki/Topological%20divisor%20of%20zero
In mathematics, an element of a Banach algebra is called a topological divisor of zero if there exists a sequence of elements of such that The sequence converges to the zero element, but The sequence does not converge to the zero element. If such a sequence exists, then one may assume that for all . If is not commutative, then is called a "left" topological divisor of zero, and one may define "right" topological divisors of zero similarly. Examples If has a unit element, then the invertible elements of form an open subset of , while the non-invertible elements are the complementary closed subset. Any point on the boundary between these two sets is both a left and right topological divisor of zero. In particular, any quasinilpotent element is a topological divisor of zero (e.g. the Volterra operator). An operator on a Banach space , which is injective, not surjective, but whose image is dense in , is a left topological divisor of zero. Generalization The notion of a topological divisor of zero may be generalized to any topological algebra. If the algebra in question is not first-countable, one must substitute nets for the sequences used in the definition. Topological algebra
https://en.wikipedia.org/wiki/Herbrand%20Award
The Herbrand Award for Distinguished Contributions to Automated Reasoning is an award given by the Conference on Automated Deduction (CADE), Inc., (although it predates the formal incorporation of CADE) to honour persons or groups for important contributions to the field of automated deduction. The award is named after the French scientist Jacques Herbrand and given at most once per CADE or International Joint Conference on Automated Reasoning (IJCAR). It comes with a prize of US$1,000. Anyone can be nominated, the award is awarded after a vote among CADE trustees and former recipients, usually with input from the CADE/IJCAR programme committee. Recipients Past award recipients are: 1990s Larry Wos (1992) Woody Bledsoe (1994) John Alan Robinson (1996) Wu Wenjun (1997) Gérard Huet (1998) Robert S. Boyer and J Strother Moore (1999) 2000s William W. McCune (2000) Donald W. Loveland (2001) Mark E. Stickel (2002). Peter B. Andrews (2003) Harald Ganzinger (2004) Martin Davis (2005) Wolfgang Bibel (2006) Alan Bundy (2007) Edmund M. Clarke (2008) Deepak Kapur (2009) 2010s David Plaisted (2010) Nachum Dershowitz (2011) Melvin Fitting (2012) C. Greg Nelson (2013) Robert L. Constable (2014) Andrei Voronkov (2015) Zohar Manna and Richard Waldinger (2016) Lawrence C. Paulson (2017) Bruno Buchberger (2018) Nikolaj Bjørner and Leonardo de Moura (2019) 2020s Franz Baader (2020) Tobias Nipkow (2021) Natarajan Shankar (2022) See also List of computer science awards Jacques Herbrand Prize — by the French Academy of Sciences, for mathematics and physics References External links The Herbrand Award for Distinguished Contributions to Automated Reasoning Artificial intelligence competitions Computer science awards Logic in computer science
https://en.wikipedia.org/wiki/Electron%20capture%20detector
An electron capture detector (ECD) is a device for detecting atoms and molecules in a gas through the attachment of electrons via electron capture ionization. The device was invented in 1957 by James Lovelock and is used in gas chromatography to detect trace amounts of chemical compounds in a sample. Gas chromatograph detector The electron capture detector is used for detecting electron-absorbing components (high electronegativity) such as halogenated compounds in the output stream of a gas chromatograph. The ECD uses a radioactive beta particle (electron) emitter in conjunction with a so-called makeup gas flowing through the detector chamber. The electron emitter typically consists of a metal foil holding 10 millicuries (370 MBq) of the radionuclide . Usually, nitrogen is used as makeup gas, because it exhibits a low excitation energy, so it is easy to remove an electron from a nitrogen molecule. The electrons emitted from the electron emitter collide with the molecules of the makeup gas, resulting in many more free electrons. The electrons are accelerated towards a positively charged anode, generating a current. There is therefore always a background signal present in the chromatogram. As the sample is carried into the detector by the carrier gas, electron-absorbing analyte molecules capture electrons and thereby reduce the current between the collector anode and a cathode. Over a wide range of concentrations the rate of electron capture is proportional to the analyte concentration. ECD detectors are particularly sensitive to halogens, organometallic compounds, nitriles, or nitro compounds. Response mechanism It is not immediately obvious why the capture of electrons by electronegative analytes reduces the current that flows between the anode and cathode: the molecular negative ions of the analyte carry the same charge as the electrons that were captured. The key to understanding why the current decreases is to ask where charged entities can go besides being c
https://en.wikipedia.org/wiki/Complete%20numbering
In computability theory complete numberings are generalizations of Gödel numbering first introduced by A.I. Mal'tsev in 1963. They are studied because several important results like the Kleene's recursion theorem and Rice's theorem, which were originally proven for the Gödel-numbered set of computable functions, still hold for arbitrary sets with complete numberings. Definition A numbering of a set is called complete (with respect to an element ) if for every partial computable function there exists a total computable function so that (Ershov 1999:482): Ershov refers to the element a as a "special" element for the numbering. A numbering is called precomplete if the weaker property holds: Examples Any numbering of a singleton set is complete The identity function on the natural numbers is not complete A Gödel numbering is precomplete References Y.L. Ershov (1999), "Theory of numberings", Handbook of Computability Theory, E.R. Griffor (ed.), Elsevier, pp. 473–506. A.I. Mal'tsev, Sets with complete numberings. Algebra i Logika, 1963, vol. 2, no. 2, 4-29 (Russian) Computability theory
https://en.wikipedia.org/wiki/Colobot
Colobot (Colonize with Bots) is an educational, post apocalyptic real-time strategy video game featuring 3D graphics, created by Swiss developer Epsitec SA. The objective of the game is to find a planet for colonization by the human race by establishing a basic infrastructure on the surface and eliminating any alien life forms endangering the expedition. The game takes place on the Earth, Moon, and seven fictional planets. The main feature of the game, which makes it educational, is the possibility for players to program their robots using a programming language similar to C++ or Java. Plot overview Life on earth is threatened by a devastating cataclysm, forcing mankind to move out and search for a new home. A first expedition composed solely of robots was sent to find another habitable planet. However, for unknown reasons, the mission was a disaster and never returned. With only a few robots for companions, the player must travel to new planets. Houston, Earth Mission Control as well as a spy satellite will transmit valuable information to the player. The player needs to build the infrastructure necessary to gather raw materials, energy supplies, and produce the weapons necessary to defend themselves. By programming robots, the player can delegate tasks to them, allowing the player to continue their mission while their robots upkeep the base, fight off enemies, harvest materials, and perform any other tasks assigned to them. Missions In the game, the player explores Earth, Moon and seven fictional planets. Language overview The programming language used in is CBOT, syntactically similar to C++ and Java. Example code for a bot to find a piece of titanium ore and deliver it to a purification facility: extern void object::FetchTitanium() { object item; // declare variable item = radar(TitaniumOre); // find a piece of titanium ore goto(item.position); // go to the ore grab(); // pick up whatever is in front of the robot (presumably the ore) item =
https://en.wikipedia.org/wiki/Interval%20exchange%20transformation
In mathematics, an interval exchange transformation is a kind of dynamical system that generalises circle rotation. The phase space consists of the unit interval, and the transformation acts by cutting the interval into several subintervals, and then permuting these subintervals. They arise naturally in the study of polygonal billiards and in area-preserving flows. Formal definition Let and let be a permutation on . Consider a vector of positive real numbers (the widths of the subintervals), satisfying Define a map called the interval exchange transformation associated with the pair as follows. For let Then for , define if lies in the subinterval . Thus acts on each subinterval of the form by a translation, and it rearranges these subintervals so that the subinterval at position is moved to position . Properties Any interval exchange transformation is a bijection of to itself that preserves the Lebesgue measure. It is continuous except at a finite number of points. The inverse of the interval exchange transformation is again an interval exchange transformation. In fact, it is the transformation where for all . If and (in cycle notation), and if we join up the ends of the interval to make a circle, then is just a circle rotation. The Weyl equidistribution theorem then asserts that if the length is irrational, then is uniquely ergodic. Roughly speaking, this means that the orbits of points of are uniformly evenly distributed. On the other hand, if is rational then each point of the interval is periodic, and the period is the denominator of (written in lowest terms). If , and provided satisfies certain non-degeneracy conditions (namely there is no integer such that ), a deep theorem which was a conjecture of M.Keane and due independently to William A. Veech and to Howard Masur asserts that for almost all choices of in the unit simplex the interval exchange transformation is again uniquely ergodic. However, for there also ex
https://en.wikipedia.org/wiki/Creative%20and%20productive%20sets
In computability theory, productive sets and creative sets are types of sets of natural numbers that have important applications in mathematical logic. They are a standard topic in mathematical logic textbooks such as and . Definition and example For the remainder of this article, assume that is an admissible numbering of the computable functions and Wi the corresponding numbering of the recursively enumerable sets. A set A of natural numbers is called productive if there exists a total recursive (computable) function so that for all , if then The function is called the productive function for A set A of natural numbers is called creative if A is recursively enumerable and its complement is productive. Not every productive set has a recursively enumerable complement, however, as illustrated below. The archetypal creative set is , the set representing the halting problem. Its complement is productive with productive function f(i) = i (the identity function). To see this, we apply the definition of a productive function and show separately that and : : suppose , then , now given that we have , this leads to a contradiction. So . : in fact if , then it would be true that , but we have demonstrated the contrary in the previous point. So . Properties No productive set A can be recursively enumerable, because whenever A contains every number in an r.e. set Wi it contains other numbers, and moreover there is an effective procedure to produce an example of such a number from the index i. Similarly, no creative set can be decidable, because this would imply that its complement, a productive set, is recursively enumerable. Any productive set has a productive function that is injective and total. The following theorems, due to Myhill (1955), show that in a sense all creative sets are like and all productive sets are like . Theorem. Let P be a set of natural numbers. The following are equivalent: P is productive. is 1-reducible to P. is m-reducible