source
stringlengths
31
203
text
stringlengths
28
2k
https://en.wikipedia.org/wiki/Film%20frame
In filmmaking, video production, animation, and related fields, a frame is one of the many still images which compose the complete moving picture. The term is derived from the historical development of film stock, in which the sequentially recorded single images look like a framed picture when examined individually. The term may also be used more generally as a noun or verb to refer to the edges of the image as seen in a camera viewfinder or projected on a screen. Thus, the camera operator can be said to keep a car in frame by panning with it as it speeds past. Overview When the moving picture is displayed, each frame is flashed on a screen for a short time (nowadays, usually 1/24, 1/25 or 1/30 of a second) and then immediately replaced by the next one. Persistence of vision blends the frames together, producing the illusion of a moving image. The frame is also sometimes used as a unit of time, so that a momentary event might be said to last six frames, the actual duration of which depends on the frame rate of the system, which varies according to the video or film standard in use. In North America and Japan, 30 frames per second (fps) is the broadcast standard, with 24 frames/s now common in production for high-definition video shot to look like film. In much of the rest of the world, 25 frames/s is standard. In systems historically based on NTSC standards, for reasons originally related to the Chromilog NTSC TV systems, the exact frame rate is actually (3579545 / 227.5) / 525 = 29.97002616 fps. This leads to many synchronization problems which are unknown outside the NTSC world, and also brings about hacks such as drop-frame timecode. In film projection, 24 fps is the normal, except in some special venue systems, such as IMAX, Showscan and Iwerks 70, where 30, 48 or even 60 frame/s have been used. Silent films and 8 mm amateur movies used 16 or 18 frame/s. Physical film frames In a strip of movie film, individual frames are separated by frame lines. Normal
https://en.wikipedia.org/wiki/C%20Sharp%20%28programming%20language%29
C# (pronounced ) is a general-purpose high-level programming language supporting multiple paradigms. C# encompasses static typing, strong typing, lexically scoped, imperative, declarative, functional, generic, object-oriented (class-based), and component-oriented programming disciplines. The C# programming language was designed by Anders Hejlsberg from Microsoft in 2000 and was later approved as an international standard by Ecma (ECMA-334) in 2002 and ISO/IEC (ISO/IEC 23270) in 2003. Microsoft introduced C# along with .NET Framework and Visual Studio, both of which were closed-source. At the time, Microsoft had no open-source products. Four years later, in 2004, a free and open-source project called Mono began, providing a cross-platform compiler and runtime environment for the C# programming language. A decade later, Microsoft released Visual Studio Code (code editor), Roslyn (compiler), and the unified .NET platform (software framework), all of which support C# and are free, open-source, and cross-platform. Mono also joined Microsoft but was not merged into .NET. the most recent stable version of the language is C# 11.0, which was released in 2022 in .NET 7.0. Design goals The Ecma standard lists these design goals for C#: The language is intended to be a simple, modern, general-purpose, object-oriented programming language. The language, and implementations thereof, should provide support for software engineering principles such as strong type checking, array bounds checking, detection of attempts to use uninitialized variables, and automatic garbage collection. Software robustness, durability, and programmer productivity are important. The language is intended for use in developing software components suitable for deployment in distributed environments. Portability is very important for source code and programmers, especially those already familiar with C and C++. Support for internationalization is very important. C# is intended to be suitable for wr
https://en.wikipedia.org/wiki/MS-CHAP
MS-CHAP is the Microsoft version of the Challenge-Handshake Authentication Protocol, (CHAP). Versions The protocol exists in two versions, MS-CHAPv1 (defined in RFC 2433) and MS-CHAPv2 (defined in RFC 2759). MS-CHAPv2 was introduced with pptp3-fix that was included in Windows NT 4.0 SP4 and was added to Windows 98 in the "Windows 98 Dial-Up Networking Security Upgrade Release" and Windows 95 in the "Dial Up Networking 1.3 Performance & Security Update for MS Windows 95" upgrade. Windows Vista dropped support for MS-CHAPv1. Applications MS-CHAP is used as one authentication option in Microsoft's implementation of the PPTP protocol for virtual private networks. It is also used as an authentication option with RADIUS servers which are used with IEEE 802.1X (e.g., WiFi security using the WPA-Enterprise protocol). It is further used as the main authentication option of the Protected Extensible Authentication Protocol (PEAP). Features Compared with CHAP, MS-CHAP: works by negotiating CHAP Algorithm 0x80 (0x81 for MS-CHAPv2) in LCP option 3, Authentication Protocol. It provides an authenticator-controlled password change mechanism. It provides an authenticator-controlled authentication retry mechanism and defines failure codes returned in the Failure packet message field. MS-CHAPv2 provides mutual authentication between peers by piggybacking a peer challenge on the response packet and an authenticator response on the success packet. MS-CHAP requires each peer to either know the plaintext password, or an MD4 hash of the password. and does not transmit the password over the link. As such, it is not compatible with most password storage formats. Flaws Weaknesses have been identified in MS-CHAP and MS-CHAPv2. The DES encryption used in NTLMv1 and MS-CHAPv2 to encrypt the NTLM password hash enable custom hardware attacks utilizing the method of brute force. As of 2012, MS-CHAP had been completely broken. After Windows 11 22H2, with the default activation of Windo
https://en.wikipedia.org/wiki/Borel%20summation
In mathematics, Borel summation is a summation method for divergent series, introduced by . It is particularly useful for summing divergent asymptotic series, and in some sense gives the best possible sum for such series. There are several variations of this method that are also called Borel summation, and a generalization of it called Mittag-Leffler summation. Definition There are (at least) three slightly different methods called Borel summation. They differ in which series they can sum, but are consistent, meaning that if two of the methods sum the same series they give the same answer. Throughout let denote a formal power series and define the Borel transform of to be its equivalent exponential series Borel's exponential summation method Let denote the partial sum A weak form of Borel's summation method defines the Borel sum of to be If this converges at to some function , we say that the weak Borel sum of converges at , and write . Borel's integral summation method Suppose that the Borel transform converges for all positive real numbers to a function growing sufficiently slowly that the following integral is well defined (as an improper integral), the Borel sum of is given by If the integral converges at to some , we say that the Borel sum of converges at , and write . Borel's integral summation method with analytic continuation This is similar to Borel's integral summation method, except that the Borel transform need not converge for all , but converges to an analytic function of near 0 that can be analytically continued along the positive real axis. Basic properties Regularity The methods and are both regular summation methods, meaning that whenever converges (in the standard sense), then the Borel sum and weak Borel sum also converge, and do so to the same value. i.e. Regularity of is easily seen by a change in order of integration, which is valid due to absolute convergence: if is convergent at , then where the rightmost exp
https://en.wikipedia.org/wiki/Simple%20Gateway%20Monitoring%20Protocol
Simple Gateway Monitoring Protocol (SGMP) defined in RFC 1028, allows commands to be issued to application protocol entities to set or retrieve values (integer or octet string types) for use in monitoring the gateways on which the application protocol entities reside. Messages are exchanged using UDP and utilize unreliable transport methods. Authentication takes place on UDP port 153. Some examples of things that can be monitored are listed below. Network Type for interfaces: IEEE 802.3 MAC, IEEE 802.4 MAC, IEEE 802.5 MAC, Ethernet, ProNET-80, ProNET-10, FDDI, X.25, Point-to-Point Serial, RPA 1822 HDH, ARPA 1822, AppleTalk, StarLAN Interface Status (down, up, attempting, etc.) Route Type (local, remote, sub-network, etc.) Routing Protocol (RIP, EGP, GGP, IGRP, Hello) The protocol was replaced by SNMP (Simple Network Management Protocol) Sources RFC 1028 Network protocols
https://en.wikipedia.org/wiki/Moxi%20%28DVR%29
Moxi was a line of high-definition digital video recorders produced by Moxi Digital Digeo and Arris International. Moxi was originally released only to cable operators, but in December 2008 it was released as a retail product. Moxi was removed from the market in November 2011. The former retail product, the Moxi HD DVR, provided a high-definition user interface with support for either two or three CableCARD TV tuners. Arris also offered a companion appliance, the Moxi Mate, which could stream live or recorded TV from a Moxi HD DVR. History Digeo was founded in 1999 (originally under the name Broadband Partners, Inc.) by Microsoft co-founder Paul Allen, with headquarters in Kirkland, Washington. In the same year, Rearden Steel was started by Steve Perlman, founder of WebTV, under a veil of secrecy. In 2000, Rearden Steel was renamed to Moxi Digital while unveiling a line of media centers designed to bridge the gap between personal computers and televisions. Digeo, Inc. purchased Moxi Digital in 2002. Digeo kept its own name but adopted Moxi as its product family name. Its Palo Alto offices and most of Moxi Digital's staff were kept. Digeo also adopted most of the Moxi hardware (originally focused on satellite consumer electronics), as well as some of the Linux extensions, which were merged into Digeo's own Linux-based infrastructure and cable-specific hardware with Digeo's Emmy award-winning user interface, known as Moxi Menu. On September 22, 2009, the assets of Digeo, Inc. were purchased by the Arris International. Arris announced it would continue to develop and market the Moxi product line to both retail customers and cable operators. Retail DVR products The Moxi HD DVR was a high-definition digital video recorder (DVR) with both three-tuner and two-tuner models available, though the two-tuner model was produced only briefly before being updated. It was designed for use with cable television and supported multi-stream CableCARDs, as well as channel scanning f
https://en.wikipedia.org/wiki/End%20system
In networking jargon, a computer, phone, or internet of things device connected to a computer network is sometimes referred to as an end system or end station, because it sits at the edge of the network. The end user directly interacts with an end system that provides information or services. End systems that are connected to the Internet are also referred to as internet hosts; this is because they host (run) internet applications such as a web browser or an email retrieval program. The Internet's end systems include some computers with which the end user does not directly interact. These include mail servers, web servers, or database servers. With the emergence of the internet of things, household items (such as toasters and refrigerators) as well as portable, handheld computers and digital cameras are all being connected to the internet as end systems. End systems are generally connected to each other using switching devices known as routers rather than using a single communication link. The path that transmitted information takes from the sending end system, through a series of communications links and routers, to the receiving end system is known as a route or path through the network. The sending and receiving route can be different, and can be reallocated during transmission due to changes in the network topology. Normally the cheapest or fastest route is chosen. For the end user the actual routing should be completely transparent. See also Communication endpoint Data terminal equipment Edge device End instrument Host (network) Node (networking) Terminal (telecommunication) References Computing terminology Internet architecture
https://en.wikipedia.org/wiki/Request%20Tracker
Request Tracker, commonly abbreviated to RT, is an open source tool for organizations of all sizes to track and manage workflows, customer requests, and internal project tasks of all sorts. With seamless email integration, custom ticket lifecycles, configurable automation, and detailed permissions and roles, Request Tracker began as ticket-tracking software written in Perl used to coordinate tasks and manage requests among an online community of users. RT's first release in 1996 was written by Jesse Vincent, who later formed Best Practical Solutions LLC to distribute, develop, and support the package. RT is open source (FOSS) and distributed under the GNU General Public License. Request Tracker for Incident Response (RTIR) is a special distribution of RT to fulfill the specific needs of CERT teams. At this point, RTIR is, at once, a tool specific to incident management, a general purpose tool teams can use for other tasks, and also a tool that can—and very often is—a fully customized system built on layers of user integrations and user customizations. It was initially developed in cooperation with JANET-CERT, and in 2006 was upgraded and expanded with joint funding from nine Computer Security Incident Response Teams (CSIRTs) in Europe. Technology RT is written in Perl and runs on the Apache and lighttpd web servers using mod_perl or FastCGI with data stored in either MySQL, PostgreSQL, Oracle or SQLite. It is possible to extend the RT interface using plug-ins written in Perl. History Jesse Vincent, while enrolled at Wesleyan University in 1994, worked for Wesleyan's computing help desk and was responsible for improving the help desk and residential networking software infrastructure. This task included setting up a ticketing system for the help desk. Initially he set up a Linux server to run "req", but later he identified that the command line interface was limiting usage. Over the next two years he created and maintained WebReq, a web based interface for re
https://en.wikipedia.org/wiki/Odometry
Odometry is the use of data from motion sensors to estimate change in position over time. It is used in robotics by some legged or wheeled robots to estimate their position relative to a starting location. This method is sensitive to errors due to the integration of velocity measurements over time to give position estimates. Rapid and accurate data collection, instrument calibration, and processing are required in most cases for odometry to be used effectively. The word odometry is composed of the Greek words odos (meaning "route") and metron (meaning "measure"). Example Suppose a robot has rotary encoders on its wheels or on its legged joints. It drives forward for some time and then would like to know how far it has traveled. It can measure how far the wheels have rotated, and if it knows the circumference of its wheels, compute the distance. Train operations are also frequent users of odometrics. Typically, a train gets an absolute position by passing over stationary sensors in the tracks, while odometry is used to calculate relative position while the train is between the sensors. More sophisticated example Suppose that a simple robot has two wheels which can both move forward or reverse and that they are positioned parallel to one another, and equidistant from the center of the robot. Further, assume that each motor has a rotary encoder, and so one can determine if either wheel has traveled one "unit" forward or reverse along the floor. This unit is the ratio of the circumference of the wheel to the resolution of the encoder. If the left wheel were to move forward one unit while the right wheel remained stationary, then the right wheel acts as a pivot, and the left wheel traces a circular arc in the clockwise direction. Since one's unit of distance is usually tiny, one can approximate by assuming that this arc is a line. Thus, the original position of the left wheel, the final position of the left wheel, and the position of the right wheel form a triang
https://en.wikipedia.org/wiki/Mathematical%20methods%20in%20electronics
Mathematical methods are integral to the study of electronics. Mathematics in electronics Electronics engineering careers usually include courses in calculus (single and multivariable), complex analysis, differential equations (both ordinary and partial), linear algebra and probability. Fourier analysis and Z-transforms are also subjects which are usually included in electrical engineering programs. Laplace transform can simplify computing RLC circuit behaviour. Basic applications A number of electrical laws apply to all electrical networks. These include Faraday's law of induction: Any change in the magnetic environment of a coil of wire will cause a voltage (emf) to be "induced" in the coil. Gauss's Law: The total of the electric flux out of a closed surface is equal to the charge enclosed divided by the permittivity. Kirchhoff's current law: the sum of all currents entering a node is equal to the sum of all currents leaving the node or the sum of total current at a junction is zero Kirchhoff's voltage law: the directed sum of the electrical potential differences around a circuit must be zero. Ohm's law: the voltage across a resistor is the product of its resistance and the current flowing through it.at constant temperature. Norton's theorem: any two-terminal collection of voltage sources and resistors is electrically equivalent to an ideal current source in parallel with a single resistor. Thévenin's theorem: any two-terminal combination of voltage sources and resistors is electrically equivalent to a single voltage source in series with a single resistor. Millman's theorem: the voltage on the ends of branches in parallel is equal to the sum of the currents flowing in every branch divided by the total equivalent conductance. See also Analysis of resistive circuits. Circuit analysis is the study of methods to solve linear systems for an unknown variable. Circuit analysis Components There are many electronic components currently used and they all have thei
https://en.wikipedia.org/wiki/RCA%20Lyra
Lyra is a series of MP3 and portable media players (PMP). Initially it was developed and sold by Indianapolis-based Thomson Consumer Electronics Inc., a part of Thomson Multimedia, from 1999 under its RCA brand in the United States and under the Thomson brand in Europe. There were also RCA/Thomson PMPs without the Lyra name, such as the RCA Kazoo (RD1000), RCA Opal and RCA Perl. In January 2008, Thomson sold its Consumer Electronics part including the RCA brand and Lyra line to AudioVox. RCA-branded PMPs are still being made today in its domestic market but no longer under the Lyra name. The Lyra was an early pioneer in digital audio players, although in later years most of its output were OEM products. Players Lyra (RD2201/RD2204) The first ever Lyra was released in 1999 as a CompactFlash (CF) based player. It was sold in two models: the RD2201 with a 32 MB CF card ($199.99 list price), and the RD2204 (sold as the Thomson PDP2201 outside the U.S.) with 64 MB CF card ($249.99 list price). It was the first MP3 player that could be updated through software downloads. The Lyra was developed in partnership between Thomson Multimedia and RealNetworks - it has integration with the RealJukebox Windows software and, alongside encrypted MP3, can also play Real's G2 format audio files. A later firmware also allows WMA format playback. It has a 1" × 3/4", 6-line, backlit monochrome display which at that time (1999-2000) was relatively large. It has a software based five band graphic equalizer, and an external power jack. This series of players requires a proprietary CF reader used in conjunction with specific media players in Windows in order to write files to the card. A supported setup would take a blank CF card, recognize the correct reader attached to the PC, and then while syncing songs to the device, convert them to an encrypted version of RealAudio, MP3, MP3Pro, and later WMA format that is unrecognizable to any other device. It also drops a folder title 'Pmp' onto
https://en.wikipedia.org/wiki/Electrorotation
Electrorotation is the circular movement of an electrically polarized particle. Similar to the slip of an electric motor, it can arise from a phase lag between an applied rotating electric field and the respective relaxation processes and may thus be used to investigate the processes or, if these are known or can be accurately described by models, to determine particle properties. The method is popular in cellular biophysics, as it allows measuring cellular properties like conductivity and permittivity of cellular compartments and their surrounding membranes. See also Dielectric relaxation Dielectrophoresis Membrane potential Biophysics Electric and magnetic fields in matter
https://en.wikipedia.org/wiki/COPS%20%28software%29
The Computer Oracle and Password System (COPS) was the first vulnerability scanner for Unix operating systems to achieve widespread use. It was created by Dan Farmer while he was a student at Purdue University. Gene Spafford helped Farmer start the project in 1989. Features COPS is a software suite comprising at least 12 small vulnerability scanners, each programmed to audit one part of the operating system: File permissions, including device permissions/nodes Password strength Content, format, and security of password and group files (e.g., passwd) Programs and files run in /etc/rc* and cron(tab) files Root-SUID files: Which users can modify them? Are they shell scripts? A cyclic redundancy check of important files Writability of users' home directories and startup files Anonymous FTP configuration Unrestricted TFTP, decode alias in sendmail, SUID uudecode problems, hidden shells inside inetd.conf, rexd in inetd.conf Various root checks: Is the current directory in the search path? Is there a plus sign ("+") in the /etc/host.equiv file? Are NFS mounts unrestricted? Is root in /etc/ftpusers? Compare the modification dates of crucial files with dates of advisories from the CERT Coordination Center Kuang expert system After COPS, Farmer developed another vulnerability scanner called SATAN (Security Administrator Tool for Analyzing Networks). COPS is generally considered obsolete, but it is not uncommon to find systems which are set up in an insecure manner that COPS will identify. References External links COPS Citeseer entry for the COPS Usenix paper 1989 software Linux security software Unix security-related software
https://en.wikipedia.org/wiki/UniFLEX
UniFLEX is a Unix-like operating system developed by Technical Systems Consultants (TSC) for the Motorola 6809 family which allowed multitasking and multiprocessing. It was released for DMA-capable 8" floppy, extended memory addressing hardware (software controlled 4KiB paging of up to 768 KiB RAM), Motorola 6809 based computers. Examples included machines from SWTPC, Gimix and Goupil (France). On SWTPC machines, UniFLEX also supported a 20 MB, 14" hard drive (OEM'd from Century Data Systems) in 1979. Later on, it also supported larger 14" drives (up to 80 MB), 8" hard drives, and 5-1/4" floppies. In 1982 other machines also supported the first widely available 5-1/4" hard disks using the ST506 interface such as the 5 MB BASF 6182 and the removable SyQuest SQ306RD of the same capacity. Due to the limited address space of the 6809 (64 kB) and hardware limitations, the main memory space for the UniFLEX kernel as well as for any running process had to be smaller than 56 kB (code + data)(processes could be up to 64K minus 512 bytes). This was achieved by writing the kernel and most user space code entirely in assembly language, and by removing a few classic Unix features, such as group permissions for files. Otherwise, UniFLEX was very similar to Unix Version 7, though some command names were slightly different. There was no technical reason for the renaming apart from achieving some level of command-level compatibility with its single-user sibling FLEX. By simply restoring the Unix style names, a considerable degree of "Unix Look & Feel" could be established, though due to memory limitations the command line interpreter (shell) was less capable than the Bourne Shell known from Unix Version 7. Memory management included swapping to a dedicated portion of the system disk (even on floppies) but only whole processes could be swapped in and out, not individual pages. This caused swapping to be a very big hit on system responsiveness, so memory had to be sized appropriate
https://en.wikipedia.org/wiki/Enterprise%20risk%20management
Enterprise risk management (ERM) in business includes the methods and processes used by organizations to manage risks and seize opportunities related to the achievement of their objectives. ERM provides a framework for risk management, which typically involves identifying particular events or circumstances relevant to the organization's objectives (threats and opportunities), assessing them in terms of likelihood and magnitude of impact, determining a response strategy, and monitoring process. By identifying and proactively addressing risks and opportunities, business enterprises protect and create value for their stakeholders, including owners, employees, customers, regulators, and society overall. ERM can also be described as a risk-based approach to managing an enterprise, integrating concepts of internal control, the Sarbanes–Oxley Act, data protection and strategic planning. ERM is evolving to address the needs of various stakeholders, who want to understand the broad spectrum of risks facing complex organizations to ensure they are appropriately managed. Regulators and debt rating agencies have increased their scrutiny on the risk management processes of companies. According to Thomas Stanton of Johns Hopkins University, the point of enterprise risk management is not to create more bureaucracy, but to facilitate discussion on what the really big risks are. ERM frameworks defined There are various important ERM frameworks, each of which describes an approach for identifying, analyzing, responding to, and monitoring risks and opportunities, within the internal and external environment facing the enterprise. Management selects a risk response strategy for specific risks identified and analyzed, which may include: Avoidance: exiting the activities giving rise to risk Reduction: taking action to reduce the likelihood or impact related to the risk Alternative Actions: deciding and considering other feasible steps to minimize risks Share or Insure: transfer
https://en.wikipedia.org/wiki/Valentin%20Turchin
Valentin Fyodorovich Turchin (, 14 February 1931 in Podolsk – 7 April 2010 in Oakland, New Jersey) was a Soviet and American physicist, cybernetician, and computer scientist. He developed the Refal programming language, the theory of metasystem transitions and the notion of supercompilation. He was as a pioneer in artificial intelligence and a proponent of the global brain hypothesis. Biography Turchin was born in 1931 in Podolsk, Soviet Union. In 1952, he graduated from Moscow University in Theoretical Physics, and got his Ph.D. in 1957. After working on neutron and solid-state physics at the Institute for Physics of Energy in Obninsk, in 1964 he accepted a position at the Keldysh Institute of Applied Mathematics in Moscow. There he worked in statistical regularization methods and authored REFAL, one of the first AI languages and the AI language of choice in the Soviet Union. In the 1960s, Turchin became politically active. In Fall 1968, he wrote the pamphlet The Inertia of Fear, which was quite widely circulated in samizdat, the writing began to be circulated under the title The Inertia of Fear: Socialism and Totalitarianism in Moscow from 1976. Following its publication in the underground press, he lost his research laboratory. In 1970 he authored "The Phenomenon of Science", a grand cybernetic meta-theory of universal evolution, which broadened and deepened the earlier book. By 1973, Turchin had founded the Moscow chapter of Amnesty International with Andrey Tverdokhlebov and was working closely with the well-known physicist and Soviet dissident Andrei Sakharov. In 1974 he lost his position at the Institute, and was persecuted by the KGB. Facing almost certain imprisonment, he and his family were forced to emigrate from the Soviet Union in 1977. He went to New York, where he joined the faculty of the City University of New York in 1979. In 1990, together with Cliff Joslyn and Francis Heylighen, he founded the Principia Cybernetica Project, a worldwide organiz
https://en.wikipedia.org/wiki/Windows%20Server%202008
Windows Server 2008, codenamed "Longhorn Server", is the fourth release of the Windows Server operating system produced by Microsoft as part of the Windows NT family of the operating systems. It was released to manufacturing on February 4, 2008, and generally to retail on February 27, 2008. Derived from Windows Vista, Windows Server 2008 is the successor of Windows Server 2003 and the predecessor to Windows Server 2008 R2. Windows Server 2008 removed support for processors without ACPI. It is the first version of Windows Server that includes Hyper-V and is also the final version of Windows Server that supports IA-32-based processors (also known as 32-bit processors). Its successor, Windows Server 2008 R2, requires a 64-bit processor in any supported architecture (x86-64 for x86 and Itanium). As of July 2019, 60% of Windows servers were running Windows Server 2008. History Microsoft had released Windows Vista to mixed reception, and their last Windows Server release was based on Windows XP. The operating system's working title was Windows Server Codename "Longhorn", but was later changed to Windows Server 2008 when Microsoft chairman Bill Gates announced it during his keynote address at WinHEC on May 16, 2007. Beta 1 was released on July 27, 2005; Beta 2 was announced and released on May 23, 2006, at WinHEC 2006 and Beta 3 was released publicly on April 25, 2007. Release Candidate 0 was released to the general public on September 24, 2007 and Release Candidate 1 was released to the general public on December 5, 2007. Windows Server 2008 was released to manufacturing on February 4, 2008, and officially launched on the 27th of that month. Features Windows Server 2008 is built from the same codebase as Windows Vista and thus it shares much of the same architecture and functionality. Since the codebase is common, Windows Server 2008 inherits most of the technical, security, management and administrative features new to Windows Vista such as the rewritten networki
https://en.wikipedia.org/wiki/Reconfigurable%20manufacturing%20system
A reconfigurable manufacturing system (RMS) is one designed at the outset for rapid change in its structure, as well as its hardware and software components, in order to quickly adjust its production capacity and functionality within a part family in response to sudden market changes or intrinsic system change. From 1996 to 2007 Yoram Koren received an NSF grant of $32.5 million to develop the RMS science-base and its software and hardware tools, which were implemented in the automotive, aerospace, and engine factories. The term reconfigurability in manufacturing was likely coined by Kusiak and Lee. The RMS, as well as one of its components—the reconfigurable machine tool (RMT)—were invented in 1998 in the Engineering Research Center for Reconfigurable Manufacturing Systems (ERC/RMS) at the University of Michigan College of Engineering. The RMS goal is summarized by the statement: "Exactly the capacity and functionality needed, exactly when needed". Ideal reconfigurable manufacturing systems possess six core RMS characteristics: modularity, integrability, customized flexibility, scalability, convertibility, and diagnosability. A typical RMS will have several of these characteristics, though not necessarily all. When possessing these characteristics, RMS increases the speed of responsiveness of manufacturing systems to unpredicted events, such as sudden market demand changes or unexpected machine failures.. The RMS facilitates a quick production launch of new products, and allows for adjustment of production quantities that might unexpectedly vary. The ideal reconfigurable system provides exactly the functionality and production capacity needed, and can be economically adjusted exactly when needed. These systems are designed and operated according to Yoram Koren's RMS principles. The components of RMS are CNC machines, reconfigurable machine tools, reconfigurable inspection machines and material transport systems (such as gantries and conveyors) that connect th
https://en.wikipedia.org/wiki/ThreadX
Azure RTOS ThreadX is a highly deterministic, embedded real-time operating system (RTOS) programmed mostly in the language C. Overview ThreadX was originally developed and marketed by Express Logic of San Diego, California, United States. The author of ThreadX is William Lamie, who was also the original author of the Nucleus RTOS in 1990. William Lamie was President and CEO of Express Logic. Express Logic was purchased for an undisclosed sum by Microsoft on April 18, 2019. The name ThreadX is derived from the threads that are used as the executable elements, and the letter X represents context switching, i.e., it switches threads. ThreadX provides priority-based, preemptive scheduling, fast interrupt response, memory management, interthread communication, mutual exclusion, event notification, and thread synchronization features. Major distinguishing technology characteristics of ThreadX include preemption-threshold, priority inheritance, efficient timer management, fast software timers, picokernel design, event-chaining, and small size: minimal size on an ARM architecture processor is about 2 KB. ThreadX supports multi-core processor environments via either asymmetric multiprocessing (AMP) or symmetric multiprocessing (SMP). Application thread isolation with memory management unit (MMU) or memory protection unit (MPU) memory protection is available with ThreadX Modules. ThreadX has extensive safety certifications from Technischer Überwachungsverein (TÜV, English: Technical Inspection Association) and UL (formerly Underwriters Laboratories) and is Motor Industry Software Reliability Association MISRA C compliant. ThreadX is the foundation of Express Logic's X-Ware Internet of things (IoT) platform, which also includes embedded file system support (FileX), embedded UI support (GUIX), embedded Internet protocol suite (TCP/IP) and cloud connectivity (NetX/NetX Duo), and Universal Serial Bus (USB) support (USBX). ThreadX has won high appraisal from developers and i
https://en.wikipedia.org/wiki/List%20of%20proteins
Proteins are a class of macromolecular organic compounds that are essential to life. They consist of a long polypeptide chain that usually adopts a single stable three-dimensional structure. They fulfill a wide variety of functions including providing structural stability to cells, catalyze chemical reactions that produce or store energy or synthesize other biomolecules including nucleic acids and proteins, transport essential nutrients, or serve other roles such as signal transduction. They are selectively transported to various compartments of the cell or in some cases, secreted from the cell. This list aims to organize information on how proteins are most often classified: by structure, by function, or by location. Structure Proteins may be classified as to their three-dimensional structure (also known a protein fold). The two most widely used classification schemes are: CATH database Structural Classification of Proteins database (SCOP) Both classification schemes are based on a hierarchy of fold types. At the top level are all alpha proteins (domains consisting of alpha helices), all beta proteins (domains consisting of beta sheets), and mixed alpha helix/beta sheet proteins. While most proteins adopt a single stable fold, a few proteins can rapidly interconvert between one or more folds. These are referred to as metamorphic proteins. Finally other proteins appear not to adopt any stable conformation and are referred to as intrinsically disordered. Proteins frequently contain two or more domains, each have a different fold separated by intrinsically disordered regions. These are referred to as multi-domain proteins. Function Proteins may also be classified based on their celluar function. A widely used classification is PANTHER (protein analysis through evolutionary relationships) classification system. Structural Protein#Structural proteins Catalytic Enzymes classified according to their Enzyme Commission number (EC). Note that strictly speaki
https://en.wikipedia.org/wiki/IDEF0
IDEF0, a compound acronym ("Icam DEFinition for Function Modeling", where ICAM is an acronym for "Integrated Computer Aided Manufacturing"), is a function modeling methodology for describing manufacturing functions, which offers a functional modeling language for the analysis, development, reengineering and integration of information systems, business processes or software engineering analysis. IDEF0 is part of the IDEF family of modeling languages in the field of software engineering, and is built on the functional modeling language Structured Analysis and Design Technique (SADT). Overview The IDEF0 Functional Modeling method is designed to model the decisions, actions, and activities of an organization or system.<ref name = "VG00">Varun Grover, William J. Kettinger (2000). Process Think: Winning Perspectives for Business Change in the Information Age. p.168.</ref> It was derived from the established graphic modeling language Structured Analysis and Design Technique (SADT) developed by Douglas T. Ross and SofTech, Inc. In its original form, IDEF0 includes both a definition of a graphical modeling language (syntax and semantics) and a description of a comprehensive methodology for developing models. The US Air Force commissioned the SADT developers "to develop a function model method for analyzing and communicating the functional perspective of a system. IDEF0 should assist in organizing system analysis and promote effective communication between the analyst and the customer through simplified graphical devices". Where the Functional flow block diagram is used to show the functional flow of a product, IDEF0 is used to show data flow, system control, and the functional flow of lifecycle processes. IDEF0 is capable of graphically representing a wide variety of business, manufacturing and other types of enterprise operations to any level of detail. It provides rigorous and precise description, and promotes consistency of usage and interpretation. It is well-tested
https://en.wikipedia.org/wiki/David%20E.%20Shaw
David Elliot Shaw (born March 29, 1951) is an American billionaire scientist and former hedge fund manager. He founded D. E. Shaw & Co., a hedge fund company which was once described by Fortune magazine as "the most intriguing and mysterious force on Wall Street". A former assistant professor in the computer science department at Columbia University, Shaw made his fortune exploiting inefficiencies in financial markets with the help of state-of-the-art high speed computer networks. In 1996, Fortune magazine referred to him as "King Quant" because of his firm's pioneering role in high-speed quantitative trading. In 2001, Shaw turned to full-time scientific research in computational biochemistry, more specifically molecular dynamics simulations of proteins. Early life and education Shaw was raised in Los Angeles, California. His father was a theoretical physicist who specialised in plasma and fluid flows, and his mother is an artist and educator. They divorced when he was 12. His stepfather, Irving Pfeffer, was professor of finance at University of California, Los Angeles, and the author of papers supporting the efficient market hypothesis. Shaw earned a bachelor's degree summa cum laude from the University of California, San Diego, a PhD from Stanford University in 1980, and then became an assistant professor of the department of computer science at Columbia University. While at Columbia, Shaw conducted research in massively parallel computing with the NON-VON supercomputer. This supercomputer was composed of processing elements in a tree structure meant to be used for fast relational database searches. Earlier in his career, he founded Stanford Systems Corporation. Investment career In 1986, he joined Morgan Stanley, as Vice President for Technology in Nunzio Tartaglia's automated proprietary trading group. In 1994, Shaw was appointed by President Clinton to the President's Council of Advisors on Science and Technology, where he was chairman of the Panel on Educat
https://en.wikipedia.org/wiki/Caml
Caml (originally an acronym for Categorical Abstract Machine Language) is a multi-paradigm, general-purpose programming language which is a dialect of the ML programming language family. Caml was developed in France at INRIA and ENS. Caml is statically typed, strictly evaluated, and uses automatic memory management. OCaml, the main descendant of Caml, adds many features to the language, including an object layer. Examples In the following, represents the Caml prompt. Hello World print_endline "Hello, world!";; Factorial function (recursion and purely functional programming) Many mathematical functions, such as factorial, are most naturally represented in a purely functional form. The following recursive, purely functional Caml function implements factorial: let rec fact n = if n=0 then 1 else n * fact(n - 1);; The function can be written equivalently using pattern matching: let rec fact = function | 0 -> 1 | n -> n * fact(n - 1);; This latter form is the mathematical definition of factorial as a recurrence relation. Note that the compiler inferred the type of this function to be , meaning that this function maps ints onto ints. For example, 12! is: # fact 12;; - : int = 479001600 Numerical derivative (higher-order functions) Since Caml is a functional programming language, it is easy to create and pass around functions in Caml programs. This capability has an enormous number of applications. Calculating the numerical derivative of a function is one such application. The following Caml function computes the numerical derivative of a given function at a given point : let d delta f x = (f (x +. delta) -. f (x -. delta)) /. (2. *. delta);; This function requires a small value . A good choice for delta is the cube root of the machine epsilon. The type of the function indicates that it maps a onto another function with the type . This allows us to partially apply arguments. This functional style is known as currying. In this case, it is useful to part
https://en.wikipedia.org/wiki/Operation%20RAFTER
RAFTER was a code name for the MI5 radio receiver detection technique, mostly used against clandestine Soviet agents and monitoring of domestic radio transmissions by foreign embassy personnel from the 1950s on. Explanation Most radio receivers of the period were of the AM superhet design, with local oscillators which generate a signal typically 455 kHz above or sometimes below the frequency to be received. There is always some oscillator radiation leakage from such receivers, and in the initial stages of RAFTER, MI5 simply attempted to locate clandestine receivers by detecting the leaked signal with a sensitive custom-built receiver. This was complicated by domestic radios in people's homes also leaking radiation. By accident, one such receiver for MI5 mobile radio transmissions was being monitored when a passing transmitter produced a powerful signal which overloaded the receiver, producing an audible change in the received signal. The agency realized that they could identify the actual frequency being monitored if they produced their own transmissions and listened for the change in the superhet tone. Soviet transmitters Soviet short-wave transmitters were extensively used to broadcast messages to clandestine agents, the transmissions consisting simply of number sequences read aloud and decoded using a one-time pad. It was realized that this new technique could be used to track down such agents. Specially equipped aircraft would fly over urban areas at times when the agents were receiving Soviet transmissions, and attempt to locate receivers tuned to the transmissions. Tactics Like many secret technologies, RAFTER's use was attended by the fear of over-use, alerting the quarry and causing a shift in tactics which would neutralize the technology. As a technical means of intelligence, it was also not well supported by the more traditional factions in MI5. Its part in the successes and failures of MI5 at the time is not entirely known. In his book Spycatcher,
https://en.wikipedia.org/wiki/Aaargh%21
Aaargh! is a single-player action video game in which the player controls a giant monster with the goal of obtaining eggs by destroying buildings in different cities across a lost island. It was designed for Mastertronic's Arcadia Systems, an arcade machine based on the custom hardware of the Amiga, and was released in 1987. It was ported to a range of other platforms and released on these across 1988 and 1989. Electronic Arts distributed the Amiga version of the game. Gameplay The goal of the game is to find the golden dragon's egg. The player controls one of two monsters who must destroy buildings in order to find Roc eggs, the discovery of each of which triggers a fight with a rival monster. When five eggs are found, the two monsters fight on a volcano to claim the dragon's egg. The game is an action game with fighting game elements. The player chooses to play as either a dragon-like lizard or an ogre (depicted as a cyclops in the game); the character that the player does not select becomes the player's rival to obtain the egg. In the arcade version of the game either one or two players could play simultaneously, whereas on the ports only one player could play at a time. Gameplay takes place across the ten cities of the Lost Island, each representing a different era of civilisation (such as ancient Egypt and the Wild West) and each comprising one level of the game. Each city is represented by a single static playing area that uses a form of 2.5D projection in order to give the impression of depth on the screen. Reception The game received mixed reviews from gaming magazines across the platforms to which it was ported, with scores ranging from around 2/10 (or equivalent) up to almost 9/10. While reviewers praised the graphics and sound, particularly on the Amiga port, they criticised the gameplay. ACE magazine said that although the game had "good graphics, atmospheric sound and good gameplay" there was not enough challenge to the game and that players woul
https://en.wikipedia.org/wiki/Mutation%20testing
Mutation testing (or mutation analysis or program mutation) is used to design new software tests and evaluate the quality of existing software tests. Mutation testing involves modifying a program in small ways. Each mutated version is called a mutant and tests detect and reject mutants by causing the behaviour of the original version to differ from the mutant. This is called killing the mutant. Test suites are measured by the percentage of mutants that they kill. New tests can be designed to kill additional mutants. Mutants are based on well-defined mutation operators that either mimic typical programming errors (such as using the wrong operator or variable name) or force the creation of valuable tests (such as dividing each expression by zero). The purpose is to help the tester develop effective tests or locate weaknesses in the test data used for the program or in sections of the code that are seldom or never accessed during execution. Mutation testing is a form of white-box testing. Introduction Most of this article is about "program mutation", in which the program is modified. A more general definition of mutation analysis is using well-defined rules defined on syntactic structures to make systematic changes to software artifacts. Mutation analysis has been applied to other problems, but is usually applied to testing. So mutation testing is defined as using mutation analysis to design new software tests or to evaluate existing software tests. Thus, mutation analysis and testing can be applied to design models, specifications, databases, tests, XML, and other types of software artifacts, although program mutation is the most common. Overview Tests can be created to verify the correctness of the implementation of a given software system, but the creation of tests still poses the question whether the tests are correct and sufficiently cover the requirements that have originated the implementation. (This technological problem is itself an instance of a deeper phi
https://en.wikipedia.org/wiki/Hyperbolic%20growth
When a quantity grows towards a singularity under a finite variation (a "finite-time singularity") it is said to undergo hyperbolic growth. More precisely, the reciprocal function has a hyperbola as a graph, and has a singularity at 0, meaning that the limit as is infinite: any similar graph is said to exhibit hyperbolic growth. Description If the output of a function is inversely proportional to its input, or inversely proportional to the difference from a given value , the function will exhibit hyperbolic growth, with a singularity at . In the real world hyperbolic growth is created by certain non-linear positive feedback mechanisms. Comparisons with other growth Like exponential growth and logistic growth, hyperbolic growth is highly nonlinear, but differs in important respects. These functions can be confused, as exponential growth, hyperbolic growth, and the first half of logistic growth are convex functions; however their asymptotic behavior (behavior as input gets large) differs dramatically: logistic growth is constrained (has a finite limit, even as time goes to infinity), exponential growth grows to infinity as time goes to infinity (but is always finite for finite time), hyperbolic growth has a singularity in finite time (grows to infinity at a finite time). Applications Population Certain mathematical models suggest that until the early 1970s the world population underwent hyperbolic growth (see, e.g., Introduction to Social Macrodynamics by Andrey Korotayev et al.). It was also shown that until the 1970s the hyperbolic growth of the world population was accompanied by quadratic-hyperbolic growth of the world GDP, and developed a number of mathematical models describing both this phenomenon, and the World System withdrawal from the blow-up regime observed in the recent decades. The hyperbolic growth of the world population and quadratic-hyperbolic growth of the world GDP observed till the 1970s have been correlated by Andrey Korotayev and his
https://en.wikipedia.org/wiki/Rice-hull%20bagwall%20construction
Rice-hull bagwall construction is a system of building, with results aesthetically similar to the use of earthbag or cob construction. Woven polypropylene bags (or tubes) are tightly filled with raw rice-hulls, and these are stacked up, layer upon layer, with strands of four-pronged barbed wire between. A surrounding "cage" composed of mats of welded or woven steel mesh (remesh or "poultry wire") on both sides (wired together between bag layers with, for example, rebar tie-wire) and then stuccoed, to form building walls. Fireproofing Mixing rice-hulls in boric acid and borax solution results in fireproofing. A similar result can be achieved if placed on top of poured ingot, which applies direct heat until turned into ash. In addition, its ash form does not appeal to vermin. See also Earthbag construction Cob construction References External links The Rice Hull House, by Paul A Olivier, Ph.D. Construction Building materials Sustainable building Appropriate technology
https://en.wikipedia.org/wiki/Algebra%20of%20physical%20space
In physics, the algebra of physical space (APS) is the use of the Clifford or geometric algebra Cl3,0(R) of the three-dimensional Euclidean space as a model for (3+1)-dimensional spacetime, representing a point in spacetime via a paravector (3-dimensional vector plus a 1-dimensional scalar). The Clifford algebra Cl3,0(R) has a faithful representation, generated by Pauli matrices, on the spin representation C2; further, Cl3,0(R) is isomorphic to the even subalgebra Cl(R) of the Clifford algebra Cl3,1(R). APS can be used to construct a compact, unified and geometrical formalism for both classical and quantum mechanics. APS should not be confused with spacetime algebra (STA), which concerns the Clifford algebra Cl1,3(R) of the four-dimensional Minkowski spacetime. Special relativity Spacetime position paravector In APS, the spacetime position is represented as the paravector where the time is given by the scalar part , and e1, e2, e3 are the standard basis for position space. Throughout, units such that are used, called natural units. In the Pauli matrix representation, the unit basis vectors are replaced by the Pauli matrices and the scalar part by the identity matrix. This means that the Pauli matrix representation of the space-time position is Lorentz transformations and rotors The restricted Lorentz transformations that preserve the direction of time and include rotations and boosts can be performed by an exponentiation of the spacetime rotation biparavector W In the matrix representation, the Lorentz rotor is seen to form an instance of the SL(2,C) group (special linear group of degree 2 over the complex numbers), which is the double cover of the Lorentz group. The unimodularity of the Lorentz rotor is translated in the following condition in terms of the product of the Lorentz rotor with its Clifford conjugation This Lorentz rotor can be always decomposed in two factors, one Hermitian , and the other unitary , such that The unitary element R is cal
https://en.wikipedia.org/wiki/DirectX%20Video%20Acceleration
DirectX Video Acceleration (DXVA) is a Microsoft API specification for the Microsoft Windows and Xbox 360 platforms that allows video decoding to be hardware-accelerated. The pipeline allows certain CPU-intensive operations such as iDCT, motion compensation and deinterlacing to be offloaded to the GPU. DXVA 2.0 allows more operations, including video capturing and processing operations, to be hardware-accelerated as well. DXVA works in conjunction with the video rendering model used by the video card. DXVA 1.0, which was introduced as a standardized API with Windows 2000 and is currently available on Windows 98 or later, can use either the overlay rendering mode or VMR 7/9. DXVA 2.0, available only on Windows Vista, Windows 7, Windows 8 and later OSs, integrates with Media Foundation (MF) and uses the Enhanced Video Renderer (EVR) present in MF. Overview The DXVA is used by software video decoders to define a codec-specific pipeline for hardware-accelerated decoding and rendering of the codec. The pipeline starts at the CPU which is used for parsing the media stream and conversion to DXVA-compatible structures. DXVA specifies a set of operations that can be hardware-accelerated and device driver interfaces (DDIs) that the graphic driver can implement to accelerate the operations. If the codec needs to do any of the defined operations, it can use these interfaces to access the hardware-accelerated implementation of these operations. If the graphic driver does not implement one or more of the interfaces, it is up to the codec to provide a software fallback for it. The decoded video is handed over to the hardware video renderer, where further video post-processing might be applied to it before being rendered to the device. The resulting pipeline is usable in a DirectShow-compatible application. DXVA specifies the Motion Compensation DDI, which specifies the interfaces for iDCT operations, Huffman coding, motion compensation, alpha blending, inverse quantization, col
https://en.wikipedia.org/wiki/GPRS%20Tunnelling%20Protocol
GPRS Tunnelling Protocol (GTP) is a group of IP-based communications protocols used to carry general packet radio service (GPRS) within GSM, UMTS, LTE and 5G NR radio networks. In 3GPP architectures, GTP and Proxy Mobile IPv6 based interfaces are specified on various interface points. GTP can be decomposed into separate protocols, GTP-C, GTP-U and GTP'. GTP-C is used within the GPRS core network for signaling between gateway GPRS support nodes (GGSN) and serving GPRS support nodes (SGSN). This allows the SGSN to activate a session on a user's behalf (PDP context activation), to deactivate the same session, to adjust quality of service parameters, or to update a session for a subscriber who has just arrived from another SGSN. GTP-U is used for carrying user data within the GPRS core network and between the radio access network and the core network. The user data transported can be packets in any of IPv4, IPv6, or PPP formats. GTP' (GTP prime) uses the same message structure as GTP-C and GTP-U, but has an independent function. It can be used for carrying charging data from the charging data function (CDF) of the GSM or UMTS network to the charging gateway function (CGF). In most cases, this should mean from many individual network elements such as the GGSNs to a centralized computer that delivers the charging data more conveniently to the network operator's billing center. Different GTP variants are implemented by RNCs, SGSNs, GGSNs and CGFs within 3GPP networks. GPRS mobile stations (MSs) are connected to a SGSN without being aware of GTP. GTP can be used with UDP or TCP. UDP is either recommended or mandatory, except for tunnelling X.25 in version 0. GTP version 1 is used only on UDP. General features All variants of GTP have certain features in common. The structure of the messages is the same, with a GTP header following the UDP/TCP header. Header GTP version 1 GTPv1 headers contain the following fields: Version It is a 3-bit field. For GTPv1, this
https://en.wikipedia.org/wiki/XACML
The eXtensible Access Control Markup Language (XACML) is an XML-based standard markup language for specifying access control policies. The standard, published by OASIS, defines a declarative fine-grained, attribute-based access control policy language, an architecture, and a processing model describing how to evaluate access requests according to the rules defined in policies. XACML is primarily an attribute-based access control system. In XACML, attributes – information about the subject accessing a resource, the resource to be addressed, and the environment – act as inputs for the decision of whether access is granted or not. XACML can also be used to implement role-based access control. In XACML, access control decisions to be taken are expressed as Rules. Each Rule comprises a series of conditions which decide whether a given request is approved or not. If a Rule is applicable to a request but the conditions within the Rule fail to evaluate, the result is Indeterminate. Rules are grouped together in Policies, and a PolicySet contains Policies and possibly other PolicySets. Each of these also includes a Target, a simple condition that determines whether it should be evaluated for a given request. Combining algorithms can be used to combine Rules and Policies with potentially differing results in various ways. XACML also supports obligations and advice expressions. Obligations specify actions which must be executed during the processing of a request, for example for logging. Advice expressions are similar, but may be ignored. XACML separates access control functionality into several components. Each operating environment in which access control is used has a Policy Enforcement Point (PEP) which implements the functionality to demand authorization and to grant or deny access to resources. These refer to an environment-independent and central Policy Decision Point (PDP) which actually makes the decision on whether access is granted. The PDP refers to policies s
https://en.wikipedia.org/wiki/KRCW-TV
KRCW-TV (channel 32) is a television station licensed to Salem, Oregon, United States, serving as the CW outlet for the Portland area. It is owned and operated by network majority owner Nexstar Media Group alongside CBS affiliate KOIN (channel 6). Both stations share studios in the basement of the KOIN Center skyscraper on Southwest Columbia Street in downtown Portland, while KRCW-TV's transmitter is located in the Sylvan-Highlands neighborhood of the city. Previously, KRCW-TV maintained separate studios on Southwest Arctic Drive in Beaverton, while KOIN's facilities only housed KRCW-TV's master control and some internal operations. Despite Salem being KRCW-TV's city of license, the station maintains no physical presence there. History Early history The station was launched on May 8, 1989, under the call sign KUTF (standing for "Keep Up the Faith"), its original transmitter was located outside Molalla. The station's original programming format almost entirely consisted of religious programs. It was originally operated by Dove Broadcasting, owner of Christian television station WGGS-TV in Greenville, South Carolina; local productions included a version of WGGS's popular Nite Line talk program. Despite its long legacy in Christian television (its flagship has been on the air since 1972), Dove struggled to build a support base for KUTF. In May 1990, the station went dark. According to station insiders, the Jim Bakker and Jimmy Swaggart scandals gave potential supporters pause. It did not help matters that the station had received competition a few months after signing on from KNMT, with wealthier ownership (Trinity Broadcasting Network, through subsidiary National Minority Television) and a stronger signal. KUTF resumed broadcasting a month later. Dove sold KUTF to Eagle Broadcasting on July 17, 1991. The call sign was changed to KEBN on February 11, 1992; the new owners then proceeded to relaunch the station as "Oregon's New Eagle 32", becoming a general entertai
https://en.wikipedia.org/wiki/210%20%28number%29
210 (two hundred [and] ten) is the natural number following 209 and preceding 211. In mathematics 210 is a composite number, an abundant number, Harshad number, and the product of the first four prime numbers (2, 3, 5, and 7), and thus a primorial. It is also the least common multiple of these four prime numbers. It is the sum of eight consecutive prime numbers (13 + 17 + 19 + 23 + 29 + 31 + 37 + 41 = 210). It is a triangular number (following 190 and preceding 231), a pentagonal number (following 176 and preceding 247), and the second smallest to be both triangular and pentagonal (the third is 40755). It is also an idoneal number, a pentatope number, a pronic number, and an untouchable number. 210 is also the third 71-gonal number, preceding 418. It is the first primorial number greater than 2 which is not adjacent to 2 primes (211 is prime, but 209 is not). It is the largest number n such that all primes between n/2 and n yield a representation as a sum of two primes. Integers between 211 and 219 211 212 213 214 215 216 217 218 219 See also 210 BC AD 210 North American telephone area code area code 210 References Integers
https://en.wikipedia.org/wiki/Tin%28IV%29%20oxide
Tin(IV) oxide, also known as stannic oxide, is the inorganic compound with the formula SnO2. The mineral form of SnO2 is called cassiterite, and this is the main ore of tin. With many other names, this oxide of tin is an important material in tin chemistry. It is a colourless, diamagnetic, amphoteric solid. Structure Tin(IV) oxide crystallises with the rutile structure. As such the tin atoms are six coordinate and the oxygen atoms three coordinate. SnO2 is usually regarded as an oxygen-deficient n-type semiconductor. Hydrous forms of SnO2 have been described as stannic acid. Such materials appear to be hydrated particles of SnO2 where the composition reflects the particle size. Preparation Tin(IV) oxide occurs naturally. Synthetic tin(IV) oxide is produced by burning tin metal in air. Annual production is in the range of 10 kilotons. SnO2 is reduced industrially to the metal with carbon in a reverberatory furnace at 1200–1300 °C. Amphoterism Although SnO2 is insoluble in water, it is amphoteric, dissolving in base and acid. "Stannic acid" refers to hydrated tin (IV) oxide, SnO2, which is also called "stannic oxide." Tin oxides dissolve in acids. Halogen acids attack SnO2 to give hexahalostannates, such as [SnI6]2−. One report describes reacting a sample in refluxing HI for many hours. SnO2 + 6 HI → H2SnI6 + 2 H2O Similarly, SnO2 dissolves in sulfuric acid to give the sulfate: SnO2 + 2 H2SO4 → Sn(SO4)2 + 2 H2O SnO2 dissolves in strong bases to give "stannates," with the nominal formula Na2SnO3. Dissolving the solidified SnO2/NaOH melt in water gives Na2[Sn(OH)6], "preparing salt," which is used in the dye industry. Uses In conjunction with vanadium oxide, it is used as a catalyst for the oxidation of aromatic compounds in the synthesis of carboxylic acids and acid anhydrides. Ceramic glazes Tin(IV) oxide has long been used as an opacifier and as a white colorant in ceramic glazes.’The Glazer’s Book’ – 2nd edition. A.B.Searle.The Technical Press Limited. Lon
https://en.wikipedia.org/wiki/Obstetrical%20dilemma
The obstetrical dilemma is a hypothesis to explain why humans often require assistance from other humans during childbirth to avoid complications, whereas most non-human primates give birth unassisted with relatively little difficulty. This occurs due to the tight fit of the fetal head to the maternal birth canal, which is additionally convoluted, meaning the head and therefore body of the infant must rotate during childbirth in order to fit, unlike in other, non-upright walking mammals. Consequently, there is a usually high incidence of cephalopelvic disproportion and obstructed labor in humans. The obstetrical dilemma claims that this difference is due to the biological trade-off imposed by two opposing evolutionary pressures in the development of the human pelvis: smaller birth canals in the mothers, and larger brains, and therefore skulls in the babies. Proponents believe bipedal locomotion (the ability to walk upright) decreased the size of the bony parts of the birth canal. They also believe that as hominids' and humans' skull and brain sizes increased over the millennia, that women needed wider hips to give birth, that these wider hips made women inherently less able to walk or run than men, and that babies had to be born earlier to fit through the birth canal, resulting in the so-called fourth trimester period for newborns (being born when the baby seems less developed than in other animals). Recent evidence has suggested bipedal locomotion is only a part of the strong evolutionary pressure constraining the expansion of the maternal birth canal. In addition to bipedal locomotion, the reduced strength of the pelvic floor due to a wider maternal pelvis also leads to fitness detriments in the mother pressuring the birth canal to remain relatively narrow. This idea was widely accepted when first published in 1960, but has since been criticized by other scientists. History The term, obstetrical dilemma, was coined in 1960, by Sherwood Larned Washburn, a pr
https://en.wikipedia.org/wiki/Cutting%20%28plant%29
A plant cutting is a piece of a plant that is used in horticulture for vegetative (asexual) propagation. A piece of the stem or root of the source plant is placed in a suitable medium such as moist soil. If the conditions are suitable, the plant piece will begin to grow as a new plant independent of the parent, a process known as striking. A stem cutting produces new roots, and a root cutting produces new stems. Some plants can be grown from leaf pieces, called leaf cuttings, which produce both stems and roots. The scions used in grafting are also called cuttings. Propagating plants from cuttings is an ancient form of cloning. There are several advantages of cuttings, mainly that the produced offspring are practically clones of their parent plants. If a plant has favorable traits, it can continue to pass down its advantageous genetic information to its offspring. This is especially economically advantageous as it allows commercial growers to clone a certain plant to ensure consistency throughout their crops. Evolutionary advantage: Succulents Cuttings are used as a method of asexual reproduction in succulent horticulture, commonly referred to as vegetative reproduction. A cutting can also be referred to as a propagule. Succulents have evolved with the ability to use adventitious root formation in reproduction to increase fitness in stressful environments. Succulents grow in shallow soils, rocky soils, and desert soils. Seedlings from sexual reproduction have a low survival rate; however, plantlets from the excised stem cuttings and leaf cuttings, broken off in the natural environment, are more successful. Cuttings have both water and carbon stored and available, which are resources needed for plant establishment. The detached part of the plant remains physiologically active, allowing mitotic activity and new root structures to form for water and nutrient uptake. Asexual reproduction of plants is also evolutionarily advantageous as it allows plantlets to be bette
https://en.wikipedia.org/wiki/Radio%20Link%20Protocol
Radio Link Protocol (RLP) is an automatic repeat request (ARQ) fragmentation protocol used over a wireless (typically cellular) air interface. Most wireless air interfaces are tuned to provide 1% packet loss, and most Vocoders are mutually tuned to sacrifice very little voice quality at 1% packet loss. However, 1% packet loss is intolerable to all variants of TCP, and so something must be done to improve reliability for voice networks carrying TCP/IP data. A RLP detects packet losses and performs retransmissions to bring packet loss down to .01%, or even .0001%, which is suitable for TCP/IP applications. RLP also implements stream fragmentation and reassembly, and sometimes, in-order delivery. Newer forms of RLP also provide framing and compression, while older forms of RLP rely upon a higher-layer PPP protocols to provide these functions. A RLP transport cannot ask the air interface to provide a certain payload size. Instead, the air interface scheduler determines the packet size, based upon constantly changing channel conditions, and upcalls RLP with the chosen packet payload size, right before transmission. Most other fragmentation protocols, such as those of 802.11b and IP, used payload sizes determined by the upper layers, and call upon the MAC to create a payload of a certain size. These other protocols are not as flexible as RLP, and can sometimes fail to transmit during a deep fade in a wireless environment. Because a RLP payload size can be as little as 11 bytes, based upon a CDMA IS-95 network's smallest voice packet size, RLP headers must be very small, to minimize overhead. This is typically achieved by allowing both ends to negotiate a variable 'sequence number space', which is used to number each byte in the transmission stream. In some variants of RLP, this sequence counter can be as small as 6 bits. A RLP protocol can be ACK-based or NAK-based. Most RLPs are NAK-based, meaning that forward-link sender assumes that each transmission got through, a
https://en.wikipedia.org/wiki/Microsoft%20Forefront%20Threat%20Management%20Gateway
Microsoft Forefront Threat Management Gateway (Forefront TMG), formerly known as Microsoft Internet Security and Acceleration Server (ISA Server), is a discontinued network router, firewall, antivirus program, VPN server and web cache from Microsoft Corporation. It ran on Windows Server and works by inspecting all network traffic that passes through it. Features Microsoft Forefront TMG offers a set of features which include: Routing and remote access features: Microsoft Forefront TMG can act as a router, an Internet gateway, a virtual private network (VPN) server, a network address translation (NAT) server and a proxy server. Security features: Microsoft Forefront TMG is a firewall which can inspect network traffic (including web content, secure web content and emails) and filter out malware, attempts to exploit security vulnerabilities and content that does not match a predefined security policy. In technical sense, Microsoft Forefront TMG offers application layer protection, stateful filtering, content filtering and anti-malware protection. Network performance features: Microsoft Forefront TMG can also improve network performance: It can compress web traffic to improve communication speed. It also offers web caching: It can cache frequently-accessed web content so that users can access them faster from the local network cache. Microsoft Forefront TMG 2010 can also cache data received through Background Intelligent Transfer Service, such as updates of software published on Microsoft Update website. History Microsoft Proxy Server The Microsoft Forefront Threat Management Gateway product line originated with Microsoft Proxy Server. Developed under the code-name "Catapult", Microsoft Proxy Server v1.0 was first launched in January 1997, and was designed to run on Windows NT 4.0. Microsoft Proxy Server v1.0 was a basic product designed to provide Internet Access for clients in a LAN Environment via TCP/IP. Support was also provided for IPX/SPX networks (primarily
https://en.wikipedia.org/wiki/Air%20handler
An air handler, or air handling unit (often abbreviated to AHU), is a device used to regulate and circulate air as part of a heating, ventilating, and air-conditioning (HVAC) system. An air handler is usually a large metal box containing a blower, furnace or A/C elements, filter racks or chambers, sound attenuators, and dampers. Air handlers usually connect to a ductwork ventilation system that distributes the conditioned air through the building and returns it to the AHU, sometimes exhausting air to the atmosphere and bringing in fresh air. Sometimes AHUs discharge (supply) and admit (return) air directly to and from the space served without ductwork Small air handlers, for local use, are called terminal units, and may only include an air filter, coil, and blower; these simple terminal units are called blower coils or fan coil units. A larger air handler that conditions 100% outside air, and no recirculated air, is known as a makeup air unit (MAU) or fresh air handling unit (FAHU). An air handler designed for outdoor use, typically on roofs, is known as a packaged unit (PU), heating and air conditioning unit (HCU), or rooftop unit (RTU). Construction The air handler is normally constructed around a framing system with metal infill panels as required to suit the configuration of the components. In its simplest form the frame may be made from metal channels or sections, with single skin metal infill panels. The metalwork is normally galvanized for long term protection. For outdoor units some form of weatherproof lid and additional sealing around joints is provided. Larger air handlers will be manufactured from a square section steel framing system with double skinned and insulated infill panels. Such constructions reduce heat loss or heat gain from the air handler, as well as providing acoustic attenuation. Larger air handlers may be several meters long and are manufactured in a sectional manner and therefore, for strength and rigidity, steel section base rails
https://en.wikipedia.org/wiki/SYNOP
SYNOP (surface synoptic observations) is a numerical code (called FM-12 by WMO) used for reporting weather observations made by staffed and automated weather stations. SYNOP reports are typically sent every six hours by Deutscher Wetterdienst on shortwave and low frequency using RTTY. A report consists of groups of numbers (and slashes where data is not available) describing general weather information, such as the temperature, barometric pressure and visibility at a weather station. It can be decoded by open-source software such as seaTTY, metaf2xml or Fldigi. SYNOP information is collected by more than 7600 manned and unmanned meteorological stations and more than 2500 mobile stations around the world and is used for weather forecasting and climatic statistics. The format of the original messages is abbreviated, some items are coded. Message format Following is the general structure of a SYNOP message. The message consists of a sequence of numeric groups, which may also contain slashes (indicating missing data) in addition to numeric digits. Leading numbers are fixed group indicators that indicate the type of observation following, and letters are replaced with numbers giving the weather data. Messages from shipboard weather stations, and in different regions of the world, use variations on this scheme. YYGGiw IIiii iRiXhVV Nddff (00fff) 1snTTT 2snTdTdTd 3PoPoPoPo 4PPPP 5appp 6RRRtR 7wwW1W2 8NhCLCMCH (9GGgg) YYGGiw: the date and time of the observation; YY for the day of the month, GG for the hour of the observation in UTC; iw for the manner of wind observation (a code number: 0 for estimated wind speed in meters per second, 1 for measured wind speed in meters per second, 2 and 3 likewise but in knots, or slash for no wind speed observations). IIiii: weather station identification code; II for a block number allocated (by the WMO) to a country or a region of the world, for example 02 for Scandinavia or 72 and 74 for the continental US; iii is the code of a
https://en.wikipedia.org/wiki/Normal%20bundle
In differential geometry, a field of mathematics, a normal bundle is a particular kind of vector bundle, complementary to the tangent bundle, and coming from an embedding (or immersion). Definition Riemannian manifold Let be a Riemannian manifold, and a Riemannian submanifold. Define, for a given , a vector to be normal to whenever for all (so that is orthogonal to ). The set of all such is then called the normal space to at . Just as the total space of the tangent bundle to a manifold is constructed from all tangent spaces to the manifold, the total space of the normal bundle to is defined as . The conormal bundle is defined as the dual bundle to the normal bundle. It can be realised naturally as a sub-bundle of the cotangent bundle. General definition More abstractly, given an immersion (for instance an embedding), one can define a normal bundle of N in M, by at each point of N, taking the quotient space of the tangent space on M by the tangent space on N. For a Riemannian manifold one can identify this quotient with the orthogonal complement, but in general one cannot (such a choice is equivalent to a section of the projection ). Thus the normal bundle is in general a quotient of the tangent bundle of the ambient space restricted to the subspace. Formally, the normal bundle to N in M is a quotient bundle of the tangent bundle on M: one has the short exact sequence of vector bundles on N: where is the restriction of the tangent bundle on M to N (properly, the pullback of the tangent bundle on M to a vector bundle on N via the map ). The fiber of the normal bundle in is referred to as the normal space at (of in ). Conormal bundle If is a smooth submanifold of a manifold , we can pick local coordinates around such that is locally defined by ; then with this choice of coordinates and the ideal sheaf is locally generated by . Therefore we can define a non-degenerate pairing that induces an isomorphism of sheaves . We can rephrase thi
https://en.wikipedia.org/wiki/FreeRADIUS
FreeRADIUS is a modular, high performance free RADIUS suite developed and distributed under the GNU General Public License, version 2, and is free for download and use. The FreeRADIUS Suite includes a RADIUS server, a BSD-licensed RADIUS client library, a PAM library, an Apache module, and numerous additional RADIUS related utilities and development libraries. In most cases, the word "FreeRADIUS" refers to the free open-source RADIUS server from this suite. FreeRADIUS is the most popular open source RADIUS server and the most widely deployed RADIUS server in the world. It supports all common authentication protocols, and the server comes with a PHP-based web user administration tool called dialupadmin. It is the basis for many commercial RADIUS products and services, such as embedded systems, RADIUS appliances that support Network Access Control, and WiMAX. It supplies the AAA needs of many Fortune-500 companies, telcos, and Tier 1 ISPs. It is also widely used in the academic community, including eduroam. The server is fast, feature-rich, modular, and scalable. History FreeRADIUS was started in August 1999 by Alan DeKok and Miquel van Smoorenburg. Miquel had previously written the Cistron RADIUS server, which had gained widespread usage once the Livingston server was no longer being maintained. FreeRADIUS was started to create a new RADIUS server, using a modular design that would encourage more active community involvement. As of November 2014, the FreeRADIUS Project has three Core Team members: Alan DeKok (Project Leader), Arran Cudbard-Bell (Principal Architect), and Matthew Newton. The latest major release is FreeRADIUS 3. FreeRADIUS 3 includes support for RADIUS over TLS, including RadSec, a completely rewritten rlm_ldap module, and hundreds of other minor consistency and usability enhancements. The latest mature version is maintained for stability rather than features. The previous major release v2.2.x has entered the final phase of its lifecycle, and w
https://en.wikipedia.org/wiki/Ion%20beam%20analysis
Ion beam analysis (IBA) is an important family of modern analytical techniques involving the use of MeV ion beams to probe the composition and obtain elemental depth profiles in the near-surface layer of solids. All IBA methods are highly sensitive and allow the detection of elements in the sub-monolayer range. The depth resolution is typically in the range of a few nanometers to a few ten nanometers. Atomic depth resolution can be achieved, but requires special equipment. The analyzed depth ranges from a few ten nanometers to a few ten micrometers. IBA methods are always quantitative with an accuracy of a few percent. Channeling allows to determine the depth profile of damage in single crystals. RBS: Rutherford backscattering is sensitive to heavy elements in a light matrix EBS: Elastic (non-Rutherford) backscattering spectrometry can be sensitive even to light elements in a heavy matrix. The term EBS is used when the incident particle is going so fast that it exceeds the "Coulomb barrier" of the target nucleus, which therefore cannot be treated by Rutherford's approximation of a point charge. In this case Schrödinger's equation should be solved to obtain the scattering cross-section (see http://www-nds.iaea.org/sigmacalc/ ). ERD: Elastic recoil detection is sensitive to light elements in a heavy matrix PIXE: Particle-induced X-ray emission gives the trace and minor elemental composition NRA: Nuclear reaction analysis is sensitive to particular isotopes Channelling: The fast ion beam can be aligned accurately with major axes of single crystals; then the strings of atoms "shadow" each other and the backscattering yield falls dramatically. Any atoms off their lattice sites will give visible extra scattering. Thus damage to the crystal is visible, and point defects (interstitials) can even be distinguished from dislocations. The quantitative evaluation of IBA methods requires the use of specialized simulation and data analysis software. SIMNRA and DataFur
https://en.wikipedia.org/wiki/Ambient%20intelligence
Ambient intelligence (AmI) is a term used in computing to refer to electronic environments that are sensitive and responsive to the presence of people. The term is generally applied to consumer electronics, telecommunications, and computing. Ambient intelligence is intended to enable devices to work in concert with people in carrying out their everyday life activities in an intuitive way by using information and intelligence hidden in the network connecting these devices. An example of ambient intelligence is the Internet of Things. A typical context of the ambient intelligence environment is home, but it may also be used in workspaces (offices, co-working), public spaces (based on technologies such as smart streetlights), and hospital environments. The concept of ambient intelligence was originally developed in the late 1990s by Eli Zelkha and his team at Palo Alto Ventures for the time frame 2010–2020. Developers theorize that as devices grow smaller, more connected, and more integrated into our environment, the technological framework behind them will disappear into our surroundings until only the user interface remains perceivable by people. Overview The ambient intelligence concept builds upon pervasive computing, ubiquitous computing, profiling, context awareness, and human-centric computer interaction design. It is characterized by systems and technologies that are: Embedded: Many networked devices are integrated into the environment. Context aware: These devices can recognize you and your situational context. Personalized: They can be tailored to your needs. Adaptive: They can change in response to you. Anticipatory: They can anticipate your desires without conscious mediation. Successful implementation of ambient intelligence requires several vital technologies to exist. These include hidden, user-friendly hardware such as miniaturization, nanotechnology, and smart devices, as well as human-centric computer interfaces (intelligent agents, multim
https://en.wikipedia.org/wiki/Pentagram%20%28video%20game%29
Pentagram is a ZX Spectrum and MSX video game released by Ultimate Play the Game in 1986. It is the fourth in the Sabreman series, following on from his adventures in Sabre Wulf, Underwurlde and Knight Lore. Similarly to Knight Lore it uses the isometric Filmation engine. The game was written by either Tim and Chris Stamper or a U.S. Gold programming team. Introduction Typically, for an Ultimate release, the inlay card provides little actual instruction for playing the game, but includes a cryptic short story as an introduction. This was Ultimate's way of describing the object of the game, which is to recover the lost Pentagram, an artifact of magical power. Firstly, Sabreman must locate one of the wells located in the maze of screens, shoot it several times with his spell and take the resultant bucket of water to one of the broken obelisks. When dropped on these, the water will "heal" the stone. This must be done with each of the four obelisks to make the titular Pentagram appear in one of the rooms. Once this is done, five magic runestones must be found and placed on the Pentagram itself. Gameplay Though the objective in Pentagram is more complex and obscure than the simple "find and fetch" gameplay of the two previous Filmation games Knight Lore and Alien 8, the gameplay is similar to those two titles. The main differences in this final revision of the Filmation engine are the new ability to shoot enemies with a projectile magic spell, and the ability of the enemies to respawn. The "directional control" system of the previous games was also removed because the Spectrum's single joystick button was now needed to fire Sabreman's spell, so it could no longer be used to jump (instead, "down" on the joystick is used to jump). The basic gameplay is the same as that of Sabreman's previous outing Knight Lore (without that game's day/night shapeshifting cycle), as he wanders a mazelike system of screens filled with enemies, pieces of movable scenery (often forming o
https://en.wikipedia.org/wiki/Surroundings
Surroundings are the area around a given physical or geographical point or place. The exact definition depends on the field. Surroundings can also be used in geography (when it is more precisely known as vicinity, or vicinage) and mathematics, as well as philosophy, with the literal or metaphorically extended definition. In thermodynamics, the term (and its synonym, environment) is used in a more restricted sense, meaning everything outside the thermodynamic system. Often, the simplifying assumptions are that energy and matter may move freely within the surroundings, and that the surroundings have a uniform composition. See also Distance Environment (biophysical) Environment (systems) Neighbourhood (mathematics) Social environment Proxemics Geography Thermodynamics
https://en.wikipedia.org/wiki/Takeuti%27s%20conjecture
In mathematics, Takeuti's conjecture is the conjecture of Gaisi Takeuti that a sequent formalisation of second-order logic has cut-elimination (Takeuti 1953). It was settled positively: By Tait, using a semantic technique for proving cut-elimination, based on work by Schütte (Tait 1966); Independently by Prawitz (Prawitz 1968) and Takahashi (Takahashi 1967) by a similar technique (Takahashi 1967) - although Prawitz's and Takahashi's proofs are not limited to second-order logic, but concern higher-order logics in general; It is a corollary of Jean-Yves Girard's syntactic proof of strong normalization for System F. Takeuti's conjecture is equivalent to the 1-consistency of second-order arithmetic in the sense that each of the statements can be derived from each other in the weak system PRA. It is also equivalent to the strong normalization of the Girard/Reynold's System F. See also Hilbert's second problem References Dag Prawitz, 1968. Hauptsatz for higher order logic. J. Symb. Log., 33:452–457, 1968. William W. Tait, 1966. A nonconstructive proof of Gentzen's Hauptsatz for second order predicate logic. In Bulletin of the American Mathematical Society, 72:980–983. Gaisi Takeuti, 1953. On a generalized logic calculus. In Japanese Journal of Mathematics, 23:39–96. An errata to this article was published in the same journal, 24:149–156, 1954. Moto-o Takahashi, 1967. A proof of cut-elimination in simple type theory. In Japanese Mathematical Society, 10:44–45. Proof theory Conjectures that have been proved
https://en.wikipedia.org/wiki/ISO/IEC%2011179
The ISO/IEC 11179 Metadata Registry (MDR) standard is an international ISO/IEC standard for representing metadata for an organization in a metadata registry. It documents the standardization and registration of metadata to make data understandable and shareable. Intended purpose Organizations exchange data between computer systems precisely using enterprise application integration technologies. Completed transactions are often transferred to separate data warehouse and business rules systems with structures designed to support data for analysis. A de facto standard model for data integration platforms is the Common Warehouse Metamodel (CWM). Data integration is often also solved as a problem of data, rather than metadata, with the use of so-called master data. ISO/IEC 11179 claims that it is a standard for metadata-driven exchange of data in an heterogeneous environment, based on exact definitions of data. Structure of an ISO/IEC 11179 metadata registry The ISO/IEC 11179 model is a result of two principles of semantic theory, combined with basic principles of data modelling. The first principle from semantic theory is the thesaurus type relation between wider and more narrow (or specific) concepts, e.g. the wide concept "income" has a relation to the more narrow concept "net income". The second principle from semantic theory is the relation between a concept and its representation, e.g., "buy" and "purchase" are the same concept although different terms are used. A basic principle of data modelling is the combination of an object class and a characteristic. For example, "Person - hair color". When applied to data modelling, ISO/IEC 11179 combines a wide "concept" with an "object class" to form a more specific "data element concept". For example, the high-level concept "income" is combined with the object class "person" to form the data element concept "net income of person". Note that "net income" is more specific than "income". The different possible repre
https://en.wikipedia.org/wiki/Bioorthogonal%20chemical%20reporter
In chemical biology, bioorthogonal chemical reporter is a non-native chemical functionality that is introduced into the naturally occurring biomolecules of a living system, generally through metabolic or protein engineering. These functional groups are subsequently utilized for tagging and visualizing biomolecules. Jennifer Prescher and Carolyn R. Bertozzi, the developers of bioorthogonal chemistry, defined bioorthogonal chemical reporters as "non-native, non-perturbing chemical handles that can be modified in living systems through highly selective reactions with exogenously delivered probes." It has been used to enrich proteins and to conduct proteomic analysis. In the early development of the technique, chemical motifs have to fulfill criteria of biocompatibility and selective reactivity in order to qualify as bioorthogonal chemical reporters. Some combinations of proteinogenic amino acid side chains meet the criteria, as do ketone and aldehyde tags. Azides and alkynes are other examples of chemical reporters. A bioorthogonal chemical reporter must be incorporated into a biomolecule. This occurs via metabolism. The chemical reporter is linked to a substrate, which a cell can metabolize. References Biochemistry methods Chemical biology
https://en.wikipedia.org/wiki/Polymer%20science
Polymer science or macromolecular science is a subfield of materials science concerned with polymers, primarily synthetic polymers such as plastics and elastomers. The field of polymer science includes researchers in multiple disciplines including chemistry, physics, and engineering. Subdisciplines This science comprises three main sub-disciplines: Polymer chemistry or macromolecular chemistry is concerned with the chemical synthesis and chemical properties of polymers. Polymer physics is concerned with the physical properties of polymer materials and engineering applications. Specifically, it seeks to present the mechanical, thermal, electronic and optical properties of polymers with respect to the underlying physics governing a polymer microstructure. Despite originating as an application of statistical physics to chain structures, polymer physics has now evolved into a discipline in its own right. Polymer characterization is concerned with the analysis of chemical structure, morphology, and the determination of physical properties in relation to compositional and structural parameters. History of polymer science The first modern example of polymer science is Henri Braconnot's work in the 1830s. Henri, along with Christian Schönbein and others, developed derivatives of the natural polymer cellulose, producing new, semi-synthetic materials, such as celluloid and cellulose acetate. The term "polymer" was coined in 1833 by Jöns Jakob Berzelius, though Berzelius did little that would be considered polymer science in the modern sense. In the 1840s, Friedrich Ludersdorf and Nathaniel Hayward independently discovered that adding sulfur to raw natural rubber (polyisoprene) helped prevent the material from becoming sticky. In 1844 Charles Goodyear received a U.S. patent for vulcanizing natural rubber with sulfur and heat. Thomas Hancock had received a patent for the same process in the UK the year before. This process strengthened natural rubber and prevented it
https://en.wikipedia.org/wiki/Gabriel%20Carroll
Gabriel Drew Carroll (born December 24, 1982) is a Professor of Economics at the University of Toronto. He was born to tech industry worker parents in Oakland. He graduated from Harvard University with B.A. in mathematics and linguistics in 2005 and received his doctorate in economics from MIT in 2012. He was recognized as a child prodigy and received numerous awards in mathematics while a student. Carroll won two gold medals (1998, 2001) and a silver medal (1999) at the International Mathematical Olympiad, earning a perfect score at the 2001 International Mathematical Olympiad held in Washington, D.C., shared only with American teammate Reid W. Barton and Chinese teammates Liang Xiao and Zhiqiang Zhang. Gabriel earned a place among the top five ranked competitors (who are themselves not ranked against each other) in the William Lowell Putnam Competition all four years that he was eligible (2000–2003), a feat matched by only seven others (Don Coppersmith (1968–1971), Arthur Rubin (1970–1973), Bjorn Poonen (1985–1988), Ravi Vakil (1988–1991), Reid W. Barton (2001–2004), Daniel Kane (2003–2006), and Brian R. Lawrence (2007–08, 2010–11). His top-5 performance in 2000 was particularly notable, as he was officially taking the exam in spite of only being a high school senior, thus forfeiting one of his years of eligibility in college. He was on the first place Putnam team twice (2001–02) and the second place team once (2003). He has earned awards in science and math, including the Intel Science Talent Search, has taught mathematics classes and tutorials, and plays the piano. He was a Research Science Institute scholar in 2000. Carroll proposed Problem 3 of IMO 2009 and Problem 3 of IMO 2010. He also proposes problems to the USAMO such as problem 3 in 2007, 2008, 2010 and problem 6 in 2009. During the 2005–06 academic year, he taught English in Chaling, Hunan, China. He worked at the National Bureau of Economic Research from 2006 to 2007 and was an Assistant Professor
https://en.wikipedia.org/wiki/E-plane%20and%20H-plane
The E-plane and H-plane are reference planes for linearly polarized waveguides, antennas and other microwave devices. In waveguide systems, as in the electric circuits, it is often desirable to be able to split the circuit power into two or more fractions. In a waveguide system, an element called a junction is used for power division. In a low frequency electrical network, it is possible to combine circuit elements in series or in parallel, thereby dividing the source power among several circuit components. In microwave circuits, a waveguide with three independent ports is called a TEE junction. The output of E-Plane Tee is 180° out of phase where the output of H-plane Tee is in phase. E-Plane For a linearly-polarized antenna, this is the plane containing the electric field vector (sometimes called the E aperture) and the direction of maximum radiation. The electric field or "E" plane determines the polarization or orientation of the radio wave. For a vertically polarized antenna, the E-plane usually coincides with the vertical/elevation plane. For a horizontally polarized antenna, the E-Plane usually coincides with the horizontal/azimuth plane. E- plane and H-plane should be 90 degrees apart. H-plane In the case of the same linearly polarized antenna, this is the plane containing the magnetic field vector (sometimes called the H aperture) and the direction of maximum radiation. The magnetizing field or "H" plane lies at a right angle to the "E" plane. For a vertically polarized antenna, the H-plane usually coincides with the horizontal/azimuth plane. For a horizontally polarized antenna, the H-plane usually coincides with the vertical/elevation plane. Illustrations Co- and cross-polarizations Co-polarization (co-pol) and cross-polarization (cross-pol) are defined for the radiating E and H planes. These directions are defined in spherical coordinates corresponding to the spherical wavefronts of the propagating wave. By convention, the co-pol direction
https://en.wikipedia.org/wiki/CYPRIS%20%28microchip%29
CYPRIS (cryptographic RISC microprocessor) was a cryptographic processor developed by the Lockheed Martin Advanced Technology Laboratories. The device was designed to implement NSA encryption algorithms and had a similar intent to the AIM and Sierra crypto modules. However, the principal references date back to the late 1990s and it does not appear that the CYPRIS ever earned NSA's Type 1 certification, without which it could not be used to protect classified government traffic. According to a manufacturer presentation, References Cryptographic hardware
https://en.wikipedia.org/wiki/Modulated%20continuous%20wave
Modulated continuous wave (MCW) is Morse code telegraphy, transmitted using an audio tone to modulate a carrier wave. The Federal Communications Commission defines modulated continuous wave in 47 CFR §97.3(c)(4) as "Tone-modulated international Morse code telegraphy emissions having designators with A, C, D, F, G, H or R as the first symbol; 2 as the second symbol; A or B as the third symbol." See Types of radio emissions for a general explanation of these symbols. Types of Morse code radio transmissions (CW and MCW) discussed in this article include: A1A and A2A — Double-sideband amplitude modulation (AM); One channel containing digital information, no subcarrier (A1A) or using a subcarrier (A2A); Aural telegraphy (intended to be decoded by ear) F2A — Frequency modulation (FM); One channel containing digital information, using a subcarrier; Aural telegraphy J2A and J2B — Single-sideband with suppressed carrier; One channel containing digital information, using a subcarrier; Aural telegraphy (J2A) or Electronic telegraphy (intended to be decoded by machine) (J2B) Unlike A1A CW transmissions, A2A MCW will produce an audible audio tone from an AM radio receiver that is not equipped with a beat oscillator. MCW is commonly used by RDF beacons to transmit the station identifier. F2A MCW Morse can be heard on a normal FM radio receiver, and it is commonly used by both commercial and amateur repeater stations for identification. Also, F2A is sometimes used by other types of stations operating under automatic control, such as a telemetry transmitter or a remote base station. MCW can be generated by any AM or FM radio transmitter with audio input from an audio oscillator or equivalent audio source. When an SSB transmitter is modulated by Morse code of only a single audio frequency, the resulting radio frequency emission is J2A or J2B and therefore is CW by definition, not MCW. Within the United States, MCW transmission is not permitted to amateur radio operators in
https://en.wikipedia.org/wiki/Evolutionary%20capacitance
Evolutionary capacitance is the storage and release of variation, just as electric capacitors store and release charge. Living systems are robust to mutations. This means that living systems accumulate genetic variation without the variation having a phenotypic effect. But when the system is disturbed (perhaps by stress), robustness breaks down, and the variation has phenotypic effects and is subject to the full force of natural selection. An evolutionary capacitor is a molecular switch mechanism that can "toggle" genetic variation between hidden and revealed states. If some subset of newly revealed variation is adaptive, it becomes fixed by genetic assimilation. After that, the rest of variation, most of which is presumably deleterious, can be switched off, leaving the population with a newly evolved advantageous trait, but no long-term handicap. For evolutionary capacitance to increase evolvability in this way, the switching rate should not be faster than the timescale of genetic assimilation. This mechanism would allow for rapid adaptation to new environmental conditions. Switching rates may be a function of stress, making genetic variation more likely to affect the phenotype at times when it is most likely to be useful for adaptation. In addition, strongly deleterious variation may be purged while in a partially cryptic state, so cryptic variation that remains is more likely to be adaptive than random mutations are. Capacitance can help cross "valleys" in the fitness landscape, where a combination of two mutations would be beneficial, even though each is deleterious on its own. There is currently no consensus about the extent to which capacitance might contribute to evolution in natural populations. The possibility of evolutionary capacitance is considered to be part of the extended evolutionary synthesis. Switches that turn robustness to phenotypic rather than genetic variation on and off do not fit the capacitance analogy, as their presence does not cause
https://en.wikipedia.org/wiki/KASY-TV
KASY-TV (channel 50) is a television station in Albuquerque, New Mexico, United States, affiliated with MyNetworkTV. It is owned by Mission Broadcasting alongside Santa Fe–licensed CW affiliate KWBQ (channel 19) and its Roswell-based satellite, KRWB-TV (channel 21). The two stations share studios with dual CBS/Fox affiliate KRQE (channel 13) on Broadcast Plaza in Albuquerque; KASY-TV's transmitter is located atop Sandia Crest. Nexstar Media Group, which owns KRQE and holds a majority stake in The CW, provides master control, technical, engineering and accounting services for KASY-TV and KWBQ through a shared services agreement (SSA), though the two stations are otherwise operated separately from KRQE as Mission handles programming, advertising sales and retransmission consent negotiations. History KASY-TV first signed on the air on October 6, 1995, owned by Ramar Communications and managed by Lee Enterprises (then-owners of CBS affiliate KRQE) under a local marketing agreement (LMA). The station was primarily a UPN affiliate, but had a secondary affiliation with The WB; this was easy to do as neither network had more than a couple nights a week of programming at that time. Initially, KASY ran cartoons, old movies, talk shows, and classic and recent off-network sitcoms. In fall 1997, KASY dropped WB programming and became an exclusive UPN affiliate; The WB would return to the market when upstart KWBQ signed on in March 1999 with a similar general entertainment format. In the interim, WB programming was brought in out-of-market from KTLA in Los Angeles or Chicago-based superstation WGN on Albuquerque area cable providers. In June 1999, ACME Communications, KWBQ's owner, bought KASY from Ramar and terminated the local marketing agreement with Lee Enterprises, resulting in the creation of the first major television duopoly in the Albuquerque market. Most of the programming inventory airing on KASY was also acquired by ACME, while some of the shows that aired on KASY
https://en.wikipedia.org/wiki/IBM%20Global%20Mirror
Global Mirror is an IBM technology that provides data replication over extended distances between two sites for business continuity and disaster recovery. If adequate bandwidth exists, Global Mirror provides a recovery point objective (RPO) of as low as 3–5 seconds between the two sites at extended distances with no performance impact on the application at the primary site. It replicates the data asynchronously and also forms a consistency group at a regular interval allowing a clean recovery of the application. The two sites can be on separate continents or simply on different utility grids. IBM also provides a synchronous data replication called Metro Mirror, which is designed to support replication at "Metropolitan" distances of (normally) less than 300 km. Global Mirror is based on IBM Copy Services functions: Global Copy and FlashCopy. Global mirror periodically pauses updates of the primary volumes and swaps change recording bitmaps. It then uses the previous bitmap to drain updates from the primary volumes to the secondaries. After all primary updates have been drained, the secondary volumes are used as the source for a FlashCopy to tertiary volumes at the recovery site. This ensures that the tertiary copy of the volumes has point-in-time consistency. By grouping many volumes into one Global Mirror session multiple volumes may be copied to the recovery site simultaneously while maintaining point-in-time consistency across those volumes. Global Mirror can be combined with a wide area network clustering product like Geographically Dispersed Parallel Sysplex (GDPS), HACMP/XD, or IBM TotalStorage Continuous Availability for Windows to provide for automated failover between sites. This combined solution provides lower recovery time objective (RTO), because it allows most applications to automatically resume productive operation in 30–600 seconds. The Global Mirror function is available on IBM Storage devices including the DS8000 series (DS8100, DS8300, DS8700
https://en.wikipedia.org/wiki/Minimalism%20%28computing%29
In computing, minimalism refers to the application of minimalist philosophies and principles in the design and use of hardware and software. Minimalism, in this sense, means designing systems that use the least hardware and software resources possible. History In the late 1970s and early 1980s, programmers worked within the confines of relatively expensive and limited resources of common platforms. Eight or sixteen kilobytes of RAM was common; 64 kilobytes was considered a vast amount and was the entire address space accessible to the 8-bit CPUs predominant during the earliest generations of personal computers. The most common storage medium was the 5.25 inch floppy disk holding from 88 to 170 kilobytes. Hard drives with capacities from five to ten megabytes cost thousands of dollars. Over time, personal-computer memory capacities expanded by orders of magnitude and mainstream programmers took advantage of the added storage to increase their software's capabilities and to make development easier by using higher-level languages. By contrast, system requirements for legacy software remained the same. As a result, even the most elaborate, feature-rich programs of yesteryear seem minimalist in comparison with current software. One example of a program whose system requirements once gave it a heavyweight reputation is the GNU Emacs text editor, which gained the backronym "Eight Megabytes And Constantly Swapping" in an era when 8 megabytes was a lot of RAM. Today, Emacs' mainly textual buffer-based paradigm uses far fewer resources than desktop metaphor GUI IDEs with comparable features such as Eclipse or Netbeans. In a speech at the 2002 International Lisp Conference, Richard Stallman indicated that minimalism was a concern in his development of GNU and Emacs, based on his experiences with Lisp and system specifications of low-end minicomputers at the time. As the capabilities and system requirements of common desktop software and operating systems grew throughout th
https://en.wikipedia.org/wiki/Densitometer
A densitometer is a device that measures the degree of darkness (the optical density) of a photographic or semitransparent material or of a reflecting surface. The densitometer is basically a light source aimed at a photoelectric cell. It determines the density of a sample placed between the light source and the photoelectric cell from differences in the readings. Modern densitometers have the same components, but also have electronic integrated circuitry for better reading. Types Transmission densitometers that measure transparent materials A transmission densitometer used to measure transparent surfaces measure color transparencies. Film & transparent substrates are some examples of common transparent surface measures. Reflection densitometers that measure light reflected from a surface of any state. Photography applications Some are capable of both types of measurements selectable by a switch. They are used in film photography to measure densities of negatives with the switch in the "T" (Transmission) position and the saturation of a resulting print in the "R" position. Such measurements enable the photographer to choose the right photo paper and the correct exposure, obviating experiments with test strips. Once the papers and darkroom have been calibrated, the first print from a previously measured negative is a success at once. Uses Measuring color saturation by print professionals Calibration of printing equipment Quantifying the radioactivity of a compound such as radiolabeled DNA as one of the molecular tools for gene study Making adjustments so that outputs are consistent with the colors desired in the finished products. Ensuring x-ray films are within code-required density ranges and comparing relative material thicknesses in industrial radiography Process control of density dot gain, dot area & ink trapping. Densitometer readings will be different for different types of printing process & substrates. See also Densitometry Density meter Microdensitomet
https://en.wikipedia.org/wiki/KSCW-DT
KSCW-DT (channel 33) is a television station in Wichita, Kansas, United States, affiliated with The CW. It is owned by Gray Television alongside Hutchinson-licensed CBS affiliate KWCH-DT (channel 12). Both stations share studios on 37th Street in northeast Wichita, while KSCW-DT's transmitter is located in rural northeastern Reno County (east of Hutchinson). KSCW-DT also operates a digital replacement translator on UHF channel 33 (its previous analog signal allotment) from a transmitter in North Wichita, just north of the station's studio facility. History The station was first licensed on June 8, 1988, under permits from LIN TV filing an application with the Federal Communications Commission (FCC), under the call letters KWCV. It was temporarily licensed as DKWCV on November 5, 1998, but on February 11, 1999, it was changed back to KWCV. The station first signed on the air on August 5, 1999, along with LIN TV (which also briefly owned KAKE and its satellites in 2000 before selling KAKE to Benedek Broadcasting) forming and owning 50% of Banks Broadcasting, which would become the station's owner. Originally operating as a WB affiliate, it was branded on-air as "Kansas' WB". Prior to the station's launch, The WB's programming could only be viewed in the Wichita market through Chicago-based cable superstation WGN, which carried the network's programming nationwide from The WB's January 1995 launch until October 1999, or Denver's KWGN-TV on cable or satellite. The station's original transmitter was located on a tower near Colwich. On January 24, 2006, the Warner Bros. unit of Time Warner and CBS Corporation announced that the two companies would shut down The WB and UPN and combine the networks' respective programming onto a newly created "fifth" network called The CW. One month later on February 22, 2006, News Corporation announced that it would launch another new network called MyNetworkTV. On March 21, not long after it was announced that the station would becom
https://en.wikipedia.org/wiki/KMTW
KMTW (channel 36) is a television station licensed to Hutchinson, Kansas, United States, serving the Wichita area as an affiliate of the digital multicast network Dabl. It is owned by the Mercury Broadcasting Company, which maintains a local marketing agreement (LMA) with Sinclair Broadcast Group, owner of dual Fox/MyNetworkTV affiliate KSAS-TV (channel 24), for the provision of certain services. Both stations share studios on North West Street in northwestern Wichita, while KMTW's transmitter is located in rural southwestern Harvey County (on the town limits of Halstead). History On June 27, 1997, Clear Channel Communications (owner of Fox affiliate KSAS-TV (channel 24)) entered into a local marketing agreement with Goddard-based Three Feathers Communications, Inc. to form a new television station in Hutchinson, Kansas. Initially bearing the name KAWJ, the construction permit took the KSCC ("Kansas Clear Channel") call letters on October 9, 1998. On July 30, 1999, Three Feathers filed an application to sell the license of KSCC to Viacom's Paramount Stations Group, with the application being granted by the Federal Communications Commission (FCC) on October 1 the same year. The station first signed on the air on January 5, 2001 (the station first appeared on Cox Cable starting in August 2000) affiliating with UPN as an owned-and-operated station, a rarity for a market of Wichita's size. However, just prior to the station's sign-on, its license assets were sold to San Antonio–based Mercury Broadcasting Company. In June 2001, Mercury Broadcasting would take over ownership of KSCC. Prior to the station's sign-on, UPN programming was seen on a secondary basis on sister station KSAS-TV. In 2003, Clear Channel attempted to buy the station outright, but was denied a "failing station" waiver by the FCC. This special approval for the sale was necessary because the Wichita–Hutchinson designated market area has only seven "unique" full-power television stations. The full-po
https://en.wikipedia.org/wiki/Dwang
In construction, a nogging or nogging piece (England and Australia), dwang (Scotland, South Island, New Zealand, and lower/central North Island, New Zealand), blocking (North America), noggin (Australia and Greater Auckland Region of New Zealand), or nog (New Zealand and Australia), is a horizontal bracing piece used between wall studs or floor joists to give rigidity to the wall or floor frames of a building. Noggings may be made of timber, steel, or aluminium. If made of timber they are cut slightly longer than the space they fit into, and are driven into place so they fit tightly or are rabbeted into the wall stud. Timber noggings are fixed to the perimeter, abutments, or for the purpose of framing any openings using suitable fixings. The interval between noggings is dictated by local building codes and by the type of timber used; a typical timber-framed house in a non-cyclonic area will have two or three noggings per storey between each pair of neighbouring studs. Additional noggings may be added as grounds for later fixings. Noggings between vertical studs generally brace the studs against buckling under load; noggings on floor joists prevent the joists from twisting or rotating under load (lateral-torsional buckling), and are often fixed at intervals, in pairs diagonally for that reason. In floors this type of bracing is also called herringbone strutting. It is also used in ceilings to prevent not only joist twisting but also ceiling damage. Noggings provide no bracing effect in shear and are generally supplemented by diagonal bracing to prevent the frame from racking. References See also Blocking (construction) Carpentry Building engineering Structural system Carpentry
https://en.wikipedia.org/wiki/Centring
Centring, centre, centering, or center is a type of formwork: the temporary structure upon which the stones of an arch or vault are laid during construction. Until the keystone is inserted an arch has no strength and needs the centring to keep the voussoirs in their correct relative positions. A simple centring without a truss is called a common centring. A cross piece connecting centring frames is called a lag or bolst. Centring is normally made of wood timbers, a relatively straightforward structure in a simple arch or vault; but with more complex shapes involving double curvature, such as a dome or the bottle-shaped flue in a Norman-period kitchen, clay or sand bound by a weak lime mortar would be used. Shaping could be done by eye, perhaps with the help of a template, then stones or bricks laid against it. On larger works like a 19th-century pottery kiln this was impractical. The structure would be built round a post acting as a datum, and each course of stonework would be set at a distance from the datum. When the centring is removed (as in "striking the centring"), pointing and other finishing continues. Gallery References Construction Arches and vaults
https://en.wikipedia.org/wiki/Personal%20Handy-phone%20System
The Personal Handy-phone System (PHS), also marketed as the Personal Communication Telephone (PCT) in Thailand, and the Personal Access System (PAS) and commercially branded as Xiaolingtong () in Mainland China, was a mobile network system operating in the 1880–1930 MHz frequency band, used mainly in Japan, China, Taiwan, and some other Asian countries and regions. Outline Technology PHS is essentially a cordless telephone like DECT, with the capability to handover from one cell to another. PHS cells are small, with transmission power of base station a maximum of 500 mW and range typically measures in tens or at most hundreds of metres (some can range up to about 2 kilometres in line-of-sight), contrary to the multi-kilometre ranges of CDMA and GSM. This makes PHS suitable for dense urban areas, but impractical for rural areas, and the small cell size also makes it difficult if not impossible to make calls from rapidly moving vehicles. PHS uses TDMA/TDD for its radio channel access method, and 32 kbit/s ADPCM for its voice codec. Modern PHS phone can also support many value-added services such as high speed wireless data/Internet connection (64 kbit/s and higher), WWW access, e-mailing, and text messaging. PHS technology is also a popular option for providing a wireless local loop, where it is used for bridging the "last mile" gap between the POTS network and the subscriber's home. It was developed under the concept of providing a wireless front-end of an ISDN network. Thus a PHS base station is compatible with ISDN and is often connected directly to ISDN telephone exchange equipment e.g. a digital switch. In spite of its low-cost base station, micro-cellular system and "Dynamic Cell Assignment" system, PHS offers higher number-of-digits frequency use efficiency with lower cost (throughput per area basis), compared with typical 3G cellular telephone systems. It enables flat-rate wireless service such as AIR-EDGE, throughout Japan. The speed of an AIR-EDGE
https://en.wikipedia.org/wiki/Species%E2%80%93area%20relationship
The species–area relationship or species–area curve describes the relationship between the area of a habitat, or of part of a habitat, and the number of species found within that area. Larger areas tend to contain larger numbers of species, and empirically, the relative numbers seem to follow systematic mathematical relationships. The species–area relationship is usually constructed for a single type of organism, such as all vascular plants or all species of a specific trophic level within a particular site. It is rarely if ever, constructed for all types of organisms if simply because of the prodigious data requirements. It is related but not identical to the species discovery curve. Ecologists have proposed a wide range of factors determining the slope and elevation of the species–area relationship. These factors include the relative balance between immigration and extinction, rate and magnitude of disturbance on small vs. large areas, predator-prey dynamics, and clustering of individuals of the same species as a result of dispersal limitation or habitat heterogeneity. The species–area relationship has been reputed to follow from the 2nd law of thermodynamics. In contrast to these "mechanistic" explanations, others assert the need to test whether the pattern is simply the result of a random sampling process. Species–area relationships are often evaluated in conservation science in order to predict extinction rates in the case of habitat loss and habitat fragmentation. Authors have classified the species–area relationship according to the type of habitats being sampled and the census design used. Frank W. Preston, an early investigator of the theory of the species–area relationship, divided it into two types: samples (a census of a contiguous habitat that grows in the census area, also called "mainland" species–area relationships), and isolates (a census of discontiguous habitats, such as islands, also called "island" species–area relationships). Michael Rosenzwe
https://en.wikipedia.org/wiki/Functional%20requirement
In software engineering and systems engineering, a functional requirement defines a function of a system or its component, where a function is described as a summary (or specification or statement) of behavior between inputs and outputs. Functional requirements may involve calculations, technical details, data manipulation and processing, and other specific functionality that define what a system is supposed to accomplish. Behavioral requirements describe all the cases where the system uses the functional requirements, these are captured in use cases. Functional requirements are supported by non-functional requirements (also known as "quality requirements"), which impose constraints on the design or implementation (such as performance requirements, security, or reliability). Generally, functional requirements are expressed in the form "system must do <requirement>," while non-functional requirements take the form "system shall be <requirement>." The plan for implementing functional requirements is detailed in the system design, whereas non-functional requirements are detailed in the system architecture. As defined in requirements engineering, functional requirements specify particular results of a system. This should be contrasted with non-functional requirements, which specify overall characteristics such as cost and reliability. Functional requirements drive the application architecture of a system, while non-functional requirements drive the technical architecture of a system. In some cases a requirements analyst generates use cases after gathering and validating a set of functional requirements. The hierarchy of functional requirements collection and change, broadly speaking, is: user/stakeholder request → analyze → use case → incorporate. Stakeholders make a request; systems engineers attempt to discuss, observe, and understand the aspects of the requirement; use cases, entity relationship diagrams, and other models are built to validate the requirement;
https://en.wikipedia.org/wiki/Oakley%20protocol
The Oakley Key Determination Protocol is a key-agreement protocol that allows authenticated parties to exchange keying material across an insecure connection using the Diffie–Hellman key exchange algorithm. The protocol was proposed by Hilarie K. Orman in 1998, and formed the basis for the more widely used Internet Key Exchange protocol. The Oakley protocol has also been implemented in Cisco Systems' ISAKMP daemon. References External links The OAKLEY Key Determination Protocol The Internet Key Exchange (IKE) Cryptographic protocols
https://en.wikipedia.org/wiki/Automotive%20Electronics%20Council
The Automotive Electronics Council (AEC) is an organization originally established in the 1990s by Chrysler, Ford, and GM for the purpose of establishing common part-qualification and quality-system standards. The AEC Component Technical Committee is the standardization body for establishing standards for reliable, high quality electronic components. Components meeting these specifications are suitable for use in the harsh automotive environment without additional component-level qualification testing. The technical documents developed by the AEC Component Technical Committee are available at the AEC web site. Most commonly referenced AEC documents are: AEC-Q100 "Failure Mechanism Based Stress Test Qualification For Integrated Circuits" AEC-Q101 "Failure Mechanism Based Stress Test Qualification For Discrete Semiconductors" AEC-Q200 "Stress Test Qualification For Passive Components" References External links Automotive Electronics Council Automotive technologies Automotive electronics Electrical engineering organizations Motor trade associations
https://en.wikipedia.org/wiki/Islamic%20Society%20of%20Engineers
The Islamic Society of Engineers (ISE) (, ) is a principlist political organization of engineers in Iran. Formerly one of the parties aligned with the Combatant Clergy Association, it is close to the Islamic Coalition Party, whose decisions they mostly follow. It is questionable whether it is an independent and strong party. The Society was formed at the end of the Iran–Iraq War (1988) with the objective of elevating the Islamic, political, scientific and technical knowledge of the Muslim people of Iran, defending major freedoms such as freedom of expression and gatherings, as well as continued campaign against foreign cultural agents whether Eastern or Western materialism. Members Mahmoud Ahmadinejad, the sixth President of Iran, was an active member since its establishment but turned against the party after presidency. Mohammad Reza Bahonar, current Secretary-General and former Deputy Speaker of the Parliament of Iran Manouchehr Mottaki, former Minister of Foreign Affairs Mohammad Nazemi Ardakani, former Minister of Cooperatives Party leaders Current officeholders Morteza Nabavi, Member of Expediency Discernment Council Morteza Saghaiyannejad, Mayor of Qom Parliament members Hamidreza Fouladgar (Isfahan) Mohammad Mehdi Zahedi (Kerman and Ravar) Jabbar Kouchakinejad (Rasht) Mohammad Mehdi Mofatteh (Toiserkan) References External links ecoi.net's profile of ISE Hamshahri's report from the general congress of the Islamic Society of Engineers (in Persian) 1988 establishments in Iran Political parties established in 1988 Principlist political groups in Iran Engineering organizations
https://en.wikipedia.org/wiki/5S%20%28methodology%29
5S is a workplace organization method that uses a list of five Japanese words: , , , , and . These have been translated as 'sort', 'set in order', 'shine', 'standardize', and 'sustain'. The list describes how to organize a work space for efficiency and effectiveness by identifying and storing the items used, maintaining the area and items, and sustaining the new organizational system. The decision-making process usually comes from a dialogue about standardization, which builds understanding among employees of how they should do the work. In some quarters, 5S has become 6S, the sixth element being safety (safe). Other than a specific stand-alone methodology, 5S is frequently viewed as an element of a broader construct known as visual control, visual workplace, or visual factory. Under those (and similar) terminologies, Western companies were applying underlying concepts of 5S before publication, in English, of the formal 5S methodology. For example, a workplace-organization photo from Tennant Company (a Minneapolis-based manufacturer) quite similar to the one accompanying this article appeared in a manufacturing-management book in 1986. Origins 5S was developed in Japan and was identified as one of the techniques that enabled just-in-time manufacturing. Two major frameworks for understanding and applying 5S to business environments have arisen, one proposed by Takahashi and Osada, the other by Hiroyuki Hirano. Hirano provided a structure to improve programs with a series of identifiable steps, each building on its predecessor. Before this Japanese management framework, a similar "scientific management" was proposed by Alexey Gastev and the USSR Central Institute of Labour (CIT) in Moscow. Each S There are five 5S phases. They can be translated to English as 'sort', 'set in order', 'shine', 'standardize', and 'sustain'. Other translations are possible. Sort ( ) is sorting through all items in a location and removing all unnecessary items from the location.
https://en.wikipedia.org/wiki/Sky%2B
Sky+ (pronounced Sky Plus) is a discontinued personal video recorder (PVR) and subscription service from the satellite television provider Sky in the UK and Ireland. Launched in September 2001, it allows customers to record, pause and instantly rewind live TV. The system performs these functions using an internal hard drive inside the Sky+ set top box, an upgrade over the standard Digibox. On 25 August 2001, the Sky+ demonstration was added to the Sky Guide demonstration, and was shown on Sky Welcome (Channel 998), which lasted for 15 minutes. A demo of the Sky+ box was shown on the Sky Customer Channel (Channel 999). Originally a Sky+ subscription cost £10 per month - this fee was discontinued for subscribers from 1 July 2007. By July 2002 the service attracted 25,000 subscribers, and by 30 September 2009, there were 5.9 million customers with Sky+. Sky+ was also released in Italy, Germany and Austria. During its lifetime its chief competitors in the UK market were Freeview+, Freesat+, BT Vision, and Virgin Media's V+ and TiVo. In the Republic of Ireland, Sky+ competed with Virgin Media Horizon TV and Saorview. In October 2016, Sky stopped selling the Sky+ subscription service, replacing it with Sky Q, although users of Sky+ can continue using their legacy box. Technical information Combined digital satellite receiver/decoder and personal video recorder (PVR). Twin digital satellite tuners – for connection to identical independent feeds from Astra 28.2°E. Allows simultaneous recording/viewing or recording of 2 channels at once. The set-top box middleware is provided by OpenTV, but the EPG and all the software extensions that manage the PVR functions are produced by NDS under the name of XTV PVR. Sky+ has its own electronic programme guide made by Sky. From here, users can see what programmes are on in the next seven days. The current EPG software version (as of July 2010) is Sky+ 5.08.6. Versions There have been various versions of Sky+: Sky+ 40  GB (discont
https://en.wikipedia.org/wiki/Devolution%20%28biology%29
Devolution, de-evolution, or backward evolution (not to be confused with dysgenics) is the notion that species can revert to supposedly more primitive forms over time. The concept relates to the idea that evolution has a purpose (teleology) and is progressive (orthogenesis), for example that feet might be better than hooves or lungs than gills. However, evolutionary biology makes no such assumptions, and natural selection shapes adaptations with no foreknowledge of any kind. It is possible for small changes (such as in the frequency of a single gene) to be reversed by chance or selection, but this is no different from the normal course of evolution and as such de-evolution is not compatible with a proper understanding of evolution due to natural selection. In the 19th century, when belief in orthogenesis was widespread, zoologists (such as Ray Lankester and Anton Dohrn) and the palaeontologists Alpheus Hyatt and Carl H. Eigenmann advocated the idea of devolution. The concept appears in Kurt Vonnegut's 1985 novel Galápagos, which portrays a society that has evolved backwards to have small brains. Dollo's law of irreversibility, first stated in 1893 by the palaeontologist Louis Dollo, denies the possibility of devolution. The evolutionary biologist Richard Dawkins explains Dollo's law as being simply a statement about the improbability of evolution's following precisely the same path twice. Context The idea of devolution is based on the presumption of orthogenesis, the view that evolution has a purposeful direction towards increasing complexity. Modern evolutionary theory, beginning with Darwin at least, poses no such presumption, and the concept of evolutionary change is independent of either any increase in complexity of organisms sharing a gene pool, or any decrease, such as in vestigiality or in loss of genes. Earlier views that species are subject to "cultural decay", "drives to perfection", or "devolution" are practically meaningless in terms of current (neo
https://en.wikipedia.org/wiki/Integrated%20logistics%20support
Integrated logistics support (ILS) is a technology in the system engineering to lower a product life cycle cost and decrease demand for logistics by the maintenance system optimization to ease the product support. Although originally developed for military purposes, it is also widely used in commercial customer service organisations. ILS defined In general, ILS plans and directs the identification and development of logistics support and system requirements for military systems, with the goal of creating systems that last longer and require less support, thereby reducing costs and increasing return on investments. ILS therefore addresses these aspects of supportability not only during acquisition, but also throughout the operational life cycle of the system. The impact of ILS is often measured in terms of metrics such as reliability, availability, maintainability and testability (RAMT), and sometimes System Safety (RAMS). ILS is the integrated planning and action of a number of disciplines in concert with one another to assure system availability. The planning of each element of ILS is ideally developed in coordination with the system engineering effort and with each other. Tradeoffs may be required between elements in order to acquire a system that is: affordable (lowest life cycle cost), operable, supportable, sustainable, transportable, and environmentally sound. In some cases, a deliberate process of Logistics Support Analysis will be used to identify tasks within each logistics support element. The most widely accepted list of ILS activities include: Reliability engineering, maintainability engineering and maintenance (preventive, predictive and corrective) planning Supply (spare part) support acquire resources Support and test equipment/equipment support Manpower and personnel Training and training support Technical data/publications Computer resources support Facilities Packaging, handling, storage and transportation Design interface Decisions are d
https://en.wikipedia.org/wiki/KFOX-TV
KFOX-TV (channel 14) is a television station in El Paso, Texas, United States, affiliated with the Fox network. It is owned by Sinclair Broadcast Group alongside dual CBS/MyNetworkTV affiliate KDBC-TV (channel 4). Both stations share studios on South Alto Mesa Drive in northwest El Paso, while KFOX-TV's transmitter is located atop the Franklin Mountains on the El Paso city limits. Established as El Paso's first non-network TV station in 1979 after years of telecasting Christian programs on cable, the station as KCIK struggled financially and introduced secular entertainment programs. While it was owned in turn by two Christian groups, it continued this orientation and affiliated with Fox in 1986. It prospered with the new affiliation and introduced local news in 1997 after being sold to Cox Television. Sinclair acquired KFOX and KDBC in separate transactions in 2013, combining their operations. History Launch and early years Six years before a signal was broadcast on channel 14 in El Paso, the foundation was laid for the station that would occupy it with the launch of a Christian television station, known as International Christian Television (ICT), on El Paso's cable system in 1973. The station was operated by a company known as Missionary Radio Evangelism, Inc. (MRE), led by Pete Warren and Alex Blomerth, and began to telecast seven days a week on cable channel 8 in 1974. That year, it purchased its first mobile production van. As early as mid-1974, the group had its sights set on building UHF channel 14 in El Paso: its club of donors was the "1400 Club", and it was soliciting donations with an eye to building capacity to make the leap. Pledge drives were also held to raise funds. On May 24, 1976, Missionary Radio Evangelism filed a formal application with the Federal Communications Commission (FCC) for a channel 14 construction permit, which was granted on December 23. While ICT/MRE promised an Easter 1977 launch after getting the permit, viewers would have t
https://en.wikipedia.org/wiki/Discharge%20pressure
Discharge pressure (also called high side pressure or head pressure) is the pressure generated on the output side of a gas compressor in a refrigeration or air conditioning system. The discharge pressure is affected by several factors: size and speed of the condenser fan, condition and cleanliness of the condenser coil, and the size of the discharge line. An extremely high discharge pressure coupled with an extremely low suction pressure is an indicator of a refrigerant restriction. Cooling technology Hydraulics Hydrostatics Pressure
https://en.wikipedia.org/wiki/Capacity%20optimization
Capacity optimization is a general term for technologies used to improve storage use by shrinking stored data. Primary technologies used for capacity optimization are data deduplication and data compression. These are delivered as software or hardware, integrated with storage systems or delivered as standalone products. Deduplication algorithms look for redundancy in sequences of bytes across comparison windows. Typically using cryptographic hash functions as identifiers of unique sequences, sequences are compared to the history of other such sequences, and where possible, the first uniquely stored version of a sequence is referenced rather than stored again. Different methods for selecting data windows include 4KB blocks to full-file comparisons known as single-instance storage (SIS). Capacity optimization generally refers to the use of this kind of technology in a storage system. An example of this kind of system is the Venti file system in the Plan9 open source OS. There are also implementations in networking (especially wide-area networking), where they are sometimes called bandwidth optimization or WAN optimization. Commercial implementations of capacity optimization are most often found in backup/recovery storage, where storage of iterating versions of backups day to day creates an opportunity for reduction in space using this approach. The term was first used widely in 2005. References Capacity optimization through sensing threshold adaptation for cognitive radio networks (https://doi.org/10.1007%2Fs11590-011-0345-8) Software optimization
https://en.wikipedia.org/wiki/Sodium%20aluminosilicate
Sodium aluminosilicate refers to compounds which contain sodium, aluminium, silicon and oxygen, and which may also contain water. These include synthetic amorphous sodium aluminosilicate, a few naturally occurring minerals and synthetic zeolites. Synthetic amorphous sodium aluminosilicate is widely used as a food additive, E 554. Amorphous sodium aluminosilicate This substance is produced with a wide range of compositions and has many different applications. It is encountered as an additive E 554 in food where it acts as an anticaking (free flow) agent. As it is manufactured with a range of compositions it is not strictly a chemical compound with a fixed stoichiometry. One supplier quotes a typical analysis for one of their products as 14SiO2·Al2O3·Na2O·3H2O,(Na2Al2Si14O32·3H2O). The US FDA has as of April 1, 2012 approved sodium aluminosilicate (sodium silicoaluminate) for direct contact with consumable items under 21 CFR 182.2727. Sodium aluminosilicate is used as molecular sieve in medicinal containers to keep contents dry. Sodium aluminosilicate may also be listed as: aluminium sodium salt sodium silicoaluminate aluminosilicic acid, sodium salt sodium aluminium silicate aluminum sodium silicate sodium silico aluminate sasil As a problem in industrial processes The formation of sodium aluminosilicate makes the Bayer process uneconomical for bauxites high in silica. Minerals sometimes called sodium aluminosilicate Naturally occurring minerals that are sometimes given the chemical name, sodium aluminosilicate include albite (NaAlSi3O8, an end-member of the plagioclase series) and jadeite (NaAlSi2O6). Synthetic zeolites sometimes called sodium aluminosilicate Synthetic zeolites have complex structures and examples (with structural formulae) are: Na12Al12Si12O48·27H2O, zeolite A (Linde type A sodium form, NaA), used in laundry detergents Na16Al16Si32O96·16H2O, Analcime, IUPAC code ANA Na12Al12Si12O48·q H2O, Losod Na384Al384Si384O1536·518H2O, Lind
https://en.wikipedia.org/wiki/Inclusion%20bodies
Inclusion bodies are aggregates of specific types of protein found in neurons, a number of tissue cells including red blood cells, bacteria, viruses, and plants. Inclusion bodies of aggregations of multiple proteins are also found in muscle cells affected by inclusion body myositis and hereditary inclusion body myopathy. Inclusion bodies in neurons may be accumulated in the cytoplasm or nucleus, and are associated with many neurodegenerative diseases. Inclusion bodies in neurodegenerative diseases are aggregates of misfolded proteins (aggresomes) and are hallmarks of many of these diseases, including Lewy bodies in Lewy body dementias, and Parkinson's disease, neuroserpin inclusion bodies called Collins bodies in familial encephalopathy with neuroserpin inclusion bodies, inclusion bodies in Huntington's disease, Papp–Lantos bodies in multiple system atrophy, and various inclusion bodies in frontotemporal dementia including Pick bodies. Bunina bodies in motor neurons are a core feature of amyotrophic lateral sclerosis. Other usual cell inclusions are often temporary inclusions of accumulated proteins, fats, secretory granules or other insoluble components. Inclusion bodies are found in bacteria as particles of aggregated protein. They have a higher density than many other cell components but are porous. They typically represent sites of viral multiplication in a bacterium or a eukaryotic cell and usually consist of viral capsid proteins. Inclusion bodies contain very little host protein, ribosomal components or DNA/RNA fragments. They often almost exclusively contain the over-expressed protein and aggregation and has been reported to be reversible. It has been suggested that inclusion bodies are dynamic structures formed by an unbalanced equilibrium between aggregated and soluble proteins of Escherichia coli. There is a growing body of information indicating that formation of inclusion bodies occurs as a result of intracellular accumulation of partially folded ex
https://en.wikipedia.org/wiki/Tanner%20graph
In coding theory, a Tanner graph, named after Michael Tanner, is a bipartite graph used to state constraints or equations which specify error correcting codes. In coding theory, Tanner graphs are used to construct longer codes from smaller ones. Both encoders and decoders employ these graphs extensively. Origins Tanner graphs were proposed by Michael Tanner as a means to create larger error correcting codes from smaller ones using recursive techniques. He generalized the techniques of Elias for product codes. Tanner discussed lower bounds on the codes obtained from these graphs irrespective of the specific characteristics of the codes which were being used to construct larger codes. Tanner graphs for linear block codes Tanner graphs are partitioned into subcode nodes and digit nodes. For linear block codes, the subcode nodes denote rows of the parity-check matrix H. The digit nodes represent the columns of the matrix H. An edge connects a subcode node to a digit node if a nonzero entry exists in the intersection of the corresponding row and column. Bounds proven by Tanner Tanner proved the following bounds Let be the rate of the resulting linear code, let the degree of the digit nodes be and the degree of the subcode nodes be . If each subcode node is associated with a linear code (n,k) with rate r = k/n, then the rate of the code is bounded by Computational complexity of Tanner graph based methods The advantage of these recursive techniques is that they are computationally tractable. The coding algorithm for Tanner graphs is extremely efficient in practice, although it is not guaranteed to converge except for cycle-free graphs, which are known not to admit asymptotically good codes. Applications of Tanner graph Zemor's decoding algorithm, which is a recursive low-complexity approach to code construction, is based on Tanner graphs. Notes Michael Tanner's Original paper Michael Tanner's page Coding theory Application-specific graphs
https://en.wikipedia.org/wiki/Matrix%20congruence
In mathematics, two square matrices A and B over a field are called congruent if there exists an invertible matrix P over the same field such that PTAP = B where "T" denotes the matrix transpose. Matrix congruence is an equivalence relation. Matrix congruence arises when considering the effect of change of basis on the Gram matrix attached to a bilinear form or quadratic form on a finite-dimensional vector space: two matrices are congruent if and only if they represent the same bilinear form with respect to different bases. Note that Halmos defines congruence in terms of conjugate transpose (with respect to a complex inner product space) rather than transpose, but this definition has not been adopted by most other authors. Congruence over the reals Sylvester's law of inertia states that two congruent symmetric matrices with real entries have the same numbers of positive, negative, and zero eigenvalues. That is, the number of eigenvalues of each sign is an invariant of the associated quadratic form. See also Congruence relation Matrix similarity Matrix equivalence References Linear algebra Matrices Equivalence (mathematics)
https://en.wikipedia.org/wiki/4B3T
4B3T, which stands for 4 (four) binary 3 (three) ternary, is a line encoding scheme used for ISDN PRI interface. 4B3T represents four binary bits using three pulses. Description It uses three states: + (positive pulse), 0 (no pulse), and − (negative pulse). This means we have 24 = 16 input combinations to represent, using 33 = 27 output combinations. 000 is not used to avoid long periods without a transition. 4B3T uses a paired disparity code to achieve an overall zero DC bias: six triplets are used which have no DC component (0+−, 0−+, +0−, −0+, +−0, −+0), and the remaining 20 are grouped into 10 pairs with differing disparity (e.g. ++− and −−+). When transmitting, the DC bias is tracked and a combination chosen that has a DC component of the opposite sign to the running total. This mapping from 4 bits to three ternary states is given in a table known as Modified Monitoring State 43 (MMS43). A competing encoding technique, used for the ISDN basic rate interface where 4B3T is not used, is 2B1Q. The sync sequence used is the 11-symbol Barker code, +++−−−+−−+− or its reverse, −+−−+−−−+++. Encoding table Each 4-bit input group is encoded as a 3-symbol group (transmitted left to right) from the following table. Encoding requires keeping track of the accumulated DC offset, the number of + pulses minus the number of − pulses in all preceding groups. The starting value is arbitrary; here we use the values 1 through 4, although −1.5, −0.5, +0.5 and +1.5 is another possibility. This code forces a transition after at most five consecutive identical non-zero symbols, or four consecutive zero symbols. Decoding table Decoding is simpler, as the decoder does not need to keep track of the encoder state, although doing so allows greater error detection. The 000 triplet is not a legal encoded sequence, but is typically decoded as binary 0000. See also Other line codes that have 3 states: hybrid ternary code bipolar encoding MLT-3 encoding B3ZS References Line
https://en.wikipedia.org/wiki/Neopterin
Neopterin is an organic compound belonging to the pteridine class of heterocyclic compounds. Neopterin belongs to the chemical group known as pteridines. It is synthesised by human macrophages upon stimulation with the cytokine interferon-gamma and is indicative of a pro-inflammatory immune status. Neopterin serves as a marker of cellular immune system activation. In humans neopterin follows a circadian and circaseptan rhythm. Biosynthesis The biosynthesis of neopterin occurs in two steps from guanosine triphosphate (GTP). The first being catalyzed by GTP cyclohydrolase, which opens the ribose group. Phosphatases next catalyze the hydrolysis of the phosphate ester group. Neopterin as disease marker Measurement of neopterin concentrations in body fluids like blood serum, cerebrospinal fluid or urine provides information about activation of cellular immune activation in humans under the control of T helper cells type 1. High neopterin production is associated with increased production of reactive oxygen species, neopterin concentrations also allow to estimate the extent of oxidative stress elicited by the immune system. Increased neopterin production is found in, but not limited to, the following diseases: Viral infections including human immunodeficiency virus (HIV), hepatitis B and hepatitis C, SARS-CoV-1, SARS-CoV-2. Bacterial infections by intracellular living bacteria such as Borrelia (Lyme disease), Mycobacterium tuberculosis, and Helicobacter pylori. parasites such as Plasmodium (malaria) Autoimmune diseases such as rheumatoid arthritis (RA) and systemic lupus erythematosus (SLE) Malignant tumor diseases Allograft rejection episodes. A leukodystrophy called Aicardi-Goutieres syndrome Depression and somatization. Neopterin concentrations usually correlate with the extent and activity of the disease, and are also useful to monitor during therapy in these patients. Elevated neopterin concentrations are among the best predictors of adverse outcome in pati
https://en.wikipedia.org/wiki/Stokes%20phenomenon
In complex analysis the Stokes phenomenon, discovered by , is where the asymptotic behavior of functions can differ in different regions of the complex plane. This seemingly gives rise to a paradox when looking at the asymptotic expansion of an analytic function. Since an analytic function is continuous you would expect the asymptotic expansion to be continuous. This paradox is the subject of Stokes' early research and is known as Stokes phenomenon. The regions in the complex plane with different asymptotic behaviour are bounded by possibly one or two types of curves known as Stokes curves and Anti-Stokes Curves. This apparent paradox has since been resolved and the supposed discontinuous jump in the asymptotic expansions has been shown to be smooth and continuous. In order to resolve this paradox the asymptotic expansion needs to be handled in a careful manner. More specifically the asymptotic expansion must include additional exponentially small terms relative to the usual algebraic terms included in a usual asymptotic expansion. What happens in Stokes phenomenon is that an asymptotic expansion in one region may contain an exponentially small contribution (neglecting this contribution still gives a correct asymptotic expansion for that region). However, this exponentially small term can become exponentially large in another region of the complex plane, this change occurs across the Anti-Stokes curves. Furthermore the exponentially small term may switch on or off other exponentially small terms, this change occurs across a Stokes curve. Including these exponentially small terms allows the asymptotic expansion to be written as a continuous expansion for the entire complex domain which resolves the Stokes Phenomenon paradox. Stokes Curves and anti-Stokes Curves Across a Stokes curve, an exponentially small term can switch on or off another exponentially small term. Across an anti-Stokes curve, a subdominant exponentially small term can switch to a dominant exponen
https://en.wikipedia.org/wiki/Pick-and-place%20machine
Surface-mount technology (SMT) component placement systems, commonly called pick-and-place machines or P&Ps, are robotic machines which are used to place surface-mount devices (SMDs) onto a printed circuit board (PCB). They are used for high speed, high precision placing of a broad range of electronic components, for example capacitors, resistors, integrated circuits onto the PCBs which are in turn used in computers, consumer electronics as well as industrial, medical, automotive, military and telecommunications equipment. Similar equipment exists for through-hole components. This type of equipment is sometimes used to package microchips using the flip chip method. History 1980s and 1990s During this time, a typical SMT assembly line employed two different types of pick-and-place (P&P) machines arranged in sequence. The unpopulated board was fed into a rapid placement machine. These machines, sometimes called chip shooters, place mainly low-precision, simple package components such as resistors and capacitors. These high-speed P&P machines were built around a single turret design capable of mounting up to two dozen stations. As the turret spins, the stations passing the back of the machine pick up parts from tape feeders mounted on a moving carriage. As the station proceeds around the turret, it passes an optical station that calculates the angle at which the part was picked up, allowing the machine to compensate for drift. Then, as the station reaches the front of the turret, the board is moved into the proper position, the nozzle is spun to put the part in proper angular orientation, and the part is placed on the board. Typical chip shooters can, under optimal conditions, place up to 53,000 parts per hour, or almost 15 parts per second. Because the PCB is moved rather than the turret, only lightweight parts that will not be shaken loose by the violent motion of the PCB can be placed this way. From the high speed machine, the board transits to a precision pla
https://en.wikipedia.org/wiki/Storage%20clamp
A clamp is a compact heap, mound or pile of materials. A storage clamp is used in the agricultural industry for temporary storage of root crops such as potato, turnip, rutabaga, mangelwurzel, and sugar beet. A clamp is formed by excavating a shallow rectangular depression in a field to make a base for the clamp. Root crops are then stacked onto the base up to a height of about . When the clamp is full, the earth scraped from the field to make the base is then used to cover the root crops to a depth of several inches. Straw or old hay may be used to protect the upper surface from rain erosion. A well-made clamp will keep the vegetables cool and dry for many months. Most clamps are relatively long and narrow, allowing the crops to be progressively removed from one end without disturbing the remaining vegetables. The use of a clamp allows a farmer to feed vegetables into market over many months. See also Food preservation Root cellar Brick clamp Charcoal clamp CLAMP, an artist collective named after potato clamps References External links How to make a Storage Clamp - Green Chronicle Food preservation Agricultural terminology
https://en.wikipedia.org/wiki/Safety%20lamp
A safety lamp is any of several types of lamp that provides illumination in places such as coal mines where the air may carry coal dust or a build-up of inflammable gases, which may explode if ignited, possibly by an electric spark. Until the development of effective electric lamps in the early 1900s, miners used flame lamps to provide illumination. Open flame lamps could ignite flammable gases which collected in mines, causing explosions; safety lamps were developed to enclose the flame to prevent it from igniting the explosive gases. Flame safety lamps have been replaced for lighting in mining with sealed explosion-proof electric lights, but continue to be used to detect gases. Background Damps or gases Miners have traditionally referred to the various gases encountered during mining as damps, from the Middle Low German word dampf (meaning "vapour"). Damps are variable mixtures and are historic terms. Firedamp Naturally occurring flammable mixtures, principally methane. Blackdamp or Chokedamp Nitrogen and carbon dioxide with no oxygen. Formed by complete combustion of firedamp or occurring naturally. Coal in contact with air will oxidize slowly and, if unused workings are not ventilated, pockets of blackdamp may develop. Also referred to as azotic air in some 19th-century papers. Whitedamp Formed by the incomplete combustion of coal, or firedamp. The mixture may contain significant amounts of carbon monoxide, which is toxic and potentially explosive. Stinkdamp Naturally occurring hydrogen sulphide and other gases. The hydrogen sulphide is highly toxic, but easily detected by smell. The other gases with it may be firedamp or blackdamp. Afterdamp The gas from an explosion of firedamp or coal dust. Contains varying proportions of blackdamp and whitedamp and is therefore suffocating, toxic, or explosive, or any combination of these. Afterdamp may also contain stinkdamp. Afterdamp may be a bigger killer following an explosion than the explosion itself. Open-
https://en.wikipedia.org/wiki/Shooting%20ratio
The shooting ratio or "Bertolo code" in filmmaking and television production is the ratio between the total duration of its footage created for possible use in a project and that which appears in its final cut. A film with a shooting ratio of 2:1 would have shot twice the amount of footage than was used in the film. In real terms this means that 120 minutes of footage would have been shot to produce a film of 60 minutes in length. While shooting ratios can vary greatly between productions, a typical shooting ratio for a production using film stock will be between 6:1 and 10:1, whereas a similar production using video is likely to be much higher. This is a direct result of the significant difference in price between video tape stock and film stock and the necessary processing. Although the decisions, styles and preferences of the filmmakers can affect the shooting ratio of a project greatly, the nature of the production (genre, form, single camera, multi-camera, etc.) greatly affects the typical range of the ratios seen – documentary films typically have the highest (often exceeding 100:1 following the rise of video and digital media) and animated films have the lowest (typically as close to 1:1 as possible, since the creation of footage frame by frame makes the time costs of animation extremely high compared to live action). Animated productions will often shoot acting reference (by animators of themselves and or others), location reference, and performance reference (taken of voice actors), but these pieces of reference footage are not regarded as counting towards the shooting ratio, as they were never intended to appear in the projects they were created for. Audition footage, screen tests, and location reference are similarly not counted towards a narrative film's shooting ratio, live action or animated, for the same reason. Since a documentary may potentially use any footage that is shot at any point for any reason, documentary productions do not have similar e
https://en.wikipedia.org/wiki/Espacenet
Espacenet (formerly stylized as esp@cenet) is a free online service for searching patents and patent applications. Espacenet was developed by the European Patent Office (EPO) together with the member states of the European Patent Organisation. Most member states have an Espacenet service in their national language, and access to the EPO's worldwide database, most of which is in English. In 2022, the Espacenet worldwide service claimed to have records on more than 140 million patent publications. History By launching Espacenet in 1998, the EPO is said to have "revolutionized public access to international patent information, releasing patent data from its paper prisons and changing forever how patents are disseminated, organized, searched, and retrieved." In 2004, i.e. in the early years of Espacenet, Nancy Lambert considered that, although free, Espacenet, like the United States Patent and Trademark Office (USPTO) database of US patents, "still tend[ed] to have primitive search engines and in some cases rather cumbersome mechanisms to download patents." She reported it as being deliberate, on the part of the USPTO and EPO, "who have said they do not wish to compete unfairly with commercial vendors". In 2009, Espacenet offered the so-called SmartSearch which allows a query to be composed using a subset of Contextual Query Language (CQL). In 2012, the EPO launched "Patent Translate", a free online automatic translation service for patents. Created in partnership with Google, the translation engine was "specifically built to handle complex and technical patent vocabulary", using "millions of official, human-translated patent documents" to train the translation engine. It covers translations between English and 31 other languages. According to the Patent Information News' magazine published by the EPO, a 2013 independent study compared Espacenet with DepatisNet, Freepatentsonline, Google Patent and the public search facility at the USPTO. In that study, Espacenet
https://en.wikipedia.org/wiki/Comparison%20of%20source-code-hosting%20facilities
A source-code-hosting facility (also known as forge) is a file archive and web hosting facility for source code of software, documentation, web pages, and other works, accessible either publicly or privately. They are often used by open-source software projects and other multi-developer projects to maintain revision and version history, or version control. Many repositories provide a bug tracking system, and offer release management, mailing lists, and wiki-based project documentation. Software authors generally retain their copyright when software is posted to a code hosting facilities. General information Features Version control systems Popularity Discontinued: CodePlex, Gna!, Google Code. Specialized hosting facilities The following are open-source software hosting facilities that only serve a specific narrowly focused community or technology. Former hosting facilities Alioth (Debian) – In 2018, Alioth has been replaced by a GitLab based solution hosted on salsa.debian.org. Alioth has been finally switched off in June 2018. BerliOS – abandoned in April 2014 Betavine – abandoned somewhere in 2015. CodeHaus – shut down in May 2015 CodePlex – shut down in December 2017. Fedora Hosted – closed in March 2017 Gitorious – shut down in June 2015. Gna! – shut down in 2017. Google Code – closed in January 2016, all projects archived. See http://code.google.com/archive/. java.net – Java.net and kenai.com hosting closed April 2017. Phabricator – wound down operations 1 June 2021, all projects continued to be hosted with very limited support after 31 August 2021. Tigris.org – shut down in July 2020. Mozdev.org - shut down in July 2020. See also Comparison of version-control software Distributed version control Forge (software) List of free software project directories List of version-control software Source code escrow for closed-source software Version control (source-code-management systems) Notes References External links Online servic
https://en.wikipedia.org/wiki/Cyberun
Cyberun is a ZX Spectrum video game by Ultimate Play the Game and published by U.S. Gold in 1986. Although not part of the Jetman series, it has similarities to Jetpac in that the player must construct their spaceship from parts, then seek out resources and power-ups. Gameplay The player controls a spaceship trapped on a planet inhabited by hostile aliens. The goal is to upgrade the spaceship with parts scattered around the planet and mine a valuable element called "Cybernite". The atmosphere above ground is populated by flying aliens and clouds that drip acid, damaging the ship's shields. The ship requires fuel to fly, and once exhausted will bounce along the ground of the planet unable to climb. A similar enemy ship is also on the planet attempting to mine the Cybernite before the player. Fuel can be replenished by tankers on the planet surface, but damaged shields cannot be repaired. The player must venture into caverns below the surface in order to mine the Cybernite, which can only be done once the ship has been upgraded to include a mining laser. Once sufficient Cybernite has been collected, the player can escape to the next planet in the Zebarema system. Reception The game was well received by critics, with Crash awarding it a 90% Crash Smash, and Your Spectrum giving it 8/10, describing the game as "a classic pick up the pieces and shoot em up with brilliant graphics". References External links Cyberun review at CRASH magazine 1986 video games Rare (company) games ZX Spectrum games Amstrad CPC games MSX games Science fiction video games Scrolling shooters Video games developed in the United Kingdom
https://en.wikipedia.org/wiki/Beetle%20%28ASIC%29
The Beetle ASIC is an analog readout chip. It is developed for the LHCb experiment at CERN. Overview The chip integrates 128 channels with low-noise charge-sensitive pre-amplifiers and shapers. The pulse shape can be chosen such that it complies with LHCb specifications: a peaking time of 25 ns with a remainder of the peak voltage after 25 ns of less than 30%. A comparator per channel with configurable polarity provides a binary signal. Four adjacent comparator channels are being ORed and brought off chip via LVDS drivers. Either the shaper or comparator output is sampled with the LHC bunch-crossing frequency of 40 MHz into an analog pipeline. This ring buffer has a programmable latency of a maximum of 160 sampling intervals and an integrated derandomising buffer of 16 stages. For analogue readout data is multiplexed with up to 40 MHz onto one or four ports. A binary readout mode operates at up to 80 MHz output rate on two ports. Current drivers bring the serialised data off chip. The chip can accept trigger rates up to 1.1 MHz to perform a dead-timeless readout within 900 ns per trigger. For testability and calibration purposes, a charge injector with adjustable pulse height is implemented. The bias settings and various other parameters can be controlled via a standard I²C-interface. The chip is radiation hardened to an accumulated dose of more than 100 Mrad. Robustness against single event upset is achieved by redundant logic. External links Beetle - a readout chip for LHCb The Large Hadron Collider beauty experiment Application-specific integrated circuits CERN
https://en.wikipedia.org/wiki/Bubbler%20%28video%20game%29
Bubbler is a ZX Spectrum video game developed and published by Ultimate Play the Game in 1987. It was Ultimate's final release for 8-bit home computers before evolving into Rare. The game is an isometric platform game in the style of Marble Madness (1984). Development A Commodore 64 version was outsourced to Lynsoft but the release was cancelled as Ultimate thought the game was running too slowly. Reception Crash magazine reviewer Ricky disliked the impreciseness of the controls. Sinclair User were more impressed by the game; they did not consider it to be one of Ultimate's most original game or particularly well presented but thought it was very addictive. It was awarded a 5 star rating. References External links Bubbler at Ultimate Wurlde Review at CRASH Unreleased C64 port 1987 video games Rare (company) games Amstrad CPC games Marble video games MSX games ZX Spectrum games Cancelled Commodore 64 games Video games with isometric graphics Video games developed in the United Kingdom
https://en.wikipedia.org/wiki/Jerzy%20Neyman
Jerzy Neyman (April 16, 1894 – August 5, 1981; born Jerzy Spława-Neyman; ) was a Polish mathematician and statistician who spent the first part of his professional career at various institutions in Warsaw, Poland and then at University College London, and the second part at the University of California, Berkeley. Neyman first introduced the modern concept of a confidence interval into statistical hypothesis testing and co-revised Ronald Fisher's null hypothesis testing (in collaboration with Egon Pearson). Life and career He was born into a Polish family in Bendery, in the Bessarabia Governorate of the Russian Empire, the fourth of four children of Czesław Spława-Neyman and Kazimiera Lutosławska. His family was Roman Catholic, and Neyman served as an altar boy during his early childhood. Later, Neyman would become an agnostic. Neyman's family descended from a long line of Polish nobles and military heroes. He graduated from the Kamieniec Podolski gubernial gymnasium for boys in 1909 under the name Yuri Cheslavovich Neyman. He began studies at Kharkiv University in 1912, where he was taught by Ukrainian probabilist Sergei Natanovich Bernstein. After he read 'Lessons on the integration and the research of the primitive functions' by Henri Lebesgue, he was fascinated with measure and integration. In 1921, he returned to Poland in a program of repatriation of POWs after the Polish-Soviet War. He earned his Doctor of Philosophy degree at University of Warsaw in 1924 for a dissertation titled "On the Applications of the Theory of Probability to Agricultural Experiments". He was examined by Wacław Sierpiński and Stefan Mazurkiewicz, among others. He spent a couple of years in London and Paris on a fellowship to study statistics with Karl Pearson and Émile Borel. After his return to Poland, he established the Biometric Laboratory at the Nencki Institute of Experimental Biology in Warsaw. He published many books dealing with experiments and statistics, and devised the way
https://en.wikipedia.org/wiki/Avid%20DNxHD
Avid DNxHD ("Digital Nonlinear Extensible High Definition") is a lossy high-definition video post-production codec developed by Avid for multi-generation compositing with reduced storage and bandwidth requirements. It is an implementation of SMPTE VC-3 standard. Overview DNxHD is a video codec intended to be usable as both an intermediate format suitable for use while editing and as a presentation format. DNxHD data is typically stored in an MXF container, although it can also be stored in a QuickTime container. On February 13, 2008, Avid reported that DNxHD was approved as compliant with the SMPTE VC-3 standard. DNxHD is intended to be an open standard, but as of March 2008, has remained effectively a proprietary Avid format. The source code for the Avid DNxHD codec is freely available from Avid for internal evaluation and review, although commercial use requires Avid licensing approval. It has been commercially licensed to a number of companies including Ikegami, FilmLight, Harris Corporation, JVC, Seachange, EVS Broadcast Equipment. On September 14, 2014, at the Avid Connect event in Amsterdam, Netherlands, Avid announced the DNxHR codec to support resolutions greater than 1080p, such as 2K and 4K. On December 22, 2014, Avid Technology released an update for Media Composer that added support for 4K resolution, the Rec. 2020 color space, and a bit rate of up to 3,730 Mbit/s with the DNxHR codec. Implementations DNxHD was first supported in Avid DS Nitris (Sept 2004), then Avid Media Composer Adrenaline with the DNxcel option (Dec 2004) and finally by Avid Symphony Nitris (Dec 2005). Xpress Pro is limited to using DNxHD 8-bit compression, which is either imported from file or captured using a Media Composer with Adrenaline hardware. Media Composer 2.5 also allows editing of fully uncompressed HD material that was either imported or captured on a Symphony Nitris or DS Nitris system. Ikegami's Editcam camera system is unique in its support for DNxHD, and rec
https://en.wikipedia.org/wiki/Double%20tracking
Double tracking or doubling is an audio recording technique in which a performer sings or plays along with their own prerecorded performance, usually to produce a stronger or bigger sound than can be obtained with a single voice or instrument. It is a form of overdubbing; the distinction comes from the doubling of a part, as opposed to recording a different part to go with the first. The effect can be further enhanced by panning one of the performances hard left and the other hard right in the stereo field. Automation Artificial or automatic double tracking, also known as ADT, was developed at Abbey Road Studios by engineers recording The Beatles in the 1960s. It used variable speed tape recorders connected in such a way as to mimic the effect created by double tracking. ADT produced a unique sound that could be imitated but not precisely duplicated by later analog and digital delay devices, which are capable of producing an effect called doubling echo. The effect is used to give one singer a fuller sound. Examples Double tracking was pioneered by Buddy Holly. John Lennon particularly enjoyed using the technique for his vocals while in the Beatles. Lennon's post-Beatles albums frequently employed doubling echo on his vocals in place of the ADT. Some critics complained that the effect gave the impression that Lennon recorded all his vocals in a bathroom, but some performers, like Black Francis and Paul Simon, value the rich echo chamber sound that it produces. Paul McCartney also commonly used this technique for his vocals while in the Beatles. See also Multitrack recording Bleed-through References External links Sauravb (September 2021). "Comprehensive guide to double tracking". Vstnation. Sound recording Audio engineering
https://en.wikipedia.org/wiki/Heliodisplay
The Heliodisplay is an air-based display using principally air that is already present in the operating environment (room or space). The system developed by IO2 Technology in 2001 uses a projection unit focused onto multiple layers of air and dry micron-size atomized particles in mid-air, resulting in a two-dimensional display that appears to float (3d when using 3d content). This is similar in principle to the cinematic technique of rear projection and can appear three-dimensional when using appropriate content. As dark areas of the image may appear invisible, the image may be more realistic than on a projection screen, although it is still not volumetric. However the system does allow for multiple viewing and dual viewing (back and front) when combined with two light sources. The necessity of an oblique viewing angle +/- 30 degrees may be required for various configurations due to the rear-projection requirement. Heliodisplay can operate as a free-space touchscreen when the equipment is ordered as an interactive unit with embedded sensors in the equipment. The original prototype of 2001 used a PC that sees the Heliodisplay as a pointing device, like a mouse. With the supplied software installed, one can use a finger, pen, or another object as cursor control and navigate or interact with simple content. As of 2010, no computer or drivers are required. The interactive version ("i") of the heliodisplay contains an embedded processor that controls these functions internally for single touch, or multiple touch interactivity using an equipment mounted arrangement but without the IR laser field found on the earlier versions. The smaller Heliodisplay version is transportable at and is as big as a lunchbox (30 cm x 30 cm x 12 cm) similar to the 2002 version. The larger equipment such as the systems that project life-size people capable of image diagonals up to 2.3 m also have the same footprint, about the same size as a sheet of paper. The air-based system is formed b
https://en.wikipedia.org/wiki/Instrument%20error
Instrument error refers to the error of a measuring instrument, or the difference between the actual value and the value indicated by the instrument. There can be errors of various types, and the overall error is the sum of the individual errors. Types of errors include systematic errors random errors absolute error other error Systematic errors The size of the systematic error is sometimes referred to as the accuracy. For example the instrument may always indicate a value 5% higher than the actual value; or perhaps the relationship between the indicated and actual values may be more complicated than that. A systematic error may arise because the instrument has been incorrectly calibrated, or perhaps because a defect has arisen in the instrument since it was calibrated. Instruments should be calibrated against a standard instrument that is known to be accurate, and ideally the calibration should be repeated at intervals. The most rigorous standards are those maintained by a standards organization such as NIST in the United States, or the ISO in Europe. If the users know the amount of the systematic error, they may decide to adjust for it manually rather than having the instrument expensively adjusted to eliminate the error: e.g. in the above example they might manually reduce all the values read by about 4.8%. Random errors The range in amount of possible random errors is sometimes referred to as the precision. Random errors may arise because of the design of the instrument. In particular they may be subdivided between errors in the amount shown on the display, and how accurately the display can actually be read. Amount shown on the display Sometimes the effect of random error can be reduced by repeating the measurement a few times and taking the average result. How accurately the display can be read If the instrument has a needle which points to a scale graduated in steps of 0.1 units, then depending on the design of the instrument it is usually possi
https://en.wikipedia.org/wiki/Long%20branch%20attraction
In phylogenetics, long branch attraction (LBA) is a form of systematic error whereby distantly related lineages are incorrectly inferred to be closely related. LBA arises when the amount of molecular or morphological change accumulated within a lineage is sufficient to cause that lineage to appear similar (thus closely related) to another long-branched lineage, solely because they have both undergone a large amount of change, rather than because they are related by descent. Such bias is more common when the overall divergence of some taxa results in long branches within a phylogeny. Long branches are often attracted to the base of a phylogenetic tree, because the lineage included to represent an outgroup is often also long-branched. The frequency of true LBA is unclear and often debated, and some authors view it as untestable and therefore irrelevant to empirical phylogenetic inference. Although often viewed as a failing of parsimony-based methodology, LBA could in principle result from a variety of scenarios and be inferred under multiple analytical paradigms. Causes LBA was first recognized as problematic when analyzing discrete morphological character sets under parsimony criteria, however Maximum Likelihood analyses of DNA or protein sequences are also susceptible. A simple hypothetical example can be found in Felsenstein 1978 where it is demonstrated that for certain unknown "true" trees, some methods can show bias for grouping long branches, ultimately resulting in the inference of a false sister relationship. Often this is because convergent evolution of one or more characters included in the analysis has occurred in multiple taxa. Although they were derived independently, these shared traits can be misinterpreted in the analysis as being shared due to common ancestry. In phylogenetic and clustering analyses, LBA is a result of the way clustering algorithms work: terminals or taxa with many autapomorphies (character states unique to a single branch) may by
https://en.wikipedia.org/wiki/Bolster%20heath
Bolster heath or cushion moorland is a type of vegetation community that features a patchwork of very low growing, tightly packed plants found at the limits of some alpine environments. The cushion plants form a smooth surfaced 'cushions' from several different plants, hence the common name of cushion heath. The cushion growth habit provides protection against the desiccating wind and help keep the cluster warm. Bolster heath is very slow growing and thus very fragile. Most propagation is by slow expansion, although two species, Abrotanella forsteroides and Pterygopappus lawrencei produce enough viable seed to survive fire. The other species are generally permanently destroyed by fire. The soil in bolster heath is generally quite poor, often gravel with a thin layer of peat. Tasmanian bolster heaths Asteraceae Abrotanella forsteroides (Abrotanella) Ewartia meredithiae (Ewartia) Pterygopappus lawrencei (Pterygopappus) Caryophyllaceae Colobanthus pulvinatus (Colobanthus) Scleranthus biflorus (Scleranthus) Donatiaceae Donatia novae-zelandiae (Donatia) Epacridaceae Dracophyllum minimum (Dracophyllum) Loganiaceae Mitrasacme archeri (Mitrasacme) Scrophulariaceae Chionohebe ciliolata (Chionohebe) Stylidiaceae Phyllachne colensoi (Phyllachne) Thymelaeaceae Pimelea pygmaea (Pimelea) Centrolepidaceae Centrolepis monogyna (Centrolepis) Centrolepis muscoides (Centrolepis) Gaimardia fitzgeraldii (Gaimardia) Gaimardia setacea (Gaimardia) Cyperaceae Carpha rodwayi (Carpha) Oreobolus acutifolius (Oreobolus) Oreobolus oligocephalus (Oreobolus) Oreobolus oxycarpus (Oreobolus) Oreobolus pumilio (Oreobolus) References Forest Practices Authority. (2007). Threatened Native Vegetation Community Information Sheet: Cushion moorland. Accessed online: 22 June 2008. Biogeography Heaths