source
stringlengths 31
203
| text
stringlengths 28
2k
|
---|---|
https://en.wikipedia.org/wiki/Royal%20Military%20Canal
|
The Royal Military Canal is a canal running for between Seabrook near Folkestone and Cliff End near Hastings, following the old cliff line bordering Romney Marsh, which was constructed as a defence against the possible invasion of England during the Napoleonic Wars.
History
Origin and construction
The canal was conceived by Lieutenant-Colonel John Brown of the Royal Staff Corps of field engineers in 1804, during anti-invasion preparations, as a defensible barrier to ensure that a French force could not use the Romney Marsh as a bridgehead. It had previously been assumed that the marsh could be inundated in the event of an invasion, but Brown argued that this would take ten days to implement and would cause massive disruption in the event of a false alarm. At a meeting on 26 September 1804, the Prime Minister, William Pitt the Younger, and the Commander-in-Chief of the Forces, the Duke of York, both enthusiastically endorsed the scheme. John Rennie was appointed consultant engineer, and Pitt personally persuaded the local landowners to agree to the new canal.
Construction was started at Seabrook, near Hythe in Kent on 30 October 1804. By May 1805 only six miles of the canal had been completed; William Pitt intervened and the contractors and Rennie were dismissed. The work was resumed by the Quartermaster-General’s department with Lt-Col. Brown in command. Civilian navvies dug the canal itself, while soldiers built the ramparts. It was constructed in two sections: the longer section starts at Hythe and ends at Iden Lock in East Sussex; the second, smaller section, runs from the foot of Winchelsea Hill to Cliff End. The two sections are linked by the River Rother (Eastern) and River Brede. Artillery batteries were generally located every , where the canal was staggered to create a salient, allowing the guns to enfilade the next stretch of water. A military road was built on the inland side of the canal, and crossings consisted of moveable wooden bridges. Any troops
|
https://en.wikipedia.org/wiki/Cathleen%20Synge%20Morawetz
|
Cathleen Synge Morawetz (May 5, 1923 – August 8, 2017) was a Canadian mathematician who spent much of her career in the United States. Morawetz's research was mainly in the study of the partial differential equations governing fluid flow, particularly those of mixed type occurring in transonic flow. She was professor emerita at the Courant Institute of Mathematical Sciences at the New York University, where she had also served as director from 1984 to 1988. She was awarded the National Medal of Science in 1998.
Childhood
Morawetz's father, John Lighton Synge, nephew of John Millington Synge, was an Irish mathematician, specializing in the geometry of general relativity. Her mother also studied mathematics for a time. Her uncle was Edward Hutchinson Synge who is credited as the inventor of the Near-field scanning optical microscope and very large astronomical telescopes, based on multiple mirrors.
Her childhood was split between Ireland and Canada. Both her parents were supportive of her interest in mathematics and science, and it was a woman mathematician, Cecilia Krieger, who had been a family friend for many years who later encouraged Morawetz to pursue a Ph.D. in mathematics. Morawetz said her father was influential in stimulating her interest in mathematics, but he wondered whether her studying mathematics would be wise (suggesting they might fight like the Bernoulli brothers).
Education
A graduate of the University of Toronto in 1945, Morawetz received her master's degree in 1946 at the Massachusetts Institute of Technology. Morawetz got a job at New York University where she edited Supersonic Flow and Shock Waves by Richard Courant and Kurt Otto Friedrichs. She earned her Ph.D. in 1951 at New York University, with a thesis on the stability of a spherical implosion, under the supervision of Kurt Otto Friedrichs. Her thesis was entitled Contracting Spherical Shocks Treated by a Perturbation Method
Career
After earning her doctorate, Morawetz spent a year as
|
https://en.wikipedia.org/wiki/Applied%20ecology
|
Applied ecology is a sub-field within ecology that considers the application of the science of ecology to real-world (usually management) questions. It is also described as a scientific field that focuses on the application of concepts, theories, models, or methods of fundamental ecology to environmental problems.
Concept
Applied ecology is an integrated treatment of the ecological, social, and biotechnological aspects of natural resource conservation and management. Applied ecology typically focuses on geomorphology, soils, and plant communities as the underpinnings for vegetation and wildlife (both game and non-game) management.
Applied ecology includes all disciplines that are related to human activities so that it does not only cover agriculture, forestry and fisheries but also global change. It has two study categories. The first involves the outputs or those fields that address the use and management of the environment, particularly for its ecosystem services and exploitable resources. The second are the inputs or those that are concerned with the management strategies or human influences on the ecosystem or biodiversity.
The discipline is often linked to ecological management on the grounds that the effective management of natural ecosystems depends on ecological knowledge. It often uses an ecological approach to solve problems of specific parts of the environment, which can involve the comparison of plausible options (e.g. best management options).
The role of applied science in agricultural production has been brought into greater focus as fluctuations in global food production feed through into prices and availability to consumers.
Approaches
Applied ecologists often use one or more of the following approaches, namely, observation, experimentation, and modeling. For example, a wildlife preservation project could involve: observational studies of the wildlife ecology; experiments to understand causal relationships; and the application of modeling to
|
https://en.wikipedia.org/wiki/PLS%20%28file%20format%29
|
PLS is a computer file format for a multimedia playlist. It is typically used by media players for streaming media over the Internet, but may also be used for playing local media.
For online streaming, typically the .PLS file would be downloaded just once from the media source—such as from an online radio station—for immediate or future use. While most computers and players automatically recognize .PLS format, the first time a PLS file is used on a computer, the media player's settings may need to be changed to recognize ("associated" with) .PLS files.
PLS was originally developed for use with the museArc audio player software by codeArts, and was later used by SHOUTcast and Icecast for streaming media over the Internet. PLS is a more expressive playlist format than the basic M3U playlist, as it can store (cache) information on the song title and length (this is supported in extended M3U only).
File format
The format is case-sensitive and essentially that of an INI file structured as follows
Header
[playlist] : This tag indicates that it is a Playlist File
Track Entry
Assuming track entry #X
FileX : Variable defining location of media file/stream (like .m3u/.m3u8 playlists).
TitleX : Defines track title. (optional)
LengthX : Length in seconds of track. Value of -1 indicates indefinite (streaming). (optional)
Footer
NumberOfEntries : This variable indicates the number of tracks and therefore equals the number used for the last track
Version : Playlist version. Currently only a value of 2 is valid.
Examples
Example of a complete PLS file used for "streaming audio;" in this case, to connect to a particular online radio station and receive its audio stream:
[playlist]
File1=http://stream2.streamq.net:8020/
Title1=Here enter name of the station
NumberOfEntries=1
Alternative Example containing local paths:
[playlist]
File1=http://relay5.181.fm:8068
Length1=-1
File2=example2.mp3
Title2=Just some local audio that is 2mins long
Length2=120
File3=F:\Music\what
|
https://en.wikipedia.org/wiki/Numerical%20differentiation
|
In numerical analysis, numerical differentiation algorithms estimate the derivative of a mathematical function or function subroutine using values of the function and perhaps other knowledge about the function.
Finite differences
The simplest method is to use finite difference approximations.
A simple two-point estimation is to compute the slope of a nearby secant line through the points and . Choosing a small number , represents a small change in , and it can be either positive or negative. The slope of this line is
This expression is Newton's difference quotient (also known as a first-order divided difference).
The slope of this secant line differs from the slope of the tangent line by an amount that is approximately proportional to . As approaches zero, the slope of the secant line approaches the slope of the tangent line. Therefore, the true derivative of at is the limit of the value of the difference quotient as the secant lines get closer and closer to being a tangent line:
Since immediately substituting 0 for results in indeterminate form, calculating the derivative directly can be unintuitive.
Equivalently, the slope could be estimated by employing positions and .
Another two-point formula is to compute the slope of a nearby secant line through the points and . The slope of this line is
This formula is known as the symmetric difference quotient. In this case the first-order errors cancel, so the slope of these secant lines differ from the slope of the tangent line by an amount that is approximately proportional to . Hence for small values of this is a more accurate approximation to the tangent line than the one-sided estimation. However, although the slope is being computed at , the value of the function at is not involved.
The estimation error is given by
where is some point between and .
This error does not include the rounding error due to numbers being represented and calculations being performed in limited precision.
The symmet
|
https://en.wikipedia.org/wiki/Arithmetic%20underflow
|
The term arithmetic underflow (also floating point underflow, or just underflow) is a condition in a computer program where the result of a calculation is a number of more precise absolute value than the computer can actually represent in memory on its central processing unit (CPU).
Arithmetic underflow can occur when the true result of a floating point operation is smaller in magnitude (that is, closer to zero) than the smallest value representable as a normal floating point number in the target datatype. Underflow can in part be regarded as negative overflow of the exponent of the floating point value. For example, if the exponent part can represent values from −128 to 127, then a result with a value less than −128 may cause underflow.
Storing values that are too low in an integer variable (e.g., attempting to store −1 in an unsigned integer) is properly referred to as integer , or more broadly, integer wraparound. The term underflow normally refers to floating point numbers only, which is a separate issue. It is impossible in most floating-point designs to store a too-low value, as usually they are signed and have a negative infinity value.
Underflow gap
The interval between −fminN and fminN, where fminN is the smallest positive normal floating point value, is called the underflow gap. This is because the size of this interval is many orders of magnitude larger than the distance between adjacent normal floating point values just outside the gap. For instance, if the floating point datatype can represent 20 bits, the underflow gap is 221 times larger than the absolute distance between adjacent floating point values just outside the gap.
In older designs, the underflow gap had just one usable value, zero. When an underflow occurred, the true result was replaced by zero (either directly by the hardware, or by system software handling the primary underflow condition). This replacement is called "flush to zero".
The 1984 edition of IEEE 754 introduced subnormal n
|
https://en.wikipedia.org/wiki/Mode%207
|
Mode 7 is a graphics mode on the Super Nintendo Entertainment System video game console that allows a background layer to be rotated and scaled on a scanline-by-scanline basis to create many different depth effects. It also supports wrapping effects such as translation and reflection.
The most famous of these effects is the application of a perspective effect on a background layer by scaling and rotating the background layer in this manner. This transforms the background layer into a two-dimensional horizontal texture-mapped plane that trades height for depth. Thus, an impression of three-dimensional graphics is achieved.
Mode 7 was one of Nintendo's prominent selling points for the Super NES platform in publications such as Nintendo Power and Super NES Player's Guide. Similar faux 3D techniques have been presented on a few 2D systems other than the Super NES, in select peripherals and games.
Overview
The Super NES console has eight graphics modes, numbered from 0 to 7, for displaying background layers. The last one (background mode 7) has a single layer that can be scaled and rotated. Two-dimensional affine transformations can produce any combination of translation, scaling, reflection, rotation, and shearing. However, many games create additional effects by setting a different transformation matrix for each scanline. In this way, pseudo-perspective, curved surface, and distortion effects can be achieved.
Mode 7 graphics are generated for each pixel by mapping screen coordinates to background coordinates using an affine transformation and sampling the corresponding background color. The 2D affine transformation is specified for each scanline by 6 parameters: , , , and ( which together define the matrix ), and and (which define the vector , the origin). Specifically, screen coordinate is translated to the origin coordinate system, the matrix is applied, and the result is translated back to the original coordinate system to obtain .
In 2D matrix notation
|
https://en.wikipedia.org/wiki/CMoy
|
A CMoy is a pocket headphone amplifier originally designed by Pow Chu Moy.
The headphone amplifier is designed around single or dual-channel operational amplifiers (op-amps) such as Burr-Brown's OPA2134 or OPA2132PA, however, a wide variety of op-amps have been successfully implemented. As the op-amp directly drives headphones, some care should be given when choosing an op-amp. Some op-amps are not suitable for such low impedance loads and will result in poor performance. (See Op-amp swapping.)
The amplifier's design is quite simple. It consists of only a few components, can be assembled on a small section of protoboard, has a lower parts cost than other headphone amplifiers, and can run for many hours on a single 9 volt battery.
Circuit
A typical CMoy consists of two identical AC coupled, non-inverting operational amplifier circuits each with a 100kΩ input impedance.
Power is supplied to the opamps using a dual power supply, which effectively divides the input voltage source in half to create a virtual ground. Many virtual ground circuit options are presented in the various CMoy tutorials found online.
References
External links
How to Build the CMoy Pocket Amplifier
Joechei – Yet Another Cmoy Headphone Amplifier
Audio amplifiers
|
https://en.wikipedia.org/wiki/Phasor
|
In physics and engineering, a phasor (a portmanteau of phase vector) is a complex number representing a sinusoidal function whose amplitude (), angular frequency (), and initial phase () are time-invariant. It is related to a more general concept called analytic representation, which decomposes a sinusoid into the product of a complex constant and a factor depending on time and frequency. The complex constant, which depends on amplitude and phase, is known as a phasor, or complex amplitude, and (in older texts) sinor or even complexor.
A common situation in electrical networks powered by time varying current is the existence of multiple sinusoids all with the same frequency, but different amplitudes and phases. The only difference in their analytic representations is the complex amplitude (phasor). A linear combination of such functions can be represented as a linear combination of phasors (known as phasor arithmetic or phasor algebra) and the time/frequency dependent factor that they all have in common.
The origin of the term phasor rightfully suggests that a (diagrammatic) calculus somewhat similar to that possible for vectors is possible for phasors as well. An important additional feature of the phasor transform is that differentiation and integration of sinusoidal signals (having constant amplitude, period and phase) corresponds to simple algebraic operations on the phasors; the phasor transform thus allows the analysis (calculation) of the AC steady state of RLC circuits by solving simple algebraic equations (albeit with complex coefficients) in the phasor domain instead of solving differential equations (with real coefficients) in the time domain. The originator of the phasor transform was Charles Proteus Steinmetz working at General Electric in the late 19th century. He got his inspiration from Oliver Heaviside. Heaviside's operational calculus was modified so that the variable p becomes jw. The complex number j has simple meaning : phase shift.
Glossin
|
https://en.wikipedia.org/wiki/Calorie%20restriction
|
Calorie restriction (caloric restriction or energy restriction) is a dietary regimen that reduces the energy intake from foods and beverages without incurring malnutrition. The possible effect of calorie restriction on body weight management, longevity, and aging-associated diseases has been an active area of research.
Dietary guidelines
Caloric intake control, and reduction for overweight individuals, is recommended by US dietary guidelines and science-based societies.
Calorie restriction is recommended for people with diabetes and prediabetes, in combination with physical exercise and a weight loss goal of 5-15% for diabetes and 7-10% for prediabetes to prevent progression to diabetes. Mild calorie restriction may be beneficial for pregnant women to reduce weight gain (without weight loss) and reduce perinatal risks for both the mother and child. For overweight or obese individuals, calorie restriction may improve health through weight loss, although a gradual weight regain of per year may occur.
Risks of malnutrition
The term "calorie restriction" as used in the study of aging refers to dietary regimens that reduce calorie intake without incurring malnutrition. If a restricted diet is not designed to include essential nutrients, malnutrition may result in serious deleterious effects, as shown in the Minnesota Starvation Experiment. This study was conducted during World War II on a group of lean men, who restricted their calorie intake by 45% for six months and composed roughly 77% of their diet with carbohydrates. As expected, this malnutrition resulted in metabolic adaptations, such as decreased body fat, improved lipid profile, and decreased resting heart rate. The experiment also caused negative effects, such as anemia, edema, muscle wasting, weakness, dizziness, irritability, lethargy, and depression.
Typical low-calorie diets may not supply sufficient nutrient intake that is typically included in a calorie restriction diet.
Possible side effects
Peop
|
https://en.wikipedia.org/wiki/Password%20policy
|
A password policy is a set of rules designed to enhance computer security by encouraging users to employ strong passwords and use them properly. A password policy is often part of an organization's official regulations and may be taught as part of security awareness training. Either the password policy is merely advisory, or the computer systems force users to comply with it. Some governments have national authentication frameworks that define requirements for user authentication to government services, including requirements for passwords.
NIST guidelines
The United States Department of Commerce's National Institute of Standards and Technology (NIST) has put out two standards for password policies which have been widely followed.
2004
From 2004, the "NIST Special Publication 800-63. Appendix A," advised people to use irregular capitalization, special characters, and at least one numeral. This was the advice that most systems followed, and was "baked into" a number of standards that businesses needed to follow.
2017
However, in 2017 a major update changed this advice, particularly that forcing complexity and regular changes is now seen as bad practice.
The key points of these are:
Verifiers should not impose composition rules e.g., requiring mixtures of different character types or prohibiting consecutively repeated characters
Verifiers should not require passwords to be changed arbitrarily or regularly e.g. the previous 90 day rule
Passwords must be at least 8 characters in length
Password systems should permit subscriber-chosen passwords at least 64 characters in length.
All printing ASCII characters, the space character, and Unicode characters should be acceptable in passwords
When establishing or changing passwords, the verifier shall advise the subscriber that they need to select a different password if they have chosen a weak or compromised password
Verifiers should offer guidance such as a password-strength meter, to assist the user in choosing a
|
https://en.wikipedia.org/wiki/V-model
|
The V-model is a graphical representation of a systems development lifecycle. It is used to produce rigorous development lifecycle models and project management models. The V-model falls into three broad categories, the German V-Modell, a general testing model, and the US government standard.
The V-model summarizes the main steps to be taken in conjunction with the corresponding deliverables within computerized system validation framework, or project life cycle development. It describes the activities to be performed and the results that have to be produced during product development.
The left side of the "V" represents the decomposition of requirements, and the creation of system specifications. The right side of the "V" represents an integration of parts and their validation. However, requirements need to be validated first against the higher level requirements or user needs. Furthermore, there is also something as validation of system models. This can partially be done on the left side also. To claim that validation only occurs on the right side may not be correct. The easiest way is to say that verification is always against the requirements (technical terms) and validation is always against the real world or the user's needs. The aerospace standard RTCA DO-178B states that requirements are validated—confirmed to be true—and the end product is verified to ensure it satisfies those requirements.
Validation can be expressed with the query "Are you building the right thing?" and verification with "Are you building it right?"
Types
There are three general types of V-model.
V-Modell
"V-Modell" is the official project management method of the German government. It is roughly equivalent to PRINCE2, but more directly relevant to software development. The key attribute of using a "V" representation was to require proof that the products from the left-side of the V were acceptable by the appropriate test and integration organization implementing the right-side of
|
https://en.wikipedia.org/wiki/Zeno%20machine
|
In mathematics and computer science, Zeno machines (abbreviated ZM, and also called accelerated Turing machine, ATM) are a hypothetical computational model related to Turing machines that are capable of carrying out computations involving a countably infinite number of algorithmic steps. These machines are ruled out in most models of computation.
The idea of Zeno machines was first discussed by Hermann Weyl in 1927; the name refers to Zeno's paradoxes, attributed to the ancient Greek philosopher Zeno of Elea. Zeno machines play a crucial role in some theories. The theory of the Omega Point devised by physicist Frank J. Tipler, for instance, can only be valid if Zeno machines are possible.
Definition
A Zeno machine is a Turing machine that can take an infinite number of steps, and then continue take more steps. This can be thought of as a supertask where units of time are taken to perform the -th step; thus, the first step takes 0.5 units of time, the second takes 0.25, the third 0.125 and so on, so that after one unit of time, a countably infinite number of steps will have been performed.
Infinite time Turing machines
A more formal model of the Zeno machine is the infinite time Turing machine. Defined first in unpublished work by Jeffrey Kidder and expanded upon by Joel Hamkins and Andy Lewis, in Infinite Time Turing Machines, the infinite time Turing machine is an extension of the classical Turing machine model, to include transfinite time; that is time beyond all finite time. A classical Turing machine has a status at step (in the start state, with an empty tape, read head at cell 0) and a procedure for getting from one status to the successive status. In this way the status of a Turing machine is defined for all steps corresponding to a natural number. An maintains these properties, but also defines the status of the machine at limit ordinals, that is ordinals that are neither nor the successor of any ordinal. The status of a Turing machine consis
|
https://en.wikipedia.org/wiki/Stratovision
|
Stratovision was an airborne television transmission relay system using aircraft flying at high altitudes. In 1945 the Glenn L. Martin Company and Westinghouse Electric Corporation originally proposed television coverage of small towns and rural areas, as well as the large metropolitan centers, by fourteen aircraft that would provide coverage for approximately 78% of the people in the United States. Although this was never implemented, the system has been used for domestic broadcasting in the United States, and by the U.S. military in South Vietnam and other countries.
Technology
Because the broadcasting antenna for Stratovision is usually hung beneath the aircraft in flight, it naturally has a great command of line-of-sight propagation. Although transmission distances are dependent upon atmospheric conditions, a transmitting antenna above the Earth's surface has a line of sight distance of approximately .
A Stratovision 25 kW transmitter operating from at 600 MHz will achieve a field intensity of 2 millivolts per meter for a high receiving antenna up to away from the aircraft.
Early tests
Stratovision tests were undertaken between June 1948 to February 1949. The first phase was undertaken by the Glenn L. Martin Co. and Westinghouse using a twin-engine PV-2 aircraft flying at that transmitted with 250 watts on 107.5 MHz and 5 kW on 514 MHz at Baltimore, Maryland so that recordings could be made at various locations ranging from Norfolk, Virginia to Pittsburgh, Pennsylvania and Boston, Massachusetts.
The second phase of testing was undertaken by these companies using a stripped-down B-29 Superfortress flying at . The plane was equipped to receive a relay transmission from WMAR-TV in Baltimore, which was then relayed over a 5 kW video transmitter and a 1 kW audio transmitter for reception on 82-88 MHz with a television set tuned to Channel 6.
The aircraft received its originating signals from circular dipoles attached to a streamlined eight-foot (2.5 m) mast
|
https://en.wikipedia.org/wiki/Descent%20II
|
Descent II is a 1996 first-person shooter game developed by Parallax Software and first published for DOS by Interplay Productions. For the PlayStation, it is known as Descent Maximum. It is the second installment in the Descent video game series and the sequel to Descent. The base of the gameplay remaining the same, the player controls a spaceship from the pilot's perspective and must navigate extrasolar underground mines to locate and destroy their reactors and escape being caught in their self-destructions, while engaging and surviving infected robots, which will attempt to destroy the ship. Unlike other first-person shooters, its six-degrees-of-freedom scheme allows the player to move and rotate in any three-dimensional space and direction.
Descent IIs development started as a project intended to expand the original using a compact disc's storage, which later became a standalone product. The game received very positive reviews from video game critics, who widely lauded the multiplayer mode and the inclusion of the Guide-Bot, a scouting robot that guides the player to their objectives. The PlayStation version's reception was rather mixed, with critics often disagreeing in their evaluations of its frame rate. A sequel, Descent 3, was released in 1999.
Gameplay
Like its predecessor, Descent II is a six-degrees-of-freedom shoot 'em up game in which the player pilots a fighter spaceship from a first-person perspective in zero gravity. It differs from standard first-person shooters in that it allows the player to move freely across three-dimensional planes and rotate on three axes, often termed pitch, yaw, and roll. Besides the keyboard, Descent II features a wide range of supported hardware configurations with which to play it, including the Gravis Gamepad and certain brands of joysticks, some of which support force feedback—making it one of the earliest PC games to support force feedback. Virtual reality and stereoscopic graphics are also officially supported.
|
https://en.wikipedia.org/wiki/Homeorhesis
|
Homeorhesis, derived from the Greek for "similar flow", is a concept encompassing dynamical systems which return to a trajectory, as opposed to systems which return to a particular state, which is termed homeostasis.
Biology
Homeorhesis is steady flow. Often biological systems are inaccurately described as homeostatic, being in a steady state. Steady state implies equilibrium which is never reached, nor are organisms and ecosystems in a closed environment. During his tenure at the State University of New York at Oneonta, Dr William Butts correctly applied the term homeorhesis to biological organisms. The term was first used in biology by C.H. Waddington around 1940, where he described the tendency of developing or changing organisms to continue development or adapting to their environment and changing towards a given state.
Gaia hypothesis
In ecology the concept is important as an element of the Gaia hypothesis, where the system under consideration is the ecological balance of different forms of life on the planet. It was Lynn Margulis, the coauthor of Gaia hypothesis, who wrote in particular that only homeorhetic, and not homeostatic, balances are involved in the theory. That is, the composition of Earth's atmosphere, hydrosphere, and lithosphere are regulated around "set points" as in homeostasis, but those set points change with time.
References
Systems theory
Ecology
Homeostasis
|
https://en.wikipedia.org/wiki/Hard%20coding
|
Hard coding (also hard-coding or hardcoding) is the software development practice of embedding data directly into the source code of a program or other executable object, as opposed to obtaining the data from external sources or generating it at runtime.
Hard-coded data typically can only be modified by editing the source code and recompiling the executable, although it can be changed in memory or on disk using a debugger or hex editor.
Data that is hard-coded is best suited for unchanging pieces of information, such as physical constants, version numbers, and static text elements.
Softcoded data, on the other hand, encodes arbitrary information through user input, text files, INI files, HTTP server responses, configuration files, preprocessor macros, external constants, databases, command-line arguments, and is determined at runtime.
Overview
Hard coding requires the program's source code to be changed any time the input data or desired format changes, when it might be more convenient to the end user to change the detail by some means outside the program.
Hard coding is often required, but can also be considered an anti-pattern. Programmers may not have a dynamic user interface solution for the end user worked out but must still deliver the feature or release the program. This is usually temporary but does resolve, in a short term sense, the pressure to deliver the code. Later, softcoding is done to allow a user to pass on parameters that give the end user a way to modify the results or outcome.
The term "hard-coded" was initially used as an analogy to hardwiring circuits - and was meant to convey the inflexibility which results from its usage within software design and implementation.
In the context of run-time extensible collaborative development environments such as MUDs, hardcoding also refers to developing the core engine of the system responsible for low-level tasks and executing scripts, as opposed to softcoding which is developing the high-level sc
|
https://en.wikipedia.org/wiki/Social%20bookmarking
|
Social bookmarking is an online service which allows users to add, annotate, edit, and share bookmarks of web documents. Many online bookmark management services have launched since 1996; Delicious, founded in 2003, popularized the terms "social bookmarking" and "tagging". Tagging is a significant feature of social bookmarking systems, allowing users to organize their bookmarks and develop shared vocabularies known as folksonomies.
Common features
Unlike file sharing, social bookmarking does not save the resources themselves, merely bookmarks that reference them, i.e. a link to the bookmarked page. Descriptions may be added to these bookmarks in the form of metadata, so users may understand the content of the resource without first needing to download it for themselves. Such descriptions may be free text comments, votes in favor of or against its quality, or tags that collectively or collaboratively become a folksonomy. Folksonomy is also called social tagging, "the process by which many users add metadata in the form of keywords to shared content".
In a social bookmarking system, users save links to web pages that they want to remember and/or share. These bookmarks are usually public, and can be saved privately, shared only with specified people or groups, shared only inside certain networks, or another combination of public and private domains. The allowed people can usually view these bookmarks chronologically, by category or tags, or via a search engine.
Most social bookmark services encourage users to organize their bookmarks with informal tags instead of the traditional browser-based system of folders, although some services feature categories/folders or a combination of folders and tags. They also enable viewing bookmarks associated with a chosen tag, and include information about the number of users who have bookmarked them. Some social bookmarking services also draw inferences from the relationship of tags to create clusters of tags or bookmarks.
Many s
|
https://en.wikipedia.org/wiki/Near%E2%80%93far%20problem
|
The near–far problem or hearability problem is the effect of a strong signal from a near signal source in making it hard for a receiver to hear a weaker signal from a further source due to adjacent-channel interference, co-channel interference, distortion, capture effect, dynamic range limitation, or the like. Such a situation is common in wireless communication systems, in particular CDMA. In some signal jamming techniques, the near–far problem is exploited to disrupt ("jam") communications.
Analogies
Consider a receiver and two transmitters, one close to the receiver, the other far away. If both transmitters transmit simultaneously and at equal powers, then due to the inverse square law the receiver will receive more power from the nearer transmitter. Since one transmission's signal is the other's noise, the signal-to-noise ratio (SNR) for the further transmitter is much lower. This makes the farther transmitter more difficult, if not impossible, to understand. In short, the near–far problem is one of detecting or filtering out a weaker signal amongst stronger signals.
To place this problem in more common terms, imagine you are talking to someone 6 meters away. If the two of you are in a quiet, empty room then a conversation is quite easy to hold at normal voice levels. In a loud, crowded bar, it would be impossible to hear the same voice level, and the only solution (for that distance) is for both you and your friend to speak louder. Of course, this increases the overall noise level in the bar, and every other patron has to talk louder too (this is equivalent to power control runaway). Eventually, everyone has to shout to make themselves heard by a person standing right beside them, and it is impossible to communicate with anyone more than half a meter away. In general, however, a human is very capable of filtering out loud sounds; similar techniques can be deployed in signal processing where suitable criteria for distinguishing between signals can be establis
|
https://en.wikipedia.org/wiki/Peptide%20synthesis
|
In organic chemistry, peptide synthesis is the production of peptides, compounds where multiple amino acids are linked via amide bonds, also known as peptide bonds. Peptides are chemically synthesized by the condensation reaction of the carboxyl group of one amino acid to the amino group of another. Protecting group strategies are usually necessary to prevent undesirable side reactions with the various amino acid side chains. Chemical peptide synthesis most commonly starts at the carboxyl end of the peptide (C-terminus), and proceeds toward the amino-terminus (N-terminus). Protein biosynthesis (long peptides) in living organisms occurs in the opposite direction.
The chemical synthesis of peptides can be carried out using classical solution-phase techniques, although these have been replaced in most research and development settings by solid-phase methods (see below). Solution-phase synthesis retains its usefulness in large-scale production of peptides for industrial purposes moreover.
Chemical synthesis facilitates the production of peptides that are difficult to express in bacteria, the incorporation of unnatural amino acids, peptide/protein backbone modification, and the synthesis of D-proteins, which consist of D-amino acids.
Solid-phase synthesis
The established method for the production of synthetic peptides in the lab is known as solid phase peptide synthesis (SPPS). Pioneered by Robert Bruce Merrifield, SPPS allows the rapid assembly of a peptide chain through successive reactions of amino acid derivatives on a macroscopically insoluble solvent-swollen beaded resin support.
The solid support consists of small, polymeric resin beads functionalized with reactive groups (such as amine or hydroxyl groups) that link to the nascent peptide chain. Since the peptide remains covalently attached to the support throughout the synthesis, excess reagents and side products can be removed by washing and filtration. This approach circumvents the comparatively time-consu
|
https://en.wikipedia.org/wiki/Transposase
|
A transposase is any of a class of enzymes capable of binding to the end of a transposon and catalysing its movement to another part of a genome, typically by a cut-and-paste mechanism or a replicative mechanism, in a process known as transposition. The word "transposase" was first coined by the individuals who cloned the enzyme required for transposition of the Tn3 transposon. The existence of transposons was postulated in the late 1940s by Barbara McClintock, who was studying the inheritance of maize, but the actual molecular basis for transposition was described by later groups. McClintock discovered that some segments of chromosomes changed their position, jumping between different loci or from one chromosome to another. The repositioning of these transposons (which coded for color) allowed other genes for pigment to be expressed. Transposition in maize causes changes in color; however, in other organisms, such as bacteria, it can cause antibiotic resistance. Transposition is also important in creating genetic diversity within species and generating adaptability to changing living conditions.
Transposases are classified under EC number EC 2.7.7. Genes encoding transposases are widespread in the genomes of most organisms and are the most abundant genes known. During the course of human evolution, as much as 40% of the human genome has moved around via methods such as transposition of transposons.
Transposase Tn5
Transposase (Tnp) Tn5 is a member of the RNase superfamily of proteins which includes retroviral integrases. Tn5 can be found in Shewanella and Escherichia bacteria. The transposon codes for antibiotic resistance to kanamycin and other aminoglycoside antibiotics.
Tn5 and other transposases are notably inactive. Because DNA transposition events are inherently mutagenic, the low activity of transposases is necessary to reduce the risk of causing a fatal mutation in the host, and thus eliminating the transposable element. One of the reasons Tn5 is so unr
|
https://en.wikipedia.org/wiki/Gunship%20%28video%20game%29
|
Gunship is a combat flight simulation video game developed and published by MicroProse in 1986. In the game, controlling a simulated AH-64 Apache helicopter, players navigate through missions to attack enemy targets and protect friendly forces. Commercially and critically successful, Gunship was followed by Gunship 2000 and Gunship!.
Gameplay
The game features missions in five regions, including the U.S. (training), Southeast Asia (1st Air Cavalry Division), Central America (82nd Airborne Division), Middle East (101st Airborne Division) and Western Europe (3rd Armored Division). After selection of region, style, and enemies, the pilot is assigned a primary mission and a secondary mission. These could include such objectives as "Destroy enemy headquarters" or "Support friendly troops" (i.e. destroy targets near friendly forces). The latter would be an easier mission, because the battle would be fought closer to friendly lines.
The pilot then arms the Apache helicopter gunship, usually selecting AGM-114 Hellfire air-to-ground missiles (guided missiles that destroy "hard" targets such as bunkers and tanks), FFARs (Folding Fin Aerial Rockets; unguided rockets that destroy "soft" targets such as infantry and installations), and HEDP (High-Explosive, Dual-Purpose) rounds for the 30 mm cannon (an all-purpose weapon with a maximum range of 1.5 km); in Central America, the Middle East, and Western Europe, AIM-9 Sidewinders would also be standard equipment, usually as a backup air-to-air weapon in case of cannon failure.
Patient players might move in short jumps, crouching behind hills to block the enemy's line of sight and suddenly popping up to attack. More aggressive players generally fly fast and erratically to evade enemy fire, flying in low to deliver devastating cannon attacks at close range. Since flight time is a component of the mission evaluation, either method has its advantages. The latter, however, can be rather dangerous against 1st Line enemies whose fast
|
https://en.wikipedia.org/wiki/Proof%20assistant
|
In computer science and mathematical logic, a proof assistant or interactive theorem prover is a software tool to assist with the development of formal proofs by human-machine collaboration. This involves some sort of interactive proof editor, or other interface, with which a human can guide the search for proofs, the details of which are stored in, and some steps provided by, a computer.
A recent effort within this field is making these tools use artificial intelligence to automate the formalization of ordinary mathematics.
System comparison
ACL2 – a programming language, a first-order logical theory, and a theorem prover (with both interactive and automatic modes) in the Boyer–Moore tradition.
Coq – Allows the expression of mathematical assertions, mechanically checks proofs of these assertions, helps to find formal proofs, and extracts a certified program from the constructive proof of its formal specification.
HOL theorem provers – A family of tools ultimately derived from the LCF theorem prover. In these systems the logical core is a library of their programming language. Theorems represent new elements of the language and can only be introduced via "strategies" which guarantee logical correctness. Strategy composition gives users the ability to produce significant proofs with relatively few interactions with the system. Members of the family include:
HOL4 – The "primary descendant", still under active development. Support for both Moscow ML and Poly/ML. Has a BSD-style license.
HOL Light – A thriving "minimalist fork". OCaml based.
ProofPower – Went proprietary, then returned to open source. Based on Standard ML.
IMPS, An Interactive Mathematical Proof System
Isabelle is an interactive theorem prover, successor of HOL. The main code-base is BSD-licensed, but the Isabelle distribution bundles many add-on tools with different licenses.
Jape – Java based.
Lean
LEGO
Matita – A light system based on the Calculus of Inductive Constructions.
MINLO
|
https://en.wikipedia.org/wiki/Whiteprint
|
Whiteprint describes a document reproduction produced by using the diazo chemical process. It is also known as the blue-line process since the result is blue lines on a white background. It is a contact printing process which accurately reproduces the original in size, but cannot reproduce continuous tones or colors. The light-sensitivity of the chemicals used was known in the 1890s and several related printing processes were patented at that time. Whiteprinting replaced the blueprint process for reproducing architectural and engineering drawings because the process was simpler and involved fewer toxic chemicals. A blue-line print is not permanent and will fade if exposed to light for weeks or months, but a drawing print that lasts only a few months is sufficient for many purposes.
The diazo printing process
Two components underpin diazo printing:
diazonium salt: a light sensitive chemical
the coupler: a colorless chemical that combines with the salt to produce color.
In a related sense, the process relies on two properties of diazonium compounds:
they are deactivated by light, i.e. they degrade irreversibly to products that cannot form deeply colored dyes
they (the diazonium compounds that were not degraded by light) react with a (colorless) coupling agent to give deeply colored product(s)
In a variety of combinations and strengths, these two chemicals are mixed together in water and coated onto paper. The resulting coating is then dried yielding the specially treated paper commercially sold as Diazo paper. This solution can also be applied to polyester film or to vellum.
The process starts with original documents that have been created on a translucent medium. Such media include polyester films, vellums, linens, and translucent bond papers (bonds). Any medium that allows some light to pass through typically works as a master; the desired durability of the master determines the choice. Depending on the thickness and type of the master, the intensity of th
|
https://en.wikipedia.org/wiki/FTPS
|
FTPS (also known as FTP-SSL and FTP Secure) is an extension to the commonly used File Transfer Protocol (FTP) that adds support for the Transport Layer Security (TLS) and, formerly, the Secure Sockets Layer (SSL, which is now prohibited by RFC7568) cryptographic protocols.
FTPS should not be confused with the SSH File Transfer Protocol (SFTP), a secure file transfer subsystem for the Secure Shell (SSH) protocol with which it is not compatible. It is also different from FTP over SSH, which is the practice of tunneling FTP through an SSH connection.
Background
The File Transfer Protocol was drafted in 1971 for use with the scientific and research network, ARPANET. Access to the ARPANET during this time was limited to a small number of military sites and universities and a narrow community of users who could operate without data security and privacy requirements within the protocol.
As the ARPANET gave way to the NSFNET and then the Internet, a broader population potentially had access to the data as it traversed increasingly longer paths from client to server. The opportunity for unauthorized third parties to eavesdrop on data transmissions increased proportionally.
In 1994, the Internet browser company Netscape developed and released the application layer wrapper, Secure Sockets Layer. This protocol enabled applications to communicate across a network in a private and secure fashion, discouraging eavesdropping, tampering, and message forgery. While it could add security to any protocol that uses reliable connections, such as TCP, it was most commonly used by Netscape with HTTP to form HTTPS.
The SSL protocol was eventually applied to FTP, with a draft Request for Comments (RFC) published in late 1996. An official IANA port was registered shortly thereafter. However, the RFC was not finalized until 2005.
Methods of invoking security
Two separate methods were developed to invoke client security for use with FTP clients: Implicit and Explicit. While the impl
|
https://en.wikipedia.org/wiki/Habitat%20fragmentation
|
Habitat fragmentation describes the emergence of discontinuities (fragmentation) in an organism's preferred environment (habitat), causing population fragmentation and ecosystem decay. Causes of habitat fragmentation include geological processes that slowly alter the layout of the physical environment (suspected of being one of the major causes of speciation), and human activity such as land conversion, which can alter the environment much faster and causes the extinction of many species. More specifically, habitat fragmentation is a process by which large and contiguous habitats get divided into smaller, isolated patches of habitats.
Definition
The term habitat fragmentation includes five discrete phenomena:
Reduction in the total area of the habitat
Decrease of the interior: edge ratio
Isolation of one habitat fragment from other areas of habitat
Breaking up of one patch of habitat into several smaller patches
Decrease in the average size of each patch of habitat
"fragmentation ... not only causes loss of the amount of habitat but by creating small, isolated patches it also changes the properties of the remaining habitat" (van den Berg et al. 2001). Habitat fragmentation is the landscape level of the phenomenon, and patch level process. Thus meaning, it covers; the patch areas, edge effects, and patch shape complexity.
In scientific literature, there is some debate whether the term "habitat fragmentation" applies in cases of habitat loss, or whether the term primarily applies to the phenomenon of habitat being cut into smaller pieces without significant reduction in habitat area. Scientists who use the stricter definition of "habitat fragmentation" per se would refer to the loss of habitat area as "habitat loss" and explicitly mention both terms if describing a situation where the habitat becomes less connected and there is less overall habitat.
Furthermore, habitat fragmentation is considered as an invasive threat to biodiversity, due to its implication
|
https://en.wikipedia.org/wiki/Head-mounted%20display
|
A head-mounted display (HMD) is a display device, worn on the head or as part of a helmet (see Helmet-mounted display for aviation applications), that has a small display optic in front of one (monocular HMD) or each eye (binocular HMD). An HMD has many uses including gaming, aviation, engineering, and medicine. Virtual reality headsets are HMDs combined with IMUs. There is also an optical head-mounted display (OHMD), which is a wearable display that can reflect projected images and allows a user to see through it.
Overview
A typical HMD has one or two small displays, with lenses and semi-transparent mirrors embedded in eyeglasses (also termed data glasses), a visor, or a helmet. The display units are miniaturized and may include cathode ray tubes (CRT), liquid-crystal displays (LCDs), liquid crystal on silicon (LCos), or organic light-emitting diodes (OLED). Some vendors employ multiple micro-displays to increase total resolution and field of view.
HMDs differ in whether they can display only computer-generated imagery (CGI), or only live imagery from the physical world, or combination. Most HMDs can display only a computer-generated image, sometimes referred to as virtual image. Some HMDs can allow a CGI to be superimposed on real-world view. This is sometimes referred to as augmented reality (AR) or mixed reality (MR). Combining real-world view with CGI can be done by projecting the CGI through a partially reflective mirror and viewing the real world directly. This method is often called optical see-through. Combining real-world view with CGI can also be done electronically by accepting video from a camera and mixing it electronically with CGI.
Optical HMD
An optical head-mounted display uses an optical mixer which is made of partly silvered mirrors. It can reflect artificial images, and let real images cross the lens, and let a user look through it. Various methods have existed for see-through HMD's, most of which can be summarized into two main families b
|
https://en.wikipedia.org/wiki/Ectogenesis
|
Ectogenesis (from the Greek ἐκτός, "outside," and genesis) is the growth of an organism in an artificial environment outside the body in which it would normally be found, such as the growth of an embryo or fetus outside the mother's body, or the growth of bacteria outside the body of a host. The term was coined by British scientist J.B.S. Haldane in 1924.
Human embryos and fetuses
Ectogenesis of human embryos and fetuses would require an artificial uterus. An artificial uterus would have to be supplied with nutrients and oxygen from some source to nurture the fetus, as well as dispose of waste material. There would likely be a need for an interface between such a supplier, filling this function of the placenta. An artificial uterus, as a replacement organ, could be used to assist women with damaged, diseased or removed uteri to allow the fetus to be conceived to term. It also has the potential to move the threshold of fetal viability to a much earlier stage of pregnancy. This would have implications for the ongoing controversy regarding human reproductive rights.
Ectogenesis could also be a means by which homosexual, impotent, disabled and single men and women could have genetic offspring without the use of surrogate pregnancy or a sperm donor, and allow women to have children without going through the pregnancy cycle.
Synthetic embryo
In 2022, Jacob Hanna and his team at the Weizmann Institute of Science created early "embryo-like structures'" from mice stem cells. The world's first synthetic embryo does not require sperm, eggs nor fertilization and were grown from only embryonic stem cells (ESCs) or also from stem cells other than ESCs. The structure had an intestinal tract, early brain and a beating heart and a placenta with a yolk sac around the embryo. The researchers said it could lead to better understanding of organ and tissue development, new sources of cells and tissues for human transplantation. However, human synthetic embryos are a long ways off.
|
https://en.wikipedia.org/wiki/BIOS%20interrupt%20call
|
BIOS implementations provide interrupts that can be invoked by operating systems and application programs to use the facilities of the firmware on IBM PC compatible computers. Traditionally, BIOS calls are mainly used by DOS programs and some other software such as boot loaders (including, mostly historically, relatively simple application software that boots directly and runs without an operating system—especially game software). BIOS runs in the real address mode (Real Mode) of the x86 CPU, so programs that call BIOS either must also run in real mode or must switch from protected mode to real mode before calling BIOS and then switching back again. For this reason, modern operating systems that use the CPU in Protected mode or Long mode generally do not use the BIOS interrupt calls to support system functions, although they use the BIOS interrupt calls to probe and initialize hardware during booting. Real mode has the 1MB memory limitation, modern boot loaders (e.g. GRUB2, Windows Boot Manager) use the unreal mode or protected mode (and execute the BIOS interrupt calls in the Virtual 8086 mode, but only for OS booting) to access up to 4GB memory.
In all computers, software instructions control the physical hardware (screen, disk, keyboard, etc.) from the moment the power is switched on. In a PC, the BIOS, pre-loaded in ROM on the motherboard, takes control immediately after the CPU is reset, including during power-up, when a hardware reset button is pressed, or when a critical software failure (a triple fault) causes the mainboard circuitry to automatically trigger a hardware reset. The BIOS tests the hardware and initializes its state; finds, loads, and runs the boot program (usually, an OS boot loader, and historical ROM BASIC); and provides basic hardware control to the software running on the machine, which is usually an operating system (with application programs) but may be a directly booting single software application.
For IBM's part, they provided all th
|
https://en.wikipedia.org/wiki/Flexure
|
A flexure is a flexible element (or combination of elements) engineered to be compliant in specific degrees of freedom. Flexures are a design feature used by design engineers (usually mechanical engineers) for providing adjustment or compliance in a design.
Flexure types
Most compound flexure designs are composed of 3 fundamental types of flexure:
Pin flexure- a thin bar or cylinder of material, constrains 3 degrees of freedom when geometry matches a notch cutout.
Blade flexure- thin sheet of material, constrains 3 degrees of freedom.
Notch flexure- thin cutout on both sides of a thick piece of material, constrains 5 degrees of freedom
Since single flexure features are limited both in travel capability and degrees of freedom available, compound flexure systems are designed using combinations of these component features. Using compound flexures, complex motion profiles with specific degrees of freedom and relatively long travel distances are possible.
Design aspects
In the field of precision engineering (especially high-precision motion control), flexures have several key advantages. High precision alignment tasks might not be possible when friction or stiction are present. Additionally, conventional bearings or linear slides often exhibit positioning hysteresis due to backlash and friction. Flexures are able to achieve much lower resolution limits (in some cases measured in the nanometer scale), because they depend on bending and/or torsion of flexible elements, rather than surface interaction of many parts (as with a ball bearing). This makes flexures a critical design feature used in optical instrumentation such as interferometers.
Due to their mode of action, flexures are used for limited range motions and cannot replace long-travel or continuous-rotation adjustments. Additionally, special care must be taken to design the flexure to avoid material yielding or fatigue, both of which are potential failure modes in a flexure design.
Design examples
|
https://en.wikipedia.org/wiki/Zeitschrift%20f%C3%BCr%20Angewandte%20Mathematik%20und%20Physik
|
The Zeitschrift für Angewandte Mathematik und Physik (English: Journal of Applied Mathematics and Physics) is a bimonthly peer-reviewed scientific journal published by Birkhäuser Verlag. The editor-in-chief is Kaspar Nipp (ETH Zurich). It was established in 1950 and covers the fields of theoretical and applied mechanics, applied mathematics, and related topics. According to the Journal Citation Reports, the journal has a 2017 impact factor of 1.711.
References
External links
Mathematics journals
Physics journals
Academic journals established in 1950
Springer Science+Business Media academic journals
Bimonthly journals
English-language journals
|
https://en.wikipedia.org/wiki/Alan%20Nunn%20May
|
Alan Nunn May (sometimes Allan){{refn|May's first name is sometimes spelled Allan, with two l'''s, but both the Oxford Dictionary of National Biography and the Encyclopædia Britannica use Alan.|group=n|name=Latour}} (2 May 1911 – 12 January 2003) was a British physicist and a confessed and convicted Soviet spy who supplied secrets of British and American atomic research to the Soviet Union during World War II.
Early life and education
May was the youngest of four children of Walter Frederick Nunn May, a brassfounder, and Mary Annie, née Kendall. He was born in Bedruthan, Park Hill, Moseley, Birmingham, and educated at King Edward's School, Birmingham. As a scholarship student at Trinity Hall, Cambridge, he achieved a first in physics, which led to doctoral studies under Charles Ellis and lectureship at King's College London.
Career
Early communist ties
May joined the Communist Party of Great Britain in the 1930s and was active in the Association of Scientific Workers. The Cambridge Five spy ring member Donald Duart Maclean was also at Trinity Hall during an overlapping period.
World War II
During World War II, May initially worked on radar in Suffolk and then with Cecil Powell in Bristol on a project that attempted to use photographic methods to detect fast particles from radioactive decay. James Chadwick recruited him to a Cambridge University team working on a possible heavy water reactor. The team was part of the British Tube Alloys directorate that was merged into the American Manhattan Project, the successful effort to create a nuclear weapon. In January 1943, the Cambridge team, including May, was transferred to the Montreal Laboratory, which was building a reactor at Chalk River, near Ottawa, Ontario, Canada. May's Canadian position ended in September 1945, and he returned to his lecturing post in London.
Soviet espionage
He had let his membership of the Communist Party lapse by 1940, but at Cambridge, when he saw an American report mentioning t
|
https://en.wikipedia.org/wiki/Moon%20tree
|
Moon trees are trees grown from seeds taken into orbit around the Moon, initially by Apollo 14 in 1971, and later by Artemis 1 in 2022. The idea was first proposed by Edward P. Cliff, then the Chief of the United States Forest Service, who convinced Stuart Roosa, the Command Module Pilot on the Apollo 14 mission, to bring a small canister containing about 500 seeds aboard the module in 1971. Seeds for the experiment were chosen from five species of tree: loblolly pine, sycamore, sweetgum, redwood, and Douglas fir. In 2022, NASA announced it would be reviving the Moon tree program by carrying 1,000 seeds aboard Artemis 1.
History
After the flight, the seeds were sent to the southern Forest Service station in Gulfport, Mississippi, and to the western station in Placerville, California, with the intent to germinate them. Nearly all the seeds germinated successfully, and after a few years, the Forest Service had about 420 seedlings. Some of these were planted alongside their Earth-bound counterparts, which were specifically set aside as controls. After more than 40 years, there was no discernible difference between the two classes of trees. Most of the Moon trees were given away in 1975 and 1976 to state forestry organizations, in order to be planted as part of the nation's bicentennial celebration. Since the trees were all of southern or western species, not all states received trees. A Loblolly Pine was planted at the White House, and trees were planted in Brazil, Switzerland, and presented to Emperor Hirohito, among others.
The locations of many of the trees that were planted from these seeds were largely unknown for decades. In 1996, a third-grade teacher, Joan Goble, and her students found a tree in their local area with a plaque identifying it as a Moon tree. Goble sent an email to NASA, and reached employee Dave Williams. Williams was unaware of the trees' existence, as were most of his colleagues at NASA. Upon doing some research, Williams found some old news
|
https://en.wikipedia.org/wiki/Evolutionary%20ecology
|
Evolutionary ecology lies at the intersection of ecology and evolutionary biology. It approaches the study of ecology in a way that explicitly considers the evolutionary histories of species and the interactions between them. Conversely, it can be seen as an approach to the study of evolution that incorporates an understanding of the interactions between the species under consideration. The main subfields of evolutionary ecology are life history evolution, sociobiology (the evolution of social behavior), the evolution of interspecific interactions (e.g. cooperation, predator–prey interactions, parasitism, mutualism) and the evolution of biodiversity and of ecological communities.
Evolutionary ecology mostly considers two things: how interactions (both among species and between species and their physical environment) shape species through selection and adaptation, and the consequences of the resulting evolutionary change.
Evolutionary models
A large part of evolutionary ecology is about utilising models and finding empirical data as proof. Examples include the Lack clutch size model devised by David Lack and his study of Darwin's finches on the Galapagos Islands. Lack's study of Darwin's finches was important in analyzing the role of different ecological factors in speciation. Lack suggested that differences in species were adaptive and produced by natural selection, based on the assertion by G.F. Gause that two species cannot occupy the same niche.
Richard Levins introduced his model of the specialization of species in 1968, which investigated how habitat specialization evolved within heterogeneous environments using the fitness sets an organism or species possesses. This model developed the concept of spatial scales in specific environments, defining fine-grained spatial scales and coarse-grained spatial scales. The implications of this model include a rapid increase in environmental ecologists' understanding of how spatial scales impact species diversity in a
|
https://en.wikipedia.org/wiki/WSAZ-TV
|
WSAZ-TV (channel 3) is a television station licensed to Huntington, West Virginia, United States, affiliated with NBC. It serves the Charleston–Huntington market, the second-largest television market (in terms of geographical area) east of the Mississippi River; the station's coverage area includes 31 counties in central West Virginia, eastern Kentucky and southeastern Ohio. WSAZ-TV is owned by Gray Television alongside Portsmouth, Ohio-licensed CW affiliate WQCW (channel 30). Both stations share studios on 5th Avenue in Huntington, with an additional studio and newsroom on Columbia Avenue in Charleston. WSAZ-TV's transmitter is located on Barker Ridge near Milton, West Virginia.
History
Early years
The oldest television station in West Virginia, WSAZ-TV began broadcasting November 15, 1949, on VHF channel 5. The station was originally owned by the Huntington Publishing Company along with the Huntington Herald-Dispatch and WSAZ radio (930 AM, now WRVC), and carried programming from all four networks at the time (NBC, CBS, ABC, and DuMont). However, it was a primary NBC affiliate due to WSAZ radio's long affiliation with NBC Radio. When WCHS-TV (channel 8) signed on from Charleston in 1954, it took over the CBS affiliation and the two television stations shared ABC programming until WHTN-TV (channel 13, now WOWK-TV) signed on from Huntington a year later. In 1955, WSAZ-TV dropped DuMont after the network shut down. It is the only commercial station in the market that has never changed its primary affiliation.
One story of how the station's call letters originated dates from WSAZ radio's origins in 1923, when radio engineer Glenn Chase began airing semi-regular broadcasts from Pomeroy, Ohio. It moved across and down the Ohio River to Huntington in 1926, in part because Chase had his hands full keeping the station going. Chase later claimed that the station proved such a headache to him that he asked for the call letters WSAZ to signify that it was the "Worst Stati
|
https://en.wikipedia.org/wiki/Paul%20Davis%20%28programmer%29
|
Paul Davis (formerly known as Paul Barton-Davis) is a British-American software developer best known for his work on audio software (JACK) for the Linux operating system, and for his role as one of the first two programmers at Amazon.com.
Davis grew up in the English Midlands and in London. After studying molecular biology and biophysics, he did post-graduate studies in computational biology at the Weizmann Institute of Science in Rehovot and EMBL in Heidelberg.
He immigrated to the U.S. in 1989. He lived in Seattle for seven years, where he worked for the Computer Science and Engineering Department at the University of Washington, and several smaller software companies in Seattle. While in Seattle, he helped to get Amazon.com off the ground during the period 1994–1996, making critical contributions to Amazon's backend systems alongside Shel Kaphan, before subsequently moving to Philadelphia in 1996. In 2019 he moved with his wife to Galisteo, NM
He went on to fund the development of various audio software for Linux, including Ardour and the JACK Audio Connection Kit. He works full-time on free software.
He is also an ultra-marathon runner and touring cyclist.
References
External links
Paul Davis' home page
Computer programmers
Living people
Year of birth missing (living people)
|
https://en.wikipedia.org/wiki/Cold%20trap
|
In vacuum applications, a cold trap is a device that condenses all vapors except the permanent gases into a liquid or solid. The most common objective is to prevent vapors being evacuated from an experiment from entering a vacuum pump where they would condense and contaminate it. Particularly large cold traps are necessary when removing large amounts of liquid as in freeze drying.
Cold traps also refer to the application of cooled surfaces or baffles to prevent oil vapours from flowing from a pump and into a chamber. In such a case, a baffle or a section of pipe containing a number of cooled vanes, will be attached to the inlet of an existing pumping system. By cooling the baffle, either with a cryogen such as a dry ice mixture, or by use of an electrically driven Peltier element, oil vapour molecules that strike the baffle vanes will condense and thus be removed from the pumped cavity.
Applications
Pumps that use oil either as their working fluid (diffusion pumps), or as their lubricant (mechanical rotary pumps), are often the sources of contamination in vacuum systems. Placing a cold trap at the mouth of such a pump greatly lowers the risk that oil vapours will backstream into the cavity.
Cold traps can also be used for experiments involving vacuum lines such as small-scale very low temperature distillations/condensations. This is accomplished through the use of a coolant such as liquid nitrogen or a freezing mixture of dry ice in acetone or a similar solvent with a low melting point. Liquid nitrogen is only used when dry ice or other cryogenic approaches will not condense the desired gasses since liquid nitrogen will also condense oxygen. Any oxygen gas content in the vacuum line or any leak in the vacuum line will result in liquid oxygen mixed with the target vapors, often with explosive results.
When performed on a larger scale, this technique is called freeze-drying, and the cold trap is referred to as the condenser.
Cold traps are also used in cryop
|
https://en.wikipedia.org/wiki/Rectangular%20function
|
The rectangular function (also known as the rectangle function, rect function, Pi function, Heaviside Pi function, gate function, unit pulse, or the normalized boxcar function) is defined as
Alternative definitions of the function define to be 0, 1, or undefined.
Its periodic version is called a rectangular wave.
History
The rect function has been introduced by Woodward in as an ideal cutout operator, together with the sinc function as an ideal interpolation operator, and their counter operations which are sampling (comb operator) and replicating (rep operator), respectively.
Relation to the boxcar function
The rectangular function is a special case of the more general boxcar function:
where is the Heaviside step function; the function is centered at and has duration , from to
Fourier transform of the rectangular function
The unitary Fourier transforms of the rectangular function are
using ordinary frequency , where is the normalized form of the sinc function and
using angular frequency , where is the unnormalized form of the sinc function.
For , its Fourier transform isNote that as long as the definition of the pulse function is only motivated by its behavior in the time-domain experience, there is no reason to believe that the oscillatory interpretation (i.e. the Fourier transform function) should be intuitive, or directly understood by humans. However, some aspects of the theoretical result may be understood intuitively, as finiteness in time domain corresponds to an infinite frequency response. (Vice versa, a finite Fourier transform will correspond to infinite time domain response.)
Relation to the triangular function
We can define the triangular function as the convolution of two rectangular functions:
Use in probability
Viewing the rectangular function as a probability density function, it is a special case of the continuous uniform distribution with The characteristic function is
and its moment-generating function is
where is the hype
|
https://en.wikipedia.org/wiki/Ping-pong%20recording
|
Ping-pong recording (also called ping-ponging, bouncing tracks, or reduction mixing) is a method of sound recording. It involves combining multiple track stems into one, allowing more room for overdubbing when using tape recorders with a limited set of tracks. It is also used to simplify mixdowns.
The two most common methods consist of
Dubbing tracks between two tape recorders (or tracks on a multitrack recorder) connected through a mixing console
Dubbing tracks internally, through the onboard mixer of many machines, including Portastudios and similar multitrackers.
In both cases, a new instrument, voice, or other material may be added with each bounce, depending on the setup's mixing capabilities.
In analog recording, the audio quality normally decreases with each generation, while in digital recording, the quality is usually preserved. In either case, the most leeway comes with having the best possible source material.
Early examples
The method was employed by Beach Boys co-founder Brian Wilson during the 1960s. For the recording of Pet Sounds (1966), Wilson created the instrumentals of songs using a 4-track recorder. He then bounced the material onto one track of an 8-track recorder, using the remaining tracks for vocal overdubs. This meant that the album could not be suitably mixed in stereo, because the instrumental parts were locked in monaural. In 1997, advances in recording technology allowed engineer Mark Linett to resync the original first-generation instrumental stems with the second-generation overdubbed vocals for the compilation The Pet Sounds Sessions and create a true stereo mix of the album.
Other terms
Ping pong is also a term of derision, in particular applied to early commercial stereo recordings of the late 1950s to mid-1960s which do not have a convincing stereo image or sound-stage. Such recordings were often made in two-track form for mixing in mono, but released as authentic stereo recordings.
References
Audio engineering
History
|
https://en.wikipedia.org/wiki/Bluesnarfing
|
Bluesnarfing is the unauthorized access of information from a wireless device through a Bluetooth connection, often between phones, desktops, laptops, and PDAs (personal digital assistant). This allows access to calendars, contact lists, emails and text messages, and on some phones, users can copy pictures and private videos. Both Bluesnarfing and Bluejacking exploit others' Bluetooth connections without their knowledge. While Bluejacking is essentially harmless as it only transmits data to the target device, Bluesnarfing is the theft of information from the target device.
Description
Current mobile software generally must allow a connection using a temporary state initiated by the user in order to be 'paired' with another device to copy content. There seem to have been, in the past, available reports of phones being Bluesnarfed without pairing being explicitly allowed. After the disclosure of this vulnerability, vendors of mobile phone patched their Bluetooth implementations and, at the time of writing, no current phone models are known to be vulnerable to this attack.
Any device with its Bluetooth connection turned on and set to "discoverable" (able to be found by other Bluetooth devices in range) may be susceptible to Bluejacking and possibly to Bluesnarfing if there is a vulnerability in the vendor's software. By turning off this feature, the potential victim can be safer from the possibility of being Bluesnarfed; although a device that is set to "hidden" may be Bluesnarfable by guessing the device's MAC address via a brute force attack. As with all brute force attacks, the main obstacle to this approach is the sheer number of possible MAC addresses. Bluetooth uses a 48-bit unique MAC Address, of which the first 24 bits are common to a manufacturer. The remaining 24 bits have approximately 16.8 million possible combinations, requiring an average of 8.4 million attempts to guess by brute force.
Prevalence
Attacks on wireless systems have increased along with
|
https://en.wikipedia.org/wiki/Hyperbolic%20coordinates
|
In mathematics, hyperbolic coordinates are a method of locating points in quadrant I of the Cartesian plane
.
Hyperbolic coordinates take values in the hyperbolic plane defined as:
.
These coordinates in HP are useful for studying logarithmic comparisons of direct proportion in Q and measuring deviations from direct proportion.
For in take
and
.
The parameter u is the hyperbolic angle to (x, y) and v is the geometric mean of x and y.
The inverse mapping is
.
The function is a continuous mapping, but not an analytic function.
Alternative quadrant metric
Since HP carries the metric space structure of the Poincaré half-plane model of hyperbolic geometry, the bijective correspondence
brings this structure to Q. It can be grasped using the notion of hyperbolic motions. Since geodesics in HP are semicircles with centers on the boundary, the geodesics in Q are obtained from the correspondence and turn out to be rays from the origin or petal-shaped curves leaving and re-entering the origin. And the hyperbolic motion of HP given by a left-right shift corresponds to a squeeze mapping applied to Q.
Since hyperbolas in Q correspond to lines parallel to the boundary of HP, they are horocycles in the metric geometry of Q.
If one only considers the Euclidean topology of the plane and the topology inherited by Q, then the lines bounding Q seem close to Q. Insight from the metric space HP shows that the open set Q has only the origin as boundary when viewed through the correspondence. Indeed, consider rays from the origin in Q, and their images, vertical rays from the boundary R of HP. Any point in HP is an infinite distance from the point p at the foot of the perpendicular to R, but a sequence of points on this perpendicular may tend in the direction of p. The corresponding sequence in Q tends along a ray toward the origin. The old Euclidean boundary of Q is no longer relevant.
Applications in physical science
Fundamental physical variables are sometimes related
|
https://en.wikipedia.org/wiki/Malignant%20transformation
|
Malignant transformation is the process by which cells acquire the properties of cancer. This may occur as a primary process in normal tissue, or secondarily as malignant degeneration of a previously existing benign tumor.
Causes
There are many causes of primary malignant transformation, or tumorigenesis. Most human cancers in the United States are caused by external factors, and these factors are largely avoidable. These factors were summarized by Doll and Peto in 1981, and were still considered to be valid in 2015. These factors are listed in the table.
a Reproductive and sexual behaviors include: number of partners; age at first menstruation; zero versus one or more live births
Examples of diet-related malignant transformation
Diet and colon cancer
Colon cancer provides one example of the mechanisms by which diet, the top factor listed in the table, is an external factor in cancer. The Western diet of African Americans in the United States is associated with a yearly colon cancer rate of 65 per 100,000 individuals, while the high fiber/low fat diet of rural Native Africans in South Africa is associated with a yearly colon cancer rate of <5 per 100,000. Feeding the Western diet for two weeks to Native Africans increased their secondary bile acids, including carcinogenic deoxycholic acid, by 400%, and also changed the colonic microbiota. Evidence reviewed by Sun and Kato indicates that differences in human colonic microbiota play an important role in the progression of colon cancer.
Diet and lung cancer
A second example, relating a dietary component to a cancer, is illustrated by lung cancer. Two large population-based studies were performed, one in Italy and one in the United States. In Italy, the study population consisted of two cohorts: the first, 1721 individuals diagnosed with lung cancer and no severe disease, and the second, 1918 control individuals with absence of lung cancer history or any advanced diseases. All individuals filled out a
|
https://en.wikipedia.org/wiki/Xenobiology
|
Xenobiology (XB) is a subfield of synthetic biology, the study of synthesizing and manipulating biological devices and systems. The name "xenobiology" derives from the Greek word xenos, which means "stranger, alien". Xenobiology is a form of biology that is not (yet) familiar to science and is not found in nature. In practice, it describes novel biological systems and biochemistries that differ from the canonical DNA–RNA-20 amino acid system (see central dogma of molecular biology). For example, instead of DNA or RNA, XB explores nucleic acid analogues, termed xeno nucleic acid (XNA) as information carriers. It also focuses on an expanded genetic code and the incorporation of non-proteinogenic amino acids into proteins.
Difference between xeno-, exo-, and astro-biology
"Astro" means "star" and "exo" means "outside". Both exo- and astrobiology deal with the search for naturally evolved life in the Universe, mostly on other planets in the circumstellar habitable zone. (These are also occasionally referred to as xenobiology.) Whereas astrobiologists are concerned with the detection and analysis of life elsewhere in the Universe, xenobiology attempts to design forms of life with a different biochemistry or different genetic code than on planet Earth.
Aims
Xenobiology has the potential to reveal fundamental knowledge about biology and the origin of life. In order to better understand the origin of life, it is necessary to know why life evolved seemingly via an early RNA world to the DNA-RNA-protein system and its nearly universal genetic code. Was it an evolutionary "accident" or were there constraints that ruled out other types of chemistries? By testing alternative biochemical "primordial soups", it is expected to better understand the principles that gave rise to life as we know it.
Xenobiology is an approach to develop industrial production systems with novel capabilities by means of biopolymer engineering and pathogen resistance. The genetic code encodes in all
|
https://en.wikipedia.org/wiki/John%20Derbyshire
|
John Derbyshire (born 3 June 1945) is a British-born American white supremacist political commentator, writer, journalist and computer programmer. He was noted for being one of the last paleoconservatives in the National Review, until he was fired in 2012 for writing an article for Taki's Magazine that was widely viewed as racist. Since 2012 he has written for white nationalist website VDARE.
In the article that caused his firing, Derbyshire suggested that white and Asian parents should talk to their children about the threats posed to their safety by black people. He also recommended that parents tell their children not to live in predominantly black communities. He included the line "If planning a trip to a beach or amusement park at some date, find out whether it is likely to be swamped with blacks on that date."
He has also written for the New English Review. His columns cover political-cultural topics, including immigration, China, history, mathematics, and race. Derbyshire's 1996 novel Seeing Calvin Coolidge in a Dream was a New York Times "Notable Book of the Year". His 2004 non-fiction book Prime Obsession won the Mathematical Association of America's inaugural Euler Book Prize. A political book, We Are Doomed: Reclaiming Conservative Pessimism, was released in September 2009.
Early life
Derbyshire attended the Northampton School for Boys and graduated from University College London, of the University of London, where he studied mathematics. Before turning to writing full-time, he worked on Wall Street as a computer programmer.
Career
National Review
Derbyshire worked as a writer at National Review until he was terminated in 2012 because of an article published in Taki's Magazine that was widely perceived as racist.
Derbyshire began writing for the far-right website VDARE in May 2012. In his first column for the website, Derbyshire wrote "White supremacy, in the sense of a society in which key decisions are made by white Europeans, is one of the bett
|
https://en.wikipedia.org/wiki/Prime%20Obsession
|
Prime Obsession: Bernhard Riemann and the Greatest Unsolved Problem in Mathematics (2003) is a historical book on mathematics by John Derbyshire, detailing the history of the Riemann hypothesis, named for Bernhard Riemann, and some of its applications.
The book was awarded the Mathematical Association of America's inaugural Euler Book Prize in 2007.
Overview
The book is written such that even-numbered chapters present historical elements related to the development of the conjecture, and odd-numbered chapters deal with the mathematical and technical aspects. Despite the title, the book provides biographical information on many iconic mathematicians including Euler, Gauss, and Lagrange.
In chapter 1, "Card Trick", Derbyshire introduces the idea of an infinite series and the ideas of convergence and divergence of these series. He imagines that there is a deck of cards stacked neatly together, and that one pulls off the top card so that it overhangs from the deck. Explaining that it can overhang only as far as the center of gravity allows, the card is pulled so that exactly half of it is overhanging. Then, without moving the top card, he slides the second card so that it is overhanging too at equilibrium. As he does this more and more, the fractional amount of overhanging cards as they accumulate becomes less and less. He explores various types of series such as the harmonic series.
In chapter 2, Bernhard Riemann is introduced and a brief historical account of Eastern Europe in the 18th Century is discussed.
In chapter 3, the Prime Number Theorem (PNT) is introduced. The function which mathematicians use to describe the number of primes in N numbers, π(N), is shown to behave in a logarithmic manner, as so:
where log is the natural logarithm.
In chapter 4, Derbyshire gives a short biographical history of Carl Friedrich Gauss and Leonard Euler, setting up their involvement in the Prime Number Theorem.
In chapter 5, the Riemann Zeta Function is introduced:
In cha
|
https://en.wikipedia.org/wiki/American%20Megatrends
|
American Megatrends International, LLC, doing business as AMI, is an international hardware and software company, specializing in PC hardware and firmware. The company was founded in 1985 by Pat Sarma and Subramonian Shankar. It is headquartered in Building 800 at 3095 Satellite Boulevard in unincorporated Gwinnett County, Georgia, United States, near the city of Duluth, and in the Atlanta metropolitan area.
The company started as a manufacturer of complete motherboards, positioning itself in the high-end segment. Its first customer was PCs Limited, later known as Dell Computer.
As hardware activity moved progressively to Taiwan-based original design manufacturers, AMI continued to develop BIOS firmware for major motherboard manufacturers. The company produced BIOS software for motherboards (1986), server motherboards (1992), storage controllers (1995) and remote-management cards (1998).
In 1993, AMI produced MegaRAID, a storage controller card. AMI sold its RAID assets to LSI Corporation in 2001, with only one employee from the RAID-division remaining with the AMI core team.
AMI continued to focus on OEM and ODM business and technology. Its product line includes AMIBIOS (a BIOS), Aptio (a successor to AMIBIOS8 based on the UEFI standard), diagnostic software, AMI EC (embedded controller firmware), MG-Series SGPIO backplane controllers (for SATA, SAS and NVMe storage devices), driver/firmware development, and MegaRAC (BMC firmware).
Founding
American Megatrends Inc. (AMI) was founded in 1985 by Subramonian Shankar and Pat Sarma with funds from a previous consulting venture, Access Methods Inc. (also AMI). Access Methods was a company run by Pat Sarma and his partner. After Access Methods successfully launched the AMIBIOS, there were legal issues among the owners of the company, resulting in Sarma buying out his partners. Access Methods still owned the rights to the AMIBIOS. Sarma had already started a company called Quintessential Consultants Inc. (QCI), and l
|
https://en.wikipedia.org/wiki/Alternative%20literature
|
Alternative literature (or alt-lit) is a literary movement strongly influenced by internet culture and online publishing. It includes various forms of prose, poetry, and new media. Alt-lit is characterized by self-publication and a presence on social media networks. Alternative literature brings together people with a common interest in the online publishing world.
Origins
The term was first used to refer to this community of writers in the summer of 2011, when Tumblr and Twitter accounts named "Alt Lit Gossip" emerged, created by Cory Stephens (@outmouth). The accounts covered writers from presses and publications such as Muumuu House, Pop Serial, and HTMLgiant in a style akin to celebrity gossip sources like TMZ. After a few months the original accounts were deleted; they were revived by Frank Hinton in the fall of 2011, and began to gain popularity.
Shared traits across Alt Lit
Alt Lit is often characterized by self-publication, self-promotion, and the maintenance of a presence on social media networks. Josh Soilker has said that Alt Lit is "in blog posts, videos, gchats and Facebook status updates. In PDFs and folded papers..." and that the movement's principal figures were Tao Lin, Noah Cicero and Brandon Scott Gorrell.
Alt Lit writers share Gmail chat logs, image macros, screenshots, and tweets, which are then self-published as poetry books and/or novels.
Writing for the New Yorker, Kenneth Goldsmith characterized Alt Lit writing as "marked by direct speech, expressions of aching desire, and wide-eyed sincerity". He also noted that Alt Lit is "usually written in the Internet vernacular of lowercase letters, inverted punctuation, abundant typos, and bad grammar".
Authors and works
Literary magazines and blogs
Online and print Alt Lit magazines include SWAY Press, Illuminati Girl Gang, New Wave Vomit, Pop Serial, Shabby Doll House, Have U Seen My Whale, The Bushwick Review, The Mall, Keep This Bag Away From Children, Everyday Genius, Metazen, Housefire, UP
|
https://en.wikipedia.org/wiki/California%20High-Speed%20Rail
|
California High-Speed Rail (also known as CAHSR or CHSR) is a publicly funded high-speed rail system currently being developed in California in the United States. In 1996, the California Legislature and Governor Pete Wilson established the California High-Speed Rail Authority with the task of creating a plan for the system and then presenting it to the voters of the state for approval. In 2008, voters approved the plan given in Proposition 1A, which specified a route connecting all the major population centers of the state, authorized the issuance of bonds for beginning implementation, and established other requirements.
The CAHSR system is currently being implemented in phased segments. Construction began in 2015 for the first of the dedicated HSR segments, the Interim Initial Operating Segment ("Interim IOS"), in the San Joaquin Valley portion of California's Central Valley. It will run from Merced to Bakersfield and is planned to begin operations in 2030 (or slightly later). Concurrently, in the major metropolitan areas of San Francisco Bay Area and Greater Los Angeles, the commuter rail systems are being upgraded for improved safety and service, and to support a "blended system" in the future, with CAHSR sharing upgraded tracks, power systems, train control systems, and stations. Proposition 1A did not specify the use of a "blended system" in the large metropolitan areas; however, cost (and other) considerations forced the Authority to adopt this approach in 2012. Extending the Interim IOS to connect to the northern and southern metropolitan segments is dependent on future funding, so it is uncertain when (or even if) the IOS will ever link to the metropolitan areas.
Maximum train speeds will be about in the dedicated HSR segments and about in the blended segments. Per Proposition 1A, the nonstop trains between San Francisco and Los Angeles – which are about apart by air – must not exceed 2 hours and 40 minutes travel time.
The proposed high-speed rail sys
|
https://en.wikipedia.org/wiki/List%20of%20systemic%20diseases%20with%20ocular%20manifestations
|
An ocular manifestation of a systemic disease is an eye condition that directly or indirectly results from a disease process in another part of the body. There are many diseases known to cause ocular or visual changes. Diabetes, for example, is the leading cause of new cases of blindness in those aged 20–74, with ocular manifestations such as diabetic retinopathy and macular edema affecting up to 80% of those who have had the disease for 15 years or more. Other diseases such as acquired immunodeficiency syndrome (AIDS) and hypertension are commonly found to have associated ocular symptoms.
Systemic allergic diseases
Asthma
Atopic dermatitis
Atopic eczema
Hay fever
Urticaria
Vernal conjunctivitis
Skin and mucous membrane diseases
Acne rosacea
Albinism
Atopic dermatitis
Behçet's disease
Cicatricial pemphigoid
Ehlers–Danlos syndrome
Epidermolysis bullosa
Erythema multiforme
Goltz–Gorlin syndrome
Ichthyosis
Incontinentia pigmenti
Nevus of Ota
Pemphigus
Pseudoxanthoma elasticum
Psoriasis
Stevens–Johnson syndrome (Erythema multiforme major)
Vogt–Koyanagi–Harada syndrome
Xeroderma pigmentosum
Phacomatoses
Angiomatosis retinae (Von Hippel–Lindau disease) (retinocerebellar capillary hemangiomatosis)
Ataxia telangiectasia (Louis–Bar syndrome)
Encephalotrigeminal angiomatosis (Sturge–Weber syndrome) (encephalofacial cavernous hemangiomatosis)
Neurofibromatosis (von Recklinghausen's disease)
Tuberous sclerosis (Bourneville's syndrome)
Wyburn–Mason syndrome (racemose hemangiomatosis)
Collagen diseases
Ankylosing spondylitis
Dermatomyositis
Periarteritis nodosa
Reactive arthritis
Rheumatoid arthritis
Ehlers-Danlos Syndrome
Sarcoidosis
Scleroderma
Systemic lupus erythematosus
Temporal arteritis
Relapsing polychondritis
Granulomatosis with polyangiitis 50-60% have ophthalmologic manifestations, which can be a presenting feature in a minority of patients. Orbital disease is the most common manifestation, and may result in proptosis, restrictive ophthalmopathy, chronic orbital
|
https://en.wikipedia.org/wiki/Z/VM
|
z/VM is the current version in IBM's VM family of virtual machine operating systems. z/VM was first released in October 2000 and remains in active use and development . It is directly based on technology and concepts dating back to the 1960s, with IBM's CP/CMS on the IBM System/360-67 (see article History of CP/CMS for historical details). z/VM runs on IBM's IBM Z family of computers. It can be used to support large numbers (thousands) of Linux virtual machines. (See Linux on IBM Z.)
On 16 September 2022, IBM released z/VM Version 7.3 which requires z/Architecture, implemented in IBM's EC12, BC12 and later models.
See also
OpenSolaris for System z
PR/SM
Time-sharing system evolution
z/OS
z/TPF
z/VSE
References
Citations
External links
IBM z/VM Evaluation Edition (free download)
Virtualization software
IBM mainframe operating systems
|
https://en.wikipedia.org/wiki/Intelligent%20Peripheral%20Interface
|
Intelligent Peripheral Interface (IPI) was a server-centric storage interface used in the 1980s and early 1990s with an ISO-9318 standard.
The idea behind IPI is that the disk drives themselves are as simple as possible, containing only the lowest level control circuitry, while the IPI interface card encapsulates most of the disk control complexity. The IPI interface card, as a central point of control, is thus theoretically able to best coordinate accesses to the connected disks, as it "knows" more about the states of the connected disks than would, say, a SCSI interface.
An IPI-2 bus could provide a data transfer rate in the vicinity of 6 MB/s.
In practice, the theoretical advantages of IPI over SCSI were often not realized, as they only materialized when several disks were connected to the interface, which could then easily become a bandwidth bottleneck.
IPI systems were often shipped by Sun Microsystems on original sun4 architecture servers, but the above limitation and reliability problems made them unpopular with customers, and the technology basically disappeared by the second half of the 1990s.
See also
Enhanced Small Disk Interface (ESDI)
Storage Module Device (SMD)
SCSI
References
Computer storage buses
Interfaces
|
https://en.wikipedia.org/wiki/National%20Organic%20Program
|
The National Organic Program (NOP) is the federal regulatory framework in the United States of America governing organic food. It is also the name of the United States Department of Agriculture (USDA) Agricultural Marketing Service (AMS) program responsible for administering and enforcing the regulatory framework. The core mission of the NOP is to protect the integrity of the USDA organic seal. The seal is used for products adhering to USDA standards that contain at least 95% organic ingredients.
The Organic Foods Production Act of 1990 (OFPA) required that the USDA develop national standards for organic products, and the final rule establishing the NOP was first published in the Federal Register in 2000 and is codified in the Code of Federal Regulations at .
Overview
The NOP covers fresh and processed agricultural food products.
The National Organic Program grew from fewer than twelve total employees in 2008 to approximately 37 in 2019 and 82 in January 2023. This growth has been due to increased annual funding appropriated by Congress since 2018.
The key activities of the National Organic Program are to:
Maintain the Organic Integrity Database, a listing of certified organic operations, and help new farmers and business learn how to get certified
Develop regulations and guidance on organic standards
Manage the National List of Allowed and Prohibited Substances
Accredit certifying agents to certify organic producers and handlers
Facilitate the work of the National Organic Standards Board (NOSB), a Federal Advisory Committee
Provide training to certifying agents, USDA staff, and other stakeholders
Engage and serve the organic community
Investigate alleged violations of the organic standards and bring violators to justice
Regulation
The Organic Foods Production Act of 1990 "requires the Secretary of Agriculture to establish a National List of Allowed and Prohibited Substances which identifies synthetic substances that may be used, and the nonsynthetic s
|
https://en.wikipedia.org/wiki/Scancode
|
A scancode (or scan code) is the data that most computer keyboards send to a computer to report which keys have been pressed. A number, or sequence of numbers, is assigned to each key on the keyboard.
Variants
Mapping key positions by row and column requires less complex computer hardware; therefore, in the past, using software or firmware to translate the scancodes to text characters was less expensive than wiring the keyboard by text character. This cost difference is not as profound as it used to be. However, many types of computers still use their traditional scancodes to maintain backward compatibility.
Some keyboard standards include a scancode for each key being pressed and a different one for each key being released. In addition, many keyboard standards (for example, IBM PC compatible standards) allow the keyboard itself to generate "typematic" repeating keys by having the keyboard itself generate the pressed-key scancode repeatedly while the key is held down, with the release scancode sent once when the key is released.
Scancode sets
On some operating systems one may discover a key's downpress scancode by holding the key down while the computer is booting. With luck, the scancode (or some part of it) will be specified in the resulting "stuck key" error message. [Note: On Windows 7 only one byte of the scancode appears.]
PC compatibles
Scancodes on IBM PC compatible computer keyboards are sets of 1 to 3 bytes which are sent by the keyboard. Most character keys have a single byte scancode; keys that perform special functions have 2-byte or 3-byte scancodes, usually beginning with the byte (in hexadecimal) E0, E1, or E2. In addition, a few keys send longer scancodes, effectively emulating a series of keys to make it easier for different types of software to process.
PC keyboards since the PS/2 keyboard support up to three scancode sets. The most commonly encountered are the "XT" ("set 1") scancodes, based on the 83-key keyboard used by the I
|
https://en.wikipedia.org/wiki/Hypervisor
|
A hypervisor (also known as a virtual machine monitor, VMM, or virtualizer) is a type of computer software, firmware or hardware that creates and runs virtual machines. A computer on which a hypervisor runs one or more virtual machines is called a host machine, and each virtual machine is called a guest machine. The hypervisor presents the guest operating systems with a virtual operating platform and manages the execution of the guest operating systems. Unlike an emulator, the guest executes most instructions on the native hardware. Multiple instances of a variety of operating systems may share the virtualized hardware resources: for example, Linux, Windows, and macOS instances can all run on a single physical x86 machine. This contrasts with operating-system–level virtualization, where all instances (usually called containers) must share a single kernel, though the guest operating systems can differ in user space, such as different Linux distributions with the same kernel.
The term hypervisor is a variant of supervisor, a traditional term for the kernel of an operating system: the hypervisor is the supervisor of the supervisors, with hyper- used as a stronger variant of super-. The term dates to circa 1970; IBM coined it for the 360/65 and later used it for the DIAG handler of CP-67. In the earlier CP/CMS (1967) system, the term Control Program was used instead.
Classification
In his 1973 thesis, "Architectural Principles for Virtual Computer Systems," Robert P. Goldberg classified two types of hypervisor:
Type-1, native or bare-metal hypervisors
These hypervisors run directly on the host's hardware to control the hardware and to manage guest operating systems. For this reason, they are sometimes called bare-metal hypervisors. The first hypervisors, which IBM developed in the 1960s, were native hypervisors. These included the test software SIMMON and the CP/CMS operating system, the predecessor of IBM's VM family of virtual machine operating systems.
Type-
|
https://en.wikipedia.org/wiki/Sympatry
|
In biology, two related species or populations are considered sympatric when they exist in the same geographic area and thus frequently encounter one another. An initially interbreeding population that splits into two or more distinct species sharing a common range exemplifies sympatric speciation. Such speciation may be a product of reproductive isolation – which prevents hybrid offspring from being viable or able to reproduce, thereby reducing gene flow – that results in genetic divergence. Sympatric speciation may, but need not, arise through secondary contact, which refers to speciation or divergence in allopatry followed by range expansions leading to an area of sympatry. Sympatric species or taxa in secondary contact may or may not interbreed.
Types of populations
Four main types of population pairs exist in nature. Sympatric populations (or species) contrast with parapatric populations, which contact one another in adjacent but not shared ranges and do not interbreed; peripatric species, which are separated only by areas in which neither organism occurs; and allopatric species, which occur in entirely distinct ranges that are neither adjacent nor overlapping. Allopatric populations isolated from one another by geographical factors (e.g., mountain ranges or bodies of water) may experience genetic—and, ultimately, phenotypic—changes in response to their varying environments. These may drive allopatric speciation, which is arguably the dominant mode of speciation.
Evolving definitions and controversy
The lack of geographic isolation as a definitive barrier between sympatric species has yielded controversy among ecologists, biologists, botanists, and zoologists regarding the validity of the term. As such, researchers have long debated the conditions under which sympatry truly applies, especially with respect to parasitism. Because parasitic organisms often inhabit multiple hosts during a life cycle, evolutionary biologist Ernst Mayr stated that internal parasit
|
https://en.wikipedia.org/wiki/Teredo%20tunneling
|
In computer networking, Teredo is a transition technology that gives full IPv6 connectivity for IPv6-capable hosts that are on the IPv4 Internet but have no native connection to an IPv6 network. Unlike similar protocols such as 6to4, it can perform its function even from behind network address translation (NAT) devices such as home routers.
Teredo operates using a platform independent tunneling protocol that provides IPv6 (Internet Protocol version 6) connectivity by encapsulating IPv6 datagram packets within IPv4 User Datagram Protocol (UDP) packets. Teredo routes these datagrams on the IPv4 Internet and through NAT devices. Teredo nodes elsewhere on the IPv6 network (called Teredo relays) receive the packets, un-encapsulate them, and pass them on.
Teredo is a temporary measure. In the long term, all IPv6 hosts should use native IPv6 connectivity. Teredo should be disabled when native IPv6 connectivity becomes available. Christian Huitema developed Teredo at Microsoft, and the IETF standardized it as RFC 4380. The Teredo server listens on UDP port 3544.
Purpose
For 6to4, the most common IPv6 over IPv4 tunneling protocol, requires that the tunnel endpoint have a public IPv4 address. However, many hosts currently attach to the IPv4 Internet through one or several NAT devices, usually because of IPv4 address shortage. In such a situation, the only available public IPv4 address is assigned to the NAT device, and the 6to4 tunnel endpoint must be implemented on the NAT device itself. The problem is that many NAT devices currently deployed cannot be upgraded to implement 6to4, for technical or economic reasons.
Teredo alleviates this problem by encapsulating IPv6 packets within UDP/IPv4 datagrams, which most NATs can forward properly. Thus, IPv6-aware hosts behind NATs can serve as Teredo tunnel endpoints even when they don't have a dedicated public IPv4 address. In effect, a host that implements Teredo can gain IPv6 connectivity with no cooperation from the local
|
https://en.wikipedia.org/wiki/Check%20mark
|
A check or check mark (American English), checkmark (Philippine English), tickmark (Indian English) or tick (Australian, New Zealand and British English) is a mark (✓, ✔, etc.) used, primarily in the English-speaking world, to indicate the concept "yes" (e.g. "yes; this has been verified", "yes; that is the correct answer", "yes; this has been completed", or "yes; this [item or option] applies"). The x mark is also sometimes used for this purpose (most notably on election ballot papers, e.g. in the United Kingdom), but otherwise usually indicates "no", incorrectness, or failure. One of the earliest usages of a check mark as an indication of completion is on ancient Babylonian tablets "where small indentations were sometimes made with a stylus, usually placed at the left of a worker's name, presumably to indicate whether the listed ration has been issued."
As a verb, to check (off) or tick (off) means to add such a mark. Printed forms, printed documents, and computer software (see checkbox) commonly include squares in which to place check marks.
International differences
The check mark is a predominant affirmative symbol of convenience in the English-speaking world because of its instant and simple composition. In other language communities, there are different conventions.
It is common in Swedish schools for a to indicate that an answer is incorrect, while "R", from the Swedish , i.e., "correct", is used to indicate that an answer is correct.
In Finnish, ✓ stands for , i.e., "wrong", due to its similarity to a slanted v. The opposite, "correct", is marked with , a slanted vertical line emphasized with two dots (see also commercial minus sign).
In Japan, the O mark is used instead of the check mark, and the X or ✓ mark are commonly used for wrong.
In the Netherlands a 'V' is used to show that things are missing while the flourish of approval (or krul) is used for approving a section or sum.
Unicode
Unicode provides various check marks:
See also
Bracket
O
|
https://en.wikipedia.org/wiki/Mathnet
|
Mathnet is a segment on the children's television show Square One Television that follows the adventures of pairs of police mathematicians. It is a pastiche of Dragnet.
Premise
Mathnet is a pastiche of Dragnet, in which the main characters are mathematicians who use their mathematical skills to solve various crimes and mysteries in the city, usually thefts, burglaries, frauds, and kidnappings. Each segment of the series aired on one episode of Square One, a production of the Children's Television Workshop (CTW) aimed at teaching math skills to young viewers. Five segments made up an episode (one for each weekday), with suspense building at the end of each segment.
Characters
Kate Monday (Beverly Leech) - A pastiche of Jack Webb's Dragnet character Joe Friday, Kate usually does not show her emotions when on the job and tackles almost every situation with a calm and rational mind. She appears in the first three seasons.
George Frankly (Joe Howard) - The partner of Kate Monday (and later Pat Tuesday), George takes his job seriously but is frequently prone to fits of comical mishaps and immature reactions. He appears in all five seasons. He has a wife named Martha whom he often mentions but who is never seen or heard.
Pat Tuesday (Toni DiBuono) - George's second partner, appearing in Seasons 4 and 5 to replace Kate. Like Kate, Pat shares the deadpan mannerisms and no-nonsense attitude of Joe Friday.
Los Angeles cast
Thad Green (James Earl Jones) - The Chief of the Los Angeles Police Department. He briefly appears in Season 4.
Debbie Williams (Mary Watson) - Technical analyst at the LAPD division where Kate and George works, she is frequently called upon to process data obtained during Mathnet investigations.
New York City cast
Joe Greco (Emilio Del Pozo) - Captain of the New York City precinct, he is the man who George, Kate, and later Pat report to when they move to New York City starting in Season 3.
Benny Pill (Bari K. Willerford) - An undercover NYPD offi
|
https://en.wikipedia.org/wiki/Web%20Services%20Discovery
|
Web Services Discovery provides access to software systems over the Internet using standard protocols. In the most basic scenario there is a Web Service Provider that publishes a service and a Web Service Consumer that uses this service. Web Service Discovery is the process of finding suitable web services for a given task.
Publishing a web service involves creating a software artifact and making it accessible to potential consumers. Web service providers augment a service endpoint interface with an interface description using the Web Services Description Language (WSDL) so that a consumer can use the service.
Universal Description, Discovery, and Integration (UDDI) is an XML-based registry for business internet services. A provider can explicitly register a service with a Web Services Registry such as UDDI or publish additional documents intended to facilitate discovery such as Web Services Inspection Language (WSIL) documents. The service users or consumers can search web services manually or automatically. The implementation of UDDI servers and WSIL engines should provide simple search APIs or web-based GUI to help find Web services.
Web services may also be discovered using multicast mechanisms like WS-Discovery, thus reducing the need for centralized registries in smaller networks.
Universal Description Discovery and Integration
Universal Description, Discovery and Integration (UDDI, pronounced ) is a platform-independent, Extensible Markup Language protocol that includes a (XML-based) registry by which businesses worldwide can list themselves on the Internet, and a mechanism to register and locate web service applications. UDDI is an open industry initiative, sponsored by the Organization for the Advancement of Structured Information Standards (OASIS), for enabling businesses to publish service listings and discover each other, and to define how the services or software applications interact over the Internet.
UDDI was originally proposed as a core Web
|
https://en.wikipedia.org/wiki/Hankel%20contour
|
In mathematics, a Hankel contour is a path in the complex plane which extends from
(+∞,δ), around the origin counter clockwise and back to
(+∞,−δ), where δ is an arbitrarily small positive number. The contour thus remains arbitrarily close to the real axis but without crossing the real axis except for negative values of x. The Hankel contour can also be represented by a path that has mirror images just above and below the real axis, connected to a circle of radius ε, centered at the origin, where ε is an arbitrarily small number. The two linear portions of the contour are said to be a distance of δ from the real axis. Thus, the total distance between the linear portions of the contour is 2δ. The contour is traversed in the positively-oriented sense, meaning that the circle around the origin is traversed counter-clockwise.
Use of Hankel contours is one of the methods of contour integration. This type of path for contour integrals was first used by Hermann Hankel in his investigations of the Gamma function.
The Hankel contour is used to evaluate integrals such as the Gamma function, the Riemann zeta function, and other Hankel functions (which are Bessel functions of the third kind).
Applications
The Hankel contour and the Gamma function
The Hankel contour is helpful in expressing and solving the Gamma function in the complex t-plane. The Gamma function can be defined for any complex value in the plane if we evaluate the integral along the Hankel contour. The Hankel contour is especially useful for expressing the Gamma function for any complex value because the end points of the contour vanish, and thus allows the fundamental property of the Gamma function to be satisfied, which states .
Derivation of the contour integral expression of the Gamma function
Note that the formal representation of the Gamma function is .
To satisfy the fundamental property of the Gamma function, it follows that
after multiplying both sides by z.
Thus, given that the endpoints of
|
https://en.wikipedia.org/wiki/General-purpose%20computing%20on%20graphics%20processing%20units
|
General-purpose computing on graphics processing units (GPGPU, or less often GPGP) is the use of a graphics processing unit (GPU), which typically handles computation only for computer graphics, to perform computation in applications traditionally handled by the central processing unit (CPU). The use of multiple video cards in one computer, or large numbers of graphics chips, further parallelizes the already parallel nature of graphics processing.
Essentially, a GPGPU pipeline is a kind of parallel processing between one or more GPUs and CPUs that analyzes data as if it were in image or other graphic form. While GPUs operate at lower frequencies, they typically have many times the number of cores. Thus, GPUs can process far more pictures and graphical data per second than a traditional CPU. Migrating data into graphical form and then using the GPU to scan and analyze it can create a large speedup.
GPGPU pipelines were developed at the beginning of the 21st century for graphics processing (e.g. for better shaders). These pipelines were found to fit scientific computing needs well, and have since been developed in this direction.
History
In principle, any arbitrary boolean function, including addition, multiplication, and other mathematical functions, can be built up from a functionally complete set of logic operators. In 1987, Conway's Game of Life became one of the first examples of general-purpose computing using an early stream processor called a blitter to invoke a special sequence of logical operations on bit vectors.
General-purpose computing on GPUs became more practical and popular after about 2001, with the advent of both programmable shaders and floating point support on graphics processors. Notably, problems involving matrices and/or vectors especially two-, three-, or four-dimensional vectors were easy to translate to a GPU, which acts with native speed and support on those types. A significant milestone for GPGPU was the year 2003 when two research
|
https://en.wikipedia.org/wiki/Function%20of%20a%20real%20variable
|
In mathematical analysis, and applications in geometry, applied mathematics, engineering, and natural sciences, a function of a real variable is a function whose domain is the real numbers , or a subset of that contains an interval of positive length. Most real functions that are considered and studied are differentiable in some interval.
The most widely considered such functions are the real functions, which are the real-valued functions of a real variable, that is, the functions of a real variable whose codomain is the set of real numbers.
Nevertheless, the codomain of a function of a real variable may be any set. However, it is often assumed to have a structure of -vector space over the reals. That is, the codomain may be a Euclidean space, a coordinate vector, the set of matrices of real numbers of a given size, or an -algebra, such as the complex numbers or the quaternions. The structure -vector space of the codomain induces a structure of -vector space on the functions. If the codomain has a structure of -algebra, the same is true for the functions.
The image of a function of a real variable is a curve in the codomain. In this context, a function that defines curve is called a parametric equation of the curve.
When the codomain of a function of a real variable is a finite-dimensional vector space, the function may be viewed as a sequence of real functions. This is often used in applications.
Real function
A real function is a function from a subset of to where denotes as usual the set of real numbers. That is, the domain of a real function is a subset , and its codomain is It is generally assumed that the domain contains an interval of positive length.
Basic examples
For many commonly used real functions, the domain is the whole set of real numbers, and the function is continuous and differentiable at every point of the domain. One says that these functions are defined, continuous and differentiable everywhere. This is the case of:
All polynomial
|
https://en.wikipedia.org/wiki/Bioaugmentation
|
Biological augmentation is the addition of archaea or bacterial cultures required to speed up the rate of degradation of a contaminant. Organisms that originate from contaminated areas may already be able to break down waste, but perhaps inefficiently and slowly.
Bioaugmentation is a type of bioremediation in which it requires studying the indigenous varieties present in the location to determine if biostimulation is possible. After discovering the indigenous bacteria found in the location, if the indigenous bacteria can metabolize the contaminants, more of the indigenous bacterial cultures will be implemented into the location to boost the degradation of the contaminants. Bioaugmentation is the introduction of more archaea or bacterial cultures to enhance the contaminant degradation whereas biostimulation is the addition of nutritional supplements for the indigenous bacteria to promote the bacterial metabolism. If the indigenous variety do not have the metabolic capability to perform the remediation process, exogenous varieties with such sophisticated pathways are introduced. The utilization of bioaugmentation provides advancement in the fields of microbial ecology and biology, immobilization, and bioreactor design.
Bioaugmentation is commonly used in municipal wastewater treatment to restart activated sludge bioreactors. Most cultures available contain microbial cultures, already containing all necessary microorganisms (B. licheniformis, B. thuringiensis, P. polymyxa, B. stearothermophilus, Penicillium sp., Aspergillus sp., Flavobacterium, Arthrobacter, Pseudomonas, Streptomyces, Saccharomyces, etc.). Activated sludge systems are generally based on microorganisms like bacteria, protozoa, nematodes, rotifers, and fungi, which are capable of degrading biodegradable organic matter. There are many positive outcomes from the use of bioaugmentation, such as the improvement in efficiency and speed of the process of breaking down substances and the reduction of toxic pa
|
https://en.wikipedia.org/wiki/Traditional%20engineering
|
Traditional engineering, also known as sequential engineering, is the process of marketing, engineering design, manufacturing, testing and production where each stage of the development process is carried out separately, and the next stage cannot start until the previous stage is finished. Therefore, the information flow is only in one direction, and it is not until the end of the chain that errors, changes and corrections can be relayed to the start of the sequence, causing estimated costs to be under predicted.
This can cause many problems; such as time consumption due to many modifications being made as each stage does not take into account the next. This method is hardly used today, as the concept of concurrent engineering is more efficient.
Traditional engineering is also known as over the wall engineering as each stage blindly throws the development to the next stage over the wall.
Lean manufacturing
Traditional manufacturing has been driven by sales forecasts that companies need to produce and stockpile inventory to support. Lean manufacturing is based on the concept that production should be driven by the actual customer demands and requirements. Instead of pushing product to the marketplace, it is pulled through by the customers' actual needs.
Sequential engineering stages
Research
Design
Manufacture
Quality Control
Distribution
Sales
Disadvantages of sequential engineering
This orderly step-by-step process will bring control to complex projects but is very slow.
In today’s highly competitive market place this can lead to product failures and lost sales.
See also
Waterfall model
References
Engineering concepts
Product management
Supply chain management
|
https://en.wikipedia.org/wiki/Legendre%20chi%20function
|
In mathematics, the Legendre chi function is a special function whose Taylor series is also a Dirichlet series, given by
As such, it resembles the Dirichlet series for the polylogarithm, and, indeed, is trivially expressible in terms of the polylogarithm as
The Legendre chi function appears as the discrete Fourier transform, with respect to the order ν, of the Hurwitz zeta function, and also of the Euler polynomials, with the explicit relationships given in those articles.
The Legendre chi function is a special case of the Lerch transcendent, and is given by
Identities
Integral relations
References
Special functions
|
https://en.wikipedia.org/wiki/Catallaxy
|
Catallaxy or catallactics is an alternative expression for the word "economy". Whereas the word economy suggests that people in a community possess a common and congruent set of values and goals, catallaxy suggests that the emergent properties of a market (prices, division of labor, growth, etc.) are the outgrowths of the diverse and disparate goals of the individuals in a community.
Aristotle was the first person to define the word "economy" as ‘the art of household management’. As is still a common method of explanation today, Aristotle tried to explain complex market phenomena through an analogy between a household and a state, take for example the modern analogy between the national debt of a country's government and a simple consumer's credit card debt. Aristotle used a common Greek word 'oikonomia' that meant "to direct a single household," and used it to mean the management of an entire city-state. The word catallaxy aims to provide a more accurate and inclusive word for the market phenomenon of groups of households, in which participants are free to pursue diverse ends of their own.
After being discussed by Ludwig von Mises the term catallaxy was later made popular by Friedrich Hayek who defines it as "the order brought about by the mutual adjustment of many individual economies in a market".
Catallaxy may also be used to refer to a marketplace of ideas, especially a place where people holding diverse political ideologies come together to gain deeper understanding of a wide range of political orientations.
Catallaxy also becomes a new dimension in software design and network architecture.
See also
Catallactics
Demonstrated preference
Partial knowledge
Flat organization
Invisible hand
Tax choice
The Fatal Conceit
The Use of Knowledge in Society
Wisdom of the crowd
References
Notes
Austrian School
Self-organization
de:Katallaxie
es:Catalaxia
fr:Catallaxie
pl:Katalaktyka
ro:Catalaxie
ru:Каталлактика
|
https://en.wikipedia.org/wiki/Eisenstein%20series
|
Eisenstein series, named after German mathematician Gotthold Eisenstein, are particular modular forms with infinite series expansions that may be written down directly. Originally defined for the modular group, Eisenstein series can be generalized in the theory of automorphic forms.
Eisenstein series for the modular group
Let be a complex number with strictly positive imaginary part. Define the holomorphic Eisenstein series of weight , where is an integer, by the following series:
This series absolutely converges to a holomorphic function of in the upper half-plane and its Fourier expansion given below shows that it extends to a holomorphic function at . It is a remarkable fact that the Eisenstein series is a modular form. Indeed, the key property is its -covariance. Explicitly if and then
Note that is necessary such that the series converges absolutely, whereas needs to be even otherwise the sum vanishes because the and terms cancel out. For the series converges but it is not a modular form.
Relation to modular invariants
The modular invariants and of an elliptic curve are given by the first two Eisenstein series:
The article on modular invariants provides expressions for these two functions in terms of theta functions.
Recurrence relation
Any holomorphic modular form for the modular group can be written as a polynomial in and . Specifically, the higher order can be written in terms of and through a recurrence relation. Let , so for example, and . Then the satisfy the relation
for all . Here, is the binomial coefficient.
The occur in the series expansion for the Weierstrass's elliptic functions:
Fourier series
Define . (Some older books define to be the nome , but is now standard in number theory.) Then the Fourier series of the Eisenstein series is
where the coefficients are given by
Here, are the Bernoulli numbers, is Riemann's zeta function and is the divisor sum function, the sum of the th powers of the divisors of .
|
https://en.wikipedia.org/wiki/Commodore%20900
|
The Commodore 900 (also known as the C900, Z-8000, and Z-Machine) was a prototype microcomputer originally intended for business computing and, later, as an affordable UNIX workstation. It was to replace the aging PET/CBM families of personal computers that had found success in Europe as business machines. The project was initiated in 1983 by Commodore systems engineers Frank Hughes, Robert Russell, and Shiraz Shivji.
In early 1983, Commodore announced an agreement with Zilog to adopt the Z8000 family of processors for its next generation of computers, conferring rights to Commodore to manufacture these processors and for Zilog to manufacture various Commodore-designed integrated circuit products. Zilog was to manufacture components for Commodore's computers, allowing Commodore to expand its own semiconductor operation. Commodore had reportedly been developing its own 16-bit microprocessor, abandoning this effort to adopt the Z8000.
Design
The C900 was a 16-bit computer based on the segmented version of the Zilog Z8000 CPU. Initial announcements indicated the use of a 10 MHz Z8001 processor, but earlier technical documentation suggested the use of a 6 MHz part and detailed the option of a Z8070 arithmetic processing unit (APU) running at 24 MHz. The specification as announced in 1984 featured 256 KB of RAM and a 10 MB hard drive, but subsequently settled on 512 KB of RAM and a 20 MB hard drive as the minimum configuration, with 40 MB and 67 MB hard drives offered as options. A minimum configuration system had been expected to provide only 128 KB of RAM and a 320 KB floppy drive, selling for under $1,000.
Two versions of the machine were developed: a workstation with pixel graphics and a multi-user system featuring a text-only display intended to act as a server for a number of connected character-based terminals. For the text-only configuration and for lower-resolution graphical output, the system employed the MOS Technology 8563 video controller, this support
|
https://en.wikipedia.org/wiki/Sottens%20transmitter
|
The Sottens Transmitter is the nationwide transmitter for French-speaking Switzerland. The transmitter is located at Sottens, Canton of Vaud, Switzerland. It is run on 765 kHz with a power of 600 kilowatts and is easily receivable during the night throughout the whole of Europe. Since 1989 the aerial used has been a centre-fed dipole fixed on the outside of a 188-metre-high grounded freestanding steel framework tower. Before 1989 a 190-metre high self-radiating, free standing steel framework tower was used as a transmission aerial.
The Sottens transmitter most recently broadcast the Option Musique radio programme from Radio Suisse Romande, up until 5 December 2010.
There is also a 125 metre tall free-standing lattice tower on the site. This tower was built in 1931 as one of a pair, which until 1958 carried between them a T-antenna for medium wave broadcasting. The second tower was dismantled in that year and rebuilt in Dole as a TV transmission tower. This tower is insulated from ground to form a tower radiator and is used as backup antenna.
After the shutdown of RSR on MW, the antenna was later used for ham radio experiments in February 2011, using both standard AM and DRM in the 80 m band.
See also
Lattice tower
External links
Sottens transmitter pictures on emetteurs.ch
http://www.skyscraperpage.com/diagrams/?b44084
http://www.skyscraperpage.com/diagrams/?b57952
The Sottens transmitter Retrieved 26 January 2006
Sottens transmitter Sottens
Lattice towers
Towers in Switzerland
Buildings and structures in the canton of Vaud
Broadcast transmitters
1931 establishments in Switzerland
Towers completed in 1931
20th-century architecture in Switzerland
|
https://en.wikipedia.org/wiki/Kirchhoff%27s%20theorem
|
In the mathematical field of graph theory, Kirchhoff's theorem or Kirchhoff's matrix tree theorem named after Gustav Kirchhoff is a theorem about the number of spanning trees in a graph, showing that this number can be computed in polynomial time from the determinant of a submatrix of the Laplacian matrix of the graph; specifically, the number is equal to any cofactor of the Laplacian matrix. Kirchhoff's theorem is a generalization of Cayley's formula which provides the number of spanning trees in a complete graph.
Kirchhoff's theorem relies on the notion of the Laplacian matrix of a graph, which is equal to the difference between the graph's degree matrix (a diagonal matrix with vertex degrees on the diagonals) and its adjacency matrix (a (0,1)-matrix with 1's at places corresponding to entries where the vertices are adjacent and 0's otherwise).
For a given connected graph G with n labeled vertices, let λ1, λ2, ..., λn−1 be the non-zero eigenvalues of its Laplacian matrix. Then the number of spanning trees of G is
An English translation of Kirchhoff's original 1847 paper was made by J. B. O'Toole and published in 1958.
An example using the matrix-tree theorem
First, construct the Laplacian matrix Q for the example diamond graph G (see image on the right):
Next, construct a matrix Q* by deleting any row and any column from Q. For example, deleting row 1 and column 1 yields
Finally, take the determinant of Q* to obtain t(G), which is 8 for the diamond graph. (Notice t(G) is the (1,1)-cofactor of Q in this example.)
Proof outline
(The proof below is based on the Cauchy–Binet formula. An elementary induction argument for Kirchhoff's theorem can be found on page 654 of Moore (2011).)
First notice that the Laplacian matrix has the property that the sum of its entries across any row and any column is 0. Thus we can transform any minor into any other minor by adding rows and columns, switching them, and multiplying a row or a column by −1. Thus the cofac
|
https://en.wikipedia.org/wiki/Acid-fastness
|
Acid-fastness is a physical property of certain bacterial and eukaryotic cells, as well as some sub-cellular structures, specifically their resistance to decolorization by acids during laboratory staining procedures. Once stained as part of a sample, these organisms can resist the acid and/or ethanol-based decolorization procedures common in many staining protocols, hence the name acid-fast.
The mechanisms of acid-fastness vary by species, although the most well-known example is in the genus Mycobacterium, which includes the species responsible for tuberculosis and leprosy. The acid-fastness of Mycobacteria is due to the high mycolic acid content of their cell walls, which is responsible for the staining pattern of poor absorption followed by high retention. Some bacteria may also be partially acid-fast, such as Nocardia.
Acid-fast organisms are difficult to characterize using standard microbiological techniques, though they can be stained using concentrated dyes, particularly when the staining process is combined with heat. Some, such as Mycobacteria, can be stained with the Gram stain, but they do not take the crystal violet well and thus appear light purple, which can still potentially result in an incorrect gram negative identification.
The most common staining technique used to identify acid-fast bacteria is the Ziehl–Neelsen stain, in which the acid-fast species are stained bright red and stand out clearly against a blue background. Another method is the Kinyoun method, in which the bacteria are stained bright red and stand out clearly against a green background. Acid-fast Mycobacteria can also be visualized by fluorescence microscopy using specific fluorescent dyes (auramine-rhodamine stain, for example). The eggs of the parasitic lung fluke Paragonimus westermani are actually destroyed by the stain, which can hinder diagnosis in patients who present with TB-like symptoms.
Some acid-fast staining techniques
Ziehl–Neelsen stain (classic and modified bleac
|
https://en.wikipedia.org/wiki/Content%20Addressable%20File%20Store
|
The Content Addressable File Store (CAFS) was a hardware device developed by International Computers Limited (ICL) that provided a disk storage with built-in search capability. The motivation for the device was the discrepancy between the high speed at which a disk could deliver data, and the much lower speed at which a general-purpose processor could filter the data looking for records that matched a search condition.
Development of CAFS started in ICL's Research and Advanced Development Centre under Gordon Scarrott in the late 1960s following research by George Coulouris and John Evans who had completed a field study at Imperial College and Queen Mary College on database systems and applications (Scarrott, 1995). Their study had revealed the potential for substantial performance improvements in large-scale database applications by the inclusion of search logic in the disk controller.
In its initial form, the search logic was built into the disk head. A standalone CAFS device was installed with a few customers, including BT Directory Enquiries, during the 1970s.
The device was subsequently productised and in 1982 was incorporated as a standard feature within ICL's 2900 series and Series 39 mainframes. By this stage, to reduce costs and to take advantage of increased hardware speeds, the search logic was incorporated into the disk controller. A query expressed in a high-level query language could be compiled into a search specification that was then sent to the disk controller for execution. Initially this capability was integrated into ICL's own Querymaster query language, which worked in conjunction with the IDMS database; subsequently it was integrated into the ICL VME port of the Ingres relational database.
ICL received the Queen's Award for Technological Achievement for CAFS in 1985.
One factor which limited the adoption of CAFS was that the device needed to know the layout of data on disk, and placed constraints on this layout. Integrating database product
|
https://en.wikipedia.org/wiki/Shape%20optimization
|
Shape optimization is part of the field of optimal control theory. The typical problem is to find the shape which is optimal in that it minimizes a certain cost functional while satisfying given constraints. In many cases, the functional being solved depends on the solution of a given partial differential equation defined on the variable domain.
Topology optimization is, in addition, concerned with the number of connected components/boundaries belonging to the domain. Such methods are needed since typically shape optimization methods work in a subset of allowable shapes which have fixed topological properties, such as having a fixed number of holes in them. Topological optimization techniques can then help work around the limitations of pure shape optimization.
Definition
Mathematically, shape optimization can be posed as the problem of finding a bounded set , minimizing a functional
,
possibly subject to a constraint of the form
Usually we are interested in sets which are Lipschitz or C1 boundary and consist of finitely many components, which is a way of saying that we would like to find a rather pleasing shape as a solution, not some jumble of rough bits and pieces. Sometimes additional constraints need to be imposed to that end to ensure well-posedness of the problem and uniqueness of the solution.
Shape optimization is an infinite-dimensional optimization problem. Furthermore, the space of allowable shapes over which the optimization is performed does not admit a vector space structure, making application of traditional optimization methods more difficult.
Examples
Techniques
Shape optimization problems are usually solved numerically, by using iterative methods. That is, one starts with an initial guess for a shape, and then gradually evolves it,
until it morphs into the optimal shape.
Keeping track of the shape
To solve a shape optimization problem, one needs to find a way to represent a shape in the computer memory, and follow its evolution. Severa
|
https://en.wikipedia.org/wiki/RBBS-PC
|
RBBS-PC (acronym for Remote Bulletin Board System for the Personal Computer) was a public domain, open-source BBS software program. It was written entirely in BASIC by a large team of people, starting with Russell Lane and then later enhanced by Tom Mack, Ken Goosens and others.
It supported messaging conferences, questionnaires, doors (through the dropfile), and much more.
History
In 1982, Larry Jordan of the Capital PC Users Group started modifying some existing BBS software that had been ported from CP/M by Russell Lane. The first major release of this effort, RBBS-PC CPC09, in May 1983 was written in interpreted BASIC and included the Xmodem file transfer protocol added by Jordan. In June 1983, Jordan turned over maintenance and enhancements to Tom Mack and Ken Goosens. The first release under Mack, version 10.0, was released July 4, 1983. New versions and features were released steadily throughout the rest of the 1980s. The final complete version, 17.4, was released March 22, 1992.
Since version 17.4 at least four other code paths have developed. Some work has been done to unify the code paths and to develop version 18.0. Dan Drinnons CDOR Mods and Mapleleaf versions were further enhanced by beta testers Mike Moore and Bob Manapeli using Ken Goosens LineBled program to manipulate the source code to endless variations of the program.
Philosophy
From the beginning of RBBS-PC's development, the authors of the software had two goals as stated in the RBBS-PC documentation:
To show what could be done with the BASIC language and that "real programmers can/do program in BASIC."
To open a new medium of communication where anyone with a personal computer the ability to communicate freely. This idea was summarized as "Users helping users for free to help the free exchange of information."
References
External links
RBBS-PC files
The BBS Software Directory - RBBS
Bulletin board system software
DOS software
Pre–World Wide Web online services
Computer-rela
|
https://en.wikipedia.org/wiki/Projective%20line%20over%20a%20ring
|
In mathematics, the projective line over a ring is an extension of the concept of projective line over a field. Given a ring A with 1, the projective line P(A) over A consists of points identified by projective coordinates. Let U be the group of units of A; pairs and from are related when there is a u in U such that and . This relation is an equivalence relation. A typical equivalence class is written U[a, b].
that is, U[a, b] is in the projective line if the ideal generated by a and b is all of A.
The projective line P(A) is equipped with a group of homographies. The homographies are expressed through use of the matrix ring over A and its group of units V as follows: If c is in Z(U), the center of U, then the group action of matrix on P(A) is the same as the action of the identity matrix. Such matrices represent a normal subgroup N of V. The homographies of P(A) correspond to elements of the quotient group .
P(A) is considered an extension of the ring A since it contains a copy of A due to the embedding
. The multiplicative inverse mapping , ordinarily restricted to the group of units U of A, is expressed by a homography on P(A):
Furthermore, for , the mapping can be extended to a homography:
Since u is arbitrary, it may be substituted for u−1.
Homographies on P(A) are called linear-fractional transformations since
Instances
Rings that are fields are most familiar: The projective line over GF(2) has three elements: , , and . Its homography group is the permutation group on these three.
The ring Z / 3Z, or GF(3), has the elements 1, 0, and −1; its projective line has the four elements , , , since both 1 and −1 are units. The homography group on this projective line has 12 elements, also described with matrices or as permutations. For a finite field GF(q), the projective line is the Galois geometry . J. W. P. Hirschfeld has described the harmonic tetrads in the projective lines for q = 4, 5, 7, 8, 9.
Over discrete rings
Consider when n is
|
https://en.wikipedia.org/wiki/Lite-On
|
Lite-On (also known as LiteOn and LiteON) is a Taiwanese company that primarily manufactures consumer electronics, including LEDs, semiconductors, computer chassis, monitors, motherboards, optical disc drives, and other electronic components. The Lite-On group also consists of some non-electronic companies like a finance arm and a cultural company.
History
Lite-On was started in 1975 by several Taiwanese Texas Instruments ex-employees. The original line of business was optical products (LEDs). They then branched out into computer power supplies by starting the Power Conversion Division. Other divisions were soon to follow.
In 1983 Lite-On Electronics issued initial public offering as the first technology company listed on the Taiwan Stock Exchange with Stock Code 2301.
In 2003 Lite-ON appoint Dragon Group as their sole distributor in Indonesia.
In 2006 Lite-On IT Corporation acquired BenQ Corporation's Optical Disk Drive Business to become one of the top 3 ODD manufacturers in the world.
In March 2007, Lite-On IT Corporation formed a joint venture with Koninklijke Philips Electronics N.V. for their optical disc drive division as Philips & Lite-On Digital Solutions Corporation (PLDS).
Kioxia (formerly Toshiba Memory) announced on August 30, 2019, that it signed a definitive agreement to acquire Lite-On's SSD business for . The transaction closed in 2020.
See also
List of companies of Taiwan
References
External links
Components, trading, and service site (US)
PLDS new site
Companies based in Taipei
Electronics companies established in 1975
Computer storage companies
Electronics companies of Taiwan
Taiwanese brands
1975 establishments in Taiwan
2020 mergers and acquisitions
|
https://en.wikipedia.org/wiki/Switched%20Multi-megabit%20Data%20Service
|
Switched Multi-megabit Data Service (SMDS) was a connectionless service used to connect LANs, MANs and WANs to exchange data, in early 1990s. In Europe, the service was known as Connectionless Broadband Data Service (CBDS).
SMDS was specified by Bellcore, and was based on the IEEE 802.6 metropolitan area network (MAN) standard, as implemented by Bellcore, and used cell relay transport, Distributed Queue Dual Bus layer-2 switching arbitrator, and standard SONET or G.703 as access interfaces.
It is a switching service that provides data transmission in the range between 1.544 Mbit/s (T1 or DS1) to 45 Mbit/s (T3 or DS3). SMDS was developed by Bellcore as an interim service until Asynchronous Transfer Mode matured. SMDS was notable for its initial introduction of the 53-byte cell and cell switching approaches, as well as the method of inserting 53-byte cells onto G.703 and SONET. In the mid-1990s, SMDS was replaced, largely by Frame Relay.
References
External links
Cisco guide to SMDS
SMDS | SIP Protocol
Computer networking
|
https://en.wikipedia.org/wiki/Available%20bit%20rate
|
Available bit rate (ABR) is a service used in ATM networks when source and destination don't need to be synchronized. ABR does not guarantee against delay or data loss. ABR mechanisms allow the network to allocate the available bandwidth fairly over the present ABR sources. ABR is one of five service categories defined by the ATM Forum for use in an ATM Network.
The network switches use locally available information to determine the explicit allowable rates or relative rate (increase/decrease) for the source. The newly calculated rates are then being sent to the sources using resource management records (RM-cells). RM-cells are generated by the source and travel along the data path to the destination and sent back. ABR sets a minimum cell rate (MCR) and a peak cell rate (PCR). When transfers exceed the PCR, cells are dropped.
Many implementers consider ABR to be overly complex, and its adoption has been modest.
See also
External links
Understanding the Available Bit Rate (ABR) Service Category for ATM VCs
Network protocols
Asynchronous Transfer Mode
|
https://en.wikipedia.org/wiki/Tunneling%20protocol
|
In computer networks, a tunneling protocol is a communication protocol which allows for the movement of data from one network to another. It involves allowing private network communications to be sent across a public network (such as the Internet) through a process called encapsulation.
Because tunneling involves repackaging the traffic data into a different form, perhaps with encryption as standard, it can hide the nature of the traffic that is run through a tunnel.
The tunneling protocol works by using the data portion of a packet (the payload) to carry the packets that actually provide the service. Tunneling uses a layered protocol model such as those of the OSI or TCP/IP protocol suite, but usually violates the layering when using the payload to carry a service not normally provided by the network. Typically, the delivery protocol operates at an equal or higher level in the layered model than the payload protocol.
Uses
A tunneling protocol may, for example, allow a foreign protocol to run over a network that does not support that particular protocol, such as running IPv6 over IPv4.
Another important use is to provide services that are impractical or unsafe to be offered using only the underlying network services, such as providing a corporate network address to a remote user whose physical network address is not part of the corporate network.
Circumventing firewall policy
Users can also use tunneling to "sneak through" a firewall, using a protocol that the firewall would normally block, but "wrapped" inside a protocol that the firewall does not block, such as HTTP. If the firewall policy does not specifically exclude this kind of "wrapping", this trick can function to get around the intended firewall policy (or any set of interlocked firewall policies).
Another HTTP-based tunneling method uses the HTTP CONNECT method/command. A client issues the HTTP CONNECT command to an HTTP proxy. The proxy then makes a TCP connection to a particular server:port, an
|
https://en.wikipedia.org/wiki/Wireless%20device%20radiation%20and%20health
|
The antennas contained in mobile phones, including smartphones, emit radiofrequency (RF) radiation (non-ionizing "radio waves" such as microwaves); the parts of the head or body nearest to the antenna can absorb this energy and convert it to heat. Since at least the 1990s, scientists have researched whether the now-ubiquitous radiation associated with mobile phone antennas or cell phone towers is affecting human health. Mobile phone networks use various bands of RF radiation, some of which overlap with the microwave range. Other digital wireless systems, such as data communication networks, produce similar radiation.
In response to public concern, the World Health Organization (WHO) established the International EMF (Electric and Magnetic Fields) Project in 1996 to assess the scientific evidence of possible health effects of EMF in the frequency range from 0 to 300 GHz. They have stated that although extensive research has been conducted into possible health effects of exposure to many parts of the frequency spectrum, all reviews conducted so far have indicated that, as long as exposures are below the limits recommended in the ICNIRP (1998) EMF guidelines, which cover the full frequency range from 0–300 GHz, such exposures do not produce any known adverse health effect. In 2011, International Agency for Research on Cancer (IARC), an agency of the WHO, classified wireless radiation as Group 2B – possibly carcinogenic. That means that there "could be some risk" of carcinogenicity, so additional research into the long-term, heavy use of wireless devices needs to be conducted. The WHO states that "A large number of studies have been performed over the last two decades to assess whether mobile phones pose a potential health risk. To date, no adverse health effects have been established as being caused by mobile phone use."
International guidelines on exposure levels to microwave frequency EMFs such as ICNIRP limit the power levels of wireless devices and it is uncommon
|
https://en.wikipedia.org/wiki/Steam%20shovel
|
A steam shovel is a large steam-powered excavating machine designed for lifting and moving material such as rock and soil. It is the earliest type of power shovel or excavator. Steam shovels played a major role in public works in the 19th and early 20th century, being key to the construction of railroads and the Panama Canal. The development of simpler, cheaper diesel-powered shovels caused steam shovels to fall out of favor in the 1930s.
History
Origins and development
Grimshaw of Boulton & Watt devised the first steam-powered excavator in 1796. In 1833 William Brunton patented another steam-powered excavator which he provided further details on in 1836. The steam shovel was invented by William Otis, who received a patent for his design in 1839. The first machines were known as 'partial-swing', since the boom could not rotate through 360 degrees. They were built on a railway chassis, on which the boiler and movement engines were mounted. The shovel arm and driving engines were mounted at one end of the chassis, which accounts for the limited swing. Bogies with flanged wheels were fitted, and power was taken to the wheels by a chain drive to the axles. Temporary rail tracks were laid by workers where the shovel was expected to work, and repositioned as required.
Steam shovels became more popular in the latter half of the nineteenth century. Originally configured with chain hoists, the advent of steel cable in the 1870s allowed for easier rigging to the winches.
Later machines were supplied with caterpillar tracks, obviating the need for rails.
The full-swing, 360° revolving shovel was developed in England in 1884, and became the preferred format for these machines.
Growth and uses
Expanding railway networks (in the US and the UK) fostered a demand for steam shovels. The extensive mileage of railways, and corresponding volume of material to be moved, forced the technological leap. As a result, steam shovels became commonplace.
American manufacturers include
|
https://en.wikipedia.org/wiki/C-One
|
The C-One is a single-board computer (SBC) created in 2002 as an enhanced version of the Commodore 64, a home computer popular in the 1980s. Designed by Jeri Ellsworth and Jens Schönfeld from Individual Computers, who manufactured the boards themselves, the C-One has been re-engineered to allow cloning of other 8-bit computers.
Design
The machine uses a combination of configurable Altera field-programmable gate array (FPGA) chips and modular CPU expansion cards to create compatibility modes that duplicate the function of many older home computers. The default CPU is the W65C816S (by Western Design Center) which is used in Commodore 64 compatibility mode as well as the C-One's native operating mode. The C-One is not merely a software emulator, it loads various core files from a card to configure the FPGA hardware to recreate the operation of the core logic chipsets found in vintage computers. This provides for a very accurate and customizable hardware emulation platform. The C-One is not limited to recreating historical computers: its programmable core logic can be used to create entirely new custom computer designs.
In 2004, the platform was expanded to include an Amstrad CPC core made by Tobias Gubener.
In 2006, Peter Wendrich ported his FPGA-64 project (originally intended for a Xilinx FPGA) and enhanced it for the C-One. This core supported both PAL and NTSC machine emulation, and aimed to be cycle-exact and emulate many of the bugs and quirks of the original hardware.
In 2008, after development of an "Extender" card which added a third FPGA, Tobias Gubener added Amiga 500 compatibility by porting Dennis van Weeren's Minimig code to the board. This core replaced the physical 68000 CPU and the PIC chip from the original with his own TG68 CPU core on the FPGA. developments to this core include features not possible with the original Minimig board.
In 2009, Peter Wendrich released a "preview" of a next-generation C64 core called "Chameleon 64", with a great
|
https://en.wikipedia.org/wiki/Exponential%20stability
|
In control theory, a continuous linear time-invariant system (LTI) is exponentially stable if and only if the system has eigenvalues (i.e., the poles of input-to-output systems) with strictly negative real parts. (i.e., in the left half of the complex plane). A discrete-time input-to-output LTI system is exponentially stable if and only if the poles of its transfer function lie strictly within the unit circle centered on the origin of the complex plane. Systems that are not LTI are exponentially stable if their convergence is bounded by exponential decay.
Exponential stability is a form of asymptotic stability, valid for more general dynamical systems.
Practical consequences
An exponentially stable LTI system is one that will not "blow up" (i.e., give an unbounded output) when given a finite input or non-zero initial condition. Moreover, if the system is given a fixed, finite input (i.e., a step), then any resulting oscillations in the output will decay at an exponential rate, and the output will tend asymptotically to a new final, steady-state value. If the system is instead given a Dirac delta impulse as input, then induced oscillations will die away and the system will return to its previous value. If oscillations do not die away, or the system does not return to its original output when an impulse is applied, the system is instead marginally stable.
Example exponentially stable LTI systems
The graph on the right shows the impulse response of two similar systems. The green curve is the response of the system with impulse response , while the blue represents the system . Although one response is oscillatory, both return to the original value of 0 over time.
Real-world example
Imagine putting a marble in a ladle. It will settle itself into the lowest point of the ladle and, unless disturbed, will stay there. Now imagine giving the ball a push, which is an approximation to a Dirac delta impulse. The marble will roll back and forth but eventually resettle in th
|
https://en.wikipedia.org/wiki/Peter%20Durand
|
Peter Durand (21 October 1766 – 23 July 1822) was an English merchant who is widely credited with receiving the first patent for the idea of preserving food using tin cans. The patent (No 3372) was granted on August 25, 1810, by King George III of the United Kingdom.
The patent specifies that it was issued to Peter Durand, a merchant of Hoxton Square, Middlesex, United Kingdom, for a method of preserving animal food, vegetable food and other perishable articles using various vessels made of glass, pottery, tin or other suitable metals. The preservation procedure was to fill up a vessel with food and cap it. Vegetables were to be put in raw, whereas animal substances might either be raw or half-cooked. Then the whole item was to be heated by any means, such as an oven, stove or a steam bath, but most conveniently by immersing in water and boiling it. The boiling time was not specified, and was said to depend on the food and vessel size. Neither was the patent clear on the preservation time, which was merely said to be "long". The cap was to be partly open during the whole heating and cooling procedure, but right after that, the vessel should be sealed airtight by any means, such as a cork plug, a screw-cap with a rubber seal, cementing, etc.
In his patent, Durand clearly mentions that the idea of the invention was communicated to him more than a year ago by a friend abroad. Extensive research in 19th century archives has revealed that he was French inventor Philippe de Girard. The relation between Durand and Girard has not been advertised, and the credit for the first canned food patent remains with Durand.
The patent itself consists of two distinct parts: first, the description of the original idea, and second, observations by Durand himself. Durand was clearly suspicious of the invention. However, having a curious mind, he performed a thorough test of it by himself, sealing meat, soups and milk, and boiling them as described. The original inventor had only exper
|
https://en.wikipedia.org/wiki/Marginal%20stability
|
In the theory of dynamical systems and control theory, a linear time-invariant system is marginally stable if it is neither asymptotically stable nor unstable. Roughly speaking, a system is stable if it always returns to and stays near a particular state (called the steady state), and is unstable if it goes farther and farther away from any state, without being bounded. A marginal system, sometimes referred to as having neutral stability, is between these two types: when displaced, it does not return to near a common steady state, nor does it go away from where it started without limit.
Marginal stability, like instability, is a feature that control theory seeks to avoid; we wish that, when perturbed by some external force, a system will return to a desired state. This necessitates the use of appropriately designed control algorithms.
In econometrics, the presence of a unit root in observed time series, rendering them marginally stable, can lead to invalid regression results regarding effects of the independent variables upon a dependent variable, unless appropriate techniques are used to convert the system to a stable system.
Continuous time
A homogeneous continuous linear time-invariant system is marginally stable if and only if the real part of every pole (eigenvalue) in the system's transfer-function is non-positive, one or more poles have zero real part, and all poles with zero real part are simple roots (i.e. the poles on the imaginary axis are all distinct from one another). In contrast, if all the poles have strictly negative real parts, the system is instead asymptotically stable. If the system is neither stable nor marginally stable, it is unstable.
If the system is in state space representation, marginal stability can be analyzed by deriving the Jordan normal form: if and only if the Jordan blocks corresponding to poles with zero real part are scalar is the system marginally stable.
Discrete time
A homogeneous discrete time linear time-invariant syst
|
https://en.wikipedia.org/wiki/Default%20route
|
In computer networking, the default route is a configuration of the Internet Protocol (IP) that establishes a forwarding rule for packets when no specific address of a next-hop host is available from the routing table or other routing mechanisms.
The default route is generally the address of another router, which treats the packet the same way: if a route matches, the packet is forwarded accordingly, otherwise the packet is forwarded to the default route of that router. The route evaluation process in each router uses the longest prefix match method to obtain the most specific route. The network with the longest subnet mask or network prefix that matches the destination IP address is the next-hop network gateway. The process repeats until a packet is delivered to the destination host, or earlier along the route, when a router has no default route available and cannot route the packet otherwise. In the latter case, the packet is dropped and an ICMP Destination Unreachable message may be returned. Each router traversal counts as one hop in the distance calculation for the transmission path.
The device to which the default route points is often called the default gateway, and it often carries out other functions such as packet filtering, firewalling, or proxy server operations.
The default route in Internet Protocol Version 4 (IPv4) is designated as the zero address, in CIDR notation. Similarly, in IPv6, the default route is specified by . The subnet mask is specified as , which effectively specifies all networks and is the shortest match possible. A route lookup that does not match any other rule falls back to this route.
In the highest-level segment of a network, administrators generally point the default route for a given host towards the router that has a connection to a network service provider. Therefore, packets with destinations outside the organization's LAN, typically destinations on the Internet or a wide area network, are forwarded to the router with t
|
https://en.wikipedia.org/wiki/Datacasting
|
Datacasting (data broadcasting) is the broadcasting of data over a wide area via radio waves. It most often refers to supplemental information sent by television stations along with digital terrestrial television (DTT), but may also be applied to digital signals on analog TV or radio. It generally does not apply to data inherent to the medium, such as PSIP data that defines virtual channels for DTT or direct broadcast satellite system, or to things like cable modems or satellite modems, which use a completely separate channel for data.
Overview
Datacasting often provides news, weather forecasting, traffic reporting, stock market, and other information which may or may not relate to the carried programs. It may also be interactive, such as gaming, shopping, or education. An electronic program guide is usually included, although it somewhat stretches the definition, as this is often considered inherent to the digital broadcast standard.
The ATSC, DVB and ISDB standards allow for broadband datacasting via DTT, though they do not necessarily define how. The overscan and VBI are used for analog TV, for moderate and low bandwidths (including closed captioning in the VBI) respectively. DirectBand and RDS/RBDS are medium and narrow subcarriers used for analog FM radio. The EUREKA 147 and HD Radio standards both allow for datacasting on digital radio, defining a few basics but also allowing for later expansion.
The term IP Datacasting (IPDC) is used in DVB-H for the technical elements required to send IP packets over DVB-H broadband downstream channel combined with a return channel over a mobile communications network such as GPRS or UMTS. The set of specifications for IP Datacast (phase1) was approved by the DVB project in October 2005.
Datacasting services around the world
North America
Ambient Information Network
Ambient Information Network, a datacasting network owned by Ambient Devices presently hosted by U.S.A. Mobility, a U.S. paging service which focuses on i
|
https://en.wikipedia.org/wiki/Cold%20hardening
|
Cold hardening is the physiological and biochemical process by which an organism prepares for cold weather.
Plants
Plants in temperate and polar regions adapt to winter and sub zero temperatures by relocating nutrients from leaves and shoots to storage organs. Freezing temperatures induce dehydrative stress on plants, as water absorption in the root and water transport in the plant decreases. Water in and between cells in the plant freezes and expands, causing tissue damage. Cold hardening is a process in which a plant undergoes physiological changes to avoid, or mitigate cellular injuries caused by sub-zero temperatures. Non-acclimatized individuals can survive −5 °C, while an acclimatized individual in the same species can survive −30 °C. Plants that originated in the tropics, like tomato or maize, don't go through cold hardening and are unable to survive freezing temperatures. The plant starts the adaptation by exposure to cold yet still not freezing temperatures. The process can be divided into three steps. First the plant perceives low temperature, then converts the signal to activate or repress expression of appropriate genes. Finally, it uses these genes to combat the stress, caused by sub-zero temperatures, affecting its living cells. Many of the genes and responses to low temperature stress are shared with other abiotic stresses, like drought or salinity.
When temperature drops, the membrane fluidity, RNA and DNA stability, and enzyme activity change. These, in turn, affect transcription, translation, intermediate metabolism, and photosynthesis, leading to an energy imbalance. This energy imbalance is thought to be one of the ways the plant detects low temperature. Experiments on arabidopsis show that the plant detects the change in temperature, rather than the absolute temperature. The rate of temperature drop is directly connected to the magnitude of calcium influx, from the space between cells, into the cell. Calcium channels in the cell membrane de
|
https://en.wikipedia.org/wiki/Cognitive%20development
|
Cognitive development is a field of study in neuroscience and psychology focusing on a child's development in terms of information processing, conceptual resources, perceptual skill, language learning, and other aspects of the developed adult brain and cognitive psychology. Qualitative differences between how a child processes their waking experience and how an adult processes their waking experience are acknowledged (such as object permanence, the understanding of logical relations, and cause-effect reasoning in school-age children). Cognitive development is defined as the emergence of the ability to consciously cognize, understand, and articulate their understanding in adult terms. Cognitive development is how a person perceives, thinks, and gains understanding of their world through the relations of genetic and learning factors. There are four stages to cognitive information development. They are, reasoning, intelligence, language, and memory. These stages start when the baby is about 18 months old, they play with toys, listen to their parents speak, they watch tv, anything that catches their attention helps build their cognitive development.
Jean Piaget was a major force establishing this field, forming his "theory of cognitive development". Piaget proposed four stages of cognitive development: the sensorimotor, preoperational, concrete operational, and formal operational period. Many of Piaget's theoretical claims have since fallen out of favor. His description of the most prominent changes in cognition with age, is generally still accepted today (e.g., how early perception moves from being dependent on concrete, external actions. Later, abstract understanding of observable aspects of reality can be captured; leading to the discovery of underlying abstract rules and principles, usually starting in adolescence)
In recent years, however, alternative models have been advanced, including information-processing theory, neo-Piagetian theories of cognitive develo
|
https://en.wikipedia.org/wiki/Congressional%20Quarterly
|
Congressional Quarterly, Inc., or CQ, is part of a privately owned publishing company called CQ Roll Call that produces several publications reporting primarily on the United States Congress. CQ was acquired by the Economist Group and combined with Roll Call to form CQ Roll Call in 2009; CQ ceased to exist as a separate entity, and in July 2018, a deal was announced for the company to be acquired by FiscalNote.
History
CQ was founded in 1945 by Nelson Poynter and his wife, Henrietta Poynter, to provide a link between local newspapers and the complex politics within Washington, D.C.
Thomas N. Schroth, who had been managing editor of The Brooklyn Eagle, was elected in October 1955 as executive editor and vice president. Schroth built the publication's impartial coverage, with annual revenue growth from $150,000 when he started to $1.8 million. In addition to adding a book division, Schroth added many staff members who achieved future journalistic success, including David S. Broder, Neal R. Peirce, and Elizabeth Drew. He was fired from Congressional Quarterly in 1969 after festering disagreements with Poynter over editorial policy at the publication, and Schroth's efforts to advocate "more imaginative ways of doing things" reached a boil.
In 1965, Poynter summed up his reasons for founding CQ: "The federal government will never set up an adequate agency to check on itself, and a foundation is too timid for that. So it had to be a private enterprise beholden to its clients." Despite its name, CQ was published quarterly for only one year. Demand drove more frequent updates, first weekly, then daily. CQ was also an early leader in delivering information on a real-time basis, starting with a dial-up service in 1984. Its website dominates the online legislative tracking information market and has been nominated for several awards. In recent years, CQ has launched several web-only newsletters with a greater focus on particular areas, including CQ Homeland Security, CQ Bu
|
https://en.wikipedia.org/wiki/Barrel%20processor
|
A barrel processor is a CPU that switches between threads of execution on every cycle. This CPU design technique is also known as "interleaved" or "fine-grained" temporal multithreading. Unlike simultaneous multithreading in modern superscalar architectures, it generally does not allow execution of multiple instructions in one cycle.
Like preemptive multitasking, each thread of execution is assigned its own program counter and other hardware registers (each thread's architectural state). A barrel processor can guarantee that each thread will execute one instruction every n cycles, unlike a preemptive multitasking machine, that typically runs one thread of execution for tens of millions of cycles, while all other threads wait their turn.
A technique called C-slowing can automatically generate a corresponding barrel processor design from a single-tasking processor design. An n-way barrel processor generated this way acts much like n separate multiprocessing copies of the original single-tasking processor, each one running at roughly 1/n the original speed.
History
One of the earliest examples of a barrel processor was the I/O processing system in the CDC 6000 series supercomputers. These executed one instruction (or a portion of an instruction) from each of 10 different virtual processors (called peripheral processors) before returning to the first processor. From CDC 6000 series we read that "The peripheral processors are collectively implemented as a barrel processor. Each executes routines independently of the others. They are a loose predecessor of bus mastering or direct memory access."
One motivation for barrel processors was to reduce hardware costs. In the case of the CDC 6x00 PPUs, the digital logic of the processor was much faster than the core memory, so rather than having ten separate processors, there are ten separate core memory units for the PPUs, but they all share the single set of processor logic.
Another example is the Honeywell 800, which
|
https://en.wikipedia.org/wiki/Biovar
|
A biovar is a variant prokaryotic strain that differs physiologically or biochemically from other strains in a particular species. Morphovars (or morphotypes) are those strains that differ morphologically. Serovars (or serotypes) are those strains that have antigenic properties that differ from other strains.
List of biovars
This is a list of biovars and strains of biovars listed at the NCBI Taxonomy database:
Acinetobacter calcoaceticus biovar anitratus, a homotypic synonym for Acinetobacter calcoaceticus subsp. anitratus
Actinobacillus anseriformium biovar 1
Actinobacillus anseriformium biovar 2
Aeromonas veronii bv. sobria
Aeromonas veronii bv. veronii
Agrobacterium biovar 1, a synonym for Agrobacterium tumefaciens complex
Agrobacterium biovar 2, a synonym for Agrobacterium rhizogenes
Agrobacterium biovar 3, a synonym for Agrobacterium vitis
Bacillus cereus biovar anthracis
Bacillus cereus biovar anthracis str. CI
Bacillus thuringiensis biovar tenebrionis, a synonym for Bacillus thuringiensis serovar tenebrionis
Bacillus cereus biovar toyoi, a synonym for Bacillus toyonensis
Bacillus wiedmannii bv. thuringiensis
Pasteurella haemolytica biovar T, synonym of Bibersteinia trehalosi
Bifidobacterium longum bv. Suis, homotypic synonym of Bifidobacterium longum subsp. suis
Bisgaard taxon 3 biovar 1
Bradyrhizobium retamae bv. lupini
Bradyrhizobium valentinum bv. lupini
Brucella melitensis biovar Abortus, a synonym for Brucella abortus
Brucella melitensis biovar Abortus 2308, a synonym for Brucella abortus 2308
Brucella abortus bv. 1
Brucella abortus bv. 1 str. 9-941
Brucella abortus bv. 2
Brucella abortus bv. 3
Brucella abortus bv. 4
Brucella melitensis biovar Canis, a synonym for Brucella canis
Brucella melitensis biovar Melitensis
Brucella melitensis bv. 1
Brucella melitensis bv. 2
Brucella melitensis bv. 3
Brucella melitensis biovar Neotomae, synonym for Brucella neotomae
Brucella melitensis biovar Ovis, synonym for Brucella ovis
Brucella melitensis biovar Suis, syn
|
https://en.wikipedia.org/wiki/Windows%20Mobile
|
Windows Mobile was a family of mobile operating systems developed by Microsoft for smartphones and personal digital assistants.
Its origin dated back to Windows CE in 1996, though Windows Mobile itself first appeared in 2000 as Pocket PC 2000 which ran on Pocket PC PDAs. It was renamed "Windows Mobile" in 2003, at which point it came in several versions (similar to the desktop versions of Windows) and was aimed at business and enterprise consumers. When initially released in the mid-2000s, it was to be the portable equivalent of what Windows desktop OS was: a major force in the then-emerging mobile/portable areas.
Following the rise of newer smartphone OSs (iOS and Android) Windows Mobile never equalled the success and faded rapidly in the following years. By February 2010, Microsoft announced the more modern and consumer-focused Windows Phone to supersede Windows Mobile. As a result, Windows Mobile has been deprecated since existing devices and software are incompatible with Windows Phone. The last version of Windows Mobile, released after the announcement of Windows Phone, was 6.5.5. After this, Microsoft ceased development on Windows Mobile in order to concentrate on Windows Phone.
Microsoft released a similarly-named Windows 10 Mobile in 2015 which was part of the Windows Phone series, and it is unrelated to the former Windows Mobile operating systems.
Features
Most versions of Windows Mobile have a standard set of features, such as multitasking and the ability to navigate a file system similar to that of Windows 9x and Windows NT, including support for many of the same file types. Similarly to its desktop counterpart, it comes bundled with a set of applications that perform basic tasks. Internet Explorer Mobile is the default web browser, and Windows Media Player is the default media player used for playing digital media. The mobile version of Microsoft Office is the default office suite.
Internet Connection Sharing, supported on compatible devices, allows
|
https://en.wikipedia.org/wiki/M.%20A.%20Foster
|
Michael Anthony Foster (July 2, 1939 – November 14, 2020) was an American science fiction writer from Greensboro, North Carolina. He spent over sixteen years as a Captain and Russian linguist in the United States Air Force.
"Ler" books
Foster wrote a loosely connected trilogy about an offshoot of humanity called the Ler: The Warriors of Dawn (1975), The Gameplayers of Zan (1977), and The Day of The Klesh (1979). The Gameplayers of Zan takes place much earlier than the others, on Earth, and details the Ler's departure from Earth; the other two books cover two separate human-Ler encounters on other planets. The "game" from the title of The Gameplayers of Zan is based on cellular automata, a more intricate version of Conway's Game of Life.
The Warriors of Dawn mostly concerns the relationship between a human man and a ler woman, and The Day of The Klesh represents the ler as a mostly inscrutable humanoid race. The Gameplayers of Zan, on the other hand, discusses the origins of the Ler as an engineered offshoot of humanity, and is as much about ler culture as their interactions with humanity. Most of the action in this book takes place in Unwharrie National Forest, North Carolina.
Ler reproduce infrequently, only becoming fertile at age thirty after a long adolescence: they experience two fertility periods, five years apart. Hence most ler females have only two children, but occasionally they have a third, and twins are not unknown.
Ler family structure is organized around a "braid," which they have designed to preserve maximum genetic diversity to offset their low initial population and small birth rate. A braid starts with two "fore-parents". They mate and produce the "elder outsibling". Then each of the fore-parents goes forth and brings back another ler of the appropriate gender, the "after-parents". The fore-parents each mate with an after-parent and produce the "insiblings", five years younger than the elder outsibling. Then the after-parents mate and produc
|
https://en.wikipedia.org/wiki/Microprinting
|
Microprinting is the production of recognizable patterns or characters in a printed medium at a scale that typically requires magnification to read with the naked eye. To the unaided eye, the text may appear as a solid line. Attempts to reproduce by methods of photocopy, image scanning, or pantograph typically translate as a dotted or solid line, unless the reproduction method can identify and recreate patterns to such scale. Microprint is predominantly used as an anti-counterfeiting technique, due to its inability to be easily reproduced by widespread digital methods.
While microphotography precedes microprint, microprint was significantly influenced by Albert Boni in 1934 when he was inspired by his friend, writer and editor Manuel Komroff, who was showing his experimentations related to the enlarging of photographs. It occurred to Boni that if he could reduce rather than enlarge photographs, this technology might enable publication companies and libraries to access much greater quantities of data at a minimum cost of material and storage space. Over the following decade, Boni worked to develop microprint, a micro-opaque process in which pages were photographed using 35mm microfilm and printed on cards using offset lithography. (, ) This process proved to produce a 6" by 9" index card that stored 100 pages of text from the normal-sized publications he was reproducing. Boni began the Readex Microprint company to produce and license this technology. He also published an article A Guide to the Literature of Photography and Related Subjects (1943), which appeared in a supplemental 18th issue of the Photo-Lab Index.
Usage
Currency commonly exhibits the highest quality (smallest size) of microprint because it demands the highest level of counterfeiting deterrence. For example, on the series 2004 United States $20 bill, microprint is hidden within the border in the lower left corner of the obverse (front) side, as well as the Twenty USA background.
Bank cheques
|
https://en.wikipedia.org/wiki/Proper%20transfer%20function
|
In control theory, a proper transfer function is a transfer function in which the degree of the numerator does not exceed the degree of the denominator. A strictly proper transfer function is a transfer function where the degree of the numerator is less than the degree of the denominator.
The difference between the degree of the denominator (number of poles) and degree of the numerator (number of zeros) is the relative degree of the transfer function.
Example
The following transfer function:
is proper, because
.
is biproper, because
.
but is not strictly proper, because
.
The following transfer function is not proper (or strictly proper)
because
.
A not proper transfer function can be made proper by using the method of long division.
The following transfer function is strictly proper
because
.
Implications
A proper transfer function will never grow unbounded as the frequency approaches infinity:
A strictly proper transfer function will approach zero as the frequency approaches infinity (which is true for all physical processes):
Also, the integral of the real part of a strictly proper transfer function is zero.
References
Transfer functions - ECE 486: Control Systems Spring 2015, University of Illinois
ELEC ENG 4CL4: Control System Design Notes for Lecture #9, 2004, Dr. Ian C. Bruce, McMaster University
Control theory
|
https://en.wikipedia.org/wiki/Bill%20Roscoe
|
Andrew William Roscoe is a Scottish computer scientist. He was Head of the Department of Computer Science, University of Oxford from 2003 to 2014, and is a Professor of Computer Science. He is also a Fellow of University College, Oxford.
Education and career
Roscoe was born in Dundee, Scotland. He studied for a degree in mathematics at University College, Oxford, from 1975 to 1978, graduating with the top mark for his year in the university. He went on to work at the Computing Laboratory and received his DPhil in 1982. He was appointed Tutorial Fellow at University College in 1983 and served as Senior Tutor from 1993 to 1997. He was head of the Department of Computer Science 2003-08 and 2009–14.
Research
Professor Roscoe works in the area of concurrency theory, in particular the semantic underpinning of Communicating Sequential Processes (CSP) and the associated occam programming language with Sir Tony Hoare. He co-founded Formal Systems (Europe) Limited and worked on the algorithms for the Failures-Divergence Refinement (FDR) tool.
References
External links
Bill Roscoe home page
Living people
People from Dundee
People educated at the High School of Dundee
Alumni of University College, Oxford
Scottish computer scientists
Members of the Department of Computer Science, University of Oxford
Formal methods people
Fellows of University College, Oxford
Scottish scholars and academics
1956 births
|
https://en.wikipedia.org/wiki/List%20of%20network%20protocols%20%28OSI%20model%29
|
This article lists protocols, categorized by the nearest layer in the Open Systems Interconnection model. This list is not exclusive to only the OSI protocol family. Many of these protocols are originally based on the Internet Protocol Suite (TCP/IP) and other models and they often do not fit neatly into OSI layers.
Layer 1 (physical layer)
Telephone network modems
IrDA physical layer
USB physical layer
EIA RS-232, EIA-422, EIA-423, RS-449, RS-485
Ethernet physical layer 10BASE-T, 10BASE2, 10BASE5, 100BASE-TX, 100BASE-FX, 1000BASE-T, 1000BASE-SX and other varieties
Varieties of 802.11 Wi-Fi physical layers
DSL
ISDN
T1 and other T-carrier links, and E1 and other E-carrier links
ITU Recommendations: see ITU-T
IEEE 1394 interfaces
TransferJet
Etherloop
ARINC 818 Avionics Digital Video Bus
G.hn/G.9960 physical layer
CAN bus (controller area network) physical layer
Mobile Industry Processor Interface physical layer
Infrared
Frame Relay
FO Fiber optics
X.25
Layer 2 (data link layer)
ARCnet Attached Resource Computer NETwork
ARP Address Resolution Protocol
ATM Asynchronous Transfer Mode
CHAP Challenge Handshake Authentication Protocol
CDP Cisco Discovery Protocol
DCAP Data Link Switching Client Access Protocol
Distributed Multi-Link Trunking
Distributed Split Multi-Link Trunking
DTP Dynamic Trunking Protocol
Econet
Ethernet
FDDI Fiber Distributed Data Interface
Frame Relay
ITU-T G.hn
HDLC High-Level Data Link Control
IEEE 802.11 WiFi
IEEE 802.16 WiMAX
LACP Link Aggregation Control Protocol
LattisNet
LocalTalk
L2F Layer 2 Forwarding Protocol
L2TP Layer 2 Tunneling Protocol
LLDP Link Layer Discovery Protocol
LLDP-MED Link Layer Discovery Protocol - Media Endpoint Discovery
MAC Media Access Control
Q.710 Simplified Message Transfer Part
Multi-link trunking Protocol
NDP Neighbor Discovery Protocol
PAgP - Cisco Systems proprietary link aggregation protocol
PPP Point-to-Point Protocol
PPTP Point-to-Point Tunneling Protocol
|
https://en.wikipedia.org/wiki/Mekong%20River%20Commission
|
The Mekong River Commission (MRC) is an "...inter-governmental organisation that works directly with the governments of Cambodia, Laos, Thailand, and Vietnam to jointly manage the shared water resources and the sustainable development of the Mekong River". Its mission is "To promote and coordinate sustainable management and development of water and related resources for the countries' mutual benefit and the people's well-being".
History
Mekong Committee (1957–1978)
The origins of the Mekong Committee are linked to the legacy of (de)colonialism in Indochina and subsequent geopolitical developments. The political, social, and economic conditions of the Mekong River basin countries evolved dramatically since the 1950s, when the Mekong represented the "only large river left in the world, besides the Amazon, which remained virtually unexploited." The impetus for the creation of the Mekong cooperative regime progressed in tandem with the drive for the development of the lower Mekong, following the 1954 Geneva Conference which granted Cambodia, Laos, and Vietnam independence from France. A 1957 United Nations Economic Commission for Asia and the Far East (ECAFE) report, Development of Water Resources in the Lower Mekong Basin, recommended development to the tune of 90,000 km2 of irrigation and 13.7 gigawatts (GW) from five dams. Based largely on the recommendations of ECAFE, the "Committee for Coordination on the Lower Mekong Basin" (known as the Mekong Committee) was established in September 1957 with the adoption of the Statute for the Committee for Coordination of Investigations into the Lower Mekong Basin. ECAFE's Bureau of Flood Control had prioritized the Mekong—of the 18 international waterways within its jurisdiction—in the hopes of creating a precedent for cooperation elsewhere. and "one of the UN's earliest spin-offs", as the organization functioned under the aegis of the UN, with its Executive Agent (EA) chosen from the career staff of the United Nations Deve
|
https://en.wikipedia.org/wiki/NASA%20Astrobiology%20Institute
|
The NASA Astrobiology Institute (NAI) was established in 1998 by the National Aeronautics and Space Administration (NASA) "to develop the field of astrobiology and provide a scientific framework for flight missions." In December 2019 the institute's activities were suspended.
The NAI is a virtual, distributed organization that integrates astrobiology research and training programs in concert with the national and international science communities.
History
Although NASA had explored the idea of forming an astrobiology institute in the past, when the Viking biological experiments returned negative results for life on Mars, the public lost interest and federal funds for exobiology dried up. In 1996, the announcement of possible traces of ancient life in the Allan Hills 84001 meteorite from Mars led to new interest in the subject. At the same time, NASA developed the Origins Program, broadening its reach from exobiology to astrobiology, the study of the origin, evolution, distribution, and future of life in the universe.
In 1998, $9 million was set aside to fund the NASA Astrobiology Institute (NAI), an interdisciplinary research effort using the expertise of different scientific research institutions and universities from across the country, centrally linked to Ames Research Center in Mountain View, California. Gerald Soffen former Project Scientist with the Viking program, helped coordinate the new institute. In May, NASA selected eleven science teams, each with a Principal Investigator (PI). NAI was established in July with Scott Hubbard as interim Director. Nobel laureate Baruch S. Blumberg was appointed the first Director of the institute, and served from May 15, 1999 – October 14, 2002.
Program
The NASA Astrobiology Program includes the NAI as one of four components, including the Exobiology and Evolutionary Biology Program; the Astrobiology Science and Technology Instrument Development (ASTID) Program; and the Astrobiology Science and Technology for Explorin
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.