source
stringlengths
32
199
text
stringlengths
26
3k
https://en.wikipedia.org/wiki/ASCII
ASCII ( ), abbreviated from American Standard Code for Information Interchange, is a character encoding standard for electronic communication. ASCII codes represent text in computers, telecommunications equipment, and other devices. Because of technical limitations of computer systems at the time it was invented, ASCII has just 128 code points, of which only 95 are , which severely limited its scope. Modern computer systems have evolved to use Unicode, which has millions of code points, but the first 128 of these are the same as the ASCII set. The Internet Assigned Numbers Authority (IANA) prefers the name US-ASCII for this character encoding. ASCII is one of the IEEE milestones. Overview ASCII was developed from telegraph code. Its first commercial use was in the Teletype Model 33 and the Teletype Model 35 as a seven-bit teleprinter code promoted by Bell data services. Work on the ASCII standard began in May 1961, with the first meeting of the American Standards Association's (ASA) (now the American National Standards Institute or ANSI) X3.2 subcommittee. The first edition of the standard was published in 1963, underwent a major revision during 1967, and experienced its most recent update during 1986. Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting (i.e., alphabetization) of lists and added features for devices other than teleprinters. The use of ASCII format for Network Interchange was described in 1969. That document was formally elevated to an Internet Standard in 2015. Originally based on the (modern) English alphabet, ASCII encodes 128 specified characters into seven-bit integers as shown by the ASCII chart in this article. Ninety-five of the encoded characters are printable: these include the digits 0 to 9, lowercase letters a to z, uppercase letters A to Z, and punctuation symbols. In addition, the original ASCII specification included 33 non-printing control codes which originated with s; most of these are now obsolete, although a few are still commonly used, such as the carriage return, line feed, and tab codes. For example, lowercase i would be represented in the ASCII encoding by binary 1101001 = hexadecimal 69 (i is the ninth letter) = decimal 105. Despite being an American standard, ASCII does not have a code point for the cent (¢). It also does not support English terms with diacritical marks such as résumé and jalapeño, or proper nouns with diacritical marks such as Beyoncé. History The American Standard Code for Information Interchange (ASCII) was developed under the auspices of a committee of the American Standards Association (ASA), called the X3 committee, by its X3.2 (later X3L2) subcommittee, and later by that subcommittee's X3.2.4 working group (now INCITS). The ASA later became the United States of America Standards Institute (USASI) and ultimately became the American National Standards Institute (ANSI). With the other special characters and con
https://en.wikipedia.org/wiki/Algorithms%20%28journal%29
Algorithms is a monthly peer-reviewed open-access scientific journal of mathematics, covering design, analysis, and experiments on algorithms. The journal is published by MDPI and was established in 2008. The founding editor-in-chief was Kazuo Iwama (Kyoto University). From May 2014 to September 2019, the editor-in-chief was Henning Fernau (Universität Trier). The current editor-in-chief is Frank Werner (Otto-von-Guericke-Universität Magdeburg). Abstracting and indexing According to the Journal Citation Reports, the journal has a 2022 impact factor of 2.3. The journal is abstracted and indexed in: See also Journals with similar scope include: ACM Transactions on Algorithms Algorithmica Journal of Algorithms (Elsevier) References External links Computer science journals Open access journals MDPI academic journals English-language journals Academic journals established in 2008 Mathematics journals Monthly journals
https://en.wikipedia.org/wiki/Algorithm
In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus". In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result. As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input. History Ancient algorithms Since antiquity, step-by-step procedures for solving mathematical problems have been attested. This includes Babylonian mathematics (around 2500 BC), Egyptian mathematics (around 1550 BC), Indian mathematics (around 800 BC and later; e.g. Shulba Sutras, Kerala School, and Brāhmasphuṭasiddhānta), The Ifa Oracle (around 500 BC), Greek mathematics (around 240 BC, e.g. sieve of Eratosthenes and Euclidean algorithm), and Arabic mathematics (9th century, e.g. cryptographic algorithms for code-breaking based on frequency analysis). Al-Khwārizmī and the term algorithm Around 825, Muḥammad ibn Mūsā al-Khwārizmī wrote kitāb al-ḥisāb al-hindī ("Book of Indian computation") and kitab al-jam' wa'l-tafriq al-ḥisāb al-hindī ("Addition and subtraction in Indian arithmetic"). Both of these texts are lost in the original Arabic at this time. (However, his other book on algebra remains.) In the early 12th century, Latin translations of said al-Khwarizmi texts involving the Hindu–Arabic numeral system and arithmetic appeared: Liber Alghoarismi de practica arismetrice (attributed to John of Seville) and Liber Algorismi de numero Indorum (attributed to Adelard of Bath). Hereby, alghoarismi or algorismi is the Latinization of Al-Khwarizmi's name; the text starts with the phrase Dixit Algorismi ("Thus spoke Al-Khwarizmi"). In 1240, Alexander of Villedieu writes a Latin text titled Carmen de Algorismo. It begins with: which translates to: The poem is a few hundred lines long and s
https://en.wikipedia.org/wiki/Anime
is hand-drawn and computer-generated animation originating from Japan. Outside Japan and in English, anime refers specifically to animation produced in Japan. However, in Japan and in Japanese, (a term derived from a shortening of the English word animation) describes all animated works, regardless of style or origin. Many works of animation with a similar style to Japanese animation are also produced outside Japan. Video games sometimes also feature themes and artstyles that can be considered as "anime". The earliest commercial Japanese animations date to 1917. A characteristic art style emerged in the 1960s with the works of cartoonist Osamu Tezuka and spread in following decades, developing a large domestic audience. Anime is distributed theatrically, through television broadcasts, directly to home media, and over the Internet. In addition to original works, anime are often adaptations of Japanese comics (manga), light novels, or video games. It is classified into numerous genres targeting various broad and niche audiences. Anime is a diverse medium with distinctive production methods that have adapted in response to emergent technologies. It combines graphic art, characterization, cinematography, and other forms of imaginative and individualistic techniques. Compared to Western animation, anime production generally focuses less on movement, and more on the detail of settings and use of "camera effects", such as panning, zooming, and angle shots. Diverse art styles are used, and character proportions and features can be quite varied, with a common characteristic feature being large and emotive eyes. The anime industry consists of over 430 production companies, including major studios such as Studio Ghibli, Kyoto Animation, Sunrise, Bones, Ufotable, MAPPA, Wit Studio, CoMix Wave Films, Production I.G and Toei Animation. Since the 1980s, the medium has also seen widespread international success with the rise of foreign dubbed, subtitled programming, and since the 2010s its increasing distribution through streaming services and a widening demographic embrace of anime culture, both within Japan and worldwide. Japanese animation accounted for 60% of the world's animated television shows. Etymology As a type of animation, anime is an art form that comprises many genres found in other mediums; it is sometimes mistakenly classified as a genre itself. In Japanese, the term anime is used to refer to all animated works, regardless of style or origin. English-language dictionaries typically define anime () as "a style of Japanese animation" or as "a style of animation originating in Japan". Other definitions are based on origin, making production in Japan a requisite for a work to be considered "anime". The etymology of the term anime is disputed. The English word "animation" is written in Japanese katakana as () and as (, ) in its shortened form. Some sources claim that the term is derived from the French term for animation ("cartoon", literal
https://en.wikipedia.org/wiki/MessagePad
The MessagePad is a discontinued series of personal digital assistant devices developed by Apple Computer for the Newton platform in 1993. Some electronic engineering and the manufacture of Apple's MessagePad devices was undertaken in Japan by Sharp. The devices are based on the ARM 610 RISC processor and all featured handwriting recognition software and were developed and marketed by Apple. The devices run Newton OS. History The development of the Newton MessagePad first began with Apple's former senior vice president of research and development, Jean-Louis Gassée; his team included Steve Capps, co-writer of macOS Finder, and an employed engineer named Steve Sakoman. The development of the Newton MessagePad operated in secret until it was eventually revealed to the Apple Board of Directors in late 1990. When Gassée resigned from his position due to a significant disagreement with the board, seeing how his employer was treated, Sakoman also stopped developing the MessagePad on March 2, 1990. Bill Atkinson, an Apple Executive responsible for the company's Lisa graphical interface, invited Steve Capps, John Sculley, Andy Hertzfeld, Susan Kare, and Marc Porat to a meeting on March 11, 1990. There, they brainstormed a way of saving the MessagePad. Sculley suggested adding new features, including libraries, museums, databases, or institutional archives features, allowing customers to navigate through various window tabs or opened galleries/stacks. The Board later approved his suggestion; he then gave Newton it is official and full backing. The first MessagePad was unveiled by Sculley on the 29th of May 1992 at the summer Consumer Electronics Show (CES) in Chicago. Sculley caved in to pressure to unveil the product early because the Newton did not officially ship for another 14 months on August 2, 1993, starting at a price of . Over 50,000 units were sold by late November 1993. Details Screen and input With the MessagePad 120 with Newton OS 2.0, the Newton Keyboard by Apple became available, which can also be used via the dongle on Newton devices with a Newton InterConnect port, most notably the Apple MessagePad 2000/2100 series, as well as the Apple eMate 300. Newton devices featuring Newton OS 2.1 or higher can be used with the screen turned horizontally ("landscape") as well as vertically ("portrait"). A change of a setting rotates the contents of the display by 90, 180 or 270 degrees. Handwriting recognition still works properly with the display rotated, although display calibration is needed when rotation in any direction is used for the first time or when the Newton device is reset. Handwriting recognition In initial versions (Newton OS 1.x) the handwriting recognition gave extremely mixed results for users and was sometimes inaccurate. The original handwriting recognition engine was called Calligrapher, and was licensed from a Russian company called Paragraph International. Calligrapher's design was quite sophisticated; it attempted to
https://en.wikipedia.org/wiki/Algorithms%20for%20calculating%20variance
Algorithms for calculating variance play a major role in computational statistics. A key difficulty in the design of good algorithms for this problem is that formulas for the variance may involve sums of squares, which can lead to numerical instability as well as to arithmetic overflow when dealing with large values. Naïve algorithm A formula for calculating the variance of an entire population of size N is: Using Bessel's correction to calculate an unbiased estimate of the population variance from a finite sample of n observations, the formula is: Therefore, a naïve algorithm to calculate the estimated variance is given by the following: Let For each datum : This algorithm can easily be adapted to compute the variance of a finite population: simply divide by n instead of n − 1 on the last line. Because and can be very similar numbers, cancellation can lead to the precision of the result to be much less than the inherent precision of the floating-point arithmetic used to perform the computation. Thus this algorithm should not be used in practice, and several alternate, numerically stable, algorithms have been proposed. This is particularly bad if the standard deviation is small relative to the mean. Computing shifted data The variance is invariant with respect to changes in a location parameter, a property which can be used to avoid the catastrophic cancellation in this formula. with any constant, which leads to the new formula the closer is to the mean value the more accurate the result will be, but just choosing a value inside the samples range will guarantee the desired stability. If the values are small then there are no problems with the sum of its squares, on the contrary, if they are large it necessarily means that the variance is large as well. In any case the second term in the formula is always smaller than the first one therefore no cancellation may occur. If just the first sample is taken as the algorithm can be written in Python programming language as def shifted_data_variance(data): if len(data) < 2: return 0.0 K = data[0] n = Ex = Ex2 = 0.0 for x in data: n += 1 Ex += x - K Ex2 += (x - K) ** 2 variance = (Ex2 - Ex**2 / n) / (n - 1) # use n instead of (n-1) if want to compute the exact variance of the given data # use (n-1) if data are samples of a larger population return variance This formula also facilitates the incremental computation that can be expressed as K = Ex = Ex2 = 0.0 n = 0 def add_variable(x): global K, n, Ex, Ex2 if n == 0: K = x n += 1 Ex += x - K Ex2 += (x - K) ** 2 def remove_variable(x): global K, n, Ex, Ex2 n -= 1 Ex -= x - K Ex2 -= (x - K) ** 2 def get_mean(): global K, n, Ex return K + Ex / n def get_variance(): global n, Ex, Ex2 return (Ex2 - Ex**2 / n) / (n - 1) Two-pass algorithm An alternative approach, using a different formula for the variance, f
https://en.wikipedia.org/wiki/Artificial%20intelligence
Artificial intelligence (AI) is the intelligence of machines or software, as opposed to the intelligence of humans or animals. It is also the field of study in computer science that develops and studies intelligent machines. "AI" may also refer to the machines themselves. AI technology is widely used throughout industry, government and science. Some high-profile applications are: advanced web search engines (e.g., Google Search), recommendation systems (used by YouTube, Amazon, and Netflix), understanding human speech (such as Siri and Alexa), self-driving cars (e.g., Waymo), generative or creative tools (ChatGPT and AI art), and competing at the highest level in strategic games (such as chess and Go). Artificial intelligence was founded as an academic discipline in 1956. The field went through multiple cycles of optimism followed by disappointment and loss of funding, but after 2012, when deep learning surpassed all previous AI techniques, there was a vast increase in funding and interest. The various sub-fields of AI research are centered around particular goals and the use of particular tools. The traditional goals of AI research include reasoning, knowledge representation, planning, learning, natural language processing, perception, and support for robotics. General intelligence (the ability to solve an arbitrary problem) is among the field's long-term goals. To solve these problems, AI researchers have adapted and integrated a wide range of problem-solving techniques, including search and mathematical optimization, formal logic, artificial neural networks, and methods based on statistics, operations research, and economics. AI also draws upon psychology, linguistics, philosophy, neuroscience and many other fields. Goals The general problem of simulating (or creating) intelligence has been broken down into sub-problems. These consist of particular traits or capabilities that researchers expect an intelligent system to display. The traits described below have received the most attention and cover the scope of AI research. Reasoning, problem-solving Early researchers developed algorithms that imitated step-by-step reasoning that humans use when they solve puzzles or make logical deductions. By the late 1980s and 1990s, methods were developed for dealing with uncertain or incomplete information, employing concepts from probability and economics. Many of these algorithms are insufficient for solving large reasoning problems because they experience a "combinatorial explosion": they became exponentially slower as the problems grew larger. Even humans rarely use the step-by-step deduction that early AI research could model. They solve most of their problems using fast, intuitive judgments. Accurate and efficient reasoning is an unsolved problem. Knowledge representation Knowledge representation and knowledge engineering allow AI programs to answer questions intelligently and make deductions about real-world facts. Formal knowledge represe
https://en.wikipedia.org/wiki/Applet
In computing, an applet is any small application that performs one specific task that runs within the scope of a dedicated widget engine or a larger program, often as a plug-in. The term is frequently used to refer to a Java applet, a program written in the Java programming language that is designed to be placed on a web page. Applets are typical examples of transient and auxiliary applications that do not monopolize the user's attention. Applets are not full-featured application programs, and are intended to be easily accessible. History The word applet was first used in 1990 in PC Magazine. However, the concept of an applet, or more broadly a small interpreted program downloaded and executed by the user, dates at least to RFC 5 (1969) by Jeff Rulifson, which described the Decode-Encode Language, which was designed to allow remote use of the oN-Line System over ARPANET, by downloading small programs to enhance the interaction. This has been specifically credited as a forerunner of Java's downloadable programs in RFC 2555. Applet as an extension of other software In some cases, an applet does not run independently. These applets must run either in a container provided by a host program, through a plugin, or a variety of other applications including mobile devices that support the applet programming model. Web-based applets Applets were used to provide interactive features to web applications that historically could not be provided by HTML alone. They could capture mouse input and also had controls like buttons or check boxes. In response to the user action, an applet could change the provided graphic content. This made applets well suited for demonstration, visualization, and teaching. There were online applet collections for studying various subjects, from physics to heart physiology. Applets were also used to create online game collections that allowed players to compete against live opponents in real-time. An applet could also be a text area only, providing, for instance, a cross-platform command-line interface to some remote system. If needed, an applet could leave the dedicated area and run as a separate window. However, applets had very little control over web page content outside the applet dedicated area, so they were less useful for improving the site appearance in general (while applets like news tickers or WYSIWYG editors are also known). Applets could also play media in formats that are not natively supported by the browser. HTML pages could embed parameters that were passed to the applet. Hence, the same applet could appear differently depending on the parameters that were passed. Examples of Web-based applets include: QuickTime movies Flash movies Windows Media Player applets, used to display embedded video files in Internet Explorer (and other browsers that supported the plugin) 3D modeling display applets, used to rotate and zoom a model Browser games that were applet-based, though some developed into fully functional
https://en.wikipedia.org/wiki/Alan%20Turing
Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher and theoretical biologist. Turing was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general-purpose computer. He is widely considered to be the father of theoretical computer science and artificial intelligence. Born in Maida Vale, London, Turing was raised in southern England. He graduated at King's College, Cambridge, with a degree in mathematics. Whilst he was a fellow at Cambridge, he published a proof demonstrating that some purely mathematical yes–no questions can never be answered by computation. He defined a Turing machine and proved that the halting problem for Turing machines is undecidable. In 1938, he obtained his PhD from the Department of Mathematics at Princeton University. During the Second World War, Turing worked for the Government Code and Cypher School at Bletchley Park, Britain's codebreaking centre that produced Ultra intelligence. For a time he led Hut 8, the section that was responsible for German naval cryptanalysis. Here, he devised a number of techniques for speeding the breaking of German ciphers, including improvements to the pre-war Polish bomba method, an electromechanical machine that could find settings for the Enigma machine. Turing played a crucial role in cracking intercepted coded messages that enabled the Allies to defeat the Axis powers in many crucial engagements, including the Battle of the Atlantic. After the war, Turing worked at the National Physical Laboratory, where he designed the Automatic Computing Engine, one of the first designs for a stored-program computer. In 1948, Turing joined Max Newman's Computing Machine Laboratory at the Victoria University of Manchester, where he helped develop the Manchester computers and became interested in mathematical biology. He wrote a paper on the chemical basis of morphogenesis and predicted oscillating chemical reactions such as the Belousov–Zhabotinsky reaction, first observed in the 1960s. Despite these accomplishments, Turing was never fully recognised in Britain during his lifetime because much of his work was covered by the Official Secrets Act. Turing was prosecuted in 1952 for homosexual acts. He accepted hormone treatment with DES, a procedure commonly referred to as chemical castration, as an alternative to prison. Turing died on 7 June 1954, 16 days before his 42nd birthday, from cyanide poisoning. An inquest determined his death as a suicide, but it has been noted that the known evidence is also consistent with accidental poisoning. Following a public campaign in 2009, British prime minister Gordon Brown made an official public apology on behalf of the government for "the appalling way [Turing] was treated". Queen Elizabeth II granted a posthumous pardon
https://en.wikipedia.org/wiki/Ada%20%28programming%20language%29
Ada is a structured, statically typed, imperative, and object-oriented high-level programming language, inspired by Pascal and other languages. It has built-in language support for design by contract (DbC), extremely strong typing, explicit concurrency, tasks, synchronous message passing, protected objects, and non-determinism. Ada improves code safety and maintainability by using the compiler to find errors in favor of runtime errors. Ada is an international technical standard, jointly defined by the International Organization for Standardization (ISO), and the International Electrotechnical Commission (IEC). , the standard, called Ada 2012 informally, is ISO/IEC 8652:2012. Ada was originally designed by a team led by French computer scientist Jean Ichbiah of Honeywell under contract to the United States Department of Defense (DoD) from 1977 to 1983 to supersede over 450 programming languages used by the DoD at that time. Ada was named after Ada Lovelace (1815–1852), who has been credited as the first computer programmer. Features Ada was originally designed for embedded and real-time systems. The Ada 95 revision, designed by S. Tucker Taft of Intermetrics between 1992 and 1995, improved support for systems, numerical, financial, and object-oriented programming (OOP). Features of Ada include: strong typing, modular programming mechanisms (packages), run-time checking, parallel processing (tasks, synchronous message passing, protected objects, and nondeterministic select statements), exception handling, and generics. Ada 95 added support for object-oriented programming, including dynamic dispatch. The syntax of Ada minimizes choices of ways to perform basic operations, and prefers English keywords (such as "or else" and "and then") to symbols (such as "||" and "&&"). Ada uses the basic arithmetical operators "+", "-", "*", and "/", but avoids using other symbols. Code blocks are delimited by words such as "declare", "begin", and "end", where the "end" (in most cases) is followed by the identifier of the block it closes (e.g., if ... end if, loop ... end loop). In the case of conditional blocks this avoids a dangling else that could pair with the wrong nested if-expression in other languages like C or Java. Ada is designed for developing very large software systems. Ada packages can be compiled separately. Ada package specifications (the package interface) can also be compiled separately without the implementation to check for consistency. This makes it possible to detect problems early during the design phase, before implementation starts. A large number of compile-time checks are supported to help avoid bugs that would not be detectable until run-time in some other languages or would require explicit checks to be added to the source code. For example, the syntax requires explicitly named closing of blocks to prevent errors due to mismatched end tokens. The adherence to strong typing allows detecting many common software errors (wrong par
https://en.wikipedia.org/wiki/Advanced%20Encryption%20Standard
The Advanced Encryption Standard (AES), also known by its original name Rijndael (), is a specification for the encryption of electronic data established by the U.S. National Institute of Standards and Technology (NIST) in 2001. AES is a variant of the Rijndael block cipher developed by two Belgian cryptographers, Joan Daemen and Vincent Rijmen, who submitted a proposal to NIST during the AES selection process. Rijndael is a family of ciphers with different key and block sizes. For AES, NIST selected three members of the Rijndael family, each with a block size of 128 bits, but three different key lengths: 128, 192 and 256 bits. AES has been adopted by the U.S. government. It supersedes the Data Encryption Standard (DES), which was published in 1977. The algorithm described by AES is a symmetric-key algorithm, meaning the same key is used for both encrypting and decrypting the data. In the United States, AES was announced by the NIST as U.S. FIPS PUB 197 (FIPS 197) on November 26, 2001. This announcement followed a five-year standardization process in which fifteen competing designs were presented and evaluated, before the Rijndael cipher was selected as the most suitable. AES is included in the ISO/IEC 18033-3 standard. AES became effective as a U.S. federal government standard on May 26, 2002, after approval by U.S. Secretary of Commerce Donald Evans. AES is available in many different encryption packages, and is the first (and only) publicly accessible cipher approved by the U.S. National Security Agency (NSA) for top secret information when used in an NSA approved cryptographic module. Definitive standards The Advanced Encryption Standard (AES) is defined in each of: FIPS PUB 197: Advanced Encryption Standard (AES) ISO/IEC 18033-3: Block ciphers Description of the ciphers AES is based on a design principle known as a substitution–permutation network, and is efficient in both software and hardware. Unlike its predecessor DES, AES does not use a Feistel network. AES is a variant of Rijndael, with a fixed block size of 128 bits, and a key size of 128, 192, or 256 bits. By contrast, Rijndael per se is specified with block and key sizes that may be any multiple of 32 bits, with a minimum of 128 and a maximum of 256 bits. Most AES calculations are done in a particular finite field. AES operates on a 4 × 4 column-major order array of 16 bytes termed the state: The key size used for an AES cipher specifies the number of transformation rounds that convert the input, called the plaintext, into the final output, called the ciphertext. The number of rounds are as follows: 10 rounds for 128-bit keys. 12 rounds for 192-bit keys. 14 rounds for 256-bit keys. Each round consists of several processing steps, including one that depends on the encryption key itself. A set of reverse rounds are applied to transform ciphertext back into the original plaintext using the same encryption key. High-level description of the algorithm round keys
https://en.wikipedia.org/wiki/Analytical%20engine
The analytical engine was a proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage's difference engine, which was a design for a simpler mechanical calculator. The analytical engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. In other words, the structure of the analytical engine was essentially the same as that which has dominated computer design in the electronic era. The analytical engine is one of the most successful achievements of Charles Babbage. Babbage was never able to complete construction of any of his machines due to conflicts with his chief engineer and inadequate funding. It was not until 1941 that Konrad Zuse built the first general-purpose computer, Z3, more than a century after Babbage had proposed the pioneering analytical engine in 1837. Design Babbage's first attempt at a mechanical computing device, the Difference Engine, was a special-purpose machine designed to tabulate logarithms and trigonometric functions by evaluating finite differences to create approximating polynomials. Construction of this machine was never completed; Babbage had conflicts with his chief engineer, Joseph Clement, and ultimately the British government withdrew its funding for the project. During this project, Babbage realised that a much more general design, the analytical engine, was possible. The work on the design of the analytical engine started around 1833. The input, consisting of programs ("formulae") and data, was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter, and a bell. The machine would also be able to punch numbers onto cards to be read in later. It employed ordinary base-10 fixed-point arithmetic. There was to be a store (that is, a memory) capable of holding 1,000 numbers of 50 decimal digits each (ca. 16.6 kB). An arithmetic unit (the "mill") would be able to perform all four arithmetic operations, plus comparisons and optionally square roots. Initially (1838) it was conceived as a difference engine curved back upon itself, in a generally circular layout, with the long store exiting off to one side. Later drawings (1858) depict a regularised grid layout. Like the central processing unit (CPU) in a modern computer, the mill would rely upon its own internal procedures, to be stored in the form of pegs inserted into rotating drums called "barrels", to carry out some of the more complex instructions the user's program might specify. The programming language to be employed by users was akin to modern day assembly languages. Loops and conditional branching were possible,
https://en.wikipedia.org/wiki/Apple%20I
The Apple Computer 1 (Apple-1), later known predominantly as the Apple I, is an 8-bit motherboard-only personal computer designed by Steve Wozniak and released by the Apple Computer Company (now Apple Inc.) in 1976. The company was initially formed to sell the Apple Iits first product and would later become the world's largest technology company. The idea of starting a company and selling the computer came from Wozniak's friend and Apple co-founder Steve Jobs. To finance its development, Wozniak and Jobs sold some of their possessions for a few hundred dollars. Wozniak demonstrated the first prototype in July 1976 at the Homebrew Computer Club in Palo Alto, California, impressing an early computer retailer. After securing an order for 50 computers, Jobs was able to order the parts on credit and deliver the first Apple products after ten days. The Apple I was one of the first computers available that used the inexpensive MOS Technology 6502 microprocessor. An expansion included a BASIC interpreter, allowing users to utilize BASIC at home instead of at institutions with mainframe computers, greatly lowering the entry cost for computing with BASIC. Production was discontinued on September 30, 1977, after the June 10, 1977 introduction of its successor, the Apple II, which Byte magazine referred to as part of the "1977 Trinity" of personal computing (along with the PET 2001 from Commodore Business Machines and the TRS-80 Model I from Tandy Corporation). As relatively few computers were made before they were discontinued, coupled with their status as Apple's first product, surviving Apple I units are now displayed in computer museums. History Development In 1975, Steve Wozniak started attending meetings of the Homebrew Computer Club, which was a major source of inspiration for him. New microcomputers such as the Altair 8800 and the IMSAI 8080 inspired Wozniak to build a microprocessor into his video terminal circuit to make a complete computer. At the time the only microcomputer CPUs generally available were the $179 Intel 8080 (), and the $170 Motorola 6800 (). Wozniak preferred the 6800, but both were out of his price range. So he watched, and learned, and designed computers on paper, waiting for the day he could afford a CPU. When MOS Technology released its $20 () 6502 chip in 1976, Wozniak wrote a version of BASIC for it, then began to design a computer for it to run on. The 6502 was designed by the same people who designed the 6800, as many in Silicon Valley left employers to form their own companies. Wozniak's earlier 6800 paper-computer needed only minor changes to run on the new chip. By March 1, 1976, Wozniak completed the basic design of his computer. Wozniak originally offered the design to HP while working there, but was denied by the company on five occasions. When he demonstrated his computer at the Homebrew Computer Club, his friend and fellow club regular Steve Jobs was immediately interested in its commercial potential. Wozni
https://en.wikipedia.org/wiki/Atanasoff%E2%80%93Berry%20computer
The Atanasoff–Berry computer (ABC) was the first automatic electronic digital computer. Limited by the technology of the day, and execution, the device has remained somewhat obscure. The ABC's priority is debated among historians of computer technology, because it was neither programmable, nor Turing-complete. Conventionally, the ABC would be considered the first electronic ALU (arithmetic logic unit) which is integrated into every modern processor's design. Its unique contribution was to make computing faster by being the first to use vacuum tubes to do the arithmetic calculations. Prior to this, slower electro-mechanical methods were used by Konrad Zuse's Z1 computer, and the simultaneously developed Harvard Mark I. The first electronic, programmable, digital machine, the Colossus computer from 1943 to 1945, used similar tube-based technology as ABC. Overview Conceived in 1937, the machine was built by Iowa State College mathematics and physics professor John Vincent Atanasoff with the help of graduate student Clifford Berry. It was designed only to solve systems of linear equations and was successfully tested in 1942. However, its intermediate result storage mechanism, a paper card writer/reader, was not perfected, and when John Vincent Atanasoff left Iowa State College for World War II assignments, work on the machine was discontinued. The ABC pioneered important elements of modern computing, including binary arithmetic and electronic switching elements, but its special-purpose nature and lack of a changeable, stored program distinguish it from modern computers. The computer was designated an IEEE Milestone in 1990. Atanasoff and Berry's computer work was not widely known until it was rediscovered in the 1960s, amid patent disputes over the first instance of an electronic computer. At that time ENIAC, that had been created by John Mauchly and J. Presper Eckert, was considered to be the first computer in the modern sense, but in 1973 a U.S. District Court invalidated the ENIAC patent and concluded that the ENIAC inventors had derived the subject matter of the electronic digital computer from Atanasoff. When, in the mid-1970s, the secrecy surrounding the British World War II development of the Colossus computers that pre-dated ENIAC, was lifted and Colossus was described at a conference in Los Alamos, New Mexico, in June 1976, John Mauchly and Konrad Zuse were reported to have been astonished. Design and construction According to Atanasoff's account, several key principles of the Atanasoff–Berry computer were conceived in a sudden insight after a long nighttime drive to Rock Island, Illinois, during the winter of 1937–38. The ABC innovations included electronic computation, binary arithmetic, parallel processing, regenerative capacitor memory, and a separation of memory and computing functions. The mechanical and logic design was worked out by Atanasoff over the next year. A grant application to build a proof of concept prototype was sub
https://en.wikipedia.org/wiki/Assembly%20language
In computer programming, assembly language (alternatively assembler language or symbolic machine code), often referred to simply as assembly and commonly abbreviated as ASM or asm, is any low-level programming language with a very strong correspondence between the instructions in the language and the architecture's machine code instructions. Assembly language usually has one statement per machine instruction (1:1), but constants, comments, assembler directives, symbolic labels of, e.g., memory locations, registers, and macros are generally also supported. The first assembly code in which a language is used to represent machine code instructions is found in Kathleen and Andrew Donald Booth's 1947 work, Coding for A.R.C.. Assembly code is converted into executable machine code by a utility program referred to as an assembler. The term "assembler" is generally attributed to Wilkes, Wheeler and Gill in their 1951 book The Preparation of Programs for an Electronic Digital Computer, who, however, used the term to mean "a program that assembles another program consisting of several sections into a single program". The conversion process is referred to as assembly, as in assembling the source code. The computational step when an assembler is processing a program is called assembly time. Because assembly depends on the machine code instructions, each assembly language is specific to a particular computer architecture. Sometimes there is more than one assembler for the same architecture, and sometimes an assembler is specific to an operating system or to particular operating systems. Most assembly languages do not provide specific syntax for operating system calls, and most assembly languages can be used universally with any operating system, as the language provides access to all the real capabilities of the processor, upon which all system call mechanisms ultimately rest. In contrast to assembly languages, most high-level programming languages are generally portable across multiple architectures but require interpreting or compiling, much more complicated tasks than assembling. In the first decades of computing, it was commonplace for both systems programming and application programming to take place entirely in assembly language. While still irreplaceable for some purposes, the majority of programming is now conducted in higher-level interpreted and compiled languages. In "No Silver Bullet", Fred Brooks summarised the effects of the switch away from assembly language programming: "Surely the most powerful stroke for software productivity, reliability, and simplicity has been the progressive use of high-level languages for programming. Most observers credit that development with at least a factor of five in productivity, and with concomitant gains in reliability, simplicity, and comprehensibility." Today, it is typical to use small amounts of assembly language code within larger systems implemented in a higher-level language, for performance rea
https://en.wikipedia.org/wiki/Alan%20Kay
Alan Curtis Kay (born May 17, 1940) is an American computer scientist best known for his pioneering work on object-oriented programming and windowing graphical user interface (GUI) design. At Xerox PARC he led the design and development of the first modern windowed computer desktop interface. There he also led the development of the influential object-oriented programming language Smalltalk, both personally designing most of the early versions of the language and coining the term "object-oriented." He has been elected a Fellow of the American Academy of Arts and Sciences, the National Academy of Engineering, and the Royal Society of Arts. He received the Turing award in 2003. Kay is also a former professional jazz guitarist, composer, and theatrical designer. He also is an amateur classical pipe organist. Early life and work In an interview on education in America with the Davis Group Ltd., Kay said: Originally from Springfield, Massachusetts, Kay's family relocated several times due to his father's career in physiology before ultimately settling in the New York metropolitan area. He attended Brooklyn Technical High School. Having accumulated enough credits to graduate, he then attended Bethany College in Bethany, West Virginia, where he majored in biology and minored in mathematics. Kay then taught guitar in Denver, Colorado for a year. He was drafted in the United States Army, then qualified for officer training in the United States Air Force, where he became a computer programmer after passing an aptitude test. After his discharge, he enrolled at the University of Colorado Boulder and earned a Bachelor of Science (B.S.) in mathematics and molecular biology in 1966. In the autumn of 1966, he began graduate school at the University of Utah College of Engineering. He earned a Master of Science in electrical engineering in 1968, then a Doctor of Philosophy in computer science in 1969. His doctoral dissertation, FLEX: A Flexible Extendable Language, described the invention of a computer language named FLEX. While there, he worked with "fathers of computer graphics" David C. Evans (who had recently been recruited from the University of California, Berkeley to start Utah's computer science department) and Ivan Sutherland (best known for writing such pioneering programs as Sketchpad). Kay credits Sutherland's 1963 thesis for influencing his views on objects and computer programming. As he grew busier with research for the Defense Advanced Research Projects Agency (DARPA), he ended his musical career. In 1968, he met Seymour Papert and learned of the programming language Logo, a dialect of Lisp optimized for educational purposes. This led him to learn of the work of Jean Piaget, Jerome Bruner, Lev Vygotsky, and of constructionist learning, further influencing his professional orientation. In 1969, Kay became a visiting researcher at the Stanford Artificial Intelligence Laboratory in anticipation of accepting a professorship at Carnegie Mel
https://en.wikipedia.org/wiki/APL%20%28programming%20language%29
APL (named after the book A Programming Language) is a programming language developed in the 1960s by Kenneth E. Iverson. Its central datatype is the multidimensional array. It uses a large range of special graphic symbols to represent most functions and operators, leading to very concise code. It has been an important influence on the development of concept modeling, spreadsheets, functional programming, and computer math packages. It has also inspired several other programming languages. History Mathematical notation A mathematical notation for manipulating arrays was developed by Kenneth E. Iverson, starting in 1957 at Harvard University. In 1960, he began work for IBM where he developed this notation with Adin Falkoff and published it in his book A Programming Language in 1962. The preface states its premise: This notation was used inside IBM for short research reports on computer systems, such as the Burroughs B5000 and its stack mechanism when stack machines versus register machines were being evaluated by IBM for upcoming computers. Iverson also used his notation in a draft of the chapter A Programming Language, written for a book he was writing with Fred Brooks, Automatic Data Processing, which would be published in 1963. In 1979, Iverson received the Turing Award for his work on APL. Development into a computer programming language As early as 1962, the first attempt to use the notation to describe a complete computer system happened after Falkoff discussed with William C. Carter his work to standardize the instruction set for the machines that later became the IBM System/360 family. In 1963, Herbert Hellerman, working at the IBM Systems Research Institute, implemented a part of the notation on an IBM 1620 computer, and it was used by students in a special high school course on calculating transcendental functions by series summation. Students tested their code in Hellerman's lab. This implementation of a part of the notation was called Personalized Array Translator (PAT). In 1963, Falkoff, Iverson, and Edward H. Sussenguth Jr., all working at IBM, used the notation for a formal description of the IBM System/360 series machine architecture and functionality, which resulted in a paper published in IBM Systems Journal in 1964. After this was published, the team turned their attention to an implementation of the notation on a computer system. One of the motivations for this focus of implementation was the interest of John L. Lawrence who had new duties with Science Research Associates, an educational company bought by IBM in 1964. Lawrence asked Iverson and his group to help use the language as a tool to develop and use computers in education. After Lawrence M. Breed and Philip S. Abrams of Stanford University joined the team at IBM Research, they continued their prior work on an implementation programmed in FORTRAN IV for a part of the notation which had been done for the IBM 7090 computer running on the IBSYS operating system. T
https://en.wikipedia.org/wiki/ALGOL
ALGOL (; short for "Algorithmic Language") is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years. In the sense that the syntax of most modern languages is "Algol-like", it was arguably more influential than three other high-level programming languages among which it was roughly contemporary: FORTRAN, Lisp, and COBOL. It was designed to avoid some of the perceived problems with FORTRAN and eventually gave rise to many other programming languages, including PL/I, Simula, BCPL, B, Pascal, and C. ALGOL introduced code blocks and the begin...end pairs for delimiting them. It was also the first language implementing nested function definitions with lexical scope. Moreover, it was the first programming language which gave detailed attention to formal language definition and through the Algol 60 Report introduced Backus–Naur form, a principal formal grammar notation for language design. There were three major specifications, named after the years they were first published: ALGOL 58 – originally proposed to be called IAL, for International Algebraic Language. ALGOL 60 – first implemented as X1 ALGOL 60 in 1961. Revised 1963. ALGOL 68 – introduced new elements including flexible arrays, slices, parallelism, operator identification. Revised 1973. ALGOL 68 is substantially different from ALGOL 60 and was not well received, so in general "Algol" means ALGOL 60 and its dialects. History ALGOL was developed jointly by a committee of European and American computer scientists in a meeting in 1958 at the Swiss Federal Institute of Technology in Zurich (cf. ALGOL 58). It specified three different syntaxes: a reference syntax, a publication syntax, and an implementation syntax. The different syntaxes permitted it to use different keyword names and conventions for decimal points (commas vs periods) for different languages. ALGOL was used mostly by research computer scientists in the United States and in Europe. Its use in commercial applications was hindered by the absence of standard input/output facilities in its description and the lack of interest in the language by large computer vendors other than Burroughs Corporation. ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development. John Backus developed the Backus normal form method of describing programming languages specifically for ALGOL 58. It was revised and expanded by Peter Naur for ALGOL 60, and at Donald Knuth's suggestion renamed Backus–Naur form. Peter Naur: "As editor of the ALGOL Bulletin I was drawn into the international discussions of the language and was selected to be member of the European language design group in November 1959. In this capacity I was the editor of the
https://en.wikipedia.org/wiki/AWK
AWK (awk ) is a domain-specific language designed for text processing and typically used as a data extraction and reporting tool. Like sed and grep, it is a filter, and is a standard feature of most Unix-like operating systems. The AWK language is a data-driven scripting language consisting of a set of actions to be taken against streams of textual data – either run directly on files or used as part of a pipeline – for purposes of extracting or transforming text, such as producing formatted reports. The language extensively uses the string datatype, associative arrays (that is, arrays indexed by key strings), and regular expressions. While AWK has a limited intended application domain and was especially designed to support one-liner programs, the language is Turing-complete, and even the early Bell Labs users of AWK often wrote well-structured large AWK programs. AWK was created at Bell Labs in the 1970s, and its name is derived from the surnames of its authors: Alfred Aho, Peter Weinberger, and Brian Kernighan. The acronym is pronounced the same as the name of the bird species auk, which is illustrated on the cover of The AWK Programming Language. When written in all lowercase letters, as awk, it refers to the Unix or Plan 9 program that runs scripts written in the AWK programming language. History AWK was initially developed in 1977 by Alfred Aho (author of egrep), Peter J. Weinberger (who worked on tiny relational databases), and Brian Kernighan. AWK takes its name from their respective initials. According to Kernighan, one of the goals of AWK was to have a tool that would easily manipulate both numbers and strings. AWK was also inspired by Marc Rochkind's programming language that was used to search for patterns in input data, and was implemented using yacc. As one of the early tools to appear in Version 7 Unix, AWK added computational features to a Unix pipeline besides the Bourne shell, the only scripting language available in a standard Unix environment. It is one of the mandatory utilities of the Single UNIX Specification, and is required by the Linux Standard Base specification. AWK was significantly revised and expanded in 1985–88, resulting in the GNU AWK implementation written by Paul Rubin, Jay Fenlason, and Richard Stallman, released in 1988. GNU AWK may be the most widely deployed version because it is included with GNU-based Linux packages. GNU AWK has been maintained solely by Arnold Robbins since 1994. Brian Kernighan's nawk (New AWK) source was first released in 1993 unpublicized, and publicly since the late 1990s; many BSD systems use it to avoid the GPL license. AWK was preceded by sed (1974). Both were designed for text processing. They share the line-oriented, data-driven paradigm, and are particularly suited to writing one-liner programs, due to the implicit main loop and current line variables. The power and terseness of early AWK programs – notably the powerful regular expression handling and conciseness due to im
https://en.wikipedia.org/wiki/Kolmogorov%20complexity
In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as algorithmic complexity, Solomonoff–Kolmogorov–Chaitin complexity, program-size complexity, descriptive complexity, or algorithmic entropy. It is named after Andrey Kolmogorov, who first published on the subject in 1963 and is a generalization of classical information theory. The notion of Kolmogorov complexity can be used to state and prove impossibility results akin to Cantor's diagonal argument, Gödel's incompleteness theorem, and Turing's halting problem. In particular, no program P computing a lower bound for each text's Kolmogorov complexity can return a value essentially larger than P's own length (see section ); hence no single program can compute the exact Kolmogorov complexity for infinitely many texts. Definition Consider the following two strings of 32 lowercase letters and digits: abababababababababababababababab , and 4c1j5b2p0cv4w1x8rx2y39umgw5q85s7 The first string has a short English-language description, namely "write ab 16 times", which consists of 17 characters. The second one has no obvious simple description (using the same character set) other than writing down the string itself, i.e., "write 4c1j5b2p0cv4w1x8rx2y39umgw5q85s7" which has 38 characters. Hence the operation of writing the first string can be said to have "less complexity" than writing the second. More formally, the complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (the sensitivity of complexity relative to the choice of description language is discussed below). It can be shown that the Kolmogorov complexity of any string cannot be more than a few bytes larger than the length of the string itself. Strings like the abab example above, whose Kolmogorov complexity is small relative to the string's size, are not considered to be complex. The Kolmogorov complexity can be defined for any mathematical object, but for simplicity the scope of this article is restricted to strings. We must first specify a description language for strings. Such a description language can be based on any computer programming language, such as Lisp, Pascal, or Java. If P is a program which outputs a string x, then P is a description of x. The length of the description is just the length of P as a character string, multiplied by the number of bits in a character (e.g., 7 for ASCII). We could, alternatively, choose an encoding for Turing machines, where an encoding is a function which associates to each Turing Machine M a bitstring <M>. If M is a Turing Machine which, on input w, outputs string x, then the concatenated string <M> w is a
https://en.wikipedia.org/wiki/ASCII%20art
ASCII art is a graphic design technique that uses computers for presentation and consists of pictures pieced together from the 95 printable (from a total of 128) characters defined by the ASCII Standard from 1963 and ASCII compliant character sets with proprietary extended characters (beyond the 128 characters of standard 7-bit ASCII). The term is also loosely used to refer to text-based visual art in general. ASCII art can be created with any text editor, and is often used with free-form languages. Most examples of ASCII art require a fixed-width font (non-proportional fonts, as on a traditional typewriter) such as Courier for presentation. Among the oldest known examples of ASCII art are the creations by computer-art pioneer Kenneth Knowlton from around 1966, who was working for Bell Labs at the time. "Studies in Perception I" by Knowlton and Leon Harmon from 1966 shows some examples of their early ASCII art. ASCII art was invented, in large part, because early printers often lacked graphics ability and thus, characters were used in place of graphic marks. Also, to mark divisions between different print jobs from different users, bulk printers often used ASCII art to print large banner pages, making the division easier to spot so that the results could be more easily separated by a computer operator or clerk. ASCII art was also used in early e-mail when images could not be embedded. History Typewriter art Since 1867, typewriters have been used for creating visual art. TTY and RTTY TTY stands for "TeleTYpe" or "TeleTYpewriter", and is also known as Teleprinter or Teletype. RTTY stands for Radioteletype; character sets such as Baudot code, which predated ASCII, were used. According to a chapter in the "RTTY Handbook", text images have been sent via teletypewriter as early as 1923. However, none of the "old" RTTY art has been discovered yet. What is known is that text images appeared frequently on radioteletype in the 1960s and the 1970s. Line-printer art In the 1960s, Andries van Dam published a representation of an electronic circuit produced on an IBM 1403 line printer. At the same time, Kenneth Knowlton was producing realistic images, also on line printers, by overprinting several characters on top of one another. Note that it was not ASCII art in a sense that the 1403 was driven by an EBCDIC-coded platform and the character sets and trains available on the 1403 were derived from EBCDIC rather than ASCII, despite some glyphs commonalities. ASCII art The widespread usage of ASCII art can be traced to the computer bulletin board systems of the late 1970s and early 1980s. The limitations of computers of that time period necessitated the use of text characters to represent images. Along with ASCII's use in communication, however, it also began to appear in the underground online art groups of the period. An ASCII comic is a form of webcomic which uses ASCII text to create images. In place of images in a regular comic, ASCII art is used, wi
https://en.wikipedia.org/wiki/Adobe%20Inc.
Adobe Inc. ( ), formerly Adobe Systems Incorporated, is an American multinational computer software company incorporated in Delaware and headquartered in San Jose, California. It has historically specialized in software for the creation and publication of a wide range of content, including graphics, photography, illustration, animation, multimedia/video, motion pictures, and print. Its flagship products include Adobe Photoshop image editing software; Adobe Illustrator vector-based illustration software; Adobe Acrobat Reader and the Portable Document Format (PDF); and a host of tools primarily for audio-visual content creation, editing and publishing. Adobe offered a bundled solution of its products named Adobe Creative Suite, which evolved into a subscription software as a service (SaaS) offering named Adobe Creative Cloud. The company also expanded into digital marketing software and in 2021 was considered one of the top global leaders in Customer Experience Management (CXM). Adobe was founded in December 1982 by John Warnock and Charles Geschke, who established the company after leaving Xerox PARC to develop and sell the PostScript page description language. In 1985, Apple Computer licensed PostScript for use in its LaserWriter printers, which helped spark the desktop publishing revolution. Adobe later developed animation and multimedia through its acquisition of Macromedia, from which it acquired Macromedia Flash; video editing and compositing software with Adobe Premiere, later known as Adobe Premiere Pro; low-code web development with Adobe Muse; and a suite of software for digital marketing management. As of 2022, Adobe has more than 26,000 employees worldwide. Adobe also has major development operations in the United States in Newton, New York City, Arden Hills, Lehi, Seattle, Austin and San Francisco. It also has major development operations in Noida and Bangalore in India. History The company was started in John Warnock's garage. The name of the company, Adobe, comes from Adobe Creek in Los Altos, California, a stream which ran behind Warnock's house. That creek is so named because of the type of clay found there (Adobe being a Spanish word for Mudbrick), which alludes to the creative nature of the company's software. Adobe's corporate logo features a stylized "A" and was designed by graphic designer Marva Warnock, John Warnock's wife. In 2020, the company updated its visual identity, including updating its logo to a single color, an all-red logo. Steve Jobs attempted to buy the company for $5 million in 1982, but Warnock and Geschke refused. Their investors urged them to work something out with Jobs, so they agreed to sell him shares worth 19 percent of the company. Jobs paid a five-times multiple of their company's valuation at the time, plus a five-year license fee for PostScript, in advance. The purchase and advance made Adobe the first company in the history of Silicon Valley to become profitable in its first year. Warnock and
https://en.wikipedia.org/wiki/Amiga
Amiga is a family of personal computers introduced by Commodore in 1985. The original model is one of a number of mid-1980s computers with 16- or 16/32-bit processors, 256 KB or more of RAM, mouse-based GUIs, and significantly improved graphics and audio compared to previous 8-bit systems. These systems include the Atari ST—released earlier the same year—as well as the Macintosh and Acorn Archimedes. Based on the Motorola 68000 microprocessor, the Amiga differs from its contemporaries through the inclusion of custom hardware to accelerate graphics and sound, including sprites and a blitter, and a pre-emptive multitasking operating system called AmigaOS. The Amiga 1000 was released in July 1985, but production problems kept it from becoming widely available until early 1986. The best-selling model, the Amiga 500, was introduced in 1987 along with the more expandable Amiga 2000. The Amiga 3000 was introduced in 1990, followed by the Amiga 500 Plus, and Amiga 600 in March 1992. Finally, the Amiga 1200 and Amiga 4000 were released in late 1992. The Amiga line sold an estimated 4.85 million units. Although early advertisements cast the computer as an all-purpose business machine, especially when outfitted with the Sidecar IBM PC compatibility add-on, the Amiga was most commercially successful as a home computer, with a wide range of games and creative software. The Video Toaster hardware and software suite helped Amiga find a prominent role in desktop video and video production. The Amiga's audio hardware made it a popular platform for music tracker software. The processor and memory capacity enabled 3D rendering packages, including LightWave 3D, Imagine, and Traces, a predecessor to Blender. Poor marketing and the failure of later models to repeat the technological advances of the first systems resulted in Commodore quickly losing market share to the rapidly dropping prices of IBM PC compatibles, which gained 256 color graphics in 1987, as well as the fourth generation of video game consoles. Commodore ultimately went bankrupt in April 1994 after a version of the Amiga packaged as a game console, the Amiga CD32, failed in the marketplace. Since the demise of Commodore, various groups have marketed successors to the original Amiga line, including Genesi, Eyetech, ACube Systems Srl and A-EON Technology. AmigaOS has influenced replacements, clones, and compatible systems such as MorphOS and AROS. Currently Belgian company Hyperion Entertainment maintains and develops AmigaOS 4, which is an official and direct descendant of AmigaOS 3.1 – the last system made by Commodore for the original Amiga Computers. History Concept and early development Jay Miner joined Atari, Inc. in the 1970s to develop custom integrated circuits, and led development of the Atari Video Computer System's TIA. When complete, the team began developing a much more sophisticated set of chips, CTIA, ANTIC and POKEY, that formed the basis of the Atari 8-bit family. With the 8-bit
https://en.wikipedia.org/wiki/Atomic%20semantics
Atomic semantics is a type of guarantee provided by a data register shared by several processors in a parallel machine or in a network of computers working together. Atomic semantics are very strong. An atomic register provides strong guarantees even when there is concurrency and failures. A read/write register R stores a value and is accessed by two basic operations: read and write(v). A read returns the value stored in R and write(v) changes the value stored in R to v. A register is called atomic if it satisfies the two following properties: 1) Each invocation op of a read or write operation: •Must appear as if it were executed at a single point τ(op) in time. •τ (op) works as follow: τb(op) ≤ τ (op) ≤ τe(op): where τb(op) and τe(op) indicate the time when the operation op begins and ends. •If op1 ≠ op2, then τ (op1)≠τ (op2) 2) Each read operation returns the value written by the last write operation before the read, in the sequence where all operations are ordered by their τ values. Atomic/Linearizable register: Termination: when a node is correct, sooner or later each read and write operation will complete. Safety Property (Linearization points for read and write and failed operations): Read operation:It appears as if happened at all nodes at some times between the invocation and response time. Write operation: Similar to read operation, it appears as if happened at all nodes at some times between the invocation and response time. Failed operation(The atomic term comes from this notion):It appears as if it is completed at every single node or it never happened at any node. Example : We know that an atomic register is one that is linearizable to a sequential safe register. The following picture shows where we should put the linearization point for each operation: An atomic register could be defined for a variable with a single writer but multi- readers (SWMR), single-writer/single-reader (SWSR), or multi-writer/multi-reader (MWMR). Here is an example of a multi-reader multi-writer atomic register which is accessed by three processes (P1, P2, P3). Note that R. read() → v means that the corresponding read operation returns v, which is the value of the register. Therefore, the following execution of the register R could satisfies the definition of the atomic registers: R.write(1), R.read()→1, R.write(3), R.write(2), R.read()→2, R.read()→2. See also Regular semantics Safe semantics References Atomic semantics are defined formally in Lamport's "On Interprocess Communication" Distributed Computing 1, 2 (1986), 77-101. (Also appeared as SRC Research Report 8). Concurrency control
https://en.wikipedia.org/wiki/Alpha%20compositing
In computer graphics, alpha compositing or alpha blending is the process of combining one image with a background to create the appearance of partial or full transparency. It is often useful to render picture elements (pixels) in separate passes or layers and then combine the resulting 2D images into a single, final image called the composite. Compositing is used extensively in film when combining computer-rendered image elements with live footage. Alpha blending is also used in 2D computer graphics to put rasterized foreground elements over a background. In order to combine the picture elements of the images correctly, it is necessary to keep an associated matte for each element in addition to its color. This matte layer contains the coverage information—the shape of the geometry being drawn—making it possible to distinguish between parts of the image where something was drawn and parts that are empty. Although the most basic operation of combining two images is to put one over the other, there are many operations, or blend modes, that are used. History The concept of an alpha channel was introduced by Alvy Ray Smith and in the late 1970s at the New York Institute of Technology Computer Graphics Lab. Bruce A. Wallace derived the same straight over operator based on a physical reflectance/transmittance model in 1981. A 1984 paper by Thomas Porter and Tom Duff introduced premultiplied alpha using a geometrical approach. The use of the term alpha is explained by Smith as follows: "We called it that because of the classic linear interpolation formula that uses the Greek letter (alpha) to control the amount of interpolation between, in this case, two images A and B". That is, when compositing image A atop image B, the value of in the formula is taken directly from A's alpha channel. Description In a 2D image a color combination is stored for each picture element (pixel), often a combination of red, green and blue (RGB). When alpha compositing is in use, each pixel has an additional numeric value stored in its alpha channel, with a value ranging from 0 to 1. A value of 0 means that the pixel is fully transparent and the color in the pixel beneath will show through. A value of 1 means that the pixel is fully opaque. With the existence of an alpha channel, it is possible to express compositing image operations using a compositing algebra. For example, given two images A and B, the most common compositing operation is to combine the images so that A appears in the foreground and B appears in the background. This can be expressed as A over B. In addition to over, Porter and Duff defined the compositing operators in, held out by (the phrase refers to holdout matting and is usually abbreviated out), atop, and xor (and the reverse operators rover, rin, rout, and ratop) from a consideration of choices in blending the colors of two pixels when their coverage is, conceptually, overlaid orthogonally: As an example, the over operator can be accomplis
https://en.wikipedia.org/wiki/Array%20%28data%20structure%29
In computer science, an array is a data structure consisting of a collection of elements (values or variables), of same memory size, each identified by at least one array index or key. An array is stored such that the position of each element can be computed from its index tuple by a mathematical formula. The simplest type of data structure is a linear array, also called one-dimensional array. For example, an array of ten 32-bit (4-byte) integer variables, with indices 0 through 9, may be stored as ten words at memory addresses 2000, 2004, 2008, ..., 2036, (in hexadecimal: 0x7D0, 0x7D4, 0x7D8, ..., 0x7F4) so that the element with index i has the address 2000 + (i × 4). The memory address of the first element of an array is called first address, foundation address, or base address. Because the mathematical concept of a matrix can be represented as a two-dimensional grid, two-dimensional arrays are also sometimes called "matrices". In some cases the term "vector" is used in computing to refer to an array, although tuples rather than vectors are the more mathematically correct equivalent. Tables are often implemented in the form of arrays, especially lookup tables; the word "table" is sometimes used as a synonym of array. Arrays are among the oldest and most important data structures, and are used by almost every program. They are also used to implement many other data structures, such as lists and strings. They effectively exploit the addressing logic of computers. In most modern computers and many external storage devices, the memory is a one-dimensional array of words, whose indices are their addresses. Processors, especially vector processors, are often optimized for array operations. Arrays are useful mostly because the element indices can be computed at run time. Among other things, this feature allows a single iterative statement to process arbitrarily many elements of an array. For that reason, the elements of an array data structure are required to have the same size and should use the same data representation. The set of valid index tuples and the addresses of the elements (and hence the element addressing formula) are usually, but not always, fixed while the array is in use. The term "array" may also refer to an array data type, a kind of data type provided by most high-level programming languages that consists of a collection of values or variables that can be selected by one or more indices computed at run-time. Array types are often implemented by array structures; however, in some languages they may be implemented by hash tables, linked lists, search trees, or other data structures. The term is also used, especially in the description of algorithms, to mean associative array or "abstract array", a theoretical computer science model (an abstract data type or ADT) intended to capture the essential properties of arrays. History The first digital computers used machine-language programming to set up and access array structures for
https://en.wikipedia.org/wiki/Acorn%20Electron
The Acorn Electron (nicknamed the Elk inside Acorn and beyond) was a lower-cost alternative to the BBC Micro educational/home computer, also developed by Acorn Computers Ltd, to provide many of the features of that more expensive machine at a price more competitive with that of the ZX Spectrum. It had 32 kilobytes of RAM, and its ROM included BBC BASIC II together with the operating system. Announced in 1982 for a possible release the same year, it was eventually introduced on 25 August 1983 priced at £199. The Electron was able to save and load programs onto audio cassette via a supplied cable that connected it to any standard tape recorder that had the correct sockets. It was capable of bitmapped graphics, and could use either a television set, a colour (RGB) monitor or a monochrome monitor as its display. Several expansions were made available to provide many of the capabilities omitted from the BBC Micro. Acorn introduced a general-purpose expansion unit, the Plus 1, offering analogue joystick and parallel ports, together with cartridge slots into which ROM cartridges, providing software, or other kinds of hardware expansions, such as disc interfaces, could be inserted. Acorn also produced a dedicated disc expansion, the Plus 3, featuring a disc controller and 3.5-inch floppy drive. For a short period, the Electron was reportedly the best selling micro in the United Kingdom, with an estimated 200,000 to 250,000 machines sold over its entire commercial lifespan. With production effectively discontinued by Acorn as early as 1985, and with the machine offered in bundles with games and expansions, later being substantially discounted by retailers, a revival in demand for the Electron supported a market for software and expansions without Acorn's involvement, with its market for games also helping to sustain the continued viability of games production for the BBC Micro. History After Acorn Computers released the BBC Micro, executives believed that the company needed a less-expensive computer for the mass market. In May 1982, when asked about the recently announced Sinclair ZX Spectrum's potential to hurt sales of the BBC Micro, priced at £125 for the 16K model compared to around twice that price for the 16K BBC Model A, Acorn co-founder Hermann Hauser responded that in the third quarter of that year Acorn would release a new £120–150 computer which "will probably be called the Electron", a form of "miniaturised BBC Micro", having 32 KB of RAM and 32 KB of ROM, with "higher resolution graphics than those offered by the Spectrum". Acorn co-founder Chris Curry also emphasised the Electron's role as being "designed to compete with the Spectrum... to get the starting price very low, but not preclude expansion in the long term." In order to reduce component costs, and to prevent cloning, the company reduced the number of chips in the Electron from the 102 on the BBC Micro's motherboard to "something like 12 to 14 chips" with most functionality on a
https://en.wikipedia.org/wiki/Andrew%20Tridgell
Andrew "Tridge" Tridgell (born 28 February 1967) is an Australian computer programmer. He is the author of and a contributor to the Samba file server, and co-inventor of the rsync algorithm. He has analysed complex proprietary protocols and algorithms, to allow compatible free and open source software implementations. Projects Tridgell was a major developer of the Samba software, analyzing the Server Message Block protocol used for workgroup and network file sharing by Microsoft Windows products. He developed the hierarchical memory allocator, originally as part of Samba. For his PhD dissertation, he co-developed rsync, including the rsync algorithm, a highly efficient file transfer and synchronisation tool. He was also the original author of rzip, which uses a similar algorithm to rsync. He developed spamsum, based on locality-sensitive hashing algorithms. He is the author of KnightCap, a reinforcement-learning based chess engine. Tridgell was also a leader in hacking the TiVo to make it work in Australia, which uses the PAL video format. In April 2005, Tridgell tried to produce free software (now known as SourcePuller) that interoperated with the BitKeeper source code repository. This was cited as the reason that BitMover revoked a license allowing Linux developers free use of their BitKeeper product. Linus Torvalds, the creator of the Linux kernel, and Tridgell were thus involved in a public debate about the events, in which Tridgell stated that, not having bought or owned BitKeeper – and thus having never agreed to its license – he could not violate it, and was analyzing the protocol ethically, as he had done with Samba. Tridgell's involvement in the project resulted in Torvalds accusing him of playing dirty tricks with BitKeeper. Tridgell claimed his analysis started with simply telneting to a BitKeeper server and typing help. In 2011 Tridgell got involved with the software development of ArduPilot Mega, an open source Arduino-based UAV controller board, working on an entry for the UAV Challenge Outback Rescue. Academic achievements Tridgell completed a PhD at the Computer Sciences Laboratory of the Australian National University. His original doctorate work was in the area of speech recognition but was never completed. His submitted dissertation 'Efficient Algorithms for Sorting and Synchronization' was based on his work on the rsync algorithm. Awards and honours In October 2003, The Bulletin magazine judged Tridgell to be Australia's smartest Information and Communications Technology person. In July 2008, Tridgell was named "Best Interoperator" at the Google–O'Reilly Open Source Awards, for his work on Samba and rsync. Tridgell (along with Jeremy Allison and Volker Lendecke) has been called a "guru in its traditional Indian meaning" by IT writer, Sam Varghese. On 11 December 2018, Tridgell was awarded the degree of Doctor of Science (Honoris Causa) by the Australian National University, for authoring Samba, co-inventing rsyn
https://en.wikipedia.org/wiki/Applesoft%20BASIC
Applesoft BASIC is a dialect of Microsoft BASIC, developed by Marc McDonald and Ric Weiland, supplied with the Apple II series of computers. It supersedes Integer BASIC and is the BASIC in ROM in all Apple II series computers after the original Apple II model. It is also referred to as FP BASIC (from floating point) because of the Apple DOS command used to invoke it, instead of INT for Integer BASIC. Applesoft BASIC was supplied by Microsoft and its name is derived from the names of both Apple Computer and Microsoft. Apple employees, including Randy Wigginton, adapted Microsoft's interpreter for the Apple II and added several features. The first version of Applesoft was released in 1977 on cassette tape and lacked proper support for high-resolution graphics. Applesoft II, which was made available on cassette and disk and in the ROM of the Apple II Plus and subsequent models, was released in 1978. It is this latter version, which has some syntax differences and support for the Apple II high-resolution graphics modes, that is usually synonymous with the term "Applesoft." A compiler for Applesoft BASIC, TASC (The Applesoft Compiler), was released by Microsoft in 1981. History When Steve Wozniak wrote Integer BASIC for the Apple II, he did not implement support for floating-point arithmetic because he was primarily interested in writing games, a task for which integers alone were sufficient. In 1976, Microsoft had developed Microsoft BASIC for the MOS Technology 6502, but at the time there was no production computer that used it. Upon learning that Apple had a 6502 machine, Microsoft asked if the company were interested in licensing BASIC, but Steve Jobs replied that Apple already had one. The Apple II was unveiled to the public at the West Coast Computer Faire in April 1977 and became available for sale in June. One of the most common customer complaints about the computer was BASIC's lack of floating-point math. Making things more problematic was that the rival Commodore PET personal computer had a floating point-capable BASIC interpreter from the beginning. As Wozniak—the only person who understood Integer BASIC well enough to add floating point features—was busy with the Disk II drive and controller and with Apple DOS, Apple turned to Microsoft. Apple reportedly obtained an eight-year license for Applesoft BASIC from Microsoft for a flat fee of $31,000, renewing it in 1985 through an arrangement that gave Microsoft the rights and source code for Apple's Macintosh version of BASIC. Applesoft was designed to be backwards-compatible with Integer BASIC and uses the core of Microsoft's 6502 BASIC implementation, which includes using the GET command for detecting key presses and not requiring any spaces on program lines. While Applesoft BASIC is slower than Integer BASIC, it has many features that the older BASIC lacks: Atomic strings: A string is no longer an array of characters (as in Integer BASIC and C); it is instead a garbage-collected ob
https://en.wikipedia.org/wiki/IBM%20AIX
AIX (Advanced Interactive eXecutive, pronounced ,) is a series of proprietary Unix operating systems developed and sold by IBM for several of its computer platforms. Background Originally released for the IBM RT PC RISC workstation in 1986, AIX has supported a wide variety of hardware platforms, including the IBM RS/6000 series and later Power and PowerPC-based systems, IBM System i, System/370 mainframes, PS/2 personal computers, and the Apple Network Server. It is currently supported on IBM Power Systems alongside IBM i and Linux. AIX is based on UNIX System V with 4.3BSD-compatible extensions. It is certified to the UNIX 03 and UNIX V7 marks of the Single UNIX Specification, beginning with AIX versions 5.3 and 7.2 TL5 respectively. Older versions were previously certified to the UNIX 95 and UNIX 98 marks. AIX was the first operating system to have a journaling file system, and IBM has continuously enhanced the software with features such as processor, disk and network virtualization, dynamic hardware resource allocation (including fractional processor units), and reliability engineering ported from its mainframe designs. History Unix started life at AT&T's Bell Labs research center in the early 1970s, running on DEC minicomputers. By 1976, the operating system was in use at various academic institutions, including Princeton, where Tom Lyon and others ported it to the S/370, to run as a guest OS under VM/370. This port would later grow out to become UTS, a mainframe Unix offering by IBM's competitor Amdahl Corporation. IBM's own involvement in Unix can be dated to 1979, when it assisted Bell Labs in doing its own Unix port to the 370 (to be used as a build host for the 5ESS switch's software). In the process, IBM made modifications to the TSS/370 hypervisor to better support Unix. It took until 1985 for IBM to offer its own Unix on the S/370 platform, IX/370, which was developed by Interactive Systems Corporation and intended by IBM to compete with Amdahl UTS. The operating system offered special facilities for interoperating with PC/IX, Interactive/IBM's version of Unix for IBM PC compatible hardware, and was licensed at $10,000 per sixteen concurrent users. AIX Version 1, introduced in 1986 for the IBM RT PC workstation, was based on UNIX System V Releases 1 and 2. In developing AIX, IBM and Interactive Systems Corporation (whom IBM contracted) also incorporated source code from 4.2 and 4.3 BSD UNIX. Among other variants, IBM later produced AIX Version 3 (also known as AIX/6000), based on System V Release 3, for their POWER-based RS/6000 platform. Since 1990, AIX has served as the primary operating system for the RS/6000 series (later renamed IBM eServer pSeries, then IBM System p, and now IBM Power Systems). AIX Version 4, introduced in 1994, added symmetric multiprocessing with the introduction of the first RS/6000 SMP servers and continued to evolve through the 1990s, culminating with AIX 4.3.3 in 1999. Version 4.1, in a slightly
https://en.wikipedia.org/wiki/AppleTalk
AppleTalk is a discontinued proprietary suite of networking protocols developed by Apple Computer for their Macintosh computers. AppleTalk includes a number of features that allow local area networks to be connected with no prior setup or the need for a centralized router or server of any sort. Connected AppleTalk-equipped systems automatically assign addresses, update the distributed namespace, and configure any required inter-networking routing. AppleTalk was released in 1985 and was the primary protocol used by Apple devices through the 1980s and 1990s. Versions were also released for the IBM PC and compatibles and the Apple IIGS. AppleTalk support was also available in most networked printers (especially laser printers), some file servers, and a number of routers. The rise of TCP/IP during the 1990s led to a reimplementation of most of these types of support on that protocol, and AppleTalk became unsupported as of the release of Mac OS X v10.6 in 2009. Many of AppleTalk's more advanced autoconfiguration features have since been introduced in Bonjour, while Universal Plug and Play serves similar needs. History AppleNet After the release of the Apple Lisa computer in January 1983, Apple invested considerable effort in the development of a local area networking (LAN) system for the machines. Known as AppleNet, it was based on the seminal Xerox XNS protocol stack but running on a custom 1 Mbit/s coaxial cable system rather than Xerox's 2.94 Mbit/s Ethernet. AppleNet was announced early in 1983 with a full introduction at the target price of $500 for plug-in AppleNet cards for the Lisa and the Apple II. At that time, early LAN systems were just coming to market, including Ethernet, Token Ring, Econet, and ARCNET. This was a topic of major commercial effort at the time, dominating shows like the National Computer Conference (NCC) in Anaheim in May 1983. All of the systems were jockeying for position in the market, but even at this time, Ethernet's widespread acceptance suggested it was to become a de facto standard. It was at this show that Steve Jobs asked Gursharan Sidhu a seemingly innocuous question: "Why has networking not caught on?" Four months later, in October, AppleNet was cancelled. At the time, they announced that "Apple realized that it's not in the business to create a networking system. We built and used AppleNet in-house, but we realized that if we had shipped it, we would have seen new standards coming up." In January, Jobs announced that they would instead be supporting IBM's Token Ring, which he expected to come out in a "few months". AppleBus Through this period, Apple was deep in development of the Macintosh computer. During development, engineers had made the decision to use the Zilog 8530 serial controller chip (SCC) instead of the lower-cost and more common UART to provide serial port connections. The SCC cost about $5 more than a UART, but offered much higher speeds of up to 250 kilobits per second (or higher with ad
https://en.wikipedia.org/wiki/Apple%20II%20series
The Apple II series (trademarked with square brackets as "Apple ][" and rendered on later models as "Apple //") is a family of home computers, one of the first highly successful mass-produced microcomputer products, designed primarily by Steve Wozniak, manufactured by Apple Computer (now Apple Inc.), and launched in 1977 with the original Apple II. In terms of ease of use, features, and expandability, the Apple II was a major advancement over its predecessor, the Apple I, a limited-production bare circuit board computer for electronics hobbyists. Through 1988, a number of models were introduced, with the most popular, the Apple IIe, remaining relatively unchanged into the 1990s. A model with more advanced graphics and sound and a 16-bit processor, the Apple IIGS, was added in 1986. It remained compatible with earlier Apple II models, but the IIGS had more in common with mid-1980s systems like the Atari ST, Amiga, and Acorn Archimedes. The Apple II was first sold on June 10, 1977. By the end of production in 1993, somewhere between five and six million Apple II series computers (including about 1.25 million Apple IIGS models) had been produced. The Apple II was one of the longest running mass-produced home computer series, with models in production for just under 17 years. The Apple II became one of several recognizable and successful computers during the 1980s and early 1990s, although this was mainly limited to the US. It was aggressively marketed through volume discounts and manufacturing arrangements to educational institutions, which made it the first computer in widespread use in American secondary schools, displacing the early leader Commodore PET. The effort to develop educational and business software for the Apple II, including the 1979 release of the popular VisiCalc spreadsheet, made the computer especially popular with business users and families. Despite the introduction of the Motorola 68000-based Macintosh in 1984, the Apple II series still reportedly accounted for 85% of the company's hardware sales in the first quarter of fiscal 1985. Apple continued to sell Apple II systems alongside the Macintosh until terminating the IIGS in December 1992 and the IIe in November 1993. The last II-series Apple in production, the IIe card for Macintoshes, was discontinued on October 15, 1993. The total Apple II sales of all of its models during its 16-year production run were about 6 million units, with the peak occurring in 1983 when 1 million were sold. Hardware All the machines in the series, except the //c, shared similar overall design elements. The plastic case was designed to look more like a home appliance than a piece of electronic equipment, and the machine could be opened without the use of tools, allowing access to the computer's internals. The motherboard held eight expansion slots and an array of random access memory (RAM) sockets that could hold up to 48 kilobytes. Over the course of the Apple II series' life, an enormous
https://en.wikipedia.org/wiki/Apple%20III
The Apple III (styled as apple ///) is a business-oriented personal computer produced by Apple Computer and released in 1980. Running the Apple SOS operating system, it was intended as the successor to the Apple II series, but was largely considered a failure in the market. It was designed to provide key features business users wanted in a personal computer: a true typewriter-style upper/lowercase keyboard (the Apple II only supported uppercase) and an 80-column display. Work on the Apple III started in late 1978 under the guidance of Dr. Wendell Sander. It had the internal code name of "Sara", named after Sander's daughter. The system was announced on May 19, 1980 and released in late November that year. Serious stability issues required a design overhaul and a recall of the first 14,000 machines produced. The Apple III was formally reintroduced on November 9, 1981. Damage to the computer's reputation had already been done, however, and it failed to do well commercially. Development stopped, and the Apple III was discontinued on April 24, 1984. Its last successor, the III Plus, was dropped from the Apple product line in September 1985. An estimated 65,000–75,000 Apple III computers were sold. The Apple III Plus brought this up to approximately 120,000. Apple co-founder Steve Wozniak stated that the primary reason for the Apple III's failure was that the system was designed by Apple's marketing department, unlike Apple's previous engineering-driven projects. The Apple III's failure led Apple to reevaluate its plan to phase out the Apple II, prompting the eventual continuation of development of the older machine. As a result, later Apple II models incorporated some hardware and software technologies of the Apple III. Overview Design Steve Wozniak and Steve Jobs expected hobbyists to purchase the Apple II, but because of VisiCalc and Disk II, small businesses purchased 90% of the computers. The Apple III was designed to be a business computer and successor. Though the Apple II contributed to the inspirations of several important business products, such as VisiCalc, Multiplan, and Apple Writer, the computer's hardware architecture, operating system, and developer environment are limited. Apple management intended to clearly establish market segmentation by designing the Apple III to appeal to the 90% business market, leaving the Apple II to home and education users. Management believed that "once the Apple III was out, the Apple II would stop selling in six months", Wozniak said. The Apple III is powered by a 1.8-megahertz Synertek 6502A or 6502B 8-bit CPU and, like some of the later machines in the Apple II family, uses bank switching techniques to address memory beyond the 6502's traditional 64 KB limit, up to 256 KB in the III's case. Third-party vendors produced memory upgrade kits that allow the Apple III to reach up to 512 KB of random-access memory (RAM). Other Apple III built-in features include an 80-column, 24-line display with upper
https://en.wikipedia.org/wiki/AVL%20tree
In computer science, an AVL tree (named after inventors Adelson-Velsky and Landis) is a self-balancing binary search tree. In an AVL tree, the heights of the two child subtrees of any node differ by at most one; if at any time they differ by more than one, rebalancing is done to restore this property. Lookup, insertion, and deletion all take time in both the average and worst cases, where is the number of nodes in the tree prior to the operation. Insertions and deletions may require the tree to be rebalanced by one or more tree rotations. The AVL tree is named after its two Soviet inventors, Georgy Adelson-Velsky and Evgenii Landis, who published it in their 1962 paper "An algorithm for the organization of information". It is the oldest self-balancing binary search tree data structure to be invented. AVL trees are often compared with red–black trees because both support the same set of operations and take time for the basic operations. For lookup-intensive applications, AVL trees are faster than red–black trees because they are more strictly balanced. Similar to red–black trees, AVL trees are height-balanced. Both are, in general, neither weight-balanced nor -balanced for any ; that is, sibling nodes can have hugely differing numbers of descendants. Definition Balance factor In a binary tree the balance factor of a node X is defined to be the height difference of its two child sub-trees rooted by node X. A binary tree is defined to be an AVL tree if the invariant holds for every node X in the tree. A node X with is called "left-heavy", one with is called "right-heavy", and one with is sometimes simply called "balanced". Properties Balance factors can be kept up-to-date by knowing the previous balance factors and the change in height – it is not necessary to know the absolute height. For holding the AVL balance information, two bits per node are sufficient. The height (counted as the maximal number of levels) of an AVL tree with nodes lies in the interval: where   is the golden ratio and This is because an AVL tree of height contains at least nodes where is the Fibonacci sequence with the seed values Operations Read-only operations of an AVL tree involve carrying out the same actions as would be carried out on an unbalanced binary search tree, but modifications have to observe and restore the height balance of the sub-trees. Searching Searching for a specific key in an AVL tree can be done the same way as that of any balanced or unbalanced binary search tree. In order for search to work effectively it has to employ a comparison function which establishes a total order (or at least a total preorder) on the set of keys. The number of comparisons required for successful search is limited by the height and for unsuccessful search is very close to , so both are in . Traversal As a read-only operation the traversal of an AVL tree functions the same way as on any other binary tree. Exploring all nodes of the tree visits each
https://en.wikipedia.org/wiki/Aster%20CT-80
The Aster CT-80 is a 1982 personal computer developed by the small Dutch company MCP (later renamed to Aster Computers), was sold in its first incarnation as a kit for hobbyists. Later it was sold ready to use. It consisted of several Eurocard PCB's with DIN 41612 connectors, and a backplane all based on a 19-inch rack configuration. It was the first commercially available Dutch personal/home computer. The Aster computer could use the software written for the popular Tandy TRS-80 computer while fixing many of the problems of that computer, but it could also run CP/M software, with a large amount of free memory Transient Program Area, (TPA) and a full 80×25 display, and it could be used as a Videotext terminal. Although the Aster was a clone of the TRS-80 Model I it was in fact more compatible with the TRS-80 Model III and ran all the software of these systems including games. It also had a built-in speaker which was compatible with such games software. Models Three models were sold. The first model (launched June 1982) looked like the IBM PC, a rectangular base unit with two floppy drives on the front, and a monitor on top with a separate detachable keyboard. The second incarnation was a much smaller unit the width of two 5" floppy drives stacked on top of each other, and the third incarnation looked like a flattened Apple with a built-in keyboard. All units ran much faster than the original TRS-80, at 4 MHz, (with a software selectable throttle to the original speed for compatibility purposes) and the display supported upper and lower case, hardware snow suppression (video ram bus arbitration logic), and an improved character font set. The floppy disk interface supported dual density, and disk capacities up to 800 KB, more than four times the capacity of the original TRS-80. A special version of NewDos/80, (an improved TRS-DOS compatible Disk operating system) was used to support these disk capacities when using the TRS-80 compatibility mode. For the educational market a version of the first model was produced with a new plastic enclosure (the First Asters had an all-metal enclosure) that also had an opening on the top in which a cassette recorder could be placed. This model was used in a cluster with one Aster (with disk drives) for the teacher, and eight disk less versions for the pupils. The pupils could download software from the teachers computer through a network based on a fast serial connection, as well as sending back their work to the teachers computer. There was also hardware in place through which the teacher could see the display of each pupils screen on his own monitor. Working modes The Aster used 64 KB of RAM and had the unique feature of supporting two fundamentally different internal architectures: when turned on without a boot floppy or with a TRS-DOS floppy, the Aster would be fully TRS-80 compatible, with 48 KB of RAM. When the boot loader detected a CP/M floppy, the Aster would reconfigure its internal memory architectu
https://en.wikipedia.org/wiki/Atari%20ST
The Atari ST is a line of personal computers from Atari Corporation and the successor to the Atari 8-bit family. The initial model, the Atari 520ST, had limited release in April–June 1985 and was widely available in July. It was the first personal computer with a bitmapped color GUI, using a version of Digital Research's GEM from February 1985. The Atari 1040ST, released in 1986 with 1 MB of RAM, was the first home computer with a cost-per-kilobyte of less than US$1. After Jack Tramiel purchased the assets of the Atari, Inc. consumer division to create Atari Corporation, the 520ST was designed in five months by a small team led by Shiraz Shivji. Alongside the Macintosh, Amiga, Apple IIGS, and Acorn Archimedes, the ST is part of a mid-1980s generation of computers with 16- or 32-bit processors, 256 KB or more of RAM, and mouse-controlled graphical user interfaces. "ST" officially stands for "Sixteen/Thirty-two", referring to the Motorola 68000's 16-bit external bus and 32-bit internals. The ST was sold with either Atari's color monitor or less expensive monochrome monitor. Color graphics modes are available only on the former while the highest-resolution mode requires the monochrome monitor. Some models can display the color modes on a TV. In Germany and some other markets, the ST gained a foothold for CAD and desktop publishing. With built-in MIDI ports, it was popular for music sequencing and as a controller of musical instruments among amateur and professional musicians. The primary competitor of the Atari ST was the Amiga from Commodore. The 520ST and 1040ST were followed by the Mega series, the STE, and the portable STacy. In the early 1990s, Atari released three final evolutions of the ST with significant technical differences from the original models: TT030 (1990), Mega STE (1991), and Falcon (1992). Atari discontinued the entire ST computer line in 1993, shifting the company's focus to the Jaguar video game console. Development The Atari ST was born from the rivalry between home computer makers Atari, Inc. and Commodore International. Jay Miner, one of the designers of the custom chips in the Atari 2600 and Atari 8-bit family, tried to convince Atari management to create a new chipset for a video game console and computer. When his idea was rejected, he left Atari to form a small think tank called Hi-Toro in 1982 and began designing the new "Lorraine" chipset. Amiga ran out of capital to complete Lorraine's development, and Atari, by then owned by Warner Communications, paid Amiga to continue its work. In return, Atari received exclusive use of the Lorraine design for one year as a video game console. After that time, Atari had the right to add a keyboard and market the complete computer, designated the 1850XLD. Tramel Technology After leaving Commodore International in January 1984, Jack Tramiel formed Tramel (without an "i") Technology, Ltd. with his sons and other ex-Commodore employees and, in April, began planning a new comp
https://en.wikipedia.org/wiki/List%20of%20artificial%20intelligence%20projects
The following is a list of current and past, non-classified notable artificial intelligence projects. Specialized projects Brain-inspired Blue Brain Project, an attempt to create a synthetic brain by reverse-engineering the mammalian brain down to the molecular level. Google Brain, a deep learning project part of Google X attempting to have intelligence similar or equal to human-level. Human Brain Project, ten-year scientific research project, based on exascale supercomputers. Cognitive architectures 4CAPS, developed at Carnegie Mellon University under Marcel A. Just ACT-R, developed at Carnegie Mellon University under John R. Anderson. AIXI, Universal Artificial Intelligence developed by Marcus Hutter at IDSIA and ANU. CALO, a DARPA-funded, 25-institution effort to integrate many artificial intelligence approaches (natural language processing, speech recognition, machine vision, probabilistic logic, planning, reasoning, many forms of machine learning) into an AI assistant that learns to help manage your office environment. CHREST, developed under Fernand Gobet at Brunel University and Peter C. Lane at the University of Hertfordshire. CLARION, developed under Ron Sun at Rensselaer Polytechnic Institute and University of Missouri. CoJACK, an ACT-R inspired extension to the JACK multi-agent system that adds a cognitive architecture to the agents for eliciting more realistic (human-like) behaviors in virtual environments. Copycat, by Douglas Hofstadter and Melanie Mitchell at the Indiana University. DUAL, developed at the New Bulgarian University under Boicho Kokinov. FORR developed by Susan L. Epstein at The City University of New York. IDA and LIDA, implementing Global Workspace Theory, developed under Stan Franklin at the University of Memphis. OpenCog Prime, developed using the OpenCog Framework. Procedural Reasoning System (PRS), developed by Michael Georgeff and Amy L. Lansky at SRI International. Psi-Theory developed under Dietrich Dörner at the Otto-Friedrich University in Bamberg, Germany. Soar, developed under Allen Newell and John Laird at Carnegie Mellon University and the University of Michigan. Society of Mind and its successor The Emotion Machine proposed by Marvin Minsky. Subsumption architectures, developed e.g. by Rodney Brooks (though it could be argued whether they are cognitive). Games AlphaGo, software developed by Google that plays the Chinese board game Go. Chinook, a computer program that plays English draughts; the first to win the world champion title in the competition against humans. Deep Blue, a chess-playing computer developed by IBM which beat Garry Kasparov in 1997. Halite, an artificial intelligence programming competition created by Two Sigma in 2016. Libratus, a poker AI that beat world-class poker players in 2017, intended to be generalisable to other applications. The Matchbox Educable Noughts and Crosses Engine (sometimes called the Machine Educable Noughts and Crosses Engine or M
https://en.wikipedia.org/wiki/Artistic%20License
The Artistic License is an open-source license used for certain free and open-source software packages, most notably the standard implementation of the Perl programming language and most CPAN modules, which are dual-licensed under the Artistic License and the GNU General Public License (GPL). History Artistic License 1.0 The original Artistic License was written by Larry Wall. The name of the license is a reference to the concept of artistic license. Whether or not the original Artistic License is a free software license is largely unsettled. The Free Software Foundation explicitly called the original Artistic License a non-free license, criticizing it as being "too vague; some passages are too clever for their own good, and their meaning is not clear". The FSF recommended that the license not be used on its own, but approved the common AL/GPL dual-licensing approach for Perl projects. In response to this, Bradley Kuhn, who later worked for the Free Software Foundation, made a minimal redraft to clarify the ambiguous passages. This was released as the Clarified Artistic License and was approved by the FSF. It is used by the Paros Proxy, the JavaFBP toolkit and NcFTP. The terms of the Artistic License 1.0 were at issue in Jacobsen v. Katzer in the initial 2009 ruling by the United States District Court for the Northern District of California declared that FOSS-like licenses could only be enforced through contract law rather than through copyright law, in contexts where contract damages would be difficult to establish. On appeal, a federal appellate court "determined that the terms of the Artistic License are enforceable copyright conditions". The case was remanded to the District Court, which did not apply the superior court's criteria on the grounds that, in the interim, the governing Supreme Court precedent applicable to the case had changed. However, this left undisturbed the finding that a free and open-source license nonetheless has economic value. Jacobsen ultimately prevailed in 2010, and the Case established a new standard making terms and conditions under Artistic License 1.0 enforceable through copyright statutes and relevant precedents. Artistic License 2.0 In response to the Request for Comments (RFC) process for improving the licensing position for Perl 6, Kuhn's draft was extensively rewritten by Roberta Cairney and Allison Randal for readability and legal clarity, with input from the Perl community. This resulted in the Artistic License 2.0, which has been approved as both a free software and open source license. The Artistic license 2.0 is also notable for its excellent license compatibility with other FOSS licenses due to a relicensing clause, a property other licenses like the GPL lack. It has been adopted by some of the Perl 6 implementations, the Mojolicious framework, NPM, and has been used by the Parrot virtual machine since version 0.4.13. It is also used by the SNEeSe emulator, which was formerly licensed under
https://en.wikipedia.org/wiki/Amstrad%20CPC
The Amstrad CPC (short for Colour Personal Computer) is a series of 8-bit home computers produced by Amstrad between 1984 and 1990. It was designed to compete in the mid-1980s home computer market dominated by the Commodore 64 and the ZX Spectrum, where it successfully established itself primarily in the United Kingdom, France, Spain, and the German-speaking parts of Europe. The series spawned a total of six distinct models: The CPC 464, CPC 664, and CPC 6128 were highly successful competitors in the European home computer market. The later 464plus and 6128plus, intended to prolong the system's lifecycle with hardware updates, were considerably less successful, as was the attempt to repackage the plus hardware into a game console as the GX4000. The CPC models' hardware is based on the Zilog Z80A CPU, complemented with either 64 or 128 KB of RAM. Their computer-in-a-keyboard design prominently features an integrated storage device, either a compact cassette deck or 3-inch floppy disk drive. The main units were only sold bundled with either a colour, green-screen or monochrome monitor that doubles as the main unit's power supply. Additionally, a wide range of first and third-party hardware extensions such as external disk drives, printers, and memory extensions, was available. The CPC series was pitched against other home computers primarily used to play video games and enjoyed a strong supply of game software. The comparatively low price for a complete computer system with dedicated monitor, its high-resolution monochrome text and graphic capabilities and the possibility to run CP/M software also rendered the system attractive for business users, which was reflected by a wide selection of application software. During its lifetime, the CPC series sold approximately three million units. Models The original range The philosophy behind the CPC series was twofold, firstly the concept was of an "all-in-one", where the computer, keyboard and its data storage device were combined in a single unit and sold with its own dedicated display monitor. Most home computers at that time such as ZX Spectrum series, Commodore 64, and BBC Micro relied on the use of the domestic television set and a separately connected tape recorder or disk drive. In itself, the all-in-one concept was not new, having been seen before on business-oriented machines and the Commodore PET. Secondly, Amstrad founder Alan Sugar wanted the machine to resemble a "real computer, similar to what someone would see being used to check them in at the airport for their holidays", and for the machine to not look like "a pregnant calculator" – in reference presumably to the ZX81 and ZX Spectrum with their low cost, membrane-type keyboards. CPC 464 The CPC 464 was one of the most successful computers in Europe and sold more than two million units. The CPC 464 featured 64 KB RAM and an internal cassette deck. It was introduced in June 1984 in the UK. Initial suggested retail prices for the CP
https://en.wikipedia.org/wiki/Analysis%20of%20algorithms
In computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them. Usually, this involves determining a function that relates the size of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity). An algorithm is said to be efficient when this function's values are small, or grow slowly compared to a growth in the size of the input. Different inputs of the same size may cause the algorithm to have different behavior, so best, worst and average case descriptions might all be of practical interest. When not otherwise specified, the function describing the performance of an algorithm is usually an upper bound, determined from the worst case inputs to the algorithm. The term "analysis of algorithms" was coined by Donald Knuth. Algorithm analysis is an important part of a broader computational complexity theory, which provides theoretical estimates for the resources needed by any algorithm which solves a given computational problem. These estimates provide an insight into reasonable directions of search for efficient algorithms. In theoretical analysis of algorithms it is common to estimate their complexity in the asymptotic sense, i.e., to estimate the complexity function for arbitrarily large input. Big O notation, Big-omega notation and Big-theta notation are used to this end. For instance, binary search is said to run in a number of steps proportional to the logarithm of the size of the sorted list being searched, or in , colloquially "in logarithmic time". Usually asymptotic estimates are used because different implementations of the same algorithm may differ in efficiency. However the efficiencies of any two "reasonable" implementations of a given algorithm are related by a constant multiplicative factor called a hidden constant. Exact (not asymptotic) measures of efficiency can sometimes be computed but they usually require certain assumptions concerning the particular implementation of the algorithm, called model of computation. A model of computation may be defined in terms of an abstract computer, e.g. Turing machine, and/or by postulating that certain operations are executed in unit time. For example, if the sorted list to which we apply binary search has elements, and we can guarantee that each lookup of an element in the list can be done in unit time, then at most time units are needed to return an answer. Cost models Time efficiency estimates depend on what we define to be a step. For the analysis to correspond usefully to the actual run-time, the time required to perform a step must be guaranteed to be bounded above by a constant. One must be careful here; for instance, some analyses count an addition of two numbers as one step. This assumption may not be warranted in certain contexts. For example, if the numbers involved in
https://en.wikipedia.org/wiki/Apple%20II
The Apple II (stylized as ) is an 8-bit home computer and one of the world's first highly successful mass-produced microcomputer products. It was designed primarily by Steve Wozniak; Jerry Manock developed the design of Apple II's foam-molded plastic case, Rod Holt developed the switching power supply, while Steve Jobs's role in the design of the computer was limited to overseeing Jerry Manock's work on the plastic case. It was introduced by Jobs and Wozniak at the 1977 West Coast Computer Faire, and marks Apple's first launch of a personal computer aimed at a consumer market—branded toward American households rather than businessmen or computer hobbyists. Byte magazine referred to the Apple II, Commodore PET 2001, and TRS-80 as the "1977 Trinity". As the Apple II had the defining feature of being able to display color graphics, the Apple logo was redesigned to have a spectrum of colors. The Apple II is the first model in the Apple II series, followed by Apple II+, Apple IIe, Apple IIc, Apple IIc Plus, and the 16-bit Apple IIGS—all of which remained compatible. Production of the last available model, Apple IIe, ceased in November 1993. History By 1976, Steve Jobs had convinced product designer Jerry Manock (who had formerly worked at Hewlett Packard designing calculators) to create the "shell" for the Apple II—a smooth case inspired by kitchen appliances that concealed the internal mechanics. The earliest Apple II computers were assembled in Silicon Valley and later in Texas; printed circuit boards were manufactured in Ireland and Singapore. The first computers went on sale on June 10, 1977 with an MOS Technology 6502 microprocessor running at 1.023 MHz ( of the NTSC color subcarrier), two game paddles (bundled until 1980, when they were found to violate FCC regulations), 4 KiB of RAM, an audio cassette interface for loading programs and storing data, and the Integer BASIC programming language built into ROMs. The video controller displayed 24 lines by 40 columns of monochrome, uppercase-only text on the screen (the original character set matches ASCII characters 20h to 5Fh), with NTSC composite video output suitable for display on a TV monitor or on a regular TV set (by way of a separate RF modulator). The original retail price of the computer with 4 KiB of RAM was and with the maximum 48 KiB of RAM it was To reflect the computer's color graphics capability, the Apple logo on the casing has rainbow stripes, which remained a part of Apple's corporate logo until early 1998. Perhaps most significantly, the Apple II was a catalyst for personal computers across many industries; it opened the doors to software marketed at consumers. Certain aspects of the system's design were influenced by Atari's arcade video game Breakout (1976), which was designed by Wozniak, who said: "A lot of features of the Apple II went in because I had designed Breakout for Atari. I had designed it in hardware. I wanted to write it in software now". This included his d
https://en.wikipedia.org/wiki/Audio%20file%20format
An audio file format is a file format for storing digital audio data on a computer system. The bit layout of the audio data (excluding metadata) is called the audio coding format and can be uncompressed, or compressed to reduce the file size, often using lossy compression. The data can be a raw bitstream in an audio coding format, but it is usually embedded in a container format or an audio data format with defined storage layer. Format types It is important to distinguish between the audio coding format, the container containing the raw audio data, and an audio codec. A codec performs the encoding and decoding of the raw audio data while this encoded data is (usually) stored in a container file. Although most audio file formats support only one type of audio coding data (created with an audio coder), a multimedia container format (as Matroska or AVI) may support multiple types of audio and video data. There are three major groups of audio file formats: Uncompressed audio formats, such as WAV, AIFF, AU or raw header-less PCM; Formats with lossless compression, such as FLAC, Monkey's Audio (filename extension .ape), WavPack (filename extension .wv), TTA, ATRAC Advanced Lossless, ALAC (filename extension .m4a), MPEG-4 SLS, MPEG-4 ALS, MPEG-4 DST, Windows Media Audio Lossless (WMA Lossless), and Shorten (SHN). Formats with lossy compression, such as Opus, MP3, Vorbis, Musepack, AAC, ATRAC and Windows Media Audio Lossy (WMA lossy). Uncompressed audio format One major uncompressed audio format, LPCM, is the same variety of PCM as used in Compact Disc Digital Audio and is the format most commonly accepted by low level audio APIs and D/A converter hardware. Although LPCM can be stored on a computer as a raw audio format, it is usually stored in a .wav file on Windows or in a .aiff file on macOS. The Audio Interchange File Format (AIFF) format is based on the Interchange File Format (IFF), and the WAV format is based on the similar Resource Interchange File Format (RIFF). WAV and AIFF are designed to store a wide variety of audio formats, lossless and lossy; they just add a small, metadata-containing header before the audio data to declare the format of the audio data, such as LPCM with a particular sample rate, bit depth, endianness and number of channels. Since WAV and AIFF are widely supported and can store LPCM, they are suitable file formats for storing and archiving an original recording. BWF (Broadcast Wave Format) is a standard audio format created by the European Broadcasting Union as a successor to WAV. Among other enhancements, BWF allows more robust metadata to be stored in the file. See European Broadcasting Union: Specification of the Broadcast Wave Format (EBU Technical document 3285, July 1997). This is the primary recording format used in many professional audio workstations in the television and film industry. BWF files include a standardized timestamp reference which allows for easy synchronization with a separate picture elem
https://en.wikipedia.org/wiki/Amdahl%27s%20law
In computer architecture, Amdahl's law (or Amdahl's argument) is a formula which gives the theoretical speedup in latency of the execution of a task at fixed workload that can be expected of a system whose resources are improved. It states that "the overall performance improvement gained by optimizing a single part of a system is limited by the fraction of time that the improved part is actually used". It is named after computer scientist Gene Amdahl, and was presented at the American Federation of Information Processing Societies (AFIPS) Spring Joint Computer Conference in 1967. Amdahl's law is often used in parallel computing to predict the theoretical speedup when using multiple processors. For example, if a program needs 20 hours to complete using a single thread, but a one-hour portion of the program cannot be parallelized, therefore only the remaining 19 hours' () execution time can be parallelized, then regardless of how many threads are devoted to a parallelized execution of this program, the minimum execution time is always more than 1 hour. Hence, the theoretical speedup is less than 20 times the single thread performance, . Definition Amdahl's law can be formulated in the following way: where Slatency is the theoretical speedup of the execution of the whole task; s is the speedup of the part of the task that benefits from improved system resources; p is the proportion of execution time that the part benefiting from improved resources originally occupied. Furthermore, shows that the theoretical speedup of the execution of the whole task increases with the improvement of the resources of the system and that regardless of the magnitude of the improvement, the theoretical speedup is always limited by the part of the task that cannot benefit from the improvement. Amdahl's law applies only to the cases where the problem size is fixed. In practice, as more computing resources become available, they tend to get used on larger problems (larger datasets), and the time spent in the parallelizable part often grows much faster than the inherently serial work. In this case, Gustafson's law gives a less pessimistic and more realistic assessment of the parallel performance. Derivation A task executed by a system whose resources are improved compared to an initial similar system can be split up into two parts: a part that does not benefit from the improvement of the resources of the system; a part that benefits from the improvement of the resources of the system. An example is a computer program that processes files. A part of that program may scan the directory of the disk and create a list of files internally in memory. After that, another part of the program passes each file to a separate thread for processing. The part that scans the directory and creates the file list cannot be sped up on a parallel computer, but the part that processes the files can. The execution time of the whole task before the improvement of the resourc
https://en.wikipedia.org/wiki/Abstract%20data%20type
In computer science, an abstract data type (ADT) is a mathematical model for data types, defined by its behavior (semantics) from the point of view of a user of the data, specifically in terms of possible values, possible operations on data of this type, and the behavior of these operations. This mathematical model contrasts with data structures, which are concrete representations of data, and are the point of view of an implementer, not a user. Formally, an ADT may be defined as a "class of objects whose logical behavior is defined by a set of values and a set of operations"; this is analogous to an algebraic structure in mathematics. What is meant by "behaviour" varies by author, with the two main types of formal specifications for behavior being axiomatic (algebraic) specification and an abstract model; these correspond to axiomatic semantics and operational semantics of an abstract machine, respectively. Some authors also include the computational complexity ("cost"), both in terms of time (for computing operations) and space (for representing values). In practice, many common data types are not ADTs, as the abstraction is not perfect, and users must be aware of issues like arithmetic overflow that are due to the representation. For example, integers are often stored as fixed-width values (32-bit or 64-bit binary numbers), and thus experience integer overflow if the maximum value is exceeded. ADTs are a theoretical concept, in computer science, used in the design and analysis of algorithms, data structures, and software systems, and do not correspond to specific features of computer languages—mainstream computer languages do not directly support formally specified ADTs. However, various language features correspond to certain aspects of ADTs, and are easily confused with ADTs proper; these include abstract types, opaque data types, protocols, and design by contract. ADTs were first proposed by Barbara Liskov and Stephen N. Zilles in 1974, as part of the development of the CLU language. Discussion For example, integers are an ADT, defined as the values ..., −2, −1, 0, 1, 2, ..., and by the operations of addition, subtraction, multiplication, and division, together with greater than, less than, etc., which behave according to familiar mathematics (with care for integer division), independently of how the integers are represented by the computer. Explicitly, "behavior" includes obeying various axioms (associativity and commutativity of addition, etc.), and preconditions on operations (cannot divide by zero). Typically integers are represented in a data structure as binary numbers, most often as two's complement, but might be binary-coded decimal or in ones' complement, but for most purposes, the user can work with the abstraction rather than the concrete choice of representation, and can simply use the data as if the type were truly abstract. An ADT consists not only of operations but also of a domain of values and of constraints on the def
https://en.wikipedia.org/wiki/Accelerated%20Graphics%20Port
Accelerated Graphics Port (AGP) is a parallel expansion card standard, designed for attaching a video card to a computer system to assist in the acceleration of 3D computer graphics. It was originally designed as a successor to PCI-type connections for video cards. Since 2004, AGP was progressively phased out in favor of PCI Express (PCIe), which is serial, as opposed to parallel; by mid-2008, PCI Express cards dominated the market and only a few AGP models were available, with GPU manufacturers and add-in board partners eventually dropping support for the interface in favor of PCI Express. Advantages over PCI AGP is a superset of the PCI standard, designed to overcome PCI's limitations in serving the requirements of the era's high-performance graphics cards. The primary advantage of AGP is that it doesn't share the PCI bus, providing a dedicated, point-to-point pathway between the expansion slot(s) and the motherboard chipset. The direct connection also allows for higher clock speeds. The second major change is the use of split transactions, wherein the address and data phases are separated. The card may send many address phases so the host can process them in order, avoiding any long delays caused by the bus being idle during read operations. Third, PCI bus handshaking is simplified. Unlike PCI bus transactions whose length is negotiated on a cycle-by-cycle basis using the FRAME# and STOP# signals, AGP transfers are always a multiple of 8 bytes long, with the total length included in the request. Further, rather than using the IRDY# and TRDY# signals for each word, data is transferred in blocks of four clock cycles (32 words at AGP 8× speed), and pauses are allowed only between blocks. Finally, AGP allows (mandatory only in AGP 3.0) sideband addressing, meaning that the address and data buses are separated so the address phase does not use the main address/data (AD) lines at all. This is done by adding an extra 8-bit "SideBand Address" bus over which the graphics controller can issue new AGP requests while other AGP data is flowing over the main 32 address/data (AD) lines. This results in improved overall AGP data throughput. This great improvement in memory read performance makes it practical for an AGP card to read textures directly from system RAM, while a PCI graphics card must copy it from system RAM to the card's video memory. System memory is made available using the graphics address remapping table (GART), which apportions main memory as needed for texture storage. The maximum amount of system memory available to AGP is defined as the AGP aperture. History The AGP slot first appeared on x86-compatible system boards based on Socket 7 Intel P5 Pentium and Slot 1 P6 Pentium II processors. Intel introduced AGP support with the i440LX Slot 1 chipset on August 26, 1997, and a flood of products followed from all the major system board vendors. The first Socket 7 chipsets to support AGP were the VIA Apollo VP3, SiS 5591/5592, and
https://en.wikipedia.org/wiki/AMD
Advanced Micro Devices, Inc., commonly abbreviated as AMD, is an American multinational semiconductor company based in Santa Clara, California, that develops computer processors and related technologies for business and consumer markets. The company was founded in 1969 by Jerry Sanders and a group of other technology professionals. AMD's early products were primarily memory chips and other components for computers. The company later expanded into the microprocessor market, competing with Intel, its main rival in the industry. In the early 2000s, AMD experienced significant growth and success, thanks in part to its strong position in the PC market and the success of its Athlon and Opteron processors. However, the company faced challenges in the late 2000s and early 2010s, as it struggled to keep up with Intel in the race to produce faster and more powerful processors. In the late 2010s, AMD regained some of its market share thanks to the success of its Ryzen processors which are now widely regarded as superior to Intel products in business applications including cloud applications. AMD's processors are used in a wide range of computing devices, including personal computers, servers, laptops, and gaming consoles. While it initially manufactured its own processors, the company later outsourced its manufacturing, a practice known as going fabless, after GlobalFoundries was spun off in 2009. AMD's main products include microprocessors, motherboard chipsets, embedded processors, graphics processors, and FPGAs for servers, workstations, personal computers, and embedded system applications. The company has also expanded into new markets, such as the data center and gaming markets, and has announced plans to enter the high-performance computing market. History First twelve years Advanced Micro Devices was formally incorporated by Jerry Sanders, along with seven of his colleagues from Fairchild Semiconductor, on May 1, 1969. Sanders, an electrical engineer who was the director of marketing at Fairchild, had, like many Fairchild executives, grown frustrated with the increasing lack of support, opportunity, and flexibility within the company. He later decided to leave to start his own semiconductor company, following the footsteps of Robert Noyce (developer of the first silicon integrated circuit at Fairchild in 1959) and Gordon Moore, who together founded the semiconductor company Intel in July 1968. In September 1969, AMD moved from its temporary location in Santa Clara to Sunnyvale, California. To immediately secure a customer base, AMD initially became a second source supplier of microchips designed by Fairchild and National Semiconductor. AMD first focused on producing logic chips. The company guaranteed quality control to United States Military Standard, an advantage in the early computer industry since unreliability in microchips was a distinct problem that customers – including computer manufacturers, the telecommunications industry, and instru
https://en.wikipedia.org/wiki/Aon%20%28company%29
Aon PLC () is a British-American professional services and management consulting firm that offers a range of risk-mitigation products. The firm also provides data and analytics services, strategy consulting through Aon Inpoint and investment banking advisory through Aon Securities. Aon has approximately 50,000 employees across 120 countries. Founded in Chicago by Patrick Ryan, Aon was created in 1982 when the Ryan Insurance Group merged with the Combined Insurance Company of America. In 1987, that company was renamed Aon from aon, a Gaelic word meaning "one". The company is globally headquartered in London with its North America operations based in Chicago at the Aon Center. Aon is listed on the New York Stock Exchange under AON with a market cap of $65 billion in April 2023. History W. Clement Stone's mother bought a small Detroit insurance agency, and in 1918 brought her son into the business. Mr. Stone sold low-cost, low-benefit accident insurance, underwriting and issuing policies on-site. The next year he founded his own agency, the Combined Registry Co. As the Great Depression began, Stone reduced his workforce and improved training. Forced by his son's respiratory illness to winter in the South, Stone moved to Arkansas and Texas. In 1939 he bought American Casualty Insurance Co. of Dallas, Texas. It was consolidated with other purchases as the Combined Insurance Co. of America in 1947. The company continued through the 1950s and 1960s, continuing to sell health and accident policies. In the 1970s, Combined expanded overseas despite being hit hard by the recession. In 1982, after 10 years of stagnation under Clement Stone Jr., the elder Stone, then 79, resumed control until the completion of a merger with Ryan Insurance Co. allowed him to transfer control to Patrick Ryan. Ryan, the son of a Ford dealer in Wisconsin and a graduate of Northwestern University, had started his company as an auto credit insurer in 1964. In 1976, the company bought the insurance brokerage units of the Esmark conglomerate. Ryan focused on insurance brokering and added more upscale insurance products. He also trimmed staff and took other cost-cutting measures, and in 1987 he changed Combined's name to Aon. In 1992, he bought Dutch insurance broker Hudig-Langeveldt. In 1995, the company sold its remaining direct life insurance holdings to General Electric to focus on consulting. Aon built a global presence through purchases. In 1997, it bought The Minet Group, as well as insurance brokerage Alexander & Alexander Services, Inc. in a deal that made Aon (temporarily) the largest insurance broker worldwide. The firm made no US buys in 1998, but doubled its employee base with purchases including Spain's largest retail insurance broker, Gil y Carvajal, and the formation of Aon Korea. Responding to industry demands, Aon announced its new fee disclosure policy in 1999, and the company reorganised to focus on buying personal line insurance firms and to integrate its ac
https://en.wikipedia.org/wiki/Analog%20computer
An analog computer or analogue computer is a type of computer that uses the continuous variation aspect of physical phenomena such as electrical, mechanical, or hydraulic quantities (analog signals) to model the problem being solved. In contrast, digital computers represent varying quantities symbolically and by discrete values of both time and amplitude (digital signals). Analog computers can have a very wide range of complexity. Slide rules and nomograms are the simplest, while naval gunfire control computers and large hybrid digital/analog computers were among the most complicated. Complex mechanisms for process control and protective relays used analog computation to perform control and protective functions. Analog computers were widely used in scientific and industrial applications even after the advent of digital computers, because at the time they were typically much faster, but they started to become obsolete as early as the 1950s and 1960s, although they remained in use in some specific applications, such as aircraft flight simulators, the flight computer in aircraft, and for teaching control systems in universities. Perhaps the most relatable example of analog computers are mechanical watches where the continuous and periodic rotation of interlinked gears drives the second, minute and hour needles in the clock. More complex applications, such as aircraft flight simulators and synthetic-aperture radar, remained the domain of analog computing (and hybrid computing) well into the 1980s, since digital computers were insufficient for the task. Timeline of analog computers Precursors This is a list of examples of early computation devices considered precursors of the modern computers. Some of them may even have been dubbed 'computers' by the press, though they may fail to fit modern definitions. The Antikythera mechanism, a type of device used to determine the positions of heavenly bodies known as an orrery, was described as an early mechanical analog computer by British physicist, information scientist, and historian of science Derek J. de Solla Price. It was discovered in 1901, in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to , during the Hellenistic period. Devices of a level of complexity comparable to that of the Antikythera mechanism would not reappear until a thousand years later. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was first described by Ptolemy in the 2nd century AD. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BC and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, P
https://en.wikipedia.org/wiki/ATM
ATM or atm often refers to: Atmosphere (unit) or atm, a unit of atmospheric pressure Automated teller machine, a cash dispenser or cash machine ATM or atm may also refer to: Computing ATM (computer), a ZX Spectrum clone developed in Moscow in 1991 Adobe Type Manager, a computer program for managing fonts Accelerated Turing machine, or Zeno machine, a model of computation used in theoretical computer science Alternating Turing machine, a model of computation used in theoretical computer science Asynchronous Transfer Mode, a telecommunications protocol used in networking ATM adaptation layer ATM Adaptation Layer 5 Media Amateur Telescope Making, a series of books by Albert Graham Ingalls ATM (2012 film), an American film ATM: Er Rak Error, a 2012 Thai film Azhagiya Tamil Magan, a 2007 Indian film "ATM" (song), a 2018 song by J. Cole from KOD People and organizations Abiding Truth Ministries, anti-LGBT organization in Springfield, Massachusetts, US Association of Teachers of Mathematics, UK Acrylic Tank Manufacturing, US aquarium manufacturer, televised in Tanked ATM FA, a football club in Malaysia A. T. M. Wilson (1906–1978), British psychiatrist African Transformation Movement, South African political party founded in 2018 The a2 Milk Company (NZX ticker symbol ATM) Science Apollo Telescope Mount, a solar observatory ATM serine/threonine kinase, a serine/threonine kinase activated by DNA damage The Airborne Topographic Mapper, a laser altimeter among the instruments used by NASA's Operation IceBridge Transportation Active traffic management, a motorway scheme on the M42 in England Air traffic management, a concept in aviation Altamira Airport, in Brazil (IATA code ATM) Azienda Trasporti Milanesi, the municipal public transport company of Milan Airlines of Tasmania (ICAO code ATM) Catalonia, Spain Autoritat del Transport Metropolità (ATM Àrea de Barcelona), in the Barcelona metropolitan area Autoritat Territorial de la Mobilitat del Camp de Tarragona (ATM Camp de Tarragona), in the Camp de Tarragona area Autoritat Territorial de la Mobilitat de l'Àrea de Girona (ATM Àrea de Girona), in the Girona area Autoritat Territorial de la Mobilitat de l'Àrea de Lleida (ATM Àrea de Lleida), in the Lleida area Other uses Actun Tunichil Muknal, a cave in Belize Anti-tank missile, a missile designed to destroy tanks Ass to mouth, a sexual act At the money, moneyness where the strike price is the same as the current spot price At-the-market offering, a type of follow-on offering of stock Automatenmarken, a variable value stamp Contracted form of Atlético Madrid, football club in Spain Common abbreviation in SMS language for "at the moment"
https://en.wikipedia.org/wiki/Asynchronous%20communication
In telecommunications, asynchronous communication is transmission of data, generally without the use of an external clock signal, where data can be transmitted intermittently rather than in a steady stream. Any timing required to recover data from the communication symbols is encoded within the symbols. The most significant aspect of asynchronous communications is that data is not transmitted at regular intervals, thus making possible variable bit rate, and that the transmitter and receiver clock generators do not have to be exactly synchronized all the time. In asynchronous transmission, data is sent one byte at a time and each byte is preceded by start and stop bits. Physical layer In asynchronous serial communication in the physical protocol layer, the data blocks are code words of a certain word length, for example octets (bytes) or ASCII characters, delimited by start bits and stop bits. A variable length space can be inserted between the code words. No bit synchronization signal is required. This is sometimes called character oriented communication. Examples include MNP2 and modems older than V.2. Data link layer and higher Asynchronous communication at the data link layer or higher protocol layers is known as statistical multiplexing, for example Asynchronous Transfer Mode (ATM). In this case, the asynchronously transferred blocks are called data packets, for example ATM cells. The opposite is circuit switched communication, which provides constant bit rate, for example ISDN and SONET/SDH. The packets may be encapsulated in a data frame, with a frame synchronization bit sequence indicating the start of the frame, and sometimes also a bit synchronization bit sequence, typically 01010101, for identification of the bit transition times. Note that at the physical layer, this is considered as synchronous serial communication. Examples of packet mode data link protocols that can be/are transferred using synchronous serial communication are the HDLC, Ethernet, PPP and USB protocols. Application layer An asynchronous communication service or application does not require a constant bit rate. Examples are file transfer, email and the World Wide Web. An example of the opposite, a synchronous communication service, is realtime streaming media, for example IP telephony, IPTV and video conferencing. Electronically mediated communication Electronically mediated communication often happens asynchronously in that the participants do not communicate concurrently. Examples include email and bulletin-board systems, where participants send or post messages at different times than they read them. The term "asynchronous communication" acquired currency in the field of online learning, where teachers and students often exchange information asynchronously instead of synchronously (that is, simultaneously), as they would in face-to-face or in telephone conversations. See also Synchronization in telecommunications Asynchronous serial communication A
https://en.wikipedia.org/wiki/Automated%20theorem%20proving
Automated theorem proving (also known as ATP or automated deduction) is a subfield of automated reasoning and mathematical logic dealing with proving mathematical theorems by computer programs. Automated reasoning over mathematical proof was a major impetus for the development of computer science. Logical foundations While the roots of formalised logic go back to Aristotle, the end of the 19th and early 20th centuries saw the development of modern logic and formalised mathematics. Frege's Begriffsschrift (1879) introduced both a complete propositional calculus and what is essentially modern predicate logic. His Foundations of Arithmetic, published in 1884, expressed (parts of) mathematics in formal logic. This approach was continued by Russell and Whitehead in their influential Principia Mathematica, first published 1910–1913, and with a revised second edition in 1927. Russell and Whitehead thought they could derive all mathematical truth using axioms and inference rules of formal logic, in principle opening up the process to automatisation. In 1920, Thoralf Skolem simplified a previous result by Leopold Löwenheim, leading to the Löwenheim–Skolem theorem and, in 1930, to the notion of a Herbrand universe and a Herbrand interpretation that allowed (un)satisfiability of first-order formulas (and hence the validity of a theorem) to be reduced to (potentially infinitely many) propositional satisfiability problems. In 1929, Mojżesz Presburger showed that the first-order theory of the natural numbers with addition and equality (now called Presburger arithmetic in his honor) is decidable and gave an algorithm that could determine if a given sentence in the language was true or false. However, shortly after this positive result, Kurt Gödel published On Formally Undecidable Propositions of Principia Mathematica and Related Systems (1931), showing that in any sufficiently strong axiomatic system there are true statements that cannot be proved in the system. This topic was further developed in the 1930s by Alonzo Church and Alan Turing, who on the one hand gave two independent but equivalent definitions of computability, and on the other gave concrete examples of undecidable questions. First implementations Shortly after World War II, the first general-purpose computers became available. In 1954, Martin Davis programmed Presburger's algorithm for a JOHNNIAC vacuum-tube computer at the Institute for Advanced Study in Princeton, New Jersey. According to Davis, "Its great triumph was to prove that the sum of two even numbers is even". More ambitious was the Logic Theorist in 1956, a deduction system for the propositional logic of the Principia Mathematica, developed by Allen Newell, Herbert A. Simon and J. C. Shaw. Also running on a JOHNNIAC, the Logic Theorist constructed proofs from a small set of propositional axioms and three deduction rules: modus ponens, (propositional) variable substitution, and the replacement of formulas by their definition. T
https://en.wikipedia.org/wiki/Au
Au, AU, au or a.u. may refer to: Science and technology Computing .au, the internet country code for Australia Au file format, Sun Microsystems' audio format Audio Units, a system level plug-in architecture from Apple Computer Adobe Audition, a sound editor program Windows Update or Automatic Updates, in Microsoft Windows Windows 10 Anniversary Update, of August 2016a Physics and chemistry Gold, symbol Au (from Latin ), a chemical element Absorbance unit, a reporting unit in spectroscopy Atomic units, a system of units convenient for atomic physics and other fields Ångström unit, a unit of length equal to 10−10 m or 0.1 nanometre. Astronomical unit, a unit of length often used in planetary systems astronomy, an approximation for the average distance between the Earth and the Sun Arbitrary unit, a relative placeholder unit for when the actual value of a measurement is unknown or unimportant ("a.u." is deprecated, use "arb. unit" instead) Arts and entertainment Music AU (band), an experimental pop group headed by Luke Wyland Au, a 2010 release by Scottish rock band Donaldson, Moir and Paterson Au a track on Some Time in New York City by an album by John Lennon & Yoko Ono and Elephant's Memory Magazines Alternative Ulster, a Northern Irish music magazine, now called AU A&U: America's AIDS Magazine, sponsor of the Christopher Hewitt Award Literature Alternative universe (fan fiction), fiction by fan authors that deliberately alters facts of the canonical universe written about. Other media Au Co, a fairy in Vietnamese mythology Age of Ultron, a 2013 series published by Marvel Comics A.U, a Chinese media franchise and brand Organizations au (mobile phone company), a mobile phone operator in Japan African Union, a continental union Americans United for Separation of Church and State Athletic Union, the union of sports clubs in a British university Austral Líneas Aéreas (IATA code AU) Auxiliary Units, specially trained, highly secret units created by the United Kingdom government during the Second World War AGROunia, an agrarian-socialist political party in Poland Universities Asia Ajou University in Suwon, Gyeonggi, South Korea Abasyn University in Peshawar, Khyber Pakhtunkhwa, Pakistan Andhra University in Visakhapatnam, AP, India Anhui University in Hefei, Anhui, China Aletheia University in New Taipei City, Taiwan Allahabad University in Allahabad, Uttar Pradesh, India Arellano University in Philippines Assumption University (Thailand) in Thailand Abhilashi University in Himachal Pradesh, India Adesh University in Bathinda, Punjab, India. Europe Aarhus University in Aarhus, Denmark Aberystwyth University in Aberystwyth, Wales, United Kingdom Akademia Umiejętności in Kraków, Poland Arden University in Coventry, England Oceania Auckland University in New Zealand North America Adelphi University in Garden City, New York Alfred University in Alfred, New York Algoma University in Sault Ste.
https://en.wikipedia.org/wiki/Atlas%20Autocode
Atlas Autocode (AA) is a programming language developed around 1963 at the University of Manchester. A variant of the language ALGOL, it was developed by Tony Brooker and Derrick Morris for the Atlas computer. The initial AA and AB compilers were written by Jeff Rohl and Tony Brooker using the Brooker-Morris Compiler-compiler, with a later hand-coded non-CC implementation (ABC) by Jeff Rohl. The word Autocode was basically an early term for programming language. Different autocodes could vary greatly. Features AA was a block structured language that featured explicitly typed variables, subroutines, and functions. It omitted some ALGOL features such as passing parameters by name, which in ALGOL 60 means passing the memory address of a short subroutine (a thunk) to recalculate a parameter each time it is mentioned. The AA compiler could generate range-checking for array accesses, and allowed an array to have dimensions that were determined at runtime, i.e., an array could be declared as integer array Thing (i:j), where i and j were calculated values. AA high-level routines could include machine code, either to make an inner loop more efficient or to effect some operation which otherwise cannot be done easily. AA included a complex data type to represent complex numbers, partly because of pressure from the electrical engineering department, as complex numbers are used to represent the behavior of alternating current. The imaginary unit square root of -1 was represented by i, which was treated as a fixed complex constant = i. The complex data type was dropped when Atlas Autocode later evolved into the language Edinburgh IMP. IMP was an extension of AA and was used to write the Edinburgh Multiple Access System (EMAS) operating system. AA's second-greatest claim to fame (after being the progenitor of IMP and EMAS) was that it had many of the features of the original Compiler Compiler. A variant of the AA compiler included run-time support for a top-down recursive descent parser. The style of parser used in the Compiler Compiler was in use continuously at Edinburgh from the 60's until almost the year 2000. Other Autocodes were developed for the Titan computer, a prototype Atlas 2 at Cambridge, and the Ferranti Mercury. Syntax Atlas Autocode's syntax was largely similar to ALGOL, though it was influenced by the output device which the author had available, a Friden Flexowriter. Thus, it allowed symbols like ½ for .5 and the superscript 2 for to the power of 2. The Flexowriter supported overstriking and thus, AA did also: up to three characters could be overstruck as a single symbol. For example, the character set had no ↑ symbol, so exponentiation was an overstrike of | and *. The aforementioned underlining of reserved words (keywords) could also be done using overstriking. The language is described in detail in the Atlas Autocode Reference Manual. Other Flexowriter characters that were found a use in AA were: α in floating-point numbers, e.g
https://en.wikipedia.org/wiki/AutoCAD
AutoCAD is a 2D and 3D computer-aided design (CAD) software application for desktop, web, and mobile developed by Autodesk. It was first released in December 1982 for the CP/M and IBM PC platforms as a desktop app running on microcomputers with internal graphics controllers. Initially a DOS application, subsequent versions were later released for other platforms including Classic Mac OS (1992), Microsoft Windows (1992), web browsers (2010), iOS (2010), macOS (2010), and Android (2011). AutoCAD is a general drafting and design application used in industry by architects, project managers, engineers, graphic designers, city planners and other professionals to prepare technical drawings. After discontinuing the sale of perpetual licenses in January 2016, commercial versions of AutoCAD are licensed through a term-based subscription. History Before AutoCAD was introduced, most commercial CAD programs ran on mainframe computers or minicomputers, with each CAD operator (user) working at a separate graphics terminal. Origins AutoCAD was derived from a program that began in 1977, and then released in 1979 called Interact CAD, also referred to in early Autodesk documents as MicroCAD, which was written prior to Autodesk's (then Marinchip Software Partners) formation by Autodesk cofounder Michael Riddle. The first version by Autodesk was demonstrated at the 1982 Comdex and released that December. AutoCAD supported CP/M-80 computers. As Autodesk's flagship product, by March 1986 AutoCAD had become the most ubiquitous CAD program worldwide. The 2022 release marked the 36th major release of AutoCAD for Windows and the 12th consecutive year of AutoCAD for Mac. The native file format of AutoCAD is .dwg. This and, to a lesser extent, its interchange file format DXF, have become de facto, if proprietary, standards for CAD data interoperability, particularly for 2D drawing exchange. AutoCAD has included support for .dwf, a format developed and promoted by Autodesk, for publishing CAD data. Features Compatibility with other software ESRI ArcMap 10 permits export as AutoCAD drawing files. Civil 3D permits export as AutoCAD objects and as LandXML. Third-party file converters exist for specific formats such as Bentley MX GENIO Extension, PISTE Extension (France), ISYBAU (Germany), OKSTRA and Microdrainage (UK); also, conversion of .pdf files is feasible, however, the accuracy of the results may be unpredictable or distorted. For example, jagged edges may appear. Several vendors provide online conversions for free such as Cometdocs. Language AutoCAD and AutoCAD LT are available for English, German, French, Italian, Spanish, Japanese, Korean, Chinese Simplified, Chinese Traditional, Brazilian Portuguese, Russian, Czech, Polish and Hungarian (also through additional language packs). The extent of localization varies from full translation of the product to documentation only. The AutoCAD command set is localized as a part of the software localization. Extensions
https://en.wikipedia.org/wiki/AutoCAD%20DXF
AutoCAD DXF (Drawing Interchange Format, or Drawing Exchange Format) is a CAD data file format developed by Autodesk for enabling data interoperability between AutoCAD and other programs. DXF was introduced in December 1982 as part of AutoCAD 1.0, and was intended to provide an exact representation of the data in the AutoCAD native file format, DWG (Drawing). For many years, Autodesk did not publish specifications, making correct creation of DXF files difficult. Autodesk now publishes the incomplete DXF specifications online. Versions of AutoCAD from Release 10 (October 1988) and up support both ASCII and binary forms of DXF. Earlier versions support only ASCII. As AutoCAD has become more powerful, supporting more complex object types, DXF has become less useful. Certain object types, including ACIS solids and regions, are not documented. Other object types, including AutoCAD 2006's dynamic blocks, and all of the objects specific to the vertical market versions of AutoCAD, are partially documented, but not well enough to allow other developers to support them. For these reasons many CAD applications use the DWG format which can be licensed from Autodesk or non-natively from the Open Design Alliance. DXF files do not specify the units of measurement used for its coordinates and dimensions. Most CAD systems and many vector graphics packages support the import and export of DXF files, notably Adobe products, Inkscape & Blender. Some CAD systems use DXF as their native format, notably QCAD and LibreCAD. File structure ASCII versions of DXF can be read with any text editor. The basic organization of a DXF file is as follows: section General information about the drawing. Each parameter has a variable name and an associated value. section Holds the information for application-defined classes whose instances appear in the , , and sections of the database. Generally does not provide sufficient information to allow interoperability with other programs. section This section contains definitions of named items. Application ID () table Block Record () table Dimension Style () table Layer () table Linetype () table Text style () table User Coordinate System () table View () table Viewport configuration () table section This section contains Block Definition entities describing the entities comprising each Block in the drawing. section This section contains the drawing entities, including any Block References. section Contains the data that apply to nongraphical objects, used by AutoLISP, and ObjectARX applications. section Contains the preview image for the DXF file. The data format of a DXF is called a "tagged data" format, which "means that each data element in the file is preceded by an integer number that is called a group code. A group code's value indicates what type of data element follows. This value also indicates the meaning of a data element for a given object (or record) type. Virtually all user-specif
https://en.wikipedia.org/wiki/Parallel%20ATA
Parallel ATA (PATA), originally , also known as IDE, is a standard interface designed for IBM PC-compatible computers. It was first developed by Western Digital and Compaq in 1986 for compatible hard drives and CD or DVD drives. The connection is used for storage devices such as hard disk drives, floppy disk drives, and optical disc drives in computers. The standard is maintained by the X3/INCITS committee. It uses the underlying (ATA) and Packet Interface (ATAPI) standards. The Parallel ATA standard is the result of a long history of incremental technical development, which began with the original AT Attachment interface, developed for use in early PC AT equipment. The ATA interface itself evolved in several stages from Western Digital's original Integrated Drive Electronics (IDE) interface. As a result, many near-synonyms for ATA/ATAPI and its previous incarnations are still in common informal use, in particular Extended IDE (EIDE) and Ultra ATA (UATA). After the introduction of SATA in 2003, the original ATA was renamed to Parallel ATA, or PATA for short. Parallel ATA cables have a maximum allowable length of . Because of this limit, the technology normally appears as an internal computer storage interface. For many years, ATA provided the most common and the least expensive interface for this application. It has largely been replaced by SATA in newer systems. History and terminology The standard was originally conceived as the "AT Bus Attachment," officially called "AT Attachment" and abbreviated "ATA" because its primary feature was a direct connection to the 16-bit ISA bus introduced with the IBM PC/AT. The original ATA specifications published by the standards committees use the name "AT Attachment". The "AT" in the IBM PC/AT referred to "Advanced Technology" so ATA has also been referred to as "Advanced Technology Attachment". When a newer Serial ATA (SATA) was introduced in 2003, the original ATA was renamed to Parallel ATA, or PATA for short. Physical ATA interfaces became a standard component in all PCs, initially on host bus adapters, sometimes on a sound card but ultimately as two physical interfaces embedded in a Southbridge chip on a motherboard. Called the "primary" and "secondary" ATA interfaces, they were assigned to base addresses 0x1F0 and 0x170 on ISA bus systems. They were replaced by SATA interfaces. IDE and ATA-1 The first version of what is now called the ATA/ATAPI interface was developed by Western Digital under the name Integrated Drive Electronics (IDE). Together with Compaq Computer (the initial customer), they worked with various disk drive manufacturers to develop and ship early products with the goal of remaining software compatible with the existing IBM PC hard drive interface. The first such drives appeared internally in Compaq PCs in 1986 and were first separately offered by Conner Peripherals as the CP342 in June 1987. The term Integrated Drive Electronics refers to the fact that the drive cont
https://en.wikipedia.org/wiki/Atari%205200
The Atari 5200 SuperSystem or simply Atari 5200 is a home video game console introduced in 1982 by Atari, Inc. as a higher-end complement for the popular Atari Video Computer System. The VCS was renamed to the Atari 2600 at the time of the 5200's launch. Created to compete with Mattel's Intellivision, the 5200 wound up a direct competitor of ColecoVision shortly after its release. While the Coleco system shipped with the first home version of Nintendo's Donkey Kong, the 5200 included the 1978 arcade game Super Breakout which had already appeared on the Atari 8-bit family and Atari VCS in 1979 and 1981 respectively. The CPU and the graphics and sound hardware are almost identical to that of the Atari 8-bit computers, although software is not directly compatible between the two systems. The 5200's controllers have an analog joystick and a numeric keypad along with start, pause, and reset buttons. The 360-degree non-centering joystick was touted as offering more control than the eight-way Atari CX40 joystick of the 2600, but was a focal point for criticism. On May 21, 1984, during a press conference at which the Atari 7800 was introduced, company executives revealed that the 5200 had been discontinued after less than two years on the market. Total sales of the 5200 were reportedly in excess of 1 million units, far short of its predecessor's sales of over 30 million. Hardware Much of the technology in the Atari 8-bit family of home computer was originally developed as a second-generation games console intended to replace the Atari Video Computer System console. However, as the system was reaching completion, the personal computer revolution was starting with the release of machines like the Commodore PET, TRS-80, and Apple II. These machines had less advanced hardware than the new Atari technology, but sold for much higher prices with associated higher profit margins. Atari's management decided to enter this market, and the technology was repackaged into the Atari 400 and 800. The chipset used in these machines was created with the mindset that the VCS would likely be obsolete by 1980. Atari later decided to re-enter the games market with a design that closely matched their original 1978 specifications. In its prototype stage, the Atari 5200 was originally called the "Atari Video System X – Advanced Video Computer System", and was codenamed "Pam" after a female employee at Atari, Inc. It is also rumored that PAM actually stood for "Personal Arcade Machine", as the majority of games for the system ended up being arcade conversions. Actual working Atari Video System X machines, whose hardware is 100% identical to the Atari 5200 do exist, but are extremely rare. The initial 1982 release of the system featured four controller ports, where nearly all other systems of the day had only one or two ports. The 5200 also featured a new style of controller with an analog joystick, numeric keypad, two fire buttons on each side of the controller and game fun
https://en.wikipedia.org/wiki/Active%20Directory
Active Directory (AD) is a directory service developed by Microsoft for Windows domain networks. Windows Server operating systems include it as a set of processes and services. Originally, only centralized domain management used Active Directory. However, it ultimately became an umbrella title for various directory-based identity-related services. A domain controller is a server running the Active Directory Domain Service (AD DS) role. It authenticates and authorizes all users and computers in a Windows domain-type network, assigning and enforcing security policies for all computers and installing or updating software. For example, when a user logs into a computer part of a Windows domain, Active Directory checks the submitted username and password and determines whether the user is a system administrator or a non-admin user. Furthermore, it allows the management and storage of information, provides authentication and authorization mechanisms, and establishes a framework to deploy other related services: Certificate Services, Active Directory Federation Services, Lightweight Directory Services, and Rights Management Services. Active Directory uses Lightweight Directory Access Protocol (LDAP) versions 2 and 3, Microsoft's version of Kerberos, and DNS. Robert R. King defined it in the following way: History Like many information-technology efforts, Active Directory originated out of a democratization of design using Requests for Comments (RFCs). The Internet Engineering Task Force (IETF) oversees the RFC process and has accepted numerous RFCs initiated by widespread participants. For example, LDAP underpins Active Directory. Also, X.500 directories and the Organizational Unit preceded the Active Directory concept that uses those methods. The LDAP concept began to emerge even before the founding of Microsoft in April 1975, with RFCs as early as 1971. RFCs contributing to LDAP include RFC 1823 (on the LDAP API, August 1995), RFC 2307, RFC 3062, and RFC 4533. Microsoft previewed Active Directory in 1999, released it first with Windows 2000 Server edition, and revised it to extend functionality and improve administration in Windows Server 2003. Active Directory support was also added to Windows 95, Windows 98, and Windows NT 4.0 via patch, with some unsupported features. Additional improvements came with subsequent versions of Windows Server. In Windows Server 2008, Microsoft added further services to Active Directory, such as Active Directory Federation Services. The part of the directory in charge of managing domains, which was a core part of the operating system, was renamed Active Directory Domain Services (ADDS) and became a server role like others. "Active Directory" became the umbrella title of a broader range of directory-based services. According to Byron Hynes, everything related to identity was brought under Active Directory's banner. Active Directory Services Active Directory Services consist of multiple directory services. The b
https://en.wikipedia.org/wiki/Ai
AI is artificial intelligence, intellectual ability in machines and robots. Ai, AI or A.I. may also refer to: Animals Ai (chimpanzee), an individual experimental subject in Japan Ai (sloth) or the pale-throated sloth, northern Amazonian mammal species Arts, entertainment and media Works Ai (album), a 2004 release by Seraphim A.I. Artificial Intelligence, a 2001 American film A.I. Rising, a 2018 Serbian film AI: The Somnium Files, a 2019 video game American Idol, televised singing contest The American Interest, a bimonthly magazine (2005–2020) I (2015 film), an Indian Tamil film (initial title: Ai) Other uses in arts and media A.i. (band), a Californian rock–electroclash group All in (poker), wagering one's entire stake Appreciation Index, a British measure of broadcast programme approval The Art Institutes, a chain of American art schools Non-player character, in gaming (colloquially, an AI) Business , a phrase in job titles Appreciative inquiry, an organizational development method All-inclusive, a full service at a vacation resort including meals and drinks Organizations and businesses Accuracy International, a firearms manufacturer Adventure International, a video game publisher Air India, the flag carrier airline of India, based in Delhi Alitalia, the former flag carrier airline of Italy Astra International, an Indonesian automotive company Alexis I. duPont High School, Delaware, U.S. Amnesty International, a human rights organisation Appraisal Institute, an association of real estate appraisers The Art Institutes, a chain of art schools People Ai (surname), a Chinese surname Ai (given name), a given name and list of people and characters with the name King Ai of Zhou (died 441 BC) Emperor Ai of Han (27–1 BC) Emperor Ai of Jin (341–365) Emperor Ai of Tang (892–908) Ai (poet) (1947–2010), American poet Ai (singer) (born 1981), Japanese-American singer and songwriter Allen Iverson (born 1975), American retired professional basketball player ("A.I.") Andre Iguodala (born 1984), American professional basketball player ("A.I. 2.0") Places Areas Anguilla, a Caribbean territory (by ISO 3166-1 code) Appenzell Innerrhoden, a Swiss canton Cities Ai (Canaan), Biblical city United States Ai, Alabama Ai, Georgia Ai, North Carolina Ai, Ohio Landforms Religion, philosophy and mythology Ái, a Norse god Ai (Canaan), Biblical city Ai (), Sinic concepts of love from Confucianism and Buddhism , colloquially , a Greek word for 'saint' Ai Toyon, the Yakut god of light Science and technology Agricultural science Active ingredient, part of a pesticide Artificial insemination of livestock and pets, in animal breeding Air force and aviation Airborne Internet, a proposed air-to-air data network Airborne Interception radar, a Royal Air Force air-to-air system Air interdiction, an aerial military capability Attitude indicator, a flight instrument on aircraft The Internet .ai, a top-level domain
https://en.wikipedia.org/wiki/AI-complete
In the field of artificial intelligence, the most difficult problems are informally known as AI-complete or AI-hard, implying that the difficulty of these computational problems, assuming intelligence is computational, is equivalent to that of solving the central artificial intelligence problem—making computers as intelligent as people, or strong AI. To call a problem AI-complete reflects an attitude that it would not be solved by a simple specific algorithm. AI-complete problems are hypothesised to include computer vision, natural language understanding, and dealing with unexpected circumstances while solving any real-world problem. Currently, AI-complete problems cannot be solved with modern computer technology alone, but would also require human computation. This property could be useful, for example, to test for the presence of humans as CAPTCHAs aim to do, and for computer security to circumvent brute-force attacks. History The term was coined by Fanya Montalvo by analogy with NP-complete and NP-hard in complexity theory, which formally describes the most famous class of difficult problems. Early uses of the term are in Erik Mueller's 1987 PhD dissertation and in Eric Raymond's 1991 Jargon File. AI-complete problems AI-complete problems are hypothesized to include: AI peer review (composite natural language understanding, automated reasoning, automated theorem proving, formalized logic expert system) Bongard problems Computer vision (and subproblems such as object recognition) Natural language understanding (and subproblems such as text mining, machine translation, and word-sense disambiguation) Autonomous driving Dealing with unexpected circumstances while solving any real world problem, whether it's navigation or planning or even the kind of reasoning done by expert systems. Machine translation To translate accurately, a machine must be able to understand the text. It must be able to follow the author's argument, so it must have some ability to reason. It must have extensive world knowledge so that it knows what is being discussed — it must at least be familiar with all the same commonsense facts that the average human translator knows. Some of this knowledge is in the form of facts that can be explicitly represented, but some knowledge is unconscious and closely tied to the human body: for example, the machine may need to understand how an ocean makes one feel to accurately translate a specific metaphor in the text. It must also model the authors' goals, intentions, and emotional states to accurately reproduce them in a new language. In short, the machine is required to have wide variety of human intellectual skills, including reason, commonsense knowledge and the intuitions that underlie motion and manipulation, perception, and social intelligence. Machine translation, therefore, is believed to be AI-complete: it may require strong AI to be done as well as humans can do it. Software brittleness Current AI systems can s
https://en.wikipedia.org/wiki/File%20archiver
A file archiver is a computer program that combines a number of files together into one archive file, or a series of archive files, for easier transportation or storage. File archivers may employ lossless data compression in their archive formats to reduce the size of the archive. Basic archivers just take a list of files and concatenate their contents sequentially into archives. The archive files need to store metadata, at least the names and lengths of the original files, if proper reconstruction is possible. More advanced archivers store additional metadata, such as the original timestamps, file attributes or access control lists. The process of making an archive file is called archiving or packing. Reconstructing the original files from the archive is termed unarchiving, unpacking or extracting. History An early archiver was the Multics command archive, descended from the CTSS command of the same name, which was a basic archiver and performed no compression. Multics also had a "tape_archiver" command, abbreviated ta, which was perhaps the forerunner of the Unix command tar. Unix archivers The Unix tools ar, tar, and cpio act as archivers but not compressors. Users of the Unix tools use additional compression tools, such as gzip, bzip2, or xz, to compress the archive file after packing or remove compression before unpacking the archive file. The filename extensions are successively added at each step of this process. For example, archiving a collection of files with tar and then compressing the resulting archive file with gzip results a file with .tar.gz extension. This approach has two goals: It follows the Unix philosophy that each program should accomplish a single task to perfection, as opposed to attempting to accomplish everything with one tool. As compression technology progresses, users may use different compression programs without having to modify or abandon their archiver. The archives use solid compression. When the files are combined, the compressor can exploit redundancy across several archived files and achieve better compression than a compressor that compresses each files individually. This approach, however, has disadvantages too: Extracting or modifying one file is difficult. Extracting one file requires decompressing an entire archive, which can be time- and space-consuming. Modifying one means the file needs to be put back into archive and the archive recompressed again. This operation requires additional time and disk space. The archive becomes damage-prone. If the area holding shared data for several files is damaged, all those files are lost. It is impossible to take advantage of redundancy between files unless the compression window is larger than the size of an individual file. For example, gzip uses DEFLATE, which typically operates with a 32768-byte window, whereas bzip2 uses a Burrows–Wheeler transform roughly 27 times bigger. xz defaults to 8 MiB but supports significantly larger windows. Windows archi
https://en.wikipedia.org/wiki/AIM%20%28software%29
AIM (AOL Instant Messenger) was an instant messaging and presence computer program created by AOL, which used the proprietary OSCAR instant messaging protocol and the TOC protocol to allow registered users to communicate in real time. AIM was popular by the late 1990s, in United States and other countries, and was the leading instant messaging application in that region into the following decade. Teens and college students were known to use the messenger's away message feature to keep in touch with friends, often frequently changing their away message throughout a day or leaving a message up with one's computer left on to inform buddies of their ongoings, location, parties, thoughts, or jokes. AIM's popularity declined as AOL subscribers started decreasing and steeply towards the 2010s, as Gmail's Google Talk, SMS, and Internet social networks, like Facebook gained popularity. Its fall has often been compared with other once-popular Internet services, such as Myspace. In June 2015, AOL was acquired by Verizon Communications. In June 2017, Verizon combined AOL and Yahoo into its subsidiary Oath Inc. (now called Yahoo). The company discontinued AIM as a service on December 15, 2017. History In May 1997, AIM was released unceremoniously as a stand-alone download for Microsoft Windows. AIM was an outgrowth of "online messages" in the original platform written in PL/1 on a Stratus computer by Dave Brown. At one time, the software had the largest share of the instant messaging market in North America, especially in the United States (with 52% of the total reported ). This does not include other instant messaging software related to or developed by AOL, such as ICQ and iChat. During its heyday, its main competitors were ICQ (which AOL acquired in 1998), Yahoo! Messenger and MSN Messenger. AOL particularly had a rivalry or "chat war" with PowWow and Microsoft, starting in 1999. There were several attempts from Microsoft to simultaneously log into their own and AIM's protocol servers. AOL was unhappy about this and started blocking MSN Messenger from being able to access AIM. This led to efforts by many companies to challenge the AOL and Time Warner merger on the grounds of antitrust behaviour, leading to the formation of the OpenNet Coalition. Official mobile versions of AIM appeared as early as 2001 on Palm OS through the AOL application. Third-party applications allowed it to be used in 2002 for the Sidekick. A version for Symbian OS was announced in 2003 as were others for BlackBerry and Windows Mobile After 2012, stand-alone official AIM client software included advertisements and was available for Microsoft Windows, Windows Mobile, Classic Mac OS, macOS, Android, iOS, and BlackBerry OS. Usage decline and product sunset Around 2011, AIM started to lose popularity rapidly, partly due to the quick rise of Gmail and its built-in real-time Google Chat instant messenger integration in 2011 and because many people migrated to SMS or iMessages text m
https://en.wikipedia.org/wiki/Association%20for%20Computing%20Machinery
The Association for Computing Machinery (ACM) is a US-based international learned society for computing. It was founded in 1947 and is the world's largest scientific and educational computing society. The ACM is a non-profit professional membership group, reporting nearly 110,000 student and professional members . Its headquarters are in New York City. The ACM is an umbrella organization for academic and scholarly interests in computer science (informatics). Its motto is "Advancing Computing as a Science & Profession". History In 1947, a notice was sent to various people: On January 10, 1947, at the Symposium on Large-Scale Digital Calculating Machinery at the Harvard computation Laboratory, Professor Samuel H. Caldwell of Massachusetts Institute of Technology spoke of the need for an association of those interested in computing machinery, and of the need for communication between them. [...] After making some inquiries during May and June, we believe there is ample interest to start an informal association of many of those interested in the new machinery for computing and reasoning. Since there has to be a beginning, we are acting as a temporary committee to start such an association: E. C. Berkeley, Prudential Insurance Co. of America, Newark, N. J. R. V. D. Campbell, Raytheon Manufacturing Co., Waltham, Mass. , Bureau of Standards, Washington, D.C. H. E. Goheen, Office of Naval Research, Boston, Mass. J. W. Mauchly, Electronic Control Co., Philadelphia, Pa. T. K. Sharpless, Moore School of Elec. Eng., Philadelphia, Pa. R. Taylor, Mass. Inst. of Tech., Cambridge, Mass. C. B. Tompkins, Engineering Research Associates, Washington, D.C. The committee (except for Curtiss) had gained experience with computers during World War II: Berkeley, Campbell, and Goheen helped build Harvard Mark I under Howard H. Aiken, Mauchly and Sharpless were involved in building ENIAC, Tompkins had used "the secret Navy code-breaking machines", and Taylor had worked on Bush's Differential analyzers. The ACM was then founded in 1947 under the name Eastern Association for Computing Machinery, which was changed the following year to the Association for Computing Machinery. The ACM History Committee since 2016 has published the A.M.Turing Oral History project, the ACM Key Award Winners Video Series, and the India Industry Leaders Video project. Activities ACM is organized into over 246 local professional chapters and 38 Special Interest Groups (SIGs), through which it conducts most of its activities. Additionally, there are over 833 college and university chapters. The first student chapter was founded in 1961 at the University of Louisiana at Lafayette. Many of the SIGs, such as SIGGRAPH, SIGDA, SIGPLAN, SIGCSE and SIGCOMM, sponsor regular conferences, which have become famous as the dominant venue for presenting innovations in certain fields. The groups also publish a large number of specialized journals, magazines, and newsletters. ACM also sponsors other comput
https://en.wikipedia.org/wiki/The%20Bush%20%28Alaska%29
In Alaska, the Bush typically refers to any region of the state that is not connected to the North American road network or does not have ready access to the state's ferry system. A large proportion of Alaska Native populations live in the Bush, often depending on subsistence hunting and fishing. Geographically, the Bush comprises the Alaska North Slope; Northwest Arctic; West, including the Baldwin and Seward Peninsulas; the Yukon-Kuskokwim Delta; Southwest Alaska; Bristol Bay; Alaska Peninsula; and remote areas of the Alaska Panhandle and Interior. Some of the hub communities in the bush, which typically can be reached by larger, commercial airplanes, include Bethel, Dillingham, King Salmon, Nome, Utqiagvik, Kodiak Island, Kotzebue, and Unalaska-Dutch Harbor. Most parts of Alaska that are off the road or ferry system can be reached by small bush airplanes. Travel between smaller communities or to and from hub communities is typically accomplished by snowmobiles, boats, or ATVs. References Regions of Alaska Rural geography Decolonization
https://en.wikipedia.org/wiki/AMOS%20%28programming%20language%29
AMOS BASIC is a dialect of the BASIC programming language for the Amiga computer. Following on from the successful STOS BASIC for the Atari ST, AMOS BASIC was written for the Amiga by François Lionet with Constantin Sotiropoulos and published by Europress Software in 1990. History AMOS competed on the Amiga platform with Acid Software's Blitz BASIC. Both BASICs differed from other dialects on different platforms, in that they allowed the easy creation of fairly demanding multimedia software, with full structured code and many high-level functions to load images, animations, sounds and display them in various ways. The original AMOS was a BASIC interpreter which, whilst working fine, suffered the same disadvantages of any language being run interpretively. By all accounts, AMOS was extremely fast among interpreted languages, being speedy enough that an extension called AMOS 3D could produce playable 3D games even on plain 7 MHz 68000 Amigas. Later, an AMOS compiler was developed that further increased speed. AMOS could also run MC68000 machine code, loaded into a program's memory banks. To simplify animation of sprites, AMOS included the AMOS Animation Language (AMAL), a compiled sprite scripting language which runs independently of the main AMOS BASIC program. It was also possible to control screen and "rainbow" effects using AMAL scripts. AMAL scripts in effect created CopperLists, small routines executed by the Amiga's Agnus chip. After the original version of AMOS, Europress released a compiler (AMOS Compiler), and two other versions of the language: Easy AMOS, a simpler version for beginners, and AMOS Professional, a more advanced version with added features, such as a better integrated development environment, ARexx support, a new user interface API and new flow control constructs. Neither of these new versions was significantly more popular than the original AMOS. AMOS was used mostly to make multimedia software, video games (platformers and graphical adventures) and educational software. The language was mildly successful within the Amiga community. Its ease of use made it especially attractive to beginners. Perhaps AMOS BASIC's biggest disadvantage, stemming from its Atari ST lineage, was its incompatibility with the Amiga's operating system functions and interfaces. Instead, AMOS BASIC controlled the computer directly, which caused programs written in it to have a non-standard user interface, and also caused compatibility problems with newer versions of hardware. Today, the language has declined in popularity along with the Amiga computer for which it was written. Despite this, a small community of enthusiasts are still using it. The source code to AMOS was released around 2001 under a BSD style license by Clickteam, a company that includes the original programmer. Software Software written using AMOS BASIC includes: Miggybyte Scorched Tanks Games by Vulcan Software, amongst which was the Valhalla trilogy Amiga version of
https://en.wikipedia.org/wiki/Optical%20audio%20disc
An audio optical disc is an optical disc that stores sound information such as music or speech. It may specifically refer to: Audio CDs Compact disc (CD), an optical disc used to store digital data (700 MB storage) Compact Disc Digital Audio (CD-DA), a CD that contains PCM encoded digital audio in the original "Red Book" CD-DA format 5.1 Music Disc, an extension to the Red Book standard that uses DTS Coherent Acoustics 5.1 surround sound Compressed audio optical disc, an optical disc storing MP3s and other compressed audio files as data, rather than in the Red Book format Audio DVDs DVD, 4 GB single layer, 8 GB double layer storage DVD-Audio, a DVD that plays audio Super Audio CD (SACD), a format which competes with DVD-Audio Audio Blu-rays Blu-ray, 25 GB single layer, 50 GB double layer BD-Audio, a Blu-ray disc that is capable of audio-only playback See also CD-4 or Compatible discrete four-channel sound, a variety of quadrophonic audio for vinyl records
https://en.wikipedia.org/wiki/Adrian%20Lamo
Adrián Alfonso Lamo Atwood (February 20, 1981 – March 14, 2018) was an American threat analyst and hacker. Lamo first gained media attention for breaking into several high-profile computer networks, including those of The New York Times, Yahoo!, and Microsoft, culminating in his 2003 arrest. Lamo was best known for reporting U.S. soldier Chelsea Manning to Army criminal investigators in 2010 for leaking hundreds of thousands of sensitive U.S. government documents to WikiLeaks. Lamo died on March 14, 2018, at the age of 37. Early life and education Adrian Lamo was born in Malden, Massachusetts His father, Mario Ricardo Lamo, was Colombian. Adrian Lamo attended high schools in Bogotá and San Francisco, from which he did not graduate, but received a GED and was court-ordered to take courses at American River College, a community college in Sacramento County, California. Lamo began his hacking efforts by hacking games on the Commodore 64 and through phone phreaking. Activities and legal issues Lamo first became known for operating AOL watchdog site Inside-AOL.com. Security compromise Lamo was a grey hat hacker who viewed the rise of the World Wide Web with a mixture of excitement and alarm. He felt that others failed to see the importance of internet security in the early days of the World Wide Web. Lamo would break into corporate computer systems, but he never caused damage to the systems involved. Instead, he would offer to fix the security flaws free of charge, and if the flaw was not fixed, he would alert the media. Lamo hoped to be hired by a corporation to attempt to break into systems and test their security, a practice that came to be known as red teaming. However, by the time this practice was common, his felony conviction prevented him from being hired. In December 2001, Lamo was praised by Worldcom for helping to fortify their corporate security. In February 2002, he broke into the internal computer network of The New York Times, added his name to the internal database of expert sources, and used the paper's LexisNexis account to conduct research on high-profile subjects. The New York Times filed a complaint, and a warrant for Lamo's arrest was issued in August 2003 following a 15-month investigation by federal prosecutors in New York. At 10:15 a.m. on September 9, after spending a few days in hiding, he surrendered to the US Marshals in Sacramento, California. He re-surrendered to the FBI in New York City on September 11, and pleaded guilty to one felony count of computer crimes against Microsoft, LexisNexis, and The New York Times on January 8, 2004. In July 2004, Lamo was sentenced to two years' probation, with six months to be served in home detention, and ordered to pay $65,000 in restitution. He was convicted of compromising security at The New York Times, Microsoft, Yahoo!, and WorldCom. When challenged for a response to allegations that he was glamorizing crime for the sake of publicity, his response was: "Anything I could s
https://en.wikipedia.org/wiki/Alexey%20Pajitnov
Alexey Leonidovich Pajitnov (born April 16, 1955) is a Russian computer engineer and video game designer who is best known for creating, designing, and developing Tetris in 1985 while working at the Dorodnitsyn Computing Centre under the Academy of Sciences of the Soviet Union (now the Russian Academy of Sciences). In 1991, he moved to the United States and later became a U.S. citizen. In 1996, Pajitnov founded The Tetris Company alongside Dutch video game designer Henk Rogers. Pajitnov did not receive royalties from Tetris prior to this time, despite the game's high popularity. Early life Pajitnov was born to parents who were both writers. His father was an art critic. His mother was a journalist who wrote for both newspapers and a film magazine. It was through his parents that Pajitnov gained exposure to the arts, eventually developing a passion for cinema. He accompanied his mother to many film screenings, including the Moscow Film Festival. Pajitnov was also mathematically inclined, enjoying puzzles and problem solving. In 1967, when he was 11 years old, Pajitnov's parents divorced. For several years, he lived with his mother in a one-bedroom apartment owned by the state. The two were eventually able to move into a private apartment at 49 Gertsen Street, when Pajitnov was 17. He later went on to study applied mathematics at the Moscow Aviation Institute. Career In 1977, Pajitnov worked as a summer intern at the Soviet Academy of Sciences. Once he graduated in 1979, he accepted a job there working on speech recognition at the Academy's Dorodnitsyn Computing Centre. When the Computing Centre received new equipment, its researchers would write a small program for it in order to test its computing capabilities. According to Pajitnov, this "became [his] excuse for making games". Computer games were fascinating to him because they offered a way to bridge the gap between logic and emotion, and Pajitnov held interests in both mathematics and puzzles, as well as the psychology of computing. Searching for inspiration, Pajitnov recalled his childhood memories of playing pentominoes, a game in which the user creates pictures using its shapes. Remembering the difficulty he had in putting the pieces back into their box, Pajitnov felt inspired to create a game based on that concept. Using an Electronika 60 in the Computing Centre, he began working on what would become the first version of Tetris. Building the first prototype in two weeks, Pajitnov spent longer playtesting and adding to the game, completing it on June 6, 1985. This primitive version did not have levels or a scoring system, but Pajitnov knew he had a potentially great game, since he could not stop playing it at work. The game attracted the interest of coworkers like fellow programmer Dmitri Pevlovsky, who helped Pajitnov connect with Vadim Gerasimov, a 16-year-old intern at the Soviet Academy. Pajitnov wanted to make a color version of Tetris for the IBM Personal Computer, and enlisted
https://en.wikipedia.org/wiki/Amiga%20500
The Amiga 500, also known as the A500, was the first popular version of the Amiga home computer, "redefining the home computer market and making so-called luxury features such as multitasking and colour a standard long before Microsoft or Apple sold these to the masses". It contains the same Motorola 68000 as the Amiga 1000, as well as the same graphics and sound coprocessors, but is in a smaller case similar to that of the Commodore 128. Commodore announced the Amiga 500 at the January 1987 winter Consumer Electronics Showat the same time as the high-end Amiga 2000. It was initially available in the Netherlands in April 1987, then the rest of Europe in May. In North America and the UK it was released in October 1987 with a list price. It competed directly against models in the Atari ST line. The Amiga 500 was sold in the same retail outlets as the Commodore 64, as opposed to the computer store-only Amiga 1000. It proved to be Commodore's best-selling model, particularly in Europe. Although popular with hobbyists, arguably its most widespread use was as a gaming machine, where its graphics and sound were of significant benefit. It was followed by a revised version of the computer, the Amiga 500 Plus, and the 500 series was discontinued in 1992. Releases In mid-1988, the Amiga 500 dropped its price from £499 to £399 (https://amr.abime.net/issue_535_pages page 7), and it was later bundled with the Batman Pack in the United Kingdom (from October 1989 to September 1990) which included the games Batman, F/A-18 Interceptor, The New Zealand Story and the bitmap graphics editor Deluxe Paint 2. Also included was the Amiga video connector which allows the A500 to be used with a conventional CRT television. In November 1991, the enhanced Amiga 500 Plus replaced the 500 in some markets. It was bundled with the Cartoon Classics pack in the United Kingdom at £399, although many stores still advertised it as an 'A500'. The Amiga 500 Plus was virtually identical except for its new operating system, different 'trap-door' expansion slot and slightly different keyboard, and in mid-1992, the two were discontinued and effectively replaced by the Amiga 600. In late 1992, Commodore released the Amiga 1200, a machine closer in concept to the original Amiga 500, but with significant technical improvements. Despite this, neither the A1200 nor the A600 replicated the commercial success of its predecessor. By this time, the home market was strongly shifting to IBM PC compatibles with VGA graphics and the "low-cost" Macintosh Classic, LC, and IIsi models. Description Outwardly resembling the Commodore 128 and codenamed "Rock Lobster" during development, the Amiga 500's base houses a keyboard and a CPU in one shell, unlike the Amiga 1000. The keyboard for Amiga 500s sold in the United States contains 94 keys, including ten function keys, four cursor keys, and a number pad. All European versions the keyboard have an additional two keys, except for the British variety,
https://en.wikipedia.org/wiki/Amiga%201000
The Amiga 1000, also known as the A1000, is the first personal computer released by Commodore International in the Amiga line. It combines the 16/32-bit Motorola 68000 CPU which was powerful by 1985 standards with one of the most advanced graphics and sound systems in its class. It runs a preemptive multitasking operating system that fits into of read-only memory and was shipped with 256 KB of RAM. The primary memory can be expanded internally with a manufacturer-supplied 256 KB module for a total of 512 KB of RAM. Using the external slot the primary memory can be expanded up to Design The A1000 has a number of characteristics that distinguish it from later Amiga models: It is the only model to feature the short-lived Amiga check-mark logo on its case, the majority of the case is elevated slightly to give a storage area for the keyboard when not in use (a "keyboard garage"), and the inside of the case is engraved with the signatures of the Amiga designers (similar to the Macintosh); including Jay Miner and the paw print of his dog Mitchy. The A1000's case was designed by Howard Stolz. As Senior Industrial Designer at Commodore, Stolz was the mechanical lead and primary interface with Sanyo in Japan, the contract manufacturer for the A1000 casing. The Amiga 1000 was manufactured in two variations: One uses the NTSC television standard and the other uses the PAL television standard. The NTSC variant was the initial model manufactured and sold in North America. The later PAL model was manufactured in Germany and sold in countries using the PAL television standard. The first NTSC systems lack the EHB video mode which is present in all later Amiga models. Because AmigaOS was rather buggy at the time of the A1000's release, the OS was not placed in ROM then. Instead, the A1000 includes a daughterboard with 256 KB of RAM, dubbed the "writable control store" (WCS), into which the core of the operating system is loaded from floppy disk (this portion of the operating system is known as the "Kickstart"). The WCS is write-protected after loading, and system resets do not require a reload of the WCS. In Europe, the WCS was often referred to as WOM (Write Once Memory), a play on the more conventional term "ROM" (read-only memory). Technical information The preproduction Amiga (which was codenamed "Velvet") released to developers in early 1985 contained of RAM with an option to expand it to Commodore later increased the system memory to due to objections by the Amiga development team. The names of the custom chips were different; Denise and Paula were called Daphne and Portia respectively. The casing of the preproduction Amiga was almost identical to the production version: the main difference being an embossed Commodore logo in the top left corner. It did not have the developer signatures. The Amiga 1000 has a Motorola 68000 CPU running at 7.15909 MHz on NTSC systems or 7.09379 MHz on PAL systems, precisely double the video color carrier frequenc
https://en.wikipedia.org/wiki/Andrew%20S.%20Tanenbaum
Andrew Stuart Tanenbaum (born March 16, 1944), sometimes referred to by the handle ast, is an American-Dutch computer scientist and professor emeritus of computer science at the Vrije Universiteit Amsterdam in the Netherlands. He is the author of MINIX, a free Unix-like operating system for teaching purposes, and has written multiple computer science textbooks regarded as standard texts in the field. He regards his teaching job as his most important work. Since 2004 he has operated Electoral-vote.com, a website dedicated to analysis of polling data in federal elections in the United States. Biography Tanenbaum was born in New York City and grew up in suburban White Plains, New York, where he attended the White Plains High School. He is Jewish. His paternal grandfather was born in Khorostkiv in the Austro-Hungarian empire. He received his Bachelor of Science degree in physics from MIT in 1965 and his PhD degree in astrophysics from the University of California, Berkeley in 1971. Tanenbaum also served as a lobbyist for the Sierra Club. He moved to the Netherlands to live with his wife, who is Dutch, but he retains his United States citizenship. He taught courses on Computer Organization and Operating Systems and supervised the work of PhD candidates at the VU University Amsterdam. On July 9, 2014, he announced his retirement. Teaching Books Tanenbaum's textbooks on computer science include: Structured Computer Organization (1976) Computer Networks, co-authored with David J. Wetherall and Nickolas Feamster (1981) Operating Systems: Design and Implementation, co-authored with Albert Woodhull (1987) Modern Operating Systems (1992) Distributed Operating Systems (1994) Distributed Systems: Principles and Paradigms, co-authored with Maarten van Steen (2001) His book, Operating Systems: Design and Implementation and MINIX were Linus Torvalds' inspiration for the Linux kernel. In his autobiography Just for Fun, Torvalds describes it as "the book that launched me to new heights". His books have been translated into many languages including Arabic, Basque, Bulgarian, Chinese, Dutch, French, German, Greek, Hebrew, Hungarian, Italian, Japanese, Korean, Macedonian, Mexican Spanish, Persian, Polish, Portuguese, Romanian, Russian, Serbian, and Spanish. They have appeared in over 175 editions and are used at universities around the world. Doctoral students Tanenbaum has had a number of PhD students who themselves have gone on to become widely known computer science researchers. These include: Henri Bal, professor at the Vrije Universiteit in Amsterdam Frans Kaashoek, professor at MIT Werner Vogels, Chief Technology Officer at Amazon.com Dean of the Advanced School for Computing and Imaging In the early 1990s, the Dutch government began setting up a number of thematically oriented research schools that spanned multiple universities. These schools were intended to bring professors and PhD students from different Dutch (and later, foreign) universi
https://en.wikipedia.org/wiki/Accumulator%20%28computing%29
In a computer's central processing unit (CPU), the accumulator is a register in which intermediate arithmetic logic unit results are stored. Without a register like an accumulator, it would be necessary to write the result of each calculation (addition, multiplication, shift, etc.) to main memory, perhaps only to be read right back again for use in the next operation. Access to main memory is slower than access to a register like an accumulator because the technology used for the large main memory is slower (but cheaper) than that used for a register. Early electronic computer systems were often split into two groups, those with accumulators and those without. Modern computer systems often have multiple general-purpose registers that can operate as accumulators, and the term is no longer as common as it once was. However, to simplify their design, a number of special-purpose processors still use a single accumulator. Basic concept Mathematical operations often take place in a stepwise fashion, using the results from one operation as the input to the next. For instance, a manual calculation of a worker's weekly payroll might look something like: look up the number of hours worked from the employee's time card look up the pay rate for that employee from a table multiply the hours by the pay rate to get their basic weekly pay multiply their basic pay by a fixed percentage to account for income tax subtract that number from their basic pay to get their weekly pay after tax multiply that result by another fixed percentage to account for retirement plans subtract that number from their basic pay to get their weekly pay after all deductions A computer program carrying out the same task would follow the same basic sequence of operations, although the values being looked up would all be stored in computer memory. In early computers, the number of hours would likely be held on a punch card and the pay rate in some other form of memory, perhaps a magnetic drum. Once the multiplication is complete, the result needs to be placed somewhere. On a "drum machine" this would likely be back to the drum, an operation that takes considerable time. And then the very next operation has to read that value back in, which introduces another considerable delay. Accumulators dramatically improve performance in systems like these by providing a scratchpad area where the results of one operation can be fed to the next one for little or no performance penalty. In the example above, the basic weekly pay would be calculated and placed in the accumulator, which could then immediately be used by the income tax calculation. This removes one save and one read operation from the sequence, operations that generally took tens to hundreds of times as long as the multiplication itself. Accumulator machines An accumulator machine, also called a 1-operand machine, or a CPU with accumulator-based architecture, is a kind of CPU where, although it may have several registers,
https://en.wikipedia.org/wiki/Advanced%20Power%20Management
Advanced power management (APM) is a technical standard for power management developed by Intel and Microsoft and released in 1992 which enables an operating system running an IBM-compatible personal computer to work with the BIOS (part of the computer's firmware) to achieve power management. Revision 1.2 was the last version of the APM specification, released in 1996. ACPI is the successor to APM. Microsoft dropped support for APM in Windows Vista. The Linux kernel still mostly supports APM, though support for APM CPU idle was dropped in version 3.0. Overview APM uses a layered approach to manage devices. APM-aware applications (which include device drivers) talk to an OS-specific APM driver. This driver communicates to the APM-aware BIOS, which controls the hardware. There is the ability to opt out of APM control on a device-by-device basis, which can be used if a driver wants to communicate directly with a hardware device. Communication occurs both ways; power management events are sent from the BIOS to the APM driver, and the APM driver sends information and requests to the BIOS via function calls. In this way the APM driver is an intermediary between the BIOS and the operating system. Power management happens in two ways; through the above-mentioned function calls from the APM driver to the BIOS requesting power state changes, and automatically based on device activity. In APM 1.0 and APM 1.1, power management is almost fully controlled by the BIOS. In APM 1.2, the operating system can control PM time (e.g. suspend timeout). Power management events There are 12 power events (such as standby, suspend and resume requests, and low battery notifications), plus OEM-defined events, that can be sent from the APM BIOS to the operating system. The APM driver regularly polls for event change notifications. Power Management Events: APM functions There are 21 APM function calls defined that the APM driver can use to query power management statuses, or request power state transitions. Example function calls include letting the BIOS know about current CPU usage (the BIOS may respond to such a call by placing the CPU in a low-power state, or returning it to its full-power state), retrieving the current power state of a device, or requesting a power state change. Power states The APM specification defines system power states and device power states. System power states APM defines five power states for the computer system: Full On: The computer is powered on, and no devices are in a power saving mode. APM Enabled: The computer is powered on, and APM is controlling device power management as needed. APM Standby: Most devices are in their low-power state, the CPU is slowed or stopped, and the system state is saved. The computer can be returned to its former state quickly (in response to activity such as the user pressing a key on the keyboard). APM Suspend: Most devices are powered off, but the system state is saved. The computer can be return
https://en.wikipedia.org/wiki/ANSI%20C
ANSI C, ISO C, and Standard C are successive standards for the C programming language published by the American National Standards Institute (ANSI) and ISO/IEC JTC 1/SC 22/WG 14 of the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). Historically, the names referred specifically to the original and best-supported version of the standard (known as C89 or C90). Software developers writing in C are encouraged to conform to the standards, as doing so helps portability between compilers. History and outlook The first standard for C was published by ANSI. Although this document was subsequently adopted by ISO/IEC and subsequent revisions published by ISO/IEC have been adopted by ANSI, "ANSI C" is still used to refer to the standard. While some software developers use the term ISO C, others are standards-body neutral and use Standard C. Informal specification: K&R C (C78) Informal specification in 1978 (Brian Kernighan and Dennis Ritchie book The C Programming Language). Standardizing C In 1983, the American National Standards Institute formed a committee, X3J11, to establish a standard specification of C. In 1985, the first Standard Draft was released, sometimes referred to as C85. In 1986, another Draft Standard was released, sometimes referred to as C86. The prerelease Standard C was published in 1988, and sometimes referred to as C88. C89 The ANSI standard was completed in 1989 and ratified as ANSI X3.159-1989 "Programming Language C." This version of the language is often referred to as "ANSI C". Later on sometimes the label "C89" is used to distinguish it from C90 but using the same labeling method. C90 The same standard as C89 was ratified by ISO/IEC as ISO/IEC 9899:1990, with only formatting changes, which is sometimes referred to as C90. Therefore, the terms "C89" and "C90" refer to essentially the same language. This standard has been withdrawn by both ANSI/INCITS and ISO/IEC. C95 In 1995, the ISO/IEC published an extension, called Amendment 1, for the ANSI-C standard. Its full name finally was ISO/IEC 9899:1990/AMD1:1995 or nicknamed C95. Aside from error correction there were further changes to the language capabilities, such as: Improved multi-byte and wide character support in the standard library, introducing <wchar.h> and <wctype.h> as well as multi-byte I/O Addition of digraphs to the language Specification of standard macros for the alternative specification of operators, e.g. and for && Specification of the standard macro __STDC_VERSION__ In addition to the amendment, two technical corrigenda were published by ISO for C90: ISO/IEC 9899:1990/Cor 1:1994 TCOR1 in 1994 ISO/IEC 9899:1990/Cor 2:1996 in 1996 Preprocessor test for C95 compatibility #if defined(__STDC_VERSION__) && __STDC_VERSION__ >= 199409L /* C95 compatible source code. */ #elif defined() /* C89 compatible source code. */ #endif C99 In March 2000, ANSI adopted the ISO/IEC 9899:1999 standard
https://en.wikipedia.org/wiki/Bit
The bit is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as either , but other representations such as true/false, yes/no, on/off, or +/− are also widely used. The relation between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program. It may be physically implemented with a two-state device. A contiguous group of binary digits is commonly called a bit string, a bit vector, or a single-dimensional (or multi-dimensional) bit array. A group of eight bits is called one byte, but historically the size of the byte is not strictly defined. Frequently, half, full, double and quadruple words consist of a number of bytes which is a low power of two. A string of four bits is a nibble. In information theory, one bit is the information entropy of a random binary variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known. As a unit of information, the bit is also known as a shannon, named after Claude E. Shannon. The symbol for the binary digit is either "bit", per the IEC 80000-13:2008 standard, or the lowercase character "b", per the IEEE 1541-2002 standard. Use of the latter may create confusion with the capital "B" which is the international standard symbol for the byte. History The encoding of data by discrete bits was used in the punched cards invented by Basile Bouchon and Jean-Baptiste Falcon (1732), developed by Joseph Marie Jacquard (1804), and later adopted by Semyon Korsakov, Charles Babbage, Herman Hollerith, and early computer manufacturers like IBM. A variant of that idea was the perforated paper tape. In all those systems, the medium (card or tape) conceptually carried an array of hole positions; each position could be either punched through or not, thus carrying one bit of information. The encoding of text by bits was also used in Morse code (1844) and early digital communications machines such as teletypes and stock ticker machines (1870). Ralph Hartley suggested the use of a logarithmic measure of information in 1928. Claude E. Shannon first used the word "bit" in his seminal 1948 paper "A Mathematical Theory of Communication". He attributed its origin to John W. Tukey, who had written a Bell Labs memo on 9 January 1947 in which he contracted "binary information digit" to simply "bit". Vannevar Bush had written in 1936 of "bits of information" that could be stored on the punched cards used in the mechanical computers of that time. The first programmable computer, built by Konrad Zuse, used binary notation for numbers. Physical representation A bit can be stored by a digital device or other physical system that exists in either of two possible distinct states. The
https://en.wikipedia.org/wiki/Byte
The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable unit of memory in many computer architectures. To disambiguate arbitrarily sized bytes from the common 8-bit definition, network protocol documents such as the Internet Protocol () refer to an 8-bit byte as an octet. Those bits in an octet are usually counted with numbering from 0 to 7 or 7 to 0 depending on the bit endianness. The first bit is number 0, making the eighth bit number 7. The size of the byte has historically been hardware-dependent and no definitive standards existed that mandated the size. Sizes from 1 to 48 bits have been used. The six-bit character code was an often-used implementation in early encoding systems, and computers using six-bit and nine-bit bytes were common in the 1960s. These systems often had memory words of 12, 18, 24, 30, 36, 48, or 60 bits, corresponding to 2, 3, 4, 5, 6, 8, or 10 six-bit bytes. In this era, bit groupings in the instruction stream were often referred to as syllables or slab, before the term byte became common. The modern de facto standard of eight bits, as documented in ISO/IEC 2382-1:1993, is a convenient power of two permitting the binary-encoded values 0 through 255 for one byte, as 2 to the power of 8 is 256. The international standard IEC 80000-13 codified this common meaning. Many types of applications use information representable in eight or fewer bits and processor designers commonly optimize for this usage. The popularity of major commercial computing architectures has aided in the ubiquitous acceptance of the 8-bit byte. Modern architectures typically use 32- or 64-bit words, built of four or eight bytes, respectively. The unit symbol for the byte was designated as the upper-case letter B by the International Electrotechnical Commission (IEC) and Institute of Electrical and Electronics Engineers (IEEE). Internationally, the unit octet, symbol o, explicitly defines a sequence of eight bits, eliminating the potential ambiguity of the term "byte". Etymology and history The term byte was coined by Werner Buchholz in June 1956, during the early design phase for the IBM Stretch computer, which had addressing to the bit and variable field length (VFL) instructions with a byte size encoded in the instruction. It is a deliberate respelling of bite to avoid accidental mutation to bit. Another origin of byte for bit groups smaller than a computer's word size, and in particular groups of four bits, is on record by Louis G. Dooley, who claimed he coined the term while working with Jules Schwartz and Dick Beeler on an air defense system called SAGE at MIT Lincoln Laboratory in 1956 or 1957, which was jointly developed by Rand, MIT, and IBM. Later on, Schwartz's language JOVIAL actually used the term, but the author recalled vaguely that it was derived from AN
https://en.wikipedia.org/wiki/Bulletin%20board%20system
A bulletin board system (BBS), also called a computer bulletin board service (CBBS), is a computer server running software that allows users to connect to the system using a terminal program. Once logged in, the user can perform functions such as uploading and downloading software and data, reading news and bulletins, and exchanging messages with other users through public message boards and sometimes via direct chatting. In the early 1980s, message networks such as FidoNet were developed to provide services such as NetMail, which is similar to internet-based email. Many BBSes also offer online games in which users can compete with each other. BBSes with multiple phone lines often provide chat rooms, allowing users to interact with each other. Bulletin board systems were in many ways a precursor to the modern form of the World Wide Web, social networks, and other aspects of the Internet. Low-cost, high-performance asynchronous modems drove the use of online services and BBSes through the early 1990s. InfoWorld estimated that there were 60,000 BBSes serving 17 million users in the United States alone in 1994, a collective market much larger than major online services such as CompuServe. The introduction of inexpensive dial-up internet service and the Mosaic web browser offered ease of use and global access that BBS and online systems did not provide, and led to a rapid crash in the market starting in late 1994 to early 1995. Over the next year, many of the leading BBS software providers went bankrupt and tens of thousands of BBSes disappeared. Today, BBSing survives largely as a nostalgic hobby in most parts of the world, but it is still an extremely popular form of communication for Taiwanese youth (see PTT Bulletin Board System). Most surviving BBSes are accessible over Telnet and typically offer free email accounts, FTP services, IRC and all the protocols commonly used on the Internet. Some offer access through packet switched networks or packet radio connections. History Precursors A precursor to the public bulletin board system was Community Memory, started in August 1973 in Berkeley, California. Useful microcomputers did not exist at that time, and modems were both expensive and slow. Community Memory therefore ran on a mainframe computer and was accessed through terminals located in several San Francisco Bay Area neighborhoods. The poor quality of the original modem connecting the terminals to the mainframe prompted Community Memory hardware person, Lee Felsenstein, to invent the Pennywhistle modem, whose design was highly influential in the mid-1970s. Community Memory allowed the user to type messages into a computer terminal after inserting a coin, and offered a "pure" bulletin board experience with public messages only (no email or other features). It did offer the ability to tag messages with keywords, which the user could use in searches. The system acted primarily in the form of a buy and sell system with the tags taking the pl
https://en.wikipedia.org/wiki/Telecommunications%20in%20Belarus
Telecommunications in Belarus involves the availability and use of electronic devices and services, such as the telephone, television, radio or computer, for the purpose of communication. Telephone system Telephone lines in use: 3,9741 million (2011). Mobile/cellular: 11,559,473 subscribers (Q1 2019). The phone calling code for Belarus is +375. The Ministry of Telecommunications controls all telecommunications originating within the country through its carrier unitary enterprise, Beltelecom. Minsk has a digital metropolitan network; waiting lists for telephones are long; fixed line penetration is improving although rural areas continue to be underserved; intercity – Belarus has developed a fibre-optic backbone system presently serving at least 13 major cities (1998). Belarus's fibre optics form synchronous digital hierarchy rings through other countries' systems. International connection Belarus is a member of the Trans-European Line (TEL), Trans-Asia-Europe Fibre-Optic Line (TAE) and has access to the Trans-Siberia Line (TSL); three fibre-optic segments provide connectivity to Latvia, Poland, Russia, and Ukraine; worldwide service is available to Belarus through this infrastructure; Intelsat, Eutelsat, and Intersputnik earth stations. In 2006 it was announced that Belarus and Russia completed the second broadband link between the two countries, the Yartsevo-Vitebsk cable. The capacity of this high speed terrestrial link which based on DWDM and STM technology is 400Gbit/s with the ability to upgrade in the future. Cellular communications Belarus has 3 GSM/UMTS operators – A1, MTS, life:). For 4G data operators use the infrastructure managed by state operator beCloud, VoLTE service currently is offered only with A1. Radio and television Television broadcast stations: 100 of which 59 are privately owned. Belarus has switched from an analog to digital broadcast television. The process finished in May 2015. Belarus broadcasts according to the DVB-T2 standard with MPEG-4 compression. Radio broadcast stations: 173 with 24 privately owned, including 30 FM stations. Radios: 3.02 million (1997). Internet Country code: .by The state telecom monopoly, Beltelecom, holds the exclusive interconnection with Internet providers outside of Belarus. Beltelecom owns all the backbone channels that linked to the Lattelecom, TEO LT, Tata Communications (former Teleglobe), Synterra, Rostelecom, Transtelekom and MTS ISP's. Beltelecom is the only operator licensed to provide commercial VoIP services in Belarus. Until 2005–2006 broadband access (mostly using ADSL) was available only in a few major cities in Belarus. In Minsk there were a dozen privately owned ISP's and in some larger cities Beltelecom's broadband was available. Outside these cities the only options for Internet access were dial-up from Beltelecom or GPRS/cdma2000 from mobile operators. In 2006 Beltelecom introduced a new trademark, Byfly, for its ADSL access. As of 2008 Byfly was avail
https://en.wikipedia.org/wiki/Transport%20in%20Belgium
Transport in Belgium is facilitated with well-developed road, air, rail and water networks. The rail network has of electrified tracks. There are of roads, among which there are of motorways, of main roads and of other paved roads. There is also a well-developed urban rail network in Brussels, Antwerp, Ghent and Charleroi. The ports of Antwerp and Bruges-Zeebrugge are two of the biggest seaports in Europe. Brussels Airport is Belgium's biggest airport. Railways Rail transport in Belgium was historically managed by the National Railway Company of Belgium, known as SNCB in French and NMBS in Dutch. In 2005, the public company was split into 2 companies: Infrabel, which manages the rail network and SNCB/NMBS itself, which manages the freight and passenger services. There is a total of , ( double track (as of 1998)), of which are electrified, mainly at 3,000 volts DC but with at 25 kV 50 Hz AC (2004) and all on standard gauge of . In 2004 the National Railway Company of Belgium, carried 178.4 million passengers a total of 8,676 million passenger-kilometres. Due to the high population density, operations are relatively profitable, so tickets are cheap and the frequency of services is high. The SNCB/NMBS is continually updating its rolling stock. The network currently includes four high speed lines, three operating up to , and one up to . HSL 1 runs from just south of Brussels to the French border, where it continues to Lille, and from there to Paris or London. HSL 2 runs from Leuven to Liège. HSL 3 continues this route from Liège to the German border near Aachen. HSL 4 runs from Antwerp to Rotterdam by meeting HSL-Zuid at the border with Netherlands. Electrification is at 3 kV DC, with the exception of the new high-speed lines, and of two recently electrified lines in the south of the country which are at 25 kV AC. Trains, contrary to tram and road traffic, run on the left. Rail links with adjacent countries France — voltage change 3 kV DC – 25 kV AC LGV 1 — voltage remains at 25 kV AC. via France to the UK on HSL 1, LGV 1, Channel Tunnel and CTRL (Channel Tunnel Rail Link) — voltage remains at 25 kV AC. Germany — voltage change 3 kV DC – 15 kV AC HSL 3 — voltage remains at 25 kV AC. Netherlands — voltage change 3 kV DC – 1500 V DC HSL-Zuid — voltage remains at 25 kV AC. Luxembourg — no voltage change at the border (the line Hatrival (Libramont)-Luxembourg is at 25 kV AC and the line Gouvy-Luxembourg is at 25 kV AC) Urban rail An urban commuter rail network, Brussels RER (, ), is operational in the Brussels-Capital Region and surrounding areas. Metros and light rail In Belgium an extensive system of tram-like local railways called vicinal or buurtspoor lines crossed the country in the first half of the 20th century, and had a greater route length than the main-line railway system. The only survivors of the vicinal/buurtspoor system are the Kusttram (covering almost the entire coast from France to the Netherlands, bein
https://en.wikipedia.org/wiki/Telecommunications%20in%20Botswana
Telecommunications in Botswana include newspapers, radio, television, fixed and mobile telephones, and the Internet. In addition to the government-owned newspaper and national radio network, there is an active, independent press (six weekly newspapers). Foreign publications are sold without restriction in Botswana. Two privately owned radio stations began operations in 1999. Botswana's first national television station, the government-owned Botswana Television (BTV), was launched in July 2000. It began broadcasting with three hours of programming on weekdays and five on weekends, offering news in Setswana and English, entertainment, and sports, with plans to produce 60% of its programming locally. The cellular phone providers Orange and MTN cover most of the country. Radio stations 2 state-owned national radio stations; 3 privately owned radio stations broadcast locally (2007); AM 8, FM 13, shortwave 4 (2001). Television stations One state-owned and one privately owned; privately owned satellite TV subscription service is available (2007). Television sets in use: 101,713 (2001); 98,568 (2003). 173,327 (2006) 297,233 (2008) 297,971 (2011) 365,650 (2014). Telephones Main lines in use: 160,500 lines, 134th in the world (2012); 136,900 (2006). Mobile cellular in use: 3.1 million lines, 129th in the world (2012); Telephone system general assessment: Botswana is participating in regional development efforts; expanding fully digital system with fiber-optic cables linking the major population centers in the east as well as a system of open-wire lines, microwave radio relays links, and radiotelephone communication stations (2011); domestic: fixed-line teledensity has declined in recent years and now stands at roughly 7 telephones per 100 persons; mobile-cellular teledensity now pushing 140 telephones per 100 persons (2011); international: country code - 267; international calls are made via satellite, using international direct dialing; 2 international exchanges; digital microwave radio relay links to Namibia, Zambia, Zimbabwe, and South Africa; satellite earth station - 1 Intelsat (Indian Ocean) (2011). ISDB-T Features: Supports ISDB-T broadcast (13 segments). MPEG-2/ MPEG-4 AVC/ H.264 HD/ SD video. DiVX Compatible with 480i / 480p / 720p / 1080i/ 1080p video formats. Auto and manually scan all available TV and radio channels. Aspect ratio 16:9 and 4:3. 1000 channels memory. Parental control. Teletext / Bit map subtitle. Compliant with ETSI. Supported 7 days EPG function. VBI Teletext support 6 MHz software setting Auto / Manual program search. Multi language supported. Internet Internet top-level domain: .bw Internet users: 241,272 users, 148th in the world; 11.5% of the population, 166th in the world (2012); 120,000 users, 154th in the world (2009);   80,000 users (2007). Internet broadband: 16,407 fixed broadband subscriptions, 134th in the world; 0.8% of the population, 143rd in the world; 348,124 wir
https://en.wikipedia.org/wiki/Transport%20in%20Botswana
Transportation in Botswana is provided by an extensive network of railways, highways, ferry services and air routes that criss-cross the country. The transport sector in Botswana played an important role in economic growth following its independence in 1966. The country discovered natural resources which allowed it to finance the development of infrastructure, and policy ensured that the transport sector grew at an affordable pace commensurate with demands for services. Rail transport Rail services are provided by Botswana Railways, with most routes radiating from Gaborone. Botswana has the 93rd longest railway network in the world at 888 km, it is one of the busiest railways in Africa. The track gauge is 1,067 mm (3 ft 6 in) (cape gauge). Botswana is an associate member of the International Union of Railways (UIC). Regional trains (BR Express) Botswana Railways run 2 nightly passenger trains, one from Lobatse to Francistown, and the other from Francistown to Lobatse, with stops in Gaborone, Mahalapye, Palapye, and Serule. The passenger train is termed the "BR Express" (Botswana Railways). Passenger services were suspended from 2009 to 2016, with the exception of an international link to Zimbabwe from Francistown. Commuter/suburban trains In Botswana, the (Botswana Railways) "BR Express" has a commuter train between Lobatse and Gaborone. The train departs to Lobatse at 0530hrs and arrives at Gaborone at 0649hrs. This train returns to Lobatse in the evening, departing in Gaborone at 1800hrs. Arrival time at Lobatse is 1934hrs. The train stops at Otse, Ramotswa, and Commerce Park Halt. BR Express Sleeping & Dining Department From the beginning, the BR decided to operate its own sleeping cars, thus building bigger-sized berths and more comfortable surroundings. Providing and operating their cars allowed better control of the services and revenue. While the food was served to passengers, the profits were never result of serving the food. Those who could afford to travel great distances expected better facilities, and favorable opinions from the overall experience would attract others to Botswana and the BR's trains. Stations Freight trains Over half of BRs freight traffic is in coal, grain and intermodal freight, and it also ships automotive parts and assembled automobiles, sulphur, fertilizers, other chemicals, soda ash, forest products and other types of the commodities. Locomotives Diesel locomotives As of March 2009: 8 General Electric UM 22C diesel-electric locomotive, 1982. 20 General Motors Model GT22LC-2 diesel-electric locomotive, 1986. 10 General Electric UI5C diesel-electric locomotive, 1990. 8 new gt142aces were delivered in the end of 2017. Network total: 888 km (since 2015) number of stations: 13 standard gauge: 1,067 mm (3 ft 6 in) cape gauge. Railway links with adjacent countries Existing South Africa (same gauge) Zimbabwe (same gauge) Currently under construction Zambia - being built at Kazungula Bridg
https://en.wikipedia.org/wiki/Transport%20in%20Brazil
Transport infrastructure in Brazil is characterized by strong regional differences and lack of development of the national rail network. Brazil's fast-growing economy, and especially the growth in exports, will place increasing demands on the transport networks. However, sizeable new investments that are expected to address some of the issues are either planned or in progress. It is common to travel domestically by air because the price is low. Brazil has the second highest number of airports in the world, after the USA. Railways The Brazilian railway network has an extension of about . It is basically used for transporting ores. Usually, the railway sector was treated in a secondary way in Brazil, due to logistical, economic or political difficulties to install more railways. The Brazilian railroad system had a great expansion between 1875 and 1920. The heyday of the railway modal was interrupted during the Getúlio Vargas government, which prioritized the road modal. In the 1940s, the railway network was already facing several problems, from low-powered locomotives to uneconomical layouts. In 1957, a state-owned company was created, the National Railroad Network (RFFSA), which started to manage 18 railroads in the Union. Several deficit railways were closed under the promise of state investment in new projects, which did not happen. The actions were centralized in the government until the opening of the market in 1990. So, the National Privatization Plan was instituted, with dozens of concessions being made. However, they ended up concentrating the railways, mainly, in three large business groups, América Latina Logística (ALL), Vale S.A. and MRS Logística. The refurbishment generated an increase in productivity (cargoes transported increased by 30% with the same railway line). However, the main problem was that the reform not only gave away the railway line, but also geographical exclusivity. This resulted in the non-creation of competitive incentives for the expansion and renewal of the existing network. With the State maintaining the opening of new railways a difficult, slow and bureaucratic process, as it maintains the total monopoly of power over this sector, the railways did not expand any further in the country, and the sector was very outdated. In 2021, a New Framework for Railways was created, allowing the construction of railways by authorization, as occurs in the exploration of infrastructure in sectors such as telecommunications, electricity and ports. It's also possible to authorize the exploration of stretches not implemented, idle, or in the process of being returned or deactivated. With the change of rules in the sector, in December 2021, there were already requests to open of new tracks, in 64 requests for implementation of new railways. Nine new railroads had already been authorized by the Federal Government, in of new tracks. Total actual network: 29,888 km of railroad and 1,411 km of subway and light rail Broad gauge:
https://en.wikipedia.org/wiki/Bjarne%20Stroustrup
Bjarne Stroustrup (; ; born 30 December 1950) is a computer scientist, most notable for the invention and development of the C++ programming language. Stroustrup served as a visiting professor of computer science at Columbia University in the City of New York beginning in 2014, where he has been a full professor since 2022. Early life and education Stroustrup was born in Aarhus, Denmark. His family was working class, and he attended local schools. He attended Aarhus University from 1969 to 1975 and graduated with a Candidatus Scientiarum in mathematics with computer science. His interests focused on microprogramming and machine architecture. He learned the fundamentals of object-oriented programming from its inventor, Kristen Nygaard, who frequently visited Aarhus. In 1979, he received his PhD in computer science from the University of Cambridge, where his research on distributed computing was supervised by David Wheeler. Career and research In 1979, Stroustrup began his career as a member of technical staff in the Computer Science Research Center of Bell Labs in Murray Hill, New Jersey, USA. There, he began his work on C++ and programming techniques. Stroustrup was the head of AT&T Bell Labs' Large-scale Programming Research department, from its creation until late 2002. In 1993, he was made a Bell Labs fellow and in 1996, an AT&T Fellow. From 2002 to 2014, Stroustrup was the College of Engineering Chair Professor in Computer Science at Texas A&M University. From 2011, he was made a University Distinguished Professor. From January 2014 to April 2022, Stroustrup was a technical fellow and managing director in the technology division of Morgan Stanley in New York City and a visiting professor in computer science at Columbia University. As of July 2022, Stroustrup is a full professor of Computer Science at Columbia University. C++ Stroustrup is best known for his work on C++. In 1979, he began developing C++ (initially called "C with Classes"). In his own words, he "invented C++, wrote its early definitions, and produced its first implementation [...] chose and formulated the design criteria for C++, designed all its major facilities, and was responsible for the processing of extension proposals in the C++ standards committee." C++ was made generally available in 1985. For non-commercial use, the source code of the compiler and the foundation libraries was the cost of shipping (US$75); this was before Internet access was common. Stroustrup also published a textbook for the language in 1985, The C++ Programming Language. The key language-technical areas of contribution of C++ are: A static type system with equal support for built-in types and user-defined types (that requires control of the construction, destruction, copying, and movement of objects; and operator overloading). Value and reference semantics. Systematic and general resource management (RAII): constructors, destructor, and exceptions relying on them. Support for effici
https://en.wikipedia.org/wiki/Bluetooth
Bluetooth is a short-range wireless technology standard that is used for exchanging data between fixed and mobile devices over short distances and building personal area networks (PANs). In the most widely used mode, transmission power is limited to 2.5 milliwatts, giving it a very short range of up to . It employs UHF radio waves in the ISM bands, from 2.402GHz to 2.48GHz. It is mainly used as an alternative to wire connections, to exchange files between nearby portable devices and connect cell phones and music players with wireless headphones. Bluetooth is managed by the Bluetooth Special Interest Group (SIG), which has more than 35,000 member companies in the areas of telecommunication, computing, networking, and consumer electronics. The IEEE standardized Bluetooth as IEEE 802.15.1, but no longer maintains the standard. The Bluetooth SIG oversees development of the specification, manages the qualification program, and protects the trademarks. A manufacturer must meet Bluetooth SIG standards to market it as a Bluetooth device. A network of patents apply to the technology, which are licensed to individual qualifying devices. , 4.7 billion Bluetooth integrated circuit chips are shipped annually. Etymology The name "Bluetooth" was proposed in 1997 by Jim Kardach of Intel, one of the founders of the Bluetooth SIG. The name was inspired by a conversation with Sven Mattisson who related Scandinavian history through tales from Frans G. Bengtsson's The Long Ships, a historical novel about Vikings and the 10th-century Danish king Harald Bluetooth. Upon discovering a picture of the runestone of Harald Bluetooth in the book A History of the Vikings by Gwyn Jones, Kardach proposed Bluetooth as the codename for the short-range wireless program which is now called Bluetooth. According to Bluetooth's official website, Bluetooth is the Anglicised version of the Scandinavian Blåtand/Blåtann (or in Old Norse blátǫnn). It was the epithet of King Harald Bluetooth, who united the disparate Danish tribes into a single kingdom; Kardach chose the name to imply that Bluetooth similarly unites communication protocols. The Bluetooth logo is a bind rune merging the Younger Futhark runes  (ᚼ, Hagall) and  (ᛒ, Bjarkan), Harald's initials. History The development of the "short-link" radio technology, later named Bluetooth, was initiated in 1989 by Nils Rydbeck, CTO at Ericsson Mobile in Lund, Sweden. The purpose was to develop wireless headsets, according to two inventions by Johan Ullman, and . Nils Rydbeck tasked Tord Wingren with specifying and Dutchman Jaap Haartsen and Sven Mattisson with developing. Both were working for Ericsson in Lund. Principal design and development began in 1994 and by 1997 the team had a workable solution. From 1997 Örjan Johansson became the project leader and propelled the technology and standardization. In 1997, Adalio Sanchez, then head of IBM ThinkPad product R&D, approached Nils Rydbeck about collaborating on integrating a mo
https://en.wikipedia.org/wiki/Binary-coded%20decimal
In computing and electronic systems, binary-coded decimal (BCD) is a class of binary encodings of decimal numbers where each digit is represented by a fixed number of bits, usually four or eight. Sometimes, special bit patterns are used for a sign or other indications (e.g. error or overflow). In byte-oriented systems (i.e. most modern computers), the term unpacked BCD usually implies a full byte for each digit (often including a sign), whereas packed BCD typically encodes two digits within a single byte by taking advantage of the fact that four bits are enough to represent the range 0 to 9. The precise four-bit encoding, however, may vary for technical reasons (e.g. Excess-3). The ten states representing a BCD digit are sometimes called tetrades (the nibble typically needed to hold them is also known as a tetrade) while the unused, don't care-states are named , pseudo-decimals or pseudo-decimal digits. BCD's main virtue, in comparison to binary positional systems, is its more accurate representation and rounding of decimal quantities, as well as its ease of conversion into conventional human-readable representations. Its principal drawbacks are a slight increase in the complexity of the circuits needed to implement basic arithmetic as well as slightly less dense storage. BCD was used in many early decimal computers, and is implemented in the instruction set of machines such as the IBM System/360 series and its descendants, Digital Equipment Corporation's VAX, the Burroughs B1700, and the Motorola 68000-series processors. BCD per se is not as widely used as in the past, and is unavailable or limited in newer instruction sets (e.g., ARM; x86 in long mode). However, decimal fixed-point and decimal floating-point formats are still important and continue to be used in financial, commercial, and industrial computing, where the subtle conversion and fractional rounding errors that are inherent in binary floating point formats cannot be tolerated. Background BCD takes advantage of the fact that any one decimal numeral can be represented by a four-bit pattern. The most obvious way of encoding digits is Natural BCD (NBCD), where each decimal digit is represented by its corresponding four-bit binary value, as shown in the following table. This is also called "8421" encoding. This scheme can also be referred to as Simple Binary-Coded Decimal (SBCD) or BCD 8421, and is the most common encoding. Others include the so-called "4221" and "7421" encoding – named after the weighting used for the bits – and "Excess-3". For example, the BCD digit 6, in 8421 notation, is in 4221 (two encodings are possible), in 7421, while in Excess-3 it is (). The following table represents decimal digits from 0 to 9 in various BCD encoding systems. In the headers, the "8421" indicates the weight of each bit. In the fifth column ("BCD 84−2−1"), two of the weights are negative. Both ASCII and EBCDIC character codes for the digits, which are examples of zoned BCD, are also
https://en.wikipedia.org/wiki/BCD
BCD may refer to: Computing Binary-coded decimal, a representation of decimal digits in binary BCD (character encoding), a 6-bit superset of binary-coded decimal derived from the binary encoding of the same name Boot Configuration Data, the configuration data required to boot Microsoft Windows Vista and later Bipolar-CMOS-DMOS, a type of BiCMOS semiconductor technology Organisations Basnahira Cricket Dundee, a Sri Lankan cricket team BCD Tofu House, a Los Angeles-based Korean restaurant chain BCD Travel, a provider of global corporate travel management Belarusian Christian Democracy, a Christian-democratic political party in Belarus. Berkshire Country Day School, an independent school in Lenox, Massachusetts, US Bid Closing Date The closing date for a bid is a specific date (and usually a specific time) when the bid is closed to the public for bid submissions. At this point, only the submitted proposals will be considered eligible. The British Columbia Dragoons, a Canadian Forces armoured regiment Places Bacolod–Silay International Airport (IATA code), Silay City, Philippines Beirut Central District, Beirut, Lebanon Other uses Bad conduct discharge, a form of discharge from US military service, sometimes referred to colloquially as a "big chicken dinner". Barrels per calendar day, a unit for measuring output of oil refineries Blue compact dwarf galaxy, a small galaxy which contains large clusters of young, hot, massive stars Board-certified diplomate, in the list of credentials in psychology Buoyancy control device, in scuba diving Bolt circle diameter, for example, of a crankset, of a bicycle disc brake, or in wheel sizing "Behind closed doors", a marketing term for previewing a product to a select audience See also BCD in the sugar baby/sugar daddy (SBSD) community means Behind Closed Doors. BCDS (disambiguation)
https://en.wikipedia.org/wiki/Bruce%20Perens
Bruce Perens (born around 1958) is an American computer programmer and advocate in the free software movement. He created The Open Source Definition and published the first formal announcement and manifesto of open source. He co-founded the Open Source Initiative (OSI) with Eric S. Raymond. In 2005, Perens represented Open Source at the United Nations World Summit on the Information Society, at the invitation of the United Nations Development Programme. He has appeared before national legislatures and is often quoted in the press, advocating for open source and the reform of national and international technology policy. Perens is also an amateur radio operator, with call sign K6BP. He promotes open radio communications standards and open-source hardware. In 2016 Perens, along with Boalt Hall (Berkeley Law) professor Lothar Determann, co-authored "Open Cars" which appeared in the Berkeley Technology Law Journal. In 2018 Perens founded the Open Research Institute (ORI), a non-profit research and development organization to address technologies involving Open Source, Open Hardware, Open Standards, Open Content, and Open Access to Research. In April 2022 he divorced himself from the organization and reported he was starting a new charity, HamOpen.org, to redirect his focus, and align with the ARRL organization for their liability insurance benefit. HamOpen has been most visible supporting the convention exhibitions of projects Perens supports, including M17 and FreeDV. Companies Perens is a partner at OSS Capital, and continues to operate two companies: Algoram is a start-up which is creating a software-defined radio transceiver. Legal Engineering is a legal-technical consultancy which specializes in resolving copyright infringement in relation to open source software. Early life Perens grew up in Long Island, New York. He was born with cerebral palsy, which caused him to have slurred speech as a child, a condition that led to a misdiagnosis of him as developmentally disabled in school and led the school to fail to teach him to read. He developed an interest in technology at an early age: besides his interest in amateur radio, he ran a pirate radio station in the town of Lido Beach and briefly engaged in phone phreaking. Career Computer graphics Perens worked for seven years at the New York Institute of Technology Computer Graphics Lab. After that, he worked at Pixar for 12 years, from 1987 to 1999. He is credited as a studio tools engineer on the Pixar films A Bug's Life (1998) and Toy Story 2 (1999). No-Code International Perens founded No-Code International in 1998 with the goal of ending the Morse Code test then required for an amateur radio license. His rationale was that amateur radio should be a tool for young people to learn advanced technology and networking, rather than something that preserved antiquity and required new hams to master outmoded technology before they were allowed on the air. Perens lobbied intensively on the Inter
https://en.wikipedia.org/wiki/Blowfish%20%28disambiguation%29
Blowfish are species of fish in the family Tetraodontidae. Blowfish may also refer to: Porcupinefish, belonging to the family Diodontidae Blowfish (cipher), an encryption algorithm Blowfish (company), an American erotic goods supplier The Blowfish, a satirical newspaper at Brandeis University Lexington County Blowfish, a baseball team Vice President Blowfish, a character in the animated series Adventure Time episode "President Porpoise Is Missing!" See also Hootie & the Blowfish, an American rock band
https://en.wikipedia.org/wiki/Braille
Braille ( , ) is a tactile writing system used by people who are visually impaired. It can be read either on embossed paper or by using refreshable braille displays that connect to computers and smartphone devices. Braille can be written using a slate and stylus, a braille writer, an electronic braille notetaker or with the use of a computer connected to a braille embosser. Braille is named after its creator, Louis Braille, a Frenchman who lost his sight as a result of a childhood accident. In 1824, at the age of fifteen, he developed the braille code based on the French alphabet as an improvement on night writing. He published his system, which subsequently included musical notation, in 1829. The second revision, published in 1837, was the first binary form of writing developed in the modern era. Braille characters are formed using a combination of six raised dots arranged in a 3 × 2 matrix, called the braille cell. The number and arrangement of these dots distinguishes one character from another. Since the various braille alphabets originated as transcription codes for printed writing, the mappings (sets of character designations) vary from language to language, and even within one; in English Braille there are 3 levels of braille: uncontracted braille a letter-by-letter transcription used for basic literacy; contracted braille an addition of abbreviations and contractions used as a space-saving mechanism; and grade 3 various non-standardized personal stenography that is less commonly used. In addition to braille text (letters, punctuation, contractions), it is also possible to create embossed illustrations and graphs, with the lines either solid or made of series of dots, arrows, and bullets that are larger than braille dots. A full braille cell includes six raised dots arranged in two columns, each column having three dots. The dot positions are identified by numbers from one to six. There are 64 possible combinations, including no dots at all for a word space. Dot configurations can be used to represent a letter, digit, punctuation mark, or even a word. Early braille education is crucial to literacy, education and employment among the blind. Despite the evolution of new technologies, including screen reader software that reads information aloud, braille provides blind people with access to spelling, punctuation and other aspects of written language less accessible through audio alone. While some have suggested that audio-based technologies will decrease the need for braille, technological advancements such as braille displays have continued to make braille more accessible and available. Braille users highlight that braille remains as essential as print is to the sighted. History Braille was based on a tactile code, now known as night writing, developed by Charles Barbier. (The name "night writing" was later given to it when it was considered as a means for soldiers to communicate silently at night and without a light source, but Ba
https://en.wikipedia.org/wiki/Bill%20Joy
William Nelson Joy (born November 8, 1954) is an American computer engineer and venture capitalist. He co-founded Sun Microsystems in 1982 along with Scott McNealy, Vinod Khosla, and Andy Bechtolsheim, and served as Chief Scientist and CTO at the company until 2003. He played an integral role in the early development of BSD UNIX while being a graduate student at Berkeley, and he is the original author of the vi text editor. He also wrote the 2000 essay "Why The Future Doesn't Need Us", in which he expressed deep concerns over the development of modern technologies. Joy was elected a member of the National Academy of Engineering (1999) for contributions to operating systems and networking software. Early career Joy was born in the Detroit suburb of Farmington Hills, Michigan, to William Joy, a school vice-principal and counselor, and Ruth Joy. He earned a Bachelor of Science in electrical engineering from the University of Michigan and a Master of Science in electrical engineering and computer science from the University of California, Berkeley, in 1979. While a graduate student at Berkeley, he worked for Fabry's Computer Systems Research Group CSRG on the Berkeley Software Distribution (BSD) version of the Unix operating system. He initially worked on a Pascal compiler left at Berkeley by Ken Thompson, who had been visiting the university when Joy had just started his graduate work. He later moved on to improving the Unix kernel, and also handled BSD distributions. Some of his most notable contributions were the ex and vi editors and the C shell. Joy's prowess as a computer programmer is legendary, with an oft-told anecdote that he wrote the vi editor in a weekend. Joy denies this assertion. A few of his other accomplishments have also been sometimes exaggerated; Eric Schmidt, CEO of Novell at the time, inaccurately reported during an interview in PBS's documentary Nerds 2.0.1 that Joy had personally rewritten the BSD kernel in a weekend. He also wrote cat -v in 1980 which Rob Pike and Brian W. Kernighan wrote went against Unix philosophy. According to a Salon article, during the early 1980s, DARPA had contracted the company Bolt, Beranek and Newman (BBN) to add TCP/IP to Berkeley UNIX. Joy had been instructed to plug BBN's stack into Berkeley Unix, but he refused to do so, as he had a low opinion of BBN's TCP/IP. So, Joy wrote his own high-performance TCP/IP stack. According to John Gage: Rob Gurwitz, who was working at BBN at the time, disputes this version of events. Sun Microsystems In 1982, after the firm had been going for six months, Joy, Sun's sixteenth employee, was brought in with full co-founder status at Sun Microsystems. At Sun, Joy was an inspiration for the development of NFS, the SPARC microprocessors, the Java programming language, Jini/JavaSpaces, and JXTA. In 1986, Joy was awarded a Grace Murray Hopper Award by the ACM for his work on the Berkeley UNIX Operating System. On September 9, 2003, Sun announced Joy was lea
https://en.wikipedia.org/wiki/BASIC
BASIC (Beginners' All-purpose Symbolic Instruction Code) is a family of general-purpose, high-level programming languages designed for ease of use. The original version was created by John G. Kemeny and Thomas E. Kurtz at Dartmouth College in 1963. They wanted to enable students in non-scientific fields to use computers. At the time, nearly all computers required writing custom software, which only scientists and mathematicians tended to learn. In addition to the program language, Kemeny and Kurtz developed the Dartmouth Time Sharing System (DTSS), which allowed multiple users to edit and run BASIC programs simultaneously on remote terminals. This general model became popular on minicomputer systems like the PDP-11 and Data General Nova in the late 1960s and early 1970s. Hewlett-Packard produced an entire computer line for this method of operation, introducing the HP2000 series in the late 1960s and continuing sales into the 1980s. Many early video games trace their history to one of these versions of BASIC. The emergence of microcomputers in the mid-1970s led to the development of multiple BASIC dialects, including Microsoft BASIC in 1975. Due to the tiny main memory available on these machines, often 4 KB, a variety of Tiny BASIC dialects were also created. BASIC was available for almost any system of the era, and became the de facto programming language for home computer systems that emerged in the late 1970s. These PCs almost always had a BASIC interpreter installed by default, often in the machine's firmware or sometimes on a ROM cartridge. BASIC declined in popularity in the 1990s, as more powerful microcomputers came to market and programming languages with advanced features (such as Pascal and C) became tenable on such computers. In 1991, Microsoft released Visual Basic, combining an updated version of BASIC with a visual forms builder. This reignited use of the language and "VB" remains a major programming language in the form of VB.NET, while a hobbyist scene for BASIC more broadly continues to exist. Origin John G. Kemeny was the math department chairman at Dartmouth College. Based largely on his reputation as an innovator in math teaching, in 1959 the school won an Alfred P. Sloan Foundation award for $500,000 to build a new department building. Thomas E. Kurtz had joined the department in 1956, and from the 1960s Kemeny and Kurtz agreed on the need for programming literacy among students outside the traditional STEM fields. Kemeny later noted that "Our vision was that every student on campus should have access to a computer, and any faculty member should be able to use a computer in the classroom whenever appropriate. It was as simple as that." Kemeny and Kurtz had made two previous experiments with simplified languages, DARSIMCO (Dartmouth Simplified Code) and DOPE (Dartmouth Oversimplified Programming Experiment). These did not progress past a single freshman class. New experiments using Fortran and ALGOL followed, but Kurtz
https://en.wikipedia.org/wiki/Borland
Borland Software Corporation was a computer technology company founded in 1983 by Niels Jensen, Ole Henriksen, Mogens Glad, and Philippe Kahn. Its main business was the development and sale of software development and software deployment products. Borland was first headquartered in Scotts Valley, California, then in Cupertino, California, and then in Austin, Texas. In 2009, the company became a full subsidiary of the British firm Micro Focus International plc. History The 1980s: Foundations Borland Ltd. was founded in August 1981 by three Danish citizens Niels Jensen, Ole Henriksen, and Mogens Glad to develop products like Word Index for the CP/M operating system using an off-the-shelf company. However, the response to the company's products at the CP/M-82 show in San Francisco showed that a U.S. company would be needed to reach the American market. They met Philippe Kahn, who had just moved to Silicon Valley and had been a key developer of the Micral. The three Danes had embarked, at first successfully, on marketing software first from Denmark, and later from Ireland, before running into some challenges when they met Philippe Kahn. Kahn was chairman, president, and CEO of Borland Inc. from its beginning in 1983 until 1995. The company name "Borland" was a creation of Kahn's, taking inspiration from the name of an American Astronaut and then-Eastern Air Lines chairperson Frank Borman. The main shareholders at the incorporation of Borland were Niels Jensen (250,000 shares), Ole Henriksen (160,000), Mogens Glad (100,000), and Kahn (80,000). Borland International, Inc. era Borland developed various software development tools. Its first product was Turbo Pascal in 1983, developed by Anders Hejlsberg (who later developed .NET and C# for Microsoft) and before Borland acquired the product which was sold in Scandinavia under the name of Compas Pascal. 1984 saw the launch of Borland Sidekick, a time organization, notebook, and calculator utility that was an early terminate-and-stay-resident program (TSR) for MS-DOS compatible operating systems. By the mid-1980s, the company had an exhibit at the 1985 West Coast Computer Faire other than IBM or AT&T. Bruce Webster reported that "the legend of Turbo Pascal has by now reached mythic proportions, as evidenced by the number of firms that, in marketing meetings, make plans to become 'the next Borland'". After Turbo Pascal and Sidekick, the company launched other applications such as SuperKey and Lightning, all developed in Denmark. While the Danes remained majority shareholders, board members included Kahn, Tim Berry, John Nash, and David Heller. With the assistance of John Nash and David Heller, both British members of the Borland Board, the company was taken public on London's Unlisted Securities Market (USM) in 1986. Schroders was the lead investment banker. According to the London IPO filings, the management team was Philippe Kahn as president, Spencer Ozawa as VP of Operations, Marie Bourget as CFO, an
https://en.wikipedia.org/wiki/Brian%20Kernighan
Brian Wilson Kernighan (; born January 30, 1942) is a Canadian computer scientist. He worked at Bell Labs and contributed to the development of Unix alongside Unix creators Ken Thompson and Dennis Ritchie. Kernighan's name became widely known through co-authorship of the first book on the C programming language (The C Programming Language) with Dennis Ritchie. Kernighan affirmed that he had no part in the design of the C language ("it's entirely Dennis Ritchie's work"). He authored many Unix programs, including ditroff. Kernighan is coauthor of the AWK and AMPL programming languages. The "K" of K&R C and of AWK both stand for "Kernighan". In collaboration with Shen Lin he devised well-known heuristics for two NP-complete optimization problems: graph partitioning and the travelling salesman problem. In a display of authorial equity, the former is usually called the Kernighan–Lin algorithm, while the latter is known as the Lin–Kernighan heuristic. Kernighan has been a professor of computer science at Princeton University since 2000 and is the director of undergraduate studies in the department of computer science. In 2015, he co-authored the book The Go Programming Language. Early life and education Kernighan was born in Toronto. He attended the University of Toronto between 1960 and 1964, earning his bachelor's degree in engineering physics. He received his Ph.D. in electrical engineering from Princeton University in 1969, completing a doctoral dissertation titled "Some graph partitioning problems related to program segmentation" under the supervision of Peter G. Weiner. Career and research Kernighan has held a professorship in the department of computer science at Princeton since 2000. Each fall he teaches a course called "Computers in Our World", which introduces the fundamentals of computing to non-majors. Kernighan was the software editor for Prentice Hall International. His "Software Tools" series spread the essence of "C/Unix thinking" with makeovers for BASIC, FORTRAN, and Pascal, and most notably his "Ratfor" (rational FORTRAN) was put in the public domain. He has said that if stranded on an island with only one programming language it would have to be C. Kernighan coined the term "Unix" and helped popularize Thompson's Unix philosophy. Kernighan is also known as a coiner of the expression "What You See Is All You Get" (WYSIAYG), which is a sarcastic variant of the original "What You See Is What You Get" (WYSIWYG). Kernighan's term is used to indicate that WYSIWYG systems might throw away information in a document that could be useful in other contexts. In 1972, Kernighan described memory management in strings using "hello" and "world", in the B programming language, which became the iconic example we know today. Kernighan's original 1978 implementation of Hello, World! was sold at The Algorithm Auction, the world's first auction of computer algorithms. In 1996, Kernighan taught CS50 which is the Harvard University introductory
https://en.wikipedia.org/wiki/BCPL
BCPL ("Basic Combined Programming Language") is a procedural, imperative, and structured programming language. Originally intended for writing compilers for other languages, BCPL is no longer in common use. However, its influence is still felt because a stripped down and syntactically changed version of BCPL, called B, was the language on which the C programming language was based. BCPL introduced several features of many modern programming languages, including using curly braces to delimit code blocks. BCPL was first implemented by Martin Richards of the University of Cambridge in 1967. Design BCPL was designed so that small and simple compilers could be written for it; reputedly some compilers could be run in 16 kilobytes. Furthermore, the original compiler, itself written in BCPL, was easily portable. BCPL was thus a popular choice for bootstrapping a system. A major reason for the compiler's portability lay in its structure. It was split into two parts: the front end parsed the source and generated O-code, an intermediate language. The back end took the O-code and translated it into the machine code for the target machine. Only of the compiler's code needed to be rewritten to support a new machine, a task that usually took between 2 and 5 person-months. This approach became common practice later (e.g. Pascal, Java). The language is unusual in having only one data type: a word, a fixed number of bits, usually chosen to align with the architecture's machine word and of adequate capacity to represent any valid storage address. For many machines of the time, this data type was a 16-bit word. This choice later proved to be a significant problem when BCPL was used on machines in which the smallest addressable item was not a word but a byte or on machines with larger word sizes such as 32-bit or 64-bit. The interpretation of any value was determined by the operators used to process the values. (For example, + added two values together, treating them as integers; ! indirected through a value, effectively treating it as a pointer.) In order for this to work, the implementation provided no type checking. The mismatch between BCPL's word orientation and byte-oriented hardware was addressed in several ways. One was by providing standard library routines for packing and unpacking words into byte strings. Later, two language features were added: the bit-field selection operator and the infix byte indirection operator (denoted by %). BCPL handles bindings spanning separate compilation units in a unique way. There are no user-declarable global variables; instead, there is a global vector, similar to "blank common" in Fortran. All data shared between different compilation units comprises scalars and pointers to vectors stored in a pre-arranged place in the global vector. Thus, the header files (files included during compilation using the "GET" directive) become the primary means of synchronizing global data between compilation units, containing "G
https://en.wikipedia.org/wiki/BPP%20%28complexity%29
In computational complexity theory, a branch of computer science, bounded-error probabilistic polynomial time (BPP) is the class of decision problems solvable by a probabilistic Turing machine in polynomial time with an error probability bounded by 1/3 for all instances. BPP is one of the largest practical classes of problems, meaning most problems of interest in BPP have efficient probabilistic algorithms that can be run quickly on real modern machines. BPP also contains P, the class of problems solvable in polynomial time with a deterministic machine, since a deterministic machine is a special case of a probabilistic machine. Informally, a problem is in BPP if there is an algorithm for it that has the following properties: It is allowed to flip coins and make random decisions It is guaranteed to run in polynomial time On any given run of the algorithm, it has a probability of at most 1/3 of giving the wrong answer, whether the answer is YES or NO. Definition A language L is in BPP if and only if there exists a probabilistic Turing machine M, such that M runs for polynomial time on all inputs For all x in L, M outputs 1 with probability greater than or equal to 2/3 For all x not in L, M outputs 1 with probability less than or equal to 1/3 Unlike the complexity class ZPP, the machine M is required to run for polynomial time on all inputs, regardless of the outcome of the random coin flips. Alternatively, BPP can be defined using only deterministic Turing machines. A language L is in BPP if and only if there exists a polynomial p and deterministic Turing machine M, such that M runs for polynomial time on all inputs For all x in L, the fraction of strings y of length p(|x|) which satisfy is greater than or equal to 2/3 For all x not in L, the fraction of strings y of length p(|x|) which satisfy is less than or equal to 1/3 In this definition, the string y corresponds to the output of the random coin flips that the probabilistic Turing machine would have made. For some applications this definition is preferable since it does not mention probabilistic Turing machines. In practice, an error probability of 1/3 might not be acceptable, however, the choice of 1/3 in the definition is arbitrary. Modifying the definition to use any constant between 0 and 1/2 (exclusive) in place of 1/3 would not change the resulting set BPP. For example, if one defined the class with the restriction that the algorithm can be wrong with probability at most 1/2100, this would result in the same class of problems. The error probability does not even have to be constant: the same class of problems is defined by allowing error as high as 1/2 − n−c on the one hand, or requiring error as small as 2−nc on the other hand, where c is any positive constant, and n is the length of input. This flexibility in the choice of error probability is based on the idea of running an error-prone algorithm many times, and using the majority result of the runs to obtain a more accur
https://en.wikipedia.org/wiki/BQP
In computational complexity theory, bounded-error quantum polynomial time (BQP) is the class of decision problems solvable by a quantum computer in polynomial time, with an error probability of at most 1/3 for all instances. It is the quantum analogue to the complexity class BPP. A decision problem is a member of BQP if there exists a quantum algorithm (an algorithm that runs on a quantum computer) that solves the decision problem with high probability and is guaranteed to run in polynomial time. A run of the algorithm will correctly solve the decision problem with a probability of at least 2/3. Definition BQP can be viewed as the languages associated with certain bounded-error uniform families of quantum circuits. A language L is in BQP if and only if there exists a polynomial-time uniform family of quantum circuits , such that For all , Qn takes n qubits as input and outputs 1 bit For all x in L, For all x not in L, Alternatively, one can define BQP in terms of quantum Turing machines. A language L is in BQP if and only if there exists a polynomial quantum Turing machine that accepts L with an error probability of at most 1/3 for all instances. Similarly to other "bounded error" probabilistic classes the choice of 1/3 in the definition is arbitrary. We can run the algorithm a constant number of times and take a majority vote to achieve any desired probability of correctness less than 1, using the Chernoff bound. The complexity class is unchanged by allowing error as high as 1/2 − n−c on the one hand, or requiring error as small as 2−nc on the other hand, where c is any positive constant, and n is the length of input. A complete problem for Promise-BQP Similar to the notion of NP-completeness and other complete problems, we can define a complete problem as a problem that is in Promise-BQP and that every problem in Promise-BQP reduces to it in polynomial time. Here is an intuitive problem that is complete for efficient quantum computation, which stems directly from the definition of Promise-BQP. Note that for technical reasons, completeness proofs focus on the promise problem version of BQP. We show that the problem below is complete for the Promise-BQP complexity class (and not for the total BQP complexity class having a trivial promise, for which no complete problems are known). APPROX-QCIRCUIT-PROB problem Given a description of a quantum circuit acting on qubits with gates, where is a polynomial in and each gate acts on one or two qubits, and two numbers , distinguish between the following two cases: measuring the first qubit of the state yields with probability measuring the first qubit of the state yields with probability Here, there is a promise on the inputs as the problem does not specify the behavior if an instance is not covered by these two cases. Claim. Any BQP problem reduces to APPROX-QCIRCUIT-PROB. Proof. Suppose we have an algorithm that solves APPROX-QCIRCUIT-PROB, i.e., given a quantum
https://en.wikipedia.org/wiki/Brainfuck
Brainfuck is an esoteric programming language created in 1993 by Urban Müller. Notable for its extreme minimalism, the language consists of only eight simple commands, a data pointer and an instruction pointer. While it is fully Turing complete, it is not intended for practical use, but to challenge and amuse programmers. Brainfuck requires one to break commands into microscopic steps. The language's name is a reference to the slang term brainfuck, which refers to things so complicated or unusual that they exceed the limits of one's understanding, as it was not meant or made for designing actual software but to challenge the boundaries of computer programming. History Müller designed Brainfuck with the goal of implementing the smallest possible compiler, inspired by the 1024-byte compiler for the FALSE programming language. Müller's original compiler was implemented in machine language and compiled to a binary with a size of 296 bytes. He uploaded the first Brainfuck compiler to Aminet in 1993. The program came with a "Readme" file, which briefly described the language, and challenged the reader "Who can program anything useful with it? :)". Müller also included an interpreter and some examples. A second version of the compiler used only 240 bytes. P′′ Except for its two I/O commands, Brainfuck is a minor variation of the formal programming language P′′ created by Corrado Böhm in 1964, which is explicitly based on the Turing machine. In fact, using six symbols equivalent to the respective Brainfuck commands +, -, <, >, [, ], Böhm provided an explicit program for each of the basic functions that together serve to compute any computable function. So the first "Brainfuck" programs appear in Böhm's 1964 paper – and they were sufficient to prove Turing completeness. Language design The language consists of eight commands. A brainfuck program is a sequence of these commands, possibly interspersed with other characters (which are ignored). The commands are executed sequentially, with some exceptions: an instruction pointer begins at the first command, and each command it points to is executed, after which it normally moves forward to the next command. The program terminates when the instruction pointer moves past the last command. The brainfuck language uses a simple machine model consisting of the program and instruction pointer, as well as a one-dimensional array of at least 30,000 byte cells initialized to zero; a movable data pointer (initialized to point to the leftmost byte of the array); and two streams of bytes for input and output (most often connected to a keyboard and a monitor respectively, and using the ASCII character encoding). The eight language commands each consist of a single character: [ and ] match as parentheses usually do: each [ matches exactly one ] and vice versa, the [ comes first, and there can be no unmatched [ or ] between the two. As the name suggests, Brainfuck programs tend to be difficult to comprehend. This
https://en.wikipedia.org/wiki/Bill%20Atkinson
William "Bill" D. Atkinson (born March 17, 1951) is an American computer engineer and photographer. Atkinson worked at Apple Computer from 1978 to 1990. Atkinson was the principal designer and developer of the graphical user interface (GUI) of the Apple Lisa and, later, one of the first thirty members of the original Apple Macintosh development team, and was the creator of the MacPaint application. He also designed and implemented QuickDraw, the fundamental toolbox that the Lisa and Macintosh used for graphics. QuickDraw's performance was essential for the success of the Macintosh GUI. He also was one of the main designers of the Lisa and Macintosh user interfaces. Atkinson also conceived, designed and implemented HyperCard, an early and influential hypermedia system. HyperCard put the power of computer programming and database design into the hands of nonprogrammers. In 1994, Atkinson received the EFF Pioneer Award for his contributions. Education He received his undergraduate degree from the University of California, San Diego, where Apple Macintosh developer Jef Raskin was one of his professors. Atkinson continued his studies as a graduate student in neurochemistry at the University of Washington. Raskin invited Atkinson to visit him at Apple Computer; Steve Jobs persuaded him to join the company immediately as employee No. 51, and Atkinson never finished his PhD. Career Around 1990, General Magic's founding, with Bill Atkinson as one of the three cofounders, met the following press in Byte magazine: The obstacles to General Magic's success may appear daunting, but General Magic is not your typical start-up company. Its partners include some of the biggest players in the worlds of computing, communications, and consumer electronics, and it's loaded with top-notch engineers who have been given a clean slate to reinvent traditional approaches to ubiquitous worldwide communications. In 2007, Atkinson began working as an outside developer with Numenta, a startup working on computer intelligence. On his work there Atkinson said, "what Numenta is doing is more fundamentally important to society than the personal computer and the rise of the Internet." Currently, Atkinson has combined his passion for computer programming with his love of nature photography to create art images. He takes close-up photographs of stones that have been cut and polished. His works are highly regarded for their resemblance to miniature landscapes which are hidden within the stones. Atkinson's 2004 book Within the Stone features a collection of his close-up photographs. The highly intricate and detailed images he creates are made possible by the accuracy and creative control of the digital printing process that he helped create. Some of Atkinson's noteworthy contributions to the field of computing include: Macintosh QuickDraw and Lisa LisaGraf Atkinson independently discovered the midpoint circle algorithm for fast drawing of circles by using the sum of consecutiv
https://en.wikipedia.org/wiki/Beeb
Beeb or BEEB may refer to: BBC, the British Broadcasting Corporation, sometimes called the Beeb or Auntie Beeb BEEB, a BBC children's magazine published in 1985 BBC Micro, a home computer built for the BBC by Acorn Computers Ltd., nicknamed The Beeb Beeb.com or BBC online Beeb Birtles (born 1948), Dutch-Australian musician See also Bebe (disambiguation) Beebe (disambiguation) The Bieb
https://en.wikipedia.org/wiki/Bioinformatics
Bioinformatics () is an interdisciplinary field of science that develops methods and software tools for understanding biological data, especially when the data sets are large and complex. Bioinformatics uses biology, chemistry, physics, computer science, computer programming, information engineering, mathematics and statistics to analyze and interpret biological data. The subsequent process of analyzing and interpreting data is referred to as computational biology. Computational, statistical, and computer programming techniques have been used for computer simulation analyses of biological queries. They include reused specific analysis "pipelines", particularly in the field of genomics, such as by the identification of genes and single nucleotide polymorphisms (SNPs). These pipelines are used to better understand the genetic basis of disease, unique adaptations, desirable properties (esp. in agricultural species), or differences between populations. Bioinformatics also includes proteomics, which tries to understand the organizational principles within nucleic acid and protein sequences. Image and signal processing allow extraction of useful results from large amounts of raw data. In the field of genetics, it aids in sequencing and annotating genomes and their observed mutations. Bioinformatics includes text mining of biological literature and the development of biological and gene ontologies to organize and query biological data. It also plays a role in the analysis of gene and protein expression and regulation. Bioinformatics tools aid in comparing, analyzing and interpreting genetic and genomic data and more generally in the understanding of evolutionary aspects of molecular biology. At a more integrative level, it helps analyze and catalogue the biological pathways and networks that are an important part of systems biology. In structural biology, it aids in the simulation and modeling of DNA, RNA, proteins as well as biomolecular interactions. History The first definition of the term bioinformatics was coined by Paulien Hogeweg and Ben Hesper in 1970, to refer to the study of information processes in biotic systems. This definition placed bioinformatics as a field parallel to biochemistry (the study of chemical processes in biological systems). Bioinformatics and computational biology involved the analysis of biological data, particularly DNA, RNA, and protein sequences. The field of bioinformatics experienced explosive growth starting in the mid-1990s, driven largely by the Human Genome Project and by rapid advances in DNA sequencing technology. Analyzing biological data to produce meaningful information involves writing and running software programs that use algorithms from graph theory, artificial intelligence, soft computing, data mining, image processing, and computer simulation. The algorithms in turn depend on theoretical foundations such as discrete mathematics, control theory, system theory, information theory, and statistics.