Dataset Viewer
Auto-converted to Parquet
source
stringlengths
32
199
text
stringlengths
26
3k
https://en.wikipedia.org/wiki/ASCII
ASCII ( ), abbreviated from American Standard Code for Information Interchange, is a character encoding standard for electronic communication. ASCII codes represent text in computers, telecommunications equipment, and other devices. Because of technical limitations of computer systems at the time it was invented, ASCII has just 128 code points, of which only 95 are , which severely limited its scope. Modern computer systems have evolved to use Unicode, which has millions of code points, but the first 128 of these are the same as the ASCII set. The Internet Assigned Numbers Authority (IANA) prefers the name US-ASCII for this character encoding. ASCII is one of the IEEE milestones. Overview ASCII was developed from telegraph code. Its first commercial use was in the Teletype Model 33 and the Teletype Model 35 as a seven-bit teleprinter code promoted by Bell data services. Work on the ASCII standard began in May 1961, with the first meeting of the American Standards Association's (ASA) (now the American National Standards Institute or ANSI) X3.2 subcommittee. The first edition of the standard was published in 1963, underwent a major revision during 1967, and experienced its most recent update during 1986. Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting (i.e., alphabetization) of lists and added features for devices other than teleprinters. The use of ASCII format for Network Interchange was described in 1969. That document was formally elevated to an Internet Standard in 2015. Originally based on the (modern) English alphabet, ASCII encodes 128 specified characters into seven-bit integers as shown by the ASCII chart in this article. Ninety-five of the encoded characters are printable: these include the digits 0 to 9, lowercase letters a to z, uppercase letters A to Z, and punctuation symbols. In addition, the original ASCII specification included 33 non-printing control codes which originated with s; most of these are now obsolete, although a few are still commonly used, such as the carriage return, line feed, and tab codes. For example, lowercase i would be represented in the ASCII encoding by binary 1101001 = hexadecimal 69 (i is the ninth letter) = decimal 105. Despite being an American standard, ASCII does not have a code point for the cent (¢). It also does not support English terms with diacritical marks such as résumé and jalapeño, or proper nouns with diacritical marks such as Beyoncé. History The American Standard Code for Information Interchange (ASCII) was developed under the auspices of a committee of the American Standards Association (ASA), called the X3 committee, by its X3.2 (later X3L2) subcommittee, and later by that subcommittee's X3.2.4 working group (now INCITS). The ASA later became the United States of America Standards Institute (USASI) and ultimately became the American National Standards Institute (ANSI). With the other special characters and con
https://en.wikipedia.org/wiki/Algorithms%20%28journal%29
Algorithms is a monthly peer-reviewed open-access scientific journal of mathematics, covering design, analysis, and experiments on algorithms. The journal is published by MDPI and was established in 2008. The founding editor-in-chief was Kazuo Iwama (Kyoto University). From May 2014 to September 2019, the editor-in-chief was Henning Fernau (Universität Trier). The current editor-in-chief is Frank Werner (Otto-von-Guericke-Universität Magdeburg). Abstracting and indexing According to the Journal Citation Reports, the journal has a 2022 impact factor of 2.3. The journal is abstracted and indexed in: See also Journals with similar scope include: ACM Transactions on Algorithms Algorithmica Journal of Algorithms (Elsevier) References External links Computer science journals Open access journals MDPI academic journals English-language journals Academic journals established in 2008 Mathematics journals Monthly journals
https://en.wikipedia.org/wiki/Algorithm
In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus". In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result. As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input. History Ancient algorithms Since antiquity, step-by-step procedures for solving mathematical problems have been attested. This includes Babylonian mathematics (around 2500 BC), Egyptian mathematics (around 1550 BC), Indian mathematics (around 800 BC and later; e.g. Shulba Sutras, Kerala School, and Brāhmasphuṭasiddhānta), The Ifa Oracle (around 500 BC), Greek mathematics (around 240 BC, e.g. sieve of Eratosthenes and Euclidean algorithm), and Arabic mathematics (9th century, e.g. cryptographic algorithms for code-breaking based on frequency analysis). Al-Khwārizmī and the term algorithm Around 825, Muḥammad ibn Mūsā al-Khwārizmī wrote kitāb al-ḥisāb al-hindī ("Book of Indian computation") and kitab al-jam' wa'l-tafriq al-ḥisāb al-hindī ("Addition and subtraction in Indian arithmetic"). Both of these texts are lost in the original Arabic at this time. (However, his other book on algebra remains.) In the early 12th century, Latin translations of said al-Khwarizmi texts involving the Hindu–Arabic numeral system and arithmetic appeared: Liber Alghoarismi de practica arismetrice (attributed to John of Seville) and Liber Algorismi de numero Indorum (attributed to Adelard of Bath). Hereby, alghoarismi or algorismi is the Latinization of Al-Khwarizmi's name; the text starts with the phrase Dixit Algorismi ("Thus spoke Al-Khwarizmi"). In 1240, Alexander of Villedieu writes a Latin text titled Carmen de Algorismo. It begins with: which translates to: The poem is a few hundred lines long and s
https://en.wikipedia.org/wiki/Anime
is hand-drawn and computer-generated animation originating from Japan. Outside Japan and in English, anime refers specifically to animation produced in Japan. However, in Japan and in Japanese, (a term derived from a shortening of the English word animation) describes all animated works, regardless of style or origin. Many works of animation with a similar style to Japanese animation are also produced outside Japan. Video games sometimes also feature themes and artstyles that can be considered as "anime". The earliest commercial Japanese animations date to 1917. A characteristic art style emerged in the 1960s with the works of cartoonist Osamu Tezuka and spread in following decades, developing a large domestic audience. Anime is distributed theatrically, through television broadcasts, directly to home media, and over the Internet. In addition to original works, anime are often adaptations of Japanese comics (manga), light novels, or video games. It is classified into numerous genres targeting various broad and niche audiences. Anime is a diverse medium with distinctive production methods that have adapted in response to emergent technologies. It combines graphic art, characterization, cinematography, and other forms of imaginative and individualistic techniques. Compared to Western animation, anime production generally focuses less on movement, and more on the detail of settings and use of "camera effects", such as panning, zooming, and angle shots. Diverse art styles are used, and character proportions and features can be quite varied, with a common characteristic feature being large and emotive eyes. The anime industry consists of over 430 production companies, including major studios such as Studio Ghibli, Kyoto Animation, Sunrise, Bones, Ufotable, MAPPA, Wit Studio, CoMix Wave Films, Production I.G and Toei Animation. Since the 1980s, the medium has also seen widespread international success with the rise of foreign dubbed, subtitled programming, and since the 2010s its increasing distribution through streaming services and a widening demographic embrace of anime culture, both within Japan and worldwide. Japanese animation accounted for 60% of the world's animated television shows. Etymology As a type of animation, anime is an art form that comprises many genres found in other mediums; it is sometimes mistakenly classified as a genre itself. In Japanese, the term anime is used to refer to all animated works, regardless of style or origin. English-language dictionaries typically define anime () as "a style of Japanese animation" or as "a style of animation originating in Japan". Other definitions are based on origin, making production in Japan a requisite for a work to be considered "anime". The etymology of the term anime is disputed. The English word "animation" is written in Japanese katakana as () and as (, ) in its shortened form. Some sources claim that the term is derived from the French term for animation ("cartoon", literal
https://en.wikipedia.org/wiki/MessagePad
The MessagePad is a discontinued series of personal digital assistant devices developed by Apple Computer for the Newton platform in 1993. Some electronic engineering and the manufacture of Apple's MessagePad devices was undertaken in Japan by Sharp. The devices are based on the ARM 610 RISC processor and all featured handwriting recognition software and were developed and marketed by Apple. The devices run Newton OS. History The development of the Newton MessagePad first began with Apple's former senior vice president of research and development, Jean-Louis Gassée; his team included Steve Capps, co-writer of macOS Finder, and an employed engineer named Steve Sakoman. The development of the Newton MessagePad operated in secret until it was eventually revealed to the Apple Board of Directors in late 1990. When Gassée resigned from his position due to a significant disagreement with the board, seeing how his employer was treated, Sakoman also stopped developing the MessagePad on March 2, 1990. Bill Atkinson, an Apple Executive responsible for the company's Lisa graphical interface, invited Steve Capps, John Sculley, Andy Hertzfeld, Susan Kare, and Marc Porat to a meeting on March 11, 1990. There, they brainstormed a way of saving the MessagePad. Sculley suggested adding new features, including libraries, museums, databases, or institutional archives features, allowing customers to navigate through various window tabs or opened galleries/stacks. The Board later approved his suggestion; he then gave Newton it is official and full backing. The first MessagePad was unveiled by Sculley on the 29th of May 1992 at the summer Consumer Electronics Show (CES) in Chicago. Sculley caved in to pressure to unveil the product early because the Newton did not officially ship for another 14 months on August 2, 1993, starting at a price of . Over 50,000 units were sold by late November 1993. Details Screen and input With the MessagePad 120 with Newton OS 2.0, the Newton Keyboard by Apple became available, which can also be used via the dongle on Newton devices with a Newton InterConnect port, most notably the Apple MessagePad 2000/2100 series, as well as the Apple eMate 300. Newton devices featuring Newton OS 2.1 or higher can be used with the screen turned horizontally ("landscape") as well as vertically ("portrait"). A change of a setting rotates the contents of the display by 90, 180 or 270 degrees. Handwriting recognition still works properly with the display rotated, although display calibration is needed when rotation in any direction is used for the first time or when the Newton device is reset. Handwriting recognition In initial versions (Newton OS 1.x) the handwriting recognition gave extremely mixed results for users and was sometimes inaccurate. The original handwriting recognition engine was called Calligrapher, and was licensed from a Russian company called Paragraph International. Calligrapher's design was quite sophisticated; it attempted to
https://en.wikipedia.org/wiki/Algorithms%20for%20calculating%20variance
Algorithms for calculating variance play a major role in computational statistics. A key difficulty in the design of good algorithms for this problem is that formulas for the variance may involve sums of squares, which can lead to numerical instability as well as to arithmetic overflow when dealing with large values. Naïve algorithm A formula for calculating the variance of an entire population of size N is: Using Bessel's correction to calculate an unbiased estimate of the population variance from a finite sample of n observations, the formula is: Therefore, a naïve algorithm to calculate the estimated variance is given by the following: Let For each datum : This algorithm can easily be adapted to compute the variance of a finite population: simply divide by n instead of n − 1 on the last line. Because and can be very similar numbers, cancellation can lead to the precision of the result to be much less than the inherent precision of the floating-point arithmetic used to perform the computation. Thus this algorithm should not be used in practice, and several alternate, numerically stable, algorithms have been proposed. This is particularly bad if the standard deviation is small relative to the mean. Computing shifted data The variance is invariant with respect to changes in a location parameter, a property which can be used to avoid the catastrophic cancellation in this formula. with any constant, which leads to the new formula the closer is to the mean value the more accurate the result will be, but just choosing a value inside the samples range will guarantee the desired stability. If the values are small then there are no problems with the sum of its squares, on the contrary, if they are large it necessarily means that the variance is large as well. In any case the second term in the formula is always smaller than the first one therefore no cancellation may occur. If just the first sample is taken as the algorithm can be written in Python programming language as def shifted_data_variance(data): if len(data) < 2: return 0.0 K = data[0] n = Ex = Ex2 = 0.0 for x in data: n += 1 Ex += x - K Ex2 += (x - K) ** 2 variance = (Ex2 - Ex**2 / n) / (n - 1) # use n instead of (n-1) if want to compute the exact variance of the given data # use (n-1) if data are samples of a larger population return variance This formula also facilitates the incremental computation that can be expressed as K = Ex = Ex2 = 0.0 n = 0 def add_variable(x): global K, n, Ex, Ex2 if n == 0: K = x n += 1 Ex += x - K Ex2 += (x - K) ** 2 def remove_variable(x): global K, n, Ex, Ex2 n -= 1 Ex -= x - K Ex2 -= (x - K) ** 2 def get_mean(): global K, n, Ex return K + Ex / n def get_variance(): global n, Ex, Ex2 return (Ex2 - Ex**2 / n) / (n - 1) Two-pass algorithm An alternative approach, using a different formula for the variance, f
https://en.wikipedia.org/wiki/Artificial%20intelligence
Artificial intelligence (AI) is the intelligence of machines or software, as opposed to the intelligence of humans or animals. It is also the field of study in computer science that develops and studies intelligent machines. "AI" may also refer to the machines themselves. AI technology is widely used throughout industry, government and science. Some high-profile applications are: advanced web search engines (e.g., Google Search), recommendation systems (used by YouTube, Amazon, and Netflix), understanding human speech (such as Siri and Alexa), self-driving cars (e.g., Waymo), generative or creative tools (ChatGPT and AI art), and competing at the highest level in strategic games (such as chess and Go). Artificial intelligence was founded as an academic discipline in 1956. The field went through multiple cycles of optimism followed by disappointment and loss of funding, but after 2012, when deep learning surpassed all previous AI techniques, there was a vast increase in funding and interest. The various sub-fields of AI research are centered around particular goals and the use of particular tools. The traditional goals of AI research include reasoning, knowledge representation, planning, learning, natural language processing, perception, and support for robotics. General intelligence (the ability to solve an arbitrary problem) is among the field's long-term goals. To solve these problems, AI researchers have adapted and integrated a wide range of problem-solving techniques, including search and mathematical optimization, formal logic, artificial neural networks, and methods based on statistics, operations research, and economics. AI also draws upon psychology, linguistics, philosophy, neuroscience and many other fields. Goals The general problem of simulating (or creating) intelligence has been broken down into sub-problems. These consist of particular traits or capabilities that researchers expect an intelligent system to display. The traits described below have received the most attention and cover the scope of AI research. Reasoning, problem-solving Early researchers developed algorithms that imitated step-by-step reasoning that humans use when they solve puzzles or make logical deductions. By the late 1980s and 1990s, methods were developed for dealing with uncertain or incomplete information, employing concepts from probability and economics. Many of these algorithms are insufficient for solving large reasoning problems because they experience a "combinatorial explosion": they became exponentially slower as the problems grew larger. Even humans rarely use the step-by-step deduction that early AI research could model. They solve most of their problems using fast, intuitive judgments. Accurate and efficient reasoning is an unsolved problem. Knowledge representation Knowledge representation and knowledge engineering allow AI programs to answer questions intelligently and make deductions about real-world facts. Formal knowledge represe
https://en.wikipedia.org/wiki/Applet
In computing, an applet is any small application that performs one specific task that runs within the scope of a dedicated widget engine or a larger program, often as a plug-in. The term is frequently used to refer to a Java applet, a program written in the Java programming language that is designed to be placed on a web page. Applets are typical examples of transient and auxiliary applications that do not monopolize the user's attention. Applets are not full-featured application programs, and are intended to be easily accessible. History The word applet was first used in 1990 in PC Magazine. However, the concept of an applet, or more broadly a small interpreted program downloaded and executed by the user, dates at least to RFC 5 (1969) by Jeff Rulifson, which described the Decode-Encode Language, which was designed to allow remote use of the oN-Line System over ARPANET, by downloading small programs to enhance the interaction. This has been specifically credited as a forerunner of Java's downloadable programs in RFC 2555. Applet as an extension of other software In some cases, an applet does not run independently. These applets must run either in a container provided by a host program, through a plugin, or a variety of other applications including mobile devices that support the applet programming model. Web-based applets Applets were used to provide interactive features to web applications that historically could not be provided by HTML alone. They could capture mouse input and also had controls like buttons or check boxes. In response to the user action, an applet could change the provided graphic content. This made applets well suited for demonstration, visualization, and teaching. There were online applet collections for studying various subjects, from physics to heart physiology. Applets were also used to create online game collections that allowed players to compete against live opponents in real-time. An applet could also be a text area only, providing, for instance, a cross-platform command-line interface to some remote system. If needed, an applet could leave the dedicated area and run as a separate window. However, applets had very little control over web page content outside the applet dedicated area, so they were less useful for improving the site appearance in general (while applets like news tickers or WYSIWYG editors are also known). Applets could also play media in formats that are not natively supported by the browser. HTML pages could embed parameters that were passed to the applet. Hence, the same applet could appear differently depending on the parameters that were passed. Examples of Web-based applets include: QuickTime movies Flash movies Windows Media Player applets, used to display embedded video files in Internet Explorer (and other browsers that supported the plugin) 3D modeling display applets, used to rotate and zoom a model Browser games that were applet-based, though some developed into fully functional
https://en.wikipedia.org/wiki/Alan%20Turing
Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher and theoretical biologist. Turing was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general-purpose computer. He is widely considered to be the father of theoretical computer science and artificial intelligence. Born in Maida Vale, London, Turing was raised in southern England. He graduated at King's College, Cambridge, with a degree in mathematics. Whilst he was a fellow at Cambridge, he published a proof demonstrating that some purely mathematical yes–no questions can never be answered by computation. He defined a Turing machine and proved that the halting problem for Turing machines is undecidable. In 1938, he obtained his PhD from the Department of Mathematics at Princeton University. During the Second World War, Turing worked for the Government Code and Cypher School at Bletchley Park, Britain's codebreaking centre that produced Ultra intelligence. For a time he led Hut 8, the section that was responsible for German naval cryptanalysis. Here, he devised a number of techniques for speeding the breaking of German ciphers, including improvements to the pre-war Polish bomba method, an electromechanical machine that could find settings for the Enigma machine. Turing played a crucial role in cracking intercepted coded messages that enabled the Allies to defeat the Axis powers in many crucial engagements, including the Battle of the Atlantic. After the war, Turing worked at the National Physical Laboratory, where he designed the Automatic Computing Engine, one of the first designs for a stored-program computer. In 1948, Turing joined Max Newman's Computing Machine Laboratory at the Victoria University of Manchester, where he helped develop the Manchester computers and became interested in mathematical biology. He wrote a paper on the chemical basis of morphogenesis and predicted oscillating chemical reactions such as the Belousov–Zhabotinsky reaction, first observed in the 1960s. Despite these accomplishments, Turing was never fully recognised in Britain during his lifetime because much of his work was covered by the Official Secrets Act. Turing was prosecuted in 1952 for homosexual acts. He accepted hormone treatment with DES, a procedure commonly referred to as chemical castration, as an alternative to prison. Turing died on 7 June 1954, 16 days before his 42nd birthday, from cyanide poisoning. An inquest determined his death as a suicide, but it has been noted that the known evidence is also consistent with accidental poisoning. Following a public campaign in 2009, British prime minister Gordon Brown made an official public apology on behalf of the government for "the appalling way [Turing] was treated". Queen Elizabeth II granted a posthumous pardon
https://en.wikipedia.org/wiki/Ada%20%28programming%20language%29
Ada is a structured, statically typed, imperative, and object-oriented high-level programming language, inspired by Pascal and other languages. It has built-in language support for design by contract (DbC), extremely strong typing, explicit concurrency, tasks, synchronous message passing, protected objects, and non-determinism. Ada improves code safety and maintainability by using the compiler to find errors in favor of runtime errors. Ada is an international technical standard, jointly defined by the International Organization for Standardization (ISO), and the International Electrotechnical Commission (IEC). , the standard, called Ada 2012 informally, is ISO/IEC 8652:2012. Ada was originally designed by a team led by French computer scientist Jean Ichbiah of Honeywell under contract to the United States Department of Defense (DoD) from 1977 to 1983 to supersede over 450 programming languages used by the DoD at that time. Ada was named after Ada Lovelace (1815–1852), who has been credited as the first computer programmer. Features Ada was originally designed for embedded and real-time systems. The Ada 95 revision, designed by S. Tucker Taft of Intermetrics between 1992 and 1995, improved support for systems, numerical, financial, and object-oriented programming (OOP). Features of Ada include: strong typing, modular programming mechanisms (packages), run-time checking, parallel processing (tasks, synchronous message passing, protected objects, and nondeterministic select statements), exception handling, and generics. Ada 95 added support for object-oriented programming, including dynamic dispatch. The syntax of Ada minimizes choices of ways to perform basic operations, and prefers English keywords (such as "or else" and "and then") to symbols (such as "||" and "&&"). Ada uses the basic arithmetical operators "+", "-", "*", and "/", but avoids using other symbols. Code blocks are delimited by words such as "declare", "begin", and "end", where the "end" (in most cases) is followed by the identifier of the block it closes (e.g., if ... end if, loop ... end loop). In the case of conditional blocks this avoids a dangling else that could pair with the wrong nested if-expression in other languages like C or Java. Ada is designed for developing very large software systems. Ada packages can be compiled separately. Ada package specifications (the package interface) can also be compiled separately without the implementation to check for consistency. This makes it possible to detect problems early during the design phase, before implementation starts. A large number of compile-time checks are supported to help avoid bugs that would not be detectable until run-time in some other languages or would require explicit checks to be added to the source code. For example, the syntax requires explicitly named closing of blocks to prevent errors due to mismatched end tokens. The adherence to strong typing allows detecting many common software errors (wrong par
https://en.wikipedia.org/wiki/Advanced%20Encryption%20Standard
The Advanced Encryption Standard (AES), also known by its original name Rijndael (), is a specification for the encryption of electronic data established by the U.S. National Institute of Standards and Technology (NIST) in 2001. AES is a variant of the Rijndael block cipher developed by two Belgian cryptographers, Joan Daemen and Vincent Rijmen, who submitted a proposal to NIST during the AES selection process. Rijndael is a family of ciphers with different key and block sizes. For AES, NIST selected three members of the Rijndael family, each with a block size of 128 bits, but three different key lengths: 128, 192 and 256 bits. AES has been adopted by the U.S. government. It supersedes the Data Encryption Standard (DES), which was published in 1977. The algorithm described by AES is a symmetric-key algorithm, meaning the same key is used for both encrypting and decrypting the data. In the United States, AES was announced by the NIST as U.S. FIPS PUB 197 (FIPS 197) on November 26, 2001. This announcement followed a five-year standardization process in which fifteen competing designs were presented and evaluated, before the Rijndael cipher was selected as the most suitable. AES is included in the ISO/IEC 18033-3 standard. AES became effective as a U.S. federal government standard on May 26, 2002, after approval by U.S. Secretary of Commerce Donald Evans. AES is available in many different encryption packages, and is the first (and only) publicly accessible cipher approved by the U.S. National Security Agency (NSA) for top secret information when used in an NSA approved cryptographic module. Definitive standards The Advanced Encryption Standard (AES) is defined in each of: FIPS PUB 197: Advanced Encryption Standard (AES) ISO/IEC 18033-3: Block ciphers Description of the ciphers AES is based on a design principle known as a substitution–permutation network, and is efficient in both software and hardware. Unlike its predecessor DES, AES does not use a Feistel network. AES is a variant of Rijndael, with a fixed block size of 128 bits, and a key size of 128, 192, or 256 bits. By contrast, Rijndael per se is specified with block and key sizes that may be any multiple of 32 bits, with a minimum of 128 and a maximum of 256 bits. Most AES calculations are done in a particular finite field. AES operates on a 4 × 4 column-major order array of 16 bytes termed the state: The key size used for an AES cipher specifies the number of transformation rounds that convert the input, called the plaintext, into the final output, called the ciphertext. The number of rounds are as follows: 10 rounds for 128-bit keys. 12 rounds for 192-bit keys. 14 rounds for 256-bit keys. Each round consists of several processing steps, including one that depends on the encryption key itself. A set of reverse rounds are applied to transform ciphertext back into the original plaintext using the same encryption key. High-level description of the algorithm round keys
https://en.wikipedia.org/wiki/Analytical%20engine
The analytical engine was a proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage's difference engine, which was a design for a simpler mechanical calculator. The analytical engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. In other words, the structure of the analytical engine was essentially the same as that which has dominated computer design in the electronic era. The analytical engine is one of the most successful achievements of Charles Babbage. Babbage was never able to complete construction of any of his machines due to conflicts with his chief engineer and inadequate funding. It was not until 1941 that Konrad Zuse built the first general-purpose computer, Z3, more than a century after Babbage had proposed the pioneering analytical engine in 1837. Design Babbage's first attempt at a mechanical computing device, the Difference Engine, was a special-purpose machine designed to tabulate logarithms and trigonometric functions by evaluating finite differences to create approximating polynomials. Construction of this machine was never completed; Babbage had conflicts with his chief engineer, Joseph Clement, and ultimately the British government withdrew its funding for the project. During this project, Babbage realised that a much more general design, the analytical engine, was possible. The work on the design of the analytical engine started around 1833. The input, consisting of programs ("formulae") and data, was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter, and a bell. The machine would also be able to punch numbers onto cards to be read in later. It employed ordinary base-10 fixed-point arithmetic. There was to be a store (that is, a memory) capable of holding 1,000 numbers of 50 decimal digits each (ca. 16.6 kB). An arithmetic unit (the "mill") would be able to perform all four arithmetic operations, plus comparisons and optionally square roots. Initially (1838) it was conceived as a difference engine curved back upon itself, in a generally circular layout, with the long store exiting off to one side. Later drawings (1858) depict a regularised grid layout. Like the central processing unit (CPU) in a modern computer, the mill would rely upon its own internal procedures, to be stored in the form of pegs inserted into rotating drums called "barrels", to carry out some of the more complex instructions the user's program might specify. The programming language to be employed by users was akin to modern day assembly languages. Loops and conditional branching were possible,
https://en.wikipedia.org/wiki/Apple%20I
The Apple Computer 1 (Apple-1), later known predominantly as the Apple I, is an 8-bit motherboard-only personal computer designed by Steve Wozniak and released by the Apple Computer Company (now Apple Inc.) in 1976. The company was initially formed to sell the Apple Iits first product and would later become the world's largest technology company. The idea of starting a company and selling the computer came from Wozniak's friend and Apple co-founder Steve Jobs. To finance its development, Wozniak and Jobs sold some of their possessions for a few hundred dollars. Wozniak demonstrated the first prototype in July 1976 at the Homebrew Computer Club in Palo Alto, California, impressing an early computer retailer. After securing an order for 50 computers, Jobs was able to order the parts on credit and deliver the first Apple products after ten days. The Apple I was one of the first computers available that used the inexpensive MOS Technology 6502 microprocessor. An expansion included a BASIC interpreter, allowing users to utilize BASIC at home instead of at institutions with mainframe computers, greatly lowering the entry cost for computing with BASIC. Production was discontinued on September 30, 1977, after the June 10, 1977 introduction of its successor, the Apple II, which Byte magazine referred to as part of the "1977 Trinity" of personal computing (along with the PET 2001 from Commodore Business Machines and the TRS-80 Model I from Tandy Corporation). As relatively few computers were made before they were discontinued, coupled with their status as Apple's first product, surviving Apple I units are now displayed in computer museums. History Development In 1975, Steve Wozniak started attending meetings of the Homebrew Computer Club, which was a major source of inspiration for him. New microcomputers such as the Altair 8800 and the IMSAI 8080 inspired Wozniak to build a microprocessor into his video terminal circuit to make a complete computer. At the time the only microcomputer CPUs generally available were the $179 Intel 8080 (), and the $170 Motorola 6800 (). Wozniak preferred the 6800, but both were out of his price range. So he watched, and learned, and designed computers on paper, waiting for the day he could afford a CPU. When MOS Technology released its $20 () 6502 chip in 1976, Wozniak wrote a version of BASIC for it, then began to design a computer for it to run on. The 6502 was designed by the same people who designed the 6800, as many in Silicon Valley left employers to form their own companies. Wozniak's earlier 6800 paper-computer needed only minor changes to run on the new chip. By March 1, 1976, Wozniak completed the basic design of his computer. Wozniak originally offered the design to HP while working there, but was denied by the company on five occasions. When he demonstrated his computer at the Homebrew Computer Club, his friend and fellow club regular Steve Jobs was immediately interested in its commercial potential. Wozni
https://en.wikipedia.org/wiki/Atanasoff%E2%80%93Berry%20computer
The Atanasoff–Berry computer (ABC) was the first automatic electronic digital computer. Limited by the technology of the day, and execution, the device has remained somewhat obscure. The ABC's priority is debated among historians of computer technology, because it was neither programmable, nor Turing-complete. Conventionally, the ABC would be considered the first electronic ALU (arithmetic logic unit) which is integrated into every modern processor's design. Its unique contribution was to make computing faster by being the first to use vacuum tubes to do the arithmetic calculations. Prior to this, slower electro-mechanical methods were used by Konrad Zuse's Z1 computer, and the simultaneously developed Harvard Mark I. The first electronic, programmable, digital machine, the Colossus computer from 1943 to 1945, used similar tube-based technology as ABC. Overview Conceived in 1937, the machine was built by Iowa State College mathematics and physics professor John Vincent Atanasoff with the help of graduate student Clifford Berry. It was designed only to solve systems of linear equations and was successfully tested in 1942. However, its intermediate result storage mechanism, a paper card writer/reader, was not perfected, and when John Vincent Atanasoff left Iowa State College for World War II assignments, work on the machine was discontinued. The ABC pioneered important elements of modern computing, including binary arithmetic and electronic switching elements, but its special-purpose nature and lack of a changeable, stored program distinguish it from modern computers. The computer was designated an IEEE Milestone in 1990. Atanasoff and Berry's computer work was not widely known until it was rediscovered in the 1960s, amid patent disputes over the first instance of an electronic computer. At that time ENIAC, that had been created by John Mauchly and J. Presper Eckert, was considered to be the first computer in the modern sense, but in 1973 a U.S. District Court invalidated the ENIAC patent and concluded that the ENIAC inventors had derived the subject matter of the electronic digital computer from Atanasoff. When, in the mid-1970s, the secrecy surrounding the British World War II development of the Colossus computers that pre-dated ENIAC, was lifted and Colossus was described at a conference in Los Alamos, New Mexico, in June 1976, John Mauchly and Konrad Zuse were reported to have been astonished. Design and construction According to Atanasoff's account, several key principles of the Atanasoff–Berry computer were conceived in a sudden insight after a long nighttime drive to Rock Island, Illinois, during the winter of 1937–38. The ABC innovations included electronic computation, binary arithmetic, parallel processing, regenerative capacitor memory, and a separation of memory and computing functions. The mechanical and logic design was worked out by Atanasoff over the next year. A grant application to build a proof of concept prototype was sub
https://en.wikipedia.org/wiki/Assembly%20language
In computer programming, assembly language (alternatively assembler language or symbolic machine code), often referred to simply as assembly and commonly abbreviated as ASM or asm, is any low-level programming language with a very strong correspondence between the instructions in the language and the architecture's machine code instructions. Assembly language usually has one statement per machine instruction (1:1), but constants, comments, assembler directives, symbolic labels of, e.g., memory locations, registers, and macros are generally also supported. The first assembly code in which a language is used to represent machine code instructions is found in Kathleen and Andrew Donald Booth's 1947 work, Coding for A.R.C.. Assembly code is converted into executable machine code by a utility program referred to as an assembler. The term "assembler" is generally attributed to Wilkes, Wheeler and Gill in their 1951 book The Preparation of Programs for an Electronic Digital Computer, who, however, used the term to mean "a program that assembles another program consisting of several sections into a single program". The conversion process is referred to as assembly, as in assembling the source code. The computational step when an assembler is processing a program is called assembly time. Because assembly depends on the machine code instructions, each assembly language is specific to a particular computer architecture. Sometimes there is more than one assembler for the same architecture, and sometimes an assembler is specific to an operating system or to particular operating systems. Most assembly languages do not provide specific syntax for operating system calls, and most assembly languages can be used universally with any operating system, as the language provides access to all the real capabilities of the processor, upon which all system call mechanisms ultimately rest. In contrast to assembly languages, most high-level programming languages are generally portable across multiple architectures but require interpreting or compiling, much more complicated tasks than assembling. In the first decades of computing, it was commonplace for both systems programming and application programming to take place entirely in assembly language. While still irreplaceable for some purposes, the majority of programming is now conducted in higher-level interpreted and compiled languages. In "No Silver Bullet", Fred Brooks summarised the effects of the switch away from assembly language programming: "Surely the most powerful stroke for software productivity, reliability, and simplicity has been the progressive use of high-level languages for programming. Most observers credit that development with at least a factor of five in productivity, and with concomitant gains in reliability, simplicity, and comprehensibility." Today, it is typical to use small amounts of assembly language code within larger systems implemented in a higher-level language, for performance rea
https://en.wikipedia.org/wiki/Alan%20Kay
Alan Curtis Kay (born May 17, 1940) is an American computer scientist best known for his pioneering work on object-oriented programming and windowing graphical user interface (GUI) design. At Xerox PARC he led the design and development of the first modern windowed computer desktop interface. There he also led the development of the influential object-oriented programming language Smalltalk, both personally designing most of the early versions of the language and coining the term "object-oriented." He has been elected a Fellow of the American Academy of Arts and Sciences, the National Academy of Engineering, and the Royal Society of Arts. He received the Turing award in 2003. Kay is also a former professional jazz guitarist, composer, and theatrical designer. He also is an amateur classical pipe organist. Early life and work In an interview on education in America with the Davis Group Ltd., Kay said: Originally from Springfield, Massachusetts, Kay's family relocated several times due to his father's career in physiology before ultimately settling in the New York metropolitan area. He attended Brooklyn Technical High School. Having accumulated enough credits to graduate, he then attended Bethany College in Bethany, West Virginia, where he majored in biology and minored in mathematics. Kay then taught guitar in Denver, Colorado for a year. He was drafted in the United States Army, then qualified for officer training in the United States Air Force, where he became a computer programmer after passing an aptitude test. After his discharge, he enrolled at the University of Colorado Boulder and earned a Bachelor of Science (B.S.) in mathematics and molecular biology in 1966. In the autumn of 1966, he began graduate school at the University of Utah College of Engineering. He earned a Master of Science in electrical engineering in 1968, then a Doctor of Philosophy in computer science in 1969. His doctoral dissertation, FLEX: A Flexible Extendable Language, described the invention of a computer language named FLEX. While there, he worked with "fathers of computer graphics" David C. Evans (who had recently been recruited from the University of California, Berkeley to start Utah's computer science department) and Ivan Sutherland (best known for writing such pioneering programs as Sketchpad). Kay credits Sutherland's 1963 thesis for influencing his views on objects and computer programming. As he grew busier with research for the Defense Advanced Research Projects Agency (DARPA), he ended his musical career. In 1968, he met Seymour Papert and learned of the programming language Logo, a dialect of Lisp optimized for educational purposes. This led him to learn of the work of Jean Piaget, Jerome Bruner, Lev Vygotsky, and of constructionist learning, further influencing his professional orientation. In 1969, Kay became a visiting researcher at the Stanford Artificial Intelligence Laboratory in anticipation of accepting a professorship at Carnegie Mel
https://en.wikipedia.org/wiki/APL%20%28programming%20language%29
APL (named after the book A Programming Language) is a programming language developed in the 1960s by Kenneth E. Iverson. Its central datatype is the multidimensional array. It uses a large range of special graphic symbols to represent most functions and operators, leading to very concise code. It has been an important influence on the development of concept modeling, spreadsheets, functional programming, and computer math packages. It has also inspired several other programming languages. History Mathematical notation A mathematical notation for manipulating arrays was developed by Kenneth E. Iverson, starting in 1957 at Harvard University. In 1960, he began work for IBM where he developed this notation with Adin Falkoff and published it in his book A Programming Language in 1962. The preface states its premise: This notation was used inside IBM for short research reports on computer systems, such as the Burroughs B5000 and its stack mechanism when stack machines versus register machines were being evaluated by IBM for upcoming computers. Iverson also used his notation in a draft of the chapter A Programming Language, written for a book he was writing with Fred Brooks, Automatic Data Processing, which would be published in 1963. In 1979, Iverson received the Turing Award for his work on APL. Development into a computer programming language As early as 1962, the first attempt to use the notation to describe a complete computer system happened after Falkoff discussed with William C. Carter his work to standardize the instruction set for the machines that later became the IBM System/360 family. In 1963, Herbert Hellerman, working at the IBM Systems Research Institute, implemented a part of the notation on an IBM 1620 computer, and it was used by students in a special high school course on calculating transcendental functions by series summation. Students tested their code in Hellerman's lab. This implementation of a part of the notation was called Personalized Array Translator (PAT). In 1963, Falkoff, Iverson, and Edward H. Sussenguth Jr., all working at IBM, used the notation for a formal description of the IBM System/360 series machine architecture and functionality, which resulted in a paper published in IBM Systems Journal in 1964. After this was published, the team turned their attention to an implementation of the notation on a computer system. One of the motivations for this focus of implementation was the interest of John L. Lawrence who had new duties with Science Research Associates, an educational company bought by IBM in 1964. Lawrence asked Iverson and his group to help use the language as a tool to develop and use computers in education. After Lawrence M. Breed and Philip S. Abrams of Stanford University joined the team at IBM Research, they continued their prior work on an implementation programmed in FORTRAN IV for a part of the notation which had been done for the IBM 7090 computer running on the IBSYS operating system. T
https://en.wikipedia.org/wiki/ALGOL
ALGOL (; short for "Algorithmic Language") is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years. In the sense that the syntax of most modern languages is "Algol-like", it was arguably more influential than three other high-level programming languages among which it was roughly contemporary: FORTRAN, Lisp, and COBOL. It was designed to avoid some of the perceived problems with FORTRAN and eventually gave rise to many other programming languages, including PL/I, Simula, BCPL, B, Pascal, and C. ALGOL introduced code blocks and the begin...end pairs for delimiting them. It was also the first language implementing nested function definitions with lexical scope. Moreover, it was the first programming language which gave detailed attention to formal language definition and through the Algol 60 Report introduced Backus–Naur form, a principal formal grammar notation for language design. There were three major specifications, named after the years they were first published: ALGOL 58 – originally proposed to be called IAL, for International Algebraic Language. ALGOL 60 – first implemented as X1 ALGOL 60 in 1961. Revised 1963. ALGOL 68 – introduced new elements including flexible arrays, slices, parallelism, operator identification. Revised 1973. ALGOL 68 is substantially different from ALGOL 60 and was not well received, so in general "Algol" means ALGOL 60 and its dialects. History ALGOL was developed jointly by a committee of European and American computer scientists in a meeting in 1958 at the Swiss Federal Institute of Technology in Zurich (cf. ALGOL 58). It specified three different syntaxes: a reference syntax, a publication syntax, and an implementation syntax. The different syntaxes permitted it to use different keyword names and conventions for decimal points (commas vs periods) for different languages. ALGOL was used mostly by research computer scientists in the United States and in Europe. Its use in commercial applications was hindered by the absence of standard input/output facilities in its description and the lack of interest in the language by large computer vendors other than Burroughs Corporation. ALGOL 60 did however become the standard for the publication of algorithms and had a profound effect on future language development. John Backus developed the Backus normal form method of describing programming languages specifically for ALGOL 58. It was revised and expanded by Peter Naur for ALGOL 60, and at Donald Knuth's suggestion renamed Backus–Naur form. Peter Naur: "As editor of the ALGOL Bulletin I was drawn into the international discussions of the language and was selected to be member of the European language design group in November 1959. In this capacity I was the editor of the
https://en.wikipedia.org/wiki/AWK
AWK (awk ) is a domain-specific language designed for text processing and typically used as a data extraction and reporting tool. Like sed and grep, it is a filter, and is a standard feature of most Unix-like operating systems. The AWK language is a data-driven scripting language consisting of a set of actions to be taken against streams of textual data – either run directly on files or used as part of a pipeline – for purposes of extracting or transforming text, such as producing formatted reports. The language extensively uses the string datatype, associative arrays (that is, arrays indexed by key strings), and regular expressions. While AWK has a limited intended application domain and was especially designed to support one-liner programs, the language is Turing-complete, and even the early Bell Labs users of AWK often wrote well-structured large AWK programs. AWK was created at Bell Labs in the 1970s, and its name is derived from the surnames of its authors: Alfred Aho, Peter Weinberger, and Brian Kernighan. The acronym is pronounced the same as the name of the bird species auk, which is illustrated on the cover of The AWK Programming Language. When written in all lowercase letters, as awk, it refers to the Unix or Plan 9 program that runs scripts written in the AWK programming language. History AWK was initially developed in 1977 by Alfred Aho (author of egrep), Peter J. Weinberger (who worked on tiny relational databases), and Brian Kernighan. AWK takes its name from their respective initials. According to Kernighan, one of the goals of AWK was to have a tool that would easily manipulate both numbers and strings. AWK was also inspired by Marc Rochkind's programming language that was used to search for patterns in input data, and was implemented using yacc. As one of the early tools to appear in Version 7 Unix, AWK added computational features to a Unix pipeline besides the Bourne shell, the only scripting language available in a standard Unix environment. It is one of the mandatory utilities of the Single UNIX Specification, and is required by the Linux Standard Base specification. AWK was significantly revised and expanded in 1985–88, resulting in the GNU AWK implementation written by Paul Rubin, Jay Fenlason, and Richard Stallman, released in 1988. GNU AWK may be the most widely deployed version because it is included with GNU-based Linux packages. GNU AWK has been maintained solely by Arnold Robbins since 1994. Brian Kernighan's nawk (New AWK) source was first released in 1993 unpublicized, and publicly since the late 1990s; many BSD systems use it to avoid the GPL license. AWK was preceded by sed (1974). Both were designed for text processing. They share the line-oriented, data-driven paradigm, and are particularly suited to writing one-liner programs, due to the implicit main loop and current line variables. The power and terseness of early AWK programs – notably the powerful regular expression handling and conciseness due to im
https://en.wikipedia.org/wiki/Kolmogorov%20complexity
In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as algorithmic complexity, Solomonoff–Kolmogorov–Chaitin complexity, program-size complexity, descriptive complexity, or algorithmic entropy. It is named after Andrey Kolmogorov, who first published on the subject in 1963 and is a generalization of classical information theory. The notion of Kolmogorov complexity can be used to state and prove impossibility results akin to Cantor's diagonal argument, Gödel's incompleteness theorem, and Turing's halting problem. In particular, no program P computing a lower bound for each text's Kolmogorov complexity can return a value essentially larger than P's own length (see section ); hence no single program can compute the exact Kolmogorov complexity for infinitely many texts. Definition Consider the following two strings of 32 lowercase letters and digits: abababababababababababababababab , and 4c1j5b2p0cv4w1x8rx2y39umgw5q85s7 The first string has a short English-language description, namely "write ab 16 times", which consists of 17 characters. The second one has no obvious simple description (using the same character set) other than writing down the string itself, i.e., "write 4c1j5b2p0cv4w1x8rx2y39umgw5q85s7" which has 38 characters. Hence the operation of writing the first string can be said to have "less complexity" than writing the second. More formally, the complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (the sensitivity of complexity relative to the choice of description language is discussed below). It can be shown that the Kolmogorov complexity of any string cannot be more than a few bytes larger than the length of the string itself. Strings like the abab example above, whose Kolmogorov complexity is small relative to the string's size, are not considered to be complex. The Kolmogorov complexity can be defined for any mathematical object, but for simplicity the scope of this article is restricted to strings. We must first specify a description language for strings. Such a description language can be based on any computer programming language, such as Lisp, Pascal, or Java. If P is a program which outputs a string x, then P is a description of x. The length of the description is just the length of P as a character string, multiplied by the number of bits in a character (e.g., 7 for ASCII). We could, alternatively, choose an encoding for Turing machines, where an encoding is a function which associates to each Turing Machine M a bitstring <M>. If M is a Turing Machine which, on input w, outputs string x, then the concatenated string <M> w is a
https://en.wikipedia.org/wiki/ASCII%20art
ASCII art is a graphic design technique that uses computers for presentation and consists of pictures pieced together from the 95 printable (from a total of 128) characters defined by the ASCII Standard from 1963 and ASCII compliant character sets with proprietary extended characters (beyond the 128 characters of standard 7-bit ASCII). The term is also loosely used to refer to text-based visual art in general. ASCII art can be created with any text editor, and is often used with free-form languages. Most examples of ASCII art require a fixed-width font (non-proportional fonts, as on a traditional typewriter) such as Courier for presentation. Among the oldest known examples of ASCII art are the creations by computer-art pioneer Kenneth Knowlton from around 1966, who was working for Bell Labs at the time. "Studies in Perception I" by Knowlton and Leon Harmon from 1966 shows some examples of their early ASCII art. ASCII art was invented, in large part, because early printers often lacked graphics ability and thus, characters were used in place of graphic marks. Also, to mark divisions between different print jobs from different users, bulk printers often used ASCII art to print large banner pages, making the division easier to spot so that the results could be more easily separated by a computer operator or clerk. ASCII art was also used in early e-mail when images could not be embedded. History Typewriter art Since 1867, typewriters have been used for creating visual art. TTY and RTTY TTY stands for "TeleTYpe" or "TeleTYpewriter", and is also known as Teleprinter or Teletype. RTTY stands for Radioteletype; character sets such as Baudot code, which predated ASCII, were used. According to a chapter in the "RTTY Handbook", text images have been sent via teletypewriter as early as 1923. However, none of the "old" RTTY art has been discovered yet. What is known is that text images appeared frequently on radioteletype in the 1960s and the 1970s. Line-printer art In the 1960s, Andries van Dam published a representation of an electronic circuit produced on an IBM 1403 line printer. At the same time, Kenneth Knowlton was producing realistic images, also on line printers, by overprinting several characters on top of one another. Note that it was not ASCII art in a sense that the 1403 was driven by an EBCDIC-coded platform and the character sets and trains available on the 1403 were derived from EBCDIC rather than ASCII, despite some glyphs commonalities. ASCII art The widespread usage of ASCII art can be traced to the computer bulletin board systems of the late 1970s and early 1980s. The limitations of computers of that time period necessitated the use of text characters to represent images. Along with ASCII's use in communication, however, it also began to appear in the underground online art groups of the period. An ASCII comic is a form of webcomic which uses ASCII text to create images. In place of images in a regular comic, ASCII art is used, wi
https://en.wikipedia.org/wiki/Adobe%20Inc.
Adobe Inc. ( ), formerly Adobe Systems Incorporated, is an American multinational computer software company incorporated in Delaware and headquartered in San Jose, California. It has historically specialized in software for the creation and publication of a wide range of content, including graphics, photography, illustration, animation, multimedia/video, motion pictures, and print. Its flagship products include Adobe Photoshop image editing software; Adobe Illustrator vector-based illustration software; Adobe Acrobat Reader and the Portable Document Format (PDF); and a host of tools primarily for audio-visual content creation, editing and publishing. Adobe offered a bundled solution of its products named Adobe Creative Suite, which evolved into a subscription software as a service (SaaS) offering named Adobe Creative Cloud. The company also expanded into digital marketing software and in 2021 was considered one of the top global leaders in Customer Experience Management (CXM). Adobe was founded in December 1982 by John Warnock and Charles Geschke, who established the company after leaving Xerox PARC to develop and sell the PostScript page description language. In 1985, Apple Computer licensed PostScript for use in its LaserWriter printers, which helped spark the desktop publishing revolution. Adobe later developed animation and multimedia through its acquisition of Macromedia, from which it acquired Macromedia Flash; video editing and compositing software with Adobe Premiere, later known as Adobe Premiere Pro; low-code web development with Adobe Muse; and a suite of software for digital marketing management. As of 2022, Adobe has more than 26,000 employees worldwide. Adobe also has major development operations in the United States in Newton, New York City, Arden Hills, Lehi, Seattle, Austin and San Francisco. It also has major development operations in Noida and Bangalore in India. History The company was started in John Warnock's garage. The name of the company, Adobe, comes from Adobe Creek in Los Altos, California, a stream which ran behind Warnock's house. That creek is so named because of the type of clay found there (Adobe being a Spanish word for Mudbrick), which alludes to the creative nature of the company's software. Adobe's corporate logo features a stylized "A" and was designed by graphic designer Marva Warnock, John Warnock's wife. In 2020, the company updated its visual identity, including updating its logo to a single color, an all-red logo. Steve Jobs attempted to buy the company for $5 million in 1982, but Warnock and Geschke refused. Their investors urged them to work something out with Jobs, so they agreed to sell him shares worth 19 percent of the company. Jobs paid a five-times multiple of their company's valuation at the time, plus a five-year license fee for PostScript, in advance. The purchase and advance made Adobe the first company in the history of Silicon Valley to become profitable in its first year. Warnock and
https://en.wikipedia.org/wiki/Amiga
Amiga is a family of personal computers introduced by Commodore in 1985. The original model is one of a number of mid-1980s computers with 16- or 16/32-bit processors, 256 KB or more of RAM, mouse-based GUIs, and significantly improved graphics and audio compared to previous 8-bit systems. These systems include the Atari ST—released earlier the same year—as well as the Macintosh and Acorn Archimedes. Based on the Motorola 68000 microprocessor, the Amiga differs from its contemporaries through the inclusion of custom hardware to accelerate graphics and sound, including sprites and a blitter, and a pre-emptive multitasking operating system called AmigaOS. The Amiga 1000 was released in July 1985, but production problems kept it from becoming widely available until early 1986. The best-selling model, the Amiga 500, was introduced in 1987 along with the more expandable Amiga 2000. The Amiga 3000 was introduced in 1990, followed by the Amiga 500 Plus, and Amiga 600 in March 1992. Finally, the Amiga 1200 and Amiga 4000 were released in late 1992. The Amiga line sold an estimated 4.85 million units. Although early advertisements cast the computer as an all-purpose business machine, especially when outfitted with the Sidecar IBM PC compatibility add-on, the Amiga was most commercially successful as a home computer, with a wide range of games and creative software. The Video Toaster hardware and software suite helped Amiga find a prominent role in desktop video and video production. The Amiga's audio hardware made it a popular platform for music tracker software. The processor and memory capacity enabled 3D rendering packages, including LightWave 3D, Imagine, and Traces, a predecessor to Blender. Poor marketing and the failure of later models to repeat the technological advances of the first systems resulted in Commodore quickly losing market share to the rapidly dropping prices of IBM PC compatibles, which gained 256 color graphics in 1987, as well as the fourth generation of video game consoles. Commodore ultimately went bankrupt in April 1994 after a version of the Amiga packaged as a game console, the Amiga CD32, failed in the marketplace. Since the demise of Commodore, various groups have marketed successors to the original Amiga line, including Genesi, Eyetech, ACube Systems Srl and A-EON Technology. AmigaOS has influenced replacements, clones, and compatible systems such as MorphOS and AROS. Currently Belgian company Hyperion Entertainment maintains and develops AmigaOS 4, which is an official and direct descendant of AmigaOS 3.1 – the last system made by Commodore for the original Amiga Computers. History Concept and early development Jay Miner joined Atari, Inc. in the 1970s to develop custom integrated circuits, and led development of the Atari Video Computer System's TIA. When complete, the team began developing a much more sophisticated set of chips, CTIA, ANTIC and POKEY, that formed the basis of the Atari 8-bit family. With the 8-bit
https://en.wikipedia.org/wiki/Atomic%20semantics
Atomic semantics is a type of guarantee provided by a data register shared by several processors in a parallel machine or in a network of computers working together. Atomic semantics are very strong. An atomic register provides strong guarantees even when there is concurrency and failures. A read/write register R stores a value and is accessed by two basic operations: read and write(v). A read returns the value stored in R and write(v) changes the value stored in R to v. A register is called atomic if it satisfies the two following properties: 1) Each invocation op of a read or write operation: •Must appear as if it were executed at a single point τ(op) in time. •τ (op) works as follow: τb(op) ≤ τ (op) ≤ τe(op): where τb(op) and τe(op) indicate the time when the operation op begins and ends. •If op1 ≠ op2, then τ (op1)≠τ (op2) 2) Each read operation returns the value written by the last write operation before the read, in the sequence where all operations are ordered by their τ values. Atomic/Linearizable register: Termination: when a node is correct, sooner or later each read and write operation will complete. Safety Property (Linearization points for read and write and failed operations): Read operation:It appears as if happened at all nodes at some times between the invocation and response time. Write operation: Similar to read operation, it appears as if happened at all nodes at some times between the invocation and response time. Failed operation(The atomic term comes from this notion):It appears as if it is completed at every single node or it never happened at any node. Example : We know that an atomic register is one that is linearizable to a sequential safe register. The following picture shows where we should put the linearization point for each operation: An atomic register could be defined for a variable with a single writer but multi- readers (SWMR), single-writer/single-reader (SWSR), or multi-writer/multi-reader (MWMR). Here is an example of a multi-reader multi-writer atomic register which is accessed by three processes (P1, P2, P3). Note that R. read() → v means that the corresponding read operation returns v, which is the value of the register. Therefore, the following execution of the register R could satisfies the definition of the atomic registers: R.write(1), R.read()→1, R.write(3), R.write(2), R.read()→2, R.read()→2. See also Regular semantics Safe semantics References Atomic semantics are defined formally in Lamport's "On Interprocess Communication" Distributed Computing 1, 2 (1986), 77-101. (Also appeared as SRC Research Report 8). Concurrency control
https://en.wikipedia.org/wiki/Alpha%20compositing
In computer graphics, alpha compositing or alpha blending is the process of combining one image with a background to create the appearance of partial or full transparency. It is often useful to render picture elements (pixels) in separate passes or layers and then combine the resulting 2D images into a single, final image called the composite. Compositing is used extensively in film when combining computer-rendered image elements with live footage. Alpha blending is also used in 2D computer graphics to put rasterized foreground elements over a background. In order to combine the picture elements of the images correctly, it is necessary to keep an associated matte for each element in addition to its color. This matte layer contains the coverage information—the shape of the geometry being drawn—making it possible to distinguish between parts of the image where something was drawn and parts that are empty. Although the most basic operation of combining two images is to put one over the other, there are many operations, or blend modes, that are used. History The concept of an alpha channel was introduced by Alvy Ray Smith and in the late 1970s at the New York Institute of Technology Computer Graphics Lab. Bruce A. Wallace derived the same straight over operator based on a physical reflectance/transmittance model in 1981. A 1984 paper by Thomas Porter and Tom Duff introduced premultiplied alpha using a geometrical approach. The use of the term alpha is explained by Smith as follows: "We called it that because of the classic linear interpolation formula that uses the Greek letter (alpha) to control the amount of interpolation between, in this case, two images A and B". That is, when compositing image A atop image B, the value of in the formula is taken directly from A's alpha channel. Description In a 2D image a color combination is stored for each picture element (pixel), often a combination of red, green and blue (RGB). When alpha compositing is in use, each pixel has an additional numeric value stored in its alpha channel, with a value ranging from 0 to 1. A value of 0 means that the pixel is fully transparent and the color in the pixel beneath will show through. A value of 1 means that the pixel is fully opaque. With the existence of an alpha channel, it is possible to express compositing image operations using a compositing algebra. For example, given two images A and B, the most common compositing operation is to combine the images so that A appears in the foreground and B appears in the background. This can be expressed as A over B. In addition to over, Porter and Duff defined the compositing operators in, held out by (the phrase refers to holdout matting and is usually abbreviated out), atop, and xor (and the reverse operators rover, rin, rout, and ratop) from a consideration of choices in blending the colors of two pixels when their coverage is, conceptually, overlaid orthogonally: As an example, the over operator can be accomplis
https://en.wikipedia.org/wiki/Array%20%28data%20structure%29
In computer science, an array is a data structure consisting of a collection of elements (values or variables), of same memory size, each identified by at least one array index or key. An array is stored such that the position of each element can be computed from its index tuple by a mathematical formula. The simplest type of data structure is a linear array, also called one-dimensional array. For example, an array of ten 32-bit (4-byte) integer variables, with indices 0 through 9, may be stored as ten words at memory addresses 2000, 2004, 2008, ..., 2036, (in hexadecimal: 0x7D0, 0x7D4, 0x7D8, ..., 0x7F4) so that the element with index i has the address 2000 + (i × 4). The memory address of the first element of an array is called first address, foundation address, or base address. Because the mathematical concept of a matrix can be represented as a two-dimensional grid, two-dimensional arrays are also sometimes called "matrices". In some cases the term "vector" is used in computing to refer to an array, although tuples rather than vectors are the more mathematically correct equivalent. Tables are often implemented in the form of arrays, especially lookup tables; the word "table" is sometimes used as a synonym of array. Arrays are among the oldest and most important data structures, and are used by almost every program. They are also used to implement many other data structures, such as lists and strings. They effectively exploit the addressing logic of computers. In most modern computers and many external storage devices, the memory is a one-dimensional array of words, whose indices are their addresses. Processors, especially vector processors, are often optimized for array operations. Arrays are useful mostly because the element indices can be computed at run time. Among other things, this feature allows a single iterative statement to process arbitrarily many elements of an array. For that reason, the elements of an array data structure are required to have the same size and should use the same data representation. The set of valid index tuples and the addresses of the elements (and hence the element addressing formula) are usually, but not always, fixed while the array is in use. The term "array" may also refer to an array data type, a kind of data type provided by most high-level programming languages that consists of a collection of values or variables that can be selected by one or more indices computed at run-time. Array types are often implemented by array structures; however, in some languages they may be implemented by hash tables, linked lists, search trees, or other data structures. The term is also used, especially in the description of algorithms, to mean associative array or "abstract array", a theoretical computer science model (an abstract data type or ADT) intended to capture the essential properties of arrays. History The first digital computers used machine-language programming to set up and access array structures for
https://en.wikipedia.org/wiki/Acorn%20Electron
The Acorn Electron (nicknamed the Elk inside Acorn and beyond) was a lower-cost alternative to the BBC Micro educational/home computer, also developed by Acorn Computers Ltd, to provide many of the features of that more expensive machine at a price more competitive with that of the ZX Spectrum. It had 32 kilobytes of RAM, and its ROM included BBC BASIC II together with the operating system. Announced in 1982 for a possible release the same year, it was eventually introduced on 25 August 1983 priced at £199. The Electron was able to save and load programs onto audio cassette via a supplied cable that connected it to any standard tape recorder that had the correct sockets. It was capable of bitmapped graphics, and could use either a television set, a colour (RGB) monitor or a monochrome monitor as its display. Several expansions were made available to provide many of the capabilities omitted from the BBC Micro. Acorn introduced a general-purpose expansion unit, the Plus 1, offering analogue joystick and parallel ports, together with cartridge slots into which ROM cartridges, providing software, or other kinds of hardware expansions, such as disc interfaces, could be inserted. Acorn also produced a dedicated disc expansion, the Plus 3, featuring a disc controller and 3.5-inch floppy drive. For a short period, the Electron was reportedly the best selling micro in the United Kingdom, with an estimated 200,000 to 250,000 machines sold over its entire commercial lifespan. With production effectively discontinued by Acorn as early as 1985, and with the machine offered in bundles with games and expansions, later being substantially discounted by retailers, a revival in demand for the Electron supported a market for software and expansions without Acorn's involvement, with its market for games also helping to sustain the continued viability of games production for the BBC Micro. History After Acorn Computers released the BBC Micro, executives believed that the company needed a less-expensive computer for the mass market. In May 1982, when asked about the recently announced Sinclair ZX Spectrum's potential to hurt sales of the BBC Micro, priced at £125 for the 16K model compared to around twice that price for the 16K BBC Model A, Acorn co-founder Hermann Hauser responded that in the third quarter of that year Acorn would release a new £120–150 computer which "will probably be called the Electron", a form of "miniaturised BBC Micro", having 32 KB of RAM and 32 KB of ROM, with "higher resolution graphics than those offered by the Spectrum". Acorn co-founder Chris Curry also emphasised the Electron's role as being "designed to compete with the Spectrum... to get the starting price very low, but not preclude expansion in the long term." In order to reduce component costs, and to prevent cloning, the company reduced the number of chips in the Electron from the 102 on the BBC Micro's motherboard to "something like 12 to 14 chips" with most functionality on a
https://en.wikipedia.org/wiki/Andrew%20Tridgell
Andrew "Tridge" Tridgell (born 28 February 1967) is an Australian computer programmer. He is the author of and a contributor to the Samba file server, and co-inventor of the rsync algorithm. He has analysed complex proprietary protocols and algorithms, to allow compatible free and open source software implementations. Projects Tridgell was a major developer of the Samba software, analyzing the Server Message Block protocol used for workgroup and network file sharing by Microsoft Windows products. He developed the hierarchical memory allocator, originally as part of Samba. For his PhD dissertation, he co-developed rsync, including the rsync algorithm, a highly efficient file transfer and synchronisation tool. He was also the original author of rzip, which uses a similar algorithm to rsync. He developed spamsum, based on locality-sensitive hashing algorithms. He is the author of KnightCap, a reinforcement-learning based chess engine. Tridgell was also a leader in hacking the TiVo to make it work in Australia, which uses the PAL video format. In April 2005, Tridgell tried to produce free software (now known as SourcePuller) that interoperated with the BitKeeper source code repository. This was cited as the reason that BitMover revoked a license allowing Linux developers free use of their BitKeeper product. Linus Torvalds, the creator of the Linux kernel, and Tridgell were thus involved in a public debate about the events, in which Tridgell stated that, not having bought or owned BitKeeper – and thus having never agreed to its license – he could not violate it, and was analyzing the protocol ethically, as he had done with Samba. Tridgell's involvement in the project resulted in Torvalds accusing him of playing dirty tricks with BitKeeper. Tridgell claimed his analysis started with simply telneting to a BitKeeper server and typing help. In 2011 Tridgell got involved with the software development of ArduPilot Mega, an open source Arduino-based UAV controller board, working on an entry for the UAV Challenge Outback Rescue. Academic achievements Tridgell completed a PhD at the Computer Sciences Laboratory of the Australian National University. His original doctorate work was in the area of speech recognition but was never completed. His submitted dissertation 'Efficient Algorithms for Sorting and Synchronization' was based on his work on the rsync algorithm. Awards and honours In October 2003, The Bulletin magazine judged Tridgell to be Australia's smartest Information and Communications Technology person. In July 2008, Tridgell was named "Best Interoperator" at the Google–O'Reilly Open Source Awards, for his work on Samba and rsync. Tridgell (along with Jeremy Allison and Volker Lendecke) has been called a "guru in its traditional Indian meaning" by IT writer, Sam Varghese. On 11 December 2018, Tridgell was awarded the degree of Doctor of Science (Honoris Causa) by the Australian National University, for authoring Samba, co-inventing rsyn
https://en.wikipedia.org/wiki/Applesoft%20BASIC
Applesoft BASIC is a dialect of Microsoft BASIC, developed by Marc McDonald and Ric Weiland, supplied with the Apple II series of computers. It supersedes Integer BASIC and is the BASIC in ROM in all Apple II series computers after the original Apple II model. It is also referred to as FP BASIC (from floating point) because of the Apple DOS command used to invoke it, instead of INT for Integer BASIC. Applesoft BASIC was supplied by Microsoft and its name is derived from the names of both Apple Computer and Microsoft. Apple employees, including Randy Wigginton, adapted Microsoft's interpreter for the Apple II and added several features. The first version of Applesoft was released in 1977 on cassette tape and lacked proper support for high-resolution graphics. Applesoft II, which was made available on cassette and disk and in the ROM of the Apple II Plus and subsequent models, was released in 1978. It is this latter version, which has some syntax differences and support for the Apple II high-resolution graphics modes, that is usually synonymous with the term "Applesoft." A compiler for Applesoft BASIC, TASC (The Applesoft Compiler), was released by Microsoft in 1981. History When Steve Wozniak wrote Integer BASIC for the Apple II, he did not implement support for floating-point arithmetic because he was primarily interested in writing games, a task for which integers alone were sufficient. In 1976, Microsoft had developed Microsoft BASIC for the MOS Technology 6502, but at the time there was no production computer that used it. Upon learning that Apple had a 6502 machine, Microsoft asked if the company were interested in licensing BASIC, but Steve Jobs replied that Apple already had one. The Apple II was unveiled to the public at the West Coast Computer Faire in April 1977 and became available for sale in June. One of the most common customer complaints about the computer was BASIC's lack of floating-point math. Making things more problematic was that the rival Commodore PET personal computer had a floating point-capable BASIC interpreter from the beginning. As Wozniak—the only person who understood Integer BASIC well enough to add floating point features—was busy with the Disk II drive and controller and with Apple DOS, Apple turned to Microsoft. Apple reportedly obtained an eight-year license for Applesoft BASIC from Microsoft for a flat fee of $31,000, renewing it in 1985 through an arrangement that gave Microsoft the rights and source code for Apple's Macintosh version of BASIC. Applesoft was designed to be backwards-compatible with Integer BASIC and uses the core of Microsoft's 6502 BASIC implementation, which includes using the GET command for detecting key presses and not requiring any spaces on program lines. While Applesoft BASIC is slower than Integer BASIC, it has many features that the older BASIC lacks: Atomic strings: A string is no longer an array of characters (as in Integer BASIC and C); it is instead a garbage-collected ob
https://en.wikipedia.org/wiki/IBM%20AIX
AIX (Advanced Interactive eXecutive, pronounced ,) is a series of proprietary Unix operating systems developed and sold by IBM for several of its computer platforms. Background Originally released for the IBM RT PC RISC workstation in 1986, AIX has supported a wide variety of hardware platforms, including the IBM RS/6000 series and later Power and PowerPC-based systems, IBM System i, System/370 mainframes, PS/2 personal computers, and the Apple Network Server. It is currently supported on IBM Power Systems alongside IBM i and Linux. AIX is based on UNIX System V with 4.3BSD-compatible extensions. It is certified to the UNIX 03 and UNIX V7 marks of the Single UNIX Specification, beginning with AIX versions 5.3 and 7.2 TL5 respectively. Older versions were previously certified to the UNIX 95 and UNIX 98 marks. AIX was the first operating system to have a journaling file system, and IBM has continuously enhanced the software with features such as processor, disk and network virtualization, dynamic hardware resource allocation (including fractional processor units), and reliability engineering ported from its mainframe designs. History Unix started life at AT&T's Bell Labs research center in the early 1970s, running on DEC minicomputers. By 1976, the operating system was in use at various academic institutions, including Princeton, where Tom Lyon and others ported it to the S/370, to run as a guest OS under VM/370. This port would later grow out to become UTS, a mainframe Unix offering by IBM's competitor Amdahl Corporation. IBM's own involvement in Unix can be dated to 1979, when it assisted Bell Labs in doing its own Unix port to the 370 (to be used as a build host for the 5ESS switch's software). In the process, IBM made modifications to the TSS/370 hypervisor to better support Unix. It took until 1985 for IBM to offer its own Unix on the S/370 platform, IX/370, which was developed by Interactive Systems Corporation and intended by IBM to compete with Amdahl UTS. The operating system offered special facilities for interoperating with PC/IX, Interactive/IBM's version of Unix for IBM PC compatible hardware, and was licensed at $10,000 per sixteen concurrent users. AIX Version 1, introduced in 1986 for the IBM RT PC workstation, was based on UNIX System V Releases 1 and 2. In developing AIX, IBM and Interactive Systems Corporation (whom IBM contracted) also incorporated source code from 4.2 and 4.3 BSD UNIX. Among other variants, IBM later produced AIX Version 3 (also known as AIX/6000), based on System V Release 3, for their POWER-based RS/6000 platform. Since 1990, AIX has served as the primary operating system for the RS/6000 series (later renamed IBM eServer pSeries, then IBM System p, and now IBM Power Systems). AIX Version 4, introduced in 1994, added symmetric multiprocessing with the introduction of the first RS/6000 SMP servers and continued to evolve through the 1990s, culminating with AIX 4.3.3 in 1999. Version 4.1, in a slightly
https://en.wikipedia.org/wiki/AppleTalk
AppleTalk is a discontinued proprietary suite of networking protocols developed by Apple Computer for their Macintosh computers. AppleTalk includes a number of features that allow local area networks to be connected with no prior setup or the need for a centralized router or server of any sort. Connected AppleTalk-equipped systems automatically assign addresses, update the distributed namespace, and configure any required inter-networking routing. AppleTalk was released in 1985 and was the primary protocol used by Apple devices through the 1980s and 1990s. Versions were also released for the IBM PC and compatibles and the Apple IIGS. AppleTalk support was also available in most networked printers (especially laser printers), some file servers, and a number of routers. The rise of TCP/IP during the 1990s led to a reimplementation of most of these types of support on that protocol, and AppleTalk became unsupported as of the release of Mac OS X v10.6 in 2009. Many of AppleTalk's more advanced autoconfiguration features have since been introduced in Bonjour, while Universal Plug and Play serves similar needs. History AppleNet After the release of the Apple Lisa computer in January 1983, Apple invested considerable effort in the development of a local area networking (LAN) system for the machines. Known as AppleNet, it was based on the seminal Xerox XNS protocol stack but running on a custom 1 Mbit/s coaxial cable system rather than Xerox's 2.94 Mbit/s Ethernet. AppleNet was announced early in 1983 with a full introduction at the target price of $500 for plug-in AppleNet cards for the Lisa and the Apple II. At that time, early LAN systems were just coming to market, including Ethernet, Token Ring, Econet, and ARCNET. This was a topic of major commercial effort at the time, dominating shows like the National Computer Conference (NCC) in Anaheim in May 1983. All of the systems were jockeying for position in the market, but even at this time, Ethernet's widespread acceptance suggested it was to become a de facto standard. It was at this show that Steve Jobs asked Gursharan Sidhu a seemingly innocuous question: "Why has networking not caught on?" Four months later, in October, AppleNet was cancelled. At the time, they announced that "Apple realized that it's not in the business to create a networking system. We built and used AppleNet in-house, but we realized that if we had shipped it, we would have seen new standards coming up." In January, Jobs announced that they would instead be supporting IBM's Token Ring, which he expected to come out in a "few months". AppleBus Through this period, Apple was deep in development of the Macintosh computer. During development, engineers had made the decision to use the Zilog 8530 serial controller chip (SCC) instead of the lower-cost and more common UART to provide serial port connections. The SCC cost about $5 more than a UART, but offered much higher speeds of up to 250 kilobits per second (or higher with ad
https://en.wikipedia.org/wiki/Apple%20II%20series
The Apple II series (trademarked with square brackets as "Apple ][" and rendered on later models as "Apple //") is a family of home computers, one of the first highly successful mass-produced microcomputer products, designed primarily by Steve Wozniak, manufactured by Apple Computer (now Apple Inc.), and launched in 1977 with the original Apple II. In terms of ease of use, features, and expandability, the Apple II was a major advancement over its predecessor, the Apple I, a limited-production bare circuit board computer for electronics hobbyists. Through 1988, a number of models were introduced, with the most popular, the Apple IIe, remaining relatively unchanged into the 1990s. A model with more advanced graphics and sound and a 16-bit processor, the Apple IIGS, was added in 1986. It remained compatible with earlier Apple II models, but the IIGS had more in common with mid-1980s systems like the Atari ST, Amiga, and Acorn Archimedes. The Apple II was first sold on June 10, 1977. By the end of production in 1993, somewhere between five and six million Apple II series computers (including about 1.25 million Apple IIGS models) had been produced. The Apple II was one of the longest running mass-produced home computer series, with models in production for just under 17 years. The Apple II became one of several recognizable and successful computers during the 1980s and early 1990s, although this was mainly limited to the US. It was aggressively marketed through volume discounts and manufacturing arrangements to educational institutions, which made it the first computer in widespread use in American secondary schools, displacing the early leader Commodore PET. The effort to develop educational and business software for the Apple II, including the 1979 release of the popular VisiCalc spreadsheet, made the computer especially popular with business users and families. Despite the introduction of the Motorola 68000-based Macintosh in 1984, the Apple II series still reportedly accounted for 85% of the company's hardware sales in the first quarter of fiscal 1985. Apple continued to sell Apple II systems alongside the Macintosh until terminating the IIGS in December 1992 and the IIe in November 1993. The last II-series Apple in production, the IIe card for Macintoshes, was discontinued on October 15, 1993. The total Apple II sales of all of its models during its 16-year production run were about 6 million units, with the peak occurring in 1983 when 1 million were sold. Hardware All the machines in the series, except the //c, shared similar overall design elements. The plastic case was designed to look more like a home appliance than a piece of electronic equipment, and the machine could be opened without the use of tools, allowing access to the computer's internals. The motherboard held eight expansion slots and an array of random access memory (RAM) sockets that could hold up to 48 kilobytes. Over the course of the Apple II series' life, an enormous
https://en.wikipedia.org/wiki/Apple%20III
The Apple III (styled as apple ///) is a business-oriented personal computer produced by Apple Computer and released in 1980. Running the Apple SOS operating system, it was intended as the successor to the Apple II series, but was largely considered a failure in the market. It was designed to provide key features business users wanted in a personal computer: a true typewriter-style upper/lowercase keyboard (the Apple II only supported uppercase) and an 80-column display. Work on the Apple III started in late 1978 under the guidance of Dr. Wendell Sander. It had the internal code name of "Sara", named after Sander's daughter. The system was announced on May 19, 1980 and released in late November that year. Serious stability issues required a design overhaul and a recall of the first 14,000 machines produced. The Apple III was formally reintroduced on November 9, 1981. Damage to the computer's reputation had already been done, however, and it failed to do well commercially. Development stopped, and the Apple III was discontinued on April 24, 1984. Its last successor, the III Plus, was dropped from the Apple product line in September 1985. An estimated 65,000–75,000 Apple III computers were sold. The Apple III Plus brought this up to approximately 120,000. Apple co-founder Steve Wozniak stated that the primary reason for the Apple III's failure was that the system was designed by Apple's marketing department, unlike Apple's previous engineering-driven projects. The Apple III's failure led Apple to reevaluate its plan to phase out the Apple II, prompting the eventual continuation of development of the older machine. As a result, later Apple II models incorporated some hardware and software technologies of the Apple III. Overview Design Steve Wozniak and Steve Jobs expected hobbyists to purchase the Apple II, but because of VisiCalc and Disk II, small businesses purchased 90% of the computers. The Apple III was designed to be a business computer and successor. Though the Apple II contributed to the inspirations of several important business products, such as VisiCalc, Multiplan, and Apple Writer, the computer's hardware architecture, operating system, and developer environment are limited. Apple management intended to clearly establish market segmentation by designing the Apple III to appeal to the 90% business market, leaving the Apple II to home and education users. Management believed that "once the Apple III was out, the Apple II would stop selling in six months", Wozniak said. The Apple III is powered by a 1.8-megahertz Synertek 6502A or 6502B 8-bit CPU and, like some of the later machines in the Apple II family, uses bank switching techniques to address memory beyond the 6502's traditional 64 KB limit, up to 256 KB in the III's case. Third-party vendors produced memory upgrade kits that allow the Apple III to reach up to 512 KB of random-access memory (RAM). Other Apple III built-in features include an 80-column, 24-line display with upper
https://en.wikipedia.org/wiki/AVL%20tree
In computer science, an AVL tree (named after inventors Adelson-Velsky and Landis) is a self-balancing binary search tree. In an AVL tree, the heights of the two child subtrees of any node differ by at most one; if at any time they differ by more than one, rebalancing is done to restore this property. Lookup, insertion, and deletion all take time in both the average and worst cases, where is the number of nodes in the tree prior to the operation. Insertions and deletions may require the tree to be rebalanced by one or more tree rotations. The AVL tree is named after its two Soviet inventors, Georgy Adelson-Velsky and Evgenii Landis, who published it in their 1962 paper "An algorithm for the organization of information". It is the oldest self-balancing binary search tree data structure to be invented. AVL trees are often compared with red–black trees because both support the same set of operations and take time for the basic operations. For lookup-intensive applications, AVL trees are faster than red–black trees because they are more strictly balanced. Similar to red–black trees, AVL trees are height-balanced. Both are, in general, neither weight-balanced nor -balanced for any ; that is, sibling nodes can have hugely differing numbers of descendants. Definition Balance factor In a binary tree the balance factor of a node X is defined to be the height difference of its two child sub-trees rooted by node X. A binary tree is defined to be an AVL tree if the invariant holds for every node X in the tree. A node X with is called "left-heavy", one with is called "right-heavy", and one with is sometimes simply called "balanced". Properties Balance factors can be kept up-to-date by knowing the previous balance factors and the change in height – it is not necessary to know the absolute height. For holding the AVL balance information, two bits per node are sufficient. The height (counted as the maximal number of levels) of an AVL tree with nodes lies in the interval: where   is the golden ratio and This is because an AVL tree of height contains at least nodes where is the Fibonacci sequence with the seed values Operations Read-only operations of an AVL tree involve carrying out the same actions as would be carried out on an unbalanced binary search tree, but modifications have to observe and restore the height balance of the sub-trees. Searching Searching for a specific key in an AVL tree can be done the same way as that of any balanced or unbalanced binary search tree. In order for search to work effectively it has to employ a comparison function which establishes a total order (or at least a total preorder) on the set of keys. The number of comparisons required for successful search is limited by the height and for unsuccessful search is very close to , so both are in . Traversal As a read-only operation the traversal of an AVL tree functions the same way as on any other binary tree. Exploring all nodes of the tree visits each
https://en.wikipedia.org/wiki/Aster%20CT-80
The Aster CT-80 is a 1982 personal computer developed by the small Dutch company MCP (later renamed to Aster Computers), was sold in its first incarnation as a kit for hobbyists. Later it was sold ready to use. It consisted of several Eurocard PCB's with DIN 41612 connectors, and a backplane all based on a 19-inch rack configuration. It was the first commercially available Dutch personal/home computer. The Aster computer could use the software written for the popular Tandy TRS-80 computer while fixing many of the problems of that computer, but it could also run CP/M software, with a large amount of free memory Transient Program Area, (TPA) and a full 80×25 display, and it could be used as a Videotext terminal. Although the Aster was a clone of the TRS-80 Model I it was in fact more compatible with the TRS-80 Model III and ran all the software of these systems including games. It also had a built-in speaker which was compatible with such games software. Models Three models were sold. The first model (launched June 1982) looked like the IBM PC, a rectangular base unit with two floppy drives on the front, and a monitor on top with a separate detachable keyboard. The second incarnation was a much smaller unit the width of two 5" floppy drives stacked on top of each other, and the third incarnation looked like a flattened Apple with a built-in keyboard. All units ran much faster than the original TRS-80, at 4 MHz, (with a software selectable throttle to the original speed for compatibility purposes) and the display supported upper and lower case, hardware snow suppression (video ram bus arbitration logic), and an improved character font set. The floppy disk interface supported dual density, and disk capacities up to 800 KB, more than four times the capacity of the original TRS-80. A special version of NewDos/80, (an improved TRS-DOS compatible Disk operating system) was used to support these disk capacities when using the TRS-80 compatibility mode. For the educational market a version of the first model was produced with a new plastic enclosure (the First Asters had an all-metal enclosure) that also had an opening on the top in which a cassette recorder could be placed. This model was used in a cluster with one Aster (with disk drives) for the teacher, and eight disk less versions for the pupils. The pupils could download software from the teachers computer through a network based on a fast serial connection, as well as sending back their work to the teachers computer. There was also hardware in place through which the teacher could see the display of each pupils screen on his own monitor. Working modes The Aster used 64 KB of RAM and had the unique feature of supporting two fundamentally different internal architectures: when turned on without a boot floppy or with a TRS-DOS floppy, the Aster would be fully TRS-80 compatible, with 48 KB of RAM. When the boot loader detected a CP/M floppy, the Aster would reconfigure its internal memory architectu
https://en.wikipedia.org/wiki/Atari%20ST
The Atari ST is a line of personal computers from Atari Corporation and the successor to the Atari 8-bit family. The initial model, the Atari 520ST, had limited release in April–June 1985 and was widely available in July. It was the first personal computer with a bitmapped color GUI, using a version of Digital Research's GEM from February 1985. The Atari 1040ST, released in 1986 with 1 MB of RAM, was the first home computer with a cost-per-kilobyte of less than US$1. After Jack Tramiel purchased the assets of the Atari, Inc. consumer division to create Atari Corporation, the 520ST was designed in five months by a small team led by Shiraz Shivji. Alongside the Macintosh, Amiga, Apple IIGS, and Acorn Archimedes, the ST is part of a mid-1980s generation of computers with 16- or 32-bit processors, 256 KB or more of RAM, and mouse-controlled graphical user interfaces. "ST" officially stands for "Sixteen/Thirty-two", referring to the Motorola 68000's 16-bit external bus and 32-bit internals. The ST was sold with either Atari's color monitor or less expensive monochrome monitor. Color graphics modes are available only on the former while the highest-resolution mode requires the monochrome monitor. Some models can display the color modes on a TV. In Germany and some other markets, the ST gained a foothold for CAD and desktop publishing. With built-in MIDI ports, it was popular for music sequencing and as a controller of musical instruments among amateur and professional musicians. The primary competitor of the Atari ST was the Amiga from Commodore. The 520ST and 1040ST were followed by the Mega series, the STE, and the portable STacy. In the early 1990s, Atari released three final evolutions of the ST with significant technical differences from the original models: TT030 (1990), Mega STE (1991), and Falcon (1992). Atari discontinued the entire ST computer line in 1993, shifting the company's focus to the Jaguar video game console. Development The Atari ST was born from the rivalry between home computer makers Atari, Inc. and Commodore International. Jay Miner, one of the designers of the custom chips in the Atari 2600 and Atari 8-bit family, tried to convince Atari management to create a new chipset for a video game console and computer. When his idea was rejected, he left Atari to form a small think tank called Hi-Toro in 1982 and began designing the new "Lorraine" chipset. Amiga ran out of capital to complete Lorraine's development, and Atari, by then owned by Warner Communications, paid Amiga to continue its work. In return, Atari received exclusive use of the Lorraine design for one year as a video game console. After that time, Atari had the right to add a keyboard and market the complete computer, designated the 1850XLD. Tramel Technology After leaving Commodore International in January 1984, Jack Tramiel formed Tramel (without an "i") Technology, Ltd. with his sons and other ex-Commodore employees and, in April, began planning a new comp
https://en.wikipedia.org/wiki/List%20of%20artificial%20intelligence%20projects
The following is a list of current and past, non-classified notable artificial intelligence projects. Specialized projects Brain-inspired Blue Brain Project, an attempt to create a synthetic brain by reverse-engineering the mammalian brain down to the molecular level. Google Brain, a deep learning project part of Google X attempting to have intelligence similar or equal to human-level. Human Brain Project, ten-year scientific research project, based on exascale supercomputers. Cognitive architectures 4CAPS, developed at Carnegie Mellon University under Marcel A. Just ACT-R, developed at Carnegie Mellon University under John R. Anderson. AIXI, Universal Artificial Intelligence developed by Marcus Hutter at IDSIA and ANU. CALO, a DARPA-funded, 25-institution effort to integrate many artificial intelligence approaches (natural language processing, speech recognition, machine vision, probabilistic logic, planning, reasoning, many forms of machine learning) into an AI assistant that learns to help manage your office environment. CHREST, developed under Fernand Gobet at Brunel University and Peter C. Lane at the University of Hertfordshire. CLARION, developed under Ron Sun at Rensselaer Polytechnic Institute and University of Missouri. CoJACK, an ACT-R inspired extension to the JACK multi-agent system that adds a cognitive architecture to the agents for eliciting more realistic (human-like) behaviors in virtual environments. Copycat, by Douglas Hofstadter and Melanie Mitchell at the Indiana University. DUAL, developed at the New Bulgarian University under Boicho Kokinov. FORR developed by Susan L. Epstein at The City University of New York. IDA and LIDA, implementing Global Workspace Theory, developed under Stan Franklin at the University of Memphis. OpenCog Prime, developed using the OpenCog Framework. Procedural Reasoning System (PRS), developed by Michael Georgeff and Amy L. Lansky at SRI International. Psi-Theory developed under Dietrich Dörner at the Otto-Friedrich University in Bamberg, Germany. Soar, developed under Allen Newell and John Laird at Carnegie Mellon University and the University of Michigan. Society of Mind and its successor The Emotion Machine proposed by Marvin Minsky. Subsumption architectures, developed e.g. by Rodney Brooks (though it could be argued whether they are cognitive). Games AlphaGo, software developed by Google that plays the Chinese board game Go. Chinook, a computer program that plays English draughts; the first to win the world champion title in the competition against humans. Deep Blue, a chess-playing computer developed by IBM which beat Garry Kasparov in 1997. Halite, an artificial intelligence programming competition created by Two Sigma in 2016. Libratus, a poker AI that beat world-class poker players in 2017, intended to be generalisable to other applications. The Matchbox Educable Noughts and Crosses Engine (sometimes called the Machine Educable Noughts and Crosses Engine or M
https://en.wikipedia.org/wiki/Artistic%20License
The Artistic License is an open-source license used for certain free and open-source software packages, most notably the standard implementation of the Perl programming language and most CPAN modules, which are dual-licensed under the Artistic License and the GNU General Public License (GPL). History Artistic License 1.0 The original Artistic License was written by Larry Wall. The name of the license is a reference to the concept of artistic license. Whether or not the original Artistic License is a free software license is largely unsettled. The Free Software Foundation explicitly called the original Artistic License a non-free license, criticizing it as being "too vague; some passages are too clever for their own good, and their meaning is not clear". The FSF recommended that the license not be used on its own, but approved the common AL/GPL dual-licensing approach for Perl projects. In response to this, Bradley Kuhn, who later worked for the Free Software Foundation, made a minimal redraft to clarify the ambiguous passages. This was released as the Clarified Artistic License and was approved by the FSF. It is used by the Paros Proxy, the JavaFBP toolkit and NcFTP. The terms of the Artistic License 1.0 were at issue in Jacobsen v. Katzer in the initial 2009 ruling by the United States District Court for the Northern District of California declared that FOSS-like licenses could only be enforced through contract law rather than through copyright law, in contexts where contract damages would be difficult to establish. On appeal, a federal appellate court "determined that the terms of the Artistic License are enforceable copyright conditions". The case was remanded to the District Court, which did not apply the superior court's criteria on the grounds that, in the interim, the governing Supreme Court precedent applicable to the case had changed. However, this left undisturbed the finding that a free and open-source license nonetheless has economic value. Jacobsen ultimately prevailed in 2010, and the Case established a new standard making terms and conditions under Artistic License 1.0 enforceable through copyright statutes and relevant precedents. Artistic License 2.0 In response to the Request for Comments (RFC) process for improving the licensing position for Perl 6, Kuhn's draft was extensively rewritten by Roberta Cairney and Allison Randal for readability and legal clarity, with input from the Perl community. This resulted in the Artistic License 2.0, which has been approved as both a free software and open source license. The Artistic license 2.0 is also notable for its excellent license compatibility with other FOSS licenses due to a relicensing clause, a property other licenses like the GPL lack. It has been adopted by some of the Perl 6 implementations, the Mojolicious framework, NPM, and has been used by the Parrot virtual machine since version 0.4.13. It is also used by the SNEeSe emulator, which was formerly licensed under
https://en.wikipedia.org/wiki/Amstrad%20CPC
The Amstrad CPC (short for Colour Personal Computer) is a series of 8-bit home computers produced by Amstrad between 1984 and 1990. It was designed to compete in the mid-1980s home computer market dominated by the Commodore 64 and the ZX Spectrum, where it successfully established itself primarily in the United Kingdom, France, Spain, and the German-speaking parts of Europe. The series spawned a total of six distinct models: The CPC 464, CPC 664, and CPC 6128 were highly successful competitors in the European home computer market. The later 464plus and 6128plus, intended to prolong the system's lifecycle with hardware updates, were considerably less successful, as was the attempt to repackage the plus hardware into a game console as the GX4000. The CPC models' hardware is based on the Zilog Z80A CPU, complemented with either 64 or 128 KB of RAM. Their computer-in-a-keyboard design prominently features an integrated storage device, either a compact cassette deck or 3-inch floppy disk drive. The main units were only sold bundled with either a colour, green-screen or monochrome monitor that doubles as the main unit's power supply. Additionally, a wide range of first and third-party hardware extensions such as external disk drives, printers, and memory extensions, was available. The CPC series was pitched against other home computers primarily used to play video games and enjoyed a strong supply of game software. The comparatively low price for a complete computer system with dedicated monitor, its high-resolution monochrome text and graphic capabilities and the possibility to run CP/M software also rendered the system attractive for business users, which was reflected by a wide selection of application software. During its lifetime, the CPC series sold approximately three million units. Models The original range The philosophy behind the CPC series was twofold, firstly the concept was of an "all-in-one", where the computer, keyboard and its data storage device were combined in a single unit and sold with its own dedicated display monitor. Most home computers at that time such as ZX Spectrum series, Commodore 64, and BBC Micro relied on the use of the domestic television set and a separately connected tape recorder or disk drive. In itself, the all-in-one concept was not new, having been seen before on business-oriented machines and the Commodore PET. Secondly, Amstrad founder Alan Sugar wanted the machine to resemble a "real computer, similar to what someone would see being used to check them in at the airport for their holidays", and for the machine to not look like "a pregnant calculator" – in reference presumably to the ZX81 and ZX Spectrum with their low cost, membrane-type keyboards. CPC 464 The CPC 464 was one of the most successful computers in Europe and sold more than two million units. The CPC 464 featured 64 KB RAM and an internal cassette deck. It was introduced in June 1984 in the UK. Initial suggested retail prices for the CP
https://en.wikipedia.org/wiki/Analysis%20of%20algorithms
In computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them. Usually, this involves determining a function that relates the size of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity). An algorithm is said to be efficient when this function's values are small, or grow slowly compared to a growth in the size of the input. Different inputs of the same size may cause the algorithm to have different behavior, so best, worst and average case descriptions might all be of practical interest. When not otherwise specified, the function describing the performance of an algorithm is usually an upper bound, determined from the worst case inputs to the algorithm. The term "analysis of algorithms" was coined by Donald Knuth. Algorithm analysis is an important part of a broader computational complexity theory, which provides theoretical estimates for the resources needed by any algorithm which solves a given computational problem. These estimates provide an insight into reasonable directions of search for efficient algorithms. In theoretical analysis of algorithms it is common to estimate their complexity in the asymptotic sense, i.e., to estimate the complexity function for arbitrarily large input. Big O notation, Big-omega notation and Big-theta notation are used to this end. For instance, binary search is said to run in a number of steps proportional to the logarithm of the size of the sorted list being searched, or in , colloquially "in logarithmic time". Usually asymptotic estimates are used because different implementations of the same algorithm may differ in efficiency. However the efficiencies of any two "reasonable" implementations of a given algorithm are related by a constant multiplicative factor called a hidden constant. Exact (not asymptotic) measures of efficiency can sometimes be computed but they usually require certain assumptions concerning the particular implementation of the algorithm, called model of computation. A model of computation may be defined in terms of an abstract computer, e.g. Turing machine, and/or by postulating that certain operations are executed in unit time. For example, if the sorted list to which we apply binary search has elements, and we can guarantee that each lookup of an element in the list can be done in unit time, then at most time units are needed to return an answer. Cost models Time efficiency estimates depend on what we define to be a step. For the analysis to correspond usefully to the actual run-time, the time required to perform a step must be guaranteed to be bounded above by a constant. One must be careful here; for instance, some analyses count an addition of two numbers as one step. This assumption may not be warranted in certain contexts. For example, if the numbers involved in
https://en.wikipedia.org/wiki/Apple%20II
The Apple II (stylized as ) is an 8-bit home computer and one of the world's first highly successful mass-produced microcomputer products. It was designed primarily by Steve Wozniak; Jerry Manock developed the design of Apple II's foam-molded plastic case, Rod Holt developed the switching power supply, while Steve Jobs's role in the design of the computer was limited to overseeing Jerry Manock's work on the plastic case. It was introduced by Jobs and Wozniak at the 1977 West Coast Computer Faire, and marks Apple's first launch of a personal computer aimed at a consumer market—branded toward American households rather than businessmen or computer hobbyists. Byte magazine referred to the Apple II, Commodore PET 2001, and TRS-80 as the "1977 Trinity". As the Apple II had the defining feature of being able to display color graphics, the Apple logo was redesigned to have a spectrum of colors. The Apple II is the first model in the Apple II series, followed by Apple II+, Apple IIe, Apple IIc, Apple IIc Plus, and the 16-bit Apple IIGS—all of which remained compatible. Production of the last available model, Apple IIe, ceased in November 1993. History By 1976, Steve Jobs had convinced product designer Jerry Manock (who had formerly worked at Hewlett Packard designing calculators) to create the "shell" for the Apple II—a smooth case inspired by kitchen appliances that concealed the internal mechanics. The earliest Apple II computers were assembled in Silicon Valley and later in Texas; printed circuit boards were manufactured in Ireland and Singapore. The first computers went on sale on June 10, 1977 with an MOS Technology 6502 microprocessor running at 1.023 MHz ( of the NTSC color subcarrier), two game paddles (bundled until 1980, when they were found to violate FCC regulations), 4 KiB of RAM, an audio cassette interface for loading programs and storing data, and the Integer BASIC programming language built into ROMs. The video controller displayed 24 lines by 40 columns of monochrome, uppercase-only text on the screen (the original character set matches ASCII characters 20h to 5Fh), with NTSC composite video output suitable for display on a TV monitor or on a regular TV set (by way of a separate RF modulator). The original retail price of the computer with 4 KiB of RAM was and with the maximum 48 KiB of RAM it was To reflect the computer's color graphics capability, the Apple logo on the casing has rainbow stripes, which remained a part of Apple's corporate logo until early 1998. Perhaps most significantly, the Apple II was a catalyst for personal computers across many industries; it opened the doors to software marketed at consumers. Certain aspects of the system's design were influenced by Atari's arcade video game Breakout (1976), which was designed by Wozniak, who said: "A lot of features of the Apple II went in because I had designed Breakout for Atari. I had designed it in hardware. I wanted to write it in software now". This included his d
https://en.wikipedia.org/wiki/Audio%20file%20format
An audio file format is a file format for storing digital audio data on a computer system. The bit layout of the audio data (excluding metadata) is called the audio coding format and can be uncompressed, or compressed to reduce the file size, often using lossy compression. The data can be a raw bitstream in an audio coding format, but it is usually embedded in a container format or an audio data format with defined storage layer. Format types It is important to distinguish between the audio coding format, the container containing the raw audio data, and an audio codec. A codec performs the encoding and decoding of the raw audio data while this encoded data is (usually) stored in a container file. Although most audio file formats support only one type of audio coding data (created with an audio coder), a multimedia container format (as Matroska or AVI) may support multiple types of audio and video data. There are three major groups of audio file formats: Uncompressed audio formats, such as WAV, AIFF, AU or raw header-less PCM; Formats with lossless compression, such as FLAC, Monkey's Audio (filename extension .ape), WavPack (filename extension .wv), TTA, ATRAC Advanced Lossless, ALAC (filename extension .m4a), MPEG-4 SLS, MPEG-4 ALS, MPEG-4 DST, Windows Media Audio Lossless (WMA Lossless), and Shorten (SHN). Formats with lossy compression, such as Opus, MP3, Vorbis, Musepack, AAC, ATRAC and Windows Media Audio Lossy (WMA lossy). Uncompressed audio format One major uncompressed audio format, LPCM, is the same variety of PCM as used in Compact Disc Digital Audio and is the format most commonly accepted by low level audio APIs and D/A converter hardware. Although LPCM can be stored on a computer as a raw audio format, it is usually stored in a .wav file on Windows or in a .aiff file on macOS. The Audio Interchange File Format (AIFF) format is based on the Interchange File Format (IFF), and the WAV format is based on the similar Resource Interchange File Format (RIFF). WAV and AIFF are designed to store a wide variety of audio formats, lossless and lossy; they just add a small, metadata-containing header before the audio data to declare the format of the audio data, such as LPCM with a particular sample rate, bit depth, endianness and number of channels. Since WAV and AIFF are widely supported and can store LPCM, they are suitable file formats for storing and archiving an original recording. BWF (Broadcast Wave Format) is a standard audio format created by the European Broadcasting Union as a successor to WAV. Among other enhancements, BWF allows more robust metadata to be stored in the file. See European Broadcasting Union: Specification of the Broadcast Wave Format (EBU Technical document 3285, July 1997). This is the primary recording format used in many professional audio workstations in the television and film industry. BWF files include a standardized timestamp reference which allows for easy synchronization with a separate picture elem
https://en.wikipedia.org/wiki/Amdahl%27s%20law
In computer architecture, Amdahl's law (or Amdahl's argument) is a formula which gives the theoretical speedup in latency of the execution of a task at fixed workload that can be expected of a system whose resources are improved. It states that "the overall performance improvement gained by optimizing a single part of a system is limited by the fraction of time that the improved part is actually used". It is named after computer scientist Gene Amdahl, and was presented at the American Federation of Information Processing Societies (AFIPS) Spring Joint Computer Conference in 1967. Amdahl's law is often used in parallel computing to predict the theoretical speedup when using multiple processors. For example, if a program needs 20 hours to complete using a single thread, but a one-hour portion of the program cannot be parallelized, therefore only the remaining 19 hours' () execution time can be parallelized, then regardless of how many threads are devoted to a parallelized execution of this program, the minimum execution time is always more than 1 hour. Hence, the theoretical speedup is less than 20 times the single thread performance, . Definition Amdahl's law can be formulated in the following way: where Slatency is the theoretical speedup of the execution of the whole task; s is the speedup of the part of the task that benefits from improved system resources; p is the proportion of execution time that the part benefiting from improved resources originally occupied. Furthermore, shows that the theoretical speedup of the execution of the whole task increases with the improvement of the resources of the system and that regardless of the magnitude of the improvement, the theoretical speedup is always limited by the part of the task that cannot benefit from the improvement. Amdahl's law applies only to the cases where the problem size is fixed. In practice, as more computing resources become available, they tend to get used on larger problems (larger datasets), and the time spent in the parallelizable part often grows much faster than the inherently serial work. In this case, Gustafson's law gives a less pessimistic and more realistic assessment of the parallel performance. Derivation A task executed by a system whose resources are improved compared to an initial similar system can be split up into two parts: a part that does not benefit from the improvement of the resources of the system; a part that benefits from the improvement of the resources of the system. An example is a computer program that processes files. A part of that program may scan the directory of the disk and create a list of files internally in memory. After that, another part of the program passes each file to a separate thread for processing. The part that scans the directory and creates the file list cannot be sped up on a parallel computer, but the part that processes the files can. The execution time of the whole task before the improvement of the resourc
https://en.wikipedia.org/wiki/Abstract%20data%20type
In computer science, an abstract data type (ADT) is a mathematical model for data types, defined by its behavior (semantics) from the point of view of a user of the data, specifically in terms of possible values, possible operations on data of this type, and the behavior of these operations. This mathematical model contrasts with data structures, which are concrete representations of data, and are the point of view of an implementer, not a user. Formally, an ADT may be defined as a "class of objects whose logical behavior is defined by a set of values and a set of operations"; this is analogous to an algebraic structure in mathematics. What is meant by "behaviour" varies by author, with the two main types of formal specifications for behavior being axiomatic (algebraic) specification and an abstract model; these correspond to axiomatic semantics and operational semantics of an abstract machine, respectively. Some authors also include the computational complexity ("cost"), both in terms of time (for computing operations) and space (for representing values). In practice, many common data types are not ADTs, as the abstraction is not perfect, and users must be aware of issues like arithmetic overflow that are due to the representation. For example, integers are often stored as fixed-width values (32-bit or 64-bit binary numbers), and thus experience integer overflow if the maximum value is exceeded. ADTs are a theoretical concept, in computer science, used in the design and analysis of algorithms, data structures, and software systems, and do not correspond to specific features of computer languages—mainstream computer languages do not directly support formally specified ADTs. However, various language features correspond to certain aspects of ADTs, and are easily confused with ADTs proper; these include abstract types, opaque data types, protocols, and design by contract. ADTs were first proposed by Barbara Liskov and Stephen N. Zilles in 1974, as part of the development of the CLU language. Discussion For example, integers are an ADT, defined as the values ..., −2, −1, 0, 1, 2, ..., and by the operations of addition, subtraction, multiplication, and division, together with greater than, less than, etc., which behave according to familiar mathematics (with care for integer division), independently of how the integers are represented by the computer. Explicitly, "behavior" includes obeying various axioms (associativity and commutativity of addition, etc.), and preconditions on operations (cannot divide by zero). Typically integers are represented in a data structure as binary numbers, most often as two's complement, but might be binary-coded decimal or in ones' complement, but for most purposes, the user can work with the abstraction rather than the concrete choice of representation, and can simply use the data as if the type were truly abstract. An ADT consists not only of operations but also of a domain of values and of constraints on the def
https://en.wikipedia.org/wiki/Accelerated%20Graphics%20Port
Accelerated Graphics Port (AGP) is a parallel expansion card standard, designed for attaching a video card to a computer system to assist in the acceleration of 3D computer graphics. It was originally designed as a successor to PCI-type connections for video cards. Since 2004, AGP was progressively phased out in favor of PCI Express (PCIe), which is serial, as opposed to parallel; by mid-2008, PCI Express cards dominated the market and only a few AGP models were available, with GPU manufacturers and add-in board partners eventually dropping support for the interface in favor of PCI Express. Advantages over PCI AGP is a superset of the PCI standard, designed to overcome PCI's limitations in serving the requirements of the era's high-performance graphics cards. The primary advantage of AGP is that it doesn't share the PCI bus, providing a dedicated, point-to-point pathway between the expansion slot(s) and the motherboard chipset. The direct connection also allows for higher clock speeds. The second major change is the use of split transactions, wherein the address and data phases are separated. The card may send many address phases so the host can process them in order, avoiding any long delays caused by the bus being idle during read operations. Third, PCI bus handshaking is simplified. Unlike PCI bus transactions whose length is negotiated on a cycle-by-cycle basis using the FRAME# and STOP# signals, AGP transfers are always a multiple of 8 bytes long, with the total length included in the request. Further, rather than using the IRDY# and TRDY# signals for each word, data is transferred in blocks of four clock cycles (32 words at AGP 8× speed), and pauses are allowed only between blocks. Finally, AGP allows (mandatory only in AGP 3.0) sideband addressing, meaning that the address and data buses are separated so the address phase does not use the main address/data (AD) lines at all. This is done by adding an extra 8-bit "SideBand Address" bus over which the graphics controller can issue new AGP requests while other AGP data is flowing over the main 32 address/data (AD) lines. This results in improved overall AGP data throughput. This great improvement in memory read performance makes it practical for an AGP card to read textures directly from system RAM, while a PCI graphics card must copy it from system RAM to the card's video memory. System memory is made available using the graphics address remapping table (GART), which apportions main memory as needed for texture storage. The maximum amount of system memory available to AGP is defined as the AGP aperture. History The AGP slot first appeared on x86-compatible system boards based on Socket 7 Intel P5 Pentium and Slot 1 P6 Pentium II processors. Intel introduced AGP support with the i440LX Slot 1 chipset on August 26, 1997, and a flood of products followed from all the major system board vendors. The first Socket 7 chipsets to support AGP were the VIA Apollo VP3, SiS 5591/5592, and
https://en.wikipedia.org/wiki/AMD
Advanced Micro Devices, Inc., commonly abbreviated as AMD, is an American multinational semiconductor company based in Santa Clara, California, that develops computer processors and related technologies for business and consumer markets. The company was founded in 1969 by Jerry Sanders and a group of other technology professionals. AMD's early products were primarily memory chips and other components for computers. The company later expanded into the microprocessor market, competing with Intel, its main rival in the industry. In the early 2000s, AMD experienced significant growth and success, thanks in part to its strong position in the PC market and the success of its Athlon and Opteron processors. However, the company faced challenges in the late 2000s and early 2010s, as it struggled to keep up with Intel in the race to produce faster and more powerful processors. In the late 2010s, AMD regained some of its market share thanks to the success of its Ryzen processors which are now widely regarded as superior to Intel products in business applications including cloud applications. AMD's processors are used in a wide range of computing devices, including personal computers, servers, laptops, and gaming consoles. While it initially manufactured its own processors, the company later outsourced its manufacturing, a practice known as going fabless, after GlobalFoundries was spun off in 2009. AMD's main products include microprocessors, motherboard chipsets, embedded processors, graphics processors, and FPGAs for servers, workstations, personal computers, and embedded system applications. The company has also expanded into new markets, such as the data center and gaming markets, and has announced plans to enter the high-performance computing market. History First twelve years Advanced Micro Devices was formally incorporated by Jerry Sanders, along with seven of his colleagues from Fairchild Semiconductor, on May 1, 1969. Sanders, an electrical engineer who was the director of marketing at Fairchild, had, like many Fairchild executives, grown frustrated with the increasing lack of support, opportunity, and flexibility within the company. He later decided to leave to start his own semiconductor company, following the footsteps of Robert Noyce (developer of the first silicon integrated circuit at Fairchild in 1959) and Gordon Moore, who together founded the semiconductor company Intel in July 1968. In September 1969, AMD moved from its temporary location in Santa Clara to Sunnyvale, California. To immediately secure a customer base, AMD initially became a second source supplier of microchips designed by Fairchild and National Semiconductor. AMD first focused on producing logic chips. The company guaranteed quality control to United States Military Standard, an advantage in the early computer industry since unreliability in microchips was a distinct problem that customers – including computer manufacturers, the telecommunications industry, and instru
https://en.wikipedia.org/wiki/Aon%20%28company%29
Aon PLC () is a British-American professional services and management consulting firm that offers a range of risk-mitigation products. The firm also provides data and analytics services, strategy consulting through Aon Inpoint and investment banking advisory through Aon Securities. Aon has approximately 50,000 employees across 120 countries. Founded in Chicago by Patrick Ryan, Aon was created in 1982 when the Ryan Insurance Group merged with the Combined Insurance Company of America. In 1987, that company was renamed Aon from aon, a Gaelic word meaning "one". The company is globally headquartered in London with its North America operations based in Chicago at the Aon Center. Aon is listed on the New York Stock Exchange under AON with a market cap of $65 billion in April 2023. History W. Clement Stone's mother bought a small Detroit insurance agency, and in 1918 brought her son into the business. Mr. Stone sold low-cost, low-benefit accident insurance, underwriting and issuing policies on-site. The next year he founded his own agency, the Combined Registry Co. As the Great Depression began, Stone reduced his workforce and improved training. Forced by his son's respiratory illness to winter in the South, Stone moved to Arkansas and Texas. In 1939 he bought American Casualty Insurance Co. of Dallas, Texas. It was consolidated with other purchases as the Combined Insurance Co. of America in 1947. The company continued through the 1950s and 1960s, continuing to sell health and accident policies. In the 1970s, Combined expanded overseas despite being hit hard by the recession. In 1982, after 10 years of stagnation under Clement Stone Jr., the elder Stone, then 79, resumed control until the completion of a merger with Ryan Insurance Co. allowed him to transfer control to Patrick Ryan. Ryan, the son of a Ford dealer in Wisconsin and a graduate of Northwestern University, had started his company as an auto credit insurer in 1964. In 1976, the company bought the insurance brokerage units of the Esmark conglomerate. Ryan focused on insurance brokering and added more upscale insurance products. He also trimmed staff and took other cost-cutting measures, and in 1987 he changed Combined's name to Aon. In 1992, he bought Dutch insurance broker Hudig-Langeveldt. In 1995, the company sold its remaining direct life insurance holdings to General Electric to focus on consulting. Aon built a global presence through purchases. In 1997, it bought The Minet Group, as well as insurance brokerage Alexander & Alexander Services, Inc. in a deal that made Aon (temporarily) the largest insurance broker worldwide. The firm made no US buys in 1998, but doubled its employee base with purchases including Spain's largest retail insurance broker, Gil y Carvajal, and the formation of Aon Korea. Responding to industry demands, Aon announced its new fee disclosure policy in 1999, and the company reorganised to focus on buying personal line insurance firms and to integrate its ac
https://en.wikipedia.org/wiki/Analog%20computer
An analog computer or analogue computer is a type of computer that uses the continuous variation aspect of physical phenomena such as electrical, mechanical, or hydraulic quantities (analog signals) to model the problem being solved. In contrast, digital computers represent varying quantities symbolically and by discrete values of both time and amplitude (digital signals). Analog computers can have a very wide range of complexity. Slide rules and nomograms are the simplest, while naval gunfire control computers and large hybrid digital/analog computers were among the most complicated. Complex mechanisms for process control and protective relays used analog computation to perform control and protective functions. Analog computers were widely used in scientific and industrial applications even after the advent of digital computers, because at the time they were typically much faster, but they started to become obsolete as early as the 1950s and 1960s, although they remained in use in some specific applications, such as aircraft flight simulators, the flight computer in aircraft, and for teaching control systems in universities. Perhaps the most relatable example of analog computers are mechanical watches where the continuous and periodic rotation of interlinked gears drives the second, minute and hour needles in the clock. More complex applications, such as aircraft flight simulators and synthetic-aperture radar, remained the domain of analog computing (and hybrid computing) well into the 1980s, since digital computers were insufficient for the task. Timeline of analog computers Precursors This is a list of examples of early computation devices considered precursors of the modern computers. Some of them may even have been dubbed 'computers' by the press, though they may fail to fit modern definitions. The Antikythera mechanism, a type of device used to determine the positions of heavenly bodies known as an orrery, was described as an early mechanical analog computer by British physicist, information scientist, and historian of science Derek J. de Solla Price. It was discovered in 1901, in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to , during the Hellenistic period. Devices of a level of complexity comparable to that of the Antikythera mechanism would not reappear until a thousand years later. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was first described by Ptolemy in the 2nd century AD. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BC and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, P
https://en.wikipedia.org/wiki/ATM
ATM or atm often refers to: Atmosphere (unit) or atm, a unit of atmospheric pressure Automated teller machine, a cash dispenser or cash machine ATM or atm may also refer to: Computing ATM (computer), a ZX Spectrum clone developed in Moscow in 1991 Adobe Type Manager, a computer program for managing fonts Accelerated Turing machine, or Zeno machine, a model of computation used in theoretical computer science Alternating Turing machine, a model of computation used in theoretical computer science Asynchronous Transfer Mode, a telecommunications protocol used in networking ATM adaptation layer ATM Adaptation Layer 5 Media Amateur Telescope Making, a series of books by Albert Graham Ingalls ATM (2012 film), an American film ATM: Er Rak Error, a 2012 Thai film Azhagiya Tamil Magan, a 2007 Indian film "ATM" (song), a 2018 song by J. Cole from KOD People and organizations Abiding Truth Ministries, anti-LGBT organization in Springfield, Massachusetts, US Association of Teachers of Mathematics, UK Acrylic Tank Manufacturing, US aquarium manufacturer, televised in Tanked ATM FA, a football club in Malaysia A. T. M. Wilson (1906–1978), British psychiatrist African Transformation Movement, South African political party founded in 2018 The a2 Milk Company (NZX ticker symbol ATM) Science Apollo Telescope Mount, a solar observatory ATM serine/threonine kinase, a serine/threonine kinase activated by DNA damage The Airborne Topographic Mapper, a laser altimeter among the instruments used by NASA's Operation IceBridge Transportation Active traffic management, a motorway scheme on the M42 in England Air traffic management, a concept in aviation Altamira Airport, in Brazil (IATA code ATM) Azienda Trasporti Milanesi, the municipal public transport company of Milan Airlines of Tasmania (ICAO code ATM) Catalonia, Spain Autoritat del Transport Metropolità (ATM Àrea de Barcelona), in the Barcelona metropolitan area Autoritat Territorial de la Mobilitat del Camp de Tarragona (ATM Camp de Tarragona), in the Camp de Tarragona area Autoritat Territorial de la Mobilitat de l'Àrea de Girona (ATM Àrea de Girona), in the Girona area Autoritat Territorial de la Mobilitat de l'Àrea de Lleida (ATM Àrea de Lleida), in the Lleida area Other uses Actun Tunichil Muknal, a cave in Belize Anti-tank missile, a missile designed to destroy tanks Ass to mouth, a sexual act At the money, moneyness where the strike price is the same as the current spot price At-the-market offering, a type of follow-on offering of stock Automatenmarken, a variable value stamp Contracted form of Atlético Madrid, football club in Spain Common abbreviation in SMS language for "at the moment"
https://en.wikipedia.org/wiki/Asynchronous%20communication
In telecommunications, asynchronous communication is transmission of data, generally without the use of an external clock signal, where data can be transmitted intermittently rather than in a steady stream. Any timing required to recover data from the communication symbols is encoded within the symbols. The most significant aspect of asynchronous communications is that data is not transmitted at regular intervals, thus making possible variable bit rate, and that the transmitter and receiver clock generators do not have to be exactly synchronized all the time. In asynchronous transmission, data is sent one byte at a time and each byte is preceded by start and stop bits. Physical layer In asynchronous serial communication in the physical protocol layer, the data blocks are code words of a certain word length, for example octets (bytes) or ASCII characters, delimited by start bits and stop bits. A variable length space can be inserted between the code words. No bit synchronization signal is required. This is sometimes called character oriented communication. Examples include MNP2 and modems older than V.2. Data link layer and higher Asynchronous communication at the data link layer or higher protocol layers is known as statistical multiplexing, for example Asynchronous Transfer Mode (ATM). In this case, the asynchronously transferred blocks are called data packets, for example ATM cells. The opposite is circuit switched communication, which provides constant bit rate, for example ISDN and SONET/SDH. The packets may be encapsulated in a data frame, with a frame synchronization bit sequence indicating the start of the frame, and sometimes also a bit synchronization bit sequence, typically 01010101, for identification of the bit transition times. Note that at the physical layer, this is considered as synchronous serial communication. Examples of packet mode data link protocols that can be/are transferred using synchronous serial communication are the HDLC, Ethernet, PPP and USB protocols. Application layer An asynchronous communication service or application does not require a constant bit rate. Examples are file transfer, email and the World Wide Web. An example of the opposite, a synchronous communication service, is realtime streaming media, for example IP telephony, IPTV and video conferencing. Electronically mediated communication Electronically mediated communication often happens asynchronously in that the participants do not communicate concurrently. Examples include email and bulletin-board systems, where participants send or post messages at different times than they read them. The term "asynchronous communication" acquired currency in the field of online learning, where teachers and students often exchange information asynchronously instead of synchronously (that is, simultaneously), as they would in face-to-face or in telephone conversations. See also Synchronization in telecommunications Asynchronous serial communication A
https://en.wikipedia.org/wiki/Automated%20theorem%20proving
Automated theorem proving (also known as ATP or automated deduction) is a subfield of automated reasoning and mathematical logic dealing with proving mathematical theorems by computer programs. Automated reasoning over mathematical proof was a major impetus for the development of computer science. Logical foundations While the roots of formalised logic go back to Aristotle, the end of the 19th and early 20th centuries saw the development of modern logic and formalised mathematics. Frege's Begriffsschrift (1879) introduced both a complete propositional calculus and what is essentially modern predicate logic. His Foundations of Arithmetic, published in 1884, expressed (parts of) mathematics in formal logic. This approach was continued by Russell and Whitehead in their influential Principia Mathematica, first published 1910–1913, and with a revised second edition in 1927. Russell and Whitehead thought they could derive all mathematical truth using axioms and inference rules of formal logic, in principle opening up the process to automatisation. In 1920, Thoralf Skolem simplified a previous result by Leopold Löwenheim, leading to the Löwenheim–Skolem theorem and, in 1930, to the notion of a Herbrand universe and a Herbrand interpretation that allowed (un)satisfiability of first-order formulas (and hence the validity of a theorem) to be reduced to (potentially infinitely many) propositional satisfiability problems. In 1929, Mojżesz Presburger showed that the first-order theory of the natural numbers with addition and equality (now called Presburger arithmetic in his honor) is decidable and gave an algorithm that could determine if a given sentence in the language was true or false. However, shortly after this positive result, Kurt Gödel published On Formally Undecidable Propositions of Principia Mathematica and Related Systems (1931), showing that in any sufficiently strong axiomatic system there are true statements that cannot be proved in the system. This topic was further developed in the 1930s by Alonzo Church and Alan Turing, who on the one hand gave two independent but equivalent definitions of computability, and on the other gave concrete examples of undecidable questions. First implementations Shortly after World War II, the first general-purpose computers became available. In 1954, Martin Davis programmed Presburger's algorithm for a JOHNNIAC vacuum-tube computer at the Institute for Advanced Study in Princeton, New Jersey. According to Davis, "Its great triumph was to prove that the sum of two even numbers is even". More ambitious was the Logic Theorist in 1956, a deduction system for the propositional logic of the Principia Mathematica, developed by Allen Newell, Herbert A. Simon and J. C. Shaw. Also running on a JOHNNIAC, the Logic Theorist constructed proofs from a small set of propositional axioms and three deduction rules: modus ponens, (propositional) variable substitution, and the replacement of formulas by their definition. T
https://en.wikipedia.org/wiki/Au
Au, AU, au or a.u. may refer to: Science and technology Computing .au, the internet country code for Australia Au file format, Sun Microsystems' audio format Audio Units, a system level plug-in architecture from Apple Computer Adobe Audition, a sound editor program Windows Update or Automatic Updates, in Microsoft Windows Windows 10 Anniversary Update, of August 2016a Physics and chemistry Gold, symbol Au (from Latin ), a chemical element Absorbance unit, a reporting unit in spectroscopy Atomic units, a system of units convenient for atomic physics and other fields Ångström unit, a unit of length equal to 10−10 m or 0.1 nanometre. Astronomical unit, a unit of length often used in planetary systems astronomy, an approximation for the average distance between the Earth and the Sun Arbitrary unit, a relative placeholder unit for when the actual value of a measurement is unknown or unimportant ("a.u." is deprecated, use "arb. unit" instead) Arts and entertainment Music AU (band), an experimental pop group headed by Luke Wyland Au, a 2010 release by Scottish rock band Donaldson, Moir and Paterson Au a track on Some Time in New York City by an album by John Lennon & Yoko Ono and Elephant's Memory Magazines Alternative Ulster, a Northern Irish music magazine, now called AU A&U: America's AIDS Magazine, sponsor of the Christopher Hewitt Award Literature Alternative universe (fan fiction), fiction by fan authors that deliberately alters facts of the canonical universe written about. Other media Au Co, a fairy in Vietnamese mythology Age of Ultron, a 2013 series published by Marvel Comics A.U, a Chinese media franchise and brand Organizations au (mobile phone company), a mobile phone operator in Japan African Union, a continental union Americans United for Separation of Church and State Athletic Union, the union of sports clubs in a British university Austral Líneas Aéreas (IATA code AU) Auxiliary Units, specially trained, highly secret units created by the United Kingdom government during the Second World War AGROunia, an agrarian-socialist political party in Poland Universities Asia Ajou University in Suwon, Gyeonggi, South Korea Abasyn University in Peshawar, Khyber Pakhtunkhwa, Pakistan Andhra University in Visakhapatnam, AP, India Anhui University in Hefei, Anhui, China Aletheia University in New Taipei City, Taiwan Allahabad University in Allahabad, Uttar Pradesh, India Arellano University in Philippines Assumption University (Thailand) in Thailand Abhilashi University in Himachal Pradesh, India Adesh University in Bathinda, Punjab, India. Europe Aarhus University in Aarhus, Denmark Aberystwyth University in Aberystwyth, Wales, United Kingdom Akademia Umiejętności in Kraków, Poland Arden University in Coventry, England Oceania Auckland University in New Zealand North America Adelphi University in Garden City, New York Alfred University in Alfred, New York Algoma University in Sault Ste.
https://en.wikipedia.org/wiki/Atlas%20Autocode
Atlas Autocode (AA) is a programming language developed around 1963 at the University of Manchester. A variant of the language ALGOL, it was developed by Tony Brooker and Derrick Morris for the Atlas computer. The initial AA and AB compilers were written by Jeff Rohl and Tony Brooker using the Brooker-Morris Compiler-compiler, with a later hand-coded non-CC implementation (ABC) by Jeff Rohl. The word Autocode was basically an early term for programming language. Different autocodes could vary greatly. Features AA was a block structured language that featured explicitly typed variables, subroutines, and functions. It omitted some ALGOL features such as passing parameters by name, which in ALGOL 60 means passing the memory address of a short subroutine (a thunk) to recalculate a parameter each time it is mentioned. The AA compiler could generate range-checking for array accesses, and allowed an array to have dimensions that were determined at runtime, i.e., an array could be declared as integer array Thing (i:j), where i and j were calculated values. AA high-level routines could include machine code, either to make an inner loop more efficient or to effect some operation which otherwise cannot be done easily. AA included a complex data type to represent complex numbers, partly because of pressure from the electrical engineering department, as complex numbers are used to represent the behavior of alternating current. The imaginary unit square root of -1 was represented by i, which was treated as a fixed complex constant = i. The complex data type was dropped when Atlas Autocode later evolved into the language Edinburgh IMP. IMP was an extension of AA and was used to write the Edinburgh Multiple Access System (EMAS) operating system. AA's second-greatest claim to fame (after being the progenitor of IMP and EMAS) was that it had many of the features of the original Compiler Compiler. A variant of the AA compiler included run-time support for a top-down recursive descent parser. The style of parser used in the Compiler Compiler was in use continuously at Edinburgh from the 60's until almost the year 2000. Other Autocodes were developed for the Titan computer, a prototype Atlas 2 at Cambridge, and the Ferranti Mercury. Syntax Atlas Autocode's syntax was largely similar to ALGOL, though it was influenced by the output device which the author had available, a Friden Flexowriter. Thus, it allowed symbols like ½ for .5 and the superscript 2 for to the power of 2. The Flexowriter supported overstriking and thus, AA did also: up to three characters could be overstruck as a single symbol. For example, the character set had no ↑ symbol, so exponentiation was an overstrike of | and *. The aforementioned underlining of reserved words (keywords) could also be done using overstriking. The language is described in detail in the Atlas Autocode Reference Manual. Other Flexowriter characters that were found a use in AA were: α in floating-point numbers, e.g
https://en.wikipedia.org/wiki/AutoCAD
AutoCAD is a 2D and 3D computer-aided design (CAD) software application for desktop, web, and mobile developed by Autodesk. It was first released in December 1982 for the CP/M and IBM PC platforms as a desktop app running on microcomputers with internal graphics controllers. Initially a DOS application, subsequent versions were later released for other platforms including Classic Mac OS (1992), Microsoft Windows (1992), web browsers (2010), iOS (2010), macOS (2010), and Android (2011). AutoCAD is a general drafting and design application used in industry by architects, project managers, engineers, graphic designers, city planners and other professionals to prepare technical drawings. After discontinuing the sale of perpetual licenses in January 2016, commercial versions of AutoCAD are licensed through a term-based subscription. History Before AutoCAD was introduced, most commercial CAD programs ran on mainframe computers or minicomputers, with each CAD operator (user) working at a separate graphics terminal. Origins AutoCAD was derived from a program that began in 1977, and then released in 1979 called Interact CAD, also referred to in early Autodesk documents as MicroCAD, which was written prior to Autodesk's (then Marinchip Software Partners) formation by Autodesk cofounder Michael Riddle. The first version by Autodesk was demonstrated at the 1982 Comdex and released that December. AutoCAD supported CP/M-80 computers. As Autodesk's flagship product, by March 1986 AutoCAD had become the most ubiquitous CAD program worldwide. The 2022 release marked the 36th major release of AutoCAD for Windows and the 12th consecutive year of AutoCAD for Mac. The native file format of AutoCAD is .dwg. This and, to a lesser extent, its interchange file format DXF, have become de facto, if proprietary, standards for CAD data interoperability, particularly for 2D drawing exchange. AutoCAD has included support for .dwf, a format developed and promoted by Autodesk, for publishing CAD data. Features Compatibility with other software ESRI ArcMap 10 permits export as AutoCAD drawing files. Civil 3D permits export as AutoCAD objects and as LandXML. Third-party file converters exist for specific formats such as Bentley MX GENIO Extension, PISTE Extension (France), ISYBAU (Germany), OKSTRA and Microdrainage (UK); also, conversion of .pdf files is feasible, however, the accuracy of the results may be unpredictable or distorted. For example, jagged edges may appear. Several vendors provide online conversions for free such as Cometdocs. Language AutoCAD and AutoCAD LT are available for English, German, French, Italian, Spanish, Japanese, Korean, Chinese Simplified, Chinese Traditional, Brazilian Portuguese, Russian, Czech, Polish and Hungarian (also through additional language packs). The extent of localization varies from full translation of the product to documentation only. The AutoCAD command set is localized as a part of the software localization. Extensions
https://en.wikipedia.org/wiki/AutoCAD%20DXF
AutoCAD DXF (Drawing Interchange Format, or Drawing Exchange Format) is a CAD data file format developed by Autodesk for enabling data interoperability between AutoCAD and other programs. DXF was introduced in December 1982 as part of AutoCAD 1.0, and was intended to provide an exact representation of the data in the AutoCAD native file format, DWG (Drawing). For many years, Autodesk did not publish specifications, making correct creation of DXF files difficult. Autodesk now publishes the incomplete DXF specifications online. Versions of AutoCAD from Release 10 (October 1988) and up support both ASCII and binary forms of DXF. Earlier versions support only ASCII. As AutoCAD has become more powerful, supporting more complex object types, DXF has become less useful. Certain object types, including ACIS solids and regions, are not documented. Other object types, including AutoCAD 2006's dynamic blocks, and all of the objects specific to the vertical market versions of AutoCAD, are partially documented, but not well enough to allow other developers to support them. For these reasons many CAD applications use the DWG format which can be licensed from Autodesk or non-natively from the Open Design Alliance. DXF files do not specify the units of measurement used for its coordinates and dimensions. Most CAD systems and many vector graphics packages support the import and export of DXF files, notably Adobe products, Inkscape & Blender. Some CAD systems use DXF as their native format, notably QCAD and LibreCAD. File structure ASCII versions of DXF can be read with any text editor. The basic organization of a DXF file is as follows: section General information about the drawing. Each parameter has a variable name and an associated value. section Holds the information for application-defined classes whose instances appear in the , , and sections of the database. Generally does not provide sufficient information to allow interoperability with other programs. section This section contains definitions of named items. Application ID () table Block Record () table Dimension Style () table Layer () table Linetype () table Text style () table User Coordinate System () table View () table Viewport configuration () table section This section contains Block Definition entities describing the entities comprising each Block in the drawing. section This section contains the drawing entities, including any Block References. section Contains the data that apply to nongraphical objects, used by AutoLISP, and ObjectARX applications. section Contains the preview image for the DXF file. The data format of a DXF is called a "tagged data" format, which "means that each data element in the file is preceded by an integer number that is called a group code. A group code's value indicates what type of data element follows. This value also indicates the meaning of a data element for a given object (or record) type. Virtually all user-specif
https://en.wikipedia.org/wiki/Parallel%20ATA
Parallel ATA (PATA), originally , also known as IDE, is a standard interface designed for IBM PC-compatible computers. It was first developed by Western Digital and Compaq in 1986 for compatible hard drives and CD or DVD drives. The connection is used for storage devices such as hard disk drives, floppy disk drives, and optical disc drives in computers. The standard is maintained by the X3/INCITS committee. It uses the underlying (ATA) and Packet Interface (ATAPI) standards. The Parallel ATA standard is the result of a long history of incremental technical development, which began with the original AT Attachment interface, developed for use in early PC AT equipment. The ATA interface itself evolved in several stages from Western Digital's original Integrated Drive Electronics (IDE) interface. As a result, many near-synonyms for ATA/ATAPI and its previous incarnations are still in common informal use, in particular Extended IDE (EIDE) and Ultra ATA (UATA). After the introduction of SATA in 2003, the original ATA was renamed to Parallel ATA, or PATA for short. Parallel ATA cables have a maximum allowable length of . Because of this limit, the technology normally appears as an internal computer storage interface. For many years, ATA provided the most common and the least expensive interface for this application. It has largely been replaced by SATA in newer systems. History and terminology The standard was originally conceived as the "AT Bus Attachment," officially called "AT Attachment" and abbreviated "ATA" because its primary feature was a direct connection to the 16-bit ISA bus introduced with the IBM PC/AT. The original ATA specifications published by the standards committees use the name "AT Attachment". The "AT" in the IBM PC/AT referred to "Advanced Technology" so ATA has also been referred to as "Advanced Technology Attachment". When a newer Serial ATA (SATA) was introduced in 2003, the original ATA was renamed to Parallel ATA, or PATA for short. Physical ATA interfaces became a standard component in all PCs, initially on host bus adapters, sometimes on a sound card but ultimately as two physical interfaces embedded in a Southbridge chip on a motherboard. Called the "primary" and "secondary" ATA interfaces, they were assigned to base addresses 0x1F0 and 0x170 on ISA bus systems. They were replaced by SATA interfaces. IDE and ATA-1 The first version of what is now called the ATA/ATAPI interface was developed by Western Digital under the name Integrated Drive Electronics (IDE). Together with Compaq Computer (the initial customer), they worked with various disk drive manufacturers to develop and ship early products with the goal of remaining software compatible with the existing IBM PC hard drive interface. The first such drives appeared internally in Compaq PCs in 1986 and were first separately offered by Conner Peripherals as the CP342 in June 1987. The term Integrated Drive Electronics refers to the fact that the drive cont
https://en.wikipedia.org/wiki/Atari%205200
The Atari 5200 SuperSystem or simply Atari 5200 is a home video game console introduced in 1982 by Atari, Inc. as a higher-end complement for the popular Atari Video Computer System. The VCS was renamed to the Atari 2600 at the time of the 5200's launch. Created to compete with Mattel's Intellivision, the 5200 wound up a direct competitor of ColecoVision shortly after its release. While the Coleco system shipped with the first home version of Nintendo's Donkey Kong, the 5200 included the 1978 arcade game Super Breakout which had already appeared on the Atari 8-bit family and Atari VCS in 1979 and 1981 respectively. The CPU and the graphics and sound hardware are almost identical to that of the Atari 8-bit computers, although software is not directly compatible between the two systems. The 5200's controllers have an analog joystick and a numeric keypad along with start, pause, and reset buttons. The 360-degree non-centering joystick was touted as offering more control than the eight-way Atari CX40 joystick of the 2600, but was a focal point for criticism. On May 21, 1984, during a press conference at which the Atari 7800 was introduced, company executives revealed that the 5200 had been discontinued after less than two years on the market. Total sales of the 5200 were reportedly in excess of 1 million units, far short of its predecessor's sales of over 30 million. Hardware Much of the technology in the Atari 8-bit family of home computer was originally developed as a second-generation games console intended to replace the Atari Video Computer System console. However, as the system was reaching completion, the personal computer revolution was starting with the release of machines like the Commodore PET, TRS-80, and Apple II. These machines had less advanced hardware than the new Atari technology, but sold for much higher prices with associated higher profit margins. Atari's management decided to enter this market, and the technology was repackaged into the Atari 400 and 800. The chipset used in these machines was created with the mindset that the VCS would likely be obsolete by 1980. Atari later decided to re-enter the games market with a design that closely matched their original 1978 specifications. In its prototype stage, the Atari 5200 was originally called the "Atari Video System X – Advanced Video Computer System", and was codenamed "Pam" after a female employee at Atari, Inc. It is also rumored that PAM actually stood for "Personal Arcade Machine", as the majority of games for the system ended up being arcade conversions. Actual working Atari Video System X machines, whose hardware is 100% identical to the Atari 5200 do exist, but are extremely rare. The initial 1982 release of the system featured four controller ports, where nearly all other systems of the day had only one or two ports. The 5200 also featured a new style of controller with an analog joystick, numeric keypad, two fire buttons on each side of the controller and game fun
https://en.wikipedia.org/wiki/Active%20Directory
Active Directory (AD) is a directory service developed by Microsoft for Windows domain networks. Windows Server operating systems include it as a set of processes and services. Originally, only centralized domain management used Active Directory. However, it ultimately became an umbrella title for various directory-based identity-related services. A domain controller is a server running the Active Directory Domain Service (AD DS) role. It authenticates and authorizes all users and computers in a Windows domain-type network, assigning and enforcing security policies for all computers and installing or updating software. For example, when a user logs into a computer part of a Windows domain, Active Directory checks the submitted username and password and determines whether the user is a system administrator or a non-admin user. Furthermore, it allows the management and storage of information, provides authentication and authorization mechanisms, and establishes a framework to deploy other related services: Certificate Services, Active Directory Federation Services, Lightweight Directory Services, and Rights Management Services. Active Directory uses Lightweight Directory Access Protocol (LDAP) versions 2 and 3, Microsoft's version of Kerberos, and DNS. Robert R. King defined it in the following way: History Like many information-technology efforts, Active Directory originated out of a democratization of design using Requests for Comments (RFCs). The Internet Engineering Task Force (IETF) oversees the RFC process and has accepted numerous RFCs initiated by widespread participants. For example, LDAP underpins Active Directory. Also, X.500 directories and the Organizational Unit preceded the Active Directory concept that uses those methods. The LDAP concept began to emerge even before the founding of Microsoft in April 1975, with RFCs as early as 1971. RFCs contributing to LDAP include RFC 1823 (on the LDAP API, August 1995), RFC 2307, RFC 3062, and RFC 4533. Microsoft previewed Active Directory in 1999, released it first with Windows 2000 Server edition, and revised it to extend functionality and improve administration in Windows Server 2003. Active Directory support was also added to Windows 95, Windows 98, and Windows NT 4.0 via patch, with some unsupported features. Additional improvements came with subsequent versions of Windows Server. In Windows Server 2008, Microsoft added further services to Active Directory, such as Active Directory Federation Services. The part of the directory in charge of managing domains, which was a core part of the operating system, was renamed Active Directory Domain Services (ADDS) and became a server role like others. "Active Directory" became the umbrella title of a broader range of directory-based services. According to Byron Hynes, everything related to identity was brought under Active Directory's banner. Active Directory Services Active Directory Services consist of multiple directory services. The b
https://en.wikipedia.org/wiki/Ai
AI is artificial intelligence, intellectual ability in machines and robots. Ai, AI or A.I. may also refer to: Animals Ai (chimpanzee), an individual experimental subject in Japan Ai (sloth) or the pale-throated sloth, northern Amazonian mammal species Arts, entertainment and media Works Ai (album), a 2004 release by Seraphim A.I. Artificial Intelligence, a 2001 American film A.I. Rising, a 2018 Serbian film AI: The Somnium Files, a 2019 video game American Idol, televised singing contest The American Interest, a bimonthly magazine (2005–2020) I (2015 film), an Indian Tamil film (initial title: Ai) Other uses in arts and media A.i. (band), a Californian rock–electroclash group All in (poker), wagering one's entire stake Appreciation Index, a British measure of broadcast programme approval The Art Institutes, a chain of American art schools Non-player character, in gaming (colloquially, an AI) Business , a phrase in job titles Appreciative inquiry, an organizational development method All-inclusive, a full service at a vacation resort including meals and drinks Organizations and businesses Accuracy International, a firearms manufacturer Adventure International, a video game publisher Air India, the flag carrier airline of India, based in Delhi Alitalia, the former flag carrier airline of Italy Astra International, an Indonesian automotive company Alexis I. duPont High School, Delaware, U.S. Amnesty International, a human rights organisation Appraisal Institute, an association of real estate appraisers The Art Institutes, a chain of art schools People Ai (surname), a Chinese surname Ai (given name), a given name and list of people and characters with the name King Ai of Zhou (died 441 BC) Emperor Ai of Han (27–1 BC) Emperor Ai of Jin (341–365) Emperor Ai of Tang (892–908) Ai (poet) (1947–2010), American poet Ai (singer) (born 1981), Japanese-American singer and songwriter Allen Iverson (born 1975), American retired professional basketball player ("A.I.") Andre Iguodala (born 1984), American professional basketball player ("A.I. 2.0") Places Areas Anguilla, a Caribbean territory (by ISO 3166-1 code) Appenzell Innerrhoden, a Swiss canton Cities Ai (Canaan), Biblical city United States Ai, Alabama Ai, Georgia Ai, North Carolina Ai, Ohio Landforms Religion, philosophy and mythology Ái, a Norse god Ai (Canaan), Biblical city Ai (), Sinic concepts of love from Confucianism and Buddhism , colloquially , a Greek word for 'saint' Ai Toyon, the Yakut god of light Science and technology Agricultural science Active ingredient, part of a pesticide Artificial insemination of livestock and pets, in animal breeding Air force and aviation Airborne Internet, a proposed air-to-air data network Airborne Interception radar, a Royal Air Force air-to-air system Air interdiction, an aerial military capability Attitude indicator, a flight instrument on aircraft The Internet .ai, a top-level domain
https://en.wikipedia.org/wiki/AI-complete
In the field of artificial intelligence, the most difficult problems are informally known as AI-complete or AI-hard, implying that the difficulty of these computational problems, assuming intelligence is computational, is equivalent to that of solving the central artificial intelligence problem—making computers as intelligent as people, or strong AI. To call a problem AI-complete reflects an attitude that it would not be solved by a simple specific algorithm. AI-complete problems are hypothesised to include computer vision, natural language understanding, and dealing with unexpected circumstances while solving any real-world problem. Currently, AI-complete problems cannot be solved with modern computer technology alone, but would also require human computation. This property could be useful, for example, to test for the presence of humans as CAPTCHAs aim to do, and for computer security to circumvent brute-force attacks. History The term was coined by Fanya Montalvo by analogy with NP-complete and NP-hard in complexity theory, which formally describes the most famous class of difficult problems. Early uses of the term are in Erik Mueller's 1987 PhD dissertation and in Eric Raymond's 1991 Jargon File. AI-complete problems AI-complete problems are hypothesized to include: AI peer review (composite natural language understanding, automated reasoning, automated theorem proving, formalized logic expert system) Bongard problems Computer vision (and subproblems such as object recognition) Natural language understanding (and subproblems such as text mining, machine translation, and word-sense disambiguation) Autonomous driving Dealing with unexpected circumstances while solving any real world problem, whether it's navigation or planning or even the kind of reasoning done by expert systems. Machine translation To translate accurately, a machine must be able to understand the text. It must be able to follow the author's argument, so it must have some ability to reason. It must have extensive world knowledge so that it knows what is being discussed — it must at least be familiar with all the same commonsense facts that the average human translator knows. Some of this knowledge is in the form of facts that can be explicitly represented, but some knowledge is unconscious and closely tied to the human body: for example, the machine may need to understand how an ocean makes one feel to accurately translate a specific metaphor in the text. It must also model the authors' goals, intentions, and emotional states to accurately reproduce them in a new language. In short, the machine is required to have wide variety of human intellectual skills, including reason, commonsense knowledge and the intuitions that underlie motion and manipulation, perception, and social intelligence. Machine translation, therefore, is believed to be AI-complete: it may require strong AI to be done as well as humans can do it. Software brittleness Current AI systems can s
https://en.wikipedia.org/wiki/File%20archiver
A file archiver is a computer program that combines a number of files together into one archive file, or a series of archive files, for easier transportation or storage. File archivers may employ lossless data compression in their archive formats to reduce the size of the archive. Basic archivers just take a list of files and concatenate their contents sequentially into archives. The archive files need to store metadata, at least the names and lengths of the original files, if proper reconstruction is possible. More advanced archivers store additional metadata, such as the original timestamps, file attributes or access control lists. The process of making an archive file is called archiving or packing. Reconstructing the original files from the archive is termed unarchiving, unpacking or extracting. History An early archiver was the Multics command archive, descended from the CTSS command of the same name, which was a basic archiver and performed no compression. Multics also had a "tape_archiver" command, abbreviated ta, which was perhaps the forerunner of the Unix command tar. Unix archivers The Unix tools ar, tar, and cpio act as archivers but not compressors. Users of the Unix tools use additional compression tools, such as gzip, bzip2, or xz, to compress the archive file after packing or remove compression before unpacking the archive file. The filename extensions are successively added at each step of this process. For example, archiving a collection of files with tar and then compressing the resulting archive file with gzip results a file with .tar.gz extension. This approach has two goals: It follows the Unix philosophy that each program should accomplish a single task to perfection, as opposed to attempting to accomplish everything with one tool. As compression technology progresses, users may use different compression programs without having to modify or abandon their archiver. The archives use solid compression. When the files are combined, the compressor can exploit redundancy across several archived files and achieve better compression than a compressor that compresses each files individually. This approach, however, has disadvantages too: Extracting or modifying one file is difficult. Extracting one file requires decompressing an entire archive, which can be time- and space-consuming. Modifying one means the file needs to be put back into archive and the archive recompressed again. This operation requires additional time and disk space. The archive becomes damage-prone. If the area holding shared data for several files is damaged, all those files are lost. It is impossible to take advantage of redundancy between files unless the compression window is larger than the size of an individual file. For example, gzip uses DEFLATE, which typically operates with a 32768-byte window, whereas bzip2 uses a Burrows–Wheeler transform roughly 27 times bigger. xz defaults to 8 MiB but supports significantly larger windows. Windows archi
https://en.wikipedia.org/wiki/AIM%20%28software%29
AIM (AOL Instant Messenger) was an instant messaging and presence computer program created by AOL, which used the proprietary OSCAR instant messaging protocol and the TOC protocol to allow registered users to communicate in real time. AIM was popular by the late 1990s, in United States and other countries, and was the leading instant messaging application in that region into the following decade. Teens and college students were known to use the messenger's away message feature to keep in touch with friends, often frequently changing their away message throughout a day or leaving a message up with one's computer left on to inform buddies of their ongoings, location, parties, thoughts, or jokes. AIM's popularity declined as AOL subscribers started decreasing and steeply towards the 2010s, as Gmail's Google Talk, SMS, and Internet social networks, like Facebook gained popularity. Its fall has often been compared with other once-popular Internet services, such as Myspace. In June 2015, AOL was acquired by Verizon Communications. In June 2017, Verizon combined AOL and Yahoo into its subsidiary Oath Inc. (now called Yahoo). The company discontinued AIM as a service on December 15, 2017. History In May 1997, AIM was released unceremoniously as a stand-alone download for Microsoft Windows. AIM was an outgrowth of "online messages" in the original platform written in PL/1 on a Stratus computer by Dave Brown. At one time, the software had the largest share of the instant messaging market in North America, especially in the United States (with 52% of the total reported ). This does not include other instant messaging software related to or developed by AOL, such as ICQ and iChat. During its heyday, its main competitors were ICQ (which AOL acquired in 1998), Yahoo! Messenger and MSN Messenger. AOL particularly had a rivalry or "chat war" with PowWow and Microsoft, starting in 1999. There were several attempts from Microsoft to simultaneously log into their own and AIM's protocol servers. AOL was unhappy about this and started blocking MSN Messenger from being able to access AIM. This led to efforts by many companies to challenge the AOL and Time Warner merger on the grounds of antitrust behaviour, leading to the formation of the OpenNet Coalition. Official mobile versions of AIM appeared as early as 2001 on Palm OS through the AOL application. Third-party applications allowed it to be used in 2002 for the Sidekick. A version for Symbian OS was announced in 2003 as were others for BlackBerry and Windows Mobile After 2012, stand-alone official AIM client software included advertisements and was available for Microsoft Windows, Windows Mobile, Classic Mac OS, macOS, Android, iOS, and BlackBerry OS. Usage decline and product sunset Around 2011, AIM started to lose popularity rapidly, partly due to the quick rise of Gmail and its built-in real-time Google Chat instant messenger integration in 2011 and because many people migrated to SMS or iMessages text m
https://en.wikipedia.org/wiki/Association%20for%20Computing%20Machinery
The Association for Computing Machinery (ACM) is a US-based international learned society for computing. It was founded in 1947 and is the world's largest scientific and educational computing society. The ACM is a non-profit professional membership group, reporting nearly 110,000 student and professional members . Its headquarters are in New York City. The ACM is an umbrella organization for academic and scholarly interests in computer science (informatics). Its motto is "Advancing Computing as a Science & Profession". History In 1947, a notice was sent to various people: On January 10, 1947, at the Symposium on Large-Scale Digital Calculating Machinery at the Harvard computation Laboratory, Professor Samuel H. Caldwell of Massachusetts Institute of Technology spoke of the need for an association of those interested in computing machinery, and of the need for communication between them. [...] After making some inquiries during May and June, we believe there is ample interest to start an informal association of many of those interested in the new machinery for computing and reasoning. Since there has to be a beginning, we are acting as a temporary committee to start such an association: E. C. Berkeley, Prudential Insurance Co. of America, Newark, N. J. R. V. D. Campbell, Raytheon Manufacturing Co., Waltham, Mass. , Bureau of Standards, Washington, D.C. H. E. Goheen, Office of Naval Research, Boston, Mass. J. W. Mauchly, Electronic Control Co., Philadelphia, Pa. T. K. Sharpless, Moore School of Elec. Eng., Philadelphia, Pa. R. Taylor, Mass. Inst. of Tech., Cambridge, Mass. C. B. Tompkins, Engineering Research Associates, Washington, D.C. The committee (except for Curtiss) had gained experience with computers during World War II: Berkeley, Campbell, and Goheen helped build Harvard Mark I under Howard H. Aiken, Mauchly and Sharpless were involved in building ENIAC, Tompkins had used "the secret Navy code-breaking machines", and Taylor had worked on Bush's Differential analyzers. The ACM was then founded in 1947 under the name Eastern Association for Computing Machinery, which was changed the following year to the Association for Computing Machinery. The ACM History Committee since 2016 has published the A.M.Turing Oral History project, the ACM Key Award Winners Video Series, and the India Industry Leaders Video project. Activities ACM is organized into over 246 local professional chapters and 38 Special Interest Groups (SIGs), through which it conducts most of its activities. Additionally, there are over 833 college and university chapters. The first student chapter was founded in 1961 at the University of Louisiana at Lafayette. Many of the SIGs, such as SIGGRAPH, SIGDA, SIGPLAN, SIGCSE and SIGCOMM, sponsor regular conferences, which have become famous as the dominant venue for presenting innovations in certain fields. The groups also publish a large number of specialized journals, magazines, and newsletters. ACM also sponsors other comput
https://en.wikipedia.org/wiki/The%20Bush%20%28Alaska%29
In Alaska, the Bush typically refers to any region of the state that is not connected to the North American road network or does not have ready access to the state's ferry system. A large proportion of Alaska Native populations live in the Bush, often depending on subsistence hunting and fishing. Geographically, the Bush comprises the Alaska North Slope; Northwest Arctic; West, including the Baldwin and Seward Peninsulas; the Yukon-Kuskokwim Delta; Southwest Alaska; Bristol Bay; Alaska Peninsula; and remote areas of the Alaska Panhandle and Interior. Some of the hub communities in the bush, which typically can be reached by larger, commercial airplanes, include Bethel, Dillingham, King Salmon, Nome, Utqiagvik, Kodiak Island, Kotzebue, and Unalaska-Dutch Harbor. Most parts of Alaska that are off the road or ferry system can be reached by small bush airplanes. Travel between smaller communities or to and from hub communities is typically accomplished by snowmobiles, boats, or ATVs. References Regions of Alaska Rural geography Decolonization
https://en.wikipedia.org/wiki/AMOS%20%28programming%20language%29
AMOS BASIC is a dialect of the BASIC programming language for the Amiga computer. Following on from the successful STOS BASIC for the Atari ST, AMOS BASIC was written for the Amiga by François Lionet with Constantin Sotiropoulos and published by Europress Software in 1990. History AMOS competed on the Amiga platform with Acid Software's Blitz BASIC. Both BASICs differed from other dialects on different platforms, in that they allowed the easy creation of fairly demanding multimedia software, with full structured code and many high-level functions to load images, animations, sounds and display them in various ways. The original AMOS was a BASIC interpreter which, whilst working fine, suffered the same disadvantages of any language being run interpretively. By all accounts, AMOS was extremely fast among interpreted languages, being speedy enough that an extension called AMOS 3D could produce playable 3D games even on plain 7 MHz 68000 Amigas. Later, an AMOS compiler was developed that further increased speed. AMOS could also run MC68000 machine code, loaded into a program's memory banks. To simplify animation of sprites, AMOS included the AMOS Animation Language (AMAL), a compiled sprite scripting language which runs independently of the main AMOS BASIC program. It was also possible to control screen and "rainbow" effects using AMAL scripts. AMAL scripts in effect created CopperLists, small routines executed by the Amiga's Agnus chip. After the original version of AMOS, Europress released a compiler (AMOS Compiler), and two other versions of the language: Easy AMOS, a simpler version for beginners, and AMOS Professional, a more advanced version with added features, such as a better integrated development environment, ARexx support, a new user interface API and new flow control constructs. Neither of these new versions was significantly more popular than the original AMOS. AMOS was used mostly to make multimedia software, video games (platformers and graphical adventures) and educational software. The language was mildly successful within the Amiga community. Its ease of use made it especially attractive to beginners. Perhaps AMOS BASIC's biggest disadvantage, stemming from its Atari ST lineage, was its incompatibility with the Amiga's operating system functions and interfaces. Instead, AMOS BASIC controlled the computer directly, which caused programs written in it to have a non-standard user interface, and also caused compatibility problems with newer versions of hardware. Today, the language has declined in popularity along with the Amiga computer for which it was written. Despite this, a small community of enthusiasts are still using it. The source code to AMOS was released around 2001 under a BSD style license by Clickteam, a company that includes the original programmer. Software Software written using AMOS BASIC includes: Miggybyte Scorched Tanks Games by Vulcan Software, amongst which was the Valhalla trilogy Amiga version of
End of preview. Expand in Data Studio

No dataset card yet

Downloads last month
7

Models trained or fine-tuned on Jenjamin3000/RAG_documents_computer_science