Thursday 28 February 2008

CHARLES BABBAGE

Charles Babbage

,

Charles Babbage
Source

  • Born: 26 December 1792
  • Birthplace: Teignmouth, Devonshire, England
  • Died: 18 October 1871
  • Best Known As: Inventor of the Difference Engine

Charles Babbage was a 19th century mathematician and inventor whose mechanical calculating machines earned him a top spot in the history of early computing. Babbage's early career was devoted to practical applications of science, particularly in manufacturing, but he is most famous for his work on what he called the Difference Engine and, later, the Analytical Engine. As early as 1822 he speculated that a machine could be used to compute complex mathematical problems and calculate and correct errors in logarithm tables and astronomical charts. He obtained government grants and began work on the Difference Engine, only to decide later that it would be easier to scrap the work and start fresh on a new idea, the Analytical Engine. The British government withdrew funding in 1842 and stuck the incomplete Difference Engine in the Science Museum, where it still sits. Babbage, using his own money, spent the rest of his life working on the Analytical Engine, but never finished it. He was assisted by Lord Byron's daughter, Ada Augusta, the countess of Lovelace and an amateur mathematician. In spite of his failure to completely develop a working machine, Babbage (and Lady Lovelace) are legendary heroes in the prehistory of the computing age; he is sometimes called "the grandfather of modern computing."

Babbage created the first reliable actuarial tables, invented skeleton keys and the locomotive cowcatcher... In 1847 he invented an ophthalmoscope to study the retina, but didn't announce the invention and didn't get any credit for it... Lady Lovelace also joined Babbage in his failed attempts to create an infallible system of betting on horse races... The work of Babbage and Lady Lovelace is central to the speculative novel The Difference Engine (1992), written by Bruce Sterling and William Gibson.


Scientist: Charles Babbage

Charles Babbage
Library of Congress

[b. Teignmouth, England, December 26, 1792, d. London, October 18, 1871]

Babbage designed and partially built the first mechanical computers. In 1832 he built a demonstration model of his first advanced calculator, the Difference Engine, designed to compute logarithms and other functions. This model worked to some degree, and Babbage's plans were later used to create fully functioning versions. Babbage also designed a device he called the Analytical Engine. This was supposed to use punched cards as input for problem-solving programs and have the equivalent of memory and a central processing system, but it was never built. Babbage was the first to use mathematics to study a complex system (the English postal system); his ideas led to the flat-rate postage stamp. He was also an inventor credited with the first ophthalmoscope, speedometer, skeleton key, and cowcatcher for locomotives.


Biography: Charles Babbage
Charles Babbage (1791-1871) was an English inventor and mathematician whose mathematical machines foreshadowed the modern computer. He was a pioneer in the scientific analysis of production systems.

Charles Babbage was born on Dec. 26, 1791, in Totnes, Devonshire. Much of his early education was under private tutors. In 1810 he matriculated at Trinity College, Cambridge. Appalled by the state of mathematical instruction there, Babbage helped to organize the Analytical Society, which played a decisive role in weakening the grip of blind Newton-worship at Cambridge and Oxford.

In 1814, the same year in which he took his degree, Babbage married Georgiana Whitmore. They had eight children, only three of whom survived to maturity. Mrs. Babbage died in 1827.

Mathematical Engines

In 1822 Babbage produced the first model of the calculating engine that would be the consuming interest of his life. The machine produced mathematical tables, and since its operation was based upon the mathematical theory of finite differences, he called it a "difference engine." The government was interested, and a vague promise of financial assistance encouraged Babbage to begin building a full-scale machine.

But he had underestimated the difficulties. Many of the precision machine tools needed to shape the wheels, gears, and cranks of the engine did not exist. Babbage and his craftsmen had to design them. The consequent delays worried the government, and the financial support was tied up in red tape.

Meanwhile the conception of a far grander engine had entered Babbage's restless brain, the "analytical engine." It would possess (in modern language) a feedback mechanism and would be able to perform any mathematical operation. Babbage asked the government for a decision on which engine to finish. After an 8-year pause for thought, the government indicated that it wanted neither.

Between bouts with the government and work on his engines, the versatile Babbage managed to squeeze in an incredible variety of activities. He wrote on mathematics, the decline of science in England, codes and ciphers, the rationalization of manufacturing processes, religion, archeology, tool design, and submarine navigation, among other subjects. He was Lucasian professor of mathematics at Cambridge for 10 years, but he was better known for his interminable campaign against organ-grinders in the streets of London.

Always he returned to his great engines, but none of them was ever finished. He died on Oct. 18, 1871, having played a prominent part in the 19th-century revival of British science.

Further Reading

The best source on Babbage is Philip Morrison and Emily Morrison, eds., Charles Babbage and His Calculating Engines: Selected Writings by Charles Babbage and Others (1961). It contains an excellent short biography by the Morrisons, a selection of Babbage's works, and associated material on the engines. For more details on Babbage's life see Maboth Moseley, Irascible Genius: A Life of Charles Babbage, Inventor (1964).

OUTPUT DEVICE

An output device is any piece of computer hardware equipment used to communicate the results of data processing carried out by an information processing system (such as a computer) to the outside world.

In computing, input/output, or I/O, refers to the communication between an information processing system (such as a computer), and the outside world. Inputs are the signals or data sent to the system, and outputs are the signals or data sent by the system to the outside.

The most common input devices used by the computer are the keyboard and mouse. The keyboard allows the entry of textual information while the mouse allows the selection of a point on the screen by moving a screen cursor to the point and pressing a mouse button. The most common outputs are monitors and speakers.

Some common output devices

Visual display unit
A visual display unit (also called VDU, monitor, or screen) offers a two-dimensional visual presentation of information.
Speaker
A speaker can be used for various sounds meant to alert the user, as well as music and spoken text.j
Printer
Printers produce a permanent hard copy of the information on paper.

INPUT DEVICE

input device

An input device is a hardware mechanism that transforms information in the external world for consumption by a computer. Often, input devices are under direct control by a human user, who uses them to communicate commands or other information to be processed by the computer, which may then transmit feedback to the user through an output device. Input and output devices together make up the hardware interface between a computer and the user or external world. Typical examples of input devices include keyboards and mice. However, there are others which provide many more degrees of freedom. In general, any sensor which monitors, scans for and accepts information from the external world can be considered an input device, whether or not the information is under the direct control of a user.

History

A definition of an input device was already included within the von Neumann architecture in 1945, however conception of an architecture including similar devices designed for input only appear since 1936. The von Neumann architecture describes a device designed for inserting user data, which are separated from the algorithm data and code. These devices included a keyboard or a punched card.

Mice were invented by Doug Engelbart in the 1960s.

Classification

Many input devices can be classified according to:-

  • the modality of input (e.g. mechanical motion, audio, visual, sound, etc.)
  • whether the input is discrete (e.g. keypresses) or continuous (e.g. a mouse's position, though digitized into a discrete quantity, is high-resolution enough to be thought of as continuous)
  • the number of degrees of freedom involved (e.g. many mice allow 2D positional input, but some devices allow 3D input, such as the Logitech Magellan Space Mouse)

Pointing devices, which are input devices used to specify a position in space, can further be classified according to

  • Whether the input is direct or indirect. With direct input, the input space coincides with the display space, i.e. pointing is done in the space where visual feedback or the cursor appears. Touchscreens and light pens involve direct input. Examples involving indirect input include the mouse and trackball.
  • Whether the positional information is absolute (e.g. on a touch screen) or relative (e.g. with a mouse that can be lifted and repositioned)

Note that direct input is almost necessarily absolute, but indirect input may be either absolute or relative. For example, digitizing graphics tablets that do not have an embedded screen involve indirect input, and sense absolute positions and are often run in an absolute input mode, but they may also be setup to simulate a relative input mode where the stylus or puck can be lifted and repositioned.

Early devices

Keyboards


Main article: Computer keyboard

Examples of types of keyboards include

Issues and techniques related to keyboards include

Pointing devices

An Apple pro mouse
Enlarge
An Apple pro mouse
Touchpad and a pointing stick on an IBM Laptop
Enlarge
Touchpad and a pointing stick on an IBM Laptop

A pointing device is any computer hardware component (specifically human interface device) that allows a user to input spatial (ie, continuous and multi-dimensional) data to a computer. CAD systems and graphical user interfaces (GUI) allow the user to control and provide data to the computer using physical gestures - point, click, and drag - typically by moving a hand-held mouse across the surface of the physical desktop and activating switches on the mouse. Movements of the pointing device are echoed on the screen by movements of the mouse pointer (or cursor) and other visual changes.

While the most common pointing device by far is the mouse, many more devices have been developed. However, mouse is commonly used as a metaphor for devices that move the cursor.

For most pointing devices, Fitts' law can be used to predict the speed with which users can point at a given target position.

Examples of common pointing devices include

High-degree of freedom input devices

Some devices allow many continuous degrees of freedom to be input, and could sometimes be used as pointing devices, but could also be used in other ways that don't conceptually involve pointing at a location in space.

Composite devices

Wii Remote with attached strap
Enlarge
Wii Remote with attached strap

Input devices, such as buttons and joysticks, can be combined on a single physical device that could be thought of as a composite device. Many gaming devices have controllers like this.

Imaging and Video input devices

Audio input devices


DATA PROCESSING

Data processing is any computer process that converts data into information or knowledge. The processing is usually assumed to be automated and running on a computer. Because data are most useful when well-presented and actually informative, data-processing systems are often referred to as information systems to emphasize their practicality. Nevertheless, both terms are roughly synonymous, performing similar conversions; data-processing systems typically manipulate raw data into information, and likewise information systems typically take raw data as input to produce information as output.

To better market their profession, a computer programmer or a systems analyst that might once have referred, such as during the 1970s, to the computer systems that they produce as data-processing systems more often than not nowadays refers to the computer systems that they produce by some other term that includes the word information, such as information systems, information technology systems, or management information systems.

In the context of data processing, data are defined as numbers or characters that represent measurements from observable phenomena. A single datum is a single measurement from observable phenomena. Measured information is then algorithmically derived and/or logically deduced and/or statistically calculated from multiple data. (evidence). Information is defined as either a meaningful answer to a query or a meaningful stimulus that can cascade into further queries.

More generally, the term data processing can apply to any process that converts data from one format to another, although data conversion would be the more logical and correct term. From this perspective, data processing becomes the process of converting information into data and also the converting of data back into information. The distinction is that conversion doesn't require a question (query) to be answered. For example, information in the form of a string of characters forming a sentence in English is converted or encoded from a keyboard's key-presses as represented by hardware-oriented codes into ASCII codes after which it may be more easily processed by a computer—not as merely raw, amorphous data, but as a meaningful character in a natural language's set of graphemes—and finally converted or decoded to be displayed as characters, represented by a font on the computer display. In that example we can see the stage-by-stage conversion of the presence of and then absence of electrical conductivity in the key-press and subsequent release at the keyboard from raw substantially-meaningless hardware-oriented data to evermore-meaningful information as the processing proceeds toward the human being.

Conversely, that simple example for pedagogical purposes here is usually described as an embedded system (for the software resident in the keyboard itself) or as (operating-)systems programming, because the information is derived from a hardware interface and may involve overt control of the hardware through that interface by an operating system. Typically control of hardware by a device driver manipulating ASIC or FPGA registers is not viewed as part of data processing proper or information systems proper, but rather as the domain of embedded systems or (operating-)systems programming. Instead, perhaps a more conventional example of the established practice of using the term data processing is that a business has collected numerous data concerning an aspect of its operations and that this multitude of data must be presented in meaningful, easy-to-access presentations for the managers who must then use that information to increase revenue or to decrease cost. That conversion and presentation of data as information is typically performed by a data-processing application.

When the domain from which the data are harvested is a science or an engineering, data processing and information systems are considered too broad of terms and the more specialized term data analysis is typically used, focusing on the highly-specialized and highly-accurate algorithmic derivations and statistical calculations that are less often observed in the typical general business environment. This divergence of culture is exhibited in the typical numerical representations used in data processing versus numerical; data processing's measurements are typically represented by integers or by fixed-point or binary-coded decimal representations of numbers whereas the majority of data analysis's measurements are often represented by floating-point representation of rational numbers.

Practically all naturally occurring processes can be viewed as examples of data processing systems where "observable" information in the form of pressure, light, etc. are converted by human observers into electrical signals in the nervous system as the senses we recognize as touch, sound, and vision. Even the interaction of non-living systems may be viewed in this way as rudimentary information processing systems. Conventional usage of the terms data processing and information systems restricts their use to refer to the algorithmic derivations, logical deductions, and statistical calculations that recur perennially in general business environments, rather than in the more expansive sense of all conversions of real-world measurements into real-world information in, say, an organic biological system or even a scientific or engineering system.

PERSONAL COMPUTER SYSTEM


personal computer

Computer2

A small, relatively inexpensive computer designed for an individual user. In price, personal computers range anywhere from a few hundred dollars to thousands of dollars. All are based on the microprocessor technology that enables manufacturers to put an entire CPU on one chip. Businesses use personal computers for word processing, accounting, desktop publishing, and for running spreadsheet and database management applications. At home, the most popular use for personal computers is for playing games.

Personal computers first appeared in the late 1970s. One of the first and most popular personal computers was the Apple II, introduced in 1977 by Apple Computer. During the late 1970s and early 1980s, new models and competing operating systems seemed to appear daily. Then, in 1981, IBM entered the fray with its first personal computer, known as the IBM PC. The IBM PC quickly became the personal computer of choice, and most other personal computer manufacturers fell by the wayside. One of the few companies to survive IBM's onslaught was Apple Computer, which remains a major player in the personal computer marketplace.

Other companies adjusted to IBM's dominance by building IBM clones, computers that were internally almost the same as the IBM PC, but that cost less. Because IBM clones used the same microprocessors as IBM PCs, they were capable of running the same software. Over the years, IBM has lost much of its influence in directing the evolution of PCs. Many of its innovations, such as the MCA expansion bus and the OS/2 operating system, have not been accepted by the industry or the marketplace.

Today, the world of personal computers is basically divided between Apple Macintoshes and PCs. The principal characteristics of personal computers are that they are single-user systems and are based on microprocessors. However, although personal computers are designed as single-user systems, it is common to link them together to form a network. In terms of power, there is great variety. At the high end, the distinction between personal computers and workstations has faded. High-end models of the Macintosh and PC offer the same computing power and graphics capability as low-end workstations by Sun Microsystems, Hewlett-Packard, and DEC.

INFORMATION TECHNOLOGY

Information technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit and retrieve information, securely.

Recently it has become popular[citation needed] to broaden the term to explicitly include the field of electronic communication so that people tend to use the abbreviation ICT (Information and Communications Technology).

Today, the term information technology has ballooned to encompass many aspects of computing and technology, and the term is more recognizable than ever before. The information technology umbrella can be quite large, covering many fields. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems. Information Technology

Information technology, as defined by the Information Technology Association of America

(ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." Encompassing the computer and information systems industries, information technology is the capability to electronically input, process, store, output, transmit, and receive data and information, including text, graphics, sound, and video, as well as the ability to control machines of all kinds electronically.

Information technology is comprised of computers, networks, satellite communications, robotics, videotext, cable television, electronic mail ("e-mail"), electronic games, and automated office equipment. The information industry consists of all computer, communications, and electronics-related organizations, including hardware, software, and services. Completing tasks using information technology results in rapid processing and information mobility, as well as improved reliability and integrity of processed information.

History of Information Technology

The term "information technology" evolved in the 1970s. Its basic concept, however, can be traced to the World War II alliance of the military and industry in the development of electronics, computers, and information theory. After the 1940s, the military remained the major source of research and development funding for the expansion of automation to replace manpower with machine power.

Since the 1950s, four generations of computers have evolved. Each generation reflected a change to hardware of decreased size but increased capabilities to control computer operations. The first generation used vacuum tubes, the second used transistors, the third used integrated circuits, and the fourth used integrated circuits on a single computer chip. Advances in artificial intelligence that will minimize the need for complex programming characterize the fifth generation of computers, still in the experimental stage.

The first commercial computer was the UNIVAC I, developed by John Eckert and John W. Mauchly in 1951. It was used by the Census Bureau to predict the outcome of the 1952 presidential election. For the next twenty-five years, mainframe computers were used in large corporations to do calculations and manipulate large amounts of information stored in databases. Supercomputers were used in science and engineering, for designing aircraft and nuclear reactors, and for predicting worldwide weather patterns. Minicomputers came on to the scene in the early 1980s in small businesses, manufacturing plants, and factories.

In 1975, the Massachusetts Institute of Technology developed microcomputers. In 1976, Tandy Corporation's first Radio Shack microcomputer followed; the Apple microcomputer was introduced in 1977. The market for microcomputers increased dramatically when IBM introduced the first personal computer in the fall of 1981. Because of dramatic improvements in computer components and manufacturing, personal computers today do more than the largest computers of the mid-1960s at about a thousandth of the cost.

Computers today are divided into four categories by size, cost, and processing ability. They are supercomputer, mainframe, minicomputer, and microcomputer, more commonly known as a personal computer. Personal computer categories include desktop, network, laptop, and handheld.

Information Technology's Role Today

Every day, people use computers in new ways. Computers are increasingly affordable; they continue to be more powerful as information-processing tools as well as easier to use.

Computers in Business One of the first and largest applications of computers is keeping and managing business and financial records. Most large companies keep the employment records of all their workers in large databases that are managed by computer programs. Similar programs and databases are used in such business functions as billing customers; tracking payments received and payments to be made; and tracking supplies needed and items produced, stored, shipped, and sold. In fact, practically all the information companies need to do business involves the use of computers and information technology.

On a smaller scale, many businesses have replaced cash registers with point-of-sale (POS) terminals. These POS terminals not only print a sales receipt for the customer but also send information to a computer database when each item is sold to maintain an inventory of items on hand and items to be ordered. Computers have also become very important in modern factories. Computer-controlled robots now do tasks that are hot, heavy, or hazardous. Robots are also used to do routine, repetitive tasks in which boredom or fatigue can lead to poor quality work.

Computers in Medicine Information technology plays an important role in medicine. For example, a scanner takes a series of pictures of the body by means of computerized axial tomography (CAT) or magnetic resonance imaging (MRI). A computer then combines the pictures to produce detailed three-dimensional images of the body's organs. In addition, the MRI produces images that show changes in body chemistry and blood flow.

Computers in Science and Engineering Using supercomputers, meteorologists predict future weather by using a combination of observations of weather conditions from many sources, a mathematical representation of the behavior of the atmosphere, and geographic data.

Computer-aided design and computer-aided manufacturing programs, often called CAD/CAM, have led to improved products in many fields, especially where designs tend to be very detailed. Computer programs make it possible for engineers to analyze designs of complex structures such as power plants and space stations.

Integrated Information Systems With today's sophisticated hardware, software, and communications technologies, it is often difficult to classify a system as belonging uniquely to one specific application program. Organizations increasingly are consolidating their information needs into a single, integrated information system. One example is SAP, a German software package that runs on mainframe computers and provides an enterprise-wide solution for information technologies. It is a powerful database that enables companies to organize all their data into a single database, then choose only the program modules or tables they want. The freestanding modules are customized to fit each customer's needs.

Software

Computer software consists of the programs, or lists of instructions, that control the operation of a computer. Application software can be used for the following purposes:

  • As a productivity/business tool
  • To assist with graphics and multimedia projects
  • To support household activities, for personal business, or for education
  • To facilitate communications

Productivity Software Productivity software is designed to make people more effective and efficient when performing daily activities. It includes applications such as word processing, spreadsheets, databases, presentation graphics, personal information management, graphics and multimedia, communications, and other related types of software. Word-processing software is used to create documents such as letters, memos, reports, mailing labels, and newsletters. This software is used to create attractive and professional-looking documents that are stored electronically, allowing them to be retrieved and revised. The software provides tools to correct spelling and grammatical mistakes, permits copying and moving text without rekeying, and provides tools to enhance the format of documents. Electronic spreadsheet software is used in business environments to perform numeric calculations rapidly and accurately. Data are keyed into rows and columns on a worksheet, and formulas and functions are used to make fast and accurate calculations. Spreadsheets are used for "what-if" analyses and for creating charts based on information in a worksheet. A database is a collection of data organized in a manner that allows access, retrieval, and use of that data. A database management system (DBMS) is used to create a computerized database; add, change, and delete data; sort and retrieve data from the database; and create forms and reports using the data in the database. Presentation graphics software is used to create presentations, which can include clip-art images, pictures, video clips, and audio clips as well as text. A personal information manager is a software application that includes an appointment calendar, address book, and notepad to help organize personal information such as appointments and task lists. Engineers, architects, desktop publishers, and graphic artists often use graphics and multimedia software such as computer-aided design, desktop publishing, video and audio entertainment, and Web page authoring. Software for communications includes groupware, e-mail, and Web browsers.

Hardware

Information processing involves four phases: input, process, output, and storage. Each of these phases and the associated devices are discussed below.

Input devices: Input devices include the keyboard, pointing devices, scanners and reading devices, digital cameras, audio and video input devices, and input devices for physically challenged users. Input devices are used to capture data at the earliest possible point in the workflow, so that the data are accurate and readily available for processing.

Processing: After data are captured, they are processed. When data are processed, they are transformed from raw facts into meaningful information. A variety of processes may be performed on the data, such as adding, subtracting, dividing, multiplying, sorting, organizing, formatting, comparing, and graphing. After processing, information is output, as a printed report, for example, or stored as files.

Output devices: Four common types of output are text, graphics, audio, and video. Once information has been processed, it can be listened to through speakers or a headset, printed onto paper, or displayed on a monitor. An output device is any computer component capable of conveying information to a user. Commonly used output devices include display devices, printers, speakers, headsets, data projectors, fax machines, and multifunction devices. A multifunction device is a single piece of equipment that looks like a copy machine but provides the functionality of a printer, scanner, copy machine, and perhaps a fax machine.

Storage devices: Storage devices retain items such as data, instructions, and information for retrieval and future use. They include floppy disks or diskettes, hard disks, compact discs (both read-only and disc-recordable), tapes, PC cards, Smart Cards, microfilm, and microfiche.

Information and Data Processing

Data processing is the input, verification, organization, storage, retrieval, transformation, and extraction of information from data. The term is usually associated with commercial applications such as inventory control or payroll. An information system refers to business applications of computers and consists of the databases, application programs, and manual and machine procedures and computer systems that process data. Databases store the master files of the business and its transaction files. Application programs provide the data entry, updating, and query and report processing. Manual procedures document the workflow, showing how the data are obtained for input and how the system's output is distributed. Machine procedures instruct the computers how to perform batch-processing activities, in which the output of one program is automatically fed into another program. Daily processing is the interactive, real-time processing of transactions. Batch-processing programs are run at the end of the day (or some other period) to update the master files that have not been updated since the last cycle. Reports are printed for the cycle's activities. Periodic processing of an information system involves updating of the master files— adding, deleting, and changing the information about customers, employees, vendors, and products.

INFORMATION


The ASCII codes for the word "Wikipedia" represented in binary, the numeral system most commonly used for encoding computer information.
The ASCII codes for
Enlarge
the word "Wikipedia" represented in binary, the numeral system most commonly used for encoding computer information.


Information is the result of processing, gathering, manipulating and organizing data in a way that adds to the knowledge of the receiver. In other words, it is the context in which data is taken.[citation needed]

Information as a concept bears a diversity of meanings, from everyday usage to technical settings. Generally speaking, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, and representation.

Many people speak about the Information Age as the advent of the Knowledge Age [citation needed][attribution needed] or knowledge society, the information society, and information technologies, and even though informatics, information science and computer science are often in the spotlight, the word "information" is often used without careful consideration of the various meanings it has acquired.

Etymology

According to the Oxford English Dictionary, the earliest historical meaning of the word information in English was the act of informing, or giving form or shape to the mind, as in education, instruction, or training. A quote from 1387: "Five books come down from heaven for information of mankind." It was also used for an item of training, e.g. a particular instruction. "Melibee had heard the great skills and reasons of Dame Prudence, and her wise information and techniques." (1386)

The English word was apparently derived by adding the common "noun of action" ending "-ation" (descended through French from Latin "-tio") to the earlier verb to inform, in the sense of to give form to the mind, to discipline, instruct, teach: "Men so wise should go and inform their kings." (1330) Inform itself comes (via French) from the Latin verb informare, to give form to, to form an idea of. Furthermore, Latin itself already even contained the word informatio meaning concept or idea, but the extent to which this may have influenced the development of the word information in English is unclear.

As a final note, the ancient Greek word for form was eidos, and this word was famously used in a technical philosophical sense by Plato (and later Aristotle) to denote the ideal identity or essence of something (see Theory of forms). "Eidos" can also be associated with thought, proposition or even concept.

Information as a message

Information is the state of a system of interest. Message is the information materialized.

Information is a quality of a message from a sender to one or more receivers. Information is always about something (size of a parameter, occurrence of an event, etc). Viewed in this manner, information does not have to be accurate. It may be a truth or a lie, or just the sound of a kiss. Even a disruptive noise used to inhibit the flow of communication and create misunderstanding would in this view be a form of information. However, generally speaking, if the amount of information in the received message increases, the message is more accurate.

This model assumes there is a definite sender and at least one receiver. Many refinements of the model assume the existence of a common language understood by the sender and at least one of the receivers. An important variation identifies information as that which would be communicated by a message if it were sent from a sender to a receiver capable of understanding the message. However, in requiring the existence of a definite sender, the "information as a message" model does not attach any significance to the idea that information is something that can be extracted from an environment, e.g., through observation, reading or measurement.

Information is a term with many meanings depending on context, but is as a rule closely related to such concepts as meaning, knowledge, instruction, communication, representation, and mental stimulus. Simply stated, information is a message received and understood. In terms of data, it can be defined as a collection of facts from which conclusions may be drawn. There are many other aspects of information since it is the knowledge acquired through study or experience or instruction. But overall, information is the result of processing, manipulating and organizing data in a way that adds to the knowledge of the person receiving it.

Communication theory is a numerical measure of the uncertainty of an outcome. For example, we can say that "the signal contained thousands of bits of information". Communication theory tends to use the concept of information entropy, generally attributed to C.E. Shannon (see below).

Another form of information is Fisher information, a concept of R.A. Fisher. This is used in application of statistics to estimation theory and to science in general. Fisher information is thought of as the amount of information that a message carries about an unobservable parameter. It can be computed from knowledge of the likelihood function defining the system. For example, with a normal likelihood function, the Fisher information is the reciprocal of the variance of the law. In the absence of knowledge of the likelihood law, the Fisher information may be computed from normally distributed score data as the reciprocal of their second moment.

Even though information and data are often used interchangeably, they are actually very different. Data is a set of unrelated information, and as such is of no use until it is properly evaluated. Upon evaluation, once there is some significant relation between data, and they show some relevance, then they are converted into information. Now this same data can be used for different purposes. Thus, till the data convey some information, they are not useful.

Measuring information entropy

The view of information as a message came into prominence with the publication in 1948 of an influential paper by Claude Shannon, "A Mathematical Theory of Communication." This paper provides the foundations of information theory and endows the word information not only with a technical meaning but also a measure. If the sending device is equally likely to send any one of a set of N messages, then the preferred measure of "the information produced when one message is chosen from the set" is the base two logarithm of N (This measure is called self-information). In this paper, Shannon continues:

The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information. N such devices can store N bits…[1]

A complementary way of measuring information is provided by algorithmic information theory. In brief, this measures the information content of a list of symbols based on how predictable they are, or more specifically how easy it is to compute the list through a program: the information content of a sequence is the number of bits of the shortest program that computes it. The sequence below would have a very low algorithmic information measurement since it is a very predictable pattern, and as the pattern continues the measurement would not change. Shannon information would give the same information measurement for each symbol, since they are statistically random, and each new symbol would increase the measurement.

123456789101112131415161718192021

It is important to recognize the limitations of traditional information theory and algorithmic information theory from the perspective of human meaning. For example, when referring to the meaning content of a message Shannon noted “Frequently the messages have meaning… these semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages” (emphasis in original).

In information theory signals are part of a process, not a substance; they do something, they do not contain any specific meaning. Combining algorithmic information theory and information theory we can conclude that the most random signal contains the most information as it can be interpreted in any way and cannot be compressed.[citation needed]

Micheal Reddy noted that "'signals' of the mathematical theory are 'patterns that can be exchanged'. There is no message contained in the signal, the signals convey the ability to select from a set of possible messages." In information theory "the system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design".

Information as a pattern

Information is any represented pattern. This view assumes neither accuracy nor directly communicating parties, but instead assumes a separation between an object and its representation. Consider the following example: economic statistics represent an economy, however inaccurately. What are commonly referred to as data in computing, statistics, and other fields, are forms of information in this sense. The electro-magnetic patterns in a computer network and connected devices are related to something other than the pattern itself, such as text characters to be displayed and keyboard input. Signals, signs, and symbols are also in this category. On the other hand, according to semiotics, data is symbols with certain syntax and information is data with a certain semantic. Painting and drawing contain information to the extent that they represent something such as an assortment of objects on a table, a profile, or a landscape. In other words, when a pattern of something is transposed to a pattern of something else, the latter is information. This would be the case whether or not there was anyone to perceive it.

But if information can be defined merely as a pattern, does that mean that neither utility nor meaning are necessary components of information? Arguably a distinction must be made between raw unprocessed data and information which possesses utility, value or some quantum of meaning. On this view, information may indeed be characterized as a pattern; but this is a necessary condition, not a sufficient one.

An individual entry in a telephone book, which follows a specific pattern formed by name, address and telephone number, does not become "informative" in some sense unless and until it possesses some degree of utility, value or meaning. For example, someone might look up a girlfriend's number, might order a take away etc. The vast majority of numbers will never be construed as "information" in any meaningful sense. The gap between data and information is only closed by a behavioral bridge whereby some value, utility or meaning is added to transform mere data or pattern into information.

When one constructs a representation of an object, one can selectively extract from the object (sampling) or use a system of signs to replace (encoding), or both. The sampling and encoding result in representation. An example of the former is a "sample" of a product; an example of the latter is "verbal description" of a product. Both contain information of the product, however inaccurate. When one interprets representation, one can predict a broader pattern from a limited number of observations (inference) or understand the relation between patterns of two different things (decoding). One example of the former is to sip a soup to know if it is spoiled; an example of the latter is examining footprints to determine the animal and its condition. In both cases, information sources are not constructed or presented by some "sender" of information. Regardless, information is dependent upon, but usually unrelated to and separate from, the medium or media used to express it. In other words, the position of a theoretical series of bits, or even the output once interpreted by a computer or similar device, is unimportant, except when someone or something is present to interpret the information. Therefore, a quantity of information is totally distinct from its medium.

Information as sensory input

Often information is viewed as a type of input to an organism or designed device. Inputs are of two kinds. Some inputs are important to the function of the organism (for example, food) or device (energy) by themselves. In his book Sensory Ecology, Dusenbery called these causal inputs. Other inputs (information) are important only because they are associated with causal inputs and can be used to predict the occurrence of a causal input at a later time (and perhaps another place). Some information is important because of association with other information but eventually there must be a connection to a causal input. In practice, information is usually carried by weak stimuli that must be detected by specialized sensory systems and amplified by energy inputs before they can be functional to the organism or device. For example, light is often a causal input to plants but provides information to animals. The colored light reflected from a flower is too weak to do much photosynthetic work but the visual system of the bee detects it and the bee's nervous system uses the information to guide the bee to the flower, where the bee often finds nectar or pollen, which are causal inputs, serving a nutritional function.

Information is any type of sensory input. When an organism with a nervous system receives an input, it transforms the input into an electrical signal. This is regarded information by some. The idea of representation is still relevant, but in a slightly different manner. That is, while abstract painting does not represent anything concretely, when the viewer sees the painting, it is nevertheless transformed into electrical signals that create a representation of the painting. Defined this way, information does not have to be related to truth, communication, or representation of an object. Entertainment in general is not intended to be informative. Music, the performing arts, amusement parks, works of fiction and so on are thus forms of information in this sense, but they are not necessarily forms of information according to some definitions given above. Consider another example: food supplies both nutrition and taste for those who eat it. If information is equated to sensory input, then nutrition is not information but taste is.

Information as an influence which leads to a transformation

Information is any type of pattern that influences the formation or transformation of other patterns. In this sense, there is no need for a conscious mind to perceive, much less appreciate, the pattern. Consider, for example, DNA. The sequence of nucleotides is a pattern that influences the formation and development of an organism without any need for a conscious mind. Systems theory at times seems to refer to information in this sense, assuming information does not necessarily involve any conscious mind, and patterns circulating (due to feedback) in the system can be called information. In other words, it can be said that information in this sense is something potentially perceived as representation, though not created or presented for that purpose.

When Marshall McLuhan speaks of media and their effects on human cultures, he refers to the structure of artifacts that in turn shape our behaviors and mindsets. Also, pheromones are often said to be "information" in this sense.

(See also Gregory Bateson.)

Information as a property in physics


Main article: Physical information

In 2003, J. D. Bekenstein claimed there is a growing trend in physics to define the physical world as being made of information itself (and thus information is defined in this way). Information has a well defined meaning in physics. Examples of this include the phenomenon of quantum entanglement where particles can interact without reference to their separation or the speed of light. Information itself cannot travel faster than light even if the information is transmitted indirectly. This could lead to the fact that all attempts at physically observing a particle with an "entangled" relationship to another are slowed down, even though the particles are not connected in any other way other than by the information they carry.

Another link is demonstrated by the Maxwell's demon thought experiment. In this experiment, a direct relationship between information and another physical property, entropy, is demonstrated. A consequence is that it is impossible to destroy information without increasing the entropy of a system; in practical terms this often means generating heat. Thus, in the study of logic gates, the theoretical lower bound of thermal energy released by an AND gate is higher than for the NOT gate (because information is destroyed in an AND gate and simply converted in a NOT gate). Physical information is of particular importance in the theory of quantum computers.

Information as records

Records are a specialized form of information. Essentially, records are information produced consciously or as by-products of business activities or transactions and retained because of their value. Primarily their value is as evidence of the activities of the organization but they may also be retained for their informational value. Sound records management ensures that the integrity of records is preserved for as long as they are required.

The international standard on records management, ISO 15489, defines records as "information created, received, and maintained as evidence and information by an organization or person, in pursuance of legal obligations or in the transaction of business". The International Committee on Archives (ICA) Committee on electronic records defined a record as, "a specific piece of recorded information generated, collected or received in the initiation, conduct or completion of an activity and that comprises sufficient content, context and structure to provide proof or evidence of that activity".

Records may be retained because of their business value, as part of the corporate memory of the organization or to meet legal, fiscal or accountability requirements imposed on the organization. Willis (2005) expressed the view that sound management of business records and information delivered "…six key requirements for good corporate governance…transparency; accountability; due process; compliance; meeting statutory and common law requirements; and security of personal and corporate information."