**Digitization**^{[1][2][3]} is the process of converting information into a digital (i.e. computer-readable) format, in which the information is organized into bits.^{[1][2]} The result is the representation of an object, image, sound, document or signal (usually an analog signal) by generating a series of numbers that describe a discrete set of points or samples. The result is called *digital** representation* or, more specifically, a *digital image*, for the object, and *digital form*, for the signal.

In modern practice, the digitized data is in the form of binary numbers, which facilitate computer processing and other operations, but, strictly speaking, digitizing simply means the conversion of analog source material into a numerical format; the decimal or any other number system that can be used instead.

Digitization is of crucial importance to data processing, storage and transmission, because it "allows information of all kinds in all formats to be carried with the same efficiency and also intermingled".^{[4]} Though analog data is typically more stable, digital data can more easily be shared and accessed and can, in theory, be propagated indefinitely, without generation loss, provided it is migrated to new, stable formats as needed. This is why it is a favored way of preserving information for many organisations around the world.

## Process

The core of the process is the compromise between the capturing device and the player device so that the rendered result represents the **original source** with the most possible fidelity. The advantage of digitization is the speed and accuracy in which this form of information can be transmitted with no degradation compared with analog information transmission.

Digital information exists as one of two digits, either 0 or 1. These are known as bits (a contraction of *binary digits*) and the sequences of 0s and 1s.

Analog signals are continuously variable, both in the number of possible values of the signal *at* a given time, as well as in the number of points in the signal *in* a given period of time. On the other hand, digital signals are discrete in both of those respects – generally a finite sequence of integers – therefore a digitization can, in practical terms, only ever be an approximation of the analog source it represents.

Digitization occurs in two parts:

**Discretization or Sampling**

The reading of an analog signal, and, at regular time intervals (frequency), sampling the value of the analog signal. Each such reading is called a *sample.*

**Quantization**

Samples are rounded to a fixed set of numbers (such as integers), a process known as quantization. In general, these can occur at the same time, though they are conceptually distinct.

A series of digital integers can be transformed into an analog output that approximates the original analog signal. Such a transformation is called a DA conversion. The sampling rate and the number of bits used to represent the integers combine to determine how close such an approximation to the analog signal a digitization will be.

## Analog signals to digital

Analog signals are continuous electrical signals; digital signals are non-continuous. Analog signals can be converted to digital signals by using an analog-to-digital converter.^{[10]}

Nearly all recorded music has been digitized. About 12 percent of the 500,000+ movies listed on the Internet Movie Database are digitized on DVD.^{[11][12]}

The handling of an analog signal becomes easy when it is digitized because the signal is digitized before modulation and transmission. The conversion process of analog to digital consists of two processes: sampling and quantizing.

## Analog texts to digital

Older print books are being scanned and optical character recognition technologies have been applied by academic and public libraries, foundations, and private companies like Google.^{[15]}