ComputersInformation Technology

Units of information

A unit of measurement is present in any area of human activity as the generally accepted equivalent of measuring something. Volume, length, weight - the most common of the existing. On different continents, in different languages and, irrespective of religion, we use the same measures of measurement. For a person, this is convenient, accessible and understandable, does not require any actions with the conversion, which often leads to errors.

With the advent of a computer in the life of a person, there was also a need to create a single unit for measuring the amount of information. Yes, so that it is perceived and Chinese, and American, and any other computer in the world. All programmers of the world could use it as a single language. So there was a binary logarithm bit.

A bit is a unit of information equal to the amount of information and assuming only two values: "on" and "off" (shown as 1 and 0). To encode the English alphabet on the computer, six bits are enough, but to implement more complex programming processes, you need more dimensional units of information that borrowed Latin prefixes from mathematics. A bit in the computer is an electrical impulse sent from the command post to the executive body and a response pulse about the work done. 0 and 1 commands are given in different voltage ranges. All this happens at such a high speed, comparable to the transfer of information between the brain and nerve endings in the human body.

All programming languages are binary-coded. The photo transferred to the electronic form, games and any soft software, the management of mechanized production - all uses a single unit of information - a bit.

Gathering into chains of impulses, bits form new units of information measurement: byte, kilobit, kilobyte, megabit, megabyte, gigabyte, terabyte, petabyte and so on.

A byte is considered equal to eight bits and has a second name - an octet, since in the computer history there are solutions of both 6-bit bytes and 32-bit bytes, but this is solely the software solution of some companies in improving the unit of measurement of information, which did not Fruitful results. The basic one is an 8-bit byte, which is why the name "octet" is commonly called.

To facilitate the work with information, units of information amount are encoded in specially approved tables, such as UNICOD, and other types of encoding using Windows OS or another. Each of the tables is used for its purpose. For a black and white image, an 8-bit binary system is used, and for a color one, a 24-bit one is used. To write text in any language of the world, it is sufficient to use a 16-bit UNICOD.

At the moment, an average user is available tasamaya large unit of information, like a terabyte - is 1024 gigabytes, which in turn is 1024 gigabytes. If you multiply by 1024, you get the number of megabytes that make up one terabyte. If such simple multiplications continue to the original unit of measurement, then the o-th-th number will be obtained, not one calculator can fit. This will be the number of binary codes needed to fill such a volume.

By the way, many manufacturers use the literal value of consoles kilo, mega, and giga when producing components. This leads to significant errors and a veiled fraud. That is, a 1 terabyte hard drive can physically store only 950 megabytes of information. The computer community has long suggested a way to avoid this.

Prefix kilo, mega-and so on remain with their usual multipliers. A kilo is a thousand, a mega is a million. For computer hardware correctly use the multiplier is not 1000, but 1024. The name is also invented - kibi. That is, the correct prefixes and multipliers will be: kibi, mibi, and further.

Similar articles

 

 

 

 

Trending Now

 

 

 

 

Newest

Copyright © 2018 en.unansea.com. Theme powered by WordPress.