Clever Geek Handbook
πŸ“œ ⬆️ ⬇️

Source of entropy

Sources of entropy are used to accumulate entropy , followed by obtaining from it the initial value ( English initial value, seed ) needed by true random number generators (RNG) to generate random numbers. The difference from the pseudorandom number generator (PRNG) is that the PRNG uses a single initial value, from where its pseudo-randomness is obtained, and the RNG always generates a random number, having at the beginning a high-quality random value provided by various sources of entropy. Random numbers have many uses in cryptography , for example, to create cryptographic keys , passwords .

Content

Sources of Entropy

Information entropy is a measure of the randomness of information, the uncertainty of the appearance of any symbol of the primary alphabet .

Entropy is the amount of information per one elementary message from a source generating statistically independent messages. The sources of entropy, as it turns out, are very dependent on implementation. Once a sufficient amount of entropy has been accumulated, it can be used as an initial value (seed) to create the necessary value of cryptographically strong randomness. There is hope in the future of a real, strong, portable source of randomness. All you need is a physical source of unpredictable numbers. Thermal noise (or Johnson noise), a source of radioactive decay , a generator of free vibrations can be sources of entropy. These conventional hardware can easily become part of the standard computer architecture . Also, most video and audio devices, a ring generator, any system with a rotating disk, stable accurate clocks, specialized hardware pseudo-random number generators embedded in the motherboard of a personal computer or its processor (for example, Intel i82802 RNG, built-in VIA C3 microprocessor generator) can serve as sources. .

Required Entropy Volume

There is not much "unpredictability" needed. For example, in AES (Advanced Encryption Standard), the key length is 128 bits and even very secure systems rarely use material over 200 bits. If a set of keys is needed, they can be generated by a cryptographically strong sequence using a good random seed value. A few hundred random bits received at startup once a day are sufficient if such sequences are used.

Entropy Hardware Sources

Most computer systems already have the necessary hardware, using which you can properly get really good random numbers.

Video and audio inputs

Many computer systems are built in such a way that they already have inputs that can digitize certain analog sources of the real world, such as sound from a microphone or video signal from a camera . The quality of this method is very dependent on the hardware implementation. For example, input from an audio digitizer without a nearby sound source or from a camera with a lid closed will essentially represent thermal noise . However, if the system has some amplifier and means for detecting (detecting) something, then the input data can provide a sufficiently high quality of random bits. For example, on some UNIX -based systems, you can read data from the device / dev / audio without a sound source placed at the microphone, or read only a low level of background noise. Such data is essentially random noise, although it should not be trusted without some verification. Thus, using / dev / audio, you can get a huge amount of random data of medium quality.

Disk Drives

Disk drives ( hard drives ) have small random fluctuations in rotational speed due to chaotic air turbulence . If we add to this a tool for finding data on disks, the time of which is very short, you can get a series of measurements containing the desired randomness. Such data, as a rule, are highly interdependent, therefore significant processing is required. However, experiments conducted at the time showed that with proper processing, even slow disk drives on the slow computers of those days could easily produce 100 or more bits of excellent random data per minute. About 5 years ago on modern equipment received a degree of generation of random bits over 10 thousand per second. This technique is used in random number generators available in many operating system libraries. It is worth noting separately that the use of the cache in this technique is not a problem due to the fact that the cache is accessed in a short time and can thus be simply ignored.

Clock and serial numbers

Computer clocks and similar values ​​related to the operating system and hardware provide a much smaller number of required random, unpredictable bits than expected from their specification. Tests of watches of a huge number of systems were conducted and it was found that the behavior of watches can vary greatly, and in an unpredictable way. One version of the operating system running on the same set of equipment can provide a clock resolution in microseconds, while another configuration of the same operating system can produce the same low-order bits (i.e., give a lower clock resolution). This means that consecutive readings from the clock can give the same values, even if enough time has passed, for which the value should have changed, based on the nominal resolution of the clock. There are also cases where fast readings from a clock can produce artificial sequential values, because an additional code checks the clock, which, in turn, has not changed between the two readings, and increases them by 1. Developing portable applications that use the system clock to generate unpredictable random variables is always a particularly difficult task, because the application developer does not always know the properties of the system clock.

Using a hardware serial number (such as a MAC address ) also provides fewer unpredictable bits than you might expect. Such quantities are usually well-structured, and their field segments can have only a limited set of possible values, or these values ​​can be easily guessed based on the approximate release date or other data. For example, if a company manufactures both computers and Ethernet adapters, it will most likely (at least within the company) use its own adapters, which greatly limits the set of built-in addresses.

The problems described above complicate the creation of application code that could use the clock and serial numbers of equipment to generate unpredictable values, because the code must take into account the variety of computer platforms and systems.

Time and magnitude of external events

It is possible to measure the time and magnitude of mouse movement, a hit on a key, and similar external events related to the user , memory allocation statistics within the operating system and network activity, counters of hardware and software interruptions, readings of environmental sensors in the system (temperature, voltage, speed cooling system fans). These are reasonable sources of quality unpredictable data. On some computer systems, input such as keystrokes is buffered , that is, recorded and accumulated. Although the user's keystrokes have sufficient change and unpredictability, there may not be an easy way to access these changes. Another problem is that there is no standard method for allocating (selecting) this click time. All this complicates the use of this method to create standard software aimed at distribution to a wide range of diverse computer systems.

It is easier to get the amount of mouse movement or keystroke than the corresponding time of the event, however, this data can give less randomness (unpredictability), this is due to the fact that the user can create a very repeated data entry.

Some other external sources of random data can also be used, such as the arrival time of the network packet [1] and its length, but only with careful use. Almost any external receiver can be a good source of entropy, for example, a rude radio receiver , a thermometer in a well-equipped computer system. However, in each case, the data must be checked how much in fact the entropy they can provide.

The methods described above are quite effective against crackers isolated from the measured value. For example, these methods are good against remote crackers who do not have access to the user's environment. In all cases, it is true: the better the time or magnitude of an external event is measured, the faster useful random bits are created and accumulated.

Non-hardware sources and some conclusions

The best sources of entropy would be hardware sources, such as disk drives, thermal noise, or radioactive decay . However, if there are none, there are other possibilities. These include system clocks, system buffers, or I / O buffers; serial numbers or addresses and times of the user , system, equipment, network ; keyboard and mouse user input. Unfortunately, each of these sources can produce only a limited or predictable number of random variables in some circumstances.

Some of these sources are quite powerful in a multi-user network, because each user is essentially a source of entropy, however, in a single small single-user system, these methods may fail if the cracker uses high-level hacking tools. The use of several sources with a strong mixing function is recommended; it can exceed the weaknesses of individual sources. Keystrokes can provide hundreds of random bits of time and magnitude, but two things must be done. If the pressing interval coincides with the previous one, then regardless of the key code, the obtained value does not add entropy. Similarly, if the key code pressed matches the code of the previous key pressed, entropy is not added, even if the time interval is different from the previous one. The result of mixing time with the key code can be extended to the value of the system clock and other inputs.

A similar strategy can create portable applications whose code produces good random numbers, for example for cryptography , even if individual sources of entropy are very weak on specific target systems. However, such applications may fail on small single-user systems against a high-level attack, especially if the cracker had the opportunity to observe the generation process earlier. However, hardware sources of entropy are preferable.

Notes

  1. ↑ See, for example: Ping-based random number generator

Links

  • RFC 4086 - Randomness Requirements for Security
  • Pseudo Random Number Generators (unspecified) . Cryptography . OpenBSD (May 11, 2009). - Sources of entropy and the use of random numbers in OpenBSD. Date of treatment April 8, 2010. Archived April 13, 2012.
Source - https://ru.wikipedia.org/w/index.php?title= Entropy source&oldid= 92416847


More articles:

  • Musical Keyboard
  • Prospect Mira (Reutov)
  • Pt47
  • Boran (people)
  • Solomonov, Yuri Semenovich
  • German Figure Skating Championship 2010
  • Vityaz (oil)
  • Aidarovskoe rural settlement (Voronezh region)
  • Razin, Nikolai Vasilievich
  • Henschel Hs 132

All articles

Clever Geek | 2019