EN
Risk is a complex category generally linked with uncertainty of the future. Economic activities are strongly determined by optimal relation between expected profitability and risk taken at the same time. Risk is an objective concept, so it must be measured. Otherwise the management process could not be undertaken effectively. Both, theory and practice, worked out many methods of risk measurement. All of them, more or less, refer to particular factors of the risk. The essential question is, what in specific situation should be concerned as the dominant risk factor. The fundamental assumption is, that the general determinant of risk is lack of information. Possessing complete information can be concerned as a synonym of certainty. Less information means more uncertainty (risk) at the same time. Following that way of deliberation, measure of missing information could be considered as a measure of the risk. In information theory a measure of the uncertainty associated with a random variable is known as entropy. The term usually refers to the Shannon theory, which quantifies, in the sense of an expected value, the information contained in a message. It usually uses units such as bits. Shannon entropy is also a measure of the average information content that is missing when one does not know the random variable's value. Authors try to describe some ways of understanding risk in the aspect of information. Besides using the entropy theory, problems of redundancy, information noise, and effective coding are discussed in the aspect of risk management.