ComputersInformation Technology

Measures and amount of information

For the first time the term "information" was proposed by ancient philosophers and sounded like informatio - explanation, presentation, knowledge. However, in academic circles, disputes still remain about the most precise and complete definition of this word. For example, the scientist Claude Shannon, who laid the foundations of information theory, believes that information is the removed uncertainty of the subject's knowledge about something. The simplest definition of "information" sounds like this - this is the degree of awareness of the object.

In order to determine the amount of information, you should familiarize yourself with the classification of information measures. In total there are three measures of information: syntactic, semantic and pragmatic. Let's consider each measure separately:

1. The syntactic measure works with data that does not reflect the semantic relation to the object. This measure deals with the type of media, the way of presentation and coding, the speed of transmission and processing of information.

In this case, the measure is the amount of information - the amount of memory needed to store data about the object. The information volume is equal to the number of digits of the binary system with which the message is encoded and measured in bits.

In order to determine the syntactic quantity of information, we turn to the concept of entropy - a measure of the uncertainty of the state of the system, namely our knowledge of the state of its elements and the state of the system as a whole. Then the amount of information is a change in the measure of the uncertainty of the system, that is, a change (increase or decrease) in the entropy.

2. The semantic measure serves to determine the semantic content of the data and associates the relevant information parameters with the user's ability to process the message. This concept was called the user's thesaurus. A thesaurus is a collection of information about an object that a system or a user has. The maximum amount of information in terms of semantics is possible in the case when the entire amount of data is understandable to the user or system - can be processed using the available thesaurus - and, therefore, is a relative concept.

3. A pragmatic measure of information measures the value of information to achieve a specific goal. This concept is also relative and has a direct bearing on the ability of the system or user to apply a specific amount of data to a particular problem area. Therefore, it is advisable to measure information from a pragmatic point of view in the same units of measure as the objective function.

Qualitative characteristics of information include the following indicators:

- Representativeness - the correct selection and presentation of information for the most optimal display of the characteristics of the object.

- Content - the ratio of the amount of information in the semantic dimension to the amount of data processed.

- Completeness - presence in the message of the minimum necessary for achievement of the purpose of a set of information.

- Availability - execution of procedures for obtaining and converting data by a user or system.

- Relevance - the degree of preservation of the value of information from the moment of receipt to the moment of use.

- Timeliness - the receipt of information no later than the required time.

- Accuracy - the degree to which information corresponds to the actual state of the object.

- Reliability - the ability of information data to reflect real objects with a given accuracy.

- Stability is a property of information that allows you to respond to the transformation of the original data in time, while maintaining the specified accuracy.

Remember, information is very important now, so you need to know about it as much as possible!

Similar articles

 

 

 

 

Trending Now

 

 

 

 

Newest

Copyright © 2018 en.atomiyme.com. Theme powered by WordPress.