Photo (c) LF Academy

Representatives of the Central Bank, the Ministry of Finance, Rosfinmontoring, the Ministry of Justice, as well as lawyers and scientists gathered on Thursday at the conference “Financial Technologies and Law: Focusing” to discuss the legal regulation of new financial technologies and civil law issues arising from smart contracts, cryptocurrencies and blockchain..

The participants discussed the current state of regulation of these innovations in the financial sector in Russia and abroad, argued about the terms proposed in the bills (three relevant bills are currently under consideration by the State Duma), and also raised the question of whether it is necessary to regulate cryptocurrencies and blockchain in general, because the apologists of these technologies are of the opinion that these technologies themselves, without external control, provide counterparties with trust in each other.

The question has also been raised more than once whether the regulation of cryptocurrencies should be brought under existing norms - for example, those that operate in the securities market (in the USA they did). The participants did not come to a consensus, the discussion will continue.

“The issue is not just at the stage of elaboration, the issue is at the stage of formulation, first of all, from the point of view of law. A huge field for work, in fact, only a few bushes rise in this field, ”said, summing up the results of the main panel discussion, moderator, Secretary of State - Deputy Director of the Federal Service of the Russian Federation for Financial Monitoring Pavel Livadny.

Cryptocurrency bills

The following bills are currently under consideration by the State Duma, none of them has yet passed a single reading.

According to the participants of the event, the controversy surrounding these documents has not subsided so far (they were even called “sleeping measures” bills, meaning the abundant references from these bills to other laws and legal acts), it is possible that all three will be merged.

Positions of the Central Bank and the Ministry of Finance as the main regulators

In October last year, President Vladimir Putin to the government and the Central Bank to determine the status of cryptocurrencies and regulate the ICO. According to the president, the use of cryptocurrencies carries serious risks, but he drew attention to the need to use the benefits that new technological solutions provide in the banking sector.

Recall that the Central Bank and the Ministry of Finance have disagreements on “On Digital Financial Assets” in terms of the envisaged possibility of exchanging cryptocurrencies for rubles, foreign currency and / or other property. According to the Bank of Russia, such transactions should be allowed only in relation to tokens issued for the purpose of attracting funding (here the term “token” means crypto-protected, digitally-only digital obligations of an organization initiating the issuance of a cryptocurrency. - ed.).

On Thursday, the director of the legal department of the Central Bank presented their positions at the conference Alexey Guznov and Director of the Financial Policy Department of the Ministry of Finance Yana Pureskina.

According to the representative of the Central Bank as a body that develops monetary policy, it is still premature to introduce into the legal field the concepts of digital law, digital assets, and even more so cryptocurrency as independent objects of civil law.

Guznov became interested in the recent history of cryptocurrencies – where did they come from, “how did they get into our world”. One point of view is that cryptocurrencies owe their existence to gamers who used cryptocurrencies to buy gaming artifacts. Another one that does not contradict or alternative to the first one: the philosophy of cryptocurrencies was born in the environment of cryptopunks and inherits the philosophy of anarchism. This number of options is not limited.

“Cryptocurrency is not a currency, it is something that tries to call itself a currency,” Guznov said.

“We treat digital currencies as a legalized means of payment very carefully, but legally this is generally impossible,” he further noted and suggested that if the concept of digital currencies is introduced into the legal field, then at the level of transactions “of good will” that do not require state support. In this case, it is really not required to regulate the circulation of cryptocurrencies.

Speaking about the positions of the Central Banks of other countries, he noted that the digital currency is either prohibited, or it is treated with a certain degree of apprehension.

Guznov noted that banks feel the influence of fintech primarily in the fact that more and more customers do not visit the offices of credit institutions. But the opinion of a number of representatives of fintech (expressed two or three years ago) that soon there will be no banks, but only one fintech, does not share the representative of the Central Bank. “Now it turns out that banks are largely stimulating the development of fintech and involving it in the standard banking turnover.”

He called the acceptance of bank clients at the end of last year a big step. “Important tasks have been solved there, which will allow, while preserving personal data, to provide access to financial technologies according to the formula 24/7/365.”

The representative of the Central Bank did not agree that an “uncontrollable shaft of operations with cryptocurrency” could arise in the country. To regulate "data entities", in his opinion, one can consider the turnover - whether it is free or limited. Now, in his opinion, the state has no other points where it can influence what is happening, except for the transition point - from one world [currencies] to another [cryptocurrencies, and vice versa].

The representative of the Ministry of Finance spoke briefly, as the plenary session was far behind schedule.

Yana Pureskina considers it right to follow the path of regulation, she once again recalled the three legislative initiatives that are under consideration by the State Duma. The Ministry of Finance believes that it is necessary to tune in to already existing legal structures, based on the assumption that cryptocurrency is a temporary phenomenon (and in this the position of the ministry is close to the position of the Ministry of Justice), new subjects [of regulation] will appear based on new financial technologies, therefore, to determine the rules in every such case is inappropriate.

In particular, the dispute about whether the cryptocurrency is an object of civil rights (i.e., whether it can be collected by creditors or be included in the inheritance base) can be resolved by existing legislation. It states that the objects of civil rights include things, including cash and documentary securities, other property, including non-cash funds, book-entry securities, property rights; results of work and provision of services; protected results of intellectual activity and equivalent means of individualization (intellectual property); intangible benefits. Cryptocurrencies may well be classified as “other property”.

The main idea of ​​the new regulation is to ensure the protection of the parties involved in cryptocurrency transactions: “The phenomenon is taking place, it is growing in volume, and in the draft law on digital financial assets we are solving this main task [of protection].” The task is to ensure that, in the event of disputes, the parties to the conflict - participants in the ICO - can apply to the courts for legal protection.

“It is necessary to find a balance between the needs to provide the economy with new ways to attract investment, and there are such needs in the economy now, to facilitate the attraction of investments for small and medium-sized businesses, which now have less access to bank lending, which find it difficult to enter the exchange infrastructure” Pureskina said. According to her, the issue of taxation of mining and the conversion of cryptocurrencies into fiat money still remains open.


Plenary session (from left to right): Alexey Guznov, Pavel Livadny, German Klimenko, Nikolay Chernogor, Yana Pureskina. Photo (c) Tatyana Kostyleva

Special opinion

We also present the most interesting opinions of other participants in the discussion.

Pavel Livadny(Rosfinmonitoring): “Blockchain evangelists say that everyone sits and sees everything. Suppose that I did not sell my apartment, but my blockchain showed that I sold it. I went to the computer an hour later and saw this, and another 10-15 transactions were made with the apartment. How can I prove that I didn't do it? Especially when you consider that blockchain apologists do not want government regulation. Blockchain is a false idea.”

MICEX representative said that the exchange is not yet ready to organize a section of cryptocurrencies.

Director of the Department of Information and Communication Technologies and Analytical Implementation of External State Audit (Control) of the Office of the Accounts Chamber Alexey Sklyar: "In the public sector, blockchain technology can be used in very limited areas where there can be complete openness between government agencies - property accounting, for the formation of budget reporting."

Deputy Director of the Institute of Legislation and Comparative Law under the Government of the Russian Federation Nikolai Chernogor: “The emergence of fintech is a manifestation of the desire to get out of the strict regulation of the state. Right now, law seeks to invade all nooks and crannies of social interaction.

Associate Professor, Department of Theory and History of Law, Faculty of Law, Higher School of Economics, IBM Legal Counsel Alexander Saveliev, on the definition of digital law proposed in the bill. “A sign of [digital law] is the ability to get acquainted with the description of an object at any time. Recall that now, and many resources are lying, so at any time you can not get acquainted. It makes sense to clarify a number of points [in the bill]. It turns out that if at least one requirement is not met, there is no judicial protection.”


The second section - lawyers argue about the problems of terminology, about the implementation of the rights and fulfillment of the duties of citizens

Features of statistical methodology. Statistical aggregate. The law of large numbers.

Law of Large Numbers

The mass nature of social laws and the originality of their actions predetermine the need for the study of aggregate data.

The law of large numbers is generated by special properties of mass phenomena. The latter, by virtue of their individuality, on the one hand, differ from each other, and on the other hand, they have something in common, due to their belonging to a certain class, species. Moreover, single phenomena are more susceptible to the influence of random factors than their combination.

The law of large numbers in its simplest form states that the quantitative regularities of mass phenomena are clearly manifested only in a sufficiently large number of them.

Thus, its essence lies in the fact that in the numbers obtained as a result of mass observation, certain regularities appear that cannot be detected in a small number of facts.

The law of large numbers expresses the dialectic of the accidental and the necessary. As a result of the mutual cancellation of random deviations, the average values ​​calculated for a value of the same type become typical, reflecting the actions of constant and significant facts under given conditions of place and time. The tendencies and regularities revealed by the law of large numbers are valid only as mass tendencies, but not as laws for each individual case.

Statistics studies its subject using various methods:

Method of mass observations

Method of statistical groupings

The method of dynamic series

・Index analysis method

· The method of correlation-regression analysis of the relationships of indicators, etc.

Polit. arithmeticians studied general phenomena with the help of numerical characteristics. Representatives of this school were Gratsit - he studied the patterns of mass phenomena, Petit - the creator of eq. statistics, Galei - laid the idea of ​​the law of large numbers.

Population- a lot of same-quality, varying phenomena. The individual elements that make up the aggregate are units of the aggregate. A statistical set is called homogeneous if the most significant features for each of its units are yavl. basically the same and heterogeneous and, if different types of phenomena are combined. Frequency-recurrence of signs in the aggregate (in the distribution series).

Sign- a characteristic feature (property) or another feature of units of objects of phenomena. Signs are divided into: 1) quantitative (these signs are expressed in numbers. They play a predominant role in statistics. These are signs of individual values ​​\u200b\u200bwhich differ in magnitude); 2) qualitative ((attributive) are expressed in the form of concepts, definitions, expressing their essence, qualitative state); 3) alternative (qualitative features that can take only one of two opposite values). The features of individual units of the population take on separate values. Fluctuation of signs - variation.

Statistical population units and feature variation. Statistical indicators.

Phenomena and processes in the life of society are characterized by statistics with the help of statistical indicators. A statistical indicator is a quantitative assessment of the properties of the phenomenon under study. In the statistical indicator, the unity of the qualitative and quantitative aspects is manifested. If the qualitative side of the phenomenon is not defined, it is impossible to determine its quantitative side.

Statistics using stat. indicators characterizes: the size of the studied phenomena; their feature; patterns of development; their relationships.

Statistical indicators are divided into accounting - estimated and analytical.

Accounting - estimated indicators reflect the volume or level of the studied phenomenon.

Analytical indicators are used to characterize the features of the development of a phenomenon, its prevalence in space, the ratio of its parts, the relationship with other phenomena. The following are used as analytical indicators: average values, structure indicators, variations, dynamics, degrees of tightness, etc. Variation- this is the diversity, the variability of the value of the attribute in individual units of the observation population.

Variation of the trait - gender - male, female.

Variation of salary - 10000, 100000, 1000000.

The individual characteristic values ​​are called options this sign.

Each individual phenomenon subject to statistical study is called

Stages of statistical observation. Statistical observation. Goals and objectives of statistical observation. Basic concepts.

Statistical observation is the collection of necessary data on phenomena, processes of social life.

Any statistical study consists of the following steps:

· Statistical observation - collection of data about the phenomenon under study.

· Summary and grouping - calculation of totals as a whole or by groups.

· Obtaining generalizing indicators and their analysis (conclusions).

The task of statistical observation is to obtain reliable initial information and obtain it in the shortest possible time.

The tasks facing the manager determine the purpose of supervision. It may follow from the decisions of government bodies, the administration of the region, the marketing strategy of the company. The general purpose of statistical observation is to provide information management. It is specified depending on many conditions.

The object of observation is a set of units of phenomena under study, about which data should be collected.

The unit of observation is the element of the object that has the feature under study.

Signs may be:

  • quantitative
  • Qualitative (attributive)

To register the collected data is used form- a specially prepared form, usually having a title, address and content parts. The title part contains the name of the survey, the organization conducting the survey, and by whom and when the form was approved. The address part contains the name, location of the research object and other details that allow it to be identified. Depending on the construction of the content part, there are two types of forms:

§ Form card, which is compiled for each unit of observation;

§ Blank list, which is compiled for a group of observation units.

Each form has its own advantages and disadvantages.

blank card convenient for manual processing, but associated with additional costs in the design of the title and address book.

Blank list used for automatic processing and cost savings on the preparation of the title and address parts.

To reduce the cost of summary and data entry, it is advisable to use machines that read forms. Questions in the content of the form should be formulated in such a way that they can receive unambiguous, objective answers. The best question is one that can be answered "Yes" or "No". Questions that are difficult or undesirable to answer should not be included in the form. You can not combine two different questions in one formulation. To assist the interviewees in the correct understanding of the program and individual questions, instructions. They can be both on the form form, and in the form of a separate book.

To direct the respondent's answers in the right direction, apply statistical clues, that is, ready-made answers. They are complete and incomplete. Incomplete give the respondent the opportunity to improvise.

Statistical tables. Subject and predicate of the table. Simple (list, territorial, chronological), group and combined tables. Simple and complex development of a predicate statistical table. Rules for constructing tables in statistics.

The results of the summary and grouping should be presented in such a way that they can be used.

There are 3 ways to present data:

1. data can be included in the text.

2. presentation in tables.

3. graphic way

Statistical table - a system of rows and columns in which statistical information on socio-economic phenomena is presented in a certain sequence.

Distinguish between subject and predicate of the table.

The subject is an object characterized by numbers, usually the subject is given on the left side of the table.

The predicate is a system of indicators by which the object is characterized.

The general title should reflect the content of the entire table, located above the table in the center.

Table rules.

1. if possible, the table should be small in size, easily visible

2. The general title of the table should briefly express the size of its main. content (territory, date)

3. numbering of columns and lines (subject) that are filled with data

4. when filling out tables, you need to use symbols

5. compliance with the rules for rounding numbers.

Statistical tables are divided into 3 types:

1. simple tables do not contain the studied units of the statistical population in the subject to systematization, but contain enumerations of the units of the studied population. By the nature of the material presented, these tables are list, territorial and chronological. Tables, in the subject of which a list of the territory (districts, regions, etc.) is given, are called list territorial.

2. group statistics tables provide more informative material for the analysis of the phenomena under study due to the groups formed in their subject according to an essential attribute or the identification of a relationship between a number of indicators.

3. When constructing combination tables, each group of the subject, formed according to one attribute, is divided into subgroups according to the second attribute, every second group is divided according to the third attribute, i.e. factor signs in this case are taken in a certain combination, combinations. The combination table establishes a mutual effect on the effective signs and a significant connection between the factor groupings.

Depending on the task of the study and the nature of the initial information, the predicate of statistical tables can be simple and difficult. The indicators of the predicate in a simple development are arranged sequentially one after the other. By distributing indicators on a group according to one or more signs in a certain combination, a complex predicate is obtained.

Statistical charts. Elements of a statistical graph: graphic image, graph field, spatial references, scale references, chart explication. Types of graphs according to the form of a graphic image and according to the image of construction.

Statistical graph - is a drawing on which statistical data is displayed using conditional geometric shapes (lines, dots or other symbolic signs).

The main elements of a statistical graph:

1. Chart field - the place where it is executed.

2. Graphic image - these are symbolic signs with which stats are depicted. data (points, lines, squares, circles, etc.)

3. Spatial landmarks determine the placement of graphic images on the graph field. They are set by a coordinate grid or contour lines and divide the graph field into parts, corresponding to the values ​​of the studied indicators.

4. Scale landmarks stat. graphics give graphic images quantitative significance, which is transmitted using a system of scales. The scale of the graph is a measure of the conversion of a numerical value into a graphic one. A scale scale is a line whose individual points are read as a certain number. The graph scale can be rectilinear and curvilinear, uniform and non-uniform.

5. The operation of the graph is an explanation of its content, it includes the title of the graph, an explanation of the scales, explanations of individual elements of the graphic image. The title of the graph briefly and clearly explains the main content of the displayed data.

Also on the graph is given text that makes it possible to read the graph. Numerical designations of the scale are supplemented by an indication of the units of measurement.

Graph classification:

By way of construction:

1. The diagram represents a drawing in which the stat. information is depicted by means of geometric shapes or symbolic signs. In stat. apply the following. chart types:

§ linear

§ columnar

§ strip (strip) charts

§ circular

§ radial

2. A cartogram is a schematic (contour) map, or a plan of the area, on which individual territories, depending on the value of the displayed indicator, are indicated using graphic symbols (hatching, colors, dots). The cartogram is subdivided into:

§ Background

§ Spot

In the background cartograms, territories with different values ​​of the studied indicator have different shading.

In dot cartograms, dots of the same size, located within certain territorial units, are used as a graphic symbol.

3. Chart diagrams (stat. maps) is a combination of a contour map (plan) of the area with a diagram.

According to the form of the applied graphic images:

1. In scatter plots as a graph. images, a set of points is used.

2. In line charts, graph. lines are images.

3. For planar graphs graph. the images are geometric shapes: rectangles, squares, circles.

4. Curly charts.

By the nature of the graphics tasks to be solved:

Distribution ranks; structures stat. aggregates; rows of dynamics; communication indicators; performance indicators.

Feature variation. Absolute indicators of variation: range of variation, mean linear deviation, variance, standard deviation. Relative indicators of variation: coefficients of oscillation and variation.

Indicators of variation of averaged static features: range of variation, mean linear deviation, mean quadratic deviation (dispersion), coefficient of variation. Calculation formulas and procedure for calculating variation indicators.

Application of variation indicators in the analysis of statistical data in the activities of enterprises and organizations, institutions of the BR, macroeconomic indicators.

The average indicator gives a generalized, typical level of a trait, but does not show the degree of its fluctuation, variation.

Therefore, the average indicators must be supplemented with indicators of variation. The reliability of averages depends on the size and distribution of deviations.

It is important to know the main indicators of variation, to be able to calculate and use them correctly.

The main indicators of variation are: the range of variation, the average linear deviation, variance, standard deviation, coefficient of variation.

Variation indicator formulas:

1. range of variation.

X μαχ - the maximum value of the attribute

X min - the minimum value of the feature.

The range of variation can only serve as an approximate measure of the variation of a trait, since it is calculated on the basis of its two extreme values, and the rest are not taken into account; in this case, the extreme values ​​of the attribute for a given population can be purely random.

2. average linear deviation.

Means that deviations are taken without regard to their sign.

The mean linear deviation is rarely used in economic statistical analysis.

3. Dispersion.


The index method for comparing complex populations and its elements: the indexed value and the commensurator (weight). statistical index. Classification of indices according to the object of study: indices of prices, physical volume, cost and labor productivity.

The word "index" has several meanings:

Indicator,

Pointer,

Description, etc.

This word, as a concept, is used in mathematics, economics, and other sciences. In statistics, an index is understood as a relative indicator that expresses the ratio of the magnitudes of a phenomenon in time and space.

The following tasks are solved with the help of indexes:

1. Measurement of the dynamics, socio-economic phenomenon for 2 or more periods of time.

2. Measuring the dynamics of the average economic indicator.

3. Measuring the ratio of indicators for different regions.

According to the object of study, the indices are:

labor productivity

Cost

The physical volume of products, etc.

P1 - price of a unit of goods in the current period

P0 - unit price of goods in the base period

2. the volume index shows how the volume of production has changed in the current period compared to the base

q1- number of goods sold or produced in the current period

q0-number of goods sold or produced in the base period

3. The cost index shows how the cost of a unit of production has changed in the current period compared to the base one.

Z1- unit cost of production in the current period

Z0 - unit cost of production in the base period

4. The labor productivity index shows how the labor productivity of one worker has changed in the current period compared to the base period

t0 - labor intensity of the total worker for the base period

t1 - labor intensity of one worker for the current period

By selection method

Repeated

Non-Iterative Sample View

At resampling the total number of population units in the sampling process is unchanged. The unit that is included in the sample after registration is returned to the general population again - “selection according to the returned ball scheme”. Resampling in socioeconomic life is rare. Typically, sampling is organized according to a non-repeating sampling scheme.

At no resampling the unit of the population that has fallen into the sample in the general population is returned and subsequently does not participate in the sample (selection according to the scheme of the non-returned ball). Thus, with non-repetitive sampling, the number of units in the general population is reduced in the process of research.

3. according to the degree of coverage of population units:

Large samples

Small samples(small sample (n<20))

Small sampling in statistics.

A small sample is understood as a non-continuous statistical survey, in which the sample population is formed from a relatively small number of units in the general population. The volume of a small sample usually does not exceed 30 units and can reach up to 4-5 units.

In trade, a small sample is used when a large sample is either not possible or not feasible (for example, if the study involves damage or destruction of the samples being examined).

The value of the error of a small sample is determined by formulas different from the formulas for sample observation with a relatively large sample size (n>100). The average error of a small sample is calculated by the formula:


The marginal error of a small sample is determined by the formula:

T- confidence factor depending on the probability (P), with which the marginal error is determined

μ is the average sampling error.

In this case, the value of the confidence coefficient t depends not only on the given confidence probability, but also on the number of sample units n.

By means of a small sample in trade, a number of practical problems are solved, first of all, the establishment of a limit in which the general average of the trait under study lies.

Selective observation. General and sample populations. Registration and representativeness errors. Sampling error. Mean and marginal sampling errors. Distribution of the results of sample observation to the general population.

In any static research, there are two types of errors:

1. Registration errors can be random (unintentional) and systematic (tendentious) in nature. Random errors usually balance each other, since they do not have a predominant direction towards exaggeration or underestimation of the value of the studied feature. Systematic errors are directed in one direction due to deliberate violation of the selection rules. They can be avoided with proper organization and monitoring.

2. Representativeness errors are inherent only in sample observation and arise due to the fact that the sample does not fully reproduce the general population.


sample share

general variance

general standard deviation

sample variance

sample standard deviation

In selective observation, the randomness of the selection of units must be ensured.

The proportion of the sample is the ratio of the number of units in the sample to the number of units in the general population.

The sample share (or frequency) is the ratio of the number of units that have the characteristic m under study to the total number of units in the sample population n.

To characterize the reliability of sample indicators, the average and marginal sampling errors are distinguished.

1. average sampling error for re-sampling


For a share, the marginal error for re-selection is:


Share in non-recurring selection:

The value of the Laplace integral is the probability (P) for different t are given in a special table:

at t=1 P=0.683

at t=2 P=0.954

at t=3 P=0.997

This means that with a probability of 0.683 it can be guaranteed that the deviation of the general mean from the sample will not exceed a single mean error

Causal relationships between phenomena. Stages of studying cause-and-effect relationships: qualitative analysis, building a relationship model, interpreting the results. Functional connection and stochastic dependence.

The study of objectively existing connections between phenomena is the most important task of the theory of statistics. In the process of statistical study of dependencies, cause-and-effect relationships between phenomena are revealed, which makes it possible to identify factors (signs)


having the main influence on the variation of the studied phenomena and processes. A cause-and-effect relationship is such a connection of phenomena and processes when a change in one of them - the cause - leads to a change in the other - the effect.

Signs according to their importance for the study of the relationship are divided into two classes. Signs that cause changes in other related signs are called factorial, or simply factors. Traits that change under the influence of factor traits are called

productive.

The concept of the relationship between the various features of the studied phenomena. Signs-factors and effective signs. Types of relationship: functional and correlation. Correlation field. Direct and feedback. Linear and non-linear connections.

Direct and reverse links.

Depending on the direction of action, functional and stochastic relationships can be direct and reverse. With a direct connection, the direction of change in the resultant sign coincides with the direction of change in the sign-factor, i.e. with an increase in the factor attribute, the effective attribute also increases, and, conversely, with a decrease in the factor attribute, the effective attribute also decreases. Otherwise, there are feedbacks between the considered quantities. For example, the higher the qualification of the worker (rank), the higher the level of labor productivity - a direct relationship. And the higher the productivity of labor, the lower the unit cost of production - feedback.

Rectilinear and curvilinear connections.

According to the analytical expression (form), the connections can be rectilinear and curvilinear. With a straight-line relationship with an increase in the value of the factor attribute, there is a continuous increase (or decrease) in the values ​​of the resulting attribute. Mathematically, such a relationship is represented by a straight line equation, and graphically by a straight line. Hence its shorter name is linear connection.

With curvilinear relationships with an increase in the value of a factor attribute, the increase (or decrease) of the resulting attribute occurs unevenly, or the direction of its change is reversed. Geometrically, such connections are represented by curved lines (hyperbola, parabola, etc.).

The subject and tasks of statistics. The law of large numbers. Main categories of statistical methodology.

Currently, the term "statistics" is used in 3 meanings:

· Under the "statistics" understand the branch of activity, which is engaged in the collection, processing, analysis, publication of data on various phenomena of public life.

· Statistics is called digital material that serves to characterize general phenomena.

· Statistics is a branch of knowledge, an academic subject.

The subject of statistics is the quantitative side of mass general phenomena in close connection with their qualitative side. Statistics studies its subject with the help of def. categories:

· Statistical totality - totality of social-eq. objects and phenomena in general. Life, united. Some quality. The basis of eg, the totality of pre-ty, firms, families.

· A population unit is the primary element of a statistical population.

Sign - quality. Feature of the unit of the population.

· Statistical indicator - the concept reflects quantities. characteristics (sizes) of signs of total. phenomena.

· Statistical system. Indicators - a set of statistical. indicators, reflecting the relationship, to-rye creatures. between phenomena.

The main tasks of statistics are:

1. a comprehensive study of deep transformations eq. and social processes based on scientific evidence. scorecards.

2. generalization and forecasting of development trends decomp. sectors of the economy as a whole

3. timely provision. reliability of information state., hoz., eq. bodies and the general public

The essence of the law of large numbers.

The regularities studied by statistics - the forms of manifestation of a causal relationship - are expressed in the recurrence with a certain regularity of events with a sufficiently high degree of probability. In this case, the condition must be observed that the factors generating events change insignificantly or do not change at all. Statistical regularity is found on the basis of the analysis of mass data, obeys the law of large numbers.

The essence of the law of large numbers lies in the fact that in the summary statistical characteristics (the total number obtained as a result of mass observation), the actions of the elements of chance are extinguished, and certain regularities (trends) appear in them that cannot be detected on a small number of facts.


Errors of statistical observation.

Deviations between the indicators calculated as a result of observation and the actual values ​​of the studied phenomena are called errors (errors) of statistical observations. There are 2 types of statistical observation errors:

1) registration errors(with continuous and non-continuous observation):

a) with radiant- errors during registration from the words (wrong age);

b) systematic deliberate– special distortions of data in reports (volume of output)

in) systematic unintentional- negligence, technical malfunction.

2) representativeness errors(representativeness) - only with non-continuous observation. They arise if the composition of the population units selected for observation does not fully reflect the composition of the entire population:

a) random- when the set of displayed units does not fully reproduce the entire set. Evaluated by mathematical methods;

b) systematic- deviations due to violation of the principle of random selection of population units. Not quantified.

All errors during registration can be checked - calculated or logically.


Census as a specially organized statistical observation.

Census- a specially organized statistical observation, the main task of which is to take into account the number and characterize the composition of the phenomenon under study by recording in the statistical form for the surveyed units of the statistical population.



There are 2 types of censuses:

1) census based on primary accounting materials - one-time accounting: census of the remains of materials, equipment;

2) census based on specially organized registration of facts: population census.

Population census- scientifically organized statistical observation to obtain data on the size, composition and distribution of the population.

census program- is stated in the census form, either individually for one person, or for several people (family, apartment). Census sheets 1979, 1989 at the same time were carriers for computers.

Census dates: 1939, 1959, 1979, 1989

Now common microcensus– socio-demographic surveys.

The last one was carried out on February 14, 1994 at 12 am, it covered 5% of the population: For 10 days, specially trained enumerators surveyed every 20th portfolio (enumeration area - according to the 1989 census - is approximately 300 people, t .e quarter, residential building).

In 1999, according to the compilation as of 11/10/99, a complete census of the population of Russia was planned. It was canceled for financial reasons and rescheduled for October 9-16, 2002. The current and permanent population, including temporarily absent and temporarily residing citizens of Russia, will be taken into account.

To do this, the State Duma of the Russian Federation should adopt the Federal Law on Population Census. Counters will be involved: through employment services (financing from the republican budget), and other workers - at the expense of the local budget.

Absolute values.

Absolute values ​​are obtained as a result of statistical observation and summary. They express the physical dimensions of the studied phenomena and processes, that is, mass, area, volume, length, temporal characteristics, as well as the volume of the population (number of units). For example, the territory of the Omsk region is 139.7 thousand square meters. kilometers; number of permanent population of the region as of 01.01.2000 - 2164.0 thousand people; volume of industrial production in 1999. - 16995 million rubles.



Absolute indicators are always named numbers, that is, they have specific units of measurement. Depending on the nature of the phenomena under study and their physical properties, absolute values ​​are expressed in natural, labor and cost units.

In international practice, natural measurements are used: tons, kilograms, meters, square meters, cubic meters, kilometers, miles, liters, barrels, pieces, etc.

In cases where the product has several varieties and its total volume can be determined only on the basis of a single consumer property for all of them, conditionally natural meters are used (for example, various types of organic fuel are converted into conventional fuel with a calorific value of 29.3 mJ / kg (7000kcal/kg)). Translation into conventional units is made through special coefficients calculated as the ratio of consumer properties of product varieties to the reference value.

Labor units of measurement allow taking into account the total labor costs and the labor intensity of individual operations of the technological process, these include man-days and man-hours.

Cost units of measurement give a monetary assessment of the phenomena and processes under study, these include rubles, thousands of rubles, millions of rubles, currencies of other countries.

Relative values.

In statistical practice, relative indicators are widely used. Relative value- this is the result of dividing two absolute values, which characterizes the quantitative relationship between them. In relation to absolute indicators, relative values ​​are derivatives, secondary. The absolute indicator, which is in the numerator of the ratio, is called the current or compared. The indicator that is in the denominator is called the base or base of comparison. Relative indicators can be expressed in coefficients, percentages (0 / 0, base = 100), ppm (0 / 00, base = 1000), decimilles (0 / 000, base = 10000) or be named numbers (for example, rub / rub .).

Relative statistics are divided into the following types:

1) the relative value of the planned target;

2) the relative value of the implementation of the plan (contractual obligations);

3) the relative size of the structure;

4) the relative magnitude of the dynamics;

5) the relative value of the comparison;

6) the relative value of coordination;

7) the relative magnitude of the intensity.


The concept of variation.

Each object under study is in specific conditions and develops with its own characteristics under the influence of various factors. This development is expressed by numerical levels of statistical indicators, in particular, by average characteristics.

Variation- this is a discrepancy between the levels of one indicator for different objects. Feature Variation- the difference between the individual values ​​of the trait within the population. Characterizes the homogeneity of the population. Variation indicators serve to measure it, in particular, they measure the deviation (variation) of individual values ​​of a trait within the studied population from the average values, and show the reliability of average characteristics. Thus, when analyzing the studied population, the obtained average values ​​must be supplemented with indicators that measure deviations from the average and show the degree of their reliability, i.e. indicators of variation.

Statistics does not study all differences in the values ​​of a particular trait, but only quantitative changes in the magnitude of a trait within a homogeneous population, which are caused by the intersecting influence of various factors.

Distinguish random and systematic trait variation. Statistics is the study of systematic variation. Its analysis allows assessing the degree of dependence of changes in the studied trait on various factors that cause these changes.

Having determined the nature of the variation in the studied population, we can say how homogeneous it is, and therefore how characteristic the calculated average value is.

The degree of proximity of individual units to the average is measured by a number of absolute, average and relative indicators of variation.

The concept of sampling error.

The generalizing indicators for some of the population units will not coincide with the corresponding indicators for the population of all units. One of the tasks of sample observation is to determine the limits of deviations of the characteristics of the sample population and the general population.

The possible limits of deviations of the general and sample shares, as well as general and sample means, are called the sampling error (representativeness error). The smaller it is, the more accurately the sample observation indicators reflect the general population.

Sampling errors are:

1) tendentious- these are deliberate errors, if the worst units of the population are specially selected;

2) random- arise due to random selection, tk. units from the population are selected at random, may be exaggerated or the characteristics of the general population.

The sampling error depends on the size of the sample and on the degree of variation of the trait under study. All possible discrepancies between the characteristics of the sample and the general population are accumulated in the formula mean sampling error. It is calculated differently depending on the selection method: repeated or non-repeated.

During re-selection, each unit that fell into the sample, after fixing the value of the studied trait, returns to the general population and can again be randomly selected.

In practice, non-repetitive selection is more often used, when the selected units are not returned to the general population.

Reselection:

1) for the indicator of the average value of a quantitative variable trait: (1),

2) for the indicator of the share of an alternative feature: (2),

Repetitive selection.

With this method of selection, the number of units of the general population is reduced in the sampling process, therefore:

1) for the indicator of the average value of a quantitative trait: (3),

2) for the indicator of the share of an alternative feature: (4)

According to the rules of mathematical statistics, the value of the average sampling error should be determined not through the sample variance, but through the general variance, but, most often, in practice, when conducting a sample survey, it is unknown.

Proved that (5)

for a sufficiently large value of n (), the ratio is close to unity, i.e. if the principle of random selection is observed, the variance of a large sample size is close to the variance in the general population. Therefore, in practice, sampling variance is usually used to determine the average sampling error.

The above formulas (1),(2),(3),(4) allow us to determine the average value of deviations, equal to , of the characteristics of the general population from the sample characteristics. It is proved that the general characteristics deviate from the sample ones by ±μ with a probability equal to 0.638. This means that in 683 cases out of 1000 the general share (general mean) will be within ±μ of the sample share (sample mean), and in 317 cases it will go beyond these limits.

The probability of judgments can be increased, and the boundaries of the characteristics of the general population can be expanded if the average sampling error is increased several times (t times, t=2,3,4...).

The value obtained as the product of t and the average sampling error is called the marginal sampling error, i.e.

(6) and (7), where

t is the confidence coefficient, it depends on the probability with which it can be guaranteed that the marginal error does not exceed the t-fold average error, is found from ready-made tables of the function F (t), defined by the Russian mathematician A.M. Lyapunov in relation to the normal distribution.

In practice, a non-continuous survey is often used, in which the sample is formed from a small number of units of the general population, usually no more than 30 units. Such a selection is called small sample.

The average error of a small sample is determined by the formula: (8)

Since the ratio is significant with a small sample, the variance of a small sample is determined taking into account the number of degrees of freedom. It is understood as the number of options that can take arbitrary values ​​without changing the value of the average, it is usually for a small sample = (n-1):

(9), (10) Knowing the confidence probability of a small sample (usually 0.95 or 0.99) and the sample size n, we can determine the value of t using a special Student's table.


Average indexes.

Any general index can be represented as a weighted average of individual indices (the second form of expression of general indices). In this case, the form of the average must be chosen in such a way that the resulting average index is identical to the original aggregate index. Two forms are used: the arithmetic mean form and the geometric mean form (for calculating general indices).

1) In cases where there is no data on the quantity of goods (products) in natural meters, but there is information on the cost of goods sold (manufactured products) and individual indices of changes in the volumes of goods (products), it is possible to determine the aggregate index of the physical volume of trade (products) by arithmetic mean form.
(24) , where

In order for the arithmetic mean index to be identical to the aggregate index, the weights of the individual indices in it must be taken from the terms of the denominator of the original aggregate index.

2) In cases where there is no information on the quantity of goods (products) in kind, but there is accounting for the sale of goods (production) in value terms and individual prices for goods (products), the average harmonic form is used to determine the summary indicators of price changes .
(25) , where

For the average harmonic index to be identical to the aggregate index, the weights of the individual indices in it must be taken from the terms of the numerator of the original aggregate index.

Territorial indexes.

Territorial indices serve to compare indicators in space, that is, by enterprises, cities, regions, etc.

The construction of territorial indices is determined by the choice of the comparison base and weights or the level at which the weights are fixed. In two-sided comparisons, each territory can be compared (numerator of the index) and base of comparison (denominator). The weights of both the first and second territories can be used in the calculation of the index, but this can lead to inconsistent results. Therefore, two methods for calculating territorial indices are proposed.

1) The volumes of goods sold (manufactured products) for two regions taken together are taken as weights: (33)

The territorial price index then has the form:

(34) , where P a, P b - the price of a unit of goods (products) in the territories a and in.

As weights here, you can use the structure of the sale of these goods (products) over a larger territory (republic, for example).

2) In the second method of calculation, the ratio of the weights of the compared territories is taken into account. The average price of each product for two territories is calculated together:

(35) , then price index (36)

This approach to the calculation of the territorial price index provides the relationship:

The index of the physical volume of trade (production) has the form:

Then the index system looks like:

(38)


Chain and basic indices.

When studying the dynamics of socio-economic phenomena, comparisons are often made for more than two periods.

If it is necessary to analyze the change in the phenomenon in all recent periods compared to the initial (basic) one, the basic indices are calculated.

If it is required to characterize the successive change of a phenomenon from period to period, then chain indices are calculated.

Depending on the nature of the initial information and the objectives of the study, both individual and general indices can be calculated.

Individual chain and basic indices are calculated similarly to the relative values ​​of dynamics (growth rates).

General indices are calculated with variable and constant weights, depending on their economic content.

General indices of quality indicators (prices, cost, labor productivity) are calculated as indexes with variable weights (that is, the weights are taken at the level of the current - reporting period).

General indices of quantitative indicators (physical volume) are calculated as indices with constant weights taken at the level of the base (initial period).

At the same time, common chain and basic indices with constant weights are interconnected:

a) The product of the chain indices gives the basic index of the last period;

b) Dividing the subsequent basic index by the previous basic index gives the chain index of the subsequent period.

In these indices, weights - co-measurements are taken at the level of the same base period.

General chain and basic indices with variable weights do not have such a relationship, since in them the weights - co-measurements are taken at the levels of different periods. For all individual indices, the relationship between chain and basic indices is preserved.

Individual

Chain basic 1.25*1.2=1.5 - saved

1. General price indices:

basic

The law of large numbers is generated by the connections of mass phenomena. It must be remembered that the tendencies and regularities revealed with the help of the law of large numbers are valid only as mass tendencies, but not as laws for individual units, for individual cases.


Don't lose. Subscribe and receive a link to the article in your email.

Interacting daily in work or study with numbers and numbers, many of us do not even suspect that there is a very interesting law of large numbers, used, for example, in statistics, economics, and even psychological and pedagogical research. It refers to the theory of probability and says that the arithmetic mean of any large sample from a fixed distribution is close to the mathematical expectation of this distribution.

You probably noticed that it is not easy to understand the essence of this law, especially for those who are not particularly friendly with mathematics. Based on this, we would like to talk about it in simple language (as far as possible, of course), so that everyone can at least approximately understand for themselves what it is. This knowledge will help you better understand some mathematical patterns, become more erudite and positively influence.

Concepts of the law of large numbers and its interpretation

In addition to the above definition of the law of large numbers in probability theory, we can give its economic interpretation. In this case, it represents the principle that the frequency of a particular type of financial loss can be predicted with a high degree of certainty when there is a high level of losses of such types in general.

In addition, depending on the level of convergence of features, we can distinguish the weak and strengthened laws of large numbers. We are talking about weak when convergence exists in probability, and about strong when convergence exists in almost everything.

If we interpret it a little differently, then we should say this: it is always possible to find such a finite number of trials, where, with any pre-programmed probability less than one, the relative frequency of occurrence of some event will differ very little from its probability.

Thus, the general essence of the law of large numbers can be expressed as follows: the result of the complex action of a large number of identical and independent random factors will be such a result that does not depend on chance. And speaking even more simply, then in the law of large numbers, the quantitative laws of mass phenomena will clearly manifest themselves only when there are a large number of them (which is why the law of large numbers is called the law).

From this we can conclude that the essence of the law lies in the fact that in the numbers that are obtained by mass observation, there are some correctness, which cannot be detected in a small number of facts.

The essence of the law of large numbers and its examples

The law of large numbers expresses the most general patterns of the accidental and the necessary. When random deviations "extinguish" each other, the averages determined for the same structure take on the form of typical ones. They reflect the operation of essential and permanent facts under the specific conditions of time and place.

Regularities defined by the law of large numbers are strong only when they represent mass tendencies, and they cannot be laws for individual cases. Thus, the principle of mathematical statistics comes into force, which says that the complex action of a number of random factors can cause a non-random result. And the most striking example of the operation of this principle is the convergence of the frequency of occurrence of a random event and its probability when the number of trials increases.

Let's remember the usual coin toss. Theoretically, heads and tails can fall out with the same probability. This means that if, for example, a coin is tossed 10 times, 5 of them should come up heads and 5 should come up heads. But everyone knows that this almost never happens, because the ratio of the frequency of heads and tails can be 4 to 6, and 9 to 1, and 2 to 8, etc. However, with an increase in the number of coin tosses, for example, up to 100, the probability that heads or tails will fall out reaches 50%. If, theoretically, an infinite number of such experiments are carried out, the probability of a coin falling out on both sides will always tend to 50%.

How exactly the coin will fall is influenced by a huge number of random factors. This is the position of the coin in the palm of your hand, and the force with which the throw is made, and the height of the fall, and its speed, etc. But if there are many experiments, regardless of how the factors act, it can always be argued that the practical probability is close to the theoretical probability.

And here is another example that will help to understand the essence of the law of large numbers: suppose we need to estimate the level of earnings of people in a certain region. If we consider 10 observations, where 9 people receive 20 thousand rubles, and 1 person - 500 thousand rubles, the arithmetic mean will be 68 thousand rubles, which, of course, is unlikely. But if we take into account 100 observations, where 99 people receive 20 thousand rubles, and 1 person - 500 thousand rubles, then when calculating the arithmetic mean, we get 24.8 thousand rubles, which is already closer to the real state of affairs. By increasing the number of observations, we will force the average value to tend to the true value.

It is for this reason that in order to apply the law of large numbers, it is first necessary to collect statistical material in order to obtain truthful results by studying a large number of observations. That is why it is convenient to use this law, again, in statistics or social economics.

Summing up

The importance of the fact that the law of large numbers works is difficult to overestimate for any field of scientific knowledge, and especially for scientific developments in the field of the theory of statistics and methods of statistical knowledge. The action of the law is also of great importance for the objects under study themselves with their mass regularities. Almost all methods of statistical observation are based on the law of large numbers and the principle of mathematical statistics.

But, even without taking into account science and statistics as such, we can safely conclude that the law of large numbers is not just a phenomenon from the field of probability theory, but a phenomenon that we encounter almost every day in our lives.

We hope that now the essence of the law of large numbers has become more clear to you, and you can easily and simply explain it to someone else. And if the topic of mathematics and probability theory is interesting to you in principle, then we recommend reading about and. Also get acquainted with and. And, of course, pay attention to ours, because after passing it, you will not only master new thinking techniques, but also improve your cognitive abilities in general, including mathematical ones.

Features of statistical methodology.

The peculiarity of statistics as a science and a method of cognition lies in the fact that it studies quantitative relations in close connection with the qualitative side of the processes of social life. An inevitable consequence of this uniqueness are the peculiarities of statistical analysis. These features, in particular, lie in the fact that statistical research methods are organically combined with the methods of the science whose subject is being studied, and the researcher, as a rule, is a specialist in two scientific disciplines. The aspect of concreteness in the analysis of social phenomena, conditioned by a certain space and time, predetermines the need for a combination of statistical and specific methods for a given sphere of objects, established by a certain scientific discipline.

There are at least three features of the statistical methodology or principle of the statistical study of phenomena:

Statistics studies mass quantitative relations in close connection with the qualitative features of processes and phenomena;

Statistics considers with the help of summary indicators any process as a whole, in the totality of factors, features and aspects of the studied phenomena;

Statistics seeks to show the totality of phenomena and their development in differentiation, in the variety of their types, to consider the interconnections and relationships between the latter.

Particularly important for revealing the specifics of statistics is the interpretation of such philosophical categories as quality and quantity, causality, necessity and chance, general, particular and separate, law and regularity.

QUALITY is an internal certainty, or the essence of a phenomenon directly related to the law of its development. Quality is found in a combination of diverse aspects, properties, signs of a phenomenon.

QUANTITY is the external certainty of a phenomenon, acting as a magnitude, number, degree of manifestation of one or another of its properties. Quantity is in unity with quality, making up the measure of the phenomenon.

ESSENCE is the internal, relatively stable side of reality, hidden behind the surface of the phenomenon.

PHENOMENON is the external, more mobile side of reality, acting as a form of expression of the essence.

RANDOMNESS is something that has a cause not in itself, not in the essence of phenomena, follows not from internal, but secondary, external connections, and therefore it may be, but it may not be, it can happen one way or another.

NEED - this is what has a reason in itself, inevitably follows from the essence, the internal connections of phenomena, therefore, it must inevitably happen in the main thing this way and not otherwise.



Essence and appearance do not coincide with each other. The task of any science, including statistics, is to see, reveal the essence behind the phenomena that only fall into the field of view of the researcher.

Thus, guided by the named categories of objective reality, the statistician has the right to expect that, on the basis of the statistical data he collects, he can reveal the essence of the social process under study.

One of the expressions of the connection between chance and necessity is the LAW OF GREAT NUMBERS.

In other words, the law of large numbers expresses a general principle by virtue of which, in a large number of phenomena, under certain general conditions, the influence of random factors is almost eliminated.

The law of large numbers received its mathematical proof in probability theory, as well as confirmation in numerous experimental tests. So, the French naturalist A. Buffon made the following experiment: he tossed a coin 4040 times, while the heads fell out 2048 times, and tails 1992 times. Hence, the particularity of falling heads was 2048/4040 = 0.507 and deviated from the probability of its falling out in each individual case, equal to 1/2, only by 0.007 (0.507 - 0.500).

This suggests that in the experiment under consideration, the influence of the main, permanent causes was almost completely manifested, and random causes rejected the results only by a very small amount. Those. as a result of mutual cancellation of random deviations, the averages calculated for quantities of the same type become typical, reflecting the action of constant and significant factors in given conditions of time and place.

By virtue of the law of large numbers, random deviations and errors in the measurement of quantities cancel each other out in the mass of phenomena. Again, by virtue of the same property, it is necessary to study the main regularities in a large set of objects, and not on individual objects, on the value of which, in addition to the main regularity, there are two kinds of errors: the individual characteristics of a given event (object) and inaccuracies associated with measuring it quantities.

Under certain conditions, the value of an individual element in the aggregate can be considered as a random variable, bearing in mind that it is not only an automatic result of some general pattern, but at the same time is itself determined by the action of many factors that do not depend on this general pattern.

Therefore, the basis of statistical research is always the mass observation of facts. But we emphasize that the law of large numbers is not a regulator of the processes studied by statistics. It is a mistake to consider it the basic law of statistics. It characterizes only one of the forms of manifestation of regularities in mass quantitative relations, which in science is called a statistical regularity.

STATISTICAL REGULARITY - one of the forms of manifestation of the universal connection of phenomena in nature and society. For the first time this term began to be used in the natural sciences as opposed to the concept of dynamic regularity, i.e. such a form of regularity, when strictly defined values ​​of any factors always correspond to strictly defined values ​​of quantities dependent on these factors. With a dynamic regularity, quantitative relationships between quantities remain valid for each individual case, each element of the population covered by the action of a known law. These relationships can be expressed by strictly defined mathematical formulas, a system of equations, etc.

An example of a dynamic regularity is the regularity of the ratios of gravity, mass and distances between bodies, determined by the law of universal gravitation.

STATISTICAL REGULARITY - another form of regularity, when any rule, law, quantitative ratio are revealed only in a sufficiently large number of elements of the population, find their expression only in the mass of phenomena. The occurrence of an individual event with this pattern is associated with a known probability (i.e., an individual event may or may not occur). But in the mass of cases, the general regularity must find its manifestation. The characteristic of the relationship between the number of events and the degree of action of the general regularity, under certain simple conditions, is given by the law of large numbers. One of the forms of mathematical expression of the relationship of causes and effects with a statistical regularity is the regression (correlation) equations.


close