General theory of statistics: lecture notes (N.V.

Statistics is a social science that studies the quantitative side of qualitatively defined mass socio-economic phenomena and processes, their structure and distribution, location in space, movement in time, identifies existing quantitative dependencies, trends and patterns, and in specific conditions of place and time.

Statistics include:

    General theory of statistics

    Economic statistics and its branches

    Socio-demographic statistics and its branches.

Statistics is related to history, sociology, mathematics, and economics.

The object of study is society.

Translated from Latin, the word “status” means a certain state of affairs. The term “statistics” was first used by the German scientist G. Achenwal in 1749, in his book on government.

In the 18th century, the Petty and Ground school of political arithmetic emerged.

19th century - statistical and mathematical school Kettle, Pearson, Galton.

Russian descriptive school of the 18th century Kirilov, Lomonosov, Chulkov. Radishchev and Herzen influenced the development of statistical thought. Chebyshev and Markov made great contributions

Statistics is a tool of knowledge.

There are 4 concepts of statistics:

    A set of educational disciplines that have certain specifics and study the quantitative aspects of mass phenomena and processes.

    The branch of practical activity, statistical accounting which is carried out by ROSSTAT.

    A set of digital information - statistical data published in collections and directories of enterprise reporting.

    Statistical methods used to study socio-economic phenomena and processes.

Statistics Features:

1) statistical data are reported in quantitative terms;

2) statistical science is interested in conclusions drawn from the analysis of collected and processed numerical data;

3) the state of the phenomenon being studied at a certain stage of its development in specific conditions of place and time is reflected by statistical data.

    Subject of statistics.

Statistics- social science, which studies the quantitative side of qualitatively defined mass socio-economic phenomena and processes, their structure and distribution, location in space, movement in time, identifies existing quantitative dependencies, trends and patterns, and in specific conditions of place and time.

Subject of statistics– dimensions and quantitative relationships of qualitatively defined socio-economic phenomena, patterns of their connection and development in specific conditions of place and time.

Statistics object- society

The object of statistical research in statistics is called a statistical population.

Statistical population- this is a set of units that have mass, homogeneity, a certain integrity, interdependence of the state of individual units and the presence of variation.

The subject of statistics is the study of social phenomena, dynamics and directions of their development. With the help of statistical indicators, statistics establishes the quantitative side of a social phenomenon, observes the patterns of transition from quantity to quality using the example of a given social phenomenon. Based on the observations provided, statistics analyzes the data obtained in specific conditions of place and time.

Statistics deals with the study of socio-economic phenomena and processes that are widespread in nature, and also studies the many factors that determine them.

To derive and confirm their theoretical laws, most social sciences use statistics.

    Basic concepts of statistical methodology

Currently, it is difficult to name a science that does not study mass processes in a particular area. In the knowledge of any mass phenomena of a specific type (i.e., any science), the general provisions of statistics as a science are used: data on a variety of objects (elements) of the phenomenon being studied are accumulated, these results are described (summarized) using a set of specific characteristics (indicators) in compliance with requirements (conditions, rules) developed by statistics. When applied to different areas of phenomena, the statistical method takes into account their characteristics. The specific techniques with which statistics studies mass phenomena form a statistical methodology (or method of statistics).

Statistical methodology– a system of techniques, methods and methods aimed at studying quantitative patterns manifested in the structure, dynamics and interrelations of socio-economic phenomena.

Statistical research

Statistical information

three stages:

    statistical observation;

Statistical observation

    summary and grouping of observation results;

Summary

Grouping

The results of the statistical summary and grouping are presented in the form of statistical tables.

Statistical table

    analysis of the obtained general indicators.

Statistical analysis is the final stage of statistical research. In its process, the structure, dynamics and relationships of social phenomena and processes are explored. The following main stages of analysis are distinguished:

    Statement of facts and their assessment;

    Establishing the characteristic features and causes of the phenomenon;

    Comparison of a phenomenon with other phenomena;

    Formulation of hypotheses, conclusions and assumptions;

    Statistical testing of proposed hypotheses using special statistical indicators.

    The concept of a statistical indicator

Statistical indicator

Statistical indicators are classified according to:

degree of population coverage:

        Individual, characterize one object or one unit of a population.

        Summary, characterize a group of a population or the entire population as a whole.

        • Volumetric indicators are obtained by adding the value of the characteristic of individual units of the population.

          Estimated indicators are determined using various formulas.

expression form:

    Absolute indicators- these indicators reflect the physical dimensions of the processes and phenomena studied by statistics, namely their mass, area, volume, extent, time characteristics, and can also represent the volume of the population, i.e. the number of its constituent units.

Absolute statistics are always named numbers.

Depending on the socio-economic essence of the phenomena under study, their

physical properties are distinguished:

    natural units of measurement: tons, kilograms, square, cubic and simple meters, kilometers, miles, liters, barrels, pieces.

    Cost units of measurement, allowing to give a monetary assessment of socio-economic objects and phenomena.

    labor units of measurement, which makes it possible to take into account both the total labor costs at the enterprise and the labor intensity of individual operations of the technological process, include man-days and man-hours.

    Relative indicators - represent the result of dividing one absolute indicator by another and express the relationship between the quantitative characteristics of socio-economic processes and phenomena.

current, or compared, and the denominator is comparison base.

    Averages

    Purpose and application of statistical indicators

Statistical indicator- represents a quantitative characteristic of socio-economic phenomena and processes in conditions of qualitative certainty.

Each statistical indicator has qualitative socio-economic content and an associated measurement methodology. A statistical indicator also has one or another statistical form (structure). An indicator can express the total number of units in a population, the total sum of the values ​​of a quantitative characteristic of these units, the average value of a characteristic, the value of a given characteristic in relation to the value of another, etc.

The main function of specific statistical indicators and their systems is the cognitive information function. Without statistical information, it is impossible to know the patterns of natural and social mass phenomena, their prediction, and therefore regulation or direct management, be it at the level of an individual enterprise, farmer, city or region, at the state or interstate level.. The condition for statistical indicators to fulfill their information , cognitive function is their scientific justification and sufficiently accurate and reliable, as well as timely quantitative determination.

    Types of statistical indicators.

Statistical indicator- represents a quantitative characteristic of socio-economic phenomena and processes in conditions of qualitative certainty.

Indicators used to study statistical practice and science are divided into groups according to the following criteria:

1) according to the essence of the phenomena being studied, they are volumetric and qualitative;

2) according to the degree of aggregation of phenomena - these are individual and generalizing;

3) depending on the nature of the phenomena being studied - interval and momentary;

4) depending on spatial definition, indicators are distinguished: federal, regional and local;

5) depending on the properties of specific objects and the form of expressions, statistical indicators are divided into relative, absolute and average.

A system of statistical indicators is formed by a set of interrelated indicators that have a single-level or multi-level structure. The system of statistical indicators is aimed at solving a specific problem.

Statistical indicators have interconnected quantitative and qualitative sides. The qualitative side of a statistical indicator is reflected in its content, regardless of the specific size of the attribute. The quantitative side of an indicator is its numerical value.

A number of functions that statistical indicators perform are primarily cognitive, managerial (control and organizational) and stimulating functions.

Statistical indicators in the cognitive function characterize the state and development of the phenomena under study, the direction and intensity of the development of processes occurring in society. Summary indicators– this is the basis for analyzing and forecasting the socio-economic development of individual areas, regions, regions and the country as a whole. The quantitative side of phenomena helps to analyze the qualitative side of an object and penetrates into its essence.

    Three stages of statistical research.

Statistical research– the process of collecting, processing and analyzing statistical information.

Statistical information– primary statistical material about socio-economic phenomena, formed in the process of statistical observation, which is subject to systematization, analysis and generalization.

Statistical research consists of three stages:

1) statistical observation;

2) summary and grouping of observation results;

3) analysis of the obtained general indicators.

    Statistical observation- mass, systematic, scientifically organized observation of the phenomena of social and economic life, which consists in recording selected characteristics of each unit of the population.

Statistical observation - primary statistical data is generated, or initial statistical information, which is the basis of statistical research. If an error is made during the collection of primary statistical data or the material turns out to be of poor quality, this will affect the correctness and reliability of both theoretical and practical conclusions;

    Summary and grouping of data - at this stage, the population is divided according to differences and combined according to similarities; total indicators are calculated for groups and as a whole. Using the grouping method, the phenomena under study are divided into types, groups and subgroups, depending on their essential characteristics. The grouping method makes it possible to limit populations that are qualitatively homogeneous in significant respects, which serves as a prerequisite for the definition and application of generalizing indicators;

Summary- this is a complex of sequential operations to generalize specific individual facts that form a set in order to identify typical features and patterns inherent in the phenomenon being studied as a whole.

Grouping- division of units of the studied population into homogeneous groups according to certain characteristics that are essential for them.

    Processing and analysis of received data, identifying patterns. At this stage, with the help of generalizing indicators, relative and average values ​​are calculated, a summary assessment of the variation of characteristics is given, the dynamics of phenomena are characterized, indices and balance sheets are used, indicators are calculated that characterize the closeness of connections in changes in characteristics. For the purpose of the most rational and visual presentation of digital material, it is presented in the form of tables and graphs.

    Structure of statistical science

The structure of statistical science includes:

general theory of statistics

General theory of statistics is the science of the most general principles and methods of statistical research of mass socio-economic phenomena and processes. It defines the system of concepts and categories of statistical science, develops the scientific foundations of methods for collecting, summarizing and analyzing statistical data, and establishes the conditions for the application of these methods. Being the methodological basis of economic and socio-demographic statistics, as well as all industry statistics, the general theory of statistics creates a scientific foundation for the application of statistical methods of analysis to specific objects of research.

economic statistics

Economic statistics engages in a comprehensive study of economic phenomena and processes occurring at the macro level, i.e. in the country's economy as a whole and at the level of large regions. It reveals the essence, methods of calculation and analysis macroeconomic (synthetic) indicators characterizing the state of the national economy; scale, level, pace of its development; structure, proportions and relationships of industries; features of the location of productive forces; availability and composition of material, labor, financial resources, the achieved level of their use. Macroeconomic indicators include indicators such as gross national wealth(VNB), gross domestic product(GDP), gross profit of the economy(VPE) and gross national income(VND), gross national product(VNP), etc.

All macroeconomic indicators are determined based on systems of national accounts (SNA). This is a system of interconnected statistical indicators corresponding to the national market economy, built in the form of a certain set of accounts and balance sheets that characterize the results of economic activity, the structure of the economy and the most important relationships of its links. Being consistent with the standard methodology for constructing the SNA adopted by the UN and the European Union, the Russian SNA allows for in-depth analysis of the national economy in a variety of areas in accordance with international statistical standards.

socio-demographic statistics

Socio-demographic statistics forms and analyzes a system of indicators for a comprehensive description of the lifestyle of the population and various social aspects of society. It studies the size and composition of the population (by age, gender, nationality, etc.), the structure of families and households, income and expenses of the population, employment and unemployment, level and quality of life, consumption of material goods and services by the population, the state of healthcare, education, culture, crime rate, etc.

industry and special statistics. In sectoral statistics of large industries, sub-sectors are distinguished, for example, in industrial statistics - statistics of mechanical engineering, metallurgy, chemistry, etc., in population statistics - statistics of population size and composition, statistics of vital statistics and migration.

IN industry statisticians The essence and methods of calculating indicators characterizing the state and dynamics of development of the corresponding sector of the economy or social sphere are covered.

All industry statistics are formed on the basis of indicators of economic or socio-demographic statistics, using methods and techniques developed in the general theory of statistics. At the same time, the development of each sectoral statistics contributes to the improvement of statistical science as a whole.

Each of the components of statistical science has its own object of study, uses a specific system of indicators, develops rules and methods for their calculation and application in various areas of economic activity and the social sphere.

There is a close relationship and interdependence between statistical science and statistical practice. The theoretical principles of statistical science are applied in practice to solve specific statistical problems. In turn, science, using these practices, generalizes the experience of practical work, draws from it new ideas and provisions, and improves methods of conducting statistical research.

    The concept of statistical observation, its goals .

The first stage of the study is statistical observation.

It represents mass, systematic, scientifically organized observation of the phenomena of social and economic life, consisting in the registration of selected characteristics in each unit of the population.

Statistical observation consists of recording selected characteristics of each unit of the population. It must be massive, systematic and carried out according to a developed program on a scientific basis.

There are stages of statistical observation:

    Observation preparation;

    conducting mass data collection;

    Control and quality of information received

Observation object

Unit of observation

Reporting unit

Observation program

Organizational plan for observation- this is a document that records all the most important organizational activities, the implementation of which is necessary for the successful implementation of observation.

Observation Toolkit– a set of documents used during observation.

Forms of statistical observation

reporting,

special observation

registers.

Purpose of observation

    Program and organization of statistical observation

Statistical observation- mass, systematic, scientifically organized observation of the phenomena of social and economic life, which consists in recording selected characteristics of each unit of the population.

Purpose of observation– obtaining reliable information to identify patterns of development of phenomena and processes.

Observation object– a set of social phenomena and processes that are subject to observation.

Unit of observation- an element of an object that is a carrier of characteristics subject to registration.

Reporting unit– this is the subject from which data about the observation unit comes.

Stages of statistical observation:

    Observation preparation; the goals and objects of observation, signs to be registered are determined, documents for data collection are developed, methods and means of obtaining data are determined, personnel are selected and trained; drawing up a work schedule for the preparation and conduct of statistical observation; materials that will be used in statistical observation are processed

    conducting mass data collection is the most important stage in conducting statistical observation, accumulating statistical information

    Control and quality of information received. At this stage, the statistical observation data is monitored, conclusions and proposals are made regarding the statistical observation carried out.

Observation program- this is a list of indicators to be registered.

The statistical observation program must contain a list of characteristics that will characterize individual units of the population.

Program requirements: the signs must be significant; the program must include only those questions to which truthful, reliable answers can be given; questions must be precise and not ambiguous; availability of questions for verification; a certain sequence of questions; presence of open/closed questions.

There is an Organizational Observation Plan- this is a document that records all the most important organizational activities, the implementation of which is necessary for the successful implementation of observation.

    Classification of statistical observation.12. Continuous and not continuous statistical observation. 13. Survey of the main body, selective and monographic observation. 14. Classification Art. observations by time. 15. Classification Art. observations based on sources of information.

Statistical observation- mass, systematic, scientifically organized observation of the phenomena of social and economic life, which consists in recording selected characteristics of each unit of the population.

Types of statistical observation are most often classified according to the following three criteria:

a) observation coverage of population units subject to statistical research;

    Continuous (all units are examined completely)

    Not continuous

    Sample - based on collecting information on part of the population units and distributing the observation results to the entire population. The size of the sample depends on the nature of the phenomenon being studied. The sample population must represent all types of units that are present in the population.

    Main array - data collection is carried out only for those units of the population that make the main contribution to the characteristics of the phenomenon under study.

    Monographic is a description of individual units of a population for their in-depth study, which cannot be so effective with mass observation. Monographic observation is carried out in order to identify development trends, to study and disseminate the best practices of farms or enterprises.

b) systematic observation;

    Continuous (register)

    Intermittent

    Periodic (as needed)

    One-time (housing census)

c) the source of information on the basis of which the facts to be recorded during the observation process are established.

    Direct (the registrars themselves establish the fact to be recorded by measuring, weighing, counting)

    Documented (based on the use of accounting documents as a source of information)

    Survey (information is obtained from the words of the respondent. Used to obtain information about phenomena and processes that are not directly observable)

    Self-registration

    Appearance method

    Correspondent method

    Questionnaire

D) by form:

    Statistical reporting– this is a form of organizing statistical observation of the activities of enterprises and organizations, according to which state statistics bodies receive information in the form of reporting documents signed by persons responsible for the accuracy of the information.

    Specially organized surveillance is the collection of information through censuses and one-time surveys.

    Register is a form of continuous statistical observation of long-term processes that have a fixed beginning, a stage of development and a fixed end. This is a system that constantly monitors the state of observation units and evaluates the influence of various factors on the indicators being studied. Each unit in the register is characterized by a set of indicators. Some remain unchanged throughout the observation period, others, the frequency of which is unknown, are updated as they change.

Every observation is subject to error.

Observation errors– errors that appear during the observation process:

    Registration errors– all errors that arise during continuous observation.

    Random errors– these are errors made when filling out forms, a reservation in the answers, vagueness in the question and, accordingly, in the answer, etc.

    Systematic errors:

    Intentional errors (conscious) are obtained as a result of the fact that, when knowing the actual state (value) of the attribute, incorrect data are deliberately reported.

    Unintentional are called errors caused by random reasons: for example, incorrect measuring instruments, inattention of recorders, etc.

    Errors of representativeness - arise as a result of the fact that the composition of the part of the mass phenomenon selected for the survey does not fully reflect the features and essence of the entire population being studied.

Material quality control:

    Logical – checking the consistency of the obtained data with each other or comparison with previous periods.

    Arithmetic – arithmetic verification of final and calculated indicators.

Completeness control- this is a check of how completely the object is covered by observation, in other words, whether information has been collected about all observation units.

    Reporting as the most important type of Art. observations. Classification of statistical reporting.

Statistical observation is carried out in 2 forms:

1) by providing reports;

2) by conducting specially organized statistics. observations.

Reporting is an organized form of statistical observation in which information is received in the form of mandatory reports within certain deadlines and in approved forms. Reporting as a form of statistical observation is based on primary accounting and is its generalization.

Primary accounting is a registration of various facts (events, processes, etc.) produced as they occur and, as a rule, on a primary document.

The management of statistical reporting and its organization are entrusted to the state statistics bodies. All forms of statistical reporting are approved by state statistics bodies. Submitting reports on unapproved forms is considered a violation of reporting discipline, for which heads of enterprises and departments are held accountable.

The list of reporting is a list of reporting forms indicating their most important details.

Reporting program- system of performance indicators of a trading enterprise.

General reporting- this is reporting containing the same data for a certain sector of the national economy and for enterprises (institutions, etc.) of the entire national economy.

IN specialized reporting contains specific indicators of individual industries, agriculture, etc.

Based on the period of time for which reporting is presented, and its duration, a distinction is made between current and annual reporting. If information is presented for the year, then such reporting is called annual. Reporting for all other periods within less than a year, respectively quarterly, monthly, weekly, etc., is called current.

According to the method of presentation, reporting is distinguished urgent, when all information is submitted by teletype, telegraph, and postal

In commercial practice reporting is subdivided on the:

1) nationwide - provided both to a higher organization and to the relevant state bodies. statistics;

2) intradepartmental - which is submitted only to higher trade authorities;

3) current - presented during the year;

4) annual - the most complete in terms of the composition of displayed indicators.

    Grouping. Concept and application.

The most common method of processing and analyzing primary statistical information is grouping.

Grouping- division of units of the studied population into homogeneous groups according to certain characteristics that are essential for them.

Grouping functions:

    identification of socio-economic types of phenomena;

    study of the structure and structural changes occurring in socio-economic phenomena;

    analysis of relationships between phenomena.

Types of grouping:

Typological grouping- this is the division of a qualitatively heterogeneous population into separate qualitatively homogeneous groups and the identification on this basis of economic types of phenomena.

Structural grouping- this is the identification of patterns of distribution of units of a homogeneous population according to varying values ​​of the studied

sign.

Analytical grouping is a study of the relationships between varying characteristics within a homogeneous population. In this case, one characteristic will be effective, and the other (others) will be factorial. Factorial signs that influence the change in results are called. Effective characteristics that change under the influence of factors are called.

A type of structural grouping is distribution series.

Stages of building a group:

    The choice of a grouping characteristic, i.e. the characteristic by which

Units of the population under study are combined into groups.

    Determining the number of groups and the size of the interval

(n-number of groups, R-range of variation, a-size of the interval, N-number of units of the population)

R=x max -x min

n = 1 + 3.322 –log N

    Establishing a list of indicators that should characterize

    Creating a table layout based on grouping results

    Calculation of absolute, average, relative indicators, filling out tables and drawing graphs.

By number of signsgroupings:

    Simple (one attribute)

    Combinative

    Multidimensional

Secondary grouping- an operation to form new groups based on a previously carried out grouping.

Secondary grouping methods:

    Changing Initial Intervals

    Business regrouping

Classification –

Types of classification:

    Types of groups.

Statistical groupings have the following purposes:

    Identification of qualitatively homogeneous populations;

    Study of population structure

    Research existing dependencies

Each of these goals corresponds to a special type of grouping:

    Typological is the division of a population into groups that are homogeneous in quality and conditions of development (solves the problem of identifying and characterizing socio-economic types). There are two ways to form typological groupings:

A method of sequential partitioning, which consists in the formation of groups, all objects of which have the same values ​​of classification characteristics (first dividing the entire population according to one characteristic, then obtaining parts using another, etc.)

A method of multidimensional classification, when objects forming groups can have different values ​​of classification characteristics (groups are formed based on the proximity of objects simultaneously according to a large number of characteristics, it has become widely used with the development of pattern recognition methods and the advent of computers)

    Structural – used to study the structure of a population, characteristics of its structure and structural shifts. Structural groupings are built either on the basis of a previously conducted typological grouping, or on the basis of primary data

    Analytical (factorial) - designed to establish the close relationship between interacting characteristics - factorial and resultant. It allows you to identify the presence and direction of a connection, as well as measure its closeness and strength. Therefore, a factor characteristic identified on the basis of an analysis of the phenomenon being studied is most often used as a grouping characteristic.

In cases where a qualitative characteristic has a large number of varieties, a classification is developed.

Classification – a special type of grouping; this is a stable nomenclature of classes and groups formed on the basis of the similarities and differences of the units of the object being studied. Classification is the distribution of phenomena and objects into certain groups, classes, categories.

Types of classification:

Product nomenclatures as a systematic list of objects and groups.

Classifiers are a classification where each attribute value is assigned a code, i.e. conventional digital designation.

Depending on the number of characteristics underlying the grouping, the following groups are distinguished:

    simple - made according to one characteristic. Among the simple ones, distribution series stand out. A distribution series is a grouping in which one indicator is used to characterize groups (ordering those arranged by characteristic value) - the number of the group. Series constructed according to an attribute are called attribute distribution series. Distribution series constructed on a quantitative basis are called variation series.

    Complex ones, which are divided into:

    • a combinational grouping based on two or more characteristics taken in interrelation, in combination. In this case, classification is carried out by sequential logical division of the population according to individual characteristics;

      multidimensional groupings are carried out simultaneously according to several characteristics.

According to the relationships between the characteristics, the following are distinguished:

    hierarchical groupings performed according to two or more characteristics, with the values ​​of the second characteristic determined by the range of values ​​of the first (for example, classification of industries by sub-sectors);

    non-hierarchical groupings that are constructed when there is no strict dependence of the values ​​of the second characteristic on the first.

According to the order in which information is processed, the groups are:

    primary (compiled on the basis of primary data);

    secondary, resulting from the regrouping of previously grouped material.

In accordance with the time criterion, they distinguish:

    static groupings that characterize the population at a certain point in time or for a certain period;

    dynamic - groupings showing the transitions of units from one group to another (as well as entry and exit from the aggregate).

    Statistical tables

Statistical table– a table that contains a summary numerical characteristic of the population under study according to one or more essential characteristics, interconnected by the logic of economic analysis.

Types of headers:

Ostaf– a table without numbers and headings.

Layout– table with headings.

Subjects of statistical table- an object that is characterized in it by numbers. (A set, individual units of a set in the order of their list or territorial units grouped according to one or several characteristics, time periods, etc.)

INDepending on the structure of the subject, they are distinguishedstatistical tables

    simple, in the subject of which a simple list of units of the population is given ( list) or only one of them, a unit identified according to a specific characteristic ( monographic);

    complex, the subject of which contains groups of units of the aggregate one at a time ( group) or several ( combinational) quantitative or attributive characteristics.

Predicate of a statistical table– a system of indicators that characterize the object of study, i.e., the subject of the table. The predicate forms the headings of the graph and makes up their content.

According to the structural structure of the predicate, statistical tables are distinguished with:

    simple predicate development- the indicator that determines it is obtained by simply summing the values ​​for each characteristic separately, independently of each other.

    complex predicate development involves dividing the characteristic that forms it into groups.

Matrix - a rectangular table of numerical information consisting of m-rows and n-columns.

    Application of multidimensional grouping and data classification methods. Cluster analysis.

Grouping- division of units of the studied population into homogeneous groups according to certain characteristics that are essential for them.

By number of signsgroupings:

    Simple (one attribute)

    Complex (according to two or more characteristics)

    Combinative

    Multidimensional

Let's consider the use of multidimensional groupings. Since it is difficult to choose any one characteristic as the basis for a grouping. It is even more difficult to group according to several characteristics. The combination of two characteristics allows us to maintain the visibility of the table, but the combination of three or four characteristics gives a completely unsatisfactory result: even if we identify three categories for each of the grouping characteristics, we will get 9 or 12 subgroups. A uniform distribution of units among groups is impossible in principle. So we get groups that include 1-2 observations. Methods of multidimensional groupings make it possible to preserve the complexity of describing groups and at the same time overcome the disadvantages of combinational grouping. They are often called multidimensional classification methods.

Classification – a special type of grouping; this is a stable nomenclature of classes and groups formed on the basis of the similarities and differences of the units of the object being studied. Classification is the distribution of phenomena and objects into certain groups, classes, categories.

These methods have become widespread through the use of (computers and application software packages). The purpose of these methods is data classification, in other words, grouping based on many characteristics. Such problems are widespread in the sciences of nature and society, in practical activities to control mass processes. For example, the identification of types of enterprises according to financial status and economic efficiency of activities is carried out on the basis of many characteristics: identification and study of types of people according to the degree of their suitability for a certain profession (professional suitability); diagnosis of diseases based on many objective signs (symptoms), etc.

The simplest version of multivariate classification is grouping based on multivariate averages.

A multidimensional average is the average value of several characteristics for one unit of the population.

A more reasonable method of multidimensional classification is cluster analysis. The name of the method itself comes from the same root as the word “class”, “classification”. The English word the cluster has the meaning: group, bunch, bush, i.e. associations of some homogeneous phenomena. In this context, it is close to the mathematical concept of “set”, and, like a set, a cluster can contain only one phenomenon, but, unlike a set, cannot be empty.

Each population unit in cluster analysis is considered as a point in a given feature space.

    The concept of statistical graphs, the rules for their construction

Graphical method –

Schedule

When constructing a graphic image, a number of requirements must be observed. First of all, the graph must be quite visual, since the whole point of a graphic image is to clearly depict statistical indicators. In addition, the schedule must be expressive, intelligible and understandable. To fulfill the above requirements, each the schedule should include a number of basic elements:

    Graphic image

    Graph field

    Spatial orientation

    Scale guidelines

    Explication of the graph (explanation)

Graphic image- these are geometric signs, i.e. a set of points, lines, figures with the help of which statistical indicators are depicted.

Graph field- this is the part of the plane where graphic images are located. The graph field has certain dimensions, which depend on its purpose. The most optimal ratio is 2 in width and 3 in height.

Spatial landmarks graphics are specified in the form of a system of coordinate grids. A coordinate system is necessary to place geometric signs in the graph field. Two coordinate systems are used: a rectangular coordinate system and a polar coordinate system.

Scale guidelines statistical graphics are determined by the scale and system of scales. The scale of a statistical graph is a measure of the conversion of a numerical value into a graphic one. A scale is a line whose individual points can be read as specific numbers. The scale is of great importance in graphics and includes three elements: a line (or scale carrier), a certain number of points marked with dashes, which are located on the scale carrier in a certain order, and a digital designation of numbers corresponding to individual marked points.

Explication of the graph– names of axes, graphics, symbols.

The most important part of charting is choosing the right composition., i.e.:

What data should be depicted from the many available,

What type of chart to use.

Charts are intended for:

Monitoring the reliability of information,

Studying the patterns of development of phenomena,

Identification of possible relationships between phenomena.

    Classification of statistical graphs.

Modern science cannot be imagined without graphic methods. The use of graphs to present statistical indicators makes it possible to provide clarity and expressiveness, facilitate their perception, and in many cases helps to understand the essence of the phenomenon being studied, its patterns and features, to see the trends in its development, the relationship of indicators characterizing it.

Graphical method – This is a method of conventionally depicting statistical data using geometric shapes, lines, points and other images.

Schedule– a means of summarizing statistical data and identifying connections between phenomena.

Classification of graphs:

-according to the method of constructing a graphic image:

1) charts – depiction of statistical data using lines, shapes, etc.

2) statistical maps – image of a feature on a map

    Cartogram - image of a feature by coloring or shading

    Cardiogram – combination and diagrams

-according to geometric characteristics

1) linear

2) planar

3) volumetric

-by type of problems solved using graphs

1) comparison charts

2) structure diagrams

3) dynamic charts

Diagrams

    linear - this is an image of data using lines in a rectangular coordinate system

    columnar - image of data in the form of columns of the same width, but different in height in relation to the scale

    tape (strip) - these are columns placed horizontally. They can be bilateral and directional.

    square - the value of the attribute is proportional to the area of ​​the square. Therefore, to construct them, the square root of the attribute value is extracted.

    circular

    sectoral - used to characterize the structure of a phenomenon. The circle is divided into sectors, the areas of which are proportional to the parts of the phenomenon. Absolute values ​​are converted to percentages.

    The Varzar sign is a rectangle whose length and width are two interrelated features. Then the area of ​​the figure corresponds to the product of these features.

    A Lorenz curve is a graph that shows the distribution of one characteristic among certain groups. The Lorenz curve is constructed using relative indicators (their accumulated values). The larger the area of ​​the figure, the more uneven the distribution.

    radial diagrams - used to visually depict a phenomenon over time. The circle is divided into 12 equal parts. Each ray corresponds to a specific month. On the radii, starting from the center, segments are laid out, depicting the value of the characteristic by month on a scale. The resulting figure characterizes the seasonal fluctuations of the phenomenon.

Graphs that characterize distribution series

    polygon - broken line. Constructed for discrete distribution series

    histogram - used for interval series. The columns should fit tightly to each other

    cumulate - used for distribution series, for accumulated series

    ogive - constructed in a similar way that the abscissa and ordinate axis are swapped

    Classification and assignment of relative quantities.

Statistical indicator- represents a quantitative characteristic of socio-economic phenomena and processes in conditions of qualitative certainty.

Statistical indicators are distinguished by form:

    Absolute

    Relative

Relative values ​​represent various coefficients or percentages.

Relative statistics- these are indicators that provide a numerical measure of the relationship between two comparable quantities.

Relative indicators - represent the result of dividing one absolute indicator by another and express the relationship between the quantitative characteristics of socio-economic processes and phenomena.

When calculating a relative indicator, the absolute indicator found in the numerator of the resulting ratio is called current, or compared, and the denominator is comparison base.

The main condition for the correct calculation of relative values ​​is the comparability of the compared values ​​and the presence of real connections between the phenomena being studied.

Relative value = compared value / basis

According to the method of obtaining, relative quantities are always derivative (secondary) quantities.

They can be expressed: in coefficients, in percentages, in ppm, in prodecimille.

The following types of relative statistical quantities are distinguished:

Relative dynamics indicator (RDI) represents the ratio of the level of the process or phenomenon under study for a given period of time (as of a given point in time) and the level of the same process or phenomenon in the past:

OPD = Current level / Previous or baseline level

OPD = OPP * OPRP

OPD can be with a permanent base - basic, and variable – chain.

Relative Plan Performance (RPP) characterizes tension, i.e. how many times the planned production volume (or any financial result of the enterprise’s activity) will exceed the achieved level or what percentage of this level it will be.

OPP = level planned for (i+1)th period / level reached ini-th period

Relative plan implementation indicator (RPI) reflects the actual production volume as a percentage or coefficient compared to the planned level.

OPRP = level reached in (i+1)th period/level planned for (i+1)th period

Relative structure index (RSI) represents the relationship between the structural parts of the object being studied and their whole:

OPS = indicator characterizing part of the population / indicator for the entire population as a whole (*100%)

Relative Coordination Index (RCI) represents the ratio of one part of a population to another part of the same population:

OPC = indicator characterizingi-th part of the population / indicator characterizing the part of the population selected as a comparison base

Relative intensity index (RII) characterizes the degree of distribution of the process or phenomenon being studied and represents the ratio of the indicator under study to the size of its inherent environment:

OPI = indicator characterizing phenomenon A / indicator characterizing the environment of distribution of the phenomenonA

Type of OPI - Relative indicator of the level of economic development, characterizing production per capita and playing an important role in assessing the development of the state’s economy.

Relative comparison index (RCr) represents the ratio of the same absolute indicator characterizing different objects (enterprises, firms, districts, regions, countries, etc.)

OPSR = indicator characterizing object A / indicator characterizing object B

One of the most pressing problems of modern natural science and, in particular, physics, remains the question of the nature of causality and causal relationships in the world. More specifically, this question in physics is formulated in the problem of the relationship between dynamic and statistical laws and objective laws. In solving this problem, two philosophical directions arose - determinism and indeterminism, which occupy directly opposite positions.
Determinism - the doctrine of the causal material conditionality of natural, social and mental phenomena. The essence of determinism is the idea that everything that exists in the world arises and is destroyed naturally, as a result of the action of certain causes.
Indeterminism - a doctrine that denies the objective causality of natural phenomena, society and the human psyche.
In modern physics, the idea of ​​determinism is expressed in the recognition of the existence of objective physical laws and finds its more complete and general reflection in fundamental physical theories.
Fundamental physical theories (laws) represent the body of the most essential knowledge about physical laws. This knowledge is not exhaustive, but today it most fully reflects the physical processes in nature. In turn, on the basis of certain fundamental theories, private physical laws such as Archimedes’ law, Ohm’s law, the law of electromagnetic induction, etc. are formulated.
Scientists are unanimous in the opinion that the basis of any physical theory consists of three main elements:
1) a set of physical quantities with the help of which the objects of a given theory are described (for example, in Newtonian mechanics - coordinates, impulses, energy, forces); 2) the concept of state; 3) equations of motion, that is, equations that describe the evolution of the state of the system under consideration.
In addition, to solve the problem of causality, the division of physical laws and theories into dynamic and statistical (probabilistic) is important.

DYNAMIC LAWS AND THEORIES AND MECHANICAL, DETERMINISM

A dynamic law is a physical law that reflects an objective pattern in the form of an unambiguous connection between physical quantities expressed quantitatively. A dynamic theory is a physical theory that represents a set of dynamic laws. Historically, the first and simplest theory of this kind was Newton's classical mechanics. It claimed to describe mechanical motion, that is, the movement in space over time of any bodies or parts of bodies relative to each other with any accuracy.
Directly, the laws of mechanics formulated by Newton relate to a physical body, the dimensions of which can be neglected, a material point. But any body of macroscopic dimensions can always be considered as a collection of material points and, therefore, its movements can be described quite accurately.
Therefore, in modern physics, classical mechanics is understood as the mechanics of a material point or system of material points and the mechanics of an absolutely rigid body.
To calculate motion, the dependence of the interaction between particles on their coordinates and velocities must be known. Then, based on the given values ​​of the coordinates and momenta of all particles of the system at the initial moment of time, Newton’s second law makes it possible to unambiguously determine the coordinates and momenta at any subsequent moment in time. This allows us to assert that the coordinates and momenta of the particles of the system completely determine its state in mechanics. Any mechanical quantity of interest to us (energy, angular momentum, etc.) is expressed through coordinates and momentum. Thus, all three elements of the fundamental theory, which is classical mechanics, are determined.
Another example of a fundamental physical theory of a dynamic nature is Maxwell's electrodynamics. Here the object of study is the electromagnetic field. Maxwell's equations are then the equations of motion for the electromagnetic form of matter. Moreover, the structure of electrodynamics in the most general terms repeats the structure of Newtonian mechanics. Maxwell's equations make it possible to unambiguously determine the electromagnetic field at any subsequent time based on given initial values ​​of the electric and magnetic fields inside a certain volume.
Other fundamental theories of a dynamic nature have the same structure as Newtonian mechanics and Maxwellian electrodynamics. These include: continuum mechanics, thermodynamics and general relativity (theory of gravity).
Metaphysical philosophy believed that all objective physical laws (and not only physical ones) have exactly the same character as dynamic laws. In other words, no other types of objective laws were recognized, except for dynamic laws that express unambiguous connections between physical objects and describe them absolutely accurately through certain physical quantities. The lack of such a complete description was interpreted as a lack of our cognitive abilities.
The absolutization of dynamic laws and, therefore, mechanical determinism is usually associated with P. Laplace, who owns the famous statement already quoted by us that if there were a sufficiently vast mind that would know for any given moment all the forces acting on all bodies of the Universe (from its largest bodies to the smallest atoms), as well as their location, if he could analyze these data in a single formula of motion, then there would be nothing left that would be unreliable, and both the past and future of the Universe.
According to the principle proclaimed by Laplace, all phenomena in nature are predetermined with “iron” necessity. Randomness, as an objective category, has no place in the picture of the world drawn by Laplace. Only the limitations of our cognitive abilities force us to consider individual events in the world as random. For these reasons, as well as noting the role of Laplace, classical mechanical determinism is also called hard or Laplace determinism.
The need to abandon classical determinism in physics became obvious after it became clear that dynamic laws are not universal and not unique and that the deeper laws of nature are not dynamic, but statistical laws discovered in the second half of XIX century, especially after the statistical nature of the laws of the microworld became clear.
But even when describing the motion of individual macroscopic bodies, the implementation of ideal classical determinism is practically impossible. This is clearly seen from the description of constantly changing systems. In general, the initial parameters of any mechanical systems cannot be fixed with absolute accuracy, therefore the accuracy of predicting physical quantities decreases over time. For every mechanical system there is a certain critical time, from which it is impossible to accurately predict its behavior.
There is no doubt that Laplace's determinism, with a certain degree of idealization, reflects the real movement of bodies and in this regard it cannot be considered false. But its absolutization as a completely accurate reflection of reality is unacceptable.
With the establishment of the dominant importance of statistical laws in physics, the idea of ​​omniscient consciousness, for which the fate of the world is absolutely precisely and unambiguously determined, disappears, the ideal that was set before science by the concept of absolute determinism.

STATISTICAL LAWS AND THEORIES AND PROBABILISTIC DETERMINISM

The dynamic laws described above are universal in nature, that is, they apply to all objects under study without exception. A distinctive feature of this kind of laws is that the predictions obtained on their basis are reliable and unambiguous.
Along with them, in natural science in the middle of the last century, laws were formulated whose predictions are not definite, but only probable. These laws received their name from the nature of the information that was used to formulate them. They were called probabilistic because the conclusions based on them do not follow logically from the available information, and therefore are not reliable and unambiguous. Since the information itself is statistical in nature, such laws are often also called statistical, and this name has become much more widespread in natural science.
The idea of ​​laws of a special type, in which the connections between the quantities included in the theory are ambiguous, was first introduced by Maxwell in 1859. He was the first to understand that when considering systems consisting of a huge number of particles, it is necessary to pose the problem completely differently than it was done in Newtonian mechanics. To do this, Maxwell introduced into physics the concept of probability, previously developed by mathematicians in the analysis of random phenomena, in particular gambling.
Numerous physical and chemical experiments have shown that, in principle, it is impossible not only to trace changes in the momentum or position of one molecule over a large time interval, but also to accurately determine the momentum and coordinates of all molecules of a gas or other macroscopic body at a given moment in time. After all, the number of molecules or atoms in a macroscopic body is of the order of 1023. From the macroscopic conditions in which the gas is located (a certain temperature, volume, pressure, etc.), certain values ​​of the momenta and coordinates of the molecules do not necessarily follow. They should be considered as random variables that, under given macroscopic conditions, can take on different values, just as when throwing a dice, any number of points from 1 to 6 can appear. It is impossible to predict what number of points will appear in a given throw of the dice. But the probability of rolling, for example, 5, can be calculated.
This probability has an objective character, since it expresses the objective relations of reality and its introduction is not due only to our ignorance of the details of the course of objective processes. So, for a dice, the probability of getting any number of points from 1 to 6 is equal to /6, which does not depend on the knowledge of this process and therefore is an objective phenomenon.
Against the background of many random events, a certain pattern is revealed, expressed by a number. This number - the probability of an event - allows you to determine statistical average values ​​(the sum of the individual values ​​of all quantities divided by their number). So, if you roll a die 300 times, the average number of fives you get will be 300. “L = 50 times. Moreover, it makes no difference whether you throw the same dice or throw 300 identical dice at the same time.
There is no doubt that the behavior of gas molecules in a vessel is much more complex than a thrown dice. But here, too, certain quantitative patterns can be found that make it possible to calculate statistical average values, if only the problem is posed in the same way as in game theory, and not as in classical mechanics. It is necessary to abandon, for example, the insoluble problem of determining the exact value of the momentum of a molecule at a given moment, and try to find the probability of a certain value of this momentum.
Maxwell managed to solve this problem. The statistical law of the distribution of molecules over momenta turned out to be simple. But Maxwell's main merit was not in the solution, but in the very formulation of the new problem. He clearly realized that the random behavior of individual molecules under given macroscopic conditions is subject to a certain probabilistic (or statistical) law.
After the impetus given by Maxwell, molecular kinetic theory (or statistical mechanics, as it was later called) began to develop rapidly.
Statistical laws and theories have the following characteristic features.
1. In statistical theories, any state is a probabilistic characteristic of the system. This means that the state in statistical theories is determined not by the values ​​of physical quantities, but by the statistical (probability) distributions of these quantities. This is a fundamentally different characteristic of the state than in dynamic theories, where the state is specified by the values ​​of the physical quantities themselves.
2. In statistical theories, based on a known initial state, it is not the values ​​of physical quantities themselves that are unambiguously determined as a result, but the probabilities of these values ​​within given intervals. In this way, the average values ​​of physical quantities are determined unambiguously. These average values ​​in statistical theories play the same role as the physical quantities themselves in dynamic theories. Finding average values ​​of physical quantities is the main task of statistical theory.
The probabilistic characteristics of a state in statistical theories are completely different from the characteristics of a state in dynamic theories. Nevertheless, the dynamic and statistical theories exhibit, in the most essential respects, a remarkable unity. The evolution of a state in statistical theories is uniquely determined by the equations of motion, as in dynamic theories. Based on a given statistical distribution (by a given probability) at the initial moment of time, the equation of motion uniquely determines the statistical distribution (probability) at any subsequent moment in time, if the energy of interaction of particles with each other and with external bodies is known. The average values ​​of all physical quantities are determined unambiguously, respectively. There is no difference here from dynamic theories regarding the uniqueness of the results. After all, statistical theories, like dynamic ones, express the necessary connections in nature, and they generally cannot be expressed otherwise than through an unambiguous connection of states.
At the level of statistical laws and patterns, we also encounter causality. But determinism in statistical laws represents a deeper form of determinism in nature. In contrast to hard classical determinism, it can be called probabilistic (or modern) determinism.
Statistical laws and theories are a more advanced form of description of physical laws; any currently known process in nature is more accurately described by statistical laws than by dynamic ones. The unambiguous connection of states in statistical theories indicates their commonality with dynamic theories. The difference between them is in one thing - the method of recording (describing) the state of the system.
The true, comprehensive meaning of probabilistic determinism became obvious after the creation of quantum mechanics - a statistical theory that describes phenomena on an atomic scale, that is, the movement of elementary particles and systems consisting of them (other statistical theories are: the statistical theory of nonequilibrium processes, electronic theory, quantum electrodynamics). Despite the fact that quantum mechanics differs significantly from classical theories, the structure common to fundamental theories is preserved here. Physical quantities (coordinates, impulses, energy, angular momentum, etc.) remain generally the same as in classical mechanics. The main quantity characterizing the state is the complex wave function. Knowing it, you can calculate the probability of detecting a certain value not only of a coordinate, but also of any other physical quantity, as well as the average values ​​of all quantities. The basic equation of nonrelativistic quantum mechanics - the Schrödinger equation - uniquely determines the evolution of the state of the system in time.

RELATIONSHIP OF DYNAMIC AND STATISTICAL LAWS

Immediately after the appearance of the concept of a statistical law in physics, the problem of the existence of statistical laws and their relationship with dynamic laws arose.
With the development of science, the approach to this problem and even its formulation changed. Initially, the main issue in the problem of correlation was the question of substantiating classical statistical mechanics on the basis of Newton's dynamic laws. Researchers tried to find out how statistical mechanics, an essential feature of which is the probabilistic nature of predicting the values ​​of physical quantities, should relate to Newton's laws with their unambiguous connections between the values ​​of all quantities.
Statistical laws, as a new type of description of patterns, were originally formulated on the basis of the dynamic equations of classical mechanics. For a long time, dynamic laws were considered the main, primary type of reflection of physical laws, and statistical laws were considered to a large extent as a consequence of the limitations of our cognitive abilities.
But today it is known that the patterns of behavior of objects in the microworld and the laws of quantum mechanics are statistical. It was then that the question was posed this way: is the statistical description of microprocesses the only possible one, or are there dynamic laws that more deeply determine the movement of elementary particles, but are hidden under the veil of the statistical laws of quantum mechanics?
The emergence and development of quantum theory gradually led to a revision of ideas about the role of dynamic and statistical laws in reflecting the laws of nature. The statistical nature of the behavior of individual elementary particles was discovered. At the same time, no dynamic laws were discovered behind the laws of quantum mechanics describing this behavior. Therefore, major scientists, such as N. Bohr, W. Heisenberg, M. Born, P. Langevin and others, put forward the thesis about the primacy of statistical laws. True, the acceptance of this thesis at that time was difficult due to the fact that some of the above-mentioned scientists associated the position on the primacy of statistical laws with indeterminism. Since the usual model of determinism in the microworld was unattainable, they concluded that there was no causality in the microworld at all. But most scientists did not agree with this conclusion and began to insist on the need to find dynamic laws to describe the microworld, perceiving statistical laws as an intermediate stage that makes it possible to describe the behavior of a set of microobjects, but does not yet provide the opportunity to accurately describe the behavior of individual microobjects.
When it became obvious that the role of statistical laws in the description of physical phenomena cannot be denied (all experimental data were fully consistent with theoretical calculations based on probability calculations), the theory of “equality” of statistical and dynamic laws was put forward. Those and other laws were considered as laws of equal rights, but relating to different phenomena, each having its own scope of application, not reducible to each other, but mutually complementing each other.
This point of view does not take into account the indisputable fact that all fundamental statistical theories of modern physics (quantum mechanics, quantum electrodynamics, statistical thermodynamics, etc.) contain corresponding dynamic theories as their approximations. Therefore, today many prominent scientists tend to consider statistical laws as the deepest, most general form of description of all physical laws.
There is no reason to draw a conclusion about indeterminism in nature because the laws of the microworld are fundamentally statistical. Since determinism insists on the existence of objective laws, indeterminism must mean the absence of such laws. This is certainly not the case. Statistical patterns are no less objective than dynamic ones, and reflect the interconnection of phenomena in the material world. The dominant significance of statistical laws means a transition to a higher level of determinism, and not a rejection of it altogether.
When considering the relationship between dynamic and statistical laws, we encounter two aspects of this problem.
In the aspect that arose historically first, the relationship between dynamic and statistical laws appears in the following way: laws reflecting the behavior of individual objects are dynamic, and laws describing the behavior of a large collection of these objects are statistical. This is, for example, the relationship between classical mechanics and statistical mechanics. Essential for this aspect is that here dynamic and statistical laws describe different forms of motion of matter that are not reducible to each other. They have different objects of description, and therefore the analysis of theories does not reveal what is essential in their relationship to each other. This aspect cannot be considered the main one when analyzing their relationship.
The second aspect of the problem studies the relationship between dynamic and statistical laws that describe the same form of motion of matter. Examples include thermodynamics and statistical mechanics, Maxwellian electrodynamics and electron theory, etc.
Before the advent of quantum mechanics, it was believed that the behavior of individual objects always obeys dynamic laws, and the behavior of a collection of objects always obeys statistical laws; the lower, simplest forms of movement are subject to dynamic laws, and the higher, more complex forms are subject to statistical laws. But with the advent of quantum mechanics, it was established that both “lower” and “higher” forms of matter motion can be described by both dynamic and statistical laws. For example, quantum mechanics and quantum statistics describe different forms of matter, but both are statistical theories.
After the creation of quantum mechanics, we can rightfully assert that dynamic laws represent the first, lower stage in the knowledge of the world around us and that statistical laws more fully reflect the objective relationships in nature, being a higher stage of knowledge. Throughout the history of the development of science, we see how the initially emerging dynamic theories, covering a certain range of phenomena, are replaced, as science develops, by statistical theories that describe the same range of issues from a new, deeper point of view.
The replacement of dynamic theories with statistical ones does not mean that the old dynamic theories are obsolete and forgotten. Their practical value, within certain limits, is in no way diminished by the fact that new statistical theories have been created. When we talk about a change in theories, we primarily mean the replacement of less profound physical ideas with more profound ideas about the essence of phenomena. Simultaneously with the change in physical concepts, the range of applicability of theories expands. Statistical theories extend to a wider range of phenomena that are inaccessible to dynamic theories. Statistical theories are in better quantitative agreement with experiment than dynamic ones. But under certain conditions, the statistical theory leads to the same results as the simpler dynamic theory (the correspondence principle comes into play - we will discuss it below).
The connection between the necessary and the accidental cannot be revealed within the framework of dynamic laws, since they ignore the accidental. The dynamic law displays the average necessary result to which the flow of processes leads, but does not reflect the complex nature of determining this result. When considering a fairly wide range of issues, when deviations from the required average value are negligible, such a description of the processes is quite satisfactory. But even in this case, it can be considered sufficient provided that we are not interested in those complex relationships that lead to the necessary connections, and we limit ourselves to only stating these connections. We must clearly understand that absolutely precise, unambiguous connections between the physical quantities that dynamic theories speak of simply do not exist in nature. In real processes, inevitable deviations from the required average values ​​always occur - random fluctuations, which only under certain conditions do not play a significant role and may not be taken into account.
Dynamic theories are not able to describe phenomena when fluctuations are significant, and are not able to predict under what conditions we can no longer consider the necessary in isolation from the random. In dynamic laws, necessity appears in a form that coarsens its connection with chance. But it is precisely the latter circumstance that statistical laws take into account. It follows that statistical laws reflect real physical processes more deeply than dynamic ones. It is no coincidence that statistical laws are learned after dynamic ones.
Returning to the problems of causality, we can conclude that dynamic and probabilistic causality arises on the basis of dynamic and statistical laws. And just as statistical laws reflect the objective connections of nature more deeply than dynamic ones, so probabilistic causation is more general, and dynamic causation is only its special case.

Seminar lesson plan (2 hours)

1. Dynamic laws and mechanical determinism.
2. Statistical laws and probabilistic determinism.
3. Relationship between dynamic and statistical laws.

Topics of reports and abstracts

LITERATURE

1. Myakishev G.Ya. Dynamic and statistical patterns in physics. M„ 1973.
2. Svechnikov G.A. Causality and connection of states in physics. M., 1971.
3. Philosophical problems of natural science. M., 1985.

Fundamentality of statistical theories

As already mentioned, in classical natural science there was a belief that the most fundamental knowledge should be clothed in the form of a dynamic theory - accurate, unambiguous, not allowing any uncertainty. The first statistical theories were considered only as approximations, acceptable temporarily, until the development of “rigorous” methods.

However, time passed, new, more and more effective scientific theories were developed - and it turned out that almost all of them were statistical. In physics, the last fundamental dynamic theory - the general theory of relativity - was created at the beginning of the 20th century. The situation was similar in chemistry and biology.

Since knowledge moves forward and not backward, it became obvious that the thesis about the fundamental nature of dynamic theories and the subordinate role of statistical ones was subject to revision. A compromise point of view has emerged, according to which dynamic and statistical theories are equally fundamental, but describe reality from different points of view, complementing each other. However, at present, the prevailing idea is that the most fundamental, that is, the most deeply and completely describing reality, are statistical theories.

The most compelling arguments in favor of this concept rely on principle of correspondence(clause 2.3.5).

For each of the fundamental physical theories dynamic kind of exists statistical an analogue that describes the same range of phenomena: for classical mechanics - quantum mechanics, for thermodynamics - statistical mechanics, for electrodynamics and the special theory of relativity - quantum electrodynamics... The only exception is the general theory of relativity, the statistical analogue of which - the quantum theory of gravity - has not yet been created, since quantum gravitational effects must manifest themselves under conditions that are virtually impossible to create in a laboratory or to be found anywhere in the modern Universe.

On the other hand, a number of fundamental statistical theories do not have and are not expected to have dynamic analogues. Such are, for example, quantum chromodynamics (a discipline that studies strongly interacting particles) or Darwinian evolutionary theory. Removing the factor of chance from the latter gives Lamarck's theory (Section 4.2), the fallacy of which is now beyond doubt.

What is even more significant is that in each of the listed pairs, the statistical theory invariably describes a wider range of phenomena and provides a more complete and detailed description of them than its dynamic analogue. For example, in MCT the same gas laws of Boyle-Mariotte, Charles, Gay-Lussac are valid as in thermodynamics, however, Besides, it also describes viscosity, thermal conductivity, and diffusion, which thermodynamics does not allow. With the help of quantum mechanics, we can, if desired, describe the motion of macroscopic bodies: after simplifications, we obtain the same equations of motion as in Newtonian mechanics. But the behavior of micro-objects - for example, electrons in atoms - can be described only quantum mechanically; attempts to apply classical mechanics produce meaningless and contradictory results.

Dynamic theory always plays the role of an approximation, a simplification of the corresponding statistical theory.

Statistical theory considers and takes into account fluctuations, random deviations from the average. If the situation is such that these deviations are insignificant, then, neglecting them, we get approximate a theory that describes the behavior of average values ​​- and this theory will already be dynamic.

For example, if we are interested in the air pressure on a window glass, then with good accuracy we can assume that all molecules move at the same speed. Deviations in b O more and less cancel each other out when the impacts of myriad molecules add up due to the pressure on the glass. Thermodynamics applies here. However, if we are interested in the speed at which planets lose their atmospheres, then a statistical approach becomes necessary, because the fastest molecules escape into space, the speed of which exceeds the average - and here we cannot do without a statistical analysis of fluctuations.

The characteristic value of quantum fluctuations is determined by Planck's constant ħ . On the macroscopic scales we are familiar with, this value is too small, so quantum fluctuations can be neglected and the motion of bodies can be described dynamically, using Newton’s laws. However, on scales in which Planck's constant is not small, Newtonian mechanics gives in - it cannot take into account the quantum fluctuations that become significant. In other words, classical mechanics is only suitable if, without major error, one can put ħ = 0.

GENERAL THEORY OF STATISTICS

1.1. Subject, method, objectives and organization

Statistics is a science that studies the quantitative side of mass phenomena in inextricable connection with their qualitative side, the quantitative expression of the laws of social development.

Statistics as a science has five features.

First feature statistics is the study not of individual facts, but of mass socio-economic phenomena and processes, acting as a set of individual facts that have both individual characteristics and general characteristics. The problem of statistical research consists in obtaining general indicators and identifying patterns of social life in specific conditions of place and time, which manifest themselves only in a large mass of phenomena through overcoming the randomness inherent in individual elements.

Second feature statistics is that it studies primarily the quantitative side of social phenomena and processes, but unlike mathematics, in specific conditions of place and time, i.e. The subject of statistics is the size and quantitative relationships of socio-economic phenomena, the patterns of their connection and development. At the same time, the qualitative certainty of individual phenomena is usually determined by related sciences.

Third feature statistics is that it characterizes the structure, i.e. internal structure of mass phenomena (statistical set) using statistical indicators.

Fourth feature statistics is the study of changes in social phenomena in space and time. Changes in space (i.e., in statics) are revealed by analyzing the structure of a social phenomenon, and changes in time (i.e., in dynamics) are revealed by studying the level and structure of the phenomenon.

Fifth feature statistics is to identify cause-and-effect relationships of individual phenomena of social life.

Under statistical methodology is understood as a system of techniques, methods and methods aimed at studying quantitative patterns manifested in the structure, dynamics and interrelations of socio-economic phenomena.

1.2. Statistical observation

The full cycle of statistical research includes the following stages:

1) collection of primary information (method of statistical observation);

2) preliminary data processing (grouping method, graphical method);

3) calculation and interpretation of individual and summary indicators (level, structure and variation, relationships and dynamics);

4) modeling and forecasting the relationship and dynamics of the processes and phenomena under study.

Statistical observation is a systematic, systematic, scientifically based collection of data on the phenomena and processes of social life by recording their most important features in accordance with the observation program.

The statistical observation plan includes programmatic, methodological and organizational parts. The program and methodological part indicates: the purpose, objectives and program of observation, the object and unit of observation, a set of characteristics of the unit of observation and observation tools (instructions for conducting observation and a statistical form containing the program and results of observation). The organizational part indicates: place and time of observation; a list of institutions and organizations responsible for organizing and performing observations, training and placement of personnel; selection of methods and registration of information, list of preparatory activities, etc.

Statistical observations are classified according to the form, type and method of observation.

The most common forms of statistical observation are: reporting (of enterprises, organizations, institutions, etc.) and specially organized observations in order to obtain information not included in reporting (censuses, surveys, one-time records).

Types of observation are distinguished: by the time of observation (continuous, periodic and one-time) and by the completeness of coverage of units of the statistical population (continuous and non-continuous).

According to the methods of statistical observation, they are distinguished: direct, documentary observation and survey. In statistics, the following types of surveys are used: oral (expeditionary), self-registration (when the forms are filled out by the respondents themselves), correspondent, questionnaire and personal surveys, using modern computer technology.

The indicators used in economic-statistical analysis characterize certain categories and concepts, and the calculation of such indicators should be carried out through a theoretical analysis of the phenomenon being studied. Therefore, in each specific area of ​​application of statistics, its own system of statistical indicators is developed.

1.3. Methods of continuous and selective observation of socio-economic phenomena and processes

The task continuous observation is to obtain information about all units of the population under study. Therefore, when conducting continuous observation, an important task is to formulate a list of signs to be examined. The quality and reliability of the survey results ultimately depends on this.

Until recently, Russian statistics relied primarily on continuous observation. However, this type of observation has serious disadvantages: the high cost of obtaining and processing the entire amount of information; high labor costs; insufficient efficiency of information, since it takes a lot of time to collect and process it. And finally, not a single continuous observation, as a rule, provides complete coverage of all units of the population without exception. A larger or smaller number of units necessarily remain unobserved both when conducting one-time surveys and when obtaining information through such a form of observation as reporting.

For example, when conducting a comprehensive statistical survey of small enterprises based on the results of work in 2000, blank forms (questionnaires) were received from 61% of enterprises to which questionnaires were sent. The reasons for non-response are summarized in Table. 1.

Table 1

The number and proportion of units not covered depend on many factors: the type of survey (by mail, by oral interview); reporting unit type; registrar qualifications; the content of the questions provided for in the observation program; time of day or year of the survey, etc.

A partial survey initially assumes that only a portion of the units in the population being studied are subject to survey. When conducting it, it is necessary to determine in advance what part of the population should be subjected to observation and how to select those units that should be surveyed.

One of the advantages of non-continuous observations is the ability to obtain information in a shorter time and with less resources than with continuous observation. This is due to a smaller volume of collected information, and therefore lower costs for its acquisition, verification, processing and analysis.

There are many types of incomplete observation. One of them - sample observation, in which characteristics are recorded in individual units of the population under study, selected using special methods, and the results obtained during the survey with a certain level of probability are extended to the entire original population.

The advantage of selective observation is ensured through:

1) saving financial resources spent on data collection and processing,

2) saving material and technical resources (stationery, office equipment, consumables, transport services, etc.),

3) saving labor resources involved at all stages of sample observation,

4) reducing the time spent both on obtaining primary information and on its subsequent processing until the publication of the final materials.

The main problem when conducting a sample study is how confidently one can judge the actual properties of the general population based on the properties of the selected objects. Therefore, any such judgment inevitably has a probabilistic nature, and the task comes down to ensuring the greatest possible probability of a correct judgment.

The population from which selection is made is called general. The selected data is sample population or sample. In order for a sample to fully and adequately represent the properties of the population, it must be representative or representative. The representativeness of the sample is ensured only if the data selection is objective.

There are two types of selective observation: repeated and non-repetitive sampling.

At repeated selection, the probability of each individual unit being included in the sample remains constant, because after selection, the selected unit is returned to the population and can be selected again - the “return ball scheme”.

At repeatable During selection, the selected unit does not come back, the probability of the remaining units getting into the sample changes all the time - the “irreturnable ball scheme”.

The following are distinguished: ways selection of units from the general population:

A) individual selection, when individual units are selected for the sample,

b) group selection, when the sample includes qualitatively homogeneous groups or series of studied units,

V) combined selection, which is a combination of the first two methods.

The following are possible methods selection of units to form a sample population:

1) random(unintentional) selection, when the sample population is selected by drawing lots or using a table of random numbers,

2) mechanical selection, when the sample population is determined from the general population divided into equal intervals (groups),

3) typical selection (stratified, stratified) with preliminary division of the general population into qualitatively homogeneous typical groups (not necessarily equal),

4) serial or cluster selection, when not individual units, but series are selected from the general population, and within each series included in the sample, all units without exception are examined.

1.4. Statistical groupings

One of the main and most common methods of processing and analyzing primary statistical information is grouping. The concept of statistical grouping in the broad sense of the word covers a whole range of statistical operations. First of all, these include combining individual cases recorded during observation into groups that are similar in one way or another, since the holistic characteristics of the population must be combined with the characteristics of its main parts, classes, etc. The results of the summary and grouping of statistical observation data are presented in the form of statistical distribution series And tables.

The significance of groupings lies in the fact that this method, firstly, provides systematization and generalization of observation results, and secondly, the grouping method is the basis for the use of other methods of statistical analysis of the main aspects and characteristic features of the phenomena being studied.

The purpose of statistical grouping is to divide population units into a number of groups for the calculation and analysis of generalizing group indicators, which make it possible to obtain an idea of ​​the composition, structure and relationships of the object or phenomenon being studied.

Generalizing statistical indicators characterizing each selected group can be presented in the form of absolute, relative and average values.

In table 2 summarizes various types of statistical groupings, differing depending on the grouping task:

table 2

The basis of grouping is the grouping characteristics by which units of the population under study are assigned to certain groups. If the grouping is performed according to one characteristic, then it is considered simple, if according to two or more characteristics – then combinational(or combined).

Primary called a grouping formed on the basis of primary data collected in the process of statistical observation.

Secondary grouping is performed based on primary data, if there is a need to obtain a smaller number, but larger groups, or to bring data grouped by interval size into a comparable form for the purpose of their possible comparison.

The classification and characteristics of the grouping characteristics are presented in Table. 3.

The tasks of typological grouping, which usually involves the division of a heterogeneous population into qualitatively homogeneous groups, are closely related to two other grouping tasks: the study of the structure and structural shifts in the homogeneous population under study and the identification of the relationship of individual features of the phenomenon under study in it.

Examples of typological groupings include the grouping of economic objects by type of ownership, the division of the economically active population into employed and unemployed, and workers into those engaged primarily in physical and mental labor.

The methodology of typological groupings is determined by how clearly the qualitative differences in the phenomena being studied are manifested. For example, when grouping industries by economic

Table 3

Principle of classification Types of signs Characteristics
By content (essence) Essential Express the main content of the phenomena being studied
Minor Important for the characteristics of the phenomena being studied, but not classified as significant
If possible, quantitative measurement Quantitative, including: a) discrete (discontinuous) b) continuous Reflect a property of a phenomenon that can be measured Expressed only as a whole number Expressed as both a whole and a fraction
Attributive (qualitative), including alternative The characteristic cannot be measured quantitatively and is written in text form. Found only in two mutually exclusive options (either - or)

According to the purpose of products, industries producing means of production and industries producing consumer goods are distinguished; in the macrostructure of retail trade turnover, production and non-production goods are distinguished. In most cases, qualitative differences between phenomena do not appear so clearly. For example, distinguishing large, medium and small enterprises in industries is a rather methodologically complex problem.

1.5. Methods for processing and analyzing statistical information

In the process of statistical observation, data is obtained on the values ​​of certain characteristics that characterize each unit of the population under study. To characterize the population as a whole or its parts, data on individual units of the population are summarized and, as a result, generalized indicators are obtained, which reflect the results of knowledge of the quantitative side of the phenomena being studied.

Statistic indicator called a generalizing quantitative and qualitative value that characterizes socio-economic phenomena and processes.

Individual values ​​of a population represent characteristics, and a quantitative-qualitative characteristic of any property of a population (group) is a statistical indicator. For example, the average performance of a particular student is a sign, the average performance of university students is an indicator.

Summary indicators can be presented absolute, relative And average quantities that are widely used in planning and analyzing the activities of enterprises and firms, industries and the economy as a whole.

Absolute indicators are obtained by summing the primary data. They can be individual and general (total). Individual absolute values ​​express the size of quantitative characteristics in individual units of the population being studied. General and group absolute values ​​are the final and group quantitative characteristics of characteristics. Using the absolute value, the absolute dimensions of the phenomena being studied are characterized: volume, mass, area, length, etc. Absolute indicators are always named numbers (have units of measurement), which can be natural, conditionally natural (for comparing homogeneous, but different-quality products of the unit physical quantities are converted into conventional units using special coefficients) and cost (monetary) ones.

For comparison, comparison of absolute values ​​with each other in time, space and other relationships, relative values ​​are used, i.e. generalizing indicators expressing the quantitative relationship of two absolute values ​​to each other.

Relative values ​​can be the result of comparison:

- statistical indicators of the same name (with the past period - relative values ​​of dynamics and plan targets; with a plan - relative values ​​of plan implementation; parts and the whole or parts among themselves - relative values ​​of structure and coordination, respectively; in space - relative values ​​of visibility);

– different statistical indicators (relative intensity values).

1.5.1. Method of averages

average value is a generalized indicator expressing a typical, i.e. level characteristic of most traits. The method of averages allows you to replace a large number of varying values ​​of a characteristic with one averaged value.

There are averages: power and structural.

Formulas for calculating power averages are presented in table. 4.

In table 4 the following designations are used: the value of the characteristic of the th unit of the population or the th variant of the characteristic for the weighted average; volume of the population; weight of the attribute variant; number of variants of the characteristic being averaged.

The use of unweighted (simple) and weighted averages depends on the repeatability of the feature option:

Table 4

View of the middle Formula for calculating the average
Unweighted Weighted
Arithmetic mean
Harmonic mean
Geometric mean
Mean square
Average cubic

– in the absence of such repetitions or in case of repetition only individual option limited number of times apply unweighted average;

- when repeated everyone or almost everyone option many times apply weighted average.

Calculation of average values ​​is used when:

– assessing the characteristics of a typical level for a given population;

– comparison of typical levels for two or more populations;

– calculating the norm when establishing plan targets and contractual obligations.

In practice, the arithmetic mean is most often used. The harmonic mean is used in cases where the numerator is known, but the denominator of the original mean ratio is unknown. Basically, the geometric mean is used to average individual indicators over time. Power averages of the second and higher orders are used when calculating indicators of variation, correlation, structural changes, asymmetry and kurtosis.

Structural averages include two main characteristics of the variation series of a distribution – mode and median.

Fashion– this is the value of the attribute that is most often found in a given population, i.e. reflects the value of the attribute that is the most typical, predominant, dominant. With a large number of observations, a population can be characterized by two or more modal options.

Median- this is a variant of the characteristic being studied, which divides the ranked series of data into two equal parts: 50% of the units of the population under study will have characteristic values ​​less than the median, and 50% will have characteristic values ​​greater than the median.

When determining the median from ungrouped (primary) data, you first need to arrange them in ascending order (rank). Then you need to determine the “position” of the median or determine the number of the unit whose attribute value will correspond to the median:

where is the number of units in the population under study.

1.5.2. Variational analysis

Variation– this is the difference in individual values ​​(changes) of characteristics within the population being studied. Variation indicators allow us to evaluate:

Dispersion of attribute values ​​among units of a statistical population;

Stability of development of the processes under study over time;

The influence of a factor characteristic on changes in the performance characteristic;

Various types of risks (insurance, systematic, etc.).

There are absolute and relative indicators of variation. Absolute measures of variation include: range of variation, average linear deviation, dispersion and standard deviation. The ratios for calculating these indicators are summarized in table. 5.

Table 5

Indicators Calculation formulas
for ungrouped data for grouped data
Range of variation (oscillations)
Average linear deviation
Dispersion
Standard deviation

where: attribute value; and, accordingly, the maximum and minimum value of the attribute in the aggregate; arithmetic mean; volume of the population; weight of the attribute variant.

Determining the scope of variation is a necessary stage in grouping primary statistical information. This variation indicator has two significant drawbacks: a) it strongly depends on the maximum anomalous values ​​of the trait and b) it does not take into account the “internal” variation between the boundaries determined by the maximum and minimum values. Therefore, it does not provide an exhaustive description of variation.

The indicator of average linear deviation gives a generalized characteristic of the degree of dispersion of a characteristic in the aggregate, however, it is less often used compared to dispersion and standard deviation, since when calculating it one has to make actions that are incorrect from a mathematical point of view and violate the laws of algebra.

The dispersion is presented in squared units in which the registered characteristic is measured, so the interpretation of this indicator is quite difficult. In this regard, the standard deviation indicator has been introduced, which is measured in the same units of measurement as the individual value of the attribute.

Relative indicators of variation are calculated as percentages (relative to the arithmetic mean or median of the series). The following relative measures of variation are used in statistics:

1) oscillation coefficient

shows the relative spread of extreme values ​​of characteristics around the arithmetic mean;

2) relative linear deviation

characterizes the share of the average value of absolute deviations from the arithmetic mean;

3) the coefficient of variation

most often used, as it characterizes the degree of homogeneity of the population. The population is considered homogeneous if the coefficient of variation does not exceed 33% (for distributions close to normal).

1.5.3. Correlation analysis

The most important task of the general theory of statistics is to study objectively existing connections between phenomena. In the process of statistical research, cause-and-effect relationships between phenomena are clarified, which makes it possible to identify factors (signs) that have a significant impact on the variation of the phenomena and processes being studied.

In statistics, a distinction is made between functional connection and stochastic dependence. Functional is a relationship in which a certain value of a factor characteristic corresponds to one and only one value of the resultant characteristic. This connection is manifested in all cases of observation and for each specific unit of the population under study.

If a causal relationship does not appear in each individual case, but in general, on average over a large number of observations, then such a relationship is called stochastic. A special case of stochastic is correlation a relationship in which a change in the average value of a resultant characteristic is due to a change in factor characteristics.

When studying specific dependencies, some characteristics act as factors that determine changes in other characteristics. The signs of the first group are called factorial, and the signs that are the result of the influence of these factors are effective.

Statistics do not always require quantitative assessments of the relationship; often it is important to determine only its direction and nature, to identify the form of influence of some factors on others. One of the main methods for identifying the presence of a connection is correlational a method that aims to quantify the closeness of the relationship between two characteristics (in a pairwise relationship) and between the resultant and multiple factor characteristics (in a multifactorial relationship).

Correlation is a statistical relationship between random variables that do not have a strictly functional nature, in which a change in one of the random variables leads to a change in the mathematical expectation of the other.

In statistics, the following dependency options are distinguished:

1) pair correlation – a connection between two characteristics (resultative and factor or two factor);

2) partial correlation - the dependence between the resultant and one factor characteristics with a fixed value of other factor characteristics;

3) multiple correlation - the dependence of the resultant and two or more factor characteristics included in the study.

The main method for identifying the presence of a correlation is the method of analytical grouping and determining group averages. It consists in the fact that all units of the population are divided into groups according to the value of the factor characteristic and for each group the average value of the resulting characteristic is determined.

Multi-valued term: Physics: Classical field theory is a concept that combines classical electrodynamics (electromagnetic field theory), gravitational field theory, and the theory of classical gauge and spinor fields. Quantum field theory... ... Wikipedia

Quantum field theory- This article should be Wikified. Please format it according to the article formatting rules. Quantum field theory (QFT) is a branch of physics that studies the behavior of quantum systems with an infinitely large number of degrees ... Wikipedia

Classical field theory- a physical theory about the interaction of fields and matter, which does not affect quantum phenomena. Usually a distinction is made between relativistic and non-relativistic field theory. Contents 1 Continuum physics and nonequilibrium thermodynamics ... Wikipedia

Statistical mechanics- Statistical mechanics is a branch of statistical physics that studies the behavior of systems of an (arbitrary) finite number of particles using the methods of probability theory. The number of particles is an arbitrary finite natural number. For the first time classical... ... Wikipedia

Statistical physics- Statistical physics... Wikipedia

Oscillation theory- a theory that considers all kinds of vibrations, abstracting from their physical nature. For this purpose, the apparatus of differential calculus is used. Contents 1 Harmonic vibrations ... Wikipedia

Debye-Hückel theory- Debye Hückel’s theory of strong electrolytes, proposed by Peter Debye and Erich Hückel in 1923, is a statistical theory of dilute solutions of strong electrolytes, according to which each ion, by the action of its electric charge, polarizes... ... Wikipedia

Statistical physics- a branch of physics whose task is to express the properties of macroscopic bodies, i.e. systems consisting of a very large number of identical particles (molecules, atoms, electrons, etc.), through the properties of these particles and the interaction between them.... ... Great Soviet Encyclopedia

Plasticity theory- Plasticity theory is a branch of continuum mechanics, the objectives of which are to determine stresses and displacements in a deformable body beyond the limits of elasticity. Strictly speaking, in the theory of plasticity it is assumed that the stress state... ... Wikipedia

Elasticity theory- Continuum mechanics ... Wikipedia

Books

  • Problems in theoretical physics. Textbook, Belousov Yuri Mikhailovich, Burmistrov Sergei Nikolaevich, Ternov Alexey Igorevich. The book contains 460 problems of varying degrees of complexity, which at various times were offered to MIPT students, and covers all the main sections of theoretical physics: Field Theory, Quantum... Buy for 1854 RUR
  • Course of theoretical physics. In two volumes. Volume 1. Theory of the electromagnetic field. Theory of relativity. Statistical physics. Electromagnetic processes in matter, Levich V.G.. The first edition of the book Course of Theoretical Physics (1962) was used in a number of higher educational institutions as a teaching aid. Numerous comments and wishes received from a number of...