نام پژوهشگر: فهیمه مسیحی بیدگلی

آنتروپی طرح های نمونه گیری با احتمال نابرابر
پایان نامه وزارت علوم، تحقیقات و فناوری - دانشگاه صنعتی اصفهان - دانشکده ریاضی 1390
  فهیمه مسیحی بیدگلی   سروش علیمرادی

the main objective in sampling is to select a sample from a population in order to estimate some unknown population parameter, usually a total or a mean of some interesting variable. a simple way to take a sample of size n is to let all the possible samples have the same probability of being selected. this is called simple random sampling and then all units have the same probability of being chosen. when units vary considerably in size, simple random sampling does not seem to be an appropriate procedure, since it does not take into account the possible importance of the size of the units. in the circumstances, selection of units with unequal probabilities is suitable. when the units in the population do not have the same probabilities of being included in a sample, the sampling is called unequal probability sampling. when unequal probability sampling is applicable, it generally gives much better estimates than sampling with equal probabilities. a random sample is selected according to some specified random mechanism called the sampling design. for unequal probability sampling there exist many different sampling designs such as poisson, conditional poisson, sampford, pareto and splitting sampling. a sampling design which is obtained without replacement and the inclusion probabilities are proportional to the size of an auxiliary variable, is called a ?ps sampling. the choice of sampling design is important since it determines the properties of the estimator that is. the comparison of different designs, is a problem in sampling. in this thesis entropy, which is a measurement for the level of randomization of the design, is used to compare ?ps designs. in general a sampling design should also have a high level of randomization. a design called adjusted conditional poisson has maximum entropy among all fixed size ?ps-designs, but in the thesis, using different populations, it has been shown that several ?ps designs are close in terms of entropy. the top four designs are adjusted conditional poisson, adjusted pareto, a design called brewer’s method, and sampford design. a few designs yield low entropy and should therefore in general be avoided. in order to compare different designs it is also possible to look at some measure for the distance between designs. one such measure is the hellinger distance can also be used. some of the designs with high entropy are being compared and it had been shown that they have probability functions close to each other