نتایج جستجو برای: estimates were obtained with restricted maximum likelihood method via ai algorithm
تعداد نتایج: 10779391 فیلتر نتایج به سال:
This study investigates the missing data problem in the Japan Meteorological Agency catalog of the Kumamoto aftershock sequence, which occurred since April 15, 2016, in Japan. Based on the assumption that earthquake magnitudes are independent of their occurrence times, we replenish the short-term missing data of small earthquakes by using a bi-scale transformation and study their influence on t...
The issue of normalization arises whenever two different values for a vector of unknown parameters imply the identical economic model. A normalization implies not just a rule for selecting which among equivalent points to call the maximum likelihood estimate (MLE), but also governs the topography of the set of points that go into a small-sample confidence interval associated with that MLE. A po...
A model-based multiple imputation approach for analyzing sample data with non-detects is proposed. The imputation approach involves randomly generating observations below the detection limit using the detected sample values and then analyzing the data using complete sample techniques, along with suitable adjustments to account for the imputation. The method is described for the normal case and ...
A Bayesian Nominal Regression Model with Random Effects for Analysing Tehran Labor Force Survey Data
Large survey data are often accompanied by sampling weights that reflect the inequality probabilities for selecting samples in complex sampling. Sampling weights act as an expansion factor that, by scaling the subjects, turns the sample into a representative of the community. The quasi-maximum likelihood method is one of the approaches for considering sampling weights in the frequentist framewo...
This report summarizes the theory and some main applications of a new non-monotonic algorithm for maximizing a Poisson Likelihood, which for Positron Emission Tomography (PET) is equivalent to minimizing the associated Kullback-Leibler Divergence, and for Transmission Tomography is similar to maximizing the dual of a maximum entropy problem. We call our method non-monotonic maximum likelihood (...
Data cloning method is a new computational tool for computing maximum likelihood estimates in complex statistical models such as mixed models. The data cloning method is synthesized with integrated nested Laplace approximation to compute maximum likelihood estimates efficiently via a fast implementation in generalized linear mixed models. Asymptotic normality of the hybrid data cloning based di...
We present a fast and exact novel algorithm to compute maximum likelihood estimates for the number of defects initially contained in a software, using the hypergeometric software reliability model. The algorithm is based on a rigorous and comprehensive mathematical analysis of the growth behavior of the likelihood function for the hypergeometric model. We also study a numerical example taken fr...
A new two-parameter distribution with decreasing failure rate is introduced. Various properties of the introduced distribution are discussed. The EM algorithm is used to determine the maximum likelihood estimates and the asymptotic variances and covariance of these estimates are obtained. Simulation studies are performed in order to assess the accuracy of the approximation of the variances and ...
the performance of many traffic control strategies depends on how much the traffic flow models are accurately calibrated. one of the most applicable traffic flow model in traffic control and management is lwr or metanet model. practically, key parameters in lwr model, including free flow speed and critical density, are parameterized using flow and speed measurements gathered by inductive loop d...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید