Abstract: True-value theory (Bechtel, 2010), as an extension of randomization theory, allows arbitrary measurement errors to pervade a survey score as well as its predictor scores. This implies that true scores need not be expectations of observed scores and that expected errors need not be zero within a respondent. Rather, weaker assumptions about measurement errors over respondents enable the regression of true scores on true predictor scores. The present paper incorporates Sarndal-Lundstrom (2005) weight calibration into true-value regression. This correction for non-response is illustrated with data from the fourth round of the European Social Survey (ESS). The results show that a true-value regression coefficient can be corrected even with a severely unrepresentative sample. They also demonstrate that this regression slope is attenuated more by measurement error than by non-response. Substantively, this ESS analysis establishes economic anxiety as an important predictor of life quality in the financially stressful year of 2008.
Abstract: Markov chain Monte Carlo simulation techniques enable the ap plication of Bayesian methods to a variety of models where the posterior density of interest is too difficult to explore analytically. In practice, how ever, multivariate posterior densities often have characteristics which make implementation of MCMC methods more difficult. A number of techniques have been explored to help speed the convergence of a Markov chain. This paper presents a new algorithm which employs some of these techniques for cases where the target density is bounded. The algorithm is tested on sev eral known distributions to empirically examine convergence properties. It is then applied to a wildlife disease model to demonstrate real-world appli cability.
Abstract: In this article, a Bayesian model averaging approach for hier archical log-linear models is considered. Posterior model probabilities are approximately calculated for hierarchical log-linear models. Dimension of interested model space is reduced by using Occam’s window and Occam’s razor approaches. 2002 road traffic accident data of Turkey is analyzed by using the considered approach
Abstract: The detection of slope change points in wind curves depends on linear curve-fitting. Hall and Titterington’s algorithm based on smoothing is adapted and compared to a Bayesian method of curve-fitting. After prior spline smoothing of the data, the algorithms are tested and the errors between the split-linear fitted wind and the real one are estimated. In our case, the adaptation of the edge-preserving smoothing algorithm gives the same good performance as automatic Bayesian curve-fitting based on a Monte Carlo Markov chain algorithm yet saves computation time.
COVID-19 is quickly spreading around the world and carries along with it a significant threat to public health. This study sought to apply meta-analysis to more accurately estimate the basic reproduction number (R0) because prior estimates of R0 have a broad range from 1.95 to 6.47 in the existing literature. Utilizing meta-analysis techniques, we can determine a more robust estimation of R0, which is substantially larger than that provided by the World Health Organization (WHO). A susceptible-Infectious-removed (SIR) model for the new infection cases based on R0 from meta analysis is proposed to estimate the effective reproduction number Rt. The curves of estimated Rt values over time can illustrate that the isolation measures enforced in China and South Korea were substantially more effective in controlling COVID-19 compared to the measures enacted early in both Italy and the United States. Finally, we present the daily standardized infection cases per million population over time across countries, which is a good index to indicate the effectiveness of isolation measures on the prevention of COVID-19. This standardized infection case determines whether the current infection severity status is out of range of the national health capacity to care for patients.
Recent decades have witnessed a series of damages in the financial sector due to the unpleasant movements of prices beyond certain limits. These movements are commonly termed as Financial Bubbles. The formation and burst of a bubble creates huge damage in the field of finance. Hence in order to prevent the market from facing damages, the detection and modeling of financial bubble is very essential. We proposed improved test procedures for detecting financial bubbles by combining the existing Max test and Supremum Augmented Dickey Fuller (SADF) test generally used for detecting bubbles. The performance of proposed test is compared with existing tests via Monte Carlo simulation. It is observed that the proposed test have higher power compared to the existing tests, for detecting collapsible bubble irrespective of window length and collapsible probability. Further the power of proposed test increases as window size decreases. The empirical study of S&P 500 monthly data from January 2006 to December 2010 is carried out to demonstrate the advantages of proposed test procedures over existing tests.
Abstract: Crude oil being the primary source of energy is been unquestioningly the main driving engine of every country in this world whether it is the oil producer economy and/or oil consumer economy. Crude oil, one of the key strategic products in the global market, may influence the economy of the exporting and importing countries. Iran is one of the major crude oil exporting partners of the Organization of the Petroleum Exporting Countries (OPEC). Analysis of the risk measures associated with the Iranian oil price data is of strategic importance to the Iranian government and policy makers in particular for the short-and long-term planning for setting up the oil production targets. Oil price risk-management focuses mainly on when and how an organization can best prevent the costly exposure to the price risk. Value-at-Risk (VaR) is the commonly accepted instrument of risk-measure and is evaluated by analysing the negative tail of the probability distributions of the returns/profit and loss. Among several approaches for calculating VaR, the most common approaches are variance-covariance approach, historical simulation and Monte-Carlo simulation. Recently, copula functions have emerged as a powerful tool to model and simulate multivariate probability distributions. Copula applications have been noted predominantly in the areas of finance, actuary, economics and health and clinical studies. In addition, copulas are useful devices to deal with the non normality and non-linearity issues which are frequently observed in cases of financial time series data. In this paper we shall apply copulas namely; Frank copula, Clayton copula and Gumbel copula to analyse the time series crude oil price data of Iran in respect of OPEC prices. Data considered are; i. Monthly average prices for a barrel of Iranian and OPEC crude oil, from January 1997 to December 2008, ii. Seasonal number of barrels of Iran’s crude oil export, from January 1997 to December 2008. The results will demonstrate copula simulated data are providing higher and lower relative change values on the upper and lower tails respectively in comparison to the original data.
Social phenomena that are related to human beings cannot be performed under controlled conditions, making it difficult for policy planners to have an idea about the expected future conditions in the society under varying situations and forming policies. However, modelling can be really helpful to planners in these situations. The present paper attempts to find the distributions of age at last conception of women with the help of stochastic modelling for human fertility taking into consideration different parity progression behaviours among couples. This may be helpful to planners for having at least a rough idea of estimated proportion of women of different age groups who will be completing their childbearing and willing to go for sterilization after marriage under different stopping rules regarding desired family size and sex composition of children. Accordingly, these estimates will help planners to optimize the cost and service provision for sterilization programs for women.
Abstract: A prospective, multi-institutional and randomized surgical trial involving 724 early stage melanoma patients was conducted to determine whether excision margins for intermediate-thickness melanomas (1.0 to 4.0 mm) could be safely reduced from the standard 4-cm radius. Patients with 1- to 4-mm-thick melanomas on the trunk or proximal extremities were randomly assigned to receive either a 2- or 4-cm surgical margin with or without immediate node dissection (i.e. immediate vs. later -within 6 months). The median follow-up time was 6 years. Recurrence rates did not correlate with surgical margins, even among stratified thickness groups. The hospital stay was shortened from 7.0 days for patients receiving 4-cm surgical margins to 5.2 days for those receiving 2-cm margins (p = 0.0001). This reduction was largely due to reduced need for skin grafting in the 2cm group. The overall conclusion was that the narrower margins significantly reduced the need for skin grafting and shortened the hospital stay. Due to the adequacy of subject follow up, recently a statistical focus was on what prognostics factors usually called covariates actually determined recurrence. As was anticipated, the thickness of the lesion (p = 0.0091) and whether or not the lesion was ulcerated (p = 0.0079), were determined to be significantly associated with recurrence events using the logistic regression model. This type of fixed effect analysis is rather a routine. The authors have determined that a Bayesian consideration of the results would afford a more coherent interpretation of the effect of the model assuming a random effect of the covariates of thickness and ulceration. Thus, using a Markov Chain Monte Carlo method of parameter estimation with non informative priors, one is able to obtain the posterior estimates and credible regions of estimates of these effects as well as their interaction on recurrence outcome. Graphical displays of convergence history and posterior densities affirm the stability of the results. We demonstrate how the model performs under relevant clinical conditions. The conditions are all tested using a Bayesian statistical approach allowing for the robust testing of the model parameters under various recursive partitioning conditions of the covariates and hyper parameters which we introduce into the model. The convergence of the parameters to stable values are seen in trace plots which follow the convergence patterns This allows for precise estimation for determining clinical conditions under which the response pattern will change.