This question evaluates understanding of random variate generation and Markov chain Monte Carlo techniques, specifically inverse-transform sampling for the Logistic(μ,s) distribution and Gibbs sampling for a bivariate joint density, testing competency in probability theory, distribution properties, and statistical computing within the Statistics & Math domain. It is commonly asked to assess the ability to derive inverse CDFs and full conditional distributions, verify sample correctness with goodness-of-fit and convergence diagnostics, and reason about sample size, burn-in, and thinning, reflecting a mix of conceptual understanding and practical application.
Inverse transform: (a) Derive an algorithm to simulate from the Logistic(μ, s) distribution using its CDF and inverse CDF; show how to obtain samples for μ=0, s=1. (b) Give acceptance criteria to verify sample correctness (e.g., KS test) and discuss sample size needed. Gibbs sampling: Consider the joint density proportional to exp(−(x−y)^2/2)·exp(−x^2/2) for x, y ∈ R. (c) Derive the full conditional distributions p(x|y) and p(y|x) and specify a Gibbs sampler. (d) Explain how to diagnose convergence and choose burn-in and thinning.