Abstract
A life test on $N$ items is considered in which the common underlying distribution of the length of life of a single item is given by the density \\begin{equation*}\\tag{1} p(x; \\theta, A) = \\begin{cases}\\frac{1}{\\theta} e^{-(x-A)/\\theta},\\quad\\text{for} x \\geqq A \\\\ 0,\\quad\\text{otherwise}\\end{cases}\\end{equation*} where $\\theta > 0$ is unknown but is the same for all items and $A \\geqq 0.$ Several lemmas are given concerning the first $r$ out of $n$ observations when the underlying p.d.f. is given by (1). These results are then used to estimate $\\theta$ when the $N$ items are divided into $k$ sets $S_j$ (each containing $n_j > 0,$ items, $\\sum^k_{j=1} n_j = N)$ and each set $S_j$ is observed only until the first $r_j$ failures occur $(0 < r_j \\leqq n_j).$ The constants $r_j$ and $n_j$ are fixed and preassigned. Three different cases are considered: 1. The $n_j$ items in each set $S_j$ have a common known $A_j (j = 1, 2, \\cdots, k).$ 2. All $N$ items have a common unknown $A.$ 3. The $n_j$ items in each set $S_j$ have a common unknown $A_j (j = 1, 2, \\cdots, k).$ The results for these three cases are such that the results for any intermediate situation (i.e. some $A_j$ values known, the others unknown) can be written down at will. The particular case $k = 1$ and $A = 0$ is treated in [2]. The constant $A$ in (1) can be interpreted in two different ways: (i) $A$ is the minimum life, that is life is measured from the beginning of time, which is taken as zero. (ii) $A$ is the "time of birth", that is life is measured from time $A$. Under interpretation (ii) the parameter $\\theta,$ which we are trying to estimate, represents the expected length of life.
Keywords
Related Publications
An Extremal Problem in Probability Theory
Let $\xi _1 ,\xi _2 , \cdots \xi _n $ be independent random variables satisfying the following condition; \[ {\bf M}\xi _k = 0,\quad \left| {\xi _k } \right| \leqq c,\quad 1 \le...
Modified Randomization Tests for Nonparametric Hypotheses
Suppose $X_1, \\cdots, X_m, Y_1, \\cdots, Y_n$ are $m + n = N$ independent random variables, the $X$'s identically distributed and the $Y$'s identically distributed, each with a...
A Class of Statistics with Asymptotically Normal Distribution
Let $X_1, \\cdot, X_n$ be $n$ independent random vectors, $X_\\nu = (X^{(1)}_\\nu, \\cdots, X^{(r)}_\\nu),$ and $\\Phi(x_1, \\cdots, x_m)$ a function of $m(\\leq n)$ vectors $x_...
Laws of Large Numbers for Sums of Extreme Values
Let $X_1, X_2, \\cdots$, be a sequence of nonnegative i.i.d. random variables with common distribution $F$, and for each $n \\geq 1$ let $X_{1n} \\leq \\cdots \\leq X_{nn}$ deno...
Achievable rates for multiple descriptions
Consider a sequence of independent identically distributed (i.i.d.) random variables <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlin...
Publication Info
- Year
- 1954
- Type
- article
- Volume
- 25
- Issue
- 2
- Pages
- 373-381
- Citations
- 314
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1214/aoms/1177728793