Научная статья на тему 'RELIABILITY ASSESSMENTS USING STOCHASTIC DEGRADATION PROCESS FOR CURRENT TIME ANALYSIS CUMULATIVE DAMAGE MODELS'

RELIABILITY ASSESSMENTS USING STOCHASTIC DEGRADATION PROCESS FOR CURRENT TIME ANALYSIS CUMULATIVE DAMAGE MODELS Текст научной статьи по специальности «Строительство и архитектура»

CC BY
20
8
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
Reliability / Lumen Maintenance Data / Degradation Data / High-Power White Light / Failure-Time / Highly Reliable Products

Аннотация научной статьи по строительству и архитектуре, автор научной работы — G. Sathya Priyanka, S. Rita, M. Iyappan

The reliability study of such incredibly reliable items is inappropriate for the use of failure time data analysis and testing methodologies. More trustworthy information can be obtained from degradation data than from standard censored failure-time data, especially in cases where few or no failures are anticipated. The market for lighting has given a lot of attention to high-power white light emitting diodes (HPWLEDs). But as one of the more dependable electronic goods, it may not be expected to fail in either a traditional or even an accelerated life test. DDDM, or data-driven degradation methodology, is used in this research. Using data on lumen maintenance gathered from the IES LM-80-08 lumen maintenance test standard and based on the general degradation path model, the dependability of HPWLED was predicted. Testing such devices in typical working situations, and occasionally even under worse conditions, is difficult enough without trying to collect an adequate amount of time-to-failure data. Modern items are made with superb quality and high reliability in mind. Some safety-critical parts and systems are even made to last for an incredibly long time in order to prevent the disastrous effects of probable breakdowns. A cumulative damage model based on stochastic degradation processes has been developed in this paper. A suitable numerical representation is used to support the analytical findings. As a result, the degradation analysis approach has been developed to address dependability modeling issues using data on product degradation gleaned from historical records or degradation testing.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «RELIABILITY ASSESSMENTS USING STOCHASTIC DEGRADATION PROCESS FOR CURRENT TIME ANALYSIS CUMULATIVE DAMAGE MODELS»

FOR CURRENT TIME ANALYSIS CUMULATIVE DAMAGE MODELS

RELIABILITY ASSESSMENTS USING STOCHASTIC DEGRADATION PROCESS FOR CURRENT TIME ANALYSIS CUMULATIVE DAMAGE MODELS

G. Sathya Priyanka1*, S. Rita2, M. Iyappan3

!* Ph.D Research Scholar, Department of Statistics, Periyar University, Salem-11, India 2Associate Professor& Head, Department of Statistics, Periyar University, Salem-11, India 3Assistant Professor, Department of Statistics St. Francis College, Bengaluru-34, India 1*sathyapriyankastat@gmail.com, 2ritasamikannu@gmail.com, 3iyappastat@gmail.com

Abstract

The reliability study of such incredibly reliable items is inappropriate for the use of failure time data analysis and testing methodologies. More trustworthy information can be obtained from degradation data than from standard censored failure-time data, especially in cases where few or no failures are anticipated. The market for lighting has given a lot of attention to high-power white light emitting diodes (HPWLEDs). But as one of the more dependable electronic goods, it may not be expected to fail in either a traditional or even an accelerated life test. DDDM, or data-driven degradation methodology, is used in this research. Using data on lumen maintenance gathered from the IES LM-80-08 lumen maintenance test standard and based on the general degradation path model, the dependability of HPWLED was predicted. Testing such devices in typical working situations, and occasionally even under worse conditions, is difficult enough without trying to collect an adequate amount of time-to-failure data. Modern items are made with superb quality and high reliability in mind. Some safety-critical parts and systems are even made to last for an incredibly long time in order to prevent the disastrous effects of probable breakdowns. A cumulative damage model based on stochastic degradation processes has been developed in this paper. A suitable numerical representation is used to support the analytical findings. As a result, the degradation analysis approach has been developed to address dependability modeling issues using data on product degradation gleaned from historical records or degradation testing.

Keywords: Reliability, Lumen Maintenance Data, Degradation Data, High-Power White Light, Failure-Time, Highly Reliable Products

I. Introduction

The present makers face solid strain to foster new, higher innovation items in record time, while further developing efficiency, item field unwavering quality what's more, by and large quality. This has spurred the advancement of techniques like simultaneous designing and energized more extensive utilization of planned tests for item and interaction improvement. The

necessities for higher unwavering quality have expanded the requirement for more forthright testing of materials, parts and frameworks. Engineers in the assembling enterprises have utilized accelerated test (AT) tests for a long time. The reason for AT tests is to secure dependability data rapidly. Test unit of a material part of subsystem or whole frameworks are exposed to higher than normal degrees of one or then again additional speeding up factors like temperature or stress. Then, at that point, the AT results are utilized to foresee life of the units at use conditions. The extrapolation is normally legitimate (accurately or erroneously) based on genuinely roused models or a blend of experimental model fitting with an adequate measure of past involvement with testing comparative units. The need to extrapolate in both time and the speeding up factors by and large requires the utilization of completely parametric models [16]. Analysts have made significant commitments in the improvement of suitable stochastic models for AT information [typically a dispersion for the reaction and relapse connections between the boundaries of this appropriation and the speeding up variable(s)], measurable strategies for AT arranging (decision of speeding up factor levels and allotment of accessible test units to those levels) and strategies for assessment of reasonable dependability measurements. This paper gives a survey of a considerable lot of the AT models that have been utilized effectively around here.

Sped up life tests are usually utilized in item configuration processes. Since there is restricted opportunity to send off new items, engineers utilize sped up tests to acquire required data on the unwavering quality by raising the levels of specific speed increase factors like temperature, voltage, dampness, stress, and strain. For exceptionally solid present day items, it frequently requires substantially more investment to acquire lifetime and debasement information under common use conditions, and this expects one to utilize sped up tests [4]. Sped up tests open the items to more noteworthy ecological feelings of anxiety so we can get lifetime and corruption estimations in an additional convenient Procedures for playing out an Accelerated Life Testing Plans(ALT) incorporate steady pressure, step pressure, and slope pressure, among others. Assessment of the fluctuation of an assessor of a log area scale dispersion quantile with shifting pressure has numerous pragmatic applications [20]. It is important to foster helpful, precise likelihood models for derivations on the lifetime of the gadgets or frameworks under study. Such models ought to sensibly consolidate the speed increase factors and estimations of debasement as well as any real disappointments noticed. In this manner, in many designing dependability tests, proportions of debasement or wear toward disappointment can frequently be seen throughout some stretch of time before disappointment happens. Since the debasement values give extra data past that given by the disappointment perceptions, the two arrangements of perceptions should be thought about while doing derivation on the factual boundaries of the item or framework lifetime circulations as examined [24]. The target of the current paper is to expand existing outcomes by creating general disappointment models in view of Stochastic cycles for corruption which consolidate a few speeding up factors, and utilize both debasement estimations, and various decisions of test-feelings of anxiety and test length can bring about various accuracy of the gauge of the dependability of the item at typical use conditions. We want to find a test plan that gives least fluctuation of the most extreme probability gauges (MLEs) of the obscure area and scale boundaries of the log-area scale group of disseminations at indicated feelings of anxiety by reasonably deciding the test length [8].

II. Methods

I. Degradation Models in Reliability Analysis

The major idea under the overall debasement way models is to restrict the example space of the corruption interaction and accept all example capabilities concede a similar practical structure yet with various boundaries allude to Lio (2004). The overall corruption way model fits the debasement perceptions by a relapse model with irregular coefficients [16]. It declares that both basic straight relapse and nonlinear relapse models are by and large utilized in debasement way displaying [24]. Straight corruption is used in some basic wear cycles, for example, auto tire wear. Nonetheless, corruption ways are in many cases nonlinear elements of time and now and again linearization is infeasible. It presents an overall nonlinear blended impacts model and a two-stage way to deal with gauge model boundaries, that are multivariate regularly dispersed [18]. What's more, fosters a Monte Carlo reproduction strategy to work out a gauge of the dissemination capability of the opportunity to-disappointment [18]. They propose a parametric bootstrap strategy to set certainty stretches as recommended [18] with the accompanying presumption. Sample assets are randomly selected from a population or production process and random measurement errors are independent across time and assets

• Sample assets are tested in a particular homogenous environment such as the same constant temperature

• Measurement (or inspection) times are pre-specified, the same across all the test assets, and may or may not be equally spaced in time. This assumption is used for constructing confidence intervals for time to failure distribution via the bootstrap simulation technique

A general degradation path model can be expressed as:

JV (0,«J|) ; = 1,2.........m0L<

m

(1) (2)

Where

tj is time of the /^measurement or inspection.

is the measurement error with constant variance <->i

Jjy is the actual path of the iL asset at time tj with unknown parameters as listed later.

^ is the vector of fixed-effect parameters, common for all assets.

jfh

'■-■: is the vector of the 1 asset random-effect parameters, representing resenting individual asset characteristics.

® l and Si j are independent of each other ( i = 1,2,.........n ) and [J = 1.2.........m

m is the total number of possible inspections in the experiment, in is the total number of inspections on the asset, a function of

It is assumed that 0 [-■ 1 = 1-i.........n follows a multivariate distribution function (.) which

may depend on some unknown parameters that must be estimated from the data. The distribution function ofT, the failure time, can be written as:

II. Stochastic Models for Degradation Process

The aleatory vulnerabilities of a corruption cycle can be portrayed utilizing different sorts of probabilistic models. Customarily, the existence time appropriation models are utilized, in which the vulnerability of the corruption is portrayed according to the point of view of the unsure disappointment season of the part. The existence time dissemination model is generally applied in age based upkeep methodologies, where a part is supplanted when its activity time arrives at specific edge [16]. At the point when the review and substitution cost is restrictively high, for example, on account of a thermal energy station, age-based support techniques are normally wasteful as investigation and substitution of part are regardless of its genuine state of corruption. In such cases, condition based support procedures are frequently utilized, which require direct displaying of debasement progress.

Stochastic models are overall more adaptable in demonstrating these mind boggling designs of corruption process. Consequently, the utilization of stochastic models in corruption appraisal and forecast has become progressively famous as of late. Unwavering quality creation in light of corruption demonstrating can be a proficient technique for assess dependability of frameworks when perception s of disappointment rate [29]. Flow research shows that there has been a rising interest in use of stochastic debasement models in dependability expectation and endurance examination.

Models that depict the course of weakening or debasement in units or frameworks are of interest by their own doing, and are likewise key fixings in processes that decide "disappointment" occasions. Comparative with disappointment based dependability, debasement based unwavering quality has gotten a humble measure of consideration in the open writing. Corruption is an action to survey part life time was tended to in the early work [2]. All the more as of late give Valuable synopsis of corruption models, accentuating the utilization of straight models with accepted log-ordinary paces of debasement [19]. In such case, the full life time conveyance can be processed scientifically [14]. Other late models experienced in the writing manage corruption of materials, for example, those because of Celina (2001). If examined general ways to deal with assessing life time disseminations in sped up life test for exceptionally factor conditions. The models introduced there in center the particular of the corruption way as it relies unequivocally upon the working climate [10]. For profoundly dependable gadgets or costly gadgets, nonetheless, lifetime information might be challenging to acquire because of the timeframe required, or the expense of perception. Sped up life testing can frequently be utilized to speed up the disappointment for exceptionally dependable gadgets and subsequently it is proposed to broaden existing outcomes by creating general disappointment models in light of stochastic cycles for debasement which consolidate a few speeding up factors, and utilize both corruption estimations, and genuine disappointments in derivation strategies [22]. Corruption models and sped up test models for induction on dependability have been concentrated on by a few creators.

Sped-up tests decline the strength or time to disappointment and the expense of testing by uncovering the test examples to more significant levels of pressure conditions (expanded sizes or levels of ecological factors) which cause prior breakdowns and more limited lifetimes than the typical use condition [24]. These ecological factors and levels of pressure conditions are alluded to as the "speeding up factors" in the measurements and unwavering quality writing. Sped up life testing (ALT) is a speedy method for getting data about the existence dissemination of a material, part or item. In Sped up life testing (ALT) things are exposed to conditions that are more serious than the typical ones, which yield more limited life yet, ideally, don't change the disappointment systems [24]. A few suppositions are required to relate the life at high feelings of anxiety to life at typical feelings of anxiety being used. In view of these presumptions, the existence conveyance

under typical feelings of anxiety can be assessed. Such approach to testing lessens both time and cost.

Three kinds of pressure loadings are typically applied in sped up life tests: steady pressure, step pressure and Moderate pressure. Consistent pressure is the most widely recognized kind of pressure stacking. Each thing is tried under a consistent level of the pressure, which is higher than typical level. In this sort of testing, we might have a few feelings of anxiety, which are applied for various gatherings of the tried things. This implies that each thing is exposed to just a single feeling of anxiety until the thing falls flat or the test is halted for different reasons. In Sync pressure stacking, the test things are exposed to progressively more significant levels of pressure at pre allotted test times. All things are first exposed to a predetermined consistent pressure for a predefined timeframe [7]. Things that don't bomb will be exposed to a more significant level of pressure for one more indicated time. The degree of stress is expanded bit by bit until all things have fizzled or the test stops for different reasons. Moderate pressure stacking is very similar to the step pressure testing with the distinction that the feeling of anxiety increments ceaselessly. Step-stress testing is an extremely normal kind of sped up testing in view of speeding up factors. It is an effective method for getting disappointments in a moderately short measure of time. There are numerous varieties of step-stress testing. A typical sort is one in which the units are tried at a given anxiety for a specific measure of time. Toward the finish of that time, assuming there are units making due, the anxiety is expanded and held for one more measure of time [21]. The information that outcome from such tests can be dissected utilizing the total harm model. For a nitty gritty concentrate on combined harm model [3o].

For exceptionally dependable items, it's anything but a simple assignment to evaluate the lifetime dispersion of the items by utilizing the conventional life-testing methods which record just opportunity to disappointment information. In any event, utilizing the systems consolidating editing and speeding up strategies, the data about the lifetime appropriation is still exceptionally restricted. Under this present circumstance, an elective methodology is to gather the "debasement" information at more elevated levels of pressure for foreseeing an item's lifetime at a specific use-feeling of anxiety. Such an investigation is called an ADT [3].

For fruitful use of ADT, many ways to deal with model debasement of items are given. Especially, Markov cycles, for example, the Brownian movement with float, the compound Poisson process, and the gamma interaction are generally utilized inferable from their autonomous augmentations property. For the stochastic displaying of monotonic and progressive corruption over the long run in a grouping of minuscule augmentations, the gamma cycle has found application as a debasement model in many examinations [17]. The fundamental goal of paper is to expand existing outcomes and creating general disappointments models based a stochastic cycle for debasement which consolidate a few speeding up factors, and utilize both corruption estimations, and genuine disappointments in inferential methodology.

III. Model Description

At stress level Hp (i = 1,2.., k,} the lifetime Y of a test unit is assumed to follow a log-location scale distribution with a cumulative distribution function (CDF)

Where 'i'VJ is the standard log-location-scale CDF, and the location parameter is fx = fjfxjJ = |3„ 4- [i^X], and & is the unknown scale parameter. Here, the regression parameters

P0 and pL(< 03 are unknown and need to be estimated, and the scale parameter a is assumed to be

free of stress levels. The CDF of the lifetime of a test unit under the k-level step-stress ALT is given

by , = i2 __k_i

Where s:=0 and Si_L = Tj_L-I-Sj_: - Tj_: expiai - Mi-i-1 is the solution of the equation Fi(sj_L; ^..cO = Fj_LÎXj_L + Sj_: - Tj_:; cr),i = 2,3,— , k

(6)

■ :=! '-■- ' '..i-i = '■-■ and the corresponding probability

density function (PDF) of the lifetime of a test unit is given by

(7)

IV. Maximum Likelihood Estimation

From Equations (6) and (7), the joint PDF of observed data is n = Cn^nj.....nt) and y = (,yvyz.....yk) with yi = (yu, yi2.....y^.J is given by

f(y,ri) -

W]} [1 -

(8)

Note that the MLEs of/iD, and o exists only if ^ > = 2, ■■■ ,k, in Equation (8). By using the following expressions

ßi^i-'ki

^ 6 = : dz,

ÔZ.:j _ 1

° 9ßi 0

h~Xk~iJv. .J.-- -r-

8Zj.

' 3L

ZU. a

for i = 1,2, •■■, k and j = 1,2, ■ ,,,-Hi k

d*tk _ 1 a1k _ 1 dpa a " dßl a

LS = I

sVk = _Vk ' 3l a

^Cfjft) wGrffc)

n

The MLE isp„, P1and<? can be obtained by solving the following likelihood equations:

= 0

(9)

The second derivative of the log-likelihood function

L-i-Crjft) L drj

1

(1»)

Since these situations can't be addressed scientifically, mathematical techniques, for example, reenacted strengthening calculation or some other iterative system should be utilized for this assessment issue. A benefit of utilizing the reenacted strengthening calculation is that it permits us to find a worldwide ideal without relying upon the decision of the underlying qualities, which is one of the primary downsides of the regularly utilized mathematical techniques like Newton-Raphson.

I. Numerical Illustration

The ideal arrangements were acquired by the reproduced tempering calculation as proposed by Corona et al., (1987). It very well may be effectively seen that the assessed asymptotic changes in light of complete information are the littlest, trailed by those given blue-penciled information inside the looking-through range (0, 50], and afterward the ones with examination spans being picked by specific proportions. To determine the optimal unequal time points (xL < tz < ■■■ < t^) that minimize the large-sample approximate variance of the MLE of the 200pth quantile (0 <p < 1] of the log-lifetime distribution at the normal-use stressxP. The MLE of the 200p quantile at the normal-use stress xp can be expressed asyp = ftp + -I- Zp'3, where Zp is the lOOpth percentile of the standard log-location-scale distribution. Thus, If xp = 0, the asymptotic variance of the estimator vp at the normal-use stress*« is given by

Where V^' It is verified under the C- optimality Criterion based 2-level step-stress ALT plan is preferable, whenever we optimize the general 2-level step-stress ALT plan the second K-level step-ALT plan under a censoring scheme under the considered and the results are shown in Table -1.x = 2: l.The inspection interval for the first stage is twice as long as that for the second stage

IV. Discussion

A Var Cvp) = A Var (ft +- z^o) = Vn +■ VM +■ 23p V,

(12)

Table 1: Censoring scheme

Co P0 Y0 00 Parameter MLE RAB MSE

C 1.10902 0.10902 0.01189

P 0.98314 0.01686 0.00028

Y 1.22511 0.01991 0.00062

1.0 1.0 1.25 0.7 e 0.88398 0.11602 0.01346

at 1.38040 0.10486 0.01717

Œ2 0.69831 0.11785 0.00542

Œ3 0.46874 0.12552 0.00273

C 1.15695 0.15695 0.02463

P 0.9846 0.0154 0.00024

Y 1.23098 0.05309 0.00476

1.0 1.0 1.3 1.0 e 0.87707 0.12293 0.12446

a1 1.44053 0.15299 0.03654

Œ2 0.72799 0.16536 0.01067

Œ3 0.48837 0.17266 0.00517

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

C 1.35169 0.35169 0.12369

P 0.99012 0.00988 0.00010

Y 1.25504 0.16331 0.06001

1.0 1.0 1.5 1.0 e 0.84992 0.15008 0.02252

a1 1.68507 0.34872 0.18982

Œ2 0.84832 0.35799 0.05001

Œ3 0.56782 0.36344 0.02291

C 1.26473 0.26473 0.07008

P 0.97595 0.11277 0.01539

Y 1.37052 0.02106 0.00087

1.0 1.1 1.4 1.0 e 0.8474 0.1526 0.02329

a1 1.5717 0.23028 0.08654

Œ2 0.79906 0.34074 0.04124

Œ3 0.53793 0.4099 0.02446

C 1.14000 0.08800 0.01210

P 0.95459 0.13219 0.02114

Y 1.37506 0.10005 0.01564

1.25 1.1 1.25 1.0 e 1.01030 0.01030 0.00011

a1 1.40997 0.11705 0.03494

Œ2 0.72753 0.02342 0.00030

Œ3 0.49403 0.03589 0.00029

1.4 1.0 1.0 0.7 C 0.91213 0.34848 0.23802

P 0.68352 0.31648 0.10016

Y 1.25991 0.25991 0.06755

e 1.10442 0.57774 0.16356

ai 1.06206 0.39281 0.47207

Œ2 0.66129 0.24387 0.04549

Œ3 0.50122 0.14034 0.00670

C 0.91672 0.3452 0.23356

P 0.84089 0.29926 0.12896

Y 1.49253 0.49253 0.24259

1.4 1.2 1.0 0.9 e 1.10045 0.22272 0.04018

ai 1.10547 0.39552 0.52319

Œ2 0.61718 0.22467 0.03198

Œ3 0.43888 0.10314 0.00255

Table 2: The variance optimally under the step stress setting based

Complete data Censored data

P Xi X2 «,-.= 1

0.5 0.2 0.4 0.5 1.0 0.5 1.0 10.2895 11.0011 6.8724 7.3327 10.0470, 20.0000 10.7887, 20,0000 6.8408, 20,0000 7.3107, 20.0000 10.0312(3.48) 10.8267(2.45) 6.5817(15.81) 7.1133(8.71) 10.0000(3.35) 10.8267(2.50) 6.5817(16.02) 7.1133(8.32) 6.6667(3.99) 6.6667(3.10) 6.6667(15.83) 0.6667(8.38)

0.95 0.2 0.4 0.5 1.0 0.5 1.0 13.6090 14.9424 8.1548 9.0548 12.6071, 20,0000 14.0111, 20,0000 8.0935, 20,0000 9.0018, 20,0000 13.3333(7.48) 13.3333(5.87) 7.8435(26.76) 8.8341(14.17) 10.0000(7.87) 10.0000(6.56) 8.0765(25.03) 9.0146(14.08) 6.6667(11.03) 6.6667(9.42) 6.6667(25.91) 6.6667(16.19)

The stress levels Xp l = 1, ...,k, when = 2,5,= —1.0, <j = O.S. to identify the optimal change points leading to variance-optimality, the optimal change points, and associated asymptotic variance based on the censored data when the lengths of the inspection intervals were chosen according to a certain ration.

V. Conclusion

Numerous stochastic models of equipment deterioration have been put forth based on the physics of failure and the operational environment. The essential idea of these models is that they are created by modeling the underlying mechanisms that lead to failure, like degradation and wear, using the appropriate stochastic processes. A unified theory of predictive maintenance must be developed through the creation and analysis of these stochastic deterioration models.

These models produce residual life distributions and time to failure that are highly theoretically challenging. Our study's goal is to determine whether conventional time to failure distributions, like the Weibull, can be used to approximate the time to first failure distributions

that come from stochastic deterioration models. We initially built a discrete-event simulation model that simulates the stochastic deterioration and failure of the target system, which is a single-unit system subject to a random operating environment with a variable instantaneous rate of degradation. The typical time to failure distribution is then fitted to the simulated data using a predetermined methodology. The quality of this fit is assessed using a large-scale numerical experiment with a variety of system characteristics. According to the findings of the goodness-of-fit tests, a truncated, three-parameter Weibull distribution is a fair approximation for the scenario discussed in the study.

In ALTA, the analysis is carried out in conditions of high stress, and the extrapolation from the heightened stress levels to the usage stress level is based on the association between life stress and stress at work. Any level at which the life-stress relationship holds true can be used to forecast product performance, including the usage stress level.

References

[1] Aly, H. M. and Ismail, A. A. (2008). Optimum Simple Time-Step Stress Plans for Partially Accelerated Life Testing With Censoring. Far East Journal of Theoretical Statistics, 24: 175-200.

[2] Abdel-Ghaly, A. A Attia, A. F, and. Aly, H. M. (1998). Estimation of the parameters of pareto distribution and the reliability function using accelerated life testing with censoring. Communications in Statistics Part B, 27: 469-484.

[3] Abdulla, A. Alhadeed and Shie-Shien, Yang. (2005). Optimal Simple Step-Stress Plan for Cumulative Exposure Model Using Log-Normal Distribution. IEEE Transactions on Reliability, 54: 64-68.

[4] Bai, D. S. and Chung S. W. (1989). An accelerated life test model with the inverse power law. Reliability Engineering and System Safety, 24: 223-230.

[5] Bugaighis, M. M. (1990). Properties of the MLE for parameters of a Weibull regression model under type I censoring. IEEE Transactions on Reliability, 39:102-105.

[6] Bai, M. S. Kim and Lee, S. H. (1989). Optimum Simple Step-Stress Accelerated Life Tests with Censoring. IEEE Transactions on Reliability, 38:528-532.

[7] Bhattacharyya, G. K and Zanzawi Soejoeti, (1989). A tampered failure rate model for step-stress accelerated life test. Communications in Statistics - Theory and Methods, 18:1627-1643.

[8] Chen-Mao Liao, and Sheng-Tsaing Tseng, (2006). Optimal Design for Step-Stress Accelerated Degradation Tests. IEEE Transactions on Reliability, 55: 59-66.

[9] Chengjie Xiong, and George, A. Milliken. (1999). Step-S tress Life-Testing with Random S tress-Change Times for Exponential Data. IEEE Transactions on reliability, 48:141-148.

[10] Chanseok Park and William, J. Padgett. (2005). New Cumulative Damage Models for Failure Using Stochastic Processes as Initial Damage. IEEE Transactions on Reliability, 54:530-540.

[11] El-Dessouky, E. A., On the use of bayesian approach in accelerated life testing, M.S. thesis: Institute of Statistical Studies and Research, Cairo University, Egypt, 2001.

[12] Gouno, E. (2007). Optimum step-stress for temperature accelerated life testing. Quality and Reliability Engineering International, 23:915-924.

[13] Jerry Lawless and Martin Crowder, (2004). Covariates and Random Effects in a Gamma Process Model with Application to Degradation and Failure. Lifetime Data Analysis, 10: 213-227.

[14] Johnson, N. L., Kotz, S., and Balakrishnan, N. Continuous Univariate Distributions, Probability and mathematical Statistics, 1995.

[15] Joseph .C, Lu and William. Q,Meeker. (1993). Using Degradation Measures to Estimate a Time-to- Failure Distribution, Technometrics, 35:161-174.

[16] Lawless, J.F. (1976).Confidence interval estimation in the inverse power law model. Journal of the Royal Statistical Society: Series C, 25: 128-138.

[17] Lu, C.J. and Meeker, W.Q. (1993). Using Degradation Measures to Estimate a Time-to-Failure Distribution. Technometrics, 35:161-174.

[18] McCool, J. I. (1980). Confidence limits for Weibull regression with censored data. IEEE Transactions on Reliability, 29:145-150.

[19] Meeker, W. Q, Escobar, L. A and Lu, C. J.(1998). Accelerated degradation tests: modeling and analysis. Technometrics, 40:89-99.

[20] Mathai, A. M. and Provost, S. B. (2004). On the distribution of order statistics from generalized logistic samples. International Journal of Statistics, 62:63-71.

[21] Michkle Boulanger Carey, and Reed H. Koenig. (1991). Reliability Assessment Based on Accelerated Degradation: A Case Study. IEEE Transactions on Reliability, 40:499-506.

[22] Nesar Ahmad, Ariful Islam and Abdus Salam, (2006). Analysis of optimal accelerated life test plans for periodic inspection. International Journal of Quality & Reliability Management, 23: 10191046.

[23] Nelson, W and Kielpinski, T. J (1976).Theory for optimum censored accelerated life tests for normal and lognormal life distributions. Technometrics, 18: 105-114.

[24] Park. C, and Padgett. W. J, (2005). Accelerated Degradation Models for Failure Based on Geometric Brownian motion and Gamma Processes. Lifetime Data Analysis, 11: 511-527.

[25] Rai, R. K. Sarkar, S. and Singh V. P. (2009). Evaluation of the adequacy of statistical distribution functions for deriving unit hydrograph. Water Resources Management, 23:899-929.

[26] Robert Miller and Wayne Nelson.(1983). Optimum Simple Step-Stress Plans for Accelerated Life Testing. IEEE Transactions on Reliability, 32:59-65.

[27] Shabri, A. Ahmad, U. N. Zakaria Z. A. (2011). TL-moments and L-moments estimation of the generalized logistic distribution. Journal of Mathematical Research, 10:97-106.

[28] Whitmore, G. A. and Fred Schenkelberg. (1997). Modeling Accelerated Degradation Data Using Wiener Diffusion with a Time Scale Transformation.Lfetime Data Analysis, 3:27-45.

[29] William Q. Meeker, Luis A. Escobar and C. Joseph Lu, (1998). Accelerated Degradation Tests: Modeling and Analysis. Technometrics, 40: 89-99.

[30] Watkins, A. J. (1991). On the analysis of accelerated life-testing experiments. IEEE Transactions on Reliability, 40:98-101.

i Надоели баннеры? Вы всегда можете отключить рекламу.