Clever Geek Handbook
📜 ⬆️ ⬇️

Kuiper Criterion

The Kuiper (also Cooper) consent criterion [1] is a development of the Kolmogorov consent criterion and was proposed to test simple hypotheses about the belonging of the analyzed sample to a fully known law , that is, to test hypotheses of the formH0:Fn(x)=F(x,θ) {\ displaystyle H_ {0}: F_ {n} (x) = F (x, \ theta)} H_ {0}: F_ {n} (x) = F (x, \ theta) with the known vector of the parameters of the theoretical law.

The Kuiper criterion uses statistics of the form:Vn=Dn++Dn- {\ displaystyle V_ {n} = D_ {n} ^ {+} + D_ {n} ^ {-}} {\ displaystyle V_ {n} = D_ {n} ^ {+} + D_ {n} ^ {-}} where


    
      
        
          
             D  
            
               n  
            
            
               +  
            
          
           =  
           max  
          
            
               (  
              
                
                  
                     i  
                     n  
                  
                
                 -  
                 F  
                 (  
                
                   x  
                  
                     i  
                  
                
                 ,  
                 θ  
                 )  
              
               )  
            
          
        
      
       {\ displaystyle D_ {n} ^ {+} = \ max {\ left ({\ frac {i} {n}} - F (x_ {i}, \ theta) \ right)}}  
    {\ displaystyle D_ {n} ^ {+} = \ max {\ left ({\ frac {i} {n}} - F (x_ {i}, \ theta) \ right)}}  , 
    
      
        
          
             D  
            
               n  
            
            
               -  
            
          
           =  
           max  
          
            
               (  
              
                 F  
                 (  
                
                   x  
                  
                     i  
                  
                
                 ,  
                 θ  
                 )  
                 -  
                
                  
                    
                       i  
                       -  
                       one  
                    
                     n  
                  
                
              
               )  
            
          
        
      
       {\ displaystyle D_ {n} ^ {-} = \ max {\ left (F (x_ {i}, \ theta) - {\ frac {i-1} {n}} \ right)}}  
    {\ displaystyle D_ {n} ^ {-} = \ max {\ left (F (x_ {i}, \ theta) - {\ frac {i-1} {n}} \ right)}}  , 
    
      
        
           i  
           =  
          
            
              
                
                   one  
                   ,  
                   n  
                
                 ¯  
              
            
          
        
      
       {\ displaystyle i = {\ bar {1, n}}}  
    {\ displaystyle i = {\ bar {1, n}}}  ,  

n{\ displaystyle n} n - sample size,xone,x2,...,xn {\ displaystyle x_ {1}, x_ {2}, ..., x_ {n}} x_ {1}, x_ {2}, ..., x_ {n} - Sorted items in increasing order.

With the validity of a simple testable hypothesis, statisticsnVn {\ displaystyle {\ sqrt {n}} V_ {n}} {\ displaystyle {\ sqrt {n}} V_ {n}} in the limit obeys [1] the distribution:

G(v)=one-∑m=one∞2(fourm2v2-one)e-2m2v2{\ displaystyle G (v) = 1- \ sum _ {m = 1} ^ {\ infty} 2 (4m ^ {2} v ^ {2} -1) e ^ {- 2m ^ {2} v ^ { 2}}} {\ displaystyle G (v) = 1- \ sum _ {m = 1} ^ {\ infty} 2 (4m ^ {2} v ^ {2} -1) e ^ {- 2m ^ {2} v ^ { 2}}} .

To reduce the dependence of the distribution of statistics on the sample size, you can use the modification of statistics of the form [2] in the criterion

V=Vn(n+0,155+0,24/n){\ displaystyle V = V_ {n} \ left ({\ sqrt {n}} + 0.155 + 0.24 / {\ sqrt {n}} \ right)} {\ displaystyle V = V_ {n} \ left ({\ sqrt {n}} + 0.155 + 0.24 / {\ sqrt {n}} \ right)} ,

or modification of statistics of the form [3]

Vnmod=n(Dn++Dn-)+one/(3n){\ displaystyle V_ {n} ^ {mod} = {\ sqrt {n}} \ left (D_ {n} ^ {+} + D_ {n} ^ {-} \ right) + 1 / (3 {\ sqrt {n}})} {\ displaystyle V_ {n} ^ {mod} = {\ sqrt {n}} \ left (D_ {n} ^ {+} + D_ {n} ^ {-} \ right) + 1 / (3 {\ sqrt {n}})} .

In the first case, the difference in the distribution of statistics from the limiting law can be neglected forn>twenty {\ displaystyle n> 20} {\ displaystyle n> 20} , in the second - atn>thirty {\ displaystyle n> 30} {\ displaystyle n> 30} .

When testing simple hypotheses, the criterion is free from distribution, that is, it does not depend on the type of law with which agreement is checked.

The tested hypothesis is rejected for large statistics.

Testing complex hypotheses

When testing complex hypotheses of the formH0:Fn(x)∈{F(x,θ),θ∈Θ} {\ displaystyle H_ {0}: F_ {n} (x) \ in \ left \ {F (x, \ theta), \ theta \ in \ Theta \ right \}}   where is the estimateθ^ {\ displaystyle {\ hat {\ theta}}}   scalar or vector distribution parameterF(x,θ) {\ displaystyle F (x, \ theta)}   calculated according to the same sample, Kuiper’s criterion of agreement (like all nonparametric criteria of agreement) loses the property of freedom from distribution [4] .

When testing complex hypotheses on the distribution of statistics of nonparametric criteria for agreement, they depend on a number of factors: the type of lawF(x,θ) {\ displaystyle F (x, \ theta)}   corresponding to a fair testable hypothesisH0 {\ displaystyle H_ {0}}   ; the type of parameter being evaluated and the number of parameters being evaluated; in some cases, from a specific parameter value (for example, in the case of gamma and beta distribution families); from the method of estimating parameters. The differences in the limiting distributions of the same statistics when testing simple and complex hypotheses are so significant that this cannot be neglected in any case [5] .

See also

  • Kolmogorov consent criterion
  • Cramer - Mises - Smirnov criterion
  • Anderson-Darling test
  • Pearson Consent Criteria
  • Watson Consent Criteria

Notes

  1. ↑ 1 2 Kuiper NH Tests regarding random points on a circle // Proc. Konikl. Nederl. Akad. Van Wettenschappen. 1960. Ser. AV 63.P. 38 - 47.
  2. ↑ Stephens MA EDF statistics for goodness of fit and some comparisons // J. American Statistic. Associa¬tion. 1974. V. 69. N 347. P. 730-737.
  3. ↑ Lemeshko B. Yu., Gorbunova A. A. On the application and power of nonparametric criteria for the consent of Cooper, Watson and Zhang // Measuring technique. 2013. No. 5. - C.3-9.
  4. ↑ Kac M., Kiefer J., Wolfowitz J. On Tests of Normality and Other Tests of Goodness of Fit Based on Distance Methods // Ann. Math. Stat., 1955. V. 26. - P.189-211.
  5. ↑ Lemeshko B. Yu., Gorbunova A. A. Application of non-parametric criteria for the consent of Cooper and Watson in testing complex hypotheses // Measuring technique. 2013. No. 9. - S.14-21.
Source - https://ru.wikipedia.org/w/index.php?title=Kuyper_consent_criteria&oldid=77417459


More articles:

  • Drysviat Castle
  • College of Pontiffs
  • Frey, Jan
  • Georgian, Irakli Aleksandrovich
  • Kindyakov, Mikhail Lvovich
  • GDR Weekly Demonstrations
  • Belchers (Hong Kong)
  • Chizhgora
  • K-3 (combine)
  • Superflake Returns

All articles

Clever Geek | 2019