Advanced Search
MyIDEAS: Login

Minimum distance classification rules for high dimensional data


Author Info

  • Srivastava, Muni S.
Registered author(s):


    In this article, the problem of classifying a new observation vector into one of the two known groups [Pi]i,i=1,2, distributed as multivariate normal with common covariance matrix is considered. The total number of observation vectors from the two groups is, however, less than the dimension of the observation vectors. A sample-squared distance between the two groups, using Moore-Penrose inverse, is introduced. A classification rule based on the minimum distance is proposed to classify an observation vector into two or several groups. An expression for the error of misclassification when there are only two groups is derived for large p and n=O(p[delta]),0

    Download Info

    If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
    File URL:
    Download Restriction: Full text for ScienceDirect subscribers only

    As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.

    Bibliographic Info

    Article provided by Elsevier in its journal Journal of Multivariate Analysis.

    Volume (Year): 97 (2006)
    Issue (Month): 9 (October)
    Pages: 2057-2070

    as in new window
    Handle: RePEc:eee:jmvana:v:97:y:2006:i:9:p:2057-2070

    Contact details of provider:
    Web page:

    Order Information:

    Related research

    Keywords: Fisher discriminant rule Misclassification error Moore-Penrose inverse Multivariate normal Singular Wishart;


    No references listed on IDEAS
    You can help add them by filling out this form.


    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as in new window

    Cited by:
    1. Yata, Kazuyoshi & Aoshima, Makoto, 2012. "Effective PCA for high-dimension, low-sample-size data with noise reduction via geometric representations," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 193-215.


    This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.


    Access and download statistics


    When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:97:y:2006:i:9:p:2057-2070. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Zhang, Lei).

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If references are entirely missing, you can add them using this form.

    If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.