

Appendix 2. Frequently Asked Questions



This document is divided into two part as follows.

1. Questions About This Package
2. Questions About The Theory Behind This Package



1. Questions About This Package

Q;  Where can I find this software package on the internet?
A; (1) World Wide Web Address (URL) for Our Research Lab
    *  http://www-ee.uta.edu/ip
   (2) ftp access
       directory pc/artificial at garbo.uwasa.fi  
       directory pub/neural/software/neucl at ftp.uni-bremen.de
       directory SimTel/msdos/neurlnet at ftp.coast.net
     * directory pub/neural/software at www-ee.uta.edu
* most up-to-date version

Q;  What capabilities does this software package have that 
    differentiate it from those developed elsewhere ?      
A;  This package 
 (1) Includes a network sizing program, that allows one to estimate
     how many hidden units an MLP must have to achieve a user-chosen
     performance. This program usually, but not always, works.
 (2) Includes a fast training program unlike others that are available. 
     This technique is about 3 times faster than full-blown (without 
     heuristic changes for speeding it up) conjugate gradient training, 
     and performs much better. Training is 10 to 1000 times faster 
     than backpropagation. 
 (3) Includes a network structure analysis program. Given a trained 
     MLP, this program makes a table of network performance versus 
     the number of hidden units. Using the table, and the non-demo 
     version of this program, the user can choose the size wanted 
     and prune the network, saving new weight and network structure files.
     The demo and non-demo versions can also determine the amount of 
     nonlinearity (degree) in each hidden layer, thereby informing the 
     user if a linear network would solve his problem.

Q;  Can the programs in this package be run individually, without the menus?
A;  Yes. Exe files from the \NNMap directory, and their purposes, are
    listed below.
        Classify.exe - Classify vectors using clusters from Clustr.exe or Som.exe.
        Clustr.exe - Sequential leader and K-means clustering program
        Combine.exe - Combines columns from 1 or more files into a new file
        Conmap.exe - Read a training data file for mapping nets, count patterns 
        Exam.exe - Plot columns of a data file, or their histograms 
        Gen.exe - Generate parameter files for MLP sizing, training, pruning
        Init6.exe - Fast design of MLP mapping networks
        Jinn.exe - Code generation program
        Mod.exe - Design and application of modular networks
        Pikfeat.exe - Feature selection program
        Polymap.exe - Functional link net mapping program
        Procm.exe - Process a data file using a trained MLP 
        Series.exe - Convert time series data into training data format
        Som.exe - Self-organizing feature map clustering program
        Span2.exe - KLT program
        Split.exe - Program to randomly split a file into two files
        Tmapc2.exe - Pruning of MLP mapping nets
        Tmpcmod.exe - Polnomial modelling of MLP mapping nets
        Topolm.exe - Sizing of MLP mapping networks
        Watecon.exe - Convert an MLP weight file to ASCII form 
     
Q;  What size networks can be designed using this package ?
A;  MLP networks can have up to four layers (two hidden layers) with
    40 units per hidden or output layer and up to 100 inputs. Functional
    link nets can have up to 40 inputs and 40 outputs. Under the "Data 
    Compression" option, which is under "Data Pre-Processing", you can 
    compress input vectors or desired output vectors from up to 200 
    elements down to 100 or fewer elements. If a network has been 
    trained to approximate compressed desired outputs, the "Data 
    Compression" program includes an expansion option which can be 
    used to expand the compressed outputs up to the desired size (up 
    to 200 outputs).

Q;  Why does this package design only mapping or estimation networks and
    not classification networks ? Don't both types of network use 
    the same format for training data and the same training algorithms ?
    
A;  We have separate packages for classification and mapping or 
    estimation because;    
 (1) Our training algorithms for classification and mapping networks have 
     some important differences. For example, the functional link net 
     design for a mapping net is not iterative, whereas that for 
     classification nets is iterative. The fast-trained MLP classification
     network learns even when the learning factor is set to 0., unlike
     the MLP for mapping.  
 (2) Combining the two packages would make the result unnecessarily large.
 (3) Many people need to do mapping or classification but not both.

Q;  What error function is being minimized during fast training and 
    functional link net training ?

                   Nout      
A;  MSE = (1/Npat) SUM MSE(k)     where
                   k=1  

              Npat              2
    MSE(k) =  SUM [ Tpk - Opk ]
              p=1  

    where Npat is the number of training patterns, Nout is the number 
    of network output nodes, Tpk is the desired output for the pth
    training pattern and the kth output, and Opk is the actual output
    for the pth training pattern and the kth output. MSE is printed
    for each iteration.

Q;  What are "RMS error", "Relative RMS Error" , and "Error Variances"?
    The rms error of the kth output, RMS(k), is SQRT( MSE(k)/Npat ),
    where SQRT means square root. The kth output's Relative RMS Error is

    R(k) = SQRT( MSE(k)/E(k) ) where

            Npat           2
    E(k) =  SUM [ Opk-Mk ]      and
            p=1  

                  Npat 
    Mk = (1/Npat) SUM  Opk 
                  p=1  

    The kth output's Error Variance is MSE(k)/Npat.


2. Questions About The Theory Behind This Package
   
Q;  Do you have any papers related to modular neural nets ?
A;  Yes.

    K. Rohani, M.S. Chen and M.T. Manry, "Neural Subnet Design by
    Direct Polynomial Mapping," IEEE Transactions on Neural Networks,
    Vol. 3, no. 6, pp. 1024-1026, November 1992. 

    K. Rohani and M.T. Manry, "Multi-Layer Neural Network Design Based
    on a Modular Concept," Journal of Artificial Neural Networks,
    vol. 1, no. 3, 1994, pp. 349-370.
    
    
Q;  Do you have any papers related to fast training of MLPs, and
    related topics?
A;  Yes. 
    
    M.S. Dawson, A.K. Fung, M.T. Manry, "Sea Ice Classification Using
    Fast Learning Neural Networks," Proc. of IGARSS'92, Houston, Texas,
    May 1992, vol. II, pp 1070-1071.
    
    M.S. Dawson, J. Olvera, A.K. Fung, M.T. Manry, "Inversion of
    Surface Parameters Using Fast Learning Neural Networks," Proc. of
    IGARSS'92, Houston, Texas, May 1992, vol. II, pp 910-912.
    
    M.T. Manry, X. Guan, S.J. Apollo, L.S. Allen, W.D. Lyle, and W.
    Gong, "Output Weight Optimization for the Multi-Layer Perceptron,"
    Conference Record of the Twenty-Sixth Annual Asilomar Conference on
    Signals, Systems, and Computers, Oct. 1992, vol 1, pp. 502-506.
    
    X. Jiang, Mu-Song Chen, and M.T. Manry, "Compact Polynomial
    Modeling of the Multi-Layer Perceptron," Conference Record of the
    Twenty-Sixth Annual Asilomar Conference on Signals, Systems, and
    Computers, Oct. 1992, vol 2, pp.791-795.
    
    R.R. Bailey, E.J. Pettit, R.T. Borochoff, M.T. Manry, and X. Jiang,
    "Automatic Recognition of USGS Land Use/Cover Categories Using
    Statistical and Neural Network Classifiers," Proceedings of SPIE
    OE/Aerospace and Remote Sensing, April 12-16, 1993, Orlando
    Florida.
    
    M.S. Dawson, A.K. Fung, M.T. Manry, "Classification of SSM/I Polar
    Sea Ice Data Using Neural Networks," Proc. of PIERS 93, 1993, p.
    572.
    
    F. Amar, M.S. Dawson, A.K. Fung, M.T. Manry, "Analysis of
    Scattering and Inversion From Forest," Proc. of PIERS 93, 1993, p.
    162.
    
    M.S. Dawson, A.K. Fung, and M.T. Manry, "Surface Parameter
    Retrieval Using Fast Learning Neural Networks," Remote Sensing
    Reviews, Vol. 7, pp. 1-18, 1993.

    M.T. Manry, S.J. Apollo, L.S. Allen, W.D. Lyle, W. Gong, M.S.
    Dawson, and A.K. Fung, "Fast Training of Neural Networks for Remote
    Sensing," Remote Sensing Reviews, July 1994, vol. 9, pp. 77-96, 1994.

Q;  Do you have any papers related to the analysis of trained neural
    networks ?
A;  Yes. 
    
    W. Gong, H.C. Yau, and M.T. Manry, "Non-Gaussian Feature Analyses
    Using a Neural Network," Progress in Neural Networks, vol. 2, 1994,
    pp. 253-269.
    
    M.S. Chen and M.T. Manry, "Back-Propagation Representation Theorem
    Using Power Series," Proceedings of IJCNN 90, San Diego, I-643 to
    I-648.
    
    M.S. Chen and M.T. Manry, "Basis Vector Analyses of Back-
    Propagation Neural Networks," Proceedings of the 34th Midwest
    Symposium on Circuits and Systems, Monterey, California, May 14-17
    1991, vol. 1, pp 23-26. 
    
    M.S. Chen and M.T. Manry, "Power Series Analyses of Back-
    Propagation Neural Networks," Proc. of IJCNN 91, Seattle WA., pp.
    I-295 to I-300.
    
    M.S. Chen and M.T. Manry, "Nonlinear Modelling of Back- Propagation
    Neural Networks," Proc. of IJCNN 91, Seattle WA., p. A-899.
    
    M.S. Chen and M.T. Manry, "Basis Vector Representation of Multi-
    Layer Perceptron Neural Networks," submitted to IEEE Transactions
    on Neural Networks.
    
    W. Gong, H.C. Yau, and M.T. Manry, "Non-Gaussian Feature Analyses
    Using a Neural Network," accepted by Progress in Neural Networks,
    vol. 2, 1991.
    
    X. Jiang, Mu-Song Chen, M.T. Manry, M.S. Dawson, A.K. Fung,
    "Analysis and Optimization of Neural Networks for Remote Sensinsing,"
    Remote Sensing Reviews, July 1994, vol. 9, pp. 97-114, 1994.

    M.S. Chen and M.T. Manry, "Conventional Modelling of the Multi-
    Layer Perceptron Using Polynomial Basis Functions," IEEE
    Transactions on Neural Networks, Vol. 4, no. 1, pp. 164-166,
    January 1993. 

    W.E. Weideman, M.T. Manry, H.C. Yau, and W. Gong "Comparisons of a
    Neural Network and a Nearest Neighbor Classifier Via the Numeric
    Handprint Character Recognition Problem," IEEE Transactions on
    Neural Networks, Vol. 6, no. 6, pp. 1524-1530, November 1995.
    
    
Q;  Do you have any papers related to the prediction of neural net 
    performance, and pre-processing of data ?
A;  Yes. 
    
    S.J. Apollo, M.T. Manry, L.S. Allen, and W.D. Lyle, "Optimality of
    Transforms for Parameter Estimation,"  Conference Record of the
    Twenty-Sixth Annual Asilomar Conference on Signals, Systems, and
    Computers, Oct. 1992, vol. 1, pp. 294-298.
    
    Q. Yu, S.J. Apollo, and M.T. Manry, "MAP Estimation and the
    Multilayer Perceptron," Proceedings of the 1993 IEEE Workshop on
    Neural Networks for Signal Processing, Linthicum Heights, Maryland,
    Sept. 6-9, 1993, pp. 30-39.
    
    M.T. Manry, S.J. Apollo, and Q. Yu, "Minimum Mean Square Estimation
    and the Multilayer Perceptron," accepted by Neurocomputing.
    
    W. Liang, M.T. Manry, Q. Yu, S.J. Apollo, M.S. Dawson, and A.K.
    Fung, "Bounding the Performance of Neural Network Estimators, Given
    Only a Set of Training Data," Conference Record of the
    Twenty-Eighth Annual Asilomar Conference on Signals, Systems, and
    Computers, vol. 2, Nov. 1994, pp.912-916.
    
    W. Liang, M.T. Manry, S.J. Apollo, M.S. Dawson, and A.K. Fung,
    "Stochastic Cramer Rao Bounds for Non-Gaussian Signals and
    Parameters," Proceedings of ICASSP-95, vol. 5, May 1995, pp. 3367-
    3369.
        
    
Q;  Do you have any papers related to the training of functional link
    neural networks ?
A;  Yes. 
        
    H.C. Yau and M.T. Manry, "Sigma-Pi Implementation of a Nearest
    Neighbor Classifier," Proceedings of IJCNN 90, San Diego, I-667 to
    I-672.
    
    H.C. Yau and M.T. Manry, "Sigma-Pi Implementation of a Gaussian
    Classifier," Proceedings of IJCNN 90, San Diego, III-825 to
    III-830.
    
    H.C. Yau and M.T. Manry, "Shape Recognition Using Sigma-Pi Neural
    Networks," Proc. of IJCNN 91, Seattle WA., p. II A-934.
    
    H.C. Yau and M.T. Manry, "Shape Recognition with Nearest Neighbor
    Isomorphic Network," Proceedings of the First IEEE-SP Workshop on
    Neural Networks for Signal Processing, Princeton, New Jersey, Sept.
    29 - Oct. 2, 1991, pp. 246-255.
    
    H.C. Yau and M.T. Manry, "Iterative Improvement of a Gaussian
    Classifier," Neural Networks, Vol. 3, pp. 437-443, July 1990.
    
    H.C. Yau and M.T. Manry, "Iterative Improvement of a Nearest
    Neighbor Classifier," Neural Networks, Vol. 4, Number 4, pp.
    517-524, 1991.

    M.T. Manry, L.M. Liu, F. Amar, M.S. Dawson and A.K. Fung, "Image
    Classification in Remote Sensing Using Functional Link Neural
    Networks," Proc. of the IEEE Southwest Symposium on Image Analysis
    and Interpretation, April 1994, pp. 54-58.

    L-M Liu, M.T. Manry, F. Amar, M.S. Dawson, and A.K. Fung,
    "Iterative Improvement of Image Classifiers Using Relaxation,"
    Conference Record of the Twenty-Eighth Annual Asilomar Conference
    on Signals, Systems, and Computers, vol. 2, Nov. 1994, pp.902-906.

Q;  Do you have any papers on real-world applications of neural networks ?
A;  Yes. 

    W. Gong, K.R. Rao, and M.T. Manry, "Progressive Image
    Transmission," IEEE Trans. on Circ. and Syst. for Video Technology,
    vol. 3, no. 6, October 1993, pp. 380-383.
    
    Y. Saifullah and M.T. Manry, "Classification-Based Segmentation of
    ZIP Codes," IEEE Trans. on Systems, Man, and Cybernetics, vol. 23,
    no. 5, September/October 1993, pp.1437-1443.
    
    K. Liu, S. Subbarayan, R.R.Shoults, M.T.Manry C.Kwan,
    F.L.Lewis, and J.Naccarino, "Comparison of Very Short-Term Load
    Forecasting Techniques," IEEE Transactions on Power Systems, to
    appear.
    
    M.S. Dawson, A.K. Fung, M.T. Manry, "Classification of SSM/I Polar
    Sea Ice Data Using Neural Networks," Proc. of PIERS 93, 1993, p.
    572.
    
    F. Amar, M.S. Dawson, A.K. Fung, M.T. Manry, "Analysis of
    Scattering and Inversion From Forest," Proc. of PIERS 93, 1993, p.
    162.
    
    M.S. Dawson, M.T. Manry, and A.K. Fung, "Information Retrieval from
    Remotely Sensed Data and a Method to Remove Parameter Estimator
    Ambiguity," Proc. of IGARSS'95, Firenze, Italy, July 10-14 1995. 
    
    F. Amar, M.S. Dawson, M.T. Manry, A.K. Fung, and K.S. Chen,
    "Comparison of Neural Network Algorithms Applicable to Remote
    Sensing Disciplines," Proc. of IGARSS'95, Firenze, Italy, July 10-
    14 1995. 
    

