next up previous
Next: Bibliography 2 Up: On Growing Better Decision Previous: Processing HST images

Bibliography 1 (for the survey chapter)

1
J. ACZEL AND J. DAROCZY. On measures of information and their characterizations. Academic Publishers, New York, 1975.

2
DAVID W. AHA AND RICHARD L. BANKERT. A comparitive evaluation of sequential feature selection algorithms. In AI&Statistics-95 [4], pages 1--7.

3
AI&Stats-93: Preliminary Papers of the Fourth International Workshop on Artificial Intelligence and Statistics, Ft. Lauderdale, FL, 3rd--6th, January 1993. Society for AI and Statistics.

4
AI&Stats-95: Preliminary Papers of the Fifth International Workshop on Artificial Intelligence and Statistics, Ft. Lauderdale, FL, 4--7th, January 1995. Society for AI and Statistics.

5
HUSSEIN ALMUALLIM AND THOMAS G. DIETTERICH. Learning boolean concepts in the presence of many irrelevant features. Artificial Intelligence, 69:279--305, 1994.

6
American Association for Artificial Intelligence. AAAI-92: Proceedings of the Tenth National Conference on Artificial Intelligence, San Jose, CA, 12--16th, July 1992. AAAI Press / The MIT Press.

7
American Association for Artificial Intelligence. AAAI-93: Proceedings of the Eleventh National Conference on Artificial Intelligence, Washington, DC, 11--15th, July 1993. AAAI Press / The MIT Press.

8
PETER ARGENTIERO, ROLAND CHIN, AND PAUL BEAUDET. An automated approach to the design of decision tree classifiers. IEEE Transactionson Pattern Analysis and Machine Intelligence, PAMI-4(1):51--57, January 1982.

9
LES ATLAS, RONALD COLE, YESHWANT MUTHUSWAMY, ALAN LIPMAN, JEROME CONNOR, DONG PARK, MUHAMMED EL-SHARKAWI, AND ROBERT J. MARKS II. A performance comparison of trained multilayer perceptrons and trained classification trees. Proceedings of the IEEE, 78(10):1614--1619, 1990.

10
PETER AUER, ROBERT C. HOLTE, AND WOLFGANG MAASS. Theory and applications of agnostic PAC-learning with small decision trees. In ML-95 [250], pages 21--29. Editor: Jeffrey Schlimmer.

11
HALDUN AYTUG, SIDDHARTHA BHATTACHARYA, GARY J. KOEHLER, AND JANE L. SNOWDON. A review of machine learning in scheduling. IEEE Transactions on Engineering Management, 41(2):165--171, May 1994.

12
A. BABIC, E. KRUSINSKA, AND J.-E. STROMBERG. Extraction of diagnostic rules using recursive partitioning systems: A comparison of two approaches. Artificial Intelligence in Medicine, 4(5):373--387, October 1992.

13
L. BAHL, P.F.BROWN, P.V. DE SOUZA, AND R. L. MERCER. A tree-based statistical language model for natural language speech recognition. IEEE Transactions on Accoustics, Speech and Signal Processing, 37(7):1001--1008, 1989.

14
EARD BAKER AND A. K. JAIN. On feature ordering in practice and some finite sample effects. In Proceedings of the Third International Joint Conference on Pattern Recognition, pages 45--49, San Diego, CA, 1976.

15
F. A. BAKER, DAVID L. VERBYLA, C. S. HODGES JR., AND E. W. ROSS. Classification and regression tree analysis for assessing hazard of pine mortality caused by hetero basidion annosum. Plant Disease, 77(2):136, February 1993.

16
W. A. BELSON. Matching and prediction on the principle of biological classification. Applied Statistics, 8:65--75, 1959.

17
MOSHE BEN-BASSAT. Myopic policies in sequential classification. IEEE Transactions on Computing, 27(2):170--174, February 1978.

18
MOSHE BEN-BASSAT. Use of distance measures, information measures and error bounds on feature evaluation. In Krishnaiah and Kanal [194], pages 773--791.

19
K.P. BENNETT AND O.L. MANGASARIAN. Robust linear programming discrimination of two linearly inseparable sets. Optimization Methods and Software, 1:23--34, 1992.

20
K.P. BENNETT AND O.L. MANGASARIAN. Multicategory discrimination via linear programming. Optimization Methods and Software, 3:29--39, 1994.

21
KRISTIN P. BENNETT. Decision tree construction via linear programming. In Proceedings of the 4th Midwest Artificial Intelligence and Cognitive Science Society Conference, pages 97--101, 1992.

22
KRISTIN P. BENNETT. Global tree optimization: A non-greedy decision tree algorithm. In Proceedings of Interface 94: The 26th Symposium on the Interface, Research Triangle, North Carolina, 1994.

23
A. BLUM AND R. RIVEST. Training a 3-node neural network is NP-complete. In Proceedings of the 1988 Workshop on Computational Learning Theory, pages 9--18, Boston, MA, 1988. Morgan Kaufmann.

24
MARKO BOHANEC AND IVAN BRATKO. Trading accuracy for simplicity in decision trees. Machine Learning, 15:223--250, 1994.

25
DAVID BOWSER-CHAO AND DEBRA L. DZIALO. Comparison of the use of binary decision trees and neural networks in top quark detection. Physical Review D: Particles and Fields, 47(5):1900, March 1993.

26
D. BOYCE, A. FARHI, AND R. WEISHEDEL. Optimal Subset Selection. Springer-Verlag, 1974.

27
ANNA BRAMANTI-GREGOR AND HENRY W. DAVIS. The statistical learning of accurate heuristics. In IJCAI-93 [160], pages 1079--1085. Editor: Ruzena Bajcsy.

28
Y. BRANDMAN, A. ORLITSKY, AND J. HENNESSY. A spectral lower bound technique for the size of decision trees and two-level AND/OR circuits. IEEE Transactions on Computers, 39(2):282--286, February 1990.

29
LEO BREIMAN, JEROME FRIEDMAN, RICHARD OLSHEN, AND CHARLES STONE. Classification and Regression Trees. Wadsworth International Group, 1984.

30
RICHARD P. BRENT. Fast training algorithms for multilayer neural nets. IEEE Transactions on Neural Networks, 2(3):346--354, May 1991.

31
CARLA E. BRODLEY. Recursive Automatic Algorithm Selection for Inductive Learning. PhD thesis, University of Massachusetts, Amherst, MA, 1994.

32
CARLA E. BRODLEY AND PAUL E. UTGOFF. Multivariate decision trees. Machine Learning, 19:45--77, 1995.

33
DONALD E. BROWN, VINCENT CORRUBLE, AND CLARENCE LOUIS PITTARD. A comparison of decision tree classifiers with backpropagation neural networks for multimodal classification problems. Pattern Recognition, 26(6):953--961, 1993.

34
DONALD E. BROWN AND CLARENCE LOUIS PITTARD. Classification trees with optimal multivariate splits. In Proceedings of the International Conference on Systems, Man and Cybernetics, volume 3, pages 475--477, Le Touquet, France, 17--20th, October 1993. IEEE, New York.

35
RANDAL E. BRYANT. Graph-based algorithms for boolean function manipulation. IEEE Transactions on Computing, C-35(8):677--691, August 1986.

36
RANDAL E. BRYANT. Symbolic boolean manipulation with ordered binary decision diagrams. Technical Report CMU-CS-92-160, Carnegie Mellon University, School of Computer Science, Pittsburgh, PA 15213., July 1992. Accepted to ACM Computing Surveys.

37
NADER H. BSHOUTY. Exact learning via monotone theory. In Proceedings. 34th Annual Symposium on Foundations of Computer Science, pages 302--311, New York, NY, 1993. IEEE.

38
R.S. BUCY AND R.S. DIESPOSTI. Decision tree design by simulated annealing. Mathematical Modieling and Numerical Analysis, 27(5):515--534, 1993. A RAIRO Journal.

39
M. E. BULLOCK, D. L. WANG, FAIRCHILD S. R., AND T. J. PATTERSON. Automated training of 3-D morphology algorithm for object recognition. Proceedings of SPIE -- The International Society for Optical Engineering, 2234:238--251, 1994. Issue title: Automatic Object Recognition IV.

40
WRAY BUNTINE. A theory of learning classification rules. PhD thesis, University of Technology, Sydney, Australia, 1991.

41
WRAY BUNTINE. Learning classification trees. Statistics and Computing, 2:63--73, 1992.

42
A. BUZO, A.H. GRAY JR., ROBERT M. GRAY, AND J.D. MARKEL. Speech coding based upon vector quantization. IEEE Transactions on Accoustics, Speech and Signal Processing, 28:562--574, October 1980.

43
JANICE D. CALLAHAN AND STEPHEN W. SORENSEN. Rule induction for group decisions with statistical data - an example. Journal of the Operational Research Society, 42(3):227--234, March 1991.

44
JAN M. VAN CAMPENHOUT. Topics in measurement selection. In Krishnaiah and Kanal [194], pages 793--803.

45
RICH CARUANA AND DAYNE FREITAG. Greedy attribute selection. In ML-94 [249], pages 28--36. Editors: William W. Cohen and Haym Hirsh.

46
RICHARD G. CASEY AND GEORGE NAGY. Decision tree design using a probabilistic model. IEEE Transactions on Information Theory, IT-30(1):93--99, January 1984.

47
JASON CATLETT. Megainduction. PhD thesis, Basser Department of Computer Science, University of Sydney, Australia, 1991.

48
JASON CATLETT. Tailoring rulesets to misclassification costs. In AI&Statistics-95 [4], pages 88--94.

49
B. CHANDRASEKARAN. From numbers to symbols to knowledge structures: Pattern Recognition and Artificial Intelligence perspectives on the classification task. volume 2, pages 547--559. Elsevier Science, Amsterdam, The Netherlands, 1986.

50
B. CHANDRASEKARAN AND A. K. JAIN. Quantization complexity and independent measurements. IEEE Transactions on Computers, C-23(1):102--106, January 1974.

51
A. R. CHATURVEDI AND D. L. NAZARETH. Investigating the effectiveness of conditional classification: an application to manufacturing scheduling. IEEE Transactions on Engineering Management, 41(2):183--193, May 1994.

52
M. R. CHMIELEWSKI AND J. W. GRZYMALA-BUSSE. Global discretization of continuous attributes as preprocessing for machine learning. In T. Y. Lin, editor, RSSC-94: The Third International Workshop on Rough Sets and Soft Computing, pages 294--301, San Jose, CA, November 1994. American Association of Artificial Intelligence, San Jose State University.

53
PHILIP A. CHOU. Applications of Information Theory to Pattern Recognition and the Design of Decision Trees and Trellises. PhD thesis, Stanford University, 1988.

54
PHILIP A. CHOU. Optimal partitioning for classification and regression trees. IEEE Transactions on Pattern Analysis and Machine Intelligence, 13(4):340--354, April 1991.

55
PHILIP A. CHOU AND ROBERT M. GRAY. On decision trees for pattern recognition. In Proceedings of the IEEE Symposium on Information Theory, page 69, Ann Arbor, MI, 1986.

56
PHILIP A. CHOU, TOM LOOKABAUGH, AND ROBERT M. GRAY. Optimal pruning with applications to tree-structured source coding and modeling. IEEE Transactions on Information Theory, 35(2):299--315, March 1989.

57
KRZYSZTOF J. CIOS AND NING LIU. A machine learning method for generation of a neural network architecture: A continuous ID3 algorithm. IEEE Transactions on Neural Networks, 3(2):280--291, March 1992.

58
I. CLEOTE AND H. THERON. CID3: An extension of ID3 for attributes with ordered domains. South African Computer Journal, 4:10--16, March 1991.

59
J.R.B. COCKETT AND J.A. HERRERA. Decision tree reduction. Journal of the ACM, 37(4):815--842, October 1990.

60
W.W. COHEN. Efficient pruning methods for separate-and-conquer rule learning systems. In IJCAI-93 [160], pages 988--994. Editor: Ruzena Bajcsy.

61
DOUGLAS COMER AND RAVI SETHI. The complexity of trie index construction. Journal of the ACM, 24(3):428--440, July 1977.

62
T.M. COVER AND J.M. VAN CAMPENHOUT. On the possible orderings in the measurement selection problems. IEEE Transactions on Systems, Man and Cybernetics, SMC-7(9), 1977.

63
LOUIS ANTHONY COX. Using causal knowledge to learn more useful decision rules from data. In AI&Statistics-95 [4], pages 151--160.

64
LOUIS ANTHONY COX AND YUPING QIU. Minimizing the expected costs of classifying patterns by sequential costly inspections. In AI&Statistics-93 [3].

65
LOUIS ANTHONY COX, YUPING QIU, AND WARREN KUEHNER. Heuristic least-cost computation of discrete classification functions with uncertain argument values. Annals of Operations Research, 21(1):1--30, 1989.

66
STUART L. CRAWFORD. Extensions to the CART algorithm. International Journal of Man-Machine Studies, 31(2):197--217, August 1989.

67
STEPHEN P. CURRAM AND JOHN MINGERS. Neural networks, decision tree induction and discriminant analysis: An empirical comparison. Journal of the Operational Research Society, 45(4):440--450, April 1994.

68
K.T. DAGO, R. LUTHRINGER, R. LENGELLE, G. RINAUDO, AND J. P. MATCHER. Statistical decision tree: A tool for studying pharmaco-EEG effects of CNS-active drugs. Neuropsychobiology, 29(2):91--96, 1994.

69
FLORENCE DAlché-Buc, DIDIER ZWIERSKI, AND JEAN-PIERRE NADAL. Trio learning: A new strategy for building hybrid neural trees. International Journal of Neural Systems, 5(4):259--274, December 1994.

70
S.K. DAS AND S. BHAMBRI. A decision tree approach for selecting between demand based, reorder and JIT/kanban methods for material procurement. Production Planning and Control, 5(4):342, 1994.

71
Belur V. Dasarathy, editor. Nearest neighbor (NN) norms: NN pattern classification techniques. IEEE Computer Society Press, Los Alamitos, CA, 1991.

72
BELUR V. DASARATHY. Minimal consistent set (MCS) identification for optimal nearest neighbor systems design. IEEE transactions on systems, man and cybernetics, 24(3):511--517, 1994.

73
G. R. DATTATREYA AND LAVEEN N. KANAL. Decision trees in pattern recognition. In Kanal and Rosenfeld, editors, Progress in Pattern Recognition, volume 2, pages 189--239. Elsevier Science, 1985.

74
G. R. DATTATREYA AND V. V. S. SARMA. Bayesian and decision tree approaches to pattern recognition including feature measurement costs. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-3(3):293--298, 1981.

75
THOMAS G. DIETTERICH, HERMANN HILD, AND GHULUM BAKIRI. A comparison of ID3 and backpropagation for english text-to-speech mapping. Machine Learning, 18:51--80, 1995.

76
THOMAS G. DIETTERICH AND EUN BAE KONG. Machine learning bias, statistical bias and statistical variance of decision tree algorithms. In ML-95 [250]. to appear.

77
THOMAS G. DIETTERICH AND RYSZARD S. MICHALSKI. A comparitive view of selected methods for learning from examples. In R.S. Michalski, J.G. Carbonell, and T.M. Mitchell, editors, Machine Learning, an Artificial Intelligence Approach, volume 1, pages 41--81. Morgan Kaufmann, San Mateo, CA, 1983.

78
JUSTIN DOAK. An evaluation of search algorithms for feature selection. Technical report, Graduate Group in Computer Science, University of California at Davis; and Safeguards Systems Group, Los Alamos National Laboratory, January 1994.

79
B. A. DRAPER, CARLA E. BRODLEY, AND PAUL E. UTGOFF. Goal-directed classification using linear machine decision trees. IEEE Transactions on Pattern Analysis and Machine Intelligence, 16(9):888, 1994.

80
N. R. DRAPER AND H. SMITH. Applied Regression Analysis. Wiley, New York, 1966. 2nd edition in 1981.

81
R. DUDA AND P. HART. Pattern Classification and Scene Analysis. Wiley, New York, 1973.

82
EADES AND STAPLES. On optimal trees. Journal of Algorithms, 2(4):369--384, 1981.

83
BRADLEY EFRON. Estimating the error rate of a prediction rule: improvements on cross-validation. Journal of American Statistical Association, 78(382):316--331, June 1983.

84
A. EHRENFEUCHT AND DAVID HAUSSLER. Learning decision trees from random examples. Information and Computation, 82:231--246, 1989.

85
JOHN F. ELDER, IV. Heuristic search for model structure. In AI&Statistics-95 [4], pages 199--210.

86
TAPIO ELOMAA. In defence of C4.5: Notes on learning one-level decision trees. In ML-94 [249], pages 62--69. Editors: William W. Cohen and Haym Hirsh.

87
A. ERCIL. Classification trees prove useful in nondestructive testing of spotweld quality. Welding Journal, 72(9):59, September 1993. Issue Title: Special emphasis: Rebuilding America's roads, railways and bridges.

88
FLORIANA ESPOSITO, DONATO MALERBA, AND GIOVANNI SEMERARO. A further study of pruning methods in decision tree induction. In AI&Statistics-95 [4], pages 211--218.

89
BOB EVANS AND DOUG FISHER. Overcoming process delays with decision tree induction. IEEE Expert, pages 60--66, February 1994.

90
BRIAN EVERITT. Cluster Analysis - 3rd Edition. E. Arnold Press, London., 1993.

91
JUDITH A. FALCONER, BRUCE J. NAUGHTON, DOROTHY D. DUNLOP, ELLIOT J. ROTH, AND DALE C. STRASSER. Predicting stroke inpatient rehabilitation outcome using a classification tree approach. Archives of Physical Medicine and Rehabilitation, 75(6):619, June 1994.

92
A. FAMILI. Use of decision tree induction for process optimization and knowledge refinement of an industrial process. Artificial Intelligence for Engineering Design, Analysis and Manufacturing (AI EDAM), 8(1):63--75, Winter 1994.

93
R. M. FANO. Transmission of Information. MIT Press, Cambridge, MA, 1961.

94
USAMA M. FAYYAD AND KEKI B. IRANI. What should be minimized in a decision tree? In AAAI-90: Proceedings of the National Conference on Artificial Intelligence, volume 2, pages 749--754. American Association for Artificial Intelligence, 1990.

95
USAMA M. FAYYAD AND KEKI B. IRANI. The attribute specification problem in decision tree generation. In AAAI-92 [6], pages 104--110.

96
USAMA M. FAYYAD AND KEKI B. IRANI. On the handling of continuous-valued attributes in decision tree generation. Machine Learning, 8(2):87--102, 1992.

97
USAMA M. FAYYAD AND KEKI B. IRANI. Multi-interval discretization of continuous valued attributes for classification learning. In IJCAI-93 [160], pages 1022--1027. Editor: Ruzena Bajcsy.

98
EDWARD A. FEIGENBAUM. Expert systems in the 1980s. In A. Bond, editor, State of the Art in Machine Intelligence. Pergamon-Infotech, Maidenhead, 1981.

99
C. FENG, A. SUTHERLAND, R. KING, S. MUGGLETON, AND R. HENERY. Comparison of machine learning classifiers to statistics and neural networks. In AI&Statistics-93 [3], pages 41--52.

100
A. FIELDING. Binary segmentation: the automatic interaction detector and related techniques for exploring data structure. In O'Muircheartaigh and Payne [276], pages 221--257.

101
P. E. FILE, P. I. DUGARD, AND A. S. HOUSTON. Evaluation of the use of induction in the develeopment of a medical expert system. Computers and Biomedical Research, 27(5):383--395, October 1994.

102
DOUGLAS FISHER. Knowledge acquisition via incremental conceptual clustering. Machine Learning, 2:130--172, 1987.

103
DOUGLAS FISHER AND KATHLEEN MCKUSICK. An empirical comparison of ID3 and back propagation. In IJCAI-89 [159]. Editor: N. S. Sridharan.

104
R. FLETCHER AND M. J. D. POWELL. A rapidly convergent descent method for minimization. Computer Journal, 6(ISS.2):163--168, 1963.

105
D. H. FOLEY. Considerations of sample and feature size. IEEE Transactions on Information Theory, IT-18:618--626, 1972.

106
F. FOROURAGHI, L. W. SCHMERR, AND G. M. PRABHU. Induction of multivariate regression trees for design optimization. volume 1, pages 607--612, Seattle, WA, 31st July - 4th August 1994. American Association for Artificial Intelligence, AAAI Press / The MIT Press.

107
IMAN FOROUTAN. Feature Selection for Piecewise Linear Classifiers. PhD thesis, University of California, Irvine, CA, 1985.

108
IMAN FOROUTAN AND JACK SKLANSKY. Feature selection for automatic classification of non-Gaussian data. IEEE Transactions on Systems, Man and Cybernetics, 17(2):187--198, March/April 1987.

109
RICHARD S. FORSYTH, DAVID D. CLARKE, AND RICHARD L. WRIGHT. Overfitting revisited: an information-theoretic approach to simplifying discrimination trees. Journal of Experimental and Theoretical Artificial Intelligence, 6(3):289--302, July--September 1994.

110
JEROME H. FRIEDMAN. A recursive partitioning decision rule for nonparametric classifiers. IEEE Transactions on Computers, C-26:404--408, April 1977.

111
M. FUJITA, H. FUJISAWA, AND Y. MATSUNAGA. Variable ordering algorithms for ordered binary decision diagrams and their evaluation. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 12(1):6--12, January 1993.

112
KEINOSUKE FUKANAGA AND R. A. HAYES. Effect of sample size in classifier design. IEEE Transactions on Pattern Analysis and Machine Intelligence, 11:873--885, 1989.

113
TRUXTON K. FULTON, SIMON KASIF, AND STEVEN SALZBERG. An efficient algorithm for for finding multi-way splits for decision trees. In ML-95 [250]. to appear.

114
MICHAEL R. GAREY AND RONALD L. GRAHAM. Performance bounds on the splitting algorithm for binary testing. Acta Informatica, 3(Fasc. 4):347--355, 1974.

115
MICHAEL R. GAREY AND D.S. JOHNSON. Computers and Intractability: a Guide to the theory of NP-Completeness. Freeman and Co., San Francisco, CA, 1979.

116
S. B. GELFAND AND C. S. RAVISHANKAR. A tree-structured piecewise-linear adaptive filter. IEEE Transactions on Information Theory, 39(6):1907--1922, November 1993.

117
SAUL B. GELFAND, C. S. RAVISHANKAR, AND EDWARD J. DELP. An iterative growing and pruning algorithm for classification tree design. IEEE Transaction on Pattern Analysis and Machine Intelligence, 13(2):163--174, February 1991.

118
Edzard S. Gelsema and Laveen S. Kanal, editors. Pattern Recognition in Practice IV: Multiple paradigms, Comparative studies and hybrid systems, volume 16 of Machine Intelligence and Pattern Recognition. Series editors: Kanal, L. S. and Rozenfeld, A. Elsevier, 1994.

119
G. H. GENNARI, PAT LANGLEY, AND DOUGLAS FISHER. Models of incremental concept formation. Artificial Intelligence, 40(1--3):11--62, September 1989.

120
ALLEN GERSHO AND ROBERT M. GRAY. Vector Quantization and Signal Compression. Kluwer Academic Publishers, 1991.

121
W. J. GIBB, D. M. AUSLANDER, AND J. C. GRIFFIN. Selection of myocardial electrogram features for use by implantable devices. IEEE Transactions on Biomedical Engineering, 40(8):727--735, August 1993.

122
M. W. GILLO. MAID: A Honeywell 600 program for an automatised survey analysis. Behavioral Science, 17:251--252, 1972.

123
ELIZABETH A. GIPLIN, RICHARD A. OLSHEN, KANU CHATTERJEE, JOHN KJEKSHUS, ARTHUR J. MOSS, HARMUT HENNING, ROBERT ENGLER, A. ROBERT BLACKY, HOWARD DITTRICH, AND JOHN ROSS JR. Predicting 1-year outcome following acute myocardial infarction. Computers and biomedical research, 23(1):46--63, February 1990.

124
MALCOLM A. GLESER AND MORRIS F. COLLEN. Towards automated medical decisions. Computers and Biomedical Research, 5(2):180--189, April 1972.

125
M. GOLEA AND M. MARCHAND. A growth algorithm for neural network decision trees. EuroPhysics Letters, 12(3):205--210, June 1990.

126
RODNEY M. GOODMAN AND PADHRAIC J. SMYTH. Decision tree design from a communication theory standpoint. IEEE Transactions on Information Theory, 34(5):979--994, September 1988.

127
RODNEY M. GOODMAN AND PADHRIAC J. SMYTH. Decision tree design using information theory. Knowledge Acquisition, 2:1--19, 1990.

128
MICHAEL T. GOODRICH, VINCENT MIRELLI, MARK ORLETSKY, AND JEFFERY SALOWE. Decision tree conctruction in fixed dimensions: Being global is hard but local greed is good. Technical Report TR-95-1, Johns Hopkins University, Department of Computer Science, Baltimore, MD 21218, May 1995.

129
L. GORDON AND R. A. OLSHEN. Asymptotically efficient solutions to the classification problem. Annals of Statistics, 6(3):515--533, 1978.

130
N. A. B. GRAY. Capturing knowledge through top-down induction of decision trees. IEEE Expert, 5(3):41--50, June 1990.

131
R.K. GULATI, R. GUPTA, P. GOTHOSKAR, AND S. KHOBRAGADE. Ultraviolet stellar spectral classification using multilevel tree neural networks. Vistas in Astronomy, 38:293, 1993. Part 3: Neural Networks in Astronomy.

132
HENG GUO AND SAUL B. GELFAND. Classification trees with neural network feature extraction. IEEE Transactions on Neural Networks., 3(6):923--933, November 1992.

133
Y. GUO AND K.J. DOOLEY. Distinguishing between mean, variance and autocorrelation changes in statistical quality control. International Journal of Production Research, 33(2):497--510, February 1995.

134
OUZDEN GUUR-ALI AND WILLIAM A. WALLACE. Induction of rules subject to a quality constraint: Probabilistic inductive learning. IEEE Transactions on Knowldge and Data Engineering, 5(6):979--984, December 1993. Special Issue on Learning and Discovery in Knowledge-based databases.

135
S.E. HAMPSON AND D.J. VOLPER. Linear function neurons: Structure and training. Biological Cybernetics, 53(4):203--217, 1986.

136
THOMAS R. HANCOCK. Learning k decision trees on the uniform distribution. In Proceedings of the Sixth Annual Workshop on Computational Learning Theory, pages 352--360, July 1993.

137
D. J. HAND. Discrimination and Classification. Wiley, Chichester, UK, 1981.

138
W. HANISCH. Design and optimization of a hierarchical classifier. Journal of new Generation Computer Systems, 3(2):159--173, 1990.

139
L. K. HANSEN AND P. SALOMON. Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence, 12(10):993--1001, 1990.

140
A. HART. Experience in the use of an inductive system in knowledge engineering. In M. Bramer, editor, Research and Development in Expert Systems. Cambridge University Press, Cambridge, MA, 1984.

141
CARLOS R. P. HARTMANN, PRAMOD K. VARSHNEY, KISHAN G. MEHROTRA, AND CARL L. GERBERICH. Application of information theory to the construction of efficient decision trees. IEEE Transactions on Information Theory, IT-28(4):565--577, July 1982.

142
R. E. HASKELL AND A. NOUI-MEHIDI. Design of hierarchical classifiers. In N. A. Sherwani, E. de Doncker, and J. A. Kapenga, editors, Computing in the 90's: The First Great Lakes Computer Science Conference Proceedings, pages 118--124, Berlin, 1991. Springer-Verlag. Conference held in Kalamazoo, MI on 18th-20th, October 1989.

143
N.D. HATZIARGYRIOU, G.C. CONTAXIS, AND N.C. SIDERIS. A decision tree method for on-line steady state security assessment. IEEE Transactions on Power Systems, 9(2):1052, 1994.

144
MARK A. HEAP AND M. R. MERCER. Least upper bounds on OBDD sizes. IEEE Transactions on Computers, 43(6):764--767, June 1994.

145
D. HEATH. A Geometric Framework for Machine Learning. PhD thesis, Johns Hopkins University, Baltimore, MD, 1992.

146
D. HEATH, S. KASIF, AND S. SALZBERG. k-DT: A multi-tree learning method. In Proceedings of the Second International Workshop on Multistrategy Learning, pages 138--149, Harpers Ferry, WV, 1993. George Mason University.

147
D. HEATH, S. KASIF, AND S. SALZBERG. Learning oblique decision trees. In IJCAI-93 [160], pages 1002--1007. Editor: Ruzena Bajcsy.

148
DAVID P. HELMHOLD AND ROBERT E. SCHAPIRE. Predicting nearly as well as the best pruning of a decision tree. In Proceedings of the 8th Annual Conference on Computational Learning Theory, pages 61--68, New York, NY, 1995. ACM Press.

149
ERNEST G. HENRICHON JR. AND KING-SUN FU. A nonparametric partitioning procedure for pattern classification. IEEE Transactions on Computers, C-18(7):614--624, July 1969.

150
GABOR T. HERMAN AND K.T. DANIEL YEUNG. On piecewise-linear classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14(7):782--786, July 1992.

151
KLAUS-U HOEFFGEN, HANS-U SIMON, AND KEVIN S. VAN HORN. Robust trainability of single neurons. Journal of Computer System Sciences, 50(1):114--125, 1995.

152
R. HOLTE. Very simple classification rules perform well on most commonly used datasets. Machine Learning, 11(1):63--90, 1993.

153
G. E. HUGHES. On the mean accuracy of statistical pattern recognition. IEEE Transactions on Information Theory, IT-14(1):55--63, January 1968.

154
K. J. HUNT. Classification by induction: Applications to modelling and control of non-linear dynamic systems. Intelligent Systems Engineering, 2(4):231--245, Winter 1993.

155
LAURENT HYAFIL AND RONALD L. RIVEST. Constructing optimal binary decision trees is NP-complete. Information Processing Letters, 5(1):15--17, 1976.

156
TOSHIHIDE IBARAKI AND SABURO MUROGA. Adaptive linear classifiers by linear programming. Technical Report 284, Department of Computer Science, University of Illinois, Urbana-Champaign, 1968.

157
M. ICHINO AND JACK SKLANSKY. Optimum feature selection by zero-one integer programming. IEEE Transactions on Systems, Man and Cybernetics, SMC-14:737--746, September/October 1984.

158
Y. IIKURA AND Y. YASUOKA. Utilization of a best linear discriminant function for designing the binary decision tree. International Journal of Remote Sensing, 12(1):55--67, January 1991.

159
IJCAI-89: Proceedings of the Eleventh International Joint Conference on Artificial Intelligence. Morgan Kaufmann Publishers Inc., San Mateo, CA, 1989. Editor: N. S. Sridharan.

160
IJCAI-93: Proceedings of the Thirteenth International Joint Conference on Artificial Intelligence, volume 2, Chambery, France, 28th August--3rd September 1993. Morgan Kaufmann Publishers Inc., San Mateo, CA. Editor: Ruzena Bajcsy.

161
IJCAI-95: Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence, Montreal, Canada, 16th--21st, August 1995. Morgan Kaufmann Publishers Inc., San Mateo, CA. Editor: Chris Mellish.

162
I. F. IMAM AND RYSZARD S. MICHALSKI. Should decision trees be learned from examples or from decision rules? In Methodologies for Intelligent Systems: 7th International Symposium. ISMIS '93, volume 689 of Lecture Notes in Computer Science, pages 395--404. Springer-Verlag, Trondheim, Norway, June 1993.

163
KEKI B. IRANI, CHENG JIE, USAMA M. FAYYAD, AND QIAN ZHAOGANG. Applying machine learning to semiconductor manufacturing. IEEE Expert, 8(1):41--47, February 1993.

164
P. ISRAEL AND C. KOUTSOUGERAS. A hybrid electro-optical architecture for classification trees and associative memory mechanisms. International Journal on Artificial Intelligence Tools (Architectures, Languages, Algorithms), 2(3):373--393, September 1993.

165
A. K. JAIN AND B. CHANDRASEKARAN. Dimensionality and sample size considerations in pattern recognition. In Krishnaiah and Kanal [194], pages 835--855.

166
GEORGE H. JOHN. Robust linear discriminant trees. In AI&Statistics-95 [4], pages 285--291.

167
GEORGE H. JOHN, RON KOHAVI, AND KARL PFLEGER. Irrelevant features and the subset selection problem. In ML-94 [249], pages 121--129. Editors: William W. Cohen and Haym Hirsh.

168
MICHAEL I. JORDAN AND R. A. JACOBS. Hierarchical mixtures of experts and the EM algorithm. Neural Computation, 6:181--214, 1994.

169
J. JUDMAIER, P. MEYERSBACH, G. WEISS, H. WACHTER, AND G. REIBNEGGER. The role of Neopterin in assessing disease activity in Crohn's disease: Classification and regression trees. The American Journal of Gastroenterology, 88(5):706, May 1993.

170
G. KALKANIS. The application of confidence interval error analysis to the design of decision tree classifiers. Pattern Recognition Letters, 14(5):355--361, May 1993.

171
LAVEEN N. KANAL. Patterns in pattern recognition: 1968-1974. IEEE Transactions in Information Theory, 20:697--722, 1974.

172
LAVEEN N. KANAL. Problem solving methods and search strategies for pattern recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-1:193--201, 1979.

173
LAVEEN N. KANAL AND B. CHANDRASEKARAN. On dimensionality and sample size in statistical pattern classification. Pattern Recognition, 3:225--234, 1971.

174
G. V. KASS. An exploratory technique for investigating large quantities of categorical data. Applied Statistics, 29(2):119--127, 1980.

175
MICHAEL J. KEARNS AND UMESH VIRKUMAR VAZIRANI. An Introduction to Computational Learning Theory. MIT Press, Cambridge, MA, 1994.

176
DAVIS M. KENNEDY. Decision tree bears fruit. Products Finishing, 57(10):66, July 1993.

177
J. D. KENNEFICK, R. R. CARVALHO, S. G. DJORGOVSKI, M. M. WILBER, E. S. DICKSON, N. WEIR, U. FAYYAD, AND J. RODEN. The discovery of five quasars at z>4 using the second Palomar Sky Survey. The Astronomical Journal, 110(1):78, 1995.

178
RANDY KERBER. Chimerge: Discretization of numeric attributes. In AAAI-92 [6], pages 123--128.

179
BYUNGYONG KIM AND DAVID LANDGREBE. Hierarchical decision tree classifiers in high-dimensional and large class data. IEEE Transactions on Geoscience and Remote Sensing, 29(4):518--528, July 1991.

180
HYUNSOO KIM AND G. J. KOEHLER. An investigation on the conditions of pruning an induced decision tree. European Journal of Operational Research, 77(1):82, August 1994.

181
SUNG-HO KIM. A general property among nested, pruned subtrees of a decision support tree. Communications in Statistics---Theory and Methods, 23(4):1227--1238, April 1994.

182
KENJI KIRA AND LARRY A. RENDELL. The feature selection problem: Traditional methods and a new algorithm. In AAAI-92 [6], pages 129--134.

183
Y. KODRATOFF AND M. MANAGO. Generalization and noise. International Journal of Man-Machine Studies, 27:181--204, 1987.

184
Y. KODRATOFF AND S. MOSCATELLI. Machine learning for object recognition and scene analysis. Internationa Journal of Pattern recognition and AI, 8(1):259--304, 1994.

185
RON KOHAVI. A study of cross-validation and bootstrap for accuracy estimation and model selection. In IJCAI-95 [161], pages 1137--1143. Editor: Chris Mellish.

186
RON KOHAVI AND CHIA-HSIN LI. Oblivious decision trees, graphs and top-down pruning. In IJCAI-95 [161], pages 1071--1077. Editor: Chris Mellish.

187
P. KOKOL, M. MERNIK, J. ZAVRSNIK, AND K. KANCLER. Decision trees based on automatic learning and their use in cardiology. Journal of Medical Systems, 18(4):201, 1994.

188
IGOR KONONENKO. Inductive and bayesian learning in medical diagnosis. Applied Artificial Intelligence, 7(4):317--337, October-December 1993.

189
IGOR KONONENKO. On biases in estimating multi-valued attributes. In IJCAI-95 [161], pages 1034--1040. Editor: Chris Mellish.

190
IGOR KONONENKO AND IVAN BRATKO. Information based evaluation criterion for classifier's performance. Machine Learning, 6(1):67--80, January 1991.

191
J. A. KORS AND J. H. VAN BEMMEL. Classification methods for computerized interpretation of the electrocardiogram. Methods of Information in Medicine, 29(4):330--336, September 1990.

192
V. A. KOVALEVSKY. The problem of character recognition from the point of view of mathematical statistics. In V. A. Kovalevsky, editor, Character Readers and Pattern Recognition. Spartan, New York, 1968.

193
J. R. KOZA. Concept formation and decision tree induction using the genetic programming paradigm. In H. P. Schwefel and R. Männer, editors, Parallel Problem Solving from Nature - Proceedings of 1st Workshop, PPSN 1, volume 496 of Lecture Notes in Computer Science, pages 124--128, Dortmund, Germany, October 1991. Springer-Verlag, Berlin, Germany.

194
Paruchuri Rama Krishnaiah and Laveen N. Kanal, editors. Classification, Pattern Recognition and Reduction of Dimensionality, volume 2 of Handbook of Statistics. North-Holland Publishing Company, Amsterdam, 1987.

195
SRINIVASAN KRISHNAMOORTHY AND DOUGLAS FISHER. Machine learning approaches to estimating software development effort. IEEE Transactions on Software Engineering, 21(2):126--137, February 1995.

196
M. KUBAT, G. PFURTSCHELLER, AND D. FLOTZINGER. AI-based approach to automatic sleep classification. Biological Cybernetics, 70(5):443--448, 1994.

197
ASHOK K. KULKARNI. On the mean accuracy of hierarchical classifiers. IEEE Transactions on Computers, C-27(8):771--776, August 1978.

198
MICHAEL J. KURTZ. Astronomical object classification. In E. S. Gelsema and Laveen N. Kanal, editors, Pattern Recognition and Artificial Intelligence, pages 317--328. Elsevier Science Publishers, Amsterdam, 1988.

199
M. W. KURZYNSKI. The optimal strategy of a tree classifier. Pattern Recognition, 16:81--87, 1983.

200
M. W. KURZYNSKI. On the multi-stage Bayes classifier. Pattern Recognition, 21(4):355--365, 1988.

201
M. W. KURZYNSKI. On the identity of optimal strategies for multi-stage classifiers. Pattern Recognition Letters, 10(1):39--46, July 1989.

202
EYAL KUSHILEVITZ AND YISHAY MANSOUR. Learning decision trees using the fourier spectrum. SIAM Journal of Computing, 22(6):1331--1348, 1993. Earlier version in STOC-91.

203
S.W. KWOK AND CARTER. C. Multiple decision trees. In R.D. Schachter, T.S. Levitt, L.N. Kanal, and J.F. Lemmer, editors, Uncertainty in Artificial Intelligence, volume 4, pages 327--335. Elsevier Science, Amsterdam, 1990.

204
G. LANDEWEERD, T. TIMMERS, E. GERSEMA, M. BINS, AND M. HALIC. Binary tree versus single level tree classification of white blood cells. Pattern Recognition, 16:571--577, 1983.

205
P. LANGLEY AND S. SAGE. Scaling to domains with many irrelevant features. Unpublished manuscript. Learning Systems Department, Siemens Corporate Research, Princeton, NJ, 1993.

206
C.Y. LEE. Representation of switching circuits by binary decision programs. Bell Systems Technical Journal, 38:985--999, July 1959.

207
SEONG-WHAN LEE. Noisy Hangul character recognition with fuzzy tree classifier. Proceedings of SPIE, 1661:127--136, 1992. Volume title: Machine vision applications in character recognition and industrial inspection. Conference location: San Jose, CA. 10th--12th February, 1992.

208
WENDY LEHNERT, STEPHEN SODERLAND, DAVID ARONOW, FANGFANG FENG, AND AVINOAM SHMUELI. Inductive text classification for medical applications. Journal of Experimental and Theoretical Artificial Intelligence, 7(1):49--80, January-March 1995.

209
P.M. LEWIS. The characteristic selection problem in recognition systems. IRE Transactions on Information Theory, IT-18:171--178, 1962.

210
XIAOBO LI AND RICHARD C. DUBES. Tree classifier design with a permutation statistic. Pattern Recognition, 19(3):229--235, 1986.

211
JIANHIA LIN AND L.A. STORER. Design and performance of tree structured vector quantizers. Information Processing and Management, 30(6):851--862, 1994.

212
JIANHUA LIN, J. A. STORER, AND M. COHN. Optimal pruning for tree-structured vector quantizers. Information Processing and Management, 28(6):723--733, 1992.

213
JYH-HAN LIN AND J. S. VITTER. Nearly optimal vector quantization via linear programming. In J. A. Storer and M. Cohn, editors, DCC 92. Data Compression Conference, pages 22--31, Los Alamitos, CA, March 24th--27th 1992. IEEE Computer Society Press.

214
Y. K. LIN AND KING-SUN FU. Automatic classification of cervical cells using a binary tree classifier. Pattern Recognition, 16(1):69--80, 1983.

215
W. Z. LIU AND A. P. WHITE. The importance of attribute selection measures in decision tree induction. Machine Learning, 15:25--41, 1994.

216
WEI-YIN LOH AND NUNTA VANICHSETAKUL. Tree-structured classification via generalized discriminant analysis. Journal of the American Statistical Association, 83(403):715--728, September 1988.

217
WILLIAM J. LONG, JOHN L. GRIFFITH, HARRY P. SELKER, AND RALPH B. D'AGOSTINO. A comparison of logistic regression to decision tree induction in a medical domain. Computers and Biomedical Research, 26(1):74--97, February 1993.

218
D.W. LOVELAND. Performance bounds for binary testing with arbitrary weights. Acta Informatica, 22:101--114, 1985.

219
DAVID LUBINSKY. Algorithmic speedups in growing classification trees by using an additive split criterion. In AI&Statistics-93 [3], pages 435--444.

220
DAVID LUBINSKY. Bivariate splits and consistent split criteria in dichotomous classification trees. PhD thesis, Department of Computer Science, Rutgers University, New Brunswick, NJ, 1994.

221
DAVID LUBINSKY. Classification trees with bivariate splits. Applied Intelligence: The International Journal of Artificial Intelligence, Neural Networks and Complex Problem-Solving Technologies, 4(3):283--296, July 1994.

222
DAVID LUBINSKY. Tree structured interpretable regression. In AI&Statistics-95 [4], pages 331--340.

223
REN C. LUO, RALPH S. SCHERP, AND MARK LANZO. Object identification using automated decision tree construction approach for robotics applications. Journal of Robotic Systems, 4(3):423--433, June 1987.

224
J. F. LUTSKO AND B. KUIJPERS. Simulated annealing in the construction of near-optimal decision trees. In AI&Statistics-93 [3].

225
JOHN MAKHOUL, SALIM ROUCOS, AND HERBERT GISH. Vector quantization in speech coding. Proceedings of the IEEE, 73:1551--1588, November 1985. Invited paper.

226
OLVI MANGASARIAN. Mathematical programming in neural networks. ORSA Journal on Computing, 5(4):349--360, Fall 1993.

227
OLVI L. MANGASARIAN. Misclassification minimization, 1994. Unpublished manuscript.

228
OLVI L. MANGASARIAN, R. SETIONO, AND W. WOLBERG. Pattern recognition via linear programming: Theory and application to medical diagnosis. In SIAM Workshop on Optimization, 1990.

229
LóPEZ DE MÀNTARAS. Technical note: A distance-based attribute selection measure for decision tree induction. Machine Learning, 6(1):81--92, 1991.

230
J. KENT MARTIN. Evaluating and comparing classifiers: complexity measures. In AI&Statistics-95 [4], pages 372--378.

231
J. KENT MARTIN. An exact probability metric for decision tree splitting and stopping. In AI&Statistics-95 [4], pages 379--385.

232
DEAN P. MCKENZIE AND LEE HUN LOW. The construction of computerized classification systems using machine learning algorithms: An overview. Computers in Human Behaviour, 8(2/3):155--167, 1992.

233
DEAN P. MCKENZIE, P. D. MCGORRY, C. S. WALLACE, LEE HUN LOW, D. L. COPOLOV, AND B. S. SINGH. Constructing a minimal diagnostic decision tree. Methods of Information in Medicine, 32(2):161--166, April 1993.

234
K. L. MCMILLAN. Symbolic model checking: an approach to the state explosion problem. PhD thesis, Carnegie Mellon University, School of Computer Science, 1992.

235
R.J. MCQUEEN, S. R. GARNER, C.G. NEVILL-MANNING, AND I.H. WITTEN. Applying machine learning to agricultural data. Computers and Electronics in Agriculture, 12(4):275--293, June 1995.

236
NIMROD MEGIDDO. On the complexity of polyhedral separability. Discrete and Computational Geometry, 3:325--337, 1988.

237
WILLIAM S. MEISEL AND DEMETRIOS A. MICHALOPOULOS. A partitioning algorithm with application in pattern classification and the optimization of decision trees. IEEE Transactions on Computers, C-22(1):93--103, January 1973.

238
JOSEPH J. MEZRICH. When is a tree a hedge? Financial Analysts Journal, pages 75--81, November-December 1994.

239
DONALD MICHIE. The superarticulatory phenomenon in the context of software manufacture. Proceedings of the Royal Society of London, 405A:185--212, 1986.

240
A. J. MILLER. Subset Selection in Regression. Chapman and Hall, 1990.

241
JOHN MINGERS. Expert systems --- rule induction with statistical data. Journal of the Operational Research Society, 38(1):39--47, 1987.

242
JOHN MINGERS. An empirical comparison of pruning methods for decision tree induction. Machine Learning, 4(2):227--243, 1989.

243
JOHN MINGERS. An empirical comparison of selection measures for decision tree induction. Machine Learning, 3:319--342, 1989.

244
M. MINSKY AND S. PAPERT. Perceptrons. MIT Press, Cambridge, MA, 1969.

245
TOM MITCHELL, RICH CARUANA, DAYNE FREITAG, JOHN MCDERMOTT, AND DAVID ZABOWSKI. Experience with a learning personal assistant. Communications of the ACM, July 1994.

246
MASAHIRO MIYAKAWA. Optimum decision trees -- an optimal variable theorem and its related applications. Acta Informatica, 22(5):475--498, 1985.

247
MASAHIRO MIYAKAWA. Criteria for selecting a variable in the construction of efficient decision trees. IEEE Transactions on Computers, 38(1):130--141, January 1989.

248
Machine Learning: Proceedings of the Tenth International Conference, University of Massachusetts, Amherst, MA, 27--29th, June 1993. Morgan Kaufmann Publishers Inc. Editor: Paul E. Utgoff.

249
Machine Learning: Proceedings of the Eleventh International Conference, Rutgers University, New Brunswick, NJ, 10--13th, July 1994. Morgan Kaufmann Publishers Inc. Editors: William W. Cohen and Haym Hirsh.

250
Machine Learning: Proceedings of the Twelfth International Conference, Tahoe City, CA, 10--13th, July 1995. Morgan Kaufmann Publishers Inc., San Mateo, CA. Editor: Jeffrey Schlimmer.

251
ADVAIT MOGRE, ROBERT MCLAREN, JAMES KELLER, AND RAGHURAM KRISHNAPURAM. Uncertainty management for rule-based systems with application to image analysis. IEEE Transactions on Systems, Man and Cybernetics, 24(3):470--481, March 1994.

252
ANDREW W. MOORE AND MARY S. LEE. Efficient algorithms for minimizing cross validation error. In ML-94 [249], pages 190--198. Editors: William W. Cohen and Haym Hirsh.

253
BERNARD M. E. MORET, M. G. THOMASON, AND R. C. GONZALEZ. The activity of a variable and its relation to decision trees. ACM Transactions on Programming Language Systems, 2(4):580--595, October 1980.

254
BERNARD M.E. MORET. Decision trees and diagrams. Computing Surveys, 14(4):593--623, December 1982.

255
J. N. MORGAN AND R. C. MESSENGER. THAID: a sequential search program for the analysis of nominal scale dependent variables. Technical report, Institute for Social Research, University of Michigan, Ann Arbor, MI, 1973.

256
D. T. MORRIS AND D. KALLES. Decision trees and domain knowledge in pattern recognition. In Gelsema and Kanal [118], pages 25--36.

257
A. N. MUCCIARDI AND E. E. GOSE. A comparison of seven techniques for choosing subsets of pattern recognition properties. IEEE Transactions on Computers, C-20(9):1023--1031, September 1971.

258
W. MULLER AND F. WYSOTZKI. Automatic construction of decision trees for classification. Annals of Operations Research, 52:231, 1994.

259
O. J. MURPHY AND R. L. MCCRAW. Designing storage efficient decision trees. IEEE Transactions on Computers, 40(3):315--319, March 1991.

260
PATRICK M. MURPHY. An empirical analysis of the benefit of decision tree size biases as a function of concept distribution. Submitted to the Machine Learning journal, July 1994.

261
PATRICK M. MURPHY AND DAVID AHA. UCI repository of machine learning databases -- a machine-readable data repository. Maintained at the Department of Information and Computer Science, University of California, Irvine. Anonymous FTP from ics.uci.edu in the directory pub/machine-learning-databases, 1994.

262
PATRICK M. MURPHY AND MICHAEL J. PAZZANI. Exploring the decision forest: An empirical investigation of Occam's Razor in decision tree induction. Journal of Artificial Intelligence Research, 1:257--275, 1994.

263
SREERAMA K. MURTHY, S. KASIF, S. SALZBERG, AND R. BEIGEL. OC1: Randomized induction of oblique decision trees. In AAAI-93 [7], pages 322--327.

264
SREERAMA K. MURTHY, SIMON KASIF, AND STEVEN SALZBERG. A system for induction of oblique decision trees. Journal of Artificial Intelligence Research, 2:1--33, August 1994.

265
SREERAMA K. MURTHY AND STEVEN SALZBERG. Decision tree induction: How effective is the greedy heuristic? In Proceedings of the First International Conference on Knowledge Discovery in Databases, Montreal, Canada, August 1995.

266
SREERAMA K. MURTHY AND STEVEN SALZBERG. Lookahead and pathology in decision tree induction. In IJCAI-95 [161]. to appear.

267
P. M. NARENDRA AND K. FUKANAGA. A branch and bound algorithm for feature subset selection. IEEE Transactions on Computers, C-26(9):917--922, 1977.

268
DANA S. NAU. Decision quality as a function of search depth on game trees. Journal of the Association of Computing Machinery, 30(4):687--708, October 1983.

269
G. E. NAUMOV. NP-completeness of problems of construction of optimal decision trees. Soviet Physics, Doklady, 36(4):270--271, April 1991.

270
T. NIBLETT. Constructing decision trees in noisy domains. In I. Bratko and N. Lavrac, editors, Progress in Machine Learning. Sigma Press, England, 1986.

271
N.J. NILSSON. Learning Machines. Morgan Kaufmann, 1990.

272
STEVEN W. NORTON. Generating better decision trees. In IJCAI-89 [159], pages 800--805. Editor: N. S. Sridharan.

273
M. NÚÑEZ. The use of background knowledge in decision tree induction. Machine Learning, 6:231--250, 1991.

274
J. OLIVER. Decision graphs---an extension of decision trees. In AI&Statistics-93 [3].

275
COLM A. O'MUIRCHEARTAIGH. Statistical analysis in the context of survey research. In O'Muircheartaigh and Payne [276], pages 1--40.

276
Colm A. O'Muircheartaigh and Clive Payne, editors. The analysis of survey data, volume I. John Wiley & Sons, Chichester, UK, 1977.

277
GIULIA M. PAGALLO AND D. HAUSSLER. Boolean feature discovery in empirical learning. Machine Learning, 5(1):71--99, March 1990.

278
SHAILENDRA C. PALVIA AND STEVEN R. GORDON. Tables, trees and formulas in decision analysis. Communications of the ACM, 35(10):104--113, October 1992.

279
YOUNGTAE PARK. A comparison of neural net classifiers and linear tree classifiers: Their similarities and differences. Pattern Recognition, 27(11):1493--1503, 1994.

280
YOUNGTAE PARK AND JACK SKLANSKY. Automated design of linear tree classifiers. Pattern Recognition, 23(12):1393--1412, 1990.

281
YOUNTAE PARK AND JACK SKLANSKY. Automated design of multiple-class piecewise linear classifiers. Journal of Classification, 6:195--222, 1989.

282
KRISHNA R. PATTIPATI AND MARK G. ALEXANDRIDIS. Application of heuristic search and information theory to sequential fault diagnosis. IEEE Transactions on Systems, Man and Cybernetics, 20(4):872--887, July/August 1990.

283
R. W. PAYNE AND D. A. PREECE. Identification keys and diagnostic tables: A review. Journal of the Royal Statistical Society: series A, 143:253, 1980.

284
JUDEA PEARL. On the connection between the complexity and credibility of inferred models. International Journal of General Systems, 4:255--264, 1978.

285
R. A. PEARSON AND P. E. STOKES. Vector evaluation in induction algorithms. International Journal of High Speed Computing, 2(1):25--100, March 1990.

286
F. PIPITONE, K. A. DE JONG, AND W. M. SPEARS. An artificial intelligence approach to analog systems diagnosis. In Ruey-wen Liu, editor, Testing and Diagnosis of Analog Circuits and Systems. Van Nostrand-Reinhold, New York, 1991.

287
SELWYN PIRAMUTHU, NARAYAN RAMAN, AND MICHAEL J. SHAW. Learning-based scheduling in a flexible manufacturing flow line. IEEE Transactions on Engineering Management, 41(2):172--182, May 1994.

288
N. J. PIZZI AND D. JACKSON. Comparitive review of knowledge engineering and inductive learning using data in a medical domain. Proceedings of the SPIE: The International Society for Optical Engineering, 1293(2):671--679, April 1990.

289
SHI QING-YUN AND KING-SUN FU. A method for the design of binary tree classifiers. Pattern Recognition, 16:593--603, 1983.

290
JOHN ROSS QUINLAN. Discovering rules by induction from large collections of examples. In Donald Michie, editor, Expert Systems in the Micro Electronic Age. Edinburgh University Press, Edinburgh, UK, 1979.

291
JOHN ROSS QUINLAN. The effect of noise on concept learning. In R. S. Michalski, J. G. Carbonell, and T. M. Mitchell, editors, Machine Learning: An Artificial Intelligence Approach, volume 2. Morgan Kauffman, San Mateo, CA, 1986.

292
JOHN ROSS QUINLAN. Induction of decision trees. Machine Learning, 1:81--106, 1986.

293
JOHN ROSS QUINLAN. Simplifying decision trees. International Journal of Man-Machine Studies, 27:221--234, 1987.

294
JOHN ROSS QUINLAN. An empirical comparison of genetic and decision tree classifiers. In Fifth International Conference on Machine Learning, pages 135--141, Ann Arbor, Michigan, 1988. Morgan Kaufmann.

295
JOHN ROSS QUINLAN. Unknown attribute values in induction. In Proceedings of the Sixth International Workshop on Machine Learning, pages 164--168, San Mateo, CA, 1989. Morgan Kaufmann.

296
JOHN ROSS QUINLAN. Probabilistic decision trees. In R.S.Michalski and Y. Kodratoff, editors, Machine Learning: An Artificial Intelligence Approach - Volume 3. Morgan Kaufmann, San Mateo, CA, 1990.

297
JOHN ROSS QUINLAN. C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, San Mateo, CA, 1993.

298
JOHN ROSS QUINLAN. Comparing connectionist and symbolic learning methods. In S. Hanson, G. Drastal, and R. Rivest, editors, Computational Learning Theory and Natural Learning Systems: Constraints and Prospects. MIT Press, 1993.

299
JOHN ROSS QUINLAN AND RONALD L. RIVEST. Inferring decision trees using the minimum description length principle. Information and Computation, 80(3):227--248, March 1989.

300
HARISH RAGAVAN AND LARRY RENDELL. Lookahead feature construction for learning hard concepts. In ML-93 [248], pages 252--259. Editor: Paul E. Utgoff.

301
LARRY RENDELL AND HARISH RAGAVAN. Improving the design of induction methods by analyzing algorithm functionality and data-based concept complexity. In IJCAI-93 [160], pages 952--958. Editor: Ruzena Bajcsy.

302
ALFRED RENYI AND LASZLO VEKERDI. Probability Theory. North-Holland Publishing Company, Amsterdam, 1970.

303
P. RIDDLE, R. SEGAL, AND O. ETZIONI. Representation design and brute-force induction in a Boeing manufacturing domain. Applied Artificial Intelligence, 8(1):125--147, January-March 1994.

304
JORMA RISANNEN. Stochastic Complexity in Statistica Enquiry. World Scientific, 1989.

305
EVE A. RISKIN AND ROBERT M. GRAY. A greedy tree growing algorithm for the design of variable rate vector quantizers. IEEE Transactions on Signal Processing, 39(11):2500--2507, November 1991.

306
EVE A. RISKIN AND ROBERT M. GRAY. Lookahead in growing tree-structured vector quantizers. In ICASSP 91: International Conference on Accoustics, Speech and Signal Processing, volume 4, pages 2289--2292, Toronto, Ontario, May 14th--17th 1991. IEEE.

307
E. ROUNDS. A combined non-parametric approach to feature selection and binary decision tree design. Pattern Recognition, 12:313--317, 1980.

308
STEVEN ROVNYAK, STEIN KRETSINGER, JAMES THORP, AND DONALD BROWN. Decision trees for real time transient stability prediction. IEEE Transactions on Power Systems, 9(3):1417--1426, August 1994.

309
RON RYMON. An SE-tree based characterization of the induction problem. In ML-93 [248], pages 268--275. Editor: Paul E. Utgoff.

310
RON RYMON AND N. M. SHORT, JR. Automatic cataloging and characterization of earth science data using set enumeration trees. Telematics and Informatics, 11(4):309--318, Fall 1994.

311
S. RASOUL SAFAVIN AND DAVID LANDGREBE. A survey of decision tree classifier methodology. IEEE Transactions on Systems, Man and Cybernetics, 21(3):660--674, May/June 1991.

312
M. SAHAMI. Learning non-linearly separable boolean functions with linear threshold unit trees and madaline-style networks. In AAAI-93 [7], pages 335--341.

313
STEVEN SALZBERG. Locating protein coding regions in human DNA using a decision tree algorithm. Journal of Computational Biology, 1995. To appear in Fall.

314
STEVEN SALZBERG, RUPALI CHANDAR, HOLLAND FORD, SREERAMA MURTHY, AND RICK WHITE. Decision trees for automated identification of cosmic-ray hits in Hubble Space Telescope images. Publications of the Astronomical Society of the Pacific, 107:1--10, March 1995.

315
ANANT SANKAR AND RICHARD J. MAMMONE. Growing and pruning neural tree networks. IEEE Transactions on Computers, 42(3):291--299, March 1993.

316
LAWRENCE SAUL AND MICHAEL I. JORDAN. Learning in Boltzmann trees. Neural Computation, 6(6):1174--1184, November 1994.

317
CULLEN SCHAFFER. Overfitting avoidance as bias. Machine Learning, 10:153--178, 1993.

318
CULLEN SCHAFFER. A conservation law for generalization performance. In ML-94 [249], pages 259--265. Editors: William W. Cohen and Haym Hirsh.

319
CULLEN SCHAFFER. Conservation of generalization: A case study. Technical report, Department of Computer Science, CUNY/Hunter College, February 1995.

320
T. M. SCHMIDL, P. C. COSMAN, AND ROBERT M. GRAY. Unbalanced non-binary tree-structured vector quantizers. In A. Singh, editor, Conference Record of the Twenty-Seventh Asilomar Conference on Signals, Systems and Computers, volume 2, pages 1519--1523, Los Alamitos, CA, November 1st--3rd 1993. IEEE Computer Society Press. Conf. held at Pacific Grove, CA.

321
J. SCHUERMANN AND W. DOSTER. A decision-theoretic approach in hierarchical classifier design. Pattern Recognition, 17:359--369, 1984.

322
ISHWAR KRISHNAN SETHI. Entropy nets: From decision trees to neural networks. Proceedings of the IEEE, 78(10), October 1990.

323
ISHWAR KRISHNAN SETHI AND B. CHATTERJEE. Efficient decision tree design for discrete variable pattern recognition problems. Pattern Recognition, 9:197--206, 1977.

324
ISHWAR KRISHNAN SETHI AND G.P.R. SARVARAYUDU. Hierarchical classifier design using mutual information. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-4(4):441--445, July 1982.

325
ISHWAR KRISHNAN SETHI AND J. H. YOO. Design of multicategory, multifeature split decision trees using perceptron learning. Pattern Recognition, 27(7):939--947, 1994.

326
C. E. SHANNON. A mathematical theory of communication. Bell System Technical Journal, 27:379--423,623--656, 1948.

327
JUDE W. SHAVLIK, R. J. MOONEY, AND G. G. TOWELL. Symbolic and neural learning algorithms: An empirical comparison. Machine Learning, 6(2):111--144, 1991.

328
SHELDON B. AKERS. Binary decision diagrams. IEEE Transactions on Computers, C-27(6):509--516, June 1978.

329
S. SHIMOZONO, A. SHINOHARA, T. SHINOHARA, S. MIYANO, S. KUHARA, AND S. ARIKAWA. Knowledge acquisition from amino acid sequences by machine learning system BONSAI. Transactions of the Information Processing Society of Japan, 35(10):2009--2018, October 1994.

330
SEYMOUR SHLIEN. Multiple binary decision tree classifiers. Pattern Recognition, 23(7):757--763, 1990.

331
SEYMOUR SHLIEN. Nonparametric classification using matched binary decision trees. Pattern Recognition Letters, 13(2):83--88, February 1992.

332
W. SIEDLECKI AND J. SKALANSKY. On automatic feature selection. International Journal of Pattern Recognition and Artificial Intelligence, 2(2):197--220, 1988.

333
J.A. SIRAT AND J.-P. NADAL. Neural trees: A new tool for classification. Network: Computation in Neural Systems, 1(4):423--438, October 1990.

334
JACK SKLANSKY AND LEO MICHELOTTI. Locally trained piecewise linear classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-2(2):101--111, March 1980.

335
JACK SKLANSKY AND GUSTAV NICHOLAS WASSEL. Pattern classifiers and trainable machines. Springer-Verlag, New York, 1981.

336
PADHRAIC SMYTH, ALEX GRAY, AND USAMA M. FAYYAD. Retrofitting decision tree classifiers using kernel density estimation. In ML-95 [250]. to appear.

337
J. A. SONQUIST, E. L. BAKER, AND J. N. MORGAN. Searching for Structure. Institute for Social Research, University of Michigan, Ann Arbor, MI, 1971.

338
LILLY SPIRKOVSKA. Three dimensional object recognition using similar triangles and decision trees. Pattern Recognition, 26(5):727, May 1993.

339
SREEJIT CHAKRAVARTY. A characterization of binary decision diagrams. IEEE Transactions on Computers, 42(2):129--137, February 1993.

340
S.SCHWARTZ, J. WILES, I. GOUGH, AND S. PHILIPS. Connectionist, rule-based and bayesian decision aids: An empirical comparison. pages 264--278. Chapman & Hall, London, 1993.

341
C. Y. SUEN AND QING REN WANG. ISOETRP -- an interactive clustering algorithm with new objectives. Pattern Recognition, 17:211--219, 1984.

342
XIAORONG SUN, YUPING QIU, AND LOUIS ANTHONY COX. A hill-climbing approach to construct near-optimal decision trees. In AI&Statistics-95 [4], pages 513--519.

343
P. SWAIN AND H. HAUSKA. The decision tree classifier design and potential. IEEE Transactions on Geoscience and Electronics, GE-15:142--147, 1977.

344
JAN L. TALMON. A multiclass nonparametric partitioning algorithm. Pattern Recognition Letters, 4:31--38, 1986.

345
JAN L. TALMON, WILLEM R. M. DASSEN, AND VINCENT KARTHAUS. Neural nets and classification trees: A comparison in the domain of ECG analysis. In Gelsema and Kanal [118], pages 415--423.

346
JAN L. TALMON AND P. MCNAIR. The effect of noise and biases on the performance of machine learning algorithms. International Journal of Bio-Medical Computing, 31(1):45--57, July 1992.

347
MING TAN. Cost-sensitive learning of classification knowledge and its applications in robotics. Machine Learning, 13:7--33, 1993.

348
PAUL C. TAYLOR AND BERNARD W. SILVERMAN. Block diagrams and splitting criteria for classification trees. Statistics and Computing, 3(4):147--161, December 1993.

349
SEBASTIAN THRUN AND ET AL. The monk's problems: A performance comparison of different learning algorithms. Technical Report CMU-CS-91-197, School of Computer Science, Carnegie-Mellon University, Pittsburgh, PA, 1991.

350
R. TODESHINI AND E. MARENGO. Linear discriminant classification tree: a user-driven multicriteria classification method. Chemometrics and Intelligent Laboratory Systems, 16:25--35, 1992.

351
J.T. TOU AND R.C. GONZALEZ. Pattern Recognition Principles. Addison Wesley, Reading, MA, 1974.

352
CHARALAMBOS TSATSARAKIS AND D. SLEEMAN. Supporting preprocessing and postprocessing for machine learning algorithms: A workbench for ID3. Knowledge Acquisition, 5(4):367--383, December 1993.

353
PEI-LEI TU AND JEN-YAO CHUNG. A new decision-tree classification algorithm for machine learning. In Proceedings of the IEEE International Conference on Tools with AI, pages 370--377, Arlington, Virginia, November 1992.

354
I. B. TURKSEN AND H. ZHAO. An equivalence between inductive learning and pseudo-Boolean logic simplification: a rule generation and reduction scheme. IEEE Transactions on Systems, Man and Cybernetics, 23(3):907--917, May-June 1993.

355
PETER D. TURNEY. Cost-sensitive classification: Empirical evaluation of a hybrid genetic decision tree induction algorithm. Journal of Artificial Intelligence Research, 2:369--409, March 1995.

356
PAUL E. UTGOFF. Incremental induction of decision trees. Machine Learning, 4:161--186, 1989.

357
PAUL E. UTGOFF. Perceptron trees: A case study in hybrid concept representations. Connection Science, 1(4):377--391, 1989.

358
PAUL E. UTGOFF. An improved algorithm for incremental induction of decision trees. In ML-94 [249], pages 318--325. Editors: William W. Cohen and Haym Hirsh.

359
PAUL E. UTGOFF AND CARLA E. BRODLEY. An incremental method for finding multivariate splits for decision trees. In Proceedings of the Seventh International Conference on Machine Learning, pages 58--65, Los Altos, CA, 1990. Morgan Kaufmann.

360
J.M. VAN CAMPENHOUT. On the Problem of Measurement Selection. PhD thesis, Stanford University, Dept. of Electrical Engineering, 1978.

361
THIERRY VAN DE MERCKT. Decision trees in numerical attribute spaces. In IJCAI-93 [160], pages 1016--1021. Editor: Ruzena Bajcsy.

362
P.K. VARSHNEY, C.R.P. HARTMANN, AND J.M. DE FARIA JR. Applications of information theory to sequential fault diagnosis. IEEE Transactions on Computers, C-31(2):164--170, 1982.

363
WALTER VAN DE VELDE. Incremental induction of topologically minimal trees. In Bruce W. Porter and Ray J. Mooney, editors, Proceedings of the Seventh International Conference on Machine Learning, pages 66--74, Austin, Texas, 1990.

364
C. S. WALLACE AND D. M. BOULTON. An information measure for classification. Computer Journal, 11:185--194, 1968.

365
C. S. WALLACE AND J. D. PATRICK. Coding decision trees. Machine Learning, 11(1):7--22, April 1993.

366
QING REN WANG AND C. Y. SUEN. Analysis and design of a decision tree based on entropy reduction and its application to large character set recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6:406--417, 1984.

367
QING REN WANG AND CHING Y. SUEN. Large tree classifier with heuristic search and global training. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-9(1):91--102, January 1987.

368
GUSTAV NICHOLAS WASSEL AND JACK SKLANSKY. Training a one-dimensional classifier to minimize the probability of error. IEEE Transactions on Systems, Man and Cybernetics, SMC-2:533--541, September 1972.

369
LARRY WATANABE AND LARRY RENDELL. Learning structural decision trees from examples. volume 2, pages 770--776, Darling Harbour, Sydney, Australia, 24--30th, August 1991. Morgan Kaufmann Publishers Inc., San Mateo, CA. Editors: John Mylopoulos and Ray Reiter.

370
S. WATANABE. Pattern recognition as a quest for minimum entropy. Pattern Recognition, 13:381--387, 1981.

371
NICHOLAS WEIR, S. DJORGOVSKI, AND USAMA M. FAYYAD. Initial galaxy counts from digitized POSS-II. The Astronomical Journal, 110(1):1, 1995.

372
NICHOLAS WEIR, USAMA M. FAYYAD, AND S. DJORGOVSKI. Automated star/galaxy classification for digitized POSS-II. The Astronomical Journal, 109(6):2401, 1995.

373
S. WEISS AND I. KAPOULEAS. An empirical comparison of pattern recognition, neural nets, and machine learning classification methods. In IJCAI-89 [159], pages 781--787. Editor: N. S. Sridharan.

374
ALLAN P. WHITE AND WEI ZHANG LIU. Technical note: Bias in information-based measures in decision tree induction. Machine Learning, 15(3):321--329, June 1994.

375
P.A.D. WILKS AND M.J. ENGLISH. Accurate segmentation of respiration waveforms from infants enabling identification and classification of irregular breathing patterns. Medical Engineering and Physics, 16(1):19--23, January 1994.

376
J. WIRTH AND J. CATLETT. Experiments on the costs and benefits of windowing in ID3. In Fifth International Conference on Machine Learning, pages 87--99, Ann Arbor, Michigan, 1988. Morgan Kaufmann.

377
D. WOLPERT. On overfitting avoidance as bias. Technical Report SFI TR 92-03-5001, The Santa Fe Institute, 1992.

378
K. S. WOODS, C. C. DOSS, K. W. VOWYER, J. L. SOLKA, C. E. PRIEVE, AND W. P. JR. KEGELMEYER. Comparative evaluation of pattern recognition techniques for detection of microcalcifications in mammography. International Journal of Pattern Recognition and Artificial Intelligence, 7(6):1417--1436, December 1993.

379
K. C. YOU AND KING-SUN FU. An approach to the design of a linear binary tree classifier. In Proceedings of the Third Symposium on Machine Processing of Remotely Sensed Data, West Lafayette, IN, 1976. Purdue University.

380
Y. YUAN AND M. J. SHAW. Induction of fuzzy decision trees. Fuzzy Sets and Systems, 69(2):125, 1995.

381
WANG ZHENGOU AND LIN YAN. A new inductive learning algorithm: Separability-Based Inductive learning algorithm. Acta Automatica Sinica, 5(3):267--270, 1993. Translated into Chinese Journal of Automation.

382
XIAO JIA ZHOU AND THARAM S. DILLON. A statistical-heuristic feature selection criterion for decision tree induction. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-13(8):834--841, August 1991.

383
SETH ZIMMERMAN. An optimal search procedure. The American Mathematical Monthly, 66(8):690--693, March 1959.

Kolluru Venkata Sreerama Murthy was born in Bapatla, India on 2nd April 1967. He obtained a baccalaureate honours degree in computer science and engineering from Motilal Nehru Regional Engineering College, Allahabad in 1988. He then got a Masters degree in computer science and engineering from Indian Institute of Technology, Madras in 1990. After working briefly for Westinghouse Electric Corporation (Process Control Division, Pittsburgh, PA, USA) and National Center for Software Technology (Bombay, India), he joined the Johns Hopkins University in September 1991, in pursuit of a doctoral degree in computer science. He married Sudha in May, 1995.



Sreerama Murthy
Thu Oct 19 17:40:24 EDT 1995