Abstract

In supervised learning there is usually a clear distinction between inputs and outputs --- inputs are what you will measure, outputs are what you will predict from those measurements. This paper shows that the distinction between inputs and outputs is not this simple. Some features are more useful as extra outputs than as inputs. By using a feature as an output we get more than just the case values but can learn a mapping from the other inputs to that feature. For many features this mapping may be more useful than the feature value itself. We present two regression problems and one classification problem where performance improves if features that could have been used as inputs are used as extra outputs instead. This result is surprising since a feature used as an output is not used during testing. 1 Introduction The goal in supervised learning is to learn functions that map inputs to outputs with high predictive accuracy. The standard practice in neural nets is to use all features t...

Keywords

Feature (linguistics)Computer scienceMeasure (data warehouse)Artificial intelligenceMachine learningRegressionValue (mathematics)Simple (philosophy)Pattern recognition (psychology)Data miningMathematicsStatistics

Affiliated Institutions

Related Publications

Publication Info

Year
1996
Type
article
Volume
9
Pages
389-395
Citations
36
Access
Closed

External Links

Citation Metrics

36
OpenAlex

Cite This

Rich Caruana, Virginia R. de (1996). Promoting Poor Features to Supervisors: Some Inputs Work Better as Outputs. , 9 , 389-395.