Professional Documents
Culture Documents
FEATURE
SELECTION
Method
Algo iteratively finds a set of optimal features.
The iterative algorithm which has five major components:
• feature initialization,
• graph construction,
• neural network,
• multiple dropouts,
• gradient computation
aims to iteratively find a set of optimal features which gives rise to the
greatest decreases in the optimization loss.
Features
Reported the average test AUROC (over 20 times train test splits) with
respect to the number of selected features from 1 to 10.
Inferences
reported that it might fail on data with highly nonlinear relations
between features and labels.It is computationally inefficient.
gradient estimation
The gradient of the loss function, in simple terms, tells you how much
changing a particular input to your model will change the model's
overall error (loss). It's like a map showing you which directions to
move in to quickly decrease the error and improve your model's
performance.
Other scope
Feature Graphs
GAT