אירועים
אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב
יום חמישי, 22.11.2007, 14:30
Data discretization is defined as a process of converting continuous
data attribute values into a finite set of intervals with minimal loss
of information. In this talk, we prove that discretization methods based
on informational theoretical complexity and the methods based on statistical
measures of data dependency are asymptotically equivalent.
Furthermore, we define a notion of generalized entropy and prove that
discretization methods based on MDLP, Gini Index, AIC, BIC, Pearson's X_2,
and Wilks' G_2 statistics are all derivable from the generalized entropy
function. We design a dynamic programming algorithm that guarantees the best
discretization based on the generalized entropy notion. Furthermore,
we conducted an extensive performance evaluation of our method for
several publicly available data sets. Our results show that our method
delivers on the average 31% less classification errors than many previously
known discretization methods.
This is a joint work with Ruoming Jin and Chibuike Muoh from Kent
State University. The work was presented at ICDM 2007 conference