Symmetric Delta (Galas et al, 2020; Galas et al, 2014 (link)) is the measure used in a MIST search. The Symmetric Delta is a novel symmetric measure of functional dependence (it is symmetric under exchange of variables) constructed from joint entropies. Joint entropies between variables (using the Shannon entropy (Shannon and Weaver, 1949 ) defined as the expectation of the logarithm of each element of the joint probability distribution: ). For variable tuples of size T there are N choose T tuples in the search space. Thus, the problem is reduced to computing joint probability distributions for a very large number of variable tuples.
Dependence between two variables X and Y can be directly measured with mutual information , defined as
where and are single entropies of variables X and Y and is their joint entropy.
A general dependence among three variables, X, Y, and Z, can be measured with symmetric delta. . To see clearly the definition of symmetric delta, we need to introduce interaction information, which is a multivariable generalization of mutual information (McGill, 1954 ), defined for three variables as
Given interaction information, differential interaction information is defined as a difference between values of successive interaction information arising from adding a variable:
Here is called asymmetric delta for the target variable X. To detect a fully synergistic dependence among a set of variables, we want a single measure, which is symmetric. Consequently, we defined a general measure , called symmetric delta (or simply delta), by multiplying the 's with all possible choices of the target variable:
The critical property of this delta measure is that it is zero whenever