We deal with the relation between the generalized entropy (f-entropy, a family of functions that include several biodiversity measures) of a discrete random variable and the minimal probability of error (Bayes error) when the value of this random variable is estimated. Namely the tightness of their relation is studied. Morales and Vajda  recently introduced a measure called average inaccuracy that aims to quantify the tightness of the relation between the posterior Bayes error and the power entropies. It is defined as a standardized average difference between the upper and the lower bound for the posterior Bayes error under given entropy. Their concept can be generalized to any strictly concave f-entropy and used to evaluate its relation to the Bayes probability of error. However, due to a complex form of the formula of the average inaccuracy, it is difficult to compare the average inaccuracies of most f-entropies analytically. We propose a smooth approximation of the lower bound for the posterior Bayes error under given f-entropy that simplifies the formula of the average inaccuracy. We show that under this approximation, the quadratic entropy has the tightest relation to the posterior Bayes error among f- entropies. The quadratic entropy has the tightest relation to the posterior Bayes error (in the sense described in this paper) than the Shannon's entropy and other functions that belong to the family of f-entropies, like Emlen's index, Ferreri's index and Good's index.