login/create account
    
 are independent random variables with 
, then 
 
In comparison to most probabilistic inequalities (like Hoeffding's), Feige's inequality does not deteriorate as 
 goes to infinity, something that is useful for computer scientists.
Let 
. Feige argued that to prove the conjecture, one only needs to prove it for the case when 
 and each variable 
 has the entire probability mass distributed on 0 and 
 for some 
. He proved that 
 and conjectured that the constant 1/13 may be replaced with 
. It was further conjectured that "the worst case" would be one of
-  \item one variable has 
 
 as maximum value and the remaining 
 random variables are always 1 (hence the probability that the sum is less than 
 is 
), \item each variable has 
 as maximum (hence the probability that the sum is less than 
 is 
). One way to initiate an attack on this problem is to assume 
 and argue that the case when each variable assumes 
 with probability 
 and otherwise 0 is indeed the worst.
Bibliography
*[F04] Uriel Feige: On sums of independent random variables with unbounded variance, and estimating the average degree in a graph, STOC '04: Proceedings of the thirty-sixth annual ACM symposium on Theory of computing (2004), pp. 594 - 603. ACM
*[F05] Uriel Feige: On sums of independent random variables with unbounded variance, and estimating the average degree in a graph, Manuscript, 2005, [pdf]
The problem was also referenced at population algorithms, the blog.
* indicates original appearance(s) of problem.
          
 Drupal
 CSI of Charles University