I don't know much about information entropy E , but I do know that it's usually defined as a function of a *partition* of the unit interval into disjoint subintervals, aka the probaiblities p_i of a discrete probability measure.  So E is a function of the collection (multiset) of values P = {p_1,p_2,p_3,...} and is defined via

E(P) = sum over i of (-p_i log(p_i))

Among partitions of size n, the maximum-entropy partition P_M(n) is into equal intervals, giving E(P_M(n)) = log(n).

--Dan