It looks like I may have a final upper bound on the integer complexity problem. Recall we defined f(n) to be the minimum number of 1s needed to represent n as a product or sum of 1s. Thus, f(6)=5 because we can write 6=(1+1)(1+1+1). Harry Altman and I have been working on improving the lower and upper bounds on this function. Prior posts about this problem can be found under the label integer complexity. Harry has a large number of posts on the subject also, but they aren't all tagged so I'll just direct readers to the most recent one.
I'm writing this post to announce that we have a tight upper bound for f(n). Prior to our work, it was known that if n ≥ 2, we have 3log2 n ≥ f(n) ≥ 3log3 n. It is clear that the lower bound is hit infinitely often (whenever n is a power of 3). We (and by we I mean mostly Harry) have worked on understanding what n have close to the lowest possible value, and using this to understand other conjectures about f, such as the conjecture that if n=2^a*3^b then the most efficient representation for n is the obvious one. However, I've been also working on improving the upper bound, and whenever I've thought that I've been done improving the upper bound, I've then realized another way to tweak my proof to reduce the upper bound slightly. However, at this point, I'm confident that cannot happen anymore using the techniques in question, barring substantially new insight. Thus, I'm now announcing the new upper bound. We have for n ≥ 2, f(n) ≤ αlog2 n where α = 70/(log_2 102236160)= 2.630853... Note that this constant is just small enough for us to also state the weaker but prettier result that f(n) ≤ (19/5) log n. Unfortunately, the proof in question is very ugly, involving a large amount of case checking as well as nasty calculation of inequalities. I'm in the process of writing up the last bit. If readers are interested I can email people a draft of the proof when it is fully written.
Erickson Blames the Victims of Gay Bashing
3 hours ago