Here is a striking example of the unfortunate (but probably unavoidable) dispersion of mathematics: at the end of the abstract of V. Arnold’s talk at the recent conference in honor of A. Douady, one can read:

The Cesaro mean values K̂ of the numbers K(n) tend, as n tends to ∞, to a finite limit K̂(∞)=lim 1/n ∑

_{m=1}^{n}K(m) = 15/π^{2}. This theorem, deduced from the empirical observation of the coincidence of 20 first digits, is now proved, using the formula K̂(∞)=ζ(2)/ζ(4)

Here, K(n) is defined before in the abstract as the expression

Arnold’s achievements as mathematician are about as impressive as it can get. But the statement here is a completely elementary exercise in analytic number theory, and has been for at least one century (i.e., Dirichlet, or Chebychev, could do it in a few minutes, if not Euler). Here’s the proof in Chebychev style:

hence, exchaning the sum over *n* and the sum over *d*, we get

and replacing the integral part by *X/d*+O(1), this is clearly asymptotic to *CX *with

which is an absolutely convergent series. As an Euler product it is

as desired.

Yes you’re right that it’s simple to prove, but I don’t think it is an example of the dispersion of mathematics.

I humbly think Arnold is naturally getting a bit old now (born in 1937). One should also point out that he had an accident and suffered from amnesia in 1999 (from which he recovered I believe, but perhaps that affects him now).

Since then what he has published is less technical than his earlier papers and he doesn’t seem to be aware of today’s standards in number theory and doesn’t take criticisms from referees very well hence him rediscovering things instead of checking the literature (see the first part of this, quite funny; I don’t know if the stuff he was studying in that particular case was indeed new or not, worth checking maybe as he got many great inspirations during his career).

At least it’s great to see he’s still active, many folks have stopped thinking even on less technical problems way before being 70…

What I meant by “dispersion of mathematics” is the fact that even someone as strong as Arnold could have gone through his career without learning or needing to learn such really basic techniques. (Of course, he may have forgotten, but it’s really so simple, that it would seem strange).

On the other hand, I agree very much with some of his points in the link you gave (e.g., the fact that most mathematical articles are unreadable by anyone outside a vanishingly small group of people, and that “experts” in one subfield are often too dismissive of any effort by “outsiders” to participate in their field).

But there are various ways to be “simple”, and I see no reason to accept, as a research paper, some article that spells out exercises. There are excellent journals, like the American Math. Monthly which are happy to publish good papers of this type.

To give a personal example (because this seems the easiest way to explain with a concrete example): I have written one paper on probability theory, which gives a new construction of standard Brownian motion. I never thought of submitting it to a research journal, because first of all I was very unsure this was really new, and secondly because even if new, I could see that this was in a sense an exercise (or rather, something for a 2/3 hours exam for a beginning rigorous probability course). So I expanded the presentation and submitted to the Monthly, where it fit very well.

Aside comment: it seems you are having trouble with the LaTeX code: have you tried without the backslash between $ and latex ? It might work this way, at least this is how it works at wordpress.com.

Yes, that was the problem… Thanks for pointing this out, I couldn’t understand why the previous post showed correctly the LaTeX pictures, but not this one…

Here is another unfortunate example:

http://cdsagenda5.ictp.trieste.it/full_display.php?ida=a0850

although one can hope that Arnold didn’t claim to be the first to prove that the density of primitive lattice points in

Znis1/ζ(n), which is a really old result; the best-known version of it is due to Schanuel, for general number fields, with an error term, but the case of the rationals was certainly known before (see his paper from 1979).