19

For every prime $p_i>2$ choose a $k_i\ge p_i$ , $k_i \in \mathbb{N}$ and take the arithmetic progression $A_i=k_i+np_i$ $n \ge 0$ . Is there any choice of the $k_i's$ such that $|\mathbb{N} \backslash \bigcup A_i | < \infty $ ?

ADDED Does it makes any diferrence if we omit some other prime number (not 2)?

  • Perhaps I'm misparsing. Couldn't you take $k_i=0$ for all $i$? – Cam McLeman Nov 09 '11 at 15:57
  • 2
    you must take k_i>= p_i – Asterios Gkantzounis Nov 09 '11 at 16:01
  • Do you allow $n$ to be negative in your progressions? – Ramiro de la Vega Nov 09 '11 at 16:14
  • 1
    no n must be positive – Asterios Gkantzounis Nov 09 '11 at 16:15
  • Okay, take $k_i=p_i$? – Cam McLeman Nov 09 '11 at 16:36
  • @Cam McLeman:p_i>2 – Asterios Gkantzounis Nov 09 '11 at 16:42
  • 4
    Ah, so you miss the powers of 2. Apologies. (Though there are a lot of not-so-natural-seeming technical conditions, at least one of which you omitted, and not a lot in the way of motivation, so I think some fumbling around can be excused/expected.) – Cam McLeman Nov 09 '11 at 16:45
  • 3
    Resonating what Cam says, is there a motivation for this precise choice of condition? Why do you omiy exactly 2? And a minor point, the equivalent formulation that ki are nonnegative and the n strictly positive seems a but clearer to me. –  Nov 09 '11 at 17:13
  • @quid:yes maybe you are right if you think that it is important feel free to edit. as for the motivation it is a special case ,the most simple, of a more general question – Asterios Gkantzounis Nov 09 '11 at 17:22
  • 1
    Can you get a probabilistic estimate here, using the prime number theorem? – Will Sawin Nov 09 '11 at 18:31
  • @Will Sawin:you could try – Asterios Gkantzounis Nov 09 '11 at 19:15
  • 3
    This might be worth a try: for $p=3$ take $k=4$, then go greedy; for each $p\gt3$, take $k$ to be the smallest number not less than $p$ not already covered by some smaller $p$. The sequence of $k$-values begins 4, 5, 8, 11, 14, 17, 21, 23, 32, 38, 39, 41, 47, 48, 54, 62, 63. Pencil-and-paper doodlings suggest this misses very few numbers after 27, so it may be worth putting a computer onto it. – Gerry Myerson Nov 09 '11 at 22:28
  • There is a lot of work on finite covering systems, which you may be aware of. See, e.g., http://math.nju.edu.cn/~zwsun/Cover.pdf – Frank Thorne Nov 10 '11 at 00:04
  • 4
    I don't think we need a probabilistic estimate here, because we already know that it is possible to get all positive integers, except for a set of density zero (the powers of 2) – Woett Nov 10 '11 at 00:34
  • 1
    Without restrictions on k, it is possible to get a full cover. However, this translates to a question on prime number growth versus coprime number growth: for any finite set of primes, the offsets can be merged by the Chinese Remainder Theorem into a single offset (cf Aaron Meyerowitz'z remarks to http://mathoverflow.net/questions/57564/ ). The question now becomes covering the set of integers coprime to a large product of primes, while maintaining a condition on the covering sets. My gut says no: too many "small" coprimes. Gerhard "Ask Me About Jacobsthal's Function" Paseman, 2011.11.09 – Gerhard Paseman Nov 10 '11 at 01:01
  • @Gerhard - there you go again about Jacobsthal's function... – David Roberts Nov 10 '11 at 02:45
  • Yes David, there I go again. If I were to indulge in rash speculation, I would say that the answer to asterios's question follows from how tight a bound one gets on Jacobsthal's function. However, I think the answer has to do as much with how numbers coprime to a given n are distributed, and that no matter how good a cover you start with, you will still have lots of small constellations of coprimes that are missed by the cover. Having good bounds on Jacobsthal's function is a start to (at least my) understanding of such a distribution. Gerhard "Ask Me About Gaps, Then" Paseman, 2011.11.09 – Gerhard Paseman Nov 10 '11 at 03:41
  • My thoughts match @Gerry's: a greedy progression should cover all but finitely many numbers. Mertens' theorem gives a good heuristic here. – Charles Nov 10 '11 at 16:18
  • Charles, suppose I create a repeating pattern oof S integers with a period of length N>>S. You are allowed to place a prime arithmetic progression but it can't start too close to the origin and each time we replace the pattern with one that has period N times prime. I feel that too many small clusters arise to be covered according to the restrictions. Gerhard "Ask Me About System Design" Paseman, 2011.11.10 – Gerhard Paseman Nov 10 '11 at 16:30
  • I should be more clear. In each period there are N-S integers to be covered. For each prime p , after placement you get a similar problem with N replaced by pN and less than 1/p of the job done. Gerhard "Ask Me About System Design" Paseman, 2011.11.10 – Gerhard Paseman Nov 10 '11 at 16:44
  • Darn that Cantor! I now suspect there are uncountably many such patterns k_i, and possibly one that meets the conditions of the problem, and that it does not matter which finite set of primes is dropped. I have no proof, and further suspect that the sequence k_i is not constructible within ZF. It makes sense to me that the sequence k_i is not a finite perturbation of the p_i, however. Gerhard "Wants To Talk With Kronecker" Paseman, 2011.11.10 – Gerhard Paseman Nov 10 '11 at 17:10
  • @Woatt: I was planning to estimate the size of the set, not its density. The size, being, of course, the integral of the density over time. – Will Sawin Nov 13 '11 at 20:35
  • @Gerhard:the truth is that the patterns of the ki's are countably many. – Asterios Gkantzounis Nov 13 '11 at 21:36
  • Wait, you'd like to use only finitely many sequences, right? Otherwise, you can just use $ k_i = i $ to cover all integers. – Zsbán Ambrus Nov 15 '11 at 10:34
  • @Zsban Amorus:you should read the question more carefully – Asterios Gkantzounis Nov 15 '11 at 15:52
  • Ah, you restrict the starting term as $ k_i \le p_i $. – Zsbán Ambrus Dec 14 '11 at 13:36

4 Answers4

13

I've completely changed my mind but I leave the old answer to explain the comments.

It seems quite likely that there is a choice of residues which misses only the 40 integers

$1, 2, 3, 5, 6, 8, 9, 11, 14, 15, 18, 20, 21, 23, 29, 30, 33, 36, 41, 44,51, $

$ 53, 54, 56, 63, 65, 68, 69, 71, 75, 78, 81, 84, 86, 90, 93, 95, 96, 98, 99.$

It arises from the following semi-greedy procedure:


  1. Only worry about integers starting with $s=100$ ($s=90$ is not enough).

  2. At each step, take the smallest integer $t \ge s$ not yet covered and attempt to cover it with the smallest unused odd prime $p$ such that $t+p$ is also not yet covered and $p \le t.$

  3. If there is no such prime then simply use the smallest unused prime (if it is less than $t$, otherwise, STOP!).

  4. Whatever prime is chosen, take the arithmetic progression $A=r+np$ for $n \ge 1$ where $t \bmod{p} =r$


So $1+3n$ knocks out $100,103,106,109,112,115,118,121\dots$ leaving $101$ next. Since $106$ is already covered we use $3+7n$ covering $101,108,122 \dots$ now $2+5n$ works for $102,107,117 \dots$ Next is $104$ and since $104+p$ is covered for $p=11,13,17$ we use $9+19n$.

The residues chosen start out

$[3, 1], [7, 3], [5, 2], [19, 9], [11, 6], [31, 17], [17, 9], [13, 9], [41, 32], [37, 8], [53, 14],$

$ [43, 39], [67, 64], [61, 12], [23, 20], [79, 61], [89, 55], [103, 43], [47, 12], [29, 14]$

Details: I followed this procedure using the $5132$ odd primes up to $49999.$ The number of unused primes less than $t$ (the first uncovered integer) starts at 24 when $t=100.$ It gets as low as 7 a few times, the last being when $t=1419.$ After $t=4925$ there are always at least $10$ unused primes below $t$ and from then on it seems to grow fairly steadily. After $t=33338$ there are (as far as I went) at least $500$ unused primes and after 4341 steps, $t=49980$ with $789$ unused primes available. I used up the remaining primes under $50{,}000$ (without checking if larger primes would be preferred by step 2) At step $5132$ the prime $43973$ was used for $t=60465.$ This left things with next target $t=60471$ and all $965$ primes $50000 \lt p \lt 60471$ as yet unused.

Other starting values $s$ and the $t$ at which there is no available prime left are:

$[10, 24], [20, 55], [30, 146], [40, 189], [50, 393], [60, 553], [70, 935], [80, 1969], [90, 4898].$

A pure greedy strategy of starting at say $s=1000$ and always using the smallest unused odd prime seems to fail fairly quickly (perhaps in about $s$ steps.) The semi-greedy procedure stems from the idea that the main obstacle is the smallest uncovered integers.

It may be better to not wait too long to use the smallest unused prime. Alternately, it might be better to look a little further in hopes of having $t$ along with two of $t+p,t+2p,t+3p$ all newly covered.

Old answer (this is left only to explain the comments)

I'll mildly change the notation without changing the question.

For every prime $p_i>p_0=2$ choose a residue $0 \le r_i\lt p_i$ and take the arithmetic progression $A_i=r_i+np_i$ $n \gt 0$ . Let $M=\mathbb{N} \backslash \bigcup A_i $ be the finite or infinite set of missed integers and $m_j$ be the $j$th member of $M$ (set $m_j=\infty$ if $M$ has less than $j$ elements). Once we have the residues up to $r_i$ we do know $M \cap \lbrace 1,2,\cdots,p_{i+1}-1 \rbrace$ and hence $m_j$ up to some point. So $m_0...m_5$ could be $1,2,4,8,16$ but only if we chose $r_1=0$ the first $5$ times. Otherwise we could have $1,2,4,m_4,m_5$ for $m_5 \le 13.$ If we take $r_1=1$ then can begin $1,2,3,6,9,12$ (the next choice is for $p_5=13$)

The greedy procedure is to take $r_i=0$ and get $m_j=2^j.$ The choices $r_i=1$ gives $m_j=2^j+1.$ Gerry suggests taking $r_1=1$ and then making greedy choices. Up to $p_{30}=127$ this gives the $r$ values $1, 0, 1, 0, 1, 0, 2, 0, 3, 7, 2, 0, 4, 1, 1, 3, 2, 5, 3, 8, 4, 1, 0, 1, 0, 1, 1, 2, 1, 1$ and $m$ values $1, 2, 3, 6, 9, 12, 18, 24, 26, 42, 56, 86, 87, 93, 96, 117, 122, 126$ This does not even look like exponential growth (even if extended to $p_{96}=509$. It seems that $r_1=2$ followed by greedy choices might be a little better but still subexponential.

I made the rash

claim: no matter how the $r_i$ are chosen, $m_j \le 2^j.$

NOTE: if my newer conjecture is true then for my chosen residues, $m_{40}=99$ but $m_{41}=\infty$

I made the even rasher claim below but Noam shut it down decisively.

claim: no matter how the $r_i$ are chosen, every integer interval $[x,2x-1]$ contains an $m$ value.

  • 5
    Counterexample to the final claim: $x = 8$ and $$ \begin{array}{c|ccccc} p_i & 3 & 5 & 7 & 11 & 13 \ \hline r_i & 8 & 10 & 9 & 12 & 13 \end{array} $$ – Noam D. Elkies Nov 14 '11 at 03:56
  • that's nice, it even knocks out [8,17] – Aaron Meyerowitz Nov 14 '11 at 04:08
  • 3
    And thus knocks out $[8,20]$, since we can use $17$ and $19$ to cover $18$ and $19$ respectively, and $20$ is covered already. [I deleted tmy follow-up comment that claimed the same residues also cover $1,2,5$, because that requires that some $r_i < p_i$.] – Noam D. Elkies Nov 14 '11 at 04:33
  • If one lets M be an upper bound to the missed integers, and lets r_i (without loss of generality) satisfy p_i <= r_i < 2p_i, then one has an interval of length M of some missed integers followed by (after adding the progressions using all the allowed primes below M) a repeating pattern of period O(4^M) which contains few missed numbers, and the r_i can be chosen to acheive covering numbers in (M,M+j(prod)) where prod is the product of the primes less than M. Gerhard "Ask Me About Jacobsthal's Function" Paseman, 2011.11.13 – Gerhard Paseman Nov 14 '11 at 06:44
  • I'm not sure if I understand the first claim, but I suspect that it would follow from a result of Kanold (j(m) <= 2^n) on Jacobsthal's function. The problem solution would follow if we could show that the repeating pattern above can't be covered by a.p.'s slowly enough, i.e. that r_j <= p_j for a prime p_j with M < p_j < prod. Of course, if I could convince myself of that, I would be writing this in an answer box instead of here. Gerhard "Ask Me About System Design" Paseman, 2011.11.13 – Gerhard Paseman Nov 14 '11 at 06:52
  • Westzynthious article admits that you can have "big" intervals without using many primes. This could be a problem for claims for large numbers but i am not sure , i am working on it.... http://mathoverflow.net/questions/37679/erik-westzynthiuss-cool-upper-bound-argument-update – Asterios Gkantzounis Nov 15 '11 at 06:33
  • @Gerhard Paseman: Gerhard,Your opinion about the last comment??? – Asterios Gkantzounis Nov 15 '11 at 06:34
  • Westzynthius is good if you pick one target in advance and say I will fill that target interval with small primes. It looks to me like you are asking to pick successively larger targets to the cover with small and medium primes; I don't know enough about the distribution to say that you can hit all the targets with the mild constraints you impose. I suspect that it will fail for a reason like not covering the powers of 2 "fast enough". However, I don't know what "fast enough" is. Gerhard "Not Trying To Be Crazy-making" Paseman, 2011.11.15 – Gerhard Paseman Nov 15 '11 at 19:27
  • An interesting note: starting with $s=100$ there were $114$ times that I was unable to knock out $t,t+p$ at the same time. But starting with $s=1000$ that never happened. So it seems likely that the evolving pattern of covered and uncovered above $s$ is invariant as long as $s$ is large enough. – Aaron Meyerowitz Nov 17 '11 at 15:49
  • Your cover provides a lower bound for certain j(m) where m is the product of some odd primes up to some N. If you can give me an idea as to the growth rate from your procedure, I might be able to determine its feasibility. Let a(n) be the length covered after using the nth prime ap(n). Can you provide the table for me for n up to 50? I imagine it starts (3,1),(7,2),(5,4),(11,5 or 6). To be clear, the second component is a(n) and is not always the length of the longest interval covered, but is related to the next uncovered integer. Gerhard "Ask Me About System Design" Paseman, 2011.11.17 – Gerhard Paseman Nov 17 '11 at 17:04
  • My suggested table uses 11 as ap(4), but using your example I should have 19 instead. Gerhard "Ask Me About System Design" Paseman , 2011.11.17 – Gerhard Paseman Nov 17 '11 at 17:10
  • i do not feel sure for your suggestions that such a cover works – Asterios Gkantzounis Nov 17 '11 at 18:13
  • Gerhard, I'm not sure it helps since I use the primes in a somewhat scrambled order $3, 7, 5, 19, 11, 31, 17, 13, 41, 37, 53, 43, 67, 61, 23, 79, 89, 103, 47, 29$.

    Define $J(p_i)$ to be the largest gap between numbers relatively prime to the first $i$ odd primes. Then $J(p_i)=p_{i-1}$ for primes $5,7,11,13,17,19$ So the 16 integers from $2394885$ to $2394900$ all have an odd divisor $19$ or smaller so $J(19)=17-1=16$. Going to the trusty OEIS http://oeis.org/A072752 (Maximum gap in one-stage prime-sieves) seems to say that the values continue $19, 22, 28, 32, 36, 44, 49, 52, 58$

    – Aaron Meyerowitz Nov 17 '11 at 21:46
  • Asterois, I don't have any heuristic justification, although perhaps there is one. All I can say is that it works for the first 5000 primes ( up to 50,000) missing 40 integers under 100 and then working up steadily to a gap of nearly 1000 unused primes smaller than the first uncovered integer. And it is a fairly naive procedure. I did not expect that. – Aaron Meyerowitz Nov 17 '11 at 23:25
  • While it is a nice simulation, I suspect that one will not be able to continue it because of the scarcity of small primes. Indeed, I wonder why one needs to start at 100 instead of 50 for such an experiment. Gerhard "Hopes To Post Again Soon" Paseman, 2011.11.17 – Gerhard Paseman Nov 18 '11 at 05:54
  • Gerhard, Maybe the algorithm could be tuned up but as it stands it avoids trouble starting at 100 but not at 90. What do you mean about small primes? Can you predict where the cushion of number of available primes less than the first uncovered integer (larger than 100) will start to erode? – Aaron Meyerowitz Nov 18 '11 at 06:43
  • I have just posted an answer which contains my guess. At this stage I cannot back it up with calculations, but I have started that process and perhaps others can finish. For your example, I think we won't see erosion before 10^15 is reached. The effect might be easier to see starting from 90, but it should be real telling starting from 50. I suspect even after O(10^5) trials using different starting moduli, the available primes will thin out before 10^8. Gerhard "Ask Me About System Design" Paseman, 2011.11.18 – Gerhard Paseman Nov 18 '11 at 16:57
7

Here is a graph generated by the first 7500 steps of the method described above. At each stage it finds the smallest uncovered integer $m$ greater than 100 and covers it with a progression $r_i+np_i$ for $n\ge1.$ The last few primes chosen and corresponding $m$ covered are

$ [74099, 94245], [74297, 94263], [75329, 94281], [77893, 94283] [74903, 94296],$ $ [77479, 94334], [77611, 94355], [77659, 94361], [74897, 94371], [77977, 94403]$

At this stage the gap $m-p_i$ appears to be around $16500$ for $m \bmod{3}=1$ and $19500$ for $m \bmod{3}=2$

The graph itself shows the number of unused odd primes $p \lt m$ at each stage. Starting after step 1000 or so it seems to increase pretty reliably at an average rate of slightly over $0.23$ for each step.

enter image description here

  • +1, I am convinced (in an experimental sense). – S. Carnahan Nov 18 '11 at 09:54
  • @S. Carnahan:i am not convinced , how many numbers until this are out of the cover? compare it with the 2^k (where k is this number). i think that the question remains full open – Asterios Gkantzounis Nov 18 '11 at 16:42
  • There are 40 integers which will for sure never get covered . The largest is 99. The largest prime used is under 100,000 and the smallest uncovered integer (after 99) is well over 110,000. The calculations carry on a bit further than shown. – Aaron Meyerowitz Nov 18 '11 at 17:05
  • Aaron,your method of the semi-greedy strategy (greedy from a number and over ) is interesting and i want to thank you very much for your efforts and your interest ,but the question whether the non-sieved numbers from these greedy methods grow faster than the primes is ,as it seems, too sharp to be answered analytically(?). 2^40 >> 110,000 so... – Asterios Gkantzounis Nov 18 '11 at 17:16
  • If I have time maybe I will try to tune it up a bit. I might be able to get the missed numbers to be a smaller set if I did not care that some were relatively large. Gerhard would seem to prefer that we miss 59 integers less than 90 (if possible) – Aaron Meyerowitz Nov 18 '11 at 18:54
  • Actually, you should be able to translate your results by subtracting 10 from many of your residues. Gerhard "Ask Me About System Design" Paseman, 2011.11.18 – Gerhard Paseman Nov 19 '11 at 00:16
  • Gerhard, I'm not sure exactly what you mean. I can get it down to about 20 integers missed all under 50. – Aaron Meyerowitz Nov 19 '11 at 05:30
  • In your exposition where you say 1 + 3n knocks out 100 etc, 3 + 7n knocks out 101, etc, I am saying -9 +3n knocks out 90 etc., -7 + 7n knocks out 91, etc. Just take away 10 mod p_i from the r_i when p_i are small enough. Gerhard "Ask Me About System Design" Paseman, 2011.11.18 – Gerhard Paseman Nov 19 '11 at 06:42
  • I might be wrong about 20 under 50. What you suggest may have worked until crashing around 4098. At any rate, I now thing it is best to take the largest possible prime if one is not going to kill $m$ and $m+p$. Also, maybe it is best to find the closest uncovered integer $m'$ after $m$ with $m'-m$ an unused prime less than $t$ and use that prime. – Aaron Meyerowitz Nov 19 '11 at 07:03
  • 2
    ImageShack seems to have deleted your image and replaced it with an ad banner. If you still have it (or can reproduce it), please reupload to the SE imgur account using the image upload button in the editor toolbar. – Ilmari Karonen Aug 29 '15 at 21:24
6

If you omit the condition that $k_i\ge p_i$, then here is an answer: for every integer $n$ there is some odd prime dividing $2n+1$. So choosing the $k_i$'s so that $2k_i+1\equiv0\pmod{p_i}$ provides a complete covering of the integers (by the congruence classes $\frac{p-1}2\pmod p,\;p>2$). Note that with the condition that $k_i\ge p_i$ this does not cover the numbers $(p-1)/2$.

  • Another cover is to assign the k_i using an enumeration of whatever integers are to be covered. The challenge is to meet the desired inequalities. Gerhard "Ask Me About System Design" Paseman, 2011.11.11 – Gerhard Paseman Nov 12 '11 at 07:40
3

I like Aaron Meyerowitz's efforts and think his and similar methods deserve further study. I want to post my skepticism as a counter, and hope that something will arise from the contrast. I do not consider this post as being an acceptable answer though.

The problem is essentially a shifted sieving problem. After taking the first $n$-many (finitely) primes $q_i$ with offsets $r_i$, one has an eventually periodic pattern of uncovered integers which repeats with period $Q_n = \prod_{i \leq n} q_i$, which contains $U_n = \prod_{i \leq n} (q_i - 1)$ uncovered numbers in each period, and has the first period starting somewhere near $M_n = \max_{i \leq n} r_i$.

If the $q_i$ are the primes in ascending order, we have (Mertens) that $U_n$ is $O(Q_n/\log(q_n))$, which is (roughly) about $n$ times as many primes in the interval $(M, M + Q_n)$ when $n$ gets large, especially when $n$ is comparable to the largest integer $M$ allowed to be uncovered.

If the distribution of coprimes to $Q_n$ were amenable to being nicely covered by arithmetic progressions of primes less than $q_n$, I might share Aaron's confidence. However, each later prime $q$ used is itself coprime to $Q_n$, and with small deviation will cover only about $1/q$ of what needs to be covered. I suspect that when $n$ gets to be about $Q_{24}/2$ using Aaron's sequence $Q_i$, he will run short on primes. It might be prudent to try more extensive simulations which leave no numbers greater than 50 uncovered.

Gerhard "Saying As I Feel It" Paseman, 2011.11.18