6

Disclaimer: this is a bit off-topic, but I'm thinking that TeX designers must have considered this at one point or another

Why do all pages in a book have to have exactly the same text width? (and font size, for that matter)

I understand that the Knuth-Plass strategy for making paragraphs is already a complex programming challenge, and apparently its natural extension for page layout is even more challenging (typically not implemented; a more naive approach is used instead). Adding new degrees of freedom might simply make the problem unsolvable in practice (my guess).

But perhaps not. The way I see it, the requirement that one page and the next have exactly (to a fraction of a mm) the same width and font size is not based on a typographic ideal (no human would see a difference when turning the page of a novel if the line width was half a mm wider, for example, or if the font scaled to 11.02pt). It is clearly a technical constraint inherited from previous technologies, and imposed from the start. As such, I wonder is the design of TeX thought of including it as a variable in the global problem. From a purely aesthetic point of view, I would rather let the page layout breathe ever so slightly than stretch much the inter-word spacing, which even in the best of cases happens too often to my eyes. Of course there are cases (grid design applications, etc.) where this would not be appropriate, just as ragged-right vs justified text in different contexts.

Any thoughts/references along those lines?

Edit: in response to the (very reasonable) suggestion that "ideally you would typeset your content on a grid, such that the two sides of page would perfectly line up and the content would overlap", I want to clarify a couple of things (to make the question interesting):

  • I am not suggesting drastic or even visible changes, but very minor breathing, in the line of microtypography adjustments
  • printed pages are never exactly aligned (hence page bleed, etc.)
  • ideal paper is not see-through, and partial transparency is already a problem regardless of page layout (content will never line up, except for contrived examples of Lorem Ipsum)
  • computer screens display pdfs without this see-through inconvenience
  • I believe grid typesetting is but one strategy (albeit often the best), and in some specific designs perhaps not the optimum
  • 2
    it's largely tradition, based on the requirements of metal type. however, even if not constrained by metal, it's still likely that a paragraph might end on one page and continue on the next. so the flexibility you suggest might result in varying paragraph widths on the same page (assuming division into pages), or in adjacent paragraphs in a scrolling text. that might cause a reader to become slightly seasick, depending on how delicately it is applied. – barbara beeton Apr 04 '18 at 01:32
  • @barbarabeeton I do not follow: why would such a split paragraph have to keep a constant width? Would you impose a similar constraint in (exact) line spacing when a paragraph carries over? (other than ease of implementation for the algorithms, which are probably non-interacting) – user159867 Apr 04 '18 at 01:51
  • 1
    "Philosophically", the origin of TeX starts with the idea that there exist such things as "real books" (DEK's words) and the goal/ challenge is to approach their quality using computer means. There are centuries of typographic tradition saying that page widths should not vary randomly. If you're willing to throw out centuries of typographic practice a great many new things become possible, but you may want to consult the opinion of experts first, about the aesthetics. Even "micro typography" is controversial, and e.g. Thanh's thesis touches on it too. – ShreevatsaR Apr 04 '18 at 01:59
  • Thanks for the interesting perspective. BTW I do have some knowledge of broader typography standards. Computerised typography was a quantum leap, just as type was to calligraphy. I see no reason to "throw away" anything, but only explore new possibilities offered by technology. A monk/scribe hand-writing a manuscript would have written around holes in the imperfect parchment, and no two pages would have had pixel-perfect sizing (whatever that means, now that content is virtual). I agree that aesthetics implications may be good or bad, but is this a reason to dismiss the idea? – user159867 Apr 04 '18 at 02:11
  • No not dismissing; I think it's an interesting idea, along the same lines as microtypography (letter-spacing, glyph scaling, etc). (BTW I didn't give references earlier to it being controversial; I was thinking of this and this.) Was just explaining why it's unlikely that TeX's original designer would have considered this. (Apart from of course technological constraints of the time: limited memory, slow computers etc.) – ShreevatsaR Apr 04 '18 at 03:48
  • @ShreevatsaR interesting references, thanks. My personal view is that all techniques that provide additional freedom are worth considering; their good use on the other hand can be a tricky question (and different people will have different views, as with all art forms). – user159867 Apr 04 '18 at 08:27
  • Different perspective: I write fiction, not math or theses. I find that the best typesetting is produced by the simple procedure of rewrite-to-fit. There is nothing a character says that couldn't be said with more or fewer words. There is nothing the narrator says that could not have been said differently. Result: Only 20 hyphens in 232 pages, and those being for words such as "somewhere" that break naturally. No widows or orphans, with constant lines per page, no paragraph glue, and flush bottoms. Really looks typeset. –  Apr 05 '18 at 02:25
  • @RobtAll Interesting perspective, thanks. The author of "Trees, maps and theorems" did this to make all content the same rectangular shape. I personally dislike the idea (I like to think that some great pieces of literature could never be the same if so much as one letter changed), but I also see the form and content as more synergetic than often acknowledged, so I can see some attraction (within limits). As far as computerising the process, it would require a computer that understands language and its nuances; not such a crazy thought as it once was. – user159867 Apr 05 '18 at 02:38
  • I believe @barbarabeeton's point was that, in the TeX world, pages and paragraphs are 'non-interacting', as you said. So the question is either more one of design-in-general (and interesting but maybe better suited on a different site), about whether *TeX can be tortured into making paragraphs of multiple widths on the fly as the pages break (currently the answer is 'no', I believe), and/or something else altogether. Can you clarify your question? – jon Apr 05 '18 at 04:37
  • @user159867 The way things are going, at about the time that computers can accurately parse language and its nuances, humans will have lost the ability to read. So I wouldn't worry about it. Certainly, the university students in this vicinity (regardless of gender) talk the way that naughty 13 year-old boys did, in my generation. I hardly see anyone reading a book, other than required reading, and that's the public at large. –  Apr 05 '18 at 04:42
  • @jon correct, the question is more general than about TeX itself; I was mostly curious whether such interaction(s) had been considered in the early design (I'm currently reading 'Digital Typography', but not very far in). I meant this as an 'open question' to gather references and thoughts, but if it's unsuitable it can obviously be closed. – user159867 Apr 05 '18 at 04:57
  • indeed, @jon's comment about mine is accurate. as you have been reading "digital typography", you probably know by now that knuth's target was a very specific series of books, "the art of computer programming", for which he was trying to match earlier monotype copy. it was only after seeing early tex output that other potential users appeared in droves. but the basis remains the traditional printed book, and knuth has retired from the fray, except to fix bugs. a new champion would be needed to redesign the foundations. – barbara beeton Apr 05 '18 at 16:32
  • @barbarabeeton thank you for the added perspective. Regarding a new champion would be needed to redesign the foundations, what often puzzles me in TeX is how one person designed the whole thing from scratch (concepts, implementation, and even a new font creation paradigm), yet in the many decades that have followed there are so very few examples of such "redesign of the foundations" (or just reconsideration). Sure, i) it's simultaneously one the greatest and most arcane softwares around; ii) very few people can afford full-time work on it; iii) the foundations are very strong. But still... – user159867 Apr 05 '18 at 20:22
  • to understand why tex is so competent and stable, you must understand that knuth is devoted to taocp, which he considers his life's work and legacy. he has said numerous times that if the published record did not meet his standards for appearance of presentation, he felt that it was not worth continuing. so he opens his code and text to all reviewers, and rewards the first finder of any bug. keep reading his essays in "digital typography"; they present his philosophy well. – barbara beeton Apr 05 '18 at 20:59
  • @barbarabeeton I'm familiar with the Knuth legends (been a TeX user for 14 years and always intrigued by its mythology). I understand less the comparatively modest success from the rest of the community, and why so many rules seem carved in stone (I have noted strong resistance on these forums when a bold suggestion departs from TeX choices). Many dedicated and talented people, but I find the scattered results (5Gb!) and 4-decade observance of some unstated traditions a bit disappointing. (And I'm not the only one: some projects start from scratch to avoid this baggage eg Butterick's "Pollen") – user159867 Apr 06 '18 at 00:59
  • there have been some efforts to supersede tex, or at least to "bring it into the modern era", but it's a hard problem. some of the efforts have been successful, but they have addressed only particular facets of the project. the successes (as i see it) have been (1) e-tex (enlarging memory limits and making more internal values available), (2) pdftex (output to pdf, microtype), (3) xetex (direct utf8/unicode input and use of system fonts), and (4) luatex (embedding an independent interpretive language to provide the ability to program features not native to tex) (cont'd.) – barbara beeton Apr 06 '18 at 01:12

1 Answers1

5

The page dimensions are actually very important because ideally you would typeset your content on a grid, such that the two sides of page would perfectly line up and the content would overlap. See

Here is an example picture of a double page where the grid is broken (taken from Ralf de Jong and Friedrich Forssman, “Detailtypografie”, Verlag Hermann Schmidt, 4th edition (2002)):

enter image description here

As you can see the lines from back will shine through to the front and distract while reading. Therefore it is very important to keep the page dimensions.

Henri Menke
  • 109,596