So we really do come back to Zeno: at each instant of its flight, an arrow occupies exactly as much space as the measure of its length, and an has exact position (to simplify without compromising the situation, (x,y) in a 2-D plane defined by its path, denoting the exact position of its tip). At each succeeding instant it will conserve its length, yet the (x,y) will be different, though fixed. So at each instant the arrow is at rest, yet it appears to move, so when/where does the movement take place? Not at any instant of time, nor at any identifiable location.

Zeno, being Greek, was trying to show that "becoming" was not real: that, to put it crudely enough to show the word-game he was trapped in, "becoming," which means change, the not-yet-here-and-now, was incompatible with "being" which to him meant permanence, the "always already completely here-now." Only the eternal could really BE, the phenomenal, changeable world was not only impermanent but irrational. Of course he knew that arrows could move (and potentially kill). That just proved the irrationality of the world, to him.

The question is, what are we trying to show now? You seem to want to equivocate: to concede that there is NO difference that can be stated between the two representations of unity (1 and the unwriteable 0.999…) while insisting that there must be SOME difference. So we seem to reduce the issue to one for which no mathematical argument is appropriate: in what case(s) can two objects of discussion be both the same and different in regard to a single property (here, quantity represented). Both the classic Aristotelian logic of qualifiers, and the conventional 2-valued modern logic of predication have a "law of the excluded middle" as an axiomatic way of excluding such a situation, but these could be replaced with other, perhaps multi-valued, logics that did not include this axiom.

So why would we do that in order to accommodate this case of two representations of unity whose visual-conceptual difference makes you suspect a difference of reference? If it would lead to some better "fit" of the representational system to our measurements of the world, that would be a gain, but you concede that, from a strictly arithmetic point of view we get no advantage (and I hope you'll concede we also suffer some real inconveniences) from the assumption. And philosophically, I don't see any advantage either: both "1" and "0.999…" when looked at as they are "in themselves", are just socially agreed-upon scratchmarks for representing what we mean when we say "unity" or "one" or (in other languages) "une", "eine", "uno". HOW we represent does not alter WHAT we represent. So why worry oneself about the apparent differences of form (the unwriteably indefinite string of 9's suggesting the sum of an endless series of 9/(10^n) + 9/(10^(n+1) … for all n, versus the nice, clean single stroke of a 1) if we use them interchangeably? The point is to recognize the referent of both as the same, and to understand why sometimes one, sometimes the other is used. For example, most scientific calculators doing anything more complicated than the four elementary arithmetic operations will often produce answers of the form "44.999…" to be understood as "45", because of the way they internally represent numbers and operations. So we teach students to RECOGNIZE that "0.999…" is a numerical synonym for "1" so they can interface with their tools more successfully. On the other hand, I cannot enter the string "0.999…" into my calculator, so instead I input "1", confident that the result will be exactly the same.