You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I vote that we require 𒐖 for 2 of anything, because it makes the parsing monumentally simpler.
This also brings up the question of how we print numeric results: when we print the value of 𒐖, should it default to the smallest possible integer value, or should it print a series of possible readings 2, 120, 0.0333, ....? Similarly for the result of calculations, is 𒐕 + 𒐕 always 𒐖, or do you want to include the possibility that 𒐕 + 𒐕 could mean 60 + 1 = 𒐕𒐕?
emeszida/test/test.py
Line 73 in 913a629
How do we differentiate between the writing of 120, 61, 2
All can be written: 𒐕𒐕
Or do we reserve 𒐕 for "1" of anything, and 2 has to be 𒐖?
That would make our lives easier, then 𒐖 can only be 120 (2,2) or 2 (2,1) where as 𒐕𒐕 can only be 61 (1,2)(1,1)
The text was updated successfully, but these errors were encountered: