My numbers vary in length, but your example is exactly what I get. However, I expected an integer to be represented by a 64-bit string of 0s and 1s (taking into consideration 2-complement or not). So, I am wondering if "letter" is converting the integer to a string?
All this came from my surprise to find integers greater that 2 ** 64 show up in my calculations (e.g., computing factorials that should have either produced an error or switched to floating point). For example, 24! looks like and integer and is correct, but when I divide by two, I get a truncated floating point number. However, when I add 100 to 24!, the result is still an integer and is correct.
Just confused.
Lee
Offline
That is strange, but my assumption is that 'letter' is converting integer, into string, and then parsing the string into a 32 or 64 bit integer.
for instance what the code may look like in a written language:
C# string strTest = "156276"; //See, strTest is still a string int nTest = Int32.parse(strTest); //converting
but wait, that only converts a string to integer.
bah, I'm sure there is a way to get a character, from it.
Offline
I guess it's because Scratch variables count as strings and intengers to ease programming (they're strings in some blocks such as letter [] of [], and intengers in others like [] > []).
Offline