Not going to lie, when I read him say that converting spells from a( n unknown) character set into binary made the spells "less wordy" I was mostly unable to focus on anything else that the chapter said.
Assume, for a moment, a character set of 26 characters. To convert a single letter into binary (which consists solely of "0" and "1") it would take no less than five consecutive characters (digits: 00001, for example, as "A." and that assumes only upper case letters). To convert a character set such as Japanese, with more than fifty thousand distinct characters? Sixteen consecutive distinct characters (digits: 0000000000000001, for example, as "あ"). (Please note, every f'ing time I try to do math, and post it here I somehow screw up. Someone please check my math. The quantity for "50,000+" characters for Japanese should be correct - see Unicode 16 or ask Google for 50k in binary and count the digits... Anyway).
How is this less "wordy" than a written language? In Japanese there exist single characters for entire words and/or entire concepts and to convert that to binary you have to extend it by fifteen minimum... "1" becomes: "0000000000000001" minimum...
BINARY IS NOT THE ANSWER. BINARY IS NOT BETTER. Binary exists solely because it is far easier to represent everything as "off" and "on" for computers. (Or somewhere below 5 volts as "0" and somewhere above 5 volts as "1" - depending on decade, technology and standard, and, yes, I know more standards exist).
Am I wrong? Am I crazy (with regards to this specifically)?