[Tex/LaTex] How to convert a character to a numeric value

macrossymbolsunicode

Now that I've gotten Will Robertson's excellent unicode-math package working to change the colour and style of letters in my mathematics (trust me, there is a reason) thanks to his answer to my previous question, I want to figure out how to implement this properly.

The key step seems to be that the command to change how a letter is displayed needs the unicode numerical value of the letter. The example that Will gave was:

\setmathfont[range="66,math-style=upright,Colour=FF0000]{xits-math.otf}

(font name changed to work with xelatex as per Will's comment on that question) which changes the letter e. But I want a command that I can invoke as \type{constR}{e} (which should declare e to be a constant real number). So I need to convert e to its unicode value.

Now it seems possible that there isn't a general "charater to unicode" command, so I'd be content if I could convert a standard letter to some number since the characters that I'm most going to use this on are the alphanumerics. Thus:

What's the best way to convert an alphanumerical character to a position in the alphabet?

Best Answer

There is a general way of converting a character to its char number (which is equivalent to its unicode glyph slot in unicode), but it unfortunately doesn't work well in options to fontspec/unicode-math (an issue I'm aware of). Usually you can use something like this:

\newcount\foo
\foo=`\e

to assign the char number of "e" to \foo; this "backtick-escape" syntax works wherever TeX expects a number. However, it doesn't work if the argument is being expanded first, which is what happens in fontspec & unicode-math.

Now, you don't actually need the backslash in the code above; this will work too:

\foo=`e

The difference is that this cannot be used for "difficult" characters such as % and #. If you're just using letters, this won't be a problem.

So in unicode-math you should be able to write

range={`e,`j,...}

Or with whatever letters you need.