I'm currently writing a .tex report with my wife. She's on a Windows machine and I'm running Linux over here; we've decided to encode our report in ISO-8859-1 (i.e., we use \usepackage[latin1]{inputenc}
on the preamble).
We're also generating lots of tables in R using xtable()
and some other customized functions to output LaTeX. Whenever I sink an R output, I use this command: sink(file("filename.tex", encoding = "ISO-8856-9"))
. There is, however, this one long table that contains lots of special characters. Whenever I try to sink this table as ISO I get this message:
Warning message:
invalid char string in output conversion
Even though it's just a warning, it prevents our LaTeX compiler from being able to correctly parse the command \input{filename.tex}
.
The workaround we have for this is using encoding = "UTF-8"
in R, then opening the file on a text editor and saving it with ISO encoding. Then LaTeX compiles the file correctly with apparently all special characters.
Is there a command by which I can tell LaTeX that a file I'm inputting has a different encoding from the main LaTeX file? In my mind, it would look something like this \input[utf8]{filename.tex}
(but unfortunately \input
or it's fellow \include
don't take options).
Best Answer
Are you certain that the encoding should be ISO-8856-9? I've never heard of that one. Maybe you mean ISO-8859-6 (Arabic) or ISO-8859-9 (Turkish)?
Let's assume that you meant ISO-8859-9. Then you just write
in the beginning of your code,
just before you include the file and
to restore the original encoding.
EDIT:
UTF-8 might cause some problems. I'd solve it by stating that the whole document should be in UTF-8 and then switching back. Here's an example:
The following file is "main-file.tex", and should be saved with Latin 1.
Also create the file "external-file.tex", and save it with UTF-8:
This works for me without any warnings at all. Let me know the specific warnings and errors if you still can't get it to work.