Here's my code:
let padded = "03";
ascii = `\u00${padded}`;
However, I receive Bad character escape sequence
from Babel. I'm trying to end up with:
\u0003
in the ascii
variable. What am I doing wrong?
EDIT:
Ended up with ascii = (eval('"\\u00' + padded + '"'));
Here's my code:
let padded = "03";
ascii = `\u00${padded}`;
However, I receive Bad character escape sequence
from Babel. I'm trying to end up with:
\u0003
in the ascii
variable. What am I doing wrong?
EDIT:
Ended up with ascii = (eval('"\\u00' + padded + '"'));
-
1
Uh? The
eval
solution yields the same result asString.fromCodePoint
... – Felix Kling Commented Nov 23, 2015 at 15:58 - Don't provide your own answer in the question. If you think you have the answer, then post it as an answer. – user663031 Commented Nov 23, 2015 at 16:03
1 Answer
Reset to default 5What am I doing wrong?
A unicode escape sequence is basically atomic. You cannot really build one dynamically. Template literals basically perform string concatenation, so your code is equivalent to
'\00' + padded
It should be obvious now why you get that error. If you want to get the corresponding unicode character you can instead use String.fromCodePoint
or String.fromCharCode
:
String.fromCodePoint(3)
If you want a string that literally contains the character sequence \u0003
, then you just need to escape the escape character to produce a literal backslash:
`\\u00${padded}`
发布者:admin,转转请注明出处:http://www.yc00.com/questions/1744958133a4603320.html
评论列表(0条)