Searched defs:badchar (Results 1 – 2 of 2) sorted by relevance
1652 int badchar = (int)PyUnicode_AS_UNICODE(uself->object)[uself->start]; in UnicodeEncodeError_str() local1839 int badchar = (int)PyUnicode_AS_UNICODE(uself->object)[uself->start]; in UnicodeTranslateError_str() local
510 int badchar = 0; in decoding_fgets() local