Searched refs:utf_16_decode (Results 1 – 3 of 3) sorted by relevance
/external/python/cpython2/Lib/encodings/ |
D | utf_16.py | 16 return codecs.utf_16_decode(input, errors, True)
|
/external/python/cpython2/Modules/ |
D | _codecsmodule.c | 279 utf_16_decode(PyObject *self, in utf_16_decode() function 1074 {"utf_16_decode", utf_16_decode, METH_VARARGS},
|
/external/python/cpython2/Lib/test/ |
D | test_codecs.py | 523 codecs.utf_16_decode('\x01', 'replace', True)) 525 codecs.utf_16_decode('\x01', 'ignore', True)) 528 self.assertRaises(UnicodeDecodeError, codecs.utf_16_decode, "\xff", "strict", True)
|