Home
last modified time | relevance | path

Searched refs:utf_16_decode (Results 1 – 5 of 5) sorted by relevance

/external/python/cpython2/Lib/encodings/
Dutf_16.py16 return codecs.utf_16_decode(input, errors, True)
/external/python/cpython3/Lib/encodings/
Dutf_16.py16 return codecs.utf_16_decode(input, errors, True)
/external/python/cpython2/Modules/
D_codecsmodule.c279 utf_16_decode(PyObject *self, in utf_16_decode() function
1074 {"utf_16_decode", utf_16_decode, METH_VARARGS},
/external/python/cpython2/Lib/test/
Dtest_codecs.py537 codecs.utf_16_decode('\x01', 'replace', True))
539 codecs.utf_16_decode('\x01', 'ignore', True))
542 self.assertRaises(UnicodeDecodeError, codecs.utf_16_decode, "\xff", "strict", True)
/external/python/cpython3/Lib/test/
Dtest_codecs.py650 codecs.utf_16_decode(b'\x01', 'replace', True))
652 codecs.utf_16_decode(b'\x01', 'ignore', True))
655 self.assertRaises(UnicodeDecodeError, codecs.utf_16_decode,