Searched defs:enable16BitTypes (Results 1 – 4 of 4) sorted by relevance
112 void enable16BitTypes (bool enabled) { m_16BitTypesEnabled = enabled; } in enable16BitTypes() function in vkt::MemoryModel::ShaderInterface
395 std::vector<const wchar_t*> GetDXCArguments(uint32_t compileFlags, bool enable16BitTypes) { in GetDXCArguments()
1477 bool enable16BitTypes = parseContext.hlslEnable16BitTypes(); in acceptType() local