Searched defs:integer_count_error (Results 1 – 1 of 1) sorted by relevance
788 float integer_count_error = best_combined_error[quant_level][integer_count - 1]; in one_partition_find_best_combination_for_bitcount() local904 float integer_count_error = best_combined_error[quant_level][integer_count - 2]; in two_partitions_find_best_combination_for_bitcount() local1032 float integer_count_error = best_combined_error[quant_level][integer_count - 3]; in three_partitions_find_best_combination_for_bitcount() local1171 float integer_count_error = best_combined_error[quant_level][integer_count - 4]; in four_partitions_find_best_combination_for_bitcount() local