Home
last modified time | relevance | path

Searched refs:inference_time_us (Results 1 – 6 of 6) sorted by relevance

/external/tensorflow/tensorflow/lite/tools/benchmark/
Dbenchmark_model.h42 tensorflow::Stat<int64_t> inference_time_us) in BenchmarkResults() argument
46 inference_time_us_(inference_time_us) {} in BenchmarkResults()
48 tensorflow::Stat<int64_t> inference_time_us() const { in inference_time_us() function
Dbenchmark_model.cc58 auto inference_us = results.inference_time_us(); in OnBenchmarkEnd()
169 Stat<int64_t> inference_time_us = in Run() local
173 {startup_latency_us, input_bytes, warmup_time_us, inference_time_us}); in Run()
/external/tensorflow/tensorflow/tools/benchmark/
Dbenchmark_model.h42 StatSummarizer* stats, int64* inference_time_us);
Dbenchmark_model.cc286 StatSummarizer* stats, int64* inference_time_us) { in RunBenchmark() argument
304 *inference_time_us = end_time - start_time; in RunBenchmark()
/external/tensorflow/tensorflow/lite/tools/benchmark/android/jni/
Dbenchmark_model_jni.cc34 auto inference_us = results.inference_time_us(); in OnBenchmarkEnd()
/external/tensorflow/tensorflow/lite/tools/benchmark/ios/TFLiteBenchmark/TFLiteBenchmark/
DBenchmarkViewController.mm92 OutputMicrosecondsStatToStream(results.inference_time_us(), prefix, &stream);