• Home
  • Raw
  • Download

Lines Matching +full:go +full:- +full:version

33 CPP proto version of python protobuf, which is also a prerequisite for Python
34 protobuf cpp implementation. You need to install the correct version of Python
35 C++ extension package before run generated CPP proto version of Python
39 $ sudo apt-get install python-dev
40 $ sudo apt-get install python3-dev
42 And you also need to make sure `pkg-config` is installed.
44 ### Go subsection
45 Go protobufs are maintained at [github.com/golang/protobuf](
47 toolchain and the Go protoc-gen-go plugin for protoc.
49 To install protoc-gen-go, run:
52 $ go get -u github.com/golang/protobuf/protoc-gen-go
53 $ export PATH=$PATH:$(go env GOPATH)/bin
56 The first command installs `protoc-gen-go` into the `bin` directory in your local `GOPATH`.
107 reflection and cpp generated code. To run these version benchmark, you need to:
112 $ make python-pure-python
118 $ make python-cpp-reflection
124 $ make python-cpp-generated-code
127 ### Go subsection
129 $ make go
134 We have two version of php protobuf implementation: pure php, php with c extension. To run these ve…
154 $ make java-benchmark
155 $ ./java-benchmark $(specific generated dataset file name) [$(caliper options)]
161 $ make cpp-benchmark
162 $ ./cpp-benchmark $(specific generated dataset file name) [$(benchmark options)]
167 For Python benchmark we have `--json` for outputting the json result
172 $ make python-pure-python-benchmark
173 $ ./python-pure-python-benchmark [--json] $(specific generated dataset file name)
179 $ make python-cpp-reflection-benchmark
180 $ ./python-cpp-reflection-benchmark [--json] $(specific generated dataset file name)
186 $ make python-cpp-generated-code-benchmark
187 $ ./python-cpp-generated-code-benchmark [--json] $(specific generated dataset file name)
190 ### Go:
192 $ make go-benchmark
193 $ ./go-benchmark $(specific generated dataset file name) [go testing options]
199 $ make php-benchmark
200 $ ./php-benchmark $(specific generated dataset file name)
204 $ make php-c-benchmark
205 $ ./php-c-benchmark $(specific generated dataset file name)
210 $ make js-benchmark
211 $ ./js-benchmark $(specific generated dataset file name)
218 $ dotnet run -c Release