Home
last modified time | relevance | path

Searched refs:our (Results 1 – 25 of 1491) sorted by relevance

12345678910>>...60

/external/llvm/docs/tutorial/
DBuildingAJIT1.rst35 - `Chapter #4 <BuildingAJIT4.html>`_: Improve the laziness of our JIT by
43 To provide input for our JIT we will use the Kaleidoscope REPL from
46 code for that chapter and replace it with optimization support in our JIT class
61 compiler does. To support that aim our initial, bare-bones JIT API will be:
92 In the previous section we described our API, now we examine a simple
96 input for our JIT: Each time the user enters an expression the REPL will add a
99 use the findSymbol method of our JIT class find and execute the code for the
102 of this tutorial we'll modify the REPL to enable new interactions with our JIT
103 class, but for now we will take this setup for granted and focus our attention on
104 the implementation of our JIT itself.
[all …]
DLangImpl09.rst28 our program down to something small and standalone. As part of this
73 First we make our anonymous function that contains our top level
74 statement be our "main":
147 our piece of Kaleidoscope language down to an executable program via this
162 construct one for our fib.ks file.
176 of our IR level descriptions. Construction for it takes a module so we
177 need to construct it shortly after we construct our module. We've left it
180 Next we're going to create a small container to cache some of our frequent
181 data. The first will be our compile unit, but we'll also write a bit of
182 code for our one type since we won't have to worry about multiple typed
[all …]
DBuildingAJIT2.rst36 added to it. In this Chapter we will make optimization a phase of our JIT
38 layers, but in the long term making optimization part of our JIT will yield an
41 optimization managed by our JIT will allow us to optimize lazily too, rather
42 than having to do all our optimization up-front.
44 To add optimization support to our JIT we will take the KaleidoscopeJIT from
79 but after the CompileLayer we introduce a typedef for our optimization function.
82 our optimization function typedef in place we can declare our OptimizeLayer,
83 which sits on top of our CompileLayer.
85 To initialize our OptimizeLayer we pass it a reference to the CompileLayer
122 OptimizeLayer in our key methods: addModule, findSymbol, and removeModule. In
[all …]
DLangImpl08.rst12 <index.html>`_" tutorial. This chapter describes how to compile our
28 As an example, we can see what clang thinks is our current target
64 We can now use our target triple to get a ``Target``:
108 For our example, we'll use the generic CPU without any additional
124 We're now ready to configure our module, to specify the target and
139 our file to:
171 Does it work? Let's give it a try. We need to compile our code, but
189 link it with our output. Here's the source code:
203 We link our program to output.o and check the result is what we
/external/grpc-grpc/examples/cpp/
Dcpptutorial.md25 With gRPC we can define our service once in a `.proto` file and implement clients
35 The example code for our tutorial is in [examples/cpp/route_guide](route_guide).
70 the returned stream until there are no more messages. As you can see in our
110 the request and response types used in our service methods - for example, here's
126 Next we need to generate the gRPC client and server interfaces from our `.proto`
155 - All the protocol buffer code to populate, serialize, and retrieve our request
172 There are two parts to making our `RouteGuide` service do its job:
173 - Implementing the service interface generated from our service definition:
174 doing the actual "work" of our service.
178 You can find our example `RouteGuide` server in
[all …]
/external/swiftshader/third_party/llvm-7.0/llvm/docs/tutorial/
DBuildingAJIT1.rst40 - `Chapter #4 <BuildingAJIT4.html>`_: Improve the laziness of our JIT by
48 To provide input for our JIT we will use the Kaleidoscope REPL from
51 code for that chapter and replace it with optimization support in our JIT class
66 compiler does. To support that aim our initial, bare-bones JIT API will be:
96 In the previous section we described our API, now we examine a simple
100 input for our JIT: Each time the user enters an expression the REPL will add a
103 use the findSymbol method of our JIT class find and execute the code for the
106 of this tutorial we'll modify the REPL to enable new interactions with our JIT
107 class, but for now we will take this setup for granted and focus our attention on
108 the implementation of our JIT itself.
[all …]
DLangImpl09.rst28 our program down to something small and standalone. As part of this
73 First we make our anonymous function that contains our top level
74 statement be our "main":
147 our piece of Kaleidoscope language down to an executable program via this
162 construct one for our fib.ks file.
176 of our IR level descriptions. Construction for it takes a module so we
177 need to construct it shortly after we construct our module. We've left it
180 Next we're going to create a small container to cache some of our frequent
181 data. The first will be our compile unit, but we'll also write a bit of
182 code for our one type since we won't have to worry about multiple typed
[all …]
DBuildingAJIT2.rst41 added to it. In this Chapter we will make optimization a phase of our JIT
43 layers, but in the long term making optimization part of our JIT will yield an
46 optimization managed by our JIT will allow us to optimize lazily too, rather
47 than having to do all our optimization up-front.
49 To add optimization support to our JIT we will take the KaleidoscopeJIT from
85 but after the CompileLayer we introduce a typedef for our optimization function.
88 our optimization function typedef in place we can declare our OptimizeLayer,
89 which sits on top of our CompileLayer.
91 To initialize our OptimizeLayer we pass it a reference to the CompileLayer
127 OptimizeLayer in our key methods: addModule, findSymbol, and removeModule. In
[all …]
DLangImpl08.rst12 <index.html>`_" tutorial. This chapter describes how to compile our
28 As an example, we can see what clang thinks is our current target
64 We can now use our target triple to get a ``Target``:
108 For our example, we'll use the generic CPU without any additional
124 We're now ready to configure our module, to specify the target and
139 our file to:
171 Does it work? Let's give it a try. We need to compile our code, but
189 link it with our output. Here's the source code:
203 We link our program to output.o and check the result is what we
DBuildingAJIT3.rst65 CompileOnDemandLayer to the top of our stack and we'll get the benefits of
97 to our class. The CompileCallbackManager member is used by the CompileOnDemandLayer
120 Next we have to update our constructor to initialize the new members. To create
124 function. In our simple JIT this situation is unlikely to come up, so we'll
129 Now we can construct our CompileOnDemandLayer. Following the pattern from
130 previous layers we start by passing a reference to the next layer down in our
139 our CompileCallbackManager. Finally, we need to supply an "indirect stubs
166 Finally, we need to replace the references to OptimizeLayer in our addModule,
176 Here is the complete code listing for our running example with a CompileOnDemand
/external/curl/tests/data/
Dtest108033 http://%HOSTIP:%HTTPPORT/we/want/our/1080 http://%HOSTIP:%HTTPPORT/we/want/our/1080 -w '%{redirect_…
43 GET /we/want/our/1080 HTTP/1.1
47 GET /we/want/our/1080 HTTP/1.1
59 http://%HOSTIP:%HTTPPORT/we/want/our/data/10800002.txt?coolsite=yes
66 http://%HOSTIP:%HTTPPORT/we/want/our/data/10800002.txt?coolsite=yes
Dtest108141 http://%HOSTIP:%HTTPPORT/we/want/our/1081 http://%HOSTIP:%HTTPPORT/we/want/our/10810002 -w '%{redir…
51 GET /we/want/our/1081 HTTP/1.1
55 GET /we/want/our/10810002 HTTP/1.1
67 http://%HOSTIP:%HTTPPORT/we/want/our/data/10810099.txt?coolsite=yes
Dtest126133 http://%HOSTIP:%HTTPPORT/we/want/our/1261 -w '%{redirect_url}\n' --location --max-redir 0
43 GET /we/want/our/1261 HTTP/1.1
58 http://%HOSTIP:%HTTPPORT/we/want/our/data/10290002.txt?coolsite=yes
Dtest102933 http://%HOSTIP:%HTTPPORT/we/want/our/1029 -w '%{redirect_url}\n'
43 GET /we/want/our/1029 HTTP/1.1
55 http://%HOSTIP:%HTTPPORT/we/want/our/data/10290002.txt?coolsite=yes
/external/skqp/site/dev/testing/
Dindex.md4 Skia relies heavily on our suite of unit and Golden Master \(GM\) tests, which
5 are served by our Diamond Master \(DM\) test tool, for correctness testing.
6 Tests are executed by our trybots, for every commit, across most of our
16 See the individual subpages for more details on our various test tools.
/external/skia/site/dev/testing/
Dindex.md4 Skia relies heavily on our suite of unit and Golden Master \(GM\) tests, which
5 are served by our Diamond Master \(DM\) test tool, for correctness testing.
6 Tests are executed by our trybots, for every commit, across most of our
16 See the individual subpages for more details on our various test tools.
/external/mesa3d/src/gallium/docs/source/drivers/openswr/
Dfaq.rst10 * Architecture - given our focus on scientific visualization, our
40 them through our driver yet. The fetch shader, streamout, and blend is
55 While our current performance is quite good, we know there is more
66 Visualization Toolkit (VTK), and as such our development efforts have
76 the piglit failures are errors in our driver layer interfacing Mesa
77 and SWR. Fixing these issues is one of our major future development
84 download the Mesa source and enable our driver makes life much
87 * The internal gallium APIs are not stable, so we'd like our driver
101 expose through our driver, such as MSAA, geometry shaders, compute
122 intrinsics in our code and the in-tree JIT creation. It is not the
/external/toolchain-utils/binary_search_tool/ndk/
DREADME37 flavor for arm7, our compiler wrapper won't try to bisect object files meant
48 specific build flavor in our app/build.gradle file:
54 We want to add this under the same "productFlavors" section that our arm7
56 task in our build system. We can use this to build and install an x86-64
57 version of our app.
59 Now we want to change our test_setup.sh script to run our new gradle task:
/external/flatbuffers/
DCONTRIBUTING.md11 copyright to your changes, even after your contribution becomes part of our
16 approved it, but you must do it before we can put your code into our codebase.
27 * Use our code
37 HEAD. This make reviewing the code so much easier, and our history more
/external/flatbuffers/docs/source/
DCONTRIBUTING.md11 copyright to your changes, even after your contribution becomes part of our
16 approved it, but you must do it before we can put your code into our codebase.
27 * Use our code
37 HEAD. This make reviewing the code so much easier, and our history more
/external/tensorflow/tensorflow/core/api_def/base_api/
Dapi_def_Dequantize.pbtxt60 We first find the range of values in our tensor. The
68 Next, we choose our fixed-point quantization buckets, `[min_fixed, max_fixed]`.
81 From this we compute our scaling factor, s:
86 Now we can dequantize the elements of our tensor:
/external/clang/cmake/modules/
DClangConfig.cmake.in1 # This file allows users to call find_package(Clang) and pick up our targets.
10 # Provide all our library targets to users.
/external/tensorflow/tensorflow/lite/g3doc/guide/
Dfaq.md3 If you don't find an answer to your question here, please look through our
34 not related to missing operations, search our
58 script in our repository.
87 The best way to test the behavior of a TensorFlow Lite model is to use our API
89 look at our [Python Interpreter example](../convert/python_api.md) that generates
116 classification, check out our [list of hosted models](hosted_models.md).
126 on the interpreter. Or take a look at our
Droadmap.md5 The following represents a high level overview of our 2019 plan. You should be
8 principle, we typically prioritize issues that the majority of our users are
11 We break our roadmap into four key segments: usability, performance,
12 optimization and portability. We strongly encourage you to comment on our
/external/swiftshader/third_party/LLVM/docs/HistoricalNotes/
D2001-06-20-.NET-Differences.txt4 Subject: .NET vs. our VM
6 One significant difference between .NET CLR and our VM is that the CLR
23 compiled by the same compiler, whereas our approach allows us to link and

12345678910>>...60