• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1page.title=SurfaceTexture
2@jd:body
3
4<!--
5    Copyright 2014 The Android Open Source Project
6
7    Licensed under the Apache License, Version 2.0 (the "License");
8    you may not use this file except in compliance with the License.
9    You may obtain a copy of the License at
10
11        http://www.apache.org/licenses/LICENSE-2.0
12
13    Unless required by applicable law or agreed to in writing, software
14    distributed under the License is distributed on an "AS IS" BASIS,
15    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16    See the License for the specific language governing permissions and
17    limitations under the License.
18-->
19<div id="qv-wrapper">
20  <div id="qv">
21    <h2>In this document</h2>
22    <ol id="auto-toc">
23    </ol>
24  </div>
25</div>
26
27
28<p>The SurfaceTexture class was introduced in Android 3.0. Just as SurfaceView
29is the combination of a Surface and a View, SurfaceTexture is a rough
30combination of a Surface and a GLES texture (with a few caveats).</p>
31
32<p>When you create a SurfaceTexture, you are creating a BufferQueue for which
33your app is the consumer. When a new buffer is queued by the producer, your app
34is notified via callback (<code>onFrameAvailable()</code>). Your app calls
35<code>updateTexImage()</code>, which releases the previously-held buffer,
36acquires the new buffer from the queue, and makes some EGL calls to make the
37buffer available to GLES as an external texture.</p>
38
39
40<h2 id=ext_texture>External textures</h2>
41<p>External textures (<code>GL_TEXTURE_EXTERNAL_OES</code>) are not quite the
42same as textures created by GLES (<code>GL_TEXTURE_2D</code>): You have to
43configure your renderer a bit differently, and there are things you can't do
44with them. The key point is that you can render textured polygons directly
45from the data received by your BufferQueue. gralloc supports a wide variety of
46formats, so we need to guarantee the format of the data in the buffer is
47something GLES can recognize. To do so, when SurfaceTexture creates the
48BufferQueue, it sets the consumer usage flags to
49<code>GRALLOC_USAGE_HW_TEXTURE</code>, ensuring that any buffer created by
50gralloc would be usable by GLES.</p>
51
52<p>Because SurfaceTexture interacts with an EGL context, you must be careful to
53call its methods from the correct thread (as detailed in the class
54documentation).</p>
55
56<h2 id=time_transforms>Timestamps and transformations</h2>
57<p>If you look deeper into the class documentation, you will see a couple of odd
58calls. One call retrieves a timestamp, the other a transformation matrix, the
59value of each having been set by the previous call to
60<code>updateTexImage()</code>. It turns out that BufferQueue passes more than
61just a buffer handle to the consumer. Each buffer is accompanied by a timestamp
62and transformation parameters.</p>
63
64<p>The transformation is provided for efficiency. In some cases, the source data
65might be in the incorrect orientation for the consumer; but instead of rotating
66the data before sending it, we can send the data in its current orientation with
67a transform that corrects it. The transformation matrix can be merged with other
68transformations at the point the data is used, minimizing overhead.</p>
69
70<p>The timestamp is useful for certain buffer sources. For example, suppose you
71connect the producer interface to the output of the camera (with
72<code>setPreviewTexture()</code>). To create a video, you need to set the
73presentation timestamp for each frame; but you want to base that on the time
74when the frame was captured, not the time when the buffer was received by your
75app. The timestamp provided with the buffer is set by the camera code, resulting
76in a more consistent series of timestamps.</p>
77
78<h2 id=surfacet>SurfaceTexture and Surface</h2>
79
80<p>If you look closely at the API you'll see the only way for an application
81to create a plain Surface is through a constructor that takes a SurfaceTexture
82as the sole argument. (Prior to API 11, there was no public constructor for
83Surface at all.) This might seem a bit backward if you view SurfaceTexture as a
84combination of a Surface and a texture.</p>
85
86<p>Under the hood, SurfaceTexture is called GLConsumer, which more accurately
87reflects its role as the owner and consumer of a BufferQueue. When you create a
88Surface from a SurfaceTexture, what you're doing is creating an object that
89represents the producer side of the SurfaceTexture's BufferQueue.</p>
90
91<h2 id=continuous_capture>Case Study: Grafika's continuous capture</h2>
92
93<p>The camera can provide a stream of frames suitable for recording as a movie.
94To display it on screen, you create a SurfaceView, pass the Surface to
95<code>setPreviewDisplay()</code>, and let the producer (camera) and consumer
96(SurfaceFlinger) do all the work. To record the video, you create a Surface with
97MediaCodec's <code>createInputSurface()</code>, pass that to the camera, and
98again sit back and relax. To show and record the it at the same time, you have
99to get more involved.</p>
100
101<p>The <em>continuous capture</em> activity displays video from the camera as
102the video is being recorded. In this case, encoded video is written to a
103circular buffer in memory that can be saved to disk at any time. It's
104straightforward to implement so long as you keep track of where everything is.
105</p>
106
107<p>This flow involves three BufferQueues: one created by the app, one created by
108SurfaceFlinger, and one created by mediaserver:</p>
109<ul>
110<li><strong>Application</strong>. The app uses a SurfaceTexture to receive
111frames from Camera, converting them to an external GLES texture.</li>
112<li><strong>SurfaceFlinger</strong>. The app declares a SurfaceView, which we
113use to display the frames.</li>
114<li><strong>MediaServer</strong>. You configure a MediaCodec encoder with an
115input Surface to create the video.</li>
116</ul>
117
118<img src="images/continuous_capture_activity.png" alt="Grafika continuous
119capture activity" />
120
121<p class="img-caption"><strong>Figure 1.</strong>Grafika's continuous capture
122activity. Arrows indicate data propagation from the camera and BufferQueues are
123in color (producers are teal, consumers are green).</p>
124
125<p>Encoded H.264 video goes to a circular buffer in RAM in the app process, and
126is written to an MP4 file on disk using the MediaMuxer class when the capture
127button is hit.</p>
128
129<p>All three of the BufferQueues are handled with a single EGL context in the
130app, and the GLES operations are performed on the UI thread.  Doing the
131SurfaceView rendering on the UI thread is generally discouraged, but since we're
132doing simple operations that are handled asynchronously by the GLES driver we
133should be fine.  (If the video encoder locks up and we block trying to dequeue a
134buffer, the app will become unresponsive. But at that point, we're probably
135failing anyway.)  The handling of the encoded data -- managing the circular
136buffer and writing it to disk -- is performed on a separate thread.</p>
137
138<p>The bulk of the configuration happens in the SurfaceView's <code>surfaceCreated()</code>
139callback.  The EGLContext is created, and EGLSurfaces are created for the
140display and for the video encoder.  When a new frame arrives, we tell
141SurfaceTexture to acquire it and make it available as a GLES texture, then
142render it with GLES commands on each EGLSurface (forwarding the transform and
143timestamp from SurfaceTexture).  The encoder thread pulls the encoded output
144from MediaCodec and stashes it in memory.</p>
145
146<h2 id=st_vid_play>Secure texture video playback</h2>
147<p>Android 7.0 supports GPU post-processing of protected video content. This
148allows using the GPU for complex non-linear video effects (such as warps),
149mapping protected video content onto textures for use in general graphics scenes
150(e.g., using OpenGL ES), and virtual reality (VR).</p>
151
152<img src="images/graphics_secure_texture_playback.png" alt="Secure Texture Video Playback" />
153<p class="img-caption"><strong>Figure 2.</strong>Secure texture video playback</p>
154
155<p>Support is enabled using the following two extensions:</p>
156<ul>
157<li><strong>EGL extension</strong>
158(<code><a href="https://www.khronos.org/registry/egl/extensions/EXT/EGL_EXT_protected_content.txt">EGL_EXT_protected_content</code></a>).
159Allows the creation of protected GL contexts and surfaces, which can both
160operate on protected content.</li>
161<li><strong>GLES extension</strong>
162(<code><a href="https://www.khronos.org/registry/gles/extensions/EXT/EXT_protected_textures.txt">GL_EXT_protected_textures</code></a>).
163Allows tagging textures as protected so they can be used as framebuffer texture
164attachments.</li>
165</ul>
166
167<p>Android 7.0 also updates SurfaceTexture and ACodec
168(<code>libstagefright.so</code>) to allow protected content to be sent even if
169the windows surface does not queue to the window composer (i.e., SurfaceFlinger)
170and provide a protected video surface for use within a protected context. This
171is done by setting the correct protected consumer bits
172(<code>GRALLOC_USAGE_PROTECTED</code>) on surfaces created in a protected
173context (verified by ACodec).</p>
174
175<p>These changes benefit app developers who can create apps that perform
176enhanced video effects or apply video textures using protected content in GL
177(for example, in VR), end users who can view high-value video content (such as
178movies and TV shows) in GL environment (for example, in VR), and OEMs who can
179achieve higher sales due to added device functionality (for example, watching HD
180movies in VR). The new EGL and GLES extensions can be used by system on chip
181(SoCs) providers and other vendors, and are currently implemented on the
182Qualcomm MSM8994 SoC chipset used in the Nexus 6P.
183
184<p>Secure texture video playback sets the foundation for strong DRM
185implementation in the OpenGL ES environment. Without a strong DRM implementation
186such as Widevine Level 1, many content providers would not allow rendering of
187their high-value content in the OpenGL ES environment, preventing important VR
188use cases such as watching DRM protected content in VR.</p>
189
190<p>AOSP includes framework code for secure texture video playback; driver
191support is up to the vendor. Partners must implement the
192<code>EGL_EXT_protected_content</code> and
193<code>GL_EXT_protected_textures extensions</code>. When using your own codec
194library (to replace libstagefright), note the changes in
195<code>/frameworks/av/media/libstagefright/SurfaceUtils.cpp</code> that allow
196buffers marked with <code>GRALLOC_USAGE_PROTECTED</code> to be sent to
197ANativeWindows (even if the ANativeWindow does not queue directly to the window
198composer) as long as the consumer usage bits contain
199<code>GRALLOC_USAGE_PROTECTED</code>. For detailed documentation on implementing
200the extensions, refer to the Khronos Registry
201(<a href="https://www.khronos.org/registry/egl/extensions/EXT/EGL_EXT_protected_content.txt">EGL_EXT_protected_content</a>,
202<a href="https://www.khronos.org/registry/gles/extensions/EXT/EXT_protected_textures.txt">GL_EXT_protected_textures</a>).</p>
203
204<p>Partners may also need to make hardware changes to ensure that protected
205memory mapped onto the GPU remains protected and unreadable by unprotected
206code.</p>
207