1Name 2 3 NV_video_capture 4 5Name Strings 6 7 GL_NV_video_capture 8 GLX_NV_video_capture 9 WGL_NV_video_capture 10Contributors 11 12 James Jones 13 Robert Morell 14 Andy Ritger 15 Antonio Tejada 16 Thomas True 17 18Contact 19 20 James Jones, NVIDIA Corporation (jajones 'at' nvidia.com) 21 22Status 23 24 Complete. Shipping in NVIDIA 190.XX drivers 25 26Version 27 28 Last Modified Date: Jul 8, 2011 29 Author Revision: 24 30 31Number 32 33 374 34 35Dependencies 36 37 OpenGL 2.0 is required. 38 39 ARB_vertex_buffer_object is required. 40 41 EXT_framebuffer_object is required. 42 43 EXT_timer_query is required for 64-bit integer type definitions 44 only. 45 46 NV_present_video is required for the definition of the FRAME_NV 47 token and the wglQueryCurrentContextNV function only. 48 49 Written based on the wording of the OpenGL 3.0 specification. 50 51Overview 52 53 This extension provides a mechanism for streaming video data 54 directly into texture objects and buffer objects. Applications can 55 then display video streams in interactive 3D scenes and/or 56 manipulate the video data using the GL's image processing 57 capabilities. 58 59New Procedures and Functions 60 61 void BeginVideoCaptureNV(uint video_capture_slot); 62 63 void BindVideoCaptureStreamBufferNV(uint video_capture_slot, 64 uint stream, enum frame_region, 65 intptrARB offset); 66 67 void BindVideoCaptureStreamTextureNV(uint video_capture_slot, 68 uint stream, enum frame_region, 69 enum target, uint texture); 70 71 void EndVideoCaptureNV(uint video_capture_slot); 72 73 void GetVideoCaptureivNV(uint video_capture_slot, enum pname, 74 int *params); 75 76 void GetVideoCaptureStream{i,f,d}vNV(uint video_capture_slot, 77 uint stream, enum pname, 78 T *params); 79 80 enum VideoCaptureNV(uint video_capture_slot, uint *sequence_num, 81 uint64EXT *capture_time); 82 83 void VideoCaptureStreamParameter{i,f,d}vNV(uint video_capture_slot, 84 uint stream, 85 GLenum pname, 86 const T *params); 87 88 89 int glXBindVideoCaptureDeviceNV(Display *dpy, 90 unsigned int video_capture_slot, 91 GLXVideoCaptureDeviceNV device); 92 93 GLXVideoCaptureDeviceNV * 94 glXEnumerateVideoCaptureDevicesNV(Display *dpy, int screen, 95 int *nelements); 96 97 void glXLockVideoCaptureDeviceNV(Display *dpy, 98 GLXVideoCaptureDeviceNV device); 99 100 int glXQueryVideoCaptureDeviceNV(Display *dpy, 101 GLXVideoCaptureDeviceNV device, 102 int attribute, int *value); 103 104 void glXReleaseVideoCaptureDeviceNV(Display *dpy, 105 GLXVideoCaptureDeviceNV device); 106 107 108 BOOL wglBindVideoCaptureDeviceNV(UINT uVideoSlot, 109 HVIDEOINPUTDEVICENV hDevice); 110 111 UINT wglEnumerateVideoCaptureDevicesNV(HDC hDc, 112 HVIDEOINPUTDEVICENV *phDeviceList); 113 114 BOOL wglLockVideoCaptureDeviceNV(HDC hDc, 115 HVIDEOINPUTDEVICENV hDevice); 116 117 BOOL wglQueryVideoCaptureDeviceNV(HDC hDc, 118 HVIDEOINPUTDEVICENV hDevice, 119 int iAttribute, int *piValue); 120 121 BOOL wglReleaseVideoCaptureDeviceNV(HDC hDc, 122 HVIDEOINPUTDEVICENV hDevice); 123 124New Types 125 126 typedef XID GLXVideoCaptureDeviceNV 127 128 DECLARE_HANDLE(HVIDEOINPUTDEVICENV); 129 130New Tokens 131 132 Accepted by the <target> parameters of BindBufferARB, BufferDataARB, 133 BufferSubDataARB, MapBufferARB, UnmapBufferARB, GetBufferSubDataARB, 134 GetBufferParameterivARB, and GetBufferPointervARB: 135 136 VIDEO_BUFFER_NV 0x9020 137 138 Accepted by the <pname> parameter of GetBooleanv, GetIntegerv, 139 GetFloatv, and GetDoublev: 140 141 VIDEO_BUFFER_BINDING_NV 0x9021 142 143 Accepted by the <frame_region> parameter of 144 BindVideoCaptureStreamBufferNV, and BindVideoCaptureStreamTextureNV: 145 146 FIELD_UPPER_NV 0x9022 147 FIELD_LOWER_NV 0x9023 148 149 Accepted by the <pname> parameter of GetVideoCaptureivNV: 150 151 NUM_VIDEO_CAPTURE_STREAMS_NV 0x9024 152 NEXT_VIDEO_CAPTURE_BUFFER_STATUS_NV 0x9025 153 154 Accepted by the <pname> parameter of 155 GetVideoCaptureStream{i,f,d}vNV: 156 157 LAST_VIDEO_CAPTURE_STATUS_NV 0x9027 158 VIDEO_BUFFER_PITCH_NV 0x9028 159 VIDEO_CAPTURE_FRAME_WIDTH_NV 0x9038 160 VIDEO_CAPTURE_FRAME_HEIGHT_NV 0x9039 161 VIDEO_CAPTURE_FIELD_UPPER_HEIGHT_NV 0x903A 162 VIDEO_CAPTURE_FIELD_LOWER_HEIGHT_NV 0x903B 163 VIDEO_CAPTURE_TO_422_SUPPORTED_NV 0x9026 164 165 Accepted by the <pname> parameter of 166 GetVideoCaptureStream{i,f,d}vNV and as the <pname> parameter of 167 VideoCaptureStreamParameter{i,f,d}vNV: 168 169 VIDEO_COLOR_CONVERSION_MATRIX_NV 0x9029 170 VIDEO_COLOR_CONVERSION_MAX_NV 0x902A 171 VIDEO_COLOR_CONVERSION_MIN_NV 0x902B 172 VIDEO_COLOR_CONVERSION_OFFSET_NV 0x902C 173 VIDEO_BUFFER_INTERNAL_FORMAT_NV 0x902D 174 VIDEO_CAPTURE_SURFACE_ORIGIN_NV 0x903C 175 176 Returned by VideoCaptureNV: 177 178 PARTIAL_SUCCESS_NV 0x902E 179 180 Returned by VideoCaptureNV and GetVideoCaptureStream{i,f,d}vNV 181 when <pname> is LAST_VIDEO_CAPTURE_STATUS_NV: 182 183 SUCCESS_NV 0x902F 184 FAILURE_NV 0x9030 185 186 Accepted in the <params> parameter of 187 VideoCaptureStreamParameter{i,f,d}vNV when <pname> is 188 VIDEO_BUFFER_INTERNAL_FORMAT_NV and returned by 189 GetVideoCaptureStream{i,f,d}vNV when <pname> is 190 VIDEO_BUFFER_INTERNAL_FORMAT_NV: 191 192 YCBYCR8_422_NV 0x9031 193 YCBAYCR8A_4224_NV 0x9032 194 Z6Y10Z6CB10Z6Y10Z6CR10_422_NV 0x9033 195 Z6Y10Z6CB10Z6A10Z6Y10Z6CR10Z6A10_4224_NV 0x9034 196 Z4Y12Z4CB12Z4Y12Z4CR12_422_NV 0x9035 197 Z4Y12Z4CB12Z4A12Z4Y12Z4CR12Z4A12_4224_NV 0x9036 198 Z4Y12Z4CB12Z4CR12_444_NV 0x9037 199 200 Accepted in the attribute list of the GLX reply to the 201 glXEnumerateVideoCaptureDevicesNV command: 202 203 GLX_DEVICE_ID_NV 0x20CD 204 205 Accepted by the <attribute> parameter of glXQueryContext: 206 207 GLX_NUM_VIDEO_CAPTURE_SLOTS_NV 0x20CF 208 209 Accepted by the <attribute> parameter of 210 glXQueryVideoCaptureDeviceNV: 211 212 GLX_UNIQUE_ID_NV 0x20CE 213 214 Accepted by the <iAttribute> parameter of wglQueryCurrentContextNV: 215 216 WGL_NUM_VIDEO_CAPTURE_SLOTS_NV 0x20CF 217 218 Accepted by the <iAttribute> parameter of 219 wglQueryVideoCaptureDeviceNV: 220 221 WGL_UNIQUE_ID_NV 0x20CE 222 223 224Additions to Chapter 2 of the 1.1 Specification (OpenGL Operation) 225 226 227Additions to Chapter 3 of the 1.1 Specification (Rasterization) 228 229 230Additions to Chapter 4 of the 1.1 Specification (Per-Fragment 231Operations and the Frame Buffer) 232 233 Add a new section after Section 4.4 and, if NV_present_video is 234 present, before Section 4.5 "Displaying Buffers." 235 236 "Section 4.5, Video Capture 237 238 "Video capture can be used to transfer pixels from a video input 239 device to textures or buffer objects. Video input devices are 240 accessed by binding them to a valid video capture slot in a context 241 using window-system specific functions. Valid video capture slots 242 are unsigned integers in the range 1 to the implementation dependent 243 maximum number of slots, inclusive. Trying to perform video 244 capture operations on an invalid video capture slot or a video 245 capture slot with no device bound to it will generate 246 INVALID_OPERATION. 247 248 "The values captured can be transformed by a fixed-function color 249 conversion pipeline before they are written to the destination. 250 Each video input device can have an implementation-dependent number 251 of input streams associated with it. Pixels are transferred from 252 all streams on a device simultaneously. 253 254 "Video capture can be started and stopped on a specified video 255 capture slot with the commands 256 257 void BeginVideoCaptureNV(uint video_capture_slot) 258 259 and 260 261 void EndVideoCaptureNV(uint video_capture_slot) 262 263 respectively. After BeginVideoCaptureNV is called, the capture 264 device bound to <video_capture_slot> will begin filling a queue of 265 raw buffers with incoming video data. If capture is already in the 266 requested state, INVALID_OPERATION is generated. 267 268 "To move data from the raw buffers into the GL, buffer objects or 269 textures must be bound to the individual video capture streams. A 270 video capture stream refers to a single video source. Each video 271 capture slot must provide one or more video capture streams. 272 Streams are referred to by their index, starting from zero. If an 273 invalid stream index is specified, INVALID_VALUE is generated. 274 275 Buffer objects or textures can be bound to streams using the 276 commands 277 278 void BindVideoCaptureStreamBufferNV(uint video_capture_slot, 279 uint stream, 280 enum frame_region, 281 intptrARB offset); 282 283 or 284 285 void BindVideoCaptureStreamTextureNV(uint video_capture_slot, 286 uint stream, 287 enum frame_region, 288 enum target, uint texture); 289 290 where <stream> is the index of the stream to bind the object to and 291 <frame_region> is the spatial region of the frame, specified 292 by one of FRAME_NV, FIELD_UPPER_NV, or FIELD_LOWER_NV. If 293 FIELD_UPPER_NV and FIELD_LOWER_NV are used, two objects must be 294 bound to the stream; one for the upper field and one for the lower 295 field. If only one field is bound at capture time, 296 INVALID_OPERATION is generated. 297 298 "For buffer object capture, the buffer bound to the 299 VIDEO_BUFFER_NV target is used. An offset into the buffer object 300 can be specified using the <offset> parameter. The offset provided 301 must be a multiple of the size, in bytes, of a pixel in the internal 302 format specified for this stream or INVALID_VALUE will be generated 303 at frame capture time. To unbind a buffer object from a video 304 capture stream region, bind buffer object 0 to the region. The 305 internal format of the pixel data stored in the buffer object can be 306 specified using the VideoCaptureStreamParameter functions described 307 below, with <pname> set to VIDEO_BUFFER_INTERNAL_FORMAT_NV and 308 <params> set to a color-renderable internal format (as defined in 309 section 4.4.4), or one of the Y'CbCr/Y'CbCrA formats defined in 310 table 4.13. Specifying other internal formats will generate 311 INVALID_ENUM. 312 313 Element Meaning Format 314 Format Name and Order Layout 315 ---------------------------------------- ---------------- ------- 316 YCBYCR8_422_NV Y'0, Cb, Y'1, Cr 4:2:2 317 318 YCBAYCRA8_4224_NV Y'0, Cb, A0, 4:2:2:4 319 Y'1, Cr, A1 320 321 Z6Y10Z6CB10Z6Y10Z6CR10_422_NV 6 zero bits, Y'0, 4:2:2 322 6 zero bits, Cb, 323 6 zero bits, Y'1, 324 6 zero bits, Cr 325 326 Z6Y10Z6CB10Z6A10Z6Y10Z6CR10Z6A10_4224_NV 6 zero bits, Y'0, 4:2:2:4 327 6 zero bits, Cb, 328 6 zero bits, A0, 329 6 zero bits, Y'1, 330 6 zero bits, Cr, 331 6 zero bits, A1 332 333 Z4Y12Z4CB12Z4Y12Z4CR12_422_NV 4 zero bits, Y'0, 4:2:2 334 4 zero bits, Cb, 335 4 zero bits, Y'1, 336 4 zero bits, Cr 337 338 Z4Y12Z4CB12Z4A12Z4Y12Z4CR12Z4A12_4224_NV 4 zero bits, Y'0, 4:2:2:4 339 4 zero bits, Cb, 340 4 zero bits, A0, 341 4 zero bits, Y'1, 342 4 zero bits, Cr, 343 4 zero bits, A1 344 345 Z4Y12Z4CB12Z4CR12_444_NV 4 zero bits, Y', 4:4:4 346 4 zero bits, Cb, 347 4 zero bits, Cr 348 349 Table 4.13 - Video capture buffer internal formats 350 351 "For texture object capture, the texture named <texture> on <target> 352 is used. The internal format of the texture must be color- 353 renderable as defined in section 4.4.4 at capture time, or 354 INVALID_OPERATION is generated. Only 2D textures can be used as 355 video capture destinations. If <target> is not TEXTURE_2D or 356 TEXTURE_RECTANGLE, INVALID_OPERATION is generated. If <target> 357 does not refer to a texture target supported by the current context, 358 INVALID_ENUM is generated. To unbind a texture from a video capture 359 stream region without binding a new one, bind texture 0 to the 360 region. If <texture> is non-zero and does not name an existing 361 texture object, INVALID_VALUE is generated. 362 363 "Captured video data will have 2, 3, or 4 components per pixel. The 364 number of components and their layout is determined based on the 365 format of the data output by the video capture device. This may 366 differ from the data format of the data received by the video 367 capture device if it has internal data format conversion hardware. 368 For example, if the device is configured to resample data with a 369 4:2:2 layout up to a 4:4:4:4 layout, the effective format is 370 4:4:4:4. If the formats in table 4.13 are used, the format layout 371 must be compatible with the format of the captured data, as defined 372 in table 4.14, or INVALID_ENUM is generated. Compatibility with 373 4:2:2 and 4:2:2:4 capture format layouts can be queried using the 374 GetVideoCaptureStream{i,f,d}vNV commands with <pname> set to 375 VIDEO_CAPTURE_TO_422_SUPPORTED_NV as described below. 376 377 Effective Compatible 378 Format Layout Capture Format Layouts 379 ------------- ---------------------- 380 4:2:2 4:2:2, 4:2:2:4 381 4:2:2:4 4:2:2, 4:2:2:4 382 4:4:4 4:4:4, 4:4:4:4 383 4:4:4:4 4:4:4, 4:4:4:4 384 385 Table 4.14 - Compatible format layouts. 386 387 "If the effective capture data format is 4:2:2, there will be 2 388 components per pixel. If capturing to a format from table 4.13, it 389 will take two incoming pixels to make up one pixel group referred to 390 by the destination layout. The first pixel's components 1 and 2 will 391 be written to the destination pixel group's Y'0 and Cb components. 392 The second pixel's components 1 and 2 will be written to the 393 destination pixel group's Y'1 and Cr components. Otherwise, the 394 captured pixel's components 1 and 2 will be written to the 395 destination R and G components respectively and there is no concept 396 of pixel groups. If the effective capture data format is 4:4:4, 397 there will be 3 components per pixel. If capturing to a format from 398 table 4.13, the captured components 1, 2, and 3 will be written to 399 the Y', Cb, and Cr components respectively. Otherwise the components 400 1, 2, and 3 will be written to the destination R, G, and B components 401 respectively. If the effective capture data format is 4:2:2:4 or 402 4:4:4:4, the mapping will be the same as that of 4:2:2 or 4:4:4 403 respectively, but the final component will always be stored in the 404 destination A or A' component. If the destination format does not 405 contain a component used by the mapping above, the source's 406 corresponding component will be ignored. If the destination has 407 components not mentioned in the mapping above for the current 408 effective capture data format, the value in those components will be 409 undefined after a capture operation. 410 411 "After objects have been bound to the video capture streams, 412 413 enum VideoCaptureNV(uint video_capture_slot, uint *sequence_num, 414 uint64EXT *capture_time); 415 416 can be called to capture one frame of video. If no frames are 417 available, this call will block until frames are ready for capture 418 or an error occurs. VideoCaptureNV will return one of SUCCESS_NV, 419 PARTIAL_SUCCESS_NV, or FAILURE_NV. If the capture operation 420 completed successfully on all stream with objects bound, SUCCESS_NV 421 is returned. If some streams succeeded PARTIAL_SUCCESS_NV is 422 returned. If the capture failed on all streams, or if the capture 423 state on the specified slot is invalid, FAILURE_NV is returned. In 424 addition, the following GL errors are generated if FAILURE_NV was 425 returned because of invalid capture state: 426 427 * INVALID_OPERATION if any stream has both texture and buffer 428 objects bound. 429 430 * INVALID_VALUE if any buffer objects bound are not large enough 431 to contain the the data from the region they are bound to at the 432 specified offset. 433 434 * INVALID_VALUE if the dimensions of any textures bound do not 435 match the dimensions of the region they are bound to. 436 437 * INVALID_OPERATION if the base level of any textures bound has 438 not been defined. 439 440 * INVALID_OPERATION if the internal formats of any textures bound 441 to the same stream does not match. 442 443 * INVALID_OPERATION if automatic mipmap generation is enabled for 444 any textures bound. 445 446 "If PARTIAL_SUCCESS_NV is returned, the command 447 448 void GetVideoCaptureStream{i,f,d}vNV(uint video_capture_slot, 449 uint stream, enum pname, 450 T *params); 451 452 can be used with <pname> set to LAST_VIDEO_CAPTURE_STATUS_NV to 453 determine which streams the capture succeeded on. 454 455 "After a successful or partially successful VideoCaptureNV call, 456 <sequence_num> will be set to the sequence number of the frame 457 captured, beginning at 0 for the first frame after BeginVideoCapture 458 was called, and <capture_time> is set to the GPU time, in 459 nanoseconds, that the video capture device began capturing the 460 frame. Note that the time VideoCaptureNV was called does not affect 461 the value returned in <capture_time>. The time returned is relative 462 to when the video frame first reached the capture hardware, not when 463 the GL requested delivery of the next captured frame. After a 464 failed VideoCaptureNV call, the values in <sequence_num> and 465 <capture_time> are undefined. 466 467 "When capturing data with a 4:4:4 or 4:4:4:4 layout without using 468 one of the destination formats from table 4.13 the captured pixels 469 are run through the color conversion process illustrated in figure 470 4.4 as they are transferred from the capture device's raw buffers to 471 the bound capture objects. 472 473 |a| 474 Output = clamp( M |b| + Offset ) 475 |c| 476 |d| 477 478 Figure 4.4, Video capture color conversion pipeline. When the stream 479 is in YUVA color space: a = Yi, b=Ui, c=Vi and d = Ai. When in 480 RGBA color space, a = Gi, b=Bi, c=Ri and d = Ai. 481 482 "<M> and <Offset> are the color conversion matrix and color 483 conversion offset for the video capture stream, respectively, and 484 <clamp> is an operation that clamps each component of the result to 485 the range specified by the corresponding components of <Cmin> and 486 <Cmax> for the video capture stream. Each component of <Cmin> is 487 calculated by taking the maximum of the corresponding component 488 in the vector specified by VIDEO_COLOR_CONVERSION_MIN_NV, as 489 described below, and the minimum value representable by the format 490 of the destination surface. Similarly, each component of <Cmax> 491 is calculated by taking the minimum of the corresponding component 492 in the vector specified by VIDEO_COLOR_CONVERSION_MAX_NV and the 493 maximum value representable by the format. 494 495 "When the destination format uses fixed-point or floating-point 496 internal storage, the captured video data will be converted to a 497 floating-point representation internally before the color 498 conversion step. The following equation describes the conversion: 499 500 f = ( c - Dmin ) / ( Dmax - Dmin ) 501 502 "Where <c> is the value of the incoming component, <Dmin> and <Dmax> 503 are the minimum and maximum values, respectively, that the video 504 capture device can generate in its current configuration, and <f> is 505 the resulting floating-point value. Note that <Dmin> and <Dmax> 506 refer to the numerical range of the incoming data format. They are 507 not affected by any clamping requirements of the captured data 508 format. 509 510 "The commands 511 512 void VideoCaptureStreamParameter{i,f,d}vNV(uint 513 video_capture_slot, 514 uint stream, 515 enum pname, 516 const T *params); 517 518 can be used to specify video capture stream parameters. The value 519 or values in <params> are assigned to video capture stream parameter 520 specified as <pname>. To specify a stream's conversion matrix, set 521 <pname> to VIDEO_COLOR_CONVERSION_MATRIX_NV, and set <params> to an 522 array of 16 consecutive values, which are used as the elements of a 523 4 x 4 column-major matrix. If the video capture stream's data 524 format does not include an alpha component, the fourth column of the 525 matrix is ignored. The color conversion matrix is initialized to 526 a 4x4 identity matrix when a video capture device is bound. 527 528 "To specify the video capture stream color conversion offset vector, 529 set <pname> to to VIDEO_COLOR_CONVERSION_OFFSET_NV and <params> to 530 an array of 4 consecutive values. If the video capture stream's 531 data format does not include an alpha component, the fourth 532 component of the vector is ignored. Initially the offset vector 533 is the zero vector. 534 535 "To specify the video capture stream color conversion clamp values, 536 set <pname> to one of VIDEO_COLOR_CONVERSION_MIN_NV or 537 VIDEO_COLOR_CONVERSION_MAX_NV and <params> to an array of 4 538 consecutive values. If the video capture stream's data format does 539 not include an alpha component, the fourth component of the vectors 540 is ignored. Initially the minimum and maximum values are set to 541 the zero vector and <1, 1, 1, 1> respectively. Note that care 542 should be taken to set the maximum vector correctly when using 543 destination capture formats that do not store normalized values, 544 such as integer texture formats. 545 546 "To set the orientation of the captured video data, set <pname> to 547 VIDEO_CAPTURE_SURFACE_ORIGIN_NV and <params> to LOWER_LEFT or 548 UPPER_LEFT. The default value is LOWER_LEFT, which means the bottom 549 left of the captured region of the video image will be at texture 550 coordinate <0,0> in any textures bound as capture desitnations, and 551 will be first pixel in any buffer objects bound as capture 552 destinations. If UPPER_LEFT is used as the origin, the image will 553 be mirrored vertically. If <params> contains any value other than 554 LOWER_LEFT or UPPER_LEFT, INVALID_ENUM is generated." 555 556 If NV_present_video is present, section 4.5 "Displaying Buffers" 557 becomes section 4.6. 558 559Additions to Chapter 5 of the 1.1 Specification (Special Functions) 560 561 In section 5.4, "Display Lists", add the following to the list of 562 commands that are not compiled into display lists: 563 564 "Video capture commands: BeginVideoCaptureNV, 565 BindVideoCaptureStreamBufferNV, BindVideoCaptureStreamTextureNV, 566 EndVideoCaptureNV, VideoCaptureNV, 567 VideoCaptureStreamParameter{i,f,d}vNV 568 569Additions to Chapter 6 of the 1.1 Specification (State and State 570Requests) 571 572 Add a new section after Section 6.1.14, "Shader and Program Queries" 573 574 "Section 6.1.15, Video Capture State Queries 575 576 "The command 577 578 void GetVideoCaptureivNV(uint video_capture_slot, enum pname, 579 int *params); 580 581 returns properties of the video capture device bound to 582 <video_capture_slot> in <params>. The parameter value to return is 583 specified in <pname>. 584 585 "If <pname> is NEXT_VIDEO_CAPTURE_BUFFER_STATUS_NV, TRUE is returned 586 if VideoCaptureNV will not block and FALSE is returned otherwise. 587 If <pname> is NUM_VIDEO_CAPTURE_STREAMS_NV, the number of available 588 video capture streams on the device bound to <video_capture_slot> is 589 returned. 590 591 "The command 592 593 void GetVideoCaptureStream{i,f,d}vNV(uint video_capture_slot, 594 uint stream, enum pname, 595 uint *params); 596 597 returns properties of an individual video stream on the video 598 capture device bound to <video_capture_slot> in <params>. The 599 parameter value to return is specified in <pname>. 600 601 "If <pname> is LAST_VIDEO_CAPTURE_STATUS_NV, SUCCESS_NV will be 602 returned if the last call to VideoCaptureNV captured valid pixels 603 data for the entire frame on this stream. Otherwise, FAILURE_NV 604 will be returned. If <pname> is VIDEO_BUFFER_INTERNAL_FORMAT_NV, 605 the internal format used when capturing to a buffer object is 606 returned. Initially the internal format is RGBA8. If <pname> is 607 VIDEO_BUFFER_PITCH_NV, the pitch of the image data captured when a 608 buffer object is bound to this stream is returned. The pitch is 609 based on the internal format so it should be queried whenever the 610 internal format is changed. If <pname> is 611 VIDEO_COLOR_CONVERSION_MATRIX_NV an array of 16 values representing 612 the column-major color conversion matrix is returned. Initially 613 this matrix is the identity matrix. If <pname> is 614 VIDEO_COLOR_CONVERSION_OFFSET_NV, 4 values representing the color 615 conversion offset vector are returned. Initially the offset vector 616 is [ 0 0 0 0 ]. If <pname> is VIDEO_COLOR_CONVERSION_MIN_NV or 617 VIDEO_COLOR_CONVERSION_MAX_NV, 4 values representing the color 618 conversion minimum or maximum vectors are returned, respectively. 619 Initially the minimum is [ 0 0 0 0 ] and the maximum is 620 [ 1 1 1 1 ]. If <pname> is VIDEO_CAPTURE_FRAME_WIDTH_NV, 621 VIDEO_CAPTURE_FRAME_HEIGHT_NV, VIDEO_CAPTURE_FIELD_UPPER_HEIGHT_NV, 622 or VIDEO_CAPTURE_FIELD_LOWER_HEIGHT_NV, the frame/field width, frame 623 height, upper field height, or lower field height of the data the 624 bound video capture device is configured to capture, respectively, 625 are returned. If <pname> is VIDEO_CAPTURE_SURFACE_ORIGIN_NV, 626 LOWER_LEFT or UPPER_LEFT is returned. If <pname> is 627 VIDEO_CAPTURE_TO_422_SUPPORTED_NV, TRUE is returned if using one of 628 the 4:2:2 formats from table 4.13 when capturing to buffer objects 629 on this stream is supported. Otherwise, FALSE is returned. 630 631 632Additions to Chapter 2 of the GLX 1.4 Specification (GLX Operation) 633 634 None 635 636Additions to Chapter 3 of the GLX 1.4 Specification (Functions and 637Errors) 638 639 Modify table 3.5: 640 641 Attribute Type Description 642 ------------------------------ ---- ------------------------------ 643 GLX_FBCONFIG_ID XID XID of GLXFBCconfig associated 644 with context 645 GLX_RENDER_TYPE int type of rendering supported 646 GLX_SCREEN int screen number 647 GLX_NUM_VIDEO_CAPTURE_SLOTS_NV int number of video capture slots 648 this context supports 649 650 Add a section between Sections 3.3.10 and 3.3.11: 651 652 3.3.11a Video Capture Devices 653 654 "GLX video capture devices can be used to stream video data from an 655 external source directly into GL objects for use in rendering or 656 readback. Use 657 658 GLXVideoCaptureDeviceNV * 659 glXEnumerateVideoCaptureDevicesNV(Display *dpy, 660 int screen, 661 int *nElements); 662 663 "to generate an array of video capture devices. The number of 664 elements in the array is returned in <nElements>. Each element of 665 the array is a video capture device on <screen>. Use XFree to free 666 the memory returned by glXEnumerateVideoCaptureDevicesNV. 667 668 "GLX video capture devices are abstract objects that refer to a 669 physical capture device. Each physical capture device will have a 670 unique ID that can be used to identify it when coordinating device 671 usage and setup with other APIs. To query the unique ID of the 672 physical device backing a GLX video capture device, use 673 674 int glXQueryVideoCaptureDeviceNV(Display *dpy, 675 GLXVideoCaptureDeviceNV device, 676 int attribute, int *value); 677 678 "where <attribute> must be GLX_UNIQUE_ID_NV. On success, the unique 679 ID will be returned in <value> and the function will return Success. 680 If <device> does not refer to a video capture device GLX_BAD_VALUE 681 will be returned. If <attribute> does not name a video capture 682 device attribute GLX_BAD_ATTRIBUTE will be returned. 683 684 "Before using a video capture device, it must be locked. Once a 685 video capture device is locked by a client, no other client can lock 686 a video capture device with the same unique ID until the lock is 687 released or the connection between the client holding the lock and 688 the X server is broken. To lock a video capture device to a display 689 connection, use 690 691 void glXLockVideoCaptureDeviceNV(Display *dpy, 692 GLXVideoCaptureDeviceNV device); 693 694 "If <device> does not name a video capture device, BadValue is 695 generated. If <device> is already locked <BadMatch> is generated. 696 697 "After successfully locking a video capture device, use 698 699 int glXBindVideoCaptureDeviceNV(Display *dpy, 700 unsigned int video_capture_slot, 701 GLXVideoCaptureDeviceNV device); 702 703 704 "to bind it to the capture slot <video_capture_slot> in the current 705 context. If the slot is already bound, the device it's bound to 706 it will be unbound first. To unbind a video capture device, bind 707 device None to the video capture slot the device is bound to. If 708 the bind is successful, Success is returned. If there is no context 709 current GLX_BAD_CONTEXT is returned or GLXBadContext is generated. 710 If <video_capture_slot> is not a valid capture slot on the current 711 context, BadMatch is generated. If <device> does not name a video 712 capture device, BadValue is generated. If <device> is already bound 713 to a video capture slot, GLX_BAD_VALUE is returned. If <device> is 714 not locked by <dpy>, BadMatch is generated." 715 716 "GLX does not provide a mechanism to configure the video capture 717 process. It is expected that device vendors provide a vendor- 718 specific mechanism for configuring or detecting properties such as 719 the incoming video signal and data format. However, GLX does expect 720 that devices are fully configured before glXBindVideoCaptureDeviceNV 721 is called. Changing device properties that affect the format of the 722 captured data will cause the results of video capture to be 723 undefined. 724 725 "When finished capturing data on a locked video capture device, use 726 727 void glXReleaseVideoCaptureDeviceNV(Display *dpy, 728 GLXVideoCaptureDeviceNV device); 729 730 to unlock it. The application must unbind the device before 731 releasing it, or BadMatch will be generated. If <device> does not 732 name a video capture device, BadValue is generated. If <device> 733 is not locked by <dpy>, BadMatch is generated. 734 735Additions to Chapter 4 of the GLX 1.4 Specification (Encoding on the X 736Byte Stream) 737 738 None 739 740Additions to Chapter 5 of the GLX 1.4 Specification (Extending OpenGL) 741 742Additions to Chapter 6 of the GLX 1.4 Specification (GLX Versions) 743 744 None 745 746GLX Protocol 747 748 BindVideoCaptureDeviceNV 749 1 CARD8 opcode (X assigned) 750 1 17 GLX opcode (glXVendorPrivateWithReply) 751 2 5 request length 752 4 1412 vendor specific opcode 753 4 GLX_CONTEXT_TAG context tag 754 4 CARD32 video_capture_slot 755 4 CARD32 device 756 => 757 1 CARD8 reply 758 1 unused 759 2 CARD16 sequence number 760 4 0 reply length 761 4 CARD32 status 762 20 unused 763 764 EnumerateVideoCaptureDeviceNV 765 1 CARD8 opcode (X assigned) 766 1 17 GLX opcode (glXVendorPrivateWithReply) 767 2 4 request length 768 4 1413 vendor specific opcode 769 4 unused 770 4 CARD32 screen 771 => 772 1 CARD8 reply 773 1 unused 774 2 CARD16 sequence number 775 4 n reply length, n = 2 * d * p 776 4 CARD32 num_devices (d) 777 4 CARD32 num_properties (p) 778 16 unused 779 4*n LISTofATTRIBUTE_PAIR attribute, value pairs 780 781 LockVideoCaptureDeviceNV 782 1 CARD8 opcode (X assigned) 783 1 16 GLX opcode (glXVendorPrivate) 784 2 4 request length 785 4 1414 vendor specific opcode 786 4 unused 787 4 CARD32 device 788 789 ReleaseVideoCaptureDeviceNV 790 1 CARD8 opcode (X assigned) 791 1 16 GLX opcode (glXVendorPrivate) 792 2 4 request length 793 4 1415 vendor specific opcode 794 4 unused 795 4 CARD32 device 796 797 BeginVideoCaptureNV 798 1 CARD8 opcode (X assigned) 799 1 16 GLX opcode (glXVendorPrivate) 800 2 4 request length 801 4 1400 vendor specific opcode 802 4 GLX_CONTEXT_TAG context tag 803 4 CARD32 video_cpture_slot 804 805 BindVideoCaptureStreamBufferNV 806 1 CARD8 opcode (X assigned) 807 1 16 GLX opcode (glXVendorPrivate) 808 2 8 request length 809 4 1401 vendor specific opcode 810 4 GLX_CONTEXT_TAG context tag 811 8 CARD64 offset 812 4 CARD32 video_cpture_slot 813 4 CARD32 stream 814 4 ENUM frame_region 815 816 BindVideoCaptureStreamBufferNV 817 1 CARD8 opcode (X assigned) 818 1 16 GLX opcode (glXVendorPrivate) 819 2 8 request length 820 4 1402 vendor specific opcode 821 4 GLX_CONTEXT_TAG context tag 822 4 CARD32 video_cpture_slot 823 4 CARD32 stream 824 4 ENUM frame_region 825 4 ENUM target 826 4 CARD32 texture 827 828 EndVideoCaptureNV 829 1 CARD8 opcode (X assigned) 830 1 16 GLX opcode (glXVendorPrivate) 831 2 4 request length 832 4 1403 vendor specific opcode 833 4 GLX_CONTEXT_TAG context tag 834 4 CARD32 video_cpture_slot 835 836 GetVideoCaptureivNV 837 1 CARD8 opcode (X assigned) 838 1 17 GLX opcode (glXVendorPrivateWithReply) 839 2 5 request length 840 4 1404 vendor specific opcode 841 4 GLX_CONTEXT_TAG context tag 842 4 CARD32 video_capture_slot 843 4 ENUM pname 844 => 845 1 CARD8 reply 846 1 unused 847 2 CARD16 sequence number 848 4 m reply length, m = (n==1 ? 0 : n) 849 4 unused 850 4 CARD32 n 851 852 if (n=1) this follows: 853 854 4 INT32 params 855 12 unused 856 857 otherwise this follows: 858 859 16 unused 860 n*4 LISTofINT32 params 861 862 GetVideoCaptureStreamivNV 863 1 CARD8 opcode (X assigned) 864 1 17 GLX opcode (glXVendorPrivateWithReply) 865 2 6 request length 866 4 1405 vendor specific opcode 867 4 GLX_CONTEXT_TAG context tag 868 4 CARD32 video_capture_slot 869 4 CARD32 stream 870 4 ENUM pname 871 => 872 1 CARD8 reply 873 1 unused 874 2 CARD16 sequence number 875 4 m reply length, m = (n==1 ? 0 : n) 876 4 unused 877 4 CARD32 n 878 879 if (n=1) this follows: 880 881 4 INT32 params 882 12 unused 883 884 otherwise this follows: 885 886 16 unused 887 n*4 LISTofINT32 params 888 889 GetVideoCaptureStreamfvNV 890 1 CARD8 opcode (X assigned) 891 1 17 GLX opcode (glXVendorPrivateWithReply) 892 2 6 request length 893 4 1406 vendor specific opcode 894 4 GLX_CONTEXT_TAG context tag 895 4 CARD32 video_capture_slot 896 4 CARD32 stream 897 4 ENUM pname 898 => 899 1 CARD8 reply 900 1 unused 901 2 CARD16 sequence number 902 4 m reply length, m = (n==1 ? 0 : n) 903 4 unused 904 4 CARD32 n 905 906 if (n=1) this follows: 907 908 4 FLOAT32 params 909 12 unused 910 911 otherwise this follows: 912 913 16 unused 914 n*4 LISTofFLOAT32 params 915 916 GetVideoCaptureStreamdvNV 917 1 CARD8 opcode (X assigned) 918 1 17 GLX opcode (glXVendorPrivateWithReply) 919 2 6 request length 920 4 1407 vendor specific opcode 921 4 GLX_CONTEXT_TAG context tag 922 4 CARD32 video_capture_slot 923 4 CARD32 stream 924 4 ENUM pname 925 => 926 1 CARD8 reply 927 1 unused 928 2 CARD16 sequence number 929 4 m reply length, m = (n==1 ? 0 : n*2) 930 4 unused 931 4 CARD32 n 932 933 if (n=1) this follows: 934 935 8 FLOAT64 params 936 8 unused 937 938 otherwise this follows: 939 940 16 unused 941 n*8 LISTofFLOAT64 params 942 943 VideoCaptureNV 944 1 CARD8 opcode (X assigned) 945 1 17 GLX opcode (glXVendorPrivateWithReply) 946 2 4 request length 947 4 1408 vendor specific opcode 948 4 GLX_CONTEXT_TAG context tag 949 4 CARD32 video_capture_slot 950 => 951 1 CARD8 reply 952 1 unused 953 2 CARD16 sequence number 954 4 0 reply length 955 4 unused 956 4 unused 957 8 CARD64 capture_time 958 4 CARD32 sequence_num 959 4 unused 960 961 VideoCaptureStreamParameterivNV 962 1 CARD8 opcode (X assigned) 963 1 16 GLX opcode (glXVendorPrivate) 964 2 6+n request length 965 4 1409 vendor specific opcode 966 4 GLX_CONTEXT_TAG context tag 967 4 CARD32 video_cpture_slot 968 4 CARD32 stream 969 4 ENUM pname 970 0x9029 n=16 GL_VIDEO_COLOR_CONVERSION_MATRIX_NV 971 0x902A n=4 GL_VIDEO_COLOR_CONVERSION_MAX_NV 972 0x902B n=4 GL_VIDEO_COLOR_CONVERSION_MIN_NV 973 0x902C n=4 GL_VIDEO_COLOR_CONVERSION_OFFSET_NV 974 0x902D n=1 GL_VIDEO_BUFFER_INTERNAL_FORMAT_NV 975 0x902D n=1 GL_VIDEO_CAPTURE_SURFACE_ORIGIN_NV 976 else n=0 command is erroneous 977 4*n LISTofINT32 params 978 979 VideoCaptureStreamParameterfvNV 980 1 CARD8 opcode (X assigned) 981 1 16 GLX opcode (glXVendorPrivate) 982 2 6+n request length 983 4 1410 vendor specific opcode 984 4 GLX_CONTEXT_TAG context tag 985 4 CARD32 video_cpture_slot 986 4 CARD32 stream 987 4 ENUM pname 988 0x9029 n=16 GL_VIDEO_COLOR_CONVERSION_MATRIX_NV 989 0x902A n=4 GL_VIDEO_COLOR_CONVERSION_MAX_NV 990 0x902B n=4 GL_VIDEO_COLOR_CONVERSION_MIN_NV 991 0x902C n=4 GL_VIDEO_COLOR_CONVERSION_OFFSET_NV 992 0x902D n=1 GL_VIDEO_BUFFER_INTERNAL_FORMAT_NV 993 0x902D n=1 GL_VIDEO_CAPTURE_SURFACE_ORIGIN_NV 994 else n=0 command is erroneous 995 4*n LISTofFLOAT32 params 996 997 VideoCaptureStreamParameterdvNV 998 1 CARD8 opcode (X assigned) 999 1 16 GLX opcode (glXVendorPrivate) 1000 2 6+n*2 request length 1001 4 1411 vendor specific opcode 1002 4 GLX_CONTEXT_TAG context tag 1003 4 CARD32 video_cpture_slot 1004 4 CARD32 stream 1005 4 ENUM pname 1006 0x9029 n=16 GL_VIDEO_COLOR_CONVERSION_MATRIX_NV 1007 0x902A n=4 GL_VIDEO_COLOR_CONVERSION_MAX_NV 1008 0x902B n=4 GL_VIDEO_COLOR_CONVERSION_MIN_NV 1009 0x902C n=4 GL_VIDEO_COLOR_CONVERSION_OFFSET_NV 1010 0x902D n=1 GL_VIDEO_BUFFER_INTERNAL_FORMAT_NV 1011 0x902D n=1 GL_VIDEO_CAPTURE_SURFACE_ORIGIN_NV 1012 else n=0 command is erroneous 1013 8*n LISTofFLOAT64 params 1014 1015Additions to the WGL Specification 1016 1017 Modify section "Querying WGL context attributes" from NV_present_video 1018 1019 Replace the last two sentences of the last paragraph in the section 1020 with: 1021 1022 "If <iAttribute> is WGL_NUM_VIDEO_SLOTS_NV, the number of valid video 1023 output slots in the current context is returned. If <iAttribute> is 1024 WGL_NUM_VIDEO_CAPTURE_SLOTS_NV, the number of valid video capture 1025 slots in the current context is returned." 1026 1027 Add a new section "Video Capture Devices" 1028 1029 "WGL video capture devices can be used to stream video data from an 1030 external source directly into GL objects for use in rendering or 1031 readback. Use 1032 1033 UINT wglEnumerateVideoCaptureDevicesNV(HDC hDc, 1034 HVIDEOINPUTDEVICENV *phDeviceList); 1035 1036 "to query the available video capture devices on <hDc>. The number 1037 of devices is returned, and if phDeviceList is not NULL, an array of 1038 valid device handles is returned in it. The command will assume 1039 <phDeviceList> is large enough to hold all available handles so the 1040 application should take care to first query the number of devices 1041 and allocate an appropriately sized array. 1042 1043 "WGL video capture device handles refer to a physical capture 1044 device. Each physical capture device will have a unique ID that can 1045 be used to identify it when coordinating device usage and setup with 1046 other APIs. To query the unique ID of the physical device backing a 1047 WGL video capture device handle, use 1048 1049 BOOL wglQueryVideoCaptureDeviceNV(HDC hDc, 1050 HVIDEOINPUTDEVICENV hDevice, 1051 int iAttribute, int *piValue); 1052 1053 "where <iAttribute> must be WGL_UNIQUE_ID_NV. On success, the 1054 unique ID will be returned in <piValue>. 1055 1056 "Before using a video capture device, it must be locked. Once a 1057 video capture device is locked by a process, no other process can 1058 lock a video capture device with the same unique ID until the lock 1059 is released or the process ends. To lock a video capture device, 1060 use 1061 1062 BOOL wglLockVideoCaptureDeviceNV(HDC hDc, 1063 HVIDEOINPUTDEVICENV hDevice); 1064 1065 "After successfully locking a video capture device, use 1066 1067 BOOL wglBindVideoCaptureDeviceNV(UINT uVideoSlot, 1068 HVIDEOINPUTDEVICENV hDevice); 1069 1070 1071 "to bind it to the capture slot <video_capture_slot> in the current 1072 context. If the slot is already bound, the device it's bound to 1073 will be unbound first. 1074 It's an error to bind an already bound device to a different slot. 1075 To unbind a video capture device, bind device NULL to the 1076 video capture slot the device is bound to. 1077 1078 "WGL does not provide a mechanism to configure the video capture 1079 process. It is expected that device vendors provide a vendor- 1080 specific mechanism for configuring or detecting properties such as 1081 the incoming video signal and data format. However, WGL does expect 1082 that devices are fully configured before wglBindVideoCaptureDeviceNV 1083 is called. Changing device properties that affect the format of the 1084 captured data will cause the results of video capture to be 1085 undefined. 1086 1087 "When finished capturing data on a locked video capture device, use 1088 1089 BOOL wglReleaseVideoCaptureDeviceNV(HDC hDc, 1090 HVIDEOINPUTDEVICENV hDevice); 1091 1092 to unlock it. The application must unbind the device before releasing it, 1093 it's an error to release a device that is still bound. 1094 1095Errors 1096 1097 INVALID_VALUE is generated if <video_capture_slot> is less than 1 or 1098 greater than the number of video capture slots supported by the 1099 current context when calling BeginVideoCaptureNV, 1100 BindVideoCaptureStreamBufferNV, BindVideoCaptureStreamTextureNV, 1101 EndVideoCaptureNV, GetVideoCaptureivNV, 1102 GetVideoCaptureStream{i,f,d}vNV, VideoCaptureNV, or 1103 VideoCaptureStreamParameter{i,f,d}vNV. 1104 1105 INVALID_OPERATION is generated if there is no video capture device 1106 bound to the slot specified by <video_capture_slot> when calling 1107 BeginVideoCaptureNV, BindVideoCaptureStreamBufferNV, 1108 BindVideoCaptureStreamTextureNV, EndVideoCaptureNV, 1109 GetVideoCaptureivNV, GetVideoCaptureStream{i,f,d}vNV, 1110 VideoCaptureNV, or VideoCaptureStreamParameter{i,f,d}vNV. 1111 1112 INVALID_OPERATION is generated if BeginVideoCaptureNV is called on a 1113 video capture slot that is already capturing or if EndVideoCaptureNV 1114 is called on a video capture slot that is not capturing. 1115 1116 INVALID_VALUE is generated if stream is greater than the number of 1117 streams provided by the currently bound video capture device when 1118 calling BindVideoCaptureStreamBufferNV, 1119 BindVideoCaptureStreamTextureNV, 1120 GetVideoCaptureStreamParameter{i,f,d}vNV, or 1121 VideoCaptureStreamParameter{i,f,d}vNV. 1122 1123 INVALID_ENUM is generated if <frame_region> is not one of FRAME_NV, 1124 FIELD_UPPER_NV, or FIELD_LOWER_NV when calling 1125 BindVideoCaptureStreamBufferNV or BindVideoCaptureStreamTextureNV. 1126 1127 INVALID_OPERATION is generated if <target> is a valid texture target 1128 but not TEXTURE_2D or TEXTURE_RECTANGLE when calling 1129 BindVideoCaptureStreamTextureNV 1130 1131 INVALID_ENUM is generated if <target> does not refer to a texture 1132 target supported by the GL when calling 1133 BindVideoCaptureStreamTextureNV. 1134 1135 INVALID_VALUE is generated if <texture> is not 0 and does not name 1136 an existing texture object when calling 1137 BindVideoCaptureStreamTextureNV. 1138 1139 INVALID_ENUM is generated if <pname> does not name a valid video 1140 capture slot parameter when calling GetVideoCaptureivNV or a 1141 valid video capture stream parameter when calling 1142 GetVideoCaptureStream{i,f,d}vNV or a settable video capture stream 1143 parameter when calling VideoCaptureStreamParameterivNV. 1144 1145 INVALID_ENUM is generated if the buffer internal format is not a 1146 supported texture internal format or one of the values in table 1147 4.13 when calling VideoCaptureStreamParameter{i,f,d}vNV. 1148 1149 INVALID_ENUM is generated if the buffer internal format is not 1150 a value in table 4.13 and is not color renderable when calling 1151 VideoCaptureStreamParameter{i,f,d}vNV. 1152 1153 INVALID_ENUM is generated if the buffer internal format is a value 1154 in table 4.13 and the format layout is not compatible with the 1155 effective capture data format as describe in table 4.14 when calling 1156 VideoCapturStreamParameter{i,f,d}vNV. 1157 1158 INVALID_ENUM is generated if <params> does not contain one of 1159 LOWER_LEFT or UPPER_LEFT when <pname> is 1160 VIDEO_CAPTURE_SURFACE_ORIGIN_NV when calling 1161 VideoCaptureStreamParameter{i,f,d}vNV. 1162 1163 INVALID_OPERATION is generated if any stream has a mixture of 1164 buffer objects and texture objects bound when VideoCaptureNV is 1165 called. 1166 1167 INVALID_VALUE is generated if any buffer objects bound are not large 1168 enough to contain the data that would be captured from the region 1169 they are bound to at the offset specified when VideoCaptureNV is 1170 called. 1171 1172 INVALID_VALUE is generated if the dimensions of any textures bound 1173 to the video capture slot do not match the dimensions of the region 1174 they are bound to when VideoCaptureNV is called. 1175 1176 INVALID_OPERATION is generated if the base level of any textures 1177 bound to the video capture slot has not been defined when 1178 VideoCaptureNV is called. 1179 1180 INVALID_OPERATION is generated if the internal format of all 1181 textures bound to a given video capture stream does not match when 1182 VideoCaptureNV is called. 1183 1184 INVALID_OPERATION is generated if the format of any textures bound 1185 to the video capture slot is not color renderable when 1186 VideoCaptureNV is called. 1187 1188 INVALID_OPERATION is generated if automatic mipmap generation is 1189 enabled on any of the textures bound to the video capture slot when 1190 VideoCaptureNV is called. 1191 1192 INVALID_OPERATION is generated if one field of a stream has an 1193 object bound to it but the other field does not when VideoCaptureNV 1194 is called. 1195 1196 INVALID_VALUE is generated when VideoCaptureNV is called if the 1197 <offset> provided when calling BindVideoCaptureStreamBufferNV is not 1198 a multiple of the size, in bytes, of a pixel in the internal format 1199 of the capture buffer. 1200 1201New State 1202 1203Add a new table, between tables 6.44 and 6.45: 1204 1205 Get Initial 1206 Get Value Type Command Value Description Sec. Attribute 1207 ----------------------- ---- ----------- ------- ------------ ---- ------------ 1208 VIDEO_BUFFER_BINDING_NV Z+ GetIntegerv 0 Video buffer 4.5 - 1209 binding 1210 1211 Table 6.45. Video Capture State 1212 1213Add a new table, after the above: 1214 1215 Get Initial 1216 Get Value Type Command Value Description Sec. Attribute 1217 ----------------------------------- ---- ----------------- ------- ----------- ---- --------- 1218 NEXT_VIDEO_CAPTURE_BUFFER_STATUS_NV B GetVideoCaptureiv FALSE Status of 4.5 - 1219 next video 1220 capture 1221 buffer. 1222 1223 Table 6.46. Video Capture Slot State 1224 1225Add a new table, after the above: 1226 1227 Get Initial 1228 Get Value Type Command Value Description Sec. Attribute 1229 ----------------------------------- ---- ----------------------- ------------ ------------ ---- --------- 1230 LAST_VIDEO_CAPTURE_STATUS_NV Z3 GetVideoCaptureStreamiv SUCCESS_NV Status of 4.5 - 1231 last video 1232 operation 1233 1234 VIDEO_BUFFER_INTERNAL_FORMAT_NV Z+ GetVideoCaptureStreamiv See sec. 4.5 Format of 4.5 - 1235 video 1236 capture 1237 buffers 1238 bound to 1239 this stream 1240 1241 VIDEO_BUFFER_PITCH_NV Z+ GetVideoCaptureStreamiv See sec. 4.5 Pitch of 4.5 - 1242 video 1243 capture 1244 buffers 1245 bound to 1246 this stream 1247 1248 VIDEO_CAPTURE_FRAME_WIDTH_NV Z+ GetVideoCaptureStreamiv See sec. 4.5 width of 4.5 - 1249 a frame or 1250 field on 1251 currently 1252 bound video 1253 capture 1254 device. 1255 1256 VIDEO_CAPTURE_FRAME_HEIGHT_NV Z+ GetVideoCaptureStreamiv See sec. 4.5 height of 4.5 - 1257 a full frame 1258 on currently 1259 bound video 1260 capture 1261 device. 1262 1263 VIDEO_CAPTURE_FIELD_UPPER_HEIGHT_NV Z+ GetVideoCaptureStreamiv See sec. 4.5 height of 4.5 - 1264 upper field 1265 on currently 1266 bound video 1267 capture 1268 device. 1269 1270 VIDEO_CAPTURE_FIELD_LOWER_HEIGHT_NV Z+ GetVideoCaptureStreamiv See sec. 4.5 height of 4.5 - 1271 lower field 1272 on currently 1273 bound video 1274 capture 1275 device. 1276 1277 VIDEO_CAPTURE_SURFACE_ORIGIN_NV Z2 GetVideoCaptureStreamiv LOWER_LEFT orientation 4.5 - 1278 of captured 1279 video image 1280 1281 VIDEO_CAPTURE_TO_422_SUPPORTED_NV B GetVideoCaptureStreamiv See sec 4.5 support for 4.5 - 1282 4:2:2 or 1283 4:2:2:4 1284 capture. 1285 1286 VIDEO_COLOR_CONVERSION_MATRIX_NV M4 GetVideoCaptureStreamfv Identity Color 4.5 - 1287 Matrix Conversion 1288 Matrix 1289 1290 VIDEO_COLOR_CONVERSION_MAX_NV R4 GetVideoCaptureStreamfv <1,1,1,1> Color 4.5 - 1291 Conversion 1292 Clamp Max 1293 1294 VIDEO_COLOR_CONVERSION_MIN_NV R4 GetVideoCaptureStreamfv <0,0,0,0> Color 4.5 - 1295 Conversion 1296 Clamp Min 1297 1298 VIDEO_COLOR_CONVERSION_OFFEST_NV R4 GetVideoCaptureStreamfv <0,0,0,0> Color 4.5 - 1299 Conversion 1300 Offset 1301 1302 - Z+ - 0 name of 4.5 - 1303 object bound 1304 to frame 1305 or upper 1306 field 1307 1308 - Z+ - 0 name of 4.5 - 1309 object bound 1310 to lower 1311 field 1312 1313 - B - - Is a frame 4.5 - 1314 or fields 1315 bound. 1316 1317 Table 6.47. Video Capture Stream State 1318 1319New Implementation Dependent State 1320 1321(Table 6.50, p. 388) 1322 1323 Get Initial 1324 Get Value Type Command Value Description Sec. Attribute 1325 ---------------------------- ---- ----------------- ------------ ------------- ---- --------- 1326 NUM_VIDEO_CAPTURE_STREAMS_NV Z+ GetVideoCaptureiv See Sec. 4.5 Number of 4.5 - 1327 video capture 1328 streams on 1329 this video 1330 capture slot 1331 1332Usage Examples: 1333 1334 This example demonstrates binding a video capture device to a 1335 GLX context. 1336 1337 GLXVideoCaptureDevice *devices; 1338 GLXVideoCaptureDevice device; 1339 int numDevices; 1340 1341 devices = glXEnumerateVideoCaptureDevicesNV(dpy, 0, 1342 &numDevices); 1343 1344 // Assumes at least 1 device is available and is not locked. 1345 device = devices[0]; 1346 XFree(devices); 1347 1348 glXLockVideoCaptureDeviceNV(dpy, device); 1349 1350 glXBindVideoCaptureDeviceNV(dpy, 1, device); 1351 1352 BeginVideoCaptureNV(1); 1353 1354 while (use_device) { 1355 // Do main capture loop here. 1356 } 1357 1358 EndVideoCaptureNV(1); 1359 1360 // Unbind and release the capture device. 1361 glXBindVideoCaptureDeviceNV(dpy, 1, None); 1362 1363 glXReleaseVideoCaptureDeviceNV(dpy, device); 1364 1365 1366 This example demonstrates capturing 1080p video data from two 1367 sources, streaming the first to system memory, and displaying the 1368 second with the NV_present_video extension. It assumes video 1369 capture and video output devices are already bound to the current 1370 context. 1371 1372 uint video_out_buffer; 1373 uint video_out_texture; 1374 int buffer_pitch; 1375 int video_buffer_format = RGB8; 1376 1377 // Create a video output buffer object. 1378 GenBuffersARB(1, &video_out_buffer); 1379 1380 // Create and init a video output texture object. 1381 GenTextures(1, &video_out_texture); 1382 BindTexture(TEXTURE_2D, video_out_texture); 1383 TexImage2D(TEXTURE_2D, 0, RGB8, 1920, 1080, 0, RGB, BYTE, NULL); 1384 1385 // Set up the outputs for stream 0. 1386 // Set the buffer object data format. 1387 VideoCaptureStreamParameterivNV(1, 0, 1388 VIDEO_BUFFER_INTERNAL_FORMAT_NV, 1389 &video_buffer_format); 1390 1391 // Get the video buffer pitch 1392 GetVideoCaptureStreamivNV(1, 0, VIDEO_BUFFER_PITCH_NV, 1393 &buffer_pitch); 1394 1395 // Allocate space in the buffer object. 1396 BindBufferARB(VIDEO_BUFFER_NV, video_out_buffer); 1397 BufferDataARB(VIDEO_BUFFER_NV, buffer_pitch * 1080, NULL, 1398 STREAM_READ_ARB); 1399 1400 // Bind the buffer object to the video capture stream. 1401 BindVideoCaptureStreamBufferNV(1, 0, FRAME_NV, 0); 1402 1403 // Bind the outputs for stream 1 1404 BindVideoCaptureStreamTextureNV(1, 1, FRAME_NV, GL_TEXTURE_2D, 1405 video_out_texture); 1406 1407 // Start the capture process 1408 BeginVideoCaptureNV(1); 1409 1410 // Loop capturing data 1411 while (...) { 1412 uint64EXT timestamp; 1413 uint sequence_num; 1414 1415 // Capture the video to a buffer object 1416 VideoCaptureNV(1, &sequence_num, ×tamp); 1417 1418 // Pull stream 0's video data back to local memory 1419 BindBufferARB(VIDEO_BUFFER_NV, video_out_buffer); 1420 GetBufferSubDataARB(VIDEO_BUFFER_NV, 0, buffer_pitch * 1080, 1421 someMallocMem1); 1422 1423 // Present stream 1's video data using NV_present_video 1424 PresentFrameKeyedNV(1, 0, 0, 0, GL_FRAME_NV, 1425 GL_TEXTURE_2D, video_out_texture, 0, 1426 GL_NONE, 0, 0); 1427 1428 // Do something with the data in someMallocMem1 here, 1429 // such as save it to disk. 1430 } 1431 1432 // Pause/Stop capturing. 1433 EndVideoCaptureNV(1); 1434 1435Issues 1436 1437 Should there be separate bind points for each input stream 1438 rather than having BindVideoCaptureStreamBufferNV? 1439 1440 [RESOLVED] No. BindVideoCaptureStreamBufferNV makes it simpler 1441 to use an implementation-dependent number of streams and 1442 reduces the number of tokens introduced. The downside is one 1443 extra step for the application at setup time, and possibly one 1444 extra step in the loop. 1445 1446 Should VideoCaptureNV return values, making it synchronize the 1447 client and server, or generate asynchronous query results? 1448 1449 [RESOLVED] VideoCaptureNV will return a status code and other 1450 capture statistics immediately. The application will likely need 1451 to use these values to decide how to use the captured data. 1452 1453 How should video capture devices be presented to the application? 1454 1455 [RESOLVED] In GLX, video capture devices are X resources 1456 with their own XID. Device enumeration returns a list of XIDs to 1457 the application. The application can query the unique ID of the 1458 underlying physical device associated with the XID. 1459 1460 In WGL, handles to the physical devices are returned. 1461 1462 There may be many X resources or windows handles referring to the 1463 same video device, but only one X client or handle at a time 1464 can own the physical device. This is accomplished with the lock 1465 and release entry points. 1466 1467 How does the application determine if a given capture operation 1468 returned valid data. 1469 1470 [RESOLVED] VideoCaptureNV will have an enum return value 1471 that specifies the overall status of the capture. It will be 1472 able to indicate success, partial success (some streams captured 1473 valid data), or failure (no streams captured valid data). The 1474 user can then query the individual streams to determine if they 1475 captured valid data on the last capture call. 1476 1477 The capture process involves a colorspace transformation in which 1478 the user can specify a conversion matrix. Should this matrix be 1479 configurable per-stream or is per-video-capture device sufficient? 1480 1481 [RESOLVED] Per-stream matrices will be used. This could be 1482 useful if the devices connected to each stream have different 1483 color characteristics and therefore each need different 1484 conversion matrices. 1485 1486 Should there be a way to specify color clamp values for each stream 1487 and each color component? 1488 1489 [RESOLVED] Yes. Some video specifications require color data to 1490 be in a certain range, so clamping is needed. 1491 1492 How do the color conversion parameters affect captured data when 1493 using a 4:2:2 capture format? 1494 1495 [RESOLVED] The color conversion step is skipped when the 1496 destination format is listed in table 4.13 or the effective 1497 capture data format layout isn't 4:4:4 or 4:4:4:4. 1498 1499 Does video capture slot state belong to the context or the video 1500 capture device. 1501 1502 [RESOLVED] The video capture state lives in the context. Setting 1503 video capture slot state does not affect the video capture device 1504 itself. Any video capture slot state that affects the video 1505 capture hardware will be applied to the hardware when the device 1506 is bound to the slot. 1507 1508 What happens to video capture slot state when a device is unbound, 1509 or, does video capture slot state persist across device bindings? 1510 1511 [RESOLVED] Since much of the video capture slot state depends on 1512 the currently bound device, the state should be reset to default 1513 values whenever a device is bound. 1514 1515 Is video capture slot state defined when no device is bound to the 1516 slot? Should querying video capture slot state when no device is 1517 bound generate an error? 1518 1519 [RESOLVED] Much of the state only has meaning when a device is 1520 bound. For example, the number of streams depends on how many 1521 streams the bound device exposes. Because of this, querying 1522 video capture state on a slot with no bound device should 1523 generate an INVALID_OPERATION error. This operation would 1524 essentially be the video capture equivalent of making GL calls 1525 without a current context. 1526 1527 What should the default values for all the video capture per-slot 1528 and per-stream state be? 1529 1530 [RESOLVED] Initial values have been specified in the spec and 1531 the state tables. 1532 1533 1534Revision History 1535 Fifth external draft: 2011/7/8 1536 -Fixed video slots used in second usage example 1537 1538 Fourth external draft: 2009/9/28 1539 -Added "New Types" section 1540 1541 Third external draft: 2009/9/8 1542 1543 Second external draft: 2009/7/31 1544 1545 First external draft: 2009/2/23 1546