Searched full:frame (Results 1 – 25 of 388) sorted by relevance
12345678910>>...16
| /Documentation/userspace-api/media/v4l/ |
| D | vidioc-enum-frameintervals.rst | 13 VIDIOC_ENUM_FRAMEINTERVALS - Enumerate frame intervals 30 that contains a pixel format and size and receives a frame interval. 35 This ioctl allows applications to enumerate all frame intervals that the 36 device supports for the given pixel format and frame size. 38 The supported pixel formats and frame sizes can be obtained by using the 43 depend on the type of frame intervals the device supports. Here are the 67 the ``type`` field to determine the type of frame interval enumeration 69 does it make sense to increase the index value to receive more frame 74 The order in which the frame intervals are returned has no 76 default frame intervals. [all …]
|
| D | vidioc-enum-framesizes.rst | 13 VIDIOC_ENUM_FRAMESIZES - Enumerate frame sizes 30 that contains an index and pixel format and receives a frame width 36 This ioctl allows applications to enumerate all frame sizes (i. e. width 44 depend on the type of frame sizes the device supports. Here are the 68 the ``type`` field to determine the type of frame size enumeration the 70 it make sense to increase the index value to receive more frame sizes. 74 The order in which the frame sizes are returned has no special 81 other ioctl calls while it runs the frame size enumeration. 99 - Width of the frame [pixel]. 102 - Height of the frame [pixel]. [all …]
|
| D | vidioc-subdev-enum-frame-interval.rst | 13 VIDIOC_SUBDEV_ENUM_FRAME_INTERVAL - Enumerate frame intervals 34 This ioctl lets applications enumerate available frame intervals on a 35 given sub-device pad. Frame intervals only makes sense for sub-devices 36 that can control the frame period on their own. This includes, for 39 For the common use case of image sensors, the frame intervals available 40 on the sub-device output pad depend on the frame format and size on the 42 when enumerating frame intervals. 44 To enumerate frame intervals applications initialize the ``index``, 49 EINVAL error code if one of the input fields is invalid. All frame 53 Available frame intervals may depend on the current 'try' formats at [all …]
|
| D | pixfmt-compressed.rst | 70 In addition, metadata associated with the frame to decode are 80 corresponding frame to the matching capture buffer. 121 Metadata associated with the frame to decode is required to be passed 129 of macroblocks to decode a full corresponding frame to the matching 155 - VP8 compressed video frame. The encoder generates one 156 compressed frame per buffer, and the decoder requires one 157 compressed frame per buffer. 158 * .. _V4L2-PIX-FMT-VP8-FRAME: 162 - VP8 parsed frame, including the frame header, as extracted from the container. 165 Metadata associated with the frame to decode is required to be passed [all …]
|
| D | vidioc-subdev-enum-frame-size.rst | 13 VIDIOC_SUBDEV_ENUM_FRAME_SIZE - Enumerate media bus frame sizes 34 This ioctl allows applications to access the enumeration of frame sizes 47 Therefore, to enumerate frame sizes allowed on the specified pad 54 A successful call will return with minimum and maximum frame sizes filled in. 59 Sub-devices that only support discrete frame sizes (such as most 60 sensors) will return one or more frame sizes with identical minimum and 65 might not be able to produce every frame size between the minimum and 68 sub-device for an exact supported frame size. 70 Available frame sizes may depend on the current 'try' formats at other 87 - Index of the frame size in the enumeration belonging to the given pad [all …]
|
| D | vidioc-subdev-g-frame-interval.rst | 13 VIDIOC_SUBDEV_G_FRAME_INTERVAL - VIDIOC_SUBDEV_S_FRAME_INTERVAL - Get or set the frame interval on … 38 These ioctls are used to get and set the frame interval at specific 39 subdev pads in the image pipeline. The frame interval only makes sense 40 for sub-devices that can control the frame period on their own. This 42 don't support frame intervals must not implement these ioctls. 44 To retrieve the current frame interval applications set the ``pad`` 51 To change the current frame interval applications set both the ``pad`` 58 contains the current frame interval as would be returned by a 71 Changing the frame interval shall never change the format. Changing the 72 format, on the other hand, may change the frame interval. [all …]
|
| D | ext-ctrls-codec.rst | 596 bitrate to produce requested frame quality. 603 encoding a frame would cause the encoded stream to be larger then a 604 chosen data limit then the frame will be skipped. Possible values 619 - Frame skip mode is disabled. 621 - Frame skip mode enabled and buffer limit is set by the chosen 624 - Frame skip mode enabled and buffer limit is set by the 633 For every captured frame, skip this many subsequent frames (default 667 currently displayed frame. This is the same PTS as is used in 670 .. _v4l2-mpeg-video-dec-frame: 673 This read-only control returns the frame counter of the frame that [all …]
|
| D | field-order.rst | 15 original frame. This curious technique was invented because at refresh 17 fields reduces the flicker without the necessity of doubling the frame 20 It is important to understand a video camera does not expose one frame 25 which field of a frame is older, the *temporal order*. 31 the first line of an interlaced frame, the first line of the bottom 32 field is the second line of that frame. 35 whether a frame commences with the top or bottom field is pointless. Any 37 frame. Only when the source was progressive to begin with, e. g. when 38 transferring film to video, two fields may come from the same frame, 88 - Images are in progressive (frame-based) format, not interlaced [all …]
|
| D | vidioc-g-parm.rst | 38 Applications can request a different frame interval. The capture or 39 output device will be reconfigured to support the requested frame 41 repeat frames to achieve the requested frame interval. 44 frame interval that is typically embedded in the encoded video stream. 46 Changing the frame interval shall never change the format. Changing the 47 format, on the other hand, may change the frame interval. 114 frame rate. 117 frame interval that is typically embedded in the encoded video stream. 119 Applications store here the desired frame period, drivers return 120 the actual frame period. [all …]
|
| D | dev-stateless-decoder.rst | 10 between processed frames. This means that each frame is decoded independently 30 frame may be the result of several decode requests (for instance, H.264 streams 31 with multiple slices per frame). Decoders that support such formats must also 110 frame buffer resolution for the decoded frames. 119 as per standard semantics; matching frame buffer format. 162 frame buffer resolution of the decoded stream; typically unchanged from 231 For each frame, the client is responsible for submitting at least one request to 236 corresponds to one frame worth of encoded data, but some formats may allow (or 246 If there is a possibility that the decoded frame will require one or more 253 A typical frame would thus be decoded using the following sequence: [all …]
|
| D | metafmt-d4xx.rst | 15 Intel D4xx (D435, D455 and others) cameras include per-frame metadata in their UVC 27 per frame, therefore their headers cannot be larger than 255 bytes. 57 capture the frame 59 - Exposure time (in microseconds) used to capture the frame 65 - Exposure priority value: 0 - constant frame rate 94 * - __u32 Frame counter 97 - Time in microseconds from the beginning of a frame till its middle 99 - Time, used to read out a frame in microseconds 101 - Frame exposure time in microseconds 102 * - __u32 Frame interval [all …]
|
| D | ext-ctrls-codec-stateless.rst | 514 - The frame (or the top/bottom fields, if it's a field pair) 716 frame-based decoding but new modes might be added later on. 742 control shall be set. When multiple slices compose a frame, 747 - Decoding is done at the frame granularity, 749 frame. The OUTPUT buffer must also contain both fields. 830 - The width of the frame. 833 - The height of the frame. 836 - The flags of the frame, see :ref:`fwht-flags`. 839 - The colorspace of the frame, from enum :c:type:`v4l2_colorspace`. 878 - Set if each 'frame' contains just one field. [all …]
|
| D | func-read.rst | 45 :c:func:`read()` call will provide at most one frame (two fields) 65 reading, or the capture rate must fall below the nominal frame rate of 72 previously, not read frame, and returns the frame being received at the 76 :c:func:`read()` call. The frame being received at :c:func:`read()` 77 time is discarded, returning the following frame instead. Again this 79 nominal frame rate. An example of this model is the video read mode of 100 of data required for one frame. This may happen for example because 103 the next read will start at the beginning of a new frame. Possible error
|
| D | dev-encoder.rst | 112 will include all possible frame buffer resolutions supported by the 117 frame intervals for a given format and resolution, passing the desired pixel 123 format and coded resolution will include all possible frame intervals 127 format and resolution will include all possible frame intervals supported 129 coded format, coded resolution and coded frame interval currently set on 232 4. Set the raw frame interval on the ``OUTPUT`` queue via 233 :c:func:`VIDIOC_S_PARM`. This also sets the coded frame interval on the 245 the desired frame interval; the encoder may adjust it to 251 the adjusted frame interval. 255 Changing the ``OUTPUT`` frame interval *also* sets the framerate that [all …]
|
| /Documentation/driver-api/surface_aggregator/ |
| D | ssh.rst | 50 The fundamental communication unit of the SSH protocol is a frame 51 (:c:type:`struct ssh_frame <ssh_frame>`). A frame consists of the following 54 .. flat-table:: SSH Frame 64 - Type identifier of the frame. 68 - Length of the payload associated with the frame. 74 Each frame structure is followed by a CRC over this structure. The CRC over 75 the frame structure (|TYPE|, |LEN|, and |SEQ| fields) is placed directly 76 after the frame structure and before the payload. The payload is followed by 78 the frame has ``LEN=0``), the CRC of the payload is still present and will 80 equals the number of bytes between the CRC of the frame and the CRC of the [all …]
|
| /Documentation/userspace-api/media/drivers/ |
| D | imx-uapi.rst | 21 subdev node. This event is generated by the Frame Interval Monitor 29 Frame Interval Monitor in ipuX_csiY 35 sync by adding 1 dummy line every frame, which causes a rolling effect 38 permanent split image (one frame contains lines from two consecutive 41 From experiment it was found that during image rolling, the frame 43 value for the current standard, by about one frame time (60 usec), 48 time every frame, not a fixed value), we can use it to detect the 49 corrupt fields using a frame interval monitor. If the FIM detects a 50 bad frame interval, the ipuX_csiY subdev will send the event 66 How many frame interval measurements to average before comparing against [all …]
|
| D | camera-sensor.rst | 13 Frame size 16 There are two distinct ways to configure the frame size produced by camera 40 Frame interval configuration 43 There are two different methods for obtaining possibilities for different frame 44 intervals as well as configuring the frame interval. Which one to implement 50 Instead of a high level parameter such as frame interval, the frame interval is 55 The frame interval is calculated using the following equation:: 57 frame interval = (analogue crop width + horizontal blanking) * 83 level interface natively, generally use the concept of frame interval (or frame 86 frame interval on these devices.
|
| D | npcm-video.rst | 10 capture a frame from digital video input and compare two frames in memory, and 11 the ECE can compress the frame data into HEXTILE format. 23 Capture the next complete frame into memory. 27 Compare the incoming frame with the frame stored in memory, and updates the 28 differentiated frame in memory. 39 If using V4L2_PIX_FMT_HEXTILE format, VCD will capture frame data and then ECE
|
| /Documentation/driver-api/ |
| D | frame-buffer.rst | 1 Frame Buffer Library 4 The frame buffer drivers depend heavily on four data structures. These 20 otherwise. A good example of this is the start of the frame buffer 21 memory. This "locks" the address of the frame buffer memory, so that it 31 Frame Buffer Memory 37 Frame Buffer Colormap 43 Frame Buffer Video Mode Database 52 Frame Buffer Macintosh Video Mode Database 58 Frame Buffer Fonts
|
| /Documentation/devicetree/bindings/timer/ |
| D | arm,arch_timer_mmio.yaml | 15 frames with a physical and optional virtual timer per frame. 27 description: The control frame base address 63 '^frame@[0-9a-f]+$': 66 description: A timer node has up to 8 frame sub-nodes, each with the following properties. 68 frame-number: 86 - frame-number 108 frame@0 { 109 frame-number = <0>; 116 frame@2000 { 117 frame-number = <1>;
|
| /Documentation/fb/ |
| D | framebuffer.rst | 2 The Frame Buffer Device 11 The frame buffer device provides an abstraction for the graphics hardware. It 12 represents the frame buffer of some video hardware and allows application 24 From the user's point of view, the frame buffer device looks just like any 26 specifies the frame buffer number. 31 0 = /dev/fb0 First frame buffer 32 1 = /dev/fb1 Second frame buffer 34 31 = /dev/fb31 32nd frame buffer 44 The frame buffer devices are also `normal` memory devices, this means, you can 49 There also can be more than one frame buffer at a time, e.g. if you have a [all …]
|
| D | internals.rst | 2 Frame Buffer device internals 5 This is a first start for some documentation about frame buffer device 15 Structures used by the frame buffer device API 18 The following structures play a role in the game of frame buffer devices. They 25 Device independent unchangeable information about a frame buffer device and 31 Device independent changeable information about a frame buffer device and a 46 Generic information, API and low level information about a specific frame 55 Visuals used by the frame buffer device API
|
| D | modedb.rst | 6 Currently all frame buffer device drivers have their own video mode databases, 9 - one routine to probe for video modes, which can be used by all frame buffer 14 needs non-standard modes, like amifb and Mac frame buffer drivers (which 17 When a frame buffer device receives a video= option it doesn't know, it should 18 consider that to be a video mode option. If no frame buffer device is specified 171 amifb - Amiga chipset frame buffer 172 aty128fb - ATI Rage128 / Pro frame buffer 173 atyfb - ATI Mach64 frame buffer 174 pm2fb - Permedia 2/2V frame buffer 175 pm3fb - Permedia 3 frame buffer [all …]
|
| /Documentation/devicetree/bindings/hsi/ |
| D | client-devices.txt | 9 - hsi-rx-mode: Receiver Bit transmission mode ("stream" or "frame") 10 - hsi-tx-mode: Transmitter Bit transmission mode ("stream" or "frame") 16 - hsi-arb-mode: Arbitration mode for TX frame ("round-robin", "priority") 37 hsi-mode = "frame";
|
| /Documentation/networking/ |
| D | oa-tc6-framework.rst | 46 for Ethernet frame transfers and control transactions for register 60 or may not contain valid frame data independent from each other, allowing 66 frame data is present and provides the information to determine which 67 bytes of the payload contain valid frame data. 71 The data footer indicates if there is receive frame data present within 73 of the payload contain valid frame data. 153 - Forwards the received Ethernet frame from 10Base-T1x MAC-PHY to n/w 163 transmit frame data within the 64 bytes data chunk payload. 189 any receive frame data within the current chunk. 198 whether the current chunk contains valid transmit frame data [all …]
|
12345678910>>...16