Re: [PATCH] vimc: debayer: Add support for ARGB format

From: Laurent Pinchart
Date: Tue Jun 02 2020 - 11:14:33 EST


Hi Dafna,

On Tue, Jun 02, 2020 at 03:16:03PM +0200, Dafna Hirschfeld wrote:
> On 02.06.20 14:45, Laurent Pinchart wrote:
> > On Tue, Jun 02, 2020 at 08:31:26AM -0300, Helen Koike wrote:
> >> On 6/2/20 8:24 AM, Kieran Bingham wrote:
> >>> On 02/06/2020 11:55, Helen Koike wrote:
> >>>> On 6/2/20 7:52 AM, Dafna Hirschfeld wrote:
> >>>>> On 01.06.20 14:16, Kaaira Gupta wrote:
> >>>>>> On Fri, May 29, 2020 at 05:43:57PM +0200, Dafna Hirschfeld wrote:
> >>>>>>> Hi,
> >>>>>>> Thanks for the patch
> >>>>>>>
> >>>>>>> I don't know how real devices handle ARGB formats,
> >>>>>>> I wonder if it should be the part of the debayer.
> >>>>>>
> >>>>>> Hi! qcam tries to support BA24 as it is one of the formats that vimc
> >>>>>> lists as its supported formats wih --list-formats. Shouldn't BA24 be
> >>>>>> possible to capture with vimc?
> >>>>>
> >>>>> Hi,
> >>>>> Just to clarify, when listing the supported formats of a video node, the node lists
> >>>>> the formats that the video node as an independent media entity support.
> >>>>> It does not mean that the 'camera' as a whole (that is, the media topology graph) supports
> >>>>> all the formats that the video node lists. When interacting with a video node or
> >>>>> a subdevice node, one interacts only with that specific entity.
> >>>>> In the case of vimc, the RGB video node as an independent entity supports BA24 so the format
> >>>>> appears in the list of the its supported formats. But since the Debayer does not
> >>>>> support it, the format can not be generated by the entire vimc topology.
> >>>>> This is not a bug.
> >
> > Is here a valid configuration for the vimc pipeline that produces BA24 ?
>
> I think there isn't
>
> > I agree that not all pipeline configurations need to support every
> > format, but we shouldn't report a format that can't be produced at all.
> >
> > This being said, and as discussed before, the de-bayering subdev should
> > just produce MEDIA_BUS_FMT_RGB888_1X24, and the video node should then
> > implement the RGB pixel formats. BA24 should likely be one of the
> > supported formats (or maybe BX24 ?).
>
> So you mean that the video node should support it so when it receive RGB
> format in the source pad it converts it to BA24 or BX24 ?

Yes. If you think about an equivalent hardware pipeline, the device
would carry 24-bit RGB between the processing blocks, and only when the
data reaches the output formatter (usually bundled with the DMA engine)
would it be converted to one of the multiple RGB pixel formats. I think
vimc should mimic that behaviour when it comes to pipeline
configuration.

When it comes to generating data (or processing it, I believe vimc
supports, or aims to support, memory-to-memory processing), we have two
options:

- In the general case, data generated by the TPG would be processed by
the individual blocks in the pipeline until it reaches the capture
video node. To support processing steps that may generate more, less
or an equal amount of data as they consume, we would need to allocate
buffers between all the processing blocks. Those buffers don't need to
be full frame buffers though, they can be a few lines only, generating
the test pattern and processing it a few lines at a time.

This architecture would also support memory-to-memory processing.

- When using the TPG, we could optimize the implementation by generating
data in the capture buffer directly, in the format configured on the
video node, without processing the data in every block. The drawback
is that processing artifacts (such as the artifacts due to colour
interpolation or scaling) wouldn't be generated, the captured image
would be cleaner than in a real hardware implementation. Whether that
is an issue or not is for us to decide.

If vimc doesn't need to support memory-to-memory processing, we could
implement the optimized behaviour only.

> It makes sense. I guess both BA24 and BX24 can be added, I see in the
> pixfmt-rgb.html doc that probably the control V4L2_CID_ALPHA_COMPONENT
> should then be added.

Yes, V4L2_CID_ALPHA_COMPONENT would need to be added (on the video node)
to produce BA24, otherwise the alpha component would have a fixed value.

> >>>> This is also my understanding.
> >>>>
> >>>> You should have an -EPIPE error when start streaming though, it
> >>>> shouldn't fail silently.
> >>>
> >>> Yes, we had -EPIPE, and that is what I think we were trying to resolve.
> >>>
> >>> How would userspace be expected to detect what formats to use ? Should
> >>> the available formats on the capture node depend on the current linking
> >>> of the media graph?
> >>
> >> This is a good question, I don't recall v4l2 API defining this.
> >
> > A recent extension to VIDIOC_ENUMFMT allows enumerating pixel formats
> > for a given media bus code, I think that's the way forward.
> >
> >> It would be a bit hard to implement in Vimc, specially when we have configfs
> >> for custom topology, since the capture would need to query all the pipeline.
> >> But could be implemented.
> >>
> >>> Otherwise, to know what formats are supported - userspace must first
> >>> 'get a list of formats' then try to 'set' the formats to know what is
> >>> possible?
>
> Yes, there is a doc file that explains that it should be done in a "bottom-up" way
> ,that is, starting with configuring the sensor, then adjusting the debayer
> to the sensor output, then adjusting the scaler to the debayer outout and then
> adjusting the video node output to the scaler output. One should also use the
> 'try' version of the setting at the stage of adjusting the final configuration.
> The detailed explanation is in Documentation/output/userspace-api/media/v4l/dev-subdev.html

That won't help though. The video node will happily accept a
VIDIOC_S_FMT call that sets a pixel format not compatible with the media
bus format at the input of the video node. The error will only be raised
at stream on time. The VIDIOC_ENUMFMT extension that allows enumerating
pixel formats supported for a given media bus code is the only working
option.

> >> At the moment yes.
> >>
> >>> Or should (given VIMC is quite specialist anyway) userspace 'just know'
> >>> what is capable all the same?
> >>>
> >>> That's possibly fine, as we can simply remove support for the ARGB
> >>> formats from the libcamera pipeline handler if it is never expected to
> >>> be supported.
> >>
> >> With the configfs feature, you could build a topology with sensor->capture,
> >> and ARGB would be supported.
> >>
> >>> But then as a further question - what formats will we expect VIMC to
> >>> support? VIVID has a (very) wide range of formats.
> >>>
> >>> Would we ever expect VIMC to be as configurable?
> >>> Or is the scope limited to what we have today?
> >>
> >> I know it is very limited atm, but I would like to increase the range,
> >> I'm just with a limited bandwitdh to work on it.
> >>
> >>>>>>
> >>>>>> If yes, which entity should support it, if not debayer? Should there be
> >>>>>> a separate conversion entity, or should we keep the support in debayer
> >>>>>> itself for efficiency issues?
> >>>>>>
> >>>>>>> On 28.05.20 20:57, Kaaira Gupta wrote:
> >>>>>>>> Running qcam for pixelformat 0x34324142 showed that vimc debayer does
> >>>>>>>> not support it. Hence, add the support for Alpha (255).
> >>>>>>>
> >>>>>>> I would change the commit log to:
> >>>>>>>
> >>>>>>> Add support for V4L2_PIX_FMT_RGB24 format in the debayer
> >>>>>>> and set the alpha channel to constant 255.
> >>>>>>>
> >>>>>>>> Signed-off-by: Kaaira Gupta <kgupta@xxxxxxxxxxxxx>
> >>>>>>>> ---
> >>>>>>>> ÂÂ .../media/test-drivers/vimc/vimc-debayer.cÂÂÂ | 27 ++++++++++++-------
> >>>>>>>> ÂÂ 1 file changed, 18 insertions(+), 9 deletions(-)
> >>>>>>>>
> >>>>>>>> diff --git a/drivers/media/test-drivers/vimc/vimc-debayer.c b/drivers/media/test-drivers/vimc/vimc-debayer.c
> >>>>>>>> index c3f6fef34f68..f34148717a40 100644
> >>>>>>>> --- a/drivers/media/test-drivers/vimc/vimc-debayer.c
> >>>>>>>> +++ b/drivers/media/test-drivers/vimc/vimc-debayer.c
> >>>>>>>> @@ -62,6 +62,7 @@ static const u32 vimc_deb_src_mbus_codes[] = {
> >>>>>>>> ÂÂÂÂÂÂ MEDIA_BUS_FMT_RGB888_1X7X4_SPWG,
> >>>>>>>> ÂÂÂÂÂÂ MEDIA_BUS_FMT_RGB888_1X7X4_JEIDA,
> >>>>>>>> ÂÂÂÂÂÂ MEDIA_BUS_FMT_RGB888_1X32_PADHI,
> >>>>>>>> +ÂÂÂ MEDIA_BUS_FMT_ARGB8888_1X32
> >>>>>>>> ÂÂ };
> >>>>>>>> ÂÂ static const struct vimc_deb_pix_map vimc_deb_pix_map_list[] = {
> >>>>>>>> @@ -322,15 +323,23 @@ static void vimc_deb_process_rgb_frame(struct vimc_deb_device *vdeb,
> >>>>>>>> ÂÂÂÂÂÂ unsigned int i, index;
> >>>>>>>> ÂÂÂÂÂÂ vpix = vimc_pix_map_by_code(vdeb->src_code);
> >>>>>>>> -ÂÂÂ index = VIMC_FRAME_INDEX(lin, col, vdeb->sink_fmt.width, 3);
> >>>>>>>> -ÂÂÂ for (i = 0; i < 3; i++) {
> >>>>>>>> -ÂÂÂÂÂÂÂ switch (vpix->pixelformat) {
> >>>>>>>> -ÂÂÂÂÂÂÂ case V4L2_PIX_FMT_RGB24:
> >>>>>>>> -ÂÂÂÂÂÂÂÂÂÂÂ vdeb->src_frame[index + i] = rgb[i];
> >>>>>>>> -ÂÂÂÂÂÂÂÂÂÂÂ break;
> >>>>>>>> -ÂÂÂÂÂÂÂ case V4L2_PIX_FMT_BGR24:
> >>>>>>>> -ÂÂÂÂÂÂÂÂÂÂÂ vdeb->src_frame[index + i] = rgb[2 - i];
> >>>>>>>> -ÂÂÂÂÂÂÂÂÂÂÂ break;
> >>>>>>>> +
> >>>>>>>> +ÂÂÂ if (vpix->pixelformat == V4L2_PIX_FMT_ARGB32) {
> >>>>>>>> +ÂÂÂÂÂÂÂ index =Â VIMC_FRAME_INDEX(lin, col, vdeb->sink_fmt.width, 4);
> >>>>>>>> +ÂÂÂÂÂÂÂ vdeb->src_frame[index] = 255;
> >>>>>>>> +ÂÂÂÂÂÂÂ for (i = 0; i < 3; i++)
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂ vdeb->src_frame[index + i + 1] = rgb[i];
> >>>>>>>> +ÂÂÂ } else {
> >>>>>>>> +ÂÂÂÂÂÂÂ index =Â VIMC_FRAME_INDEX(lin, col, vdeb->sink_fmt.width, 3);
> >>>>>>>> +ÂÂÂÂÂÂÂ for (i = 0; i < 3; i++) {
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂ switch (vpix->pixelformat) {
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂ case V4L2_PIX_FMT_RGB24:
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ vdeb->src_frame[index + i] = rgb[i];
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ break;
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂ case V4L2_PIX_FMT_BGR24:
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ vdeb->src_frame[index + i] = rgb[2 - i];
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ break;
> >>>>>>>> +ÂÂÂÂÂÂÂÂÂÂÂ }
> >>>>>>>> ÂÂÂÂÂÂÂÂÂÂ }
> >>>>>>>> ÂÂÂÂÂÂ }
> >>>>>>>> ÂÂ }

--
Regards,

Laurent Pinchart