@PeterMcAuley Here’s the log output (attached) from running with Nvidia first and Intel second with the BCC+Vig enabled.
My actual request would be more along the lines of “all the BCC(+) fx should act consistently.” If users had to keep in mind that the BCC fx act one way and the BCC+ fx act another but they were self-consistent within the groups that would be workable. Part of the problem now is there’s a bug where even that kind of rule doesn’t hold, otherwise we’d get the same results no matter what gpu we’re using. It seems like this is a bug in VP that’s been there since at least v15.
My ultimate preference would be for all of them to act as the BCC filters do today because more often than not I want the effect to act within the visible bounds of the clip it’s applied to. Having an option to give the user more control would be handy at times though I know it’s a double-edged sword for support.
It looks to me like VP is setting an render state and leaving it dirty. When I went to get the attached log I found that the image was going full alpha again when it should have been opaque. Same as in my video from 2d ago. There’s also this error in the log:
2021-09-14 16:49:03 BCC VGS NOTICE: Licensed render 32BitFltLib: BCC+Vignette
2021-09-14 16:49:03 BCC VGS NOTICE: BCC_DFT: ERROR: ocl render - std::exception: clCreateImage2D - Falling back to CPU
BCC - Copy (2).zip (3.3 KB)