ClipToPicture is too slow
|Reported by:||pulkomandy||Owned by:||stippi|
Description (last modified by )
We now have a working implementation of ClipToPicture. It is the only way to perform clipping in the transformed view space (ConstrainClippingRegion does not work with transforms). However, ClipToPicture hit tests work by alpha blending pixels, which is slow.
WebKit makes heavy use of that feature which makes it painful to use. We have to make this work faster.
An idea is to compute the un-transformed bounds of the clipping picture covered area, and use rectangle clipping to exclude anything outside that area from the drawing. When using a big view and a small clipping picture, this could cut down the time a lot. This extra clipping rectangle needs to be computed whenever the AlphaMask is generated, as changes in the view state could lead to different results.
Maybe there are other ways to improve this on the drawing side: avoiding the alpha blending when the clipping picture is only made of fully opaque or fully transparent pixels, for example? This may needs changes to agg rasterizer code.