For now I can exclude certain website with negative lookaround. Sure, exception url field would be nice.
But, I think "Don't match again keyword" would be useful for more complex situation.
For example we want to match this
but not
<div class "ads-block postlist">
We can use negative lookaround, but it will be complicated to construct the regex if we have complex regex before.
Or in your rules "Remove and block by URL"
/banner/.*?\.(?>gif|jpg|png|swf)
I don't want it to match:
http://somewebsite/banner/1x1.gif
http://somewebsite/banner/threads-banner/threads.jpg
How I can make such a rules?
It much easier with "Don't match against keyword in" in admuncher, but unfortunately its only work with admuncher rules.
I know it may affect to performance a lot, we just need to be wise when to use it. There is a reason admuncher had this rules.
If such feature exist we can just use this simple rules like this to match common ads class/id, and just add negative rules later.
\Aad[sv]?[-_]
[-_]ad[sv]?\Z