I've had performance issues with umatrix. I tried blocking all JS by default and explicitly enabling scripts. In theory that should make the browser preform better. I suspect there are a bunch of sites that can't run a function or reach a JS resource and then just go into spin loops eating through resources .. either that or the blocking itself is resource intensive.
blocking has non-zero cost, yes. especially if complex whitelists need to be evaluated for every request. but at least for me it still is a net win most of the time.
But you can also operate umatrix in a blacklist-based approach, i.e. simply let most requests through except the categories you deem problematic (e.g. cross-site websockets in this case)