Hey!
I’m using mitmproxy 0.17 in the way like in flowbasic.py example
My handle_request and handle_response functions are rather slow,
I can’t use decorator @concurrent here, is there any way to run my code in parallel?
The concurrent decorator is mitmproxy’s particular tool for concurrency, but there’s no reason why you can’t construct your own with the exact semantics you need. Take a look at the implementation here:
The key to understanding what happens is the .take() operation, which shifts responsibility for the message reply into the script. Once this is done, you can use any concurrency technique you like, as long as you keep track of the message and reply once you’ve completed your processing.
I’d be interested to hear why the built-in @concurrent decorator isn’t an option for you. This is an evolving part of our interface, and I’d like to cope with as many use cases as possible.