I've never used PyTorch or Chainer, so maybe I'm wildly wrong, but here goes..
Dynamic batching requires you to predefine a set of "batch-wise" operations (node types for computation graphs). This works fine for eager execution as well, because the eager code is inside a function definition. Of course if you want to do dynamic batching you need to introduce a layer of abstraction, e.g. instead of saying a+b you create a dynamic batching operation for addition and add edges to an execution graph, this is not different than in our TF implementation.
Thanks. Yeah, my understanding is that you cannot have eager execution all the way down, some eager executions inside the function is fine, but you shouldn't have these outside the function blocks.
Without this function block abstraction, it doesn't seem like you can implement dynamic batching very far though (or even at all).
Dynamic batching requires you to predefine a set of "batch-wise" operations (node types for computation graphs). This works fine for eager execution as well, because the eager code is inside a function definition. Of course if you want to do dynamic batching you need to introduce a layer of abstraction, e.g. instead of saying a+b you create a dynamic batching operation for addition and add edges to an execution graph, this is not different than in our TF implementation.