> What is the point of importing a dataset of this complexity if you can't also work with the data inside the DCC?
My understanding is, that you need to reproduce the rendering bug which crashes blender to be able to fix it. And being able to reproduce it, needs to be fast. Even if you have a smaller scene which would trigger this bug, without the optimisations it would take a lot of important time in the feedback loop. Now you have a workflow which crashes the rendering in less than 2 minutes.
I really don't understand your point at all. You might be right, the use case doesn't exist yet, maybe never. But this never was the point of the blog post.
Making the application wait extremely long or even crash by importing _something_ is a bug in my understanding. Why shouldn't it be fixed? Why shouldn't developers improve performance and blog about it, so other devs learn from it?
It's not about the Moana scene, that's just the test case, so OP has a valid benchmark with human comprehensible durations. The scene could be anything that is smaller and it will be imported faster now.
>At some point, the little pieces need to be assembled, reviewed and finally rendered. Why wouldn't you do that with the DCC application rather than with specialized, limited tools?
My understanding is, that you need to reproduce the rendering bug which crashes blender to be able to fix it. And being able to reproduce it, needs to be fast. Even if you have a smaller scene which would trigger this bug, without the optimisations it would take a lot of important time in the feedback loop. Now you have a workflow which crashes the rendering in less than 2 minutes.