I once worked with a system where all local functions calls had parameters serialized to XML, sent over, then deserialized by the calling functions.
The framework was meant to be network transparent, remote calls looked the same as local calls, everything was async, and since everything used service discovery you could easily refactor so that a locally provided service was spun off to a remote computer somewhere and none of the code that called it had to change.
So anyway 50% of the CPU was being used on XML serialization/deserialization...
To some degree this is actually nice! I mean, one major reason local API calls (in the same programming language) are nicer than network calls – besides avoiding latency – is that network APIs rarely come with strong type safety guarantees (or at least you have to bolt them on on top, think OpenAPI/Swagger). So I actually wish we were in a world where network calls were more similar to local API calls in that regard.
But of course, in the concrete situation you're describing, the fact that
> all local functions calls had parameters serialized to XML
There was also a version that would automatically turn JSON into XML and vice versa. When REST started pushing out SOAPy stuff, we actually had a meeting where the sales guy (at the behest of our XML loving CTO) showed us a diagram that was roughly this:
Web ------------- Premisis
JSON ---> || ---> XML
JSON <--- || <--- XML
Price was $9,600 for the 2U appliance that makes all your non-enterprisy JSON into something fitting of today's modern enterprise, circa 2001. To this day, I laugh when I think about it.
I would assume it works by encoding primarily a strict, predictable format. And when decoding, primarily accelerating the stricter, more predictable parts, and including some regular CPUs for the difficult bits.
In 2013 I was working on this stack: The frontend server talks in XML to the backend server, so you need 6 massive physical servers to handle the load. The load being XML serialization.
People laugh at my SQL in REST classes, aka no layer at all, but it grew with the company and we now have 4 layers, 4 USEFUL layers. And with this ultra lean architecture, we were profitable from Day 90, so we are here 7 years later.
The framework was meant to be network transparent, remote calls looked the same as local calls, everything was async, and since everything used service discovery you could easily refactor so that a locally provided service was spun off to a remote computer somewhere and none of the code that called it had to change.
So anyway 50% of the CPU was being used on XML serialization/deserialization...