Unfortunately, I am not aware of any document like that. There are some papers comparing everything efficiency of different types of RAM, and there is of course indirect evidence of premium notebooks having significantly improved the battery life after moving to LPDDR.
LPDDR does many, many things to reduce power consumption (the current version even does DVFS, which I think is a first for main memory; graphics cards have been doing interface DFS for a long time, but a high end graphics card will burn around 30-70 W in the memory chips, and uses extremely high speed PHYs running at around 20 GBit/s per pin in the current generation).
The only thing that really changes between DIMM and soldered on is a somewhat increased trace length. That accounts only for a very small increase in power.
PHY is only part of the DRAM chip, and trace length is only a part of what determines energy per bit, so increasing trace length by x % wouldn't increase overall power by x %, but a fraction of that. Thinking about it, there is a bigger problem than just energy/bit scaling; since LPDDR switches dynamically between high-speed and lower speed modes, the high speed mode is likely not used a lot. However, the low speed modes rely on being able to turn the termination off, which saves significant amounts of power. A socket may make that difficult to achieve, and if you have to run with ODT in the, I'm assuming, mostly-used low power mode, you might end up with a significant increase in power.
Of course there are multiple factors at play, but that observation doesn’t justify the conclusion that any particular factor makes only a ‘very small l’ contribution. Do you have even a back-of-an-envelope calculation to support this claim? The knock-on effects of increased trace length can be quite complex.