I think you are talking about Volumetric Efficiency, which is a known and accepted term in the automotive industry. The idea of "Effective compression", while I understand what you mean, is a bit skewed and misleading. 13.8:1 "effective compression" under 6psi boost does not behave the same as a real 13.8:1 compression ratio. That high compression ratio would destroy most engines very quickly running on pump gas, but the same "effective compression ratio" achieved by running 6psi boost should be perfectly safe in a properly built engine running on cheap pump gas.
The difference is that the forced induction motor is forcing a larger volume of the air/fuel mixture in the cylinder, as opposed to a NA motor squeezing a smaller volume to get a high compression ratio. Theoretical maximum Volumetric Efficiency for any NA motor is 1 (although with proper tuning of the intake pulses a slightly higher VE can be achieved); but a forced induction motor can have a VE greater than 1 (or 100%, .9 would be 90%). This is still related not only to boost but to cam profiles, head geometry, valve sizes, intake configuration, etc, etc, so even a boosted motor that is poorly designed can have a VE less than 1.
Hopefully this explains why the concept of "effective compression" is flawed; sure, it's an easy calculation that sounds good, but it doesn't actually make sense unless you account for the Volumetric Efficiency of the engine without boost.
Sorry, I didn't mean to sidetrack this thread; it just bugs me when people throw around artificial compression ratios that are completely bogus, just because they think it sounds cool, like "yeah, i'm running 15psi boost so i've got 18:1 compression ratio". A motor with that kind of compression ratio could run on diesel fuel, not gasoline.