We have a rough time seeking to reduce supply fan energy during energy simulations. Could anybody throw some light on the following account:

1.) The project on hand is a plant that strictly regulates pressure differences among interior spaces.
2.) The proposed design reckons with a VSD on the air handler fan to adjust fan speed as airflow drops due to loaded filters, clogged cooling coils, etc. In contrast to generic VAV systems, the VSD on the air handler fan is not designed to vary supply airflow in proportion with room sensible heat.
3.) System 6 requires that the fan control be VAV. As far as I am concerned, the VAV of system 6 does not reflect the actual operation since modulating the supply airflow according to the room sensible heat may not concomitantly control the relative pressure differences among interior spaces.
4.) As a consequence, the fan energy of the proposed design is considerably higher than that of the baseline case.

Is the table G3.1.1B in Appendix G an ironclad rule to follow?
Is it relaxed in case the baseline is not comparable with the actual operation?
Is there a minimum flow of VAV deemed effective to maintain relative pressure differences among internal spaces?

In the event VAV is applicable to the factory in question, I suspect that both the proposed and baseline fans may not save significant energy, not to mention a barrage of pressure sensors and complex controls to make the system run as intended.

Thank you very much in advance for all the responses.