Multiphase buck converters are the industry standard for powering high-current, low-voltage processors. The conventional interleaving strategy, where phases are fired sequentially, is widely adopted to reduce output ripple, but its effectiveness relies on the assumption of an ideal power delivery network (PDN). This paper demonstrates that parasitic impedance between phase outputs can induce severe current stress on the output capacitors and increase the output voltage ripple. To address this issue, this work introduces two PDN-aware phase sequencing strategies: Odd-Even and Symmetric-Pairing. Validated through circuit simulations with a PDN model extracted from ANSYS Q3D, the proposed sequence, especially Odd-Even, can reduce total capacitor ESR losses by up to 65% and simultaneously reduce the output voltage ripple by 28%, compared to the conventional approach. These findings provide a practical co-design guideline linking converter control strategy to the physical layout for high-density power solutions.