Switching fluctuation, which causes jitter, occurs in a DC/DC converter's clock signal, which causes conducted noise to be time variant. An oscilloscope's averaging mode in the measurement partially removes high-frequency noise. Thus, the accuracy when predicting the noise by model identification based on the black-box equivalent circuit model is affected by switching fluctuation. In this paper, we first focus on removing the effect of switching fluctuation and improving the accuracy of noise prediction by using the noise signal decomposition method. We propose a method for decomposing a measured time-domain noise signal into ripple noise and turn-on and turnoff spike noises to prevent the accuracy from degrading in parameter identification. The waveform decomposition method for peak detection can be used to predict the noise spectra without switching fluctuation within a 3-dB prediction error for up to 200 MHz. In reality, switching fluctuation spreads the power density of the measured signal. Thus, our second area of focus is predicting the reduced noise spectrum when switching fluctuation occurs. The noise level of the predicted spectrum agrees well with that of the measured one.