I am building a BPSK recovery system in Simulink, with the end goal of using the HDL coder to implement it on an FPGA.
I designed the algorithm in MATLAB first, and it works fine. However, translating it to Simulink is giving me trouble.
One of my first steps is to filter my data, which I implemented in MATLAB as
data = filter(filfir2,1,data);
Implementing the same thing in Simulink however, using the Digital Filter Designer (a 100 tap Hamming window direct form FIR with the same cutoff parameters, resulting in the same Bode plot), gives very different results with the same data:
The orange trace is the data filtered through the Simulink block, and the blue trace is the data filtered through Matlab and brought into Simulink with a From Workspace block.
I also tried
1) Importing my Simulink filter into Matlab and using it in Matlab before sending the filtered data to Simulink
2) Importing my Matlab filter (as a dffir block) into Simulink
Both performed identically to the original Matlab filter (i.e the blue trace).
I am wondering if there is a sample time issue or something where the actual simulation is what's causing the issue?