Im currently simulating BER performance of CCSDS convolutional code. Im having some troubles in specifying EbNo and SNR in gnuradio blocks. From matlab webpages on BER simulation for RRC pulse and Convolutional codes, the SNR was defined by the equation
$SNR|_{dB} = EbNo|_{dB} + 10log(k\times rate) - 10log(sps)$
For uncoded BPSK, rate = 1 and k = 1 bit/symbol and sps is the number of samples per symbol. Noise variance was calculated as
$\sigma ^{2} = antilog(\frac{-SNR|_{dB}}{10})$
I'm trying to import these equations into gnuradio, starting with uncoded BPSK shown in the first flowgraph. The bpskPulseShapeRRC block is a hierarchical
map[-1,1] -> char to float -> interp_fir_filter(int(sps),taps).
$\sigma$ was used as the noise voltage of the channel model block. Running the flowgraph as is resulted in a BER curve that was quite off. Surprisingly, using a noise voltage of
$\frac{\sigma}{sps}$
(or setting the bpskPulseShapeRRC filter gain to sps) produced correct BER curve (I don't understand why. Any explanation is welcomed).
Coming back to CC FEC, the second flowgraph was executed after setting the coding rate to 0.5. Unfortunately, the BER values obtained were far off as compared to BER curves from the CCSDS Green Books. My guess is, this has something to do with incorrect noise voltage calculation. Any suggestions?

