Technical Articles

Verifying Millimeter Wave RF Electronics on a Zynq RFSoC Based Digital Baseband

By Matthew Weiner, RF Pixels


Emerging 5G networks operate in the millimeter wave spectrum, which means that they can carry more data at higher speeds and with lower latency than 4G networks. While millimeter wave spectrum technology has great potential, it also presents design challenges for device manufacturers. For example, signals in the millimeter wave spectrum are more attenuated by the atmosphere and other objects than lower-frequency signals.

My colleagues and I are developing radio front ends with specialized RF electronics hardware that overcomes this attenuation by focusing millimeter wave signal power with beamforming. Our designs incorporate multi-user, multiple-input and multiple-output (MU-MIMO) technology.

To test and demonstrate these designs, we implemented our own digital baseband in MATLAB® and Simulink® (Figure 1). We accelerated implementation by adapting the LTE golden reference model from Wireless HDL Toolbox™ and deploying it to a Zynq® UltraScale+™ RFSoC board using HDL Coder™. This approach saved us at least a year of engineering effort and enabled me to complete the implementation myself without having to hire an additional digital engineer.

Figure 1. LTE digital baseband receive chain modeled in Simulink.

Figure 1. LTE digital baseband receive chain modeled in Simulink.

Modeling and Simulating the Digital Baseband

Out of the box, the golden reference LTE model provided with Wireless HDL Toolbox provided a number of key capabilities, such Master Information Block (MIB) decoding. I used these capabilities to build a custom 4G-like OFDM transceiver chain, adding enhancements to the existing timing recovery, carrier recovery, and equalization.

I simulated this transceiver chain with a simple channel model from Wireless HDL Toolbox. The simulations enabled me to validate the baseband model by evaluating and visualizing metrics such as symbol error rate (SER) and error vector magnitude (EVM) for various levels of noise (Figure 2).

Figure 2. Plots of EVM (left) and SER (right) as a function of signal-to-noise ratio (SNR)

Figure 2. Plots of EVM (left) and SER (right) as a function of signal-to-noise ratio (SNR).

Implementing the Baseband on Zynq RFSoC Hardware

After verifying the digital model through Simulink simulations, I generated RTL code from the model with HDL Coder and deployed it to the Zynq UltraScale+ RFSoC ZCU111 board. The generated code was both efficient and readable. I verified the implementation by performing digital loop-back tests on the Zynq board’s FPGA, passing the transmit output directly back into the receive chain. I followed those tests with analog loop-back tests, which incorporated the analog-to-digital (ADC) and digital-to-analog (DAC) converters on the board (Figure 3).

Figure 3. Constellation plots showing degraded performance (top left), simulated performance after a fix was implemented (top right), and real-world performance in board-to-board tests (bottom).

Figure 3. Full system diagram showing the digital baseband implemented in HDL with the RF Pixels radio front end.

At that point, I could run full board-to-board tests and explore the effects of RF impairments, using MATLAB to analyze the data captured from the board, generate constellation plots, and evaluate algorithm enhancements to address the impairments.

Rapid Design Iterations

In the past, I’ve worked in more traditional workflows in which an RTL team implements the design produced by the systems team. The iterations in this workflow tend to take a long time; it can take weeks to implement and retest changes to an algorithm. My iterations with MATLAB and Simulink were much faster, and I could usually implement and retest an enhancement in a matter of days, if not on the same day.

In one instance, I noticed that, while the system performed well shortly after system startup, the bit error rate (BER) steadily increased over time. To diagnose the problem, I captured data from the ADC at various time intervals after startup and analyzed it in MATLAB. The constellation plots clearly showed how the performance degraded over time.

I determined that the issue related to the sampling rate offset, which led to gradual drifting outside the cyclic prefix area of the LTE frame. I implemented an algorithm change to track the primary synchronization signal. I verified this fix via simulation and then implemented it on the board, where I saw that BER stayed low no matter how long the system operated (Figure 4).

Figure 4. Constellation plots showing degraded performance (top left), board-to-board testing (top right), and over-the-air testing (bottom).

Figure 4. Constellation plots showing degraded performance (top left), board-to-board testing (top right), and over-the-air testing (bottom).

Later, I found an issue with IQ gain and phase imbalance. Although we thought we had calibrated our system well to handle IQ imbalance, I discovered that the calibration parameter values were not correct. Once again, I analyzed captured data in MATLAB and then performed a quick brute-force search in MATLAB to find the appropriate calibration values to correct the problem. I updated the Simulink model to implement the change and generated code to verify the fix on live hardware within minutes.

Planned Enhancements

We are planning a 5G version of our digital baseband and working on extending our RF technology to meet the specification from the O-RAN Alliance for open radio access networks. Providing an O-RAN interface to our designs will make it easier to integrate our IP with other systems even as we continue to improve performance and add new capabilities.

Published 2020