Skip to main content

Applying Neural Networks in Optical Communication Systems: Possible Pitfalls

01 January 2017

New Image

We investigate the risk of overestimating the performance gain when applying neural network based receivers in systems with pseudo random bit sequences or with limited memory depths, resulting in repeated short patterns. 

We show that with such sequences, a large artificial gain can be obtained which comes from pattern prediction rather than predicting or compensating the studied channel/phenomena.