A promising application of Process Analytical Technology to the downstream process of monoclonal antibodies (mAbs) is the monitoring of the Protein A load phase as its control promises economic benefits. Different spectroscopic techniques have been evaluated in literature with regard to the ability to quantify the mAb concentration in the column effluent. Raman and Ultraviolet (UV) spectroscopy are among the most promising techniques. In this study, both were investigated in an in-line setup and directly compared. The data of each sensor were analyzed independently with Partial-Least-Squares (PLS) models and Convolutional Neural Networks (CNNs) for regression. Furthermore, data fusion strategies were investigated by combining both sensors in hierarchical PLS models or in CNNs. Among the tested options, UV spectroscopy alone allowed for the most precise and accurate prediction of the mAb concentration. A Root Mean Square Error of Prediction (RMSEP) of 0.013 g L−1 was reached with the UV-based PLS model. The Raman-based PLS model reached an RMSEP of 0.232 g L−1. The different data fusion techniques did not improve the prediction accuracy above the prediction accuracy of the UV-based PLS model. Data fusion by PLS models seems meritless when combining a very accurate sensor with a less accurate signal. Furthermore, the application of CNNs for UV and Raman spectra did not yield significant improvements in the prediction quality. For the presented application, linear regression techniques seem to be better suited compared with advanced nonlinear regression techniques, like, CNNs. In summary, the results support the application of UV spectroscopy and PLS modeling for future research and development activities aiming to implement spectroscopic real-time monitoring of the Protein A load phase.