|Study on Online Sequential Extreme Learning Machine and Its Application
|Place of Conferral
|极限学习机(Extreme learning machine, ELM)是一种单隐层前馈神经网络，经典ELM学习算法为批量算法，所有的数据在训练之前都应该事先获得。然而，实际应用中数据往往是连续接收的。针对该场景，在线序列极限学习机(Online Sequential Extreme learning machine, OS-ELM)应运而生，它可以对逐个或逐块到来的数据进行顺序处理，具有较好的泛化性能和较快的学习速度。
（1）OS-ELM算法利用均方误差(Mean Squared Error, MSE)准则构造代价函数，由于该准则只考虑数据的二阶统计量，因此在处理非线性和非高斯分布的数据时难以获得最佳性能。针对该问题，本文提出了基于鲁棒递归最小二乘的在线序列极限学习机（RR-OSELM）算法，利用最大相关熵准则构造代价函数，采用半二次优化算法将模型转化为二次型问题，从而得到输出权值递推公式。通过对算法收敛的理论证明以及实验表明，该算法在期望数据受非高斯噪声污染时具有良好的鲁棒性。
（2）OS-ELM算法中人为设置隐含层节点数目具有随机性，设置过大会引起过拟合问题，设置太小会降低预测精度。为了能自动获得隐含层最佳结构，本文提出基于稀疏递归最小二乘的在线序列极限学习机（S-OSELM），在代价函数中加入输出权值的 -范数和 -范数正则化惩罚项以获得稀疏解，使用子梯度方法得到输出权值的在线更新方式。同时，为了防止正则化参数选取不当对算法性能造成的不利影响，提出一种自适应调整正则化参数的方法。从理论上证明了S-OSELM算法的收敛性，实验结果表明，所提出的S-OSELM-l0和S-OSELM-l1算法精确度较高，隐含层规模更小。
（3）正交频分复用（Orthogonal Frequency Division Multiplexing, OFDM）系统中经过高功率放大器（High Power Amplifier, HPA）和衰落信道后，非线性失真和多径效应使得接收端信号发生严重的畸变，通信系统性能受到较大影响。现有的大部分基于神经网络的OFDM信道均衡方法需要对信道模型提前进行离线训练，当真实信道场景与之前的信道模型出现较大差异时，信道均衡性能下降。针对以上问题，本文使用在线序列极限学习机完成信道均衡。仿真结果表明，该方法能克服多径衰落信道和HPA产生的非线性失真带来的影响，相比传统的信道均衡方法能获得更低的误比特率。
|Extreme learning machine is a single hidden layer feedforward neural network. The original ELM belongs to a batch learning algorithm, in which the data are fully available when the training process starts. However, in many real-world applications, data always come continuously and might never have an end. For these online applications, the Online Sequential Extreme learning machine (OS-ELM) is proposed. It can process the data arriving one by one or block by block sequentially, and has good generalization ability and fast learning speed.
We make some improvements from the two aspects of non-Gaussian noise robustness and the optimal hidden layer structure in this paper. Besides, we apply the proposed algorithms on the channel equalization scenario of the OFDM system. The detailed content is as follows:
(1) The Mean Squared Error (MSE) criterion is adopted to construct the cost function in the OS-ELM. Since this criterion only considers the second-order statistics of the data, it will perform poorly when dealing with non-Gaussian distributed data. In this paper, we develop a new online sequential extreme learning machine algorithm based on robust recursive least squares, namely RR-OSELM. We employ the Maximum Correntropy Criterion to construct the cost function, and utilize the half-quadratic optimization method to transform the cost function into a quadratic problem, so that we can derive the recursive form of the output weights. The theoretical proof on the convergence of the proposed scheme is given and we make some experiments. The results show that the proposed algorithm has good robustness when the expected data is contaminated by non-Gaussian noise.
(2) The number of hidden nodes have to be manually and randomly chosen in the OS-ELM. Too large hidden layer will cause over-fitting problems, but too small will reduce the estimation accuracy. In order to get the optimal structure of the hidden layer automatically, we develop a new online sequential extreme learning machine based on sparse recursive least squares (S-OSELM). By adding the l0-norm and l1-norm regularization penalty terms into the cost function, we can obtain the sparse solution of the output weights. And we adopt the sub-gradient method and finally get the updating formula of the output weight. At the same time, in order to avoid the improper regularization parameter that reduces the accuracy, we propose an adaptive tuning method about the regularization parameter. Besides, the convergence of the S-OSELM is theoretically proved. The simulation results show that the proposed S-OSELM-l0 and S-OSELM-l1 algorithms not only have a structure with fewer hidden nodes, but also gain a higher accuracy.
(3) When the Orthogonal Frequency Division Multiplexing (OFDM) system passes through the high-power amplifier (HPA) and fading channel, the nonlinear distortion and multipath effects will greatly affect the quality of the communication system. Most of the existing neural network-based OFDM channel equalization methods require offline training of the channel model in advance. When the real channel is different from the previous one, they will perform badly. To overcome these limitations, we propose an online sequential extreme learning machine-based channel equalization scheme. Simulation results show that the proposed method can resist the nonlinear distortion and multipath fading effect, and can obtain a lower bit error rate compared with the traditional methods for channel equalization.
|First Author Affilication
|School of Information Science and Engineering
程丽. 在线序列极限学习机及其应用研究[D]. 兰州. 兰州大学,2021.
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.