Accurately evaluating the working conditions of college revolution and reformation system personnel is currently a hot issue in the field of revolution and reformation system research. Based on the neural… Click to show full abstract
Accurately evaluating the working conditions of college revolution and reformation system personnel is currently a hot issue in the field of revolution and reformation system research. Based on the neural network architecture, this paper constructs a model for the system and obtains relevant quality evaluation indicators through simulation experiment analysis. According to the quality evaluation index framework of the revolution and reformation system, the experimental platform selects 7 universities in the region, 6 of which are samples, and 1 (S University) is the research target, and the scores of each index are calculated using MATLAB auxiliary software. The simulation process starts from the characteristics of the actual neural network model; selects 17 evaluation impact indicators, including 3 basic indicators and 14 technical indicators based on historical data; and uses the factor recursive method to improve the neural network and establish an evaluation model. Then use the collected specific data to train a BP neural network with a structure of 21 × 11 × 1, continuously adjust the weights and thresholds of the neural network until the standard error requirements are met, finally verify the scientificity of the evaluation model, and compare the actual output value with the expected output value for comparison. The experimental results show that the model input data redundancy rate is reduced to 0.136, and the network training time delay is reduced to 413 ms, which improves the computing power of the network model of the reformation and revolution system in colleges and universities. The use of neural network data to reduce dimensionality effectively promotes the reformation and revolution system in universities.
               
Click one of the above tabs to view related content.