LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Higher order ANN parameter optimization using hybrid opposition-elitism based metaheuristic

Photo from wikipedia

For two decades, the dominance of nature-inspired optimization algorithms has been irresistible in solving many complex problems. In fact, most of these algorithms are integrated with some other intelligent technique… Click to show full abstract

For two decades, the dominance of nature-inspired optimization algorithms has been irresistible in solving many complex problems. In fact, most of these algorithms are integrated with some other intelligent technique to prove the effectiveness of each method. Out of these algorithms, the last decade has witnessed a good research contribution for Teaching–Learning Based optimization and its different variations as well as advances in many engineering domains. In this research, a variation of the TLBO technique has been integrated with the Functional Link Artificial Neural Network to classify the nonlinear data. With suitable parameter adjustments, the proposed model can classify the data efficiently. For learning, the Gradient Descent method has been adopted for obtaining the optimal weight units of the neural network. Simulation results reveal that the proposed hybrid approach is superior in terms of considered performance parameters as compared to other competitive methods.

Keywords: optimization; order ann; parameter; ann parameter; higher order; parameter optimization

Journal Title: Evolutionary Intelligence
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.