LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Improving robustness of complex networks by a new capacity allocation strategy

Photo by robertbye from unsplash

The robustness of infrastructure networks has attracted great attention in recent years. Scholars have studied the robustness of complex networks against cascading failures from different aspects. In this paper, a… Click to show full abstract

The robustness of infrastructure networks has attracted great attention in recent years. Scholars have studied the robustness of complex networks against cascading failures from different aspects. In this paper, a new capacity allocation strategy is proposed to reduce cascading failures and improve network robustness without changing the network structure. Compared with the typical strategy proposed in Motter–Lai (ML) model, the new strategy can reduce the scale of cascading failure. The new strategy applied in scale-free network is more efficient. In addition, to reasonably evaluate the two strategies, we introduce contribution rate of unit capacity to network robustness as evaluation index. Results show that our new strategy works well, and it is more advantageous in the rational utilization of capacity in scale-free networks. Furthermore, we were surprised to find that the efficient utilization of capacity costs declined as costs rose above a certain threshold, which indicates that it is not wise to restrain cascading failures by increasing capacity costs indefinitely.

Keywords: capacity; new capacity; complex networks; robustness complex; strategy

Journal Title: Chinese Physics B
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.