Stability Analysis of Reputation and Trust Computation under Fluctuating and Manipulated Feedback
Main Article Content
Abstract
Reputation and trust mechanisms are widely adopted in open service networks to regulate cooperation among autonomous participants. However, many existing schemes implicitly assume stable and honest feedback patterns, an assumption that rarely holds in realistic environments. This work investigates the long-term stability of trust computation under fluctuating interaction quality and coordinated manipulation. Interaction histories are modeled as evolving weighted graphs, and trust scores are derived using bounded temporal aggregation functions. Special attention is paid to the sensitivity of reputation values to sparse and bursty feedback. Simulation results indicate that overly responsive aggregation rules amplify noise and accelerate system instability. By contrast, constrained update mechanisms produce smoother trust evolution and reduce vulnerability to short-term attacks. These findings highlight the importance of stability-oriented design in reputation systems.