The performance of satellite laser ranging (SLR) station operations relies to a large extent on the quality of the required satellite orbit predictions. Poor predictions with large along-track offsets, so-called… Click to show full abstract
The performance of satellite laser ranging (SLR) station operations relies to a large extent on the quality of the required satellite orbit predictions. Poor predictions with large along-track offsets, so-called time biases, increase the target acquisition time and thus reduce the performance of stations and the International Laser Ranging (ILRS) network as a whole. There is currently no established process to evaluate or monitor the quality of predictions. This paper presents a method for such a process that uses normal point data uploaded to data centers by ILRS stations worldwide. The first analysis results show systematic trends over time for most targets and prediction providers. These trends were used to predict the development of time bias values. We also present a service that provides these predicted values for the latest satellite orbit predictions of selected targets and providers in real time. Using these values during tracking allows for faster target acquisition and thus better tracking performance at ILRS SLR stations. Through monitoring, the service further enables stations to select the best available predictions during tracking and to notify prediction providers if issues are encountered. This tool benefits not only the stations by improving their tracking performance but also allows for prediction improvement and greater support of missions.
               
Click one of the above tabs to view related content.