Disaster recovery efforts are enhanced through the collection and dissemination of satellite data, which is downloaded from satellites to ground stations. The optimal ground station locations vary depending on the location of the disaster, but ground station construction occurs before the realization of a disaster. Thus, a stochastic optimization problem arises: decide the location of ground stations before disasters with uncertain locations. We use a stochastic programming approach to select the location of ground stations given a set of potential disaster scenarios. The objective is to maximize the expected amount of data downloaded. The problem formulation consists of a two-stage stochastic program where the first-stage determines the locations of the ground stations and the second-stage schedules the uploading and downloading of data. We solve the problem using the L-shaped method; we find that it significantly outperforms solving the deterministic equivalent problem directly. We also find that an alternate second-stage formulation significantly improves solution time. The optimized set of ground stations found by our algorithm is compared to the set of ground stations operated by the National Oceanic and Atmospheric Administration’s; results confirm that the current placement is effective and demonstrate the benefit in adding additional ground stations.
There is significant value in the data collected by satellites during and after a natural disaster. The current operating paradigm in practice is for satellites to passively collect data when they happen to fly over a disaster location. Conversely, this article considers the alternative approach of actively maneuvering satellites to fly directly overhead of the disaster site on a routine basis. Toward this end, we seek to compute a satellite constellation design that minimizes the expected maneuver costs for monitoring an unknown forest fire. In this article, we present a 2‐stage stochastic programing model for this problem as well as a accelerated L‐shaped decomposition approach. A comparison between our approach and the current operating paradigm indicates that our solution provides longer duration data collections and a greater number of data collections. Analysis also shows that our proposed solution is robust over a wide array of scenarios.