CUSF Landing Predictor: http://habhub.org/predict
We grab GFS (global forecast system) files from the NOAA NOMADS systems via DAP (data access protocol). A python script runs to grab the required data given an input latitude, longitude and area around which to get data for (known at latdelta and londelta).
The data is available either as 1.0x1.0 degree tiles with 26 pressure levels through the atmosphere, or in “HD” which is 0.5×0.5 degree tiles with 47 vertical pressure levels. In general we find that the extra time and bandwidth required for the HD predictions are not worth it, as the additional accuracy provided by the data is below the noise floor created by all the other uncertainties anyway.
The predictor itself (written in C) parses this data to form a flight path, which is then sent to the client using AJAX.
Rich Wareham (rjw57) wrote the current refactored version of the landing predictor binary that is used on the website. It is available on its own on his github (https://github.com/rjw57/cusf-landing-prediction), but it also bundled with the package that includes the new user interface (https://github.com/jonsowman/cusf-standalone-predictor).
There are several python dependencies that need to be met before the predictor will run, the important packages are pydap, numpy, json, simple-json.
To build this under Linux or OS X, just clone the repo and 'cd' to it. Then just run “cmake . && make” to build. This should produce a binary called 'pred' in the pred_src/ directory. This file must be executable by the user under which the predictor will run.
You will also need to ensure the following files are executable by the user under which the predictor runs:
Additionally, you will need to ensure the following directions have rwx access by the PHP interpreter. For security reasons, it is wise to ensure that actual execution of PHP code inside these directories is disabled:
Lastly, you will need to run the above two cronjobs to clear old predictions and old GFS data. For a server getting a lot of hits, we recommend running each one daily.
There are a set of configuration options in 'predict/includes/config.inc.php'. You will need to include your Google Maps API key here, and define some paths. The config options are well commented in the file.
All communication between the user interface (client) and the server is done via AJAX requests, we never force the user to load a new page. This minimises data transfer and gives the application more of a desktop like feel.
We make the distinction here between scenarios and predictions: a scenario is a set of parameters for a launch, including the site, time, ascent and descent rates, and predicted burst altitude. A prediction is a generated flight path for a given scenario. The important concept is that scenarios can (and will) have multiple predictions.
Let's suppose a user goes to the website and fills out the launch card form. When they hit “Run Prediction”, the first thing that happens it that the data is packaged up into an AJAX request and sent to the server. The server will sanity check the data (i.e. checking we're not trying to run a prediction in the past, or more than 7 days into the future) and that lat/long are in range. If the sanity checks are passed, the server creates a scenario in the 'predict/preds/' directory. The scenario name is an SHA1 hash of the launch parameters, we call this a UUID (universally unique identifier) – the reasoning for this will become clear. Following this, the server returns the UUID to the client and creates 'scenario.ini' in the scenario directory, before starting the predictor.
'predict.py' is called with the relevant launch parameters and the UUID for the scenario. This script downloads and parses the GFS data required for the prediction to run. It also updates the progress.json file during its work, with information on which GFS model was used, how far through the data-grabbing process it has got, whether any errors have occurred, etc. It also writes the estimated time remaining to progress.json.
When predict.py is finished and all the required GFS exists, the script calls the actual predictor binary (pred_src/pred*) with the UUID, and it reads in the scenario from the 'scenario.ini' file the server created earlier, and outputs the predicted flight path to the scenario directory as 'flight_path.csv'. When it's done, predict.py updates progress.json to state that the prediction is all done and the server's work is complete.
During the prediction, the client knows the UUID of the prediction it's waiting for. As such, it polls for progress.json to keep an eye on what's going on. A progress bar is displayed which is updated from progress.json as required, and any error messages from the predictor are passed to the client in this way.
When progress.json indicates prediction is complete, an AJAX request is made to the server to request the flight path. The server parses the CSV flight path into a JSON object and returns it to the client, which then processes each point to produce a trace on the map canvas. It also plots launch, land and burst markers. The client then displays some additional information in the Scenario Information UI window, including the range of flight, and also the time at which the last prediction for this scenario was completed, and using which model. The flight time is also calculated and displayed here.
There is also an option for the user to download the CSV output from the predictor, or to generate a KML file for Google Earth, which is put together on the fly by a PHP script when the button is pressed.