Agricultural Prediction Dashboard: Commodity Loss Gradient Boosted Regression - Palouse Wheat Claims due to Drought

Select dataset
GBM meta-parameters

Erich Seamon
PhD Student | University of Idaho
University page | server | NOAA grant team - CIRC | github

This is a BETA data mining gradient boosting application, developed in Shiny.

Gradient boosting is a machine learning technique for regression and classification problems, which produces a prediction model in the form of an ensemble of weak prediction models, typically decision trees. It builds the model in a stage-wise fashion like other boosting methods do, and it generalizes them by allowing optimization of an arbitrary differentiable loss function.

The idea of gradient boosting originated in the observation by Leo Breiman that boosting can be interpreted as an optimization algorithm on a suitable cost function. Explicit regression gradient boosting algorithms were subsequently developed by Jerome H. Friedman simultaneously with the more general functional gradient boosting perspective of Llew Mason, Jonathan Baxter, Peter Bartlett and Marcus Frean . The latter two papers introduced the abstract view of boosting algorithms as iterative functional gradient descent algorithms. That is, algorithms that optimize a cost function over function space by iteratively choosing a function (weak hypothesis) that points in the negative gradient direction. This functional gradient view of boosting has led to the development of boosting algorithms in many areas of machine learning and statistics beyond regression and classification.

This project is supported by NOAA thru the Climate Impacts Research Consortium, or CIRC (

Also, thanks to Matthew Leonawicz and the SNAP team in Alaska, for the code and structure for these applications.