There are several methods to implement cost sensitive learning. One common approach is to modify the loss function to incorporate the costs of different types of errors. For example, in a binary classification problem, you might assign a higher penalty to false negatives than to false positives. Another approach is to re-sample the training data to reflect the cost distribution. For instance, you can oversample cases that are more costly to misclassify.