• Courses
  • Tutorials
  • DSA
  • Data Science
  • Web Tech
June 05, 2022 |6.2K Views

K Fold Cross-Validation | Machine Learning

Description
Discussion

The K-fold cross-validation technique is a technique of resampling the data set in order to evaluate a machine learning model. In this technique, the parameter K refers to the number of different subsets that the given data set is to be split into. Further, K-1 subsets are used to train the model and the left out subsets are used as a validation set.

 

To get the K-fold cross-validation we will split the dataset into three sets, Training, Testing, and Validation, with the challenge of the volume of the data.

 

1) Training - Here is the data being trained for the machine learning model to train the model and evaluate it on the validation set or test set

2) Testing- The machine learning Model will test with an algorithm to Calculate the prediction error.

3) Validation- Generate overall prediction error by taking the average of prediction errors in every case.

 

Cross validation in Machine learning : https://www.geeksforgeeks.org/cross-validation-machine-learning/