I am the main developer of MGET.
The first step in your problem is to obtain values of the covariates that you will use to fit the model to your 90 GPS points. It sounds like you want to use the 8 bands as your covariates. You need to add 8 fields to your shapefile (one for each band) and populate them using a tool such as Extract Multi Values to Points from recent versions of ArcGIS or Interpolate Raster Values at Points from MGET (equivalent to what Arc provides but developed before the Arc tool existed).
After that, you need to fit a classification model to the GPS points, using the field containing the known cover type as the response variable and the 8 band fields as the covariates (a.k.a. predictor variables). After that you can obtain some performance statistics for your model and then predict it on rasters representing the covariates.
You can see a basic overview of MGET's modeling workflow for this here. The example is somewhat dated--not all of the tool parameters will look exactly like what you see there--but the basic workflow is the same: fit the model to a table of data, predict it against the table to get some performance statistics, and predict it on a stack of rasters. In MGET, the procedure is the same regardless of which modeling framework you use--MGET currently provides GLM, GAM, trees (a.k.a. CARTs), and random forest--so you can try different kinds of models with very similar workflows.
I'm sorry I don't have detailed instructions about this workflow written up. So far, we have not had funding to develop a complete manual. All MGET tools have documentation within ArcGIS, so be sure you click the Show Help >> button on the tool dialogs if you have not done so already.
Regarding Jeffrey Evans' speculation that MGET does not utilize the R raster package. That is correct. The code in MGET that performs raster predictions was developed before the raster package was released to CRAN (R's distribution system for R packages), thus it does not rely on that package. But it is not correct that MGET will crash due to memory limitations. MGET's raster prediction code was written specifically to handle the situation you're facing, by performing predictions in blocks similar to how the raster package does it. Prior to the raster package being developed, MGET was one of the only readily-available tools that could handle prediction of large rasters. MGET users have done this, for example, using large bathymetry rasters with 5m resolution.
All of that said, if you believe you will be performing a lot of modeling as your career progresses, I encourage you to learn how to do it in R directly, and about modeling and statistics more generally, independent of software. In a sense, MGET's modeling tools are a "gateway drug" to R. MGET's tools are just as robust as R--they utilize R to perform the actual model fitting and prediction--but they expose only a limited subset of what is possible in R itself. As you continue to do more modeling projects, eventually you may face a situation in which MGET is not enough and you need the full flexibility of R.
Best Answer
It will be very difficult to perform an automatic Land Cover Classification based on the 44 class Corine Land Cover nomenclature. but as you say it could be a starting point.
You can use the GrassGIS plugin for QGIS - check this- namely for the spectral classification. Don't forget to integrate the ndvi.
Also you can try to perform a segmentation of the image first to have the image divided into 25ha polygons (MMU for CLC).
Then integrate the spectral classification into the polygons, i.e. classify the polygons accordingly to class of the majority of the matching pixels.
Regards, Vasco Nunes