14 posts with this tag
With a full coronavirus shutdown in Germany, Christmas this year in Berlin will be very quiet. I have been keeping myself entertained for the last couple of weeks by watching old seasons of The Great British Bakeoff, so between that and the mountain …
In this post, we continue our discussion about how to use AWS Sagemaker's BlazingText to train a word2vec model. In the last post we learned how to set up, train and evaluate a single model. However, we essentially selected our hyperparameters at …
AWS Sagemaker has a number of inbuilt algorithms, which are not only easier to use with the Sagemaker set up but are also optimised to work with AWS architecture. At my previous job, we used word embeddings extensively to help solve NLP problems. We …
One of the biggest issues when building an effective machine learning algorithm is overfitting. Overfitting is where you build a model on your training data and it not only picks up the true relationship between the outcome and the predictors, but …