There’s so much that Deep Learning models can do for you that will help you get your work done a LOT faster.
Do you still remember having to write summaries of articles for your homeworks?
Glad that we don’t have to do that anymore! Because we’re going to learn how 😉
Special thanks to Hugging Face for providing all the models for us 😄
We’re going to use a pretrained BART model. I initially tested T5 models (small, base, large) as well, but the pretrained base/large ones were unusable, they had training data leaks and the summaries were unreadable, so……
While the improvements of each method were small (about ~1% more accurate for each method), if you use them all, your model could be 5% more accurate.
However, you should only use them after picking the right model, algorithm, batch size, and learning rate first because they affect your model’s accuracy a lot more.
I’m using fastai to create this model. The final accuracy for the model is ~ 95.74%.
Self-supervised learning: Training a model using labels that are naturally part of the input data, rather than requiring separate external labels.
Self-supervised learning is often used to pre-train models. By fine-tuning a pre-trained model, we can very quickly get state-of-the-art results with very little data.
Supervised learning (data that requires human annotators) are expensive, labor-intensive, and time-consuming. Transfer learning lowers the barrier for Machine Learning, making it accessible for everyone.
Let’s take a look at some examples of self-supervised learning!
Context Encoders: a convolutional neural network trained to generate the contents of an arbitrary image region conditioned on its surroundings.
Ensemble learning is the process by which multiple models, such as classifiers or experts, are strategically generated and combined to solve a particular computational intelligence problem.
Ensembling works because it averages the errors of many models. The kinds of errors that each model make would be different from the others, so the average of their predictions should be better than either one’s individual predictions.
Ensembling works better when the models are diverse, so many ensemble methods promote diverse models to combine.
For example, in Chapter 9 of Fastai’s course, we learned to create a random forest model to predict tabular…
Full stack web developer. Deep Learning Practitioner.