DEV Community

Jesper Dramsch
Jesper Dramsch

Posted on • Originally published at dramsch.net on

Follow these 5 great ML creators on Twitter

How is the Twitter website free?!

If you're getting into machine learning and deep learning, you can get a whole education on Twitter.

Check out these creators:

Let's look at their work!

🎨 In this thread Jean de Nyandwi goes into depth on ConvNets!

The development from Dense networks to Convolutional layers onto the secret enabler of deep learning Pooling Layers.

Sharing great courses and resources along the way!

The image you see below is a typical architecture of Convolutional Neural Networks, a.k.a ConvNets.

ConvNets is a neural network type architecture that is mostly used in image recognition tasks.

More about ConvNets 🧡🧡

Image credit: CS 231n pic.twitter.com/fWTMiOUP4r

β€” Jean de Nyandwi (@Jeande_d) December 2, 2021

πŸ”Ž But ConvNets have recently been dethroned by Transformers!

In this thread Abhishek Thakur (4x @Kaggle Grandmaster & Huggingface autoNLP)

goes into detail for an implementation of transformers in @PyTorch!

"Attention is all you need" implementation from scratch in PyTorch. A Twitter thread:

1/

β€” abhishek (@abhi1thakur) December 13, 2021

🀝 Transformers and ConvNets are united by one player!

The Softmax function (in most cases).

Santiago Veldarrama writes some epic threads (you've likely seen).

More recently started sharing these neat 30-second visuals on core ML concepts:

Softmax is one of the most popular activation functions.

Here is a 30-second introduction to it. pic.twitter.com/cgwWR1vQS9

β€” Santiago (@svpino) January 15, 2022

βœ’οΈ How do you get a text into a Transformer or ConvNet though?!

Images are easy, right? They're just matrixes.

For text you need something called Text Embedding, which converts your words into numbers.

So Luiz Gustavo made a thread about Embeddings:

Models like #AlphaCode, #LaMDA #Copilot, #GPT, #CLIP, #DALL-e depends on a very important concept:

➑️Text Embeddings

But do you know what a Text Embedding is and how to create one?

Do you even need to create one?

Let's take a look...

[1.32 min]

[This is a good one!πŸ‘€]

1/9🧡

β€” Luiz GUStavo πŸ’‰πŸ’‰πŸ’‰πŸŽ‰ (@gusthema) February 3, 2022

πŸ› οΈ Finally you need the right tools to train your models, right?

Philip Vollet finds the libraries and apps before they're cool.

Accidentally DDOS'd a few websites, sharing them.

Maybe the latest Tuner mixed with automatic feature selection using Shap?

shap-hypetune a python package for simultaneous Hyperparameters Tuning and Features Selection for Gradient Boosting Models!

pip install shap-hypetune

Don't forget to star the repository! https://t.co/UYV4GA432t pic.twitter.com/REybOPVX6f

β€” Philip Vollet (@philipvollet) January 31, 2022

TL;DR

In this thread I shared 5 top ML creators on Twitter with content about:

  • ConvNets
  • Transformers
  • Softmax
  • Text Embedding
  • Great Tools (Hyperparameter tuning with shap-based feature selection!)

Top comments (0)