DEV Community

Cover image for Evolution of Natural Language Processing
James Briggs
James Briggs

Posted on

1 1

Evolution of Natural Language Processing

Friend link for free access

Attention is all you need. That is the name of the 2017 paper that introduced attention as an independent learning model — the herald of our now transformer dominant world in natural language processing (NLP).

Transformers are the new cutting-edge in NLP, and they may seem somewhat abstract — but when we look at the past decade of developments in NLP they begin to make sense.

We will cover these developments, and look at how they have led to the Transformers being used today. This article makes no assumptions in you already understanding these concepts — we will build an intuitive understanding without getting overly technical.

We will cover:

  • Natural Language Neural Nets
    • Recurrence
    • Vanishing Gradients
    • Long-Short Term Memory
    • Attention
  • Attention is All You Need
    • Self-Attention
    • Multi-Head Attention
    • Positional Encoding
    • Transformers

Hostinger image

Get n8n VPS hosting 3x cheaper than a cloud solution

Get fast, easy, secure n8n VPS hosting from $4.99/mo at Hostinger. Automate any workflow using a pre-installed n8n application and no-code customization.

Start now

Top comments (0)

Billboard image

The Next Generation Developer Platform

Coherence is the first Platform-as-a-Service you can control. Unlike "black-box" platforms that are opinionated about the infra you can deploy, Coherence is powered by CNC, the open-source IaC framework, which offers limitless customization.

Learn more