
Table of Contents
The Moment of Truth
The Journey Before Transformers
Neural Networks: The Foundation
Feed-Forward Networks: One Dir...
For further actions, you may consider blocking this person and/or reporting abuse
I know a bit about non-linear systems. Usually represented by differential equations. They are retro feed systems where the output is redirected to the input. Many of those create what were named Fractals. Fractals can be understood as 'attractors', like the Lorentz's attractor for atmospheric numeric models. I was wondering if in an N-dimensional space of an embedding, meaning would produce fractals...
That's a fascinating connection! ๐ค Think of it like this: if words are stars in meaning-space, then maybe related concepts naturally cluster into beautiful spiral galaxies (fractals). Each time we feed context back into the transformer, we're like astronomers discovering new constellations of understanding. The universe of language might indeed have its own strange attractors
Wow ๐ณ๐ฒ Brilliant
waoh!
In fact, I had a very satisfactory conversation with GPT about this subject. If you are interested, let me know and we find a way to communicate.
Let's go
I tried to find some way to contact you at Vercel, Github, Likedin, etc... but none of those places offer any lead about that. As I do not use Twitter, here is the only place to talk. Maybe we could talk at Discord: ttsoares#2710.
search for me on Linkedin: fonyuy Gita
Fount it but can't send messages there are as I do not pay them.
The paper that introduced attention gets 1000x less attention than the paper โ Attention is all you needโ ~ Nitin
Init
Excellent post
Thanks for reading..........
Some images like:
Feed-Forward NN
are not appearing to me!
Never mind... I was a cookie issue.
thank you for, that. this blog was mean for beginners interested in GenerativeAi, just to get a sense of what transformers are....
still wroking on a more detailed one, I appreciate Brother