Skip to content
Dev.to1 min read

Understanding Attention Mechanisms – Part 5: How...

In the previous article, we stopped at using the softmax function to scale the scores. When we scale the values for the first encoded word “Let’s” by 0.4: And we scale the values for the second encoded word “go” by 0.6: Finally, we add the scaled values together: These sums combine the separate encodings for both input words, “Let’s” and “go”, based on their similarity to EOS. These are the attention values for EOS. Now, to determine our first output word, we need to: Feed the attention values i
Read original on dev.to
0
0

Comment

Sign in to join the discussion.

Loading comments…

Related

Get the 10 best reads every Sunday

Curated by AI, voted by readers. Free forever.

Liked this? Start your own feed.

0
0