Skip to content
#

attention-models

Here are 6 public repositories matching this topic...

Language: All
Filter by language

Participants in this Specialization have the opportunity to construct and train various neural network architectures, including Convolutional Neural Networks, Recurrent Neural Networks, LSTMs, and Transformers. They learn to enhance these networks with techniques such as Dropout, BatchNorm, Xavier/He initialization, among others.

  • Updated Jan 12, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the attention-models topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the attention-models topic, visit your repo's landing page and select "manage topics."

Learn more