DLNLP-5 QA&Multilinguality Pre-Quiz
Sign in to Google to save your progress. Learn more
Email *
In an LSTM network, the two inputs at the forget gate represent
Clear selection
Addition of useful information to the cell state is done by
Clear selection
In the Question Answering (RE) problem, the LSTM/GRU based layer act as:
Clear selection
Context-to-Question (C2Q) Attention can be computed via:
Clear selection
Question-to-Context(Q2C) Attention can be computed via:
Clear selection
Submit
Clear form
This content is neither created nor endorsed by Google. Report Abuse - Terms of Service - Privacy Policy