Difference between revisions of "Attention-based Neural Networks for Handwriting Recognition"

From CSclasswiki
Jump to: navigation, search
(Week 1: 09/04 - 09/10)
(Added documentation video link)
Line 1: Line 1:
 +
== Documentation ==
 +
 +
Recording of working system:
 +
https://smith.zoom.us/rec/share/841_Ne3snhwP3mSduKZu63ctTFzYvdDdCrwsdPvCQWOAFDxka9tsdDTwGGZM3fWw.n5T9sxD24vdlCBzQ Passcode: iZQc4=5s
 +
 
== Fall 2020 ==
 
== Fall 2020 ==
  

Revision as of 11:32, 24 May 2021

Documentation

Recording of working system: https://smith.zoom.us/rec/share/841_Ne3snhwP3mSduKZu63ctTFzYvdDdCrwsdPvCQWOAFDxka9tsdDTwGGZM3fWw.n5T9sxD24vdlCBzQ Passcode: iZQc4=5s

Fall 2020

This honors thesis aims to improve current handwriting recognition through refining the use of attention mechanisms in sequence-to-sequence and Transformer models in HTR systems.

Week 1: 09/04 - 09/10

Goals:

- Install PyTorch and get something running
- Find a good starting point
  - Review more literature (particularly for sequence-to-sequence models)
  - Distill knowledge from currently cited papers in proposal

Bibliography