View programs
The course introduces students to Methods in Deep Sequential Learning including Recurrent Neural Nets (RNNs), a class of modern artificial neural networks aimed at processing sequential data. The course goes on to introduce the problems of vanishing and exploding gradients as well as that of long-distance dependencies in sequential processing while offering their corresponding solutions. Students will focus on the theoretical and practical aspects of various types of sequential architectures (i.e., various types of RNNs and Transformer Architectures) and study their applications including those in natural language processing (including such tasks as language translation, automatic captioning, handwriting and speech recognition), log/sensor analysis, and time series anomaly detection.