Advertisement

Continual learning in deep learning: Yeu-Shin (Guyver) Fu Thesis Proposal Presentation

Continual learning in deep learning: Yeu-Shin (Guyver) Fu Thesis Proposal Presentation

Deep learning has many applications, however, good performance usually comes at the cost of a large dataset and long training time. Transfer learning attempts to tackle the large dataset requirement by reusing previous trained architectures for new tasks, however, if a network is retrained for a different task it will cause the inference performance of the previous task to substantially deteriorate. This phenomenon is known as catastrophic forgetting. Continual learning is the ability for a machine learning system to use previously learned tasks to guide the learning of new tasks without catastrophic forgetting. We propose to develop continual learning methods that are able to identify tasks and increase the number of layers in the architecture to handle tasks with higher diversity. This will allow deep learning methods to exhibit the ability to learn and recall many tasks which is an important property of artificial general intelligence.

continual learning,deep learning,precision agriculture,#hangoutsonair,Hangouts On Air,#hoa,

Post a Comment

0 Comments