We extend the theory of PAC learning in a way which allows to model a richvariety of learning tasks where the data satisfy special properties that easethe learning process . In contrast, it is not at allclear that such assumptions can be expressed by the traditional PAC theory . In stark contrast, we show that the ERM principle fails inexplaining learnability of partial concept classes . In fact, we demonstrate classes that are incredibly easy to learn, but such that any algorithm thatlearns them must use an hypothesis space with unbounded VC dimension . We also find that the sample compression conjecture fails in this setting . Thus, this theory features problems that cannot be represented nor solved inthe traditional way. We view this as evidence that it might provide insights on the nature of learnability in realistic scenarios which the classical theoryfails to explain. We also show that it may provide insight on learnability to the classical Theory of Learning. In contrast to the traditional theory of Learning, we also found that this theory fails to solve the traditional model of learning. We demonstrateclasses that are extremely easy-to-learned classes are easy- to- learnable, but not easy to do so. In fact we demonstrate these classes are very difficult to learn. They are very hard to solve problems that are difficult to solve in this way. They were easy to solve, but they are

Author(s) : Noga Alon, Steve Hanneke, Ron Holzman, Shay Moran

Links : PDF - Abstract

Code :
Coursera

Keywords : theory - learning - classes - easy - solve -

Leave a Reply

Your email address will not be published. Required fields are marked *