via https://ift.tt/2rivGou
feathersescapism:
prokopetz:
k9jocks:
prokopetz:
I think the real lesson to take away from most failures of machine learning isn’t “machine learning doesn’t work”, but “humans are bad at explaining things”.
I would love to see a study on if animal training mechanical skills would transfer to teaching machines. Timing and reward placement would be irrelevant, but it would be interesting to see if splitting and shaping could have an affect.
(I mean yes, I am a programmer with an interest in machine learning and a dog trainer so I could do the study myself. But that would take time and I am meh about the ai prof at the local uni)
You know, the idea of an interdisciplinary approach that applies animal training techniques to goal definition for machine learning is both absolutely fascinating and head-slappingly obvious, in retrospect. It’d be kind of hilarious if it turned out that this was yet another one of those cases where the solution was staring us in the face the whole time, but we totally missed it because the STEM majors refused to talk to anyone outside their field!
Also us who spend/spend a lot of time training very small humans etc might have some ideas. Maybe.
(Your picture was not posted)
feathersescapism:
prokopetz:
k9jocks:
prokopetz:
I think the real lesson to take away from most failures of machine learning isn’t “machine learning doesn’t work”, but “humans are bad at explaining things”.
I would love to see a study on if animal training mechanical skills would transfer to teaching machines. Timing and reward placement would be irrelevant, but it would be interesting to see if splitting and shaping could have an affect.
(I mean yes, I am a programmer with an interest in machine learning and a dog trainer so I could do the study myself. But that would take time and I am meh about the ai prof at the local uni)
You know, the idea of an interdisciplinary approach that applies animal training techniques to goal definition for machine learning is both absolutely fascinating and head-slappingly obvious, in retrospect. It’d be kind of hilarious if it turned out that this was yet another one of those cases where the solution was staring us in the face the whole time, but we totally missed it because the STEM majors refused to talk to anyone outside their field!
Also us who spend/spend a lot of time training very small humans etc might have some ideas. Maybe.
(Your picture was not posted)