Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Our current economic model around AI is going to teach us more about psychology than fundamental physics. I expect we'll become more manipulative but otherwise not a lot smarter.

Funny thing is, AI also provides good models for where this is going. Years ago I saw a CNN + RL agent that explored an old-school 2d maze rendered in 3d. They found it got stuck in fewer loops if they gave it a novelty-seeking loss function. But then they stuck a "TV" which showed random images in the maze. The agent just plunked down and watched TV, forever.

Healthy humans have countermeasures around these things, but breaking them down is a now a bullion dollar industry. With where this money is going, there's good reason to think the first unarguably transcendent AGI (if it ever emerges) will mostly transcend our ability to manipulate.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: