Or how the AI alignment community is forgetting that novel engineering is very, very difficult
So, warning about complacency regarding AGI's interactions with the physical world seems to be the best solution. At least that was my takeaway from your conclusion. More optimistic than Zvi today. So thanks!
So, warning about complacency regarding AGI's interactions with the physical world seems to be the best solution. At least that was my takeaway from your conclusion. More optimistic than Zvi today. So thanks!