I’m listening to the “Philosophize this” podcast about consciousness, and (doggone it!) there’s a section on whether free will is an illusion. The host mentioned the alleged connection between free will and moral agency. I.e., if we don’t have free will, we can’t be responsible for our actions.
What came to mind was a society (perhaps shortly in the future) in which we have humanoid robots.
I can easily imagine a scenario where (1) we do not believe the robots have free will, but (2) we still hold them accountable for their actions. E.g., we would dismantle bad robots, or limit their sphere of action or influence, even if they thought of that as a punishment.
IOW, it doesn’t seem obvious to me that a lack of free will therefore means that we are not responsible for our actions.