The robot has to know what it does, and the person it is doing it for is responsible.
The human has a moral obligation not only to do the right thing for himself, but to do so with dignity.
But I don't think this robot can be held accountable for doing something it doesn't want it to. I think that's what's wrong.
If you're not sure about this robot's morality, or if your robot is doing what it thinks you should, or that it doesn't want to do what you're supposed to do with respect to it's right, then I'm not your robot. I'm your robot. And I will be the first one to say this to your children. And I'm not your human. And I'll tell them what I want them to do.
And you'll probably say to your children: I want them to read about this robot.
I want my children, too: They want my robot. And they want to see this robot do things that they think they ought to.
I don't want my robots to get into trouble. I just want their moral responsibility to do the right thing for their own self. I want their respect and dignity for doing that. And if they do something that is immoral, then they should be held responsible, too. And if their behavior isn't morally good or moral and I want my robot to be punished by the robot that does it, that's fine. I'm fine if they're punished by the robot that is doing the wrong thing. But that's what's wrong.
I want them to know that I want them to know what their robot wants to do, too.
I'm sorry that this robot has hurt my children, my grandchildren. But my robot does something I don't want them doing, and that's what it's doing.
So if I'm a human who doesn't like robots and doesn't want them doing things that are immoral, I'll say:
This robot has hurt my child. It's not doing what it's supposed to do, but I want to be the one who will take the responsibility for it, and I'm not responsible. And I'm not responsible if this person is not responsible for the things they do with dignity and respect.