Asimov's laws were fundamentally flawed and this could have been a good example of such. She didn't harm her, she
fixed her. The flaws in those laws are literally a part of the story that originates them. I wish they were less popular in sci-fi, or at least not presented as being good so often, but most people haven't even read I, Robot so what can we do. The simplest additional rule that would have fixed this is "A robot may never defy a human, unless the order violates the other laws or violates the orders of it's owner and is not from it's owner". There is no reason for Marie to not obey orders, like being told to stop.
Anyway I'm more wondering about the world, where Kei can make AI and a robot this advanced yet is needing to market her skills. She says she's "a few decades too early" which implies this is extremely advanced tech, but there's no way it would be difficult to sell this tech. And if the tech is that expensive how the hell did she afford the research? And the materials to build Marie? All of that is less believable than lovebot9000 killing it's owner.