A major problem with longterminism is that it presumes to speak for future people who are entirely theoretical, who's needs are entirely impossible to accurately predict. It also depriorites immediate problems.
So Elon Musk is associated with Longterminism (self proclaimed). He might consider that interplanetary travel is in best interest of mankind in the future (Reasonable). As a longtermist he would then feel a moral responsibility to advance interplanetary travel technology. So far, so good.
But the sitch is that he might feel that the moral responsibility to advance space travel via funding his rocket company is far more important that his moral responsibility to safeguard the well being of his employees by not overworking them.
I mean after all yeah it might ruin the personal lives and of a hundred, two hundred, even a thousand people, but what's that compared to the benefit advancing this technology will bring to all mankind? There are going to be billions of people befitting from this in the future!
But that's not really true. Because we can't be certain that those billions of people will even exist let alone benefit. But the people suffering at his rocket company absolutely do exist and their suffering is not theoretical.
The greatest criticism of this line of thought is that it gives people, or at the moment, billionaires permission to do whatever the fuck they want.
Sure flying on a private jet is ruinous to the environment but I need to do it so I can manage my company which will create an AI that will make everything better...
They fr read Foundation and missed the fact that Hari Seldon's preditctions fell apart very early on and he had the benefit of magical foresight.