mirror of
https://github.com/RPCS3/llvm-mirror.git
synced 2024-11-23 19:23:23 +01:00
AArch64: always clear kill flags up to last eliminated copy
After r261154, we were only clearing flags if the known-zero register was originally live-in to the basic block, but we have to do it even if not when more than one COPY has been eliminated, otherwise the user of the first COPY may still have <kill> marked. E.g. BB#N: %X0 = COPY %XZR STRXui %X0<kill>, <fi#0> %X0 = COPY %XZR STRXui %X0<kill>, <fi#1> We can eliminate both copies, X0 is not live-in, but we must clear the kill on the first store. Unfortunately, I've been unable to come up with a non-fragile test for this. I've only seen it in the wild with regalloc-created spills, and attempts to reproduce that in a reasonable way run afoul of COPY coalescing. Even volatile asm clobbers were moved around. Should fix the aarch64 bot though. llvm-svn: 261175
This commit is contained in:
parent
71fa6fafdf
commit
2fedd64241
@ -149,15 +149,15 @@ bool AArch64RedundantCopyElimination::optimizeCopy(MachineBasicBlock *MBB) {
|
||||
// CBZ/CBNZ. Conservatively mark as much as we can live.
|
||||
CompBr->clearRegisterKills(SmallestDef, TRI);
|
||||
|
||||
// Clear any kills of TargetReg between CompBr and MI.
|
||||
if (std::any_of(TargetRegs.begin(), TargetRegs.end(),
|
||||
[&](unsigned Reg) { return MBB->isLiveIn(Reg); })) {
|
||||
for (MachineInstr &MMI :
|
||||
make_range(MBB->begin()->getIterator(), LastChange->getIterator()))
|
||||
MMI.clearRegisterKills(SmallestDef, TRI);
|
||||
} else
|
||||
if (std::none_of(TargetRegs.begin(), TargetRegs.end(),
|
||||
[&](unsigned Reg) { return MBB->isLiveIn(Reg); }))
|
||||
MBB->addLiveIn(TargetReg);
|
||||
|
||||
// Clear any kills of TargetReg between CompBr and the last removed COPY.
|
||||
for (MachineInstr &MMI :
|
||||
make_range(MBB->begin()->getIterator(), LastChange->getIterator()))
|
||||
MMI.clearRegisterKills(SmallestDef, TRI);
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
|
@ -91,4 +91,4 @@ false:
|
||||
true:
|
||||
store volatile i64 %in, i64* %dest
|
||||
ret i32 0
|
||||
}
|
||||
}
|
||||
|
Loading…
Reference in New Issue
Block a user