Re: [PATCH v2] x86/asm: Replace __force_order with memory clobber

From: Arvind Sankar
Date: Wed Sep 02 2020 - 13:37:07 EST


On Wed, Sep 02, 2020 at 12:16:24PM -0500, Segher Boessenkool wrote:
> On Wed, Sep 02, 2020 at 11:33:46AM -0400, Arvind Sankar wrote:
> > The CRn accessor functions use __force_order as a dummy operand to
> > prevent the compiler from reordering the inline asm.
> >
> > The fact that the asm is volatile should be enough to prevent this
> > already, however older versions of GCC had a bug that could sometimes
> > result in reordering. This was fixed in 8.1, 7.3 and 6.5. Versions prior
> > to these, including 5.x and 4.9.x, may reorder volatile asm.
>
> Reordering them amongst themselves. Yes, that is bad. Reordering them
> with "random" code is Just Fine.

Right, that's what I meant, but the text isn't clear. I will edit to clarify.

>
> Volatile asm should be executed on the real machine exactly as often as
> on the C abstract machine, and in the same order. That is all.
>
> > + * The compiler should not reorder volatile asm,
>
> So, this comment needs work. And perhaps the rest of the patch as well?
>
>
> Segher

I think the patch itself is ok, we do only want to avoid reordering
volatile asm vs volatile asm. But the comment needs clarification.

Thanks.