Re: [PATCH] arm64: fix clang integrated assembler build

From: Nick Desaulniers
Date: Wed May 27 2020 - 14:03:39 EST


On Wed, May 27, 2020 at 7:14 AM Arnd Bergmann <arnd@xxxxxxxx> wrote:
>
> clang and gas seem to interpret the symbols in memmove.S and
> memset.S differently, such that clang does not make them
> 'weak' as expected, which leads to a linker error, with both
> ld.bfd and ld.lld:
>
> ld.lld: error: duplicate symbol: memmove
> >>> defined at common.c
> >>> kasan/common.o:(memmove) in archive mm/built-in.a
> >>> defined at memmove.o:(__memmove) in archive arch/arm64/lib/lib.a
>
> ld.lld: error: duplicate symbol: memset
> >>> defined at common.c
> >>> kasan/common.o:(memset) in archive mm/built-in.a
> >>> defined at memset.o:(__memset) in archive arch/arm64/lib/lib.a
>
> Copy the exact way these are written in memcpy_64.S, which does
> not have the same problem.
>
> I don't know why this makes a difference, and it would be good
> to have someone with a better understanding of assembler internals
> review it.
>
> It might be either a bug in the kernel or a bug in the assembler,
> no idea which one. My patch makes it work with all versions of
> clang and gcc, which is probably helpful even if it's a workaround
> for a clang bug.

+ Bill, Fangrui, Jian
I think we saw this bug or a very similar bug internally around the
ordering of .weak to .global.

>
> Cc: stable@xxxxxxxxxxxxxxx
> Signed-off-by: Arnd Bergmann <arnd@xxxxxxxx>
> ---
> ---
> arch/arm64/lib/memcpy.S | 3 +--
> arch/arm64/lib/memmove.S | 3 +--
> arch/arm64/lib/memset.S | 3 +--
> 3 files changed, 3 insertions(+), 6 deletions(-)
>
> diff --git a/arch/arm64/lib/memcpy.S b/arch/arm64/lib/memcpy.S
> index e0bf83d556f2..dc8d2a216a6e 100644
> --- a/arch/arm64/lib/memcpy.S
> +++ b/arch/arm64/lib/memcpy.S
> @@ -56,9 +56,8 @@
> stp \reg1, \reg2, [\ptr], \val
> .endm
>
> - .weak memcpy
> SYM_FUNC_START_ALIAS(__memcpy)
> -SYM_FUNC_START_PI(memcpy)
> +SYM_FUNC_START_WEAK_PI(memcpy)
> #include "copy_template.S"
> ret
> SYM_FUNC_END_PI(memcpy)
> diff --git a/arch/arm64/lib/memmove.S b/arch/arm64/lib/memmove.S
> index 02cda2e33bde..1035dce4bdaf 100644
> --- a/arch/arm64/lib/memmove.S
> +++ b/arch/arm64/lib/memmove.S
> @@ -45,9 +45,8 @@ C_h .req x12
> D_l .req x13
> D_h .req x14
>
> - .weak memmove
> SYM_FUNC_START_ALIAS(__memmove)
> -SYM_FUNC_START_PI(memmove)
> +SYM_FUNC_START_WEAK_PI(memmove)
> cmp dstin, src
> b.lo __memcpy
> add tmp1, src, count
> diff --git a/arch/arm64/lib/memset.S b/arch/arm64/lib/memset.S
> index 77c3c7ba0084..a9c1c9a01ea9 100644
> --- a/arch/arm64/lib/memset.S
> +++ b/arch/arm64/lib/memset.S
> @@ -42,9 +42,8 @@ dst .req x8
> tmp3w .req w9
> tmp3 .req x9
>
> - .weak memset
> SYM_FUNC_START_ALIAS(__memset)
> -SYM_FUNC_START_PI(memset)
> +SYM_FUNC_START_WEAK_PI(memset)
> mov dst, dstin /* Preserve return value. */
> and A_lw, val, #255
> orr A_lw, A_lw, A_lw, lsl #8
> --
> 2.26.2

--
Thanks,
~Nick Desaulniers