[tip: x86/cpu] x86/lib: Add fast-short-rep-movs check to copy_user_enhanced_fast_string()

From: tip-bot2 for Tony Luck
Date: Wed Dec 29 2021 - 07:57:09 EST


The following commit has been merged into the x86/cpu branch of tip:

Commit-ID: 244122b4d2e5221e6abd6e21d6a58170104db781
Gitweb: https://git.kernel.org/tip/244122b4d2e5221e6abd6e21d6a58170104db781
Author: Tony Luck <tony.luck@xxxxxxxxx>
AuthorDate: Thu, 16 Dec 2021 09:24:31 -08:00
Committer: Borislav Petkov <bp@xxxxxxx>
CommitterDate: Wed, 29 Dec 2021 13:46:02 +01:00

x86/lib: Add fast-short-rep-movs check to copy_user_enhanced_fast_string()

Commit

f444a5ff95dc ("x86/cpufeatures: Add support for fast short REP; MOVSB")

fixed memmove() with an ALTERNATIVE that will use REP MOVSB for all
string lengths.

copy_user_enhanced_fast_string() has a similar run time check to avoid
using REP MOVSB for copies less that 64 bytes.

Add an ALTERNATIVE to patch out the short length check and always use
REP MOVSB on X86_FEATURE_FSRM CPUs.

Signed-off-by: Tony Luck <tony.luck@xxxxxxxxx>
Signed-off-by: Borislav Petkov <bp@xxxxxxx>
Link: https://lore.kernel.org/r/20211216172431.1396371-1-tony.luck@xxxxxxxxx
---
arch/x86/lib/copy_user_64.S | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/arch/x86/lib/copy_user_64.S b/arch/x86/lib/copy_user_64.S
index 2797e63..1c429f0 100644
--- a/arch/x86/lib/copy_user_64.S
+++ b/arch/x86/lib/copy_user_64.S
@@ -200,8 +200,8 @@ EXPORT_SYMBOL(copy_user_generic_string)
*/
SYM_FUNC_START(copy_user_enhanced_fast_string)
ASM_STAC
- cmpl $64,%edx
- jb .L_copy_short_string /* less then 64 bytes, avoid the costly 'rep' */
+ /* CPUs without FSRM should avoid rep movsb for short copies */
+ ALTERNATIVE "cmpl $64, %edx; jb .L_copy_short_string", "", X86_FEATURE_FSRM
movl %edx,%ecx
1: rep
movsb