[PATCH 5.5 056/150] powerpc/hugetlb: Fix 8M hugepages on 8xx

From: Greg Kroah-Hartman
Date: Thu Feb 27 2020 - 09:23:08 EST


From: Christophe Leroy <christophe.leroy@xxxxxx>

commit 50a175dd18de7a647e72aca7daf4744e3a5a81e3 upstream.

With HW assistance all page tables must be 4k aligned, the 8xx drops
the last 12 bits during the walk.

Redefine HUGEPD_SHIFT_MASK to mask last 12 bits out. HUGEPD_SHIFT_MASK
is used to for alignment of page table cache.

Fixes: 22569b881d37 ("powerpc/8xx: Enable 8M hugepage support with HW assistance")
Cc: stable@xxxxxxxxxxxxxxx # v5.0+
Signed-off-by: Christophe Leroy <christophe.leroy@xxxxxx>
Signed-off-by: Michael Ellerman <mpe@xxxxxxxxxxxxxx>
Link: https://lore.kernel.org/r/778b1a248c4c7ca79640eeff7740044da6a220a0.1581264115.git.christophe.leroy@xxxxxx
Signed-off-by: Greg Kroah-Hartman <gregkh@xxxxxxxxxxxxxxxxxxx>

---
arch/powerpc/include/asm/page.h | 5 +++++
1 file changed, 5 insertions(+)

--- a/arch/powerpc/include/asm/page.h
+++ b/arch/powerpc/include/asm/page.h
@@ -295,8 +295,13 @@ static inline bool pfn_valid(unsigned lo
/*
* Some number of bits at the level of the page table that points to
* a hugepte are used to encode the size. This masks those bits.
+ * On 8xx, HW assistance requires 4k alignment for the hugepte.
*/
+#ifdef CONFIG_PPC_8xx
+#define HUGEPD_SHIFT_MASK 0xfff
+#else
#define HUGEPD_SHIFT_MASK 0x3f
+#endif

#ifndef __ASSEMBLY__