2021-05-12 09:36:43

by Borislav Petkov

[permalink] [raw]
Subject: [PATCH] x86/asm: Simplify __smp_mb() definition

From: Borislav Petkov <[email protected]>

Drop the bitness ifdeffery in favor of using the rSP register
specification for 32 and 64 bit depending on the build.

No functional changes.

Signed-off-by: Borislav Petkov <[email protected]>
---
arch/x86/include/asm/barrier.h | 7 ++-----
1 file changed, 2 insertions(+), 5 deletions(-)

diff --git a/arch/x86/include/asm/barrier.h b/arch/x86/include/asm/barrier.h
index 4819d5e5a335..3ba772a69cc8 100644
--- a/arch/x86/include/asm/barrier.h
+++ b/arch/x86/include/asm/barrier.h
@@ -54,11 +54,8 @@ static inline unsigned long array_index_mask_nospec(unsigned long index,
#define dma_rmb() barrier()
#define dma_wmb() barrier()

-#ifdef CONFIG_X86_32
-#define __smp_mb() asm volatile("lock; addl $0,-4(%%esp)" ::: "memory", "cc")
-#else
-#define __smp_mb() asm volatile("lock; addl $0,-4(%%rsp)" ::: "memory", "cc")
-#endif
+#define __smp_mb() asm volatile("lock; addl $0,-4(%%" _ASM_SP ")" ::: "memory", "cc")
+
#define __smp_rmb() dma_rmb()
#define __smp_wmb() barrier()
#define __smp_store_mb(var, value) do { (void)xchg(&var, value); } while (0)
--
2.29.2


2021-05-12 09:50:58

by Peter Zijlstra

[permalink] [raw]
Subject: Re: [PATCH] x86/asm: Simplify __smp_mb() definition

On Wed, May 12, 2021 at 11:33:10AM +0200, Borislav Petkov wrote:
> From: Borislav Petkov <[email protected]>
>
> Drop the bitness ifdeffery in favor of using the rSP register
> specification for 32 and 64 bit depending on the build.
>
> No functional changes.
>
> Signed-off-by: Borislav Petkov <[email protected]>
> ---
> arch/x86/include/asm/barrier.h | 7 ++-----
> 1 file changed, 2 insertions(+), 5 deletions(-)
>
> diff --git a/arch/x86/include/asm/barrier.h b/arch/x86/include/asm/barrier.h
> index 4819d5e5a335..3ba772a69cc8 100644
> --- a/arch/x86/include/asm/barrier.h
> +++ b/arch/x86/include/asm/barrier.h
> @@ -54,11 +54,8 @@ static inline unsigned long array_index_mask_nospec(unsigned long index,
> #define dma_rmb() barrier()
> #define dma_wmb() barrier()
>
> -#ifdef CONFIG_X86_32
> -#define __smp_mb() asm volatile("lock; addl $0,-4(%%esp)" ::: "memory", "cc")
> -#else
> -#define __smp_mb() asm volatile("lock; addl $0,-4(%%rsp)" ::: "memory", "cc")
> -#endif
> +#define __smp_mb() asm volatile("lock; addl $0,-4(%%" _ASM_SP ")" ::: "memory", "cc")
> +
> #define __smp_rmb() dma_rmb()
> #define __smp_wmb() barrier()
> #define __smp_store_mb(var, value) do { (void)xchg(&var, value); } while (0)

Acked-by: Peter Zijlstra (Intel) <[email protected]>

Subject: [tip: x86/cleanups] x86/asm: Simplify __smp_mb() definition

The following commit has been merged into the x86/cleanups branch of tip:

Commit-ID: 1bc67873d401e6c2e6e30be7fef21337db07a042
Gitweb: https://git.kernel.org/tip/1bc67873d401e6c2e6e30be7fef21337db07a042
Author: Borislav Petkov <[email protected]>
AuthorDate: Wed, 12 May 2021 11:33:10 +02:00
Committer: Ingo Molnar <[email protected]>
CommitterDate: Wed, 12 May 2021 12:22:57 +02:00

x86/asm: Simplify __smp_mb() definition

Drop the bitness ifdeffery in favor of using _ASM_SP,
which is the helper macro for the rSP register specification
for 32 and 64 bit depending on the build.

No functional changes.

Signed-off-by: Borislav Petkov <[email protected]>
Signed-off-by: Ingo Molnar <[email protected]>
Acked-by: Peter Zijlstra (Intel) <[email protected]>
Link: https://lore.kernel.org/r/[email protected]
---
arch/x86/include/asm/barrier.h | 7 ++-----
1 file changed, 2 insertions(+), 5 deletions(-)

diff --git a/arch/x86/include/asm/barrier.h b/arch/x86/include/asm/barrier.h
index 4819d5e..3ba772a 100644
--- a/arch/x86/include/asm/barrier.h
+++ b/arch/x86/include/asm/barrier.h
@@ -54,11 +54,8 @@ static inline unsigned long array_index_mask_nospec(unsigned long index,
#define dma_rmb() barrier()
#define dma_wmb() barrier()

-#ifdef CONFIG_X86_32
-#define __smp_mb() asm volatile("lock; addl $0,-4(%%esp)" ::: "memory", "cc")
-#else
-#define __smp_mb() asm volatile("lock; addl $0,-4(%%rsp)" ::: "memory", "cc")
-#endif
+#define __smp_mb() asm volatile("lock; addl $0,-4(%%" _ASM_SP ")" ::: "memory", "cc")
+
#define __smp_rmb() dma_rmb()
#define __smp_wmb() barrier()
#define __smp_store_mb(var, value) do { (void)xchg(&var, value); } while (0)