-stable review patch. If anyone has any objections, please let us know.
---------------------
From: Thomas Gleixner <[email protected]>
The exception fixup for the futex macros __futex_atomic_op1/2 and
futex_atomic_cmpxchg_inatomic() is missing an entry when the lock
prefix is replaced by a NOP via SMP alternatives.
Chuck Ebert tracked this down from the information provided in:
https://bugzilla.redhat.com/show_bug.cgi?id=429412
A possible solution would be to add another fixup after the
LOCK_PREFIX, so both the LOCK and NOP case have their own entry in the
exception table, but it's not really worth the trouble.
Simply replace LOCK_PREFIX with lock and keep those untouched by SMP
alternatives.
Signed-off-by: Thomas Gleixner <[email protected]>
Signed-off-by: Ingo Molnar <[email protected]>
[[email protected]: backport to 2.6.24]
Signed-off-by: Chris Wright <[email protected]>
Signed-off-by: Greg Kroah-Hartman <[email protected]>
---
include/asm-x86/futex_32.h | 6 +++---
include/asm-x86/futex_64.h | 6 +++---
2 files changed, 6 insertions(+), 6 deletions(-)
--- a/include/asm-x86/futex_32.h
+++ b/include/asm-x86/futex_32.h
@@ -28,7 +28,7 @@
"1: movl %2, %0\n\
movl %0, %3\n" \
insn "\n" \
-"2: " LOCK_PREFIX "cmpxchgl %3, %2\n\
+"2: lock ; cmpxchgl %3, %2\n\
jnz 1b\n\
3: .section .fixup,\"ax\"\n\
4: mov %5, %1\n\
@@ -68,7 +68,7 @@ futex_atomic_op_inuser (int encoded_op,
#endif
switch (op) {
case FUTEX_OP_ADD:
- __futex_atomic_op1(LOCK_PREFIX "xaddl %0, %2", ret,
+ __futex_atomic_op1("lock ; xaddl %0, %2", ret,
oldval, uaddr, oparg);
break;
case FUTEX_OP_OR:
@@ -111,7 +111,7 @@ futex_atomic_cmpxchg_inatomic(int __user
return -EFAULT;
__asm__ __volatile__(
- "1: " LOCK_PREFIX "cmpxchgl %3, %1 \n"
+ "1: lock ; cmpxchgl %3, %1 \n"
"2: .section .fixup, \"ax\" \n"
"3: mov %2, %0 \n"
--- a/include/asm-x86/futex_64.h
+++ b/include/asm-x86/futex_64.h
@@ -27,7 +27,7 @@
"1: movl %2, %0\n\
movl %0, %3\n" \
insn "\n" \
-"2: " LOCK_PREFIX "cmpxchgl %3, %2\n\
+"2: lock ; cmpxchgl %3, %2\n\
jnz 1b\n\
3: .section .fixup,\"ax\"\n\
4: mov %5, %1\n\
@@ -62,7 +62,7 @@ futex_atomic_op_inuser (int encoded_op,
__futex_atomic_op1("xchgl %0, %2", ret, oldval, uaddr, oparg);
break;
case FUTEX_OP_ADD:
- __futex_atomic_op1(LOCK_PREFIX "xaddl %0, %2", ret, oldval,
+ __futex_atomic_op1("lock ; xaddl %0, %2", ret, oldval,
uaddr, oparg);
break;
case FUTEX_OP_OR:
@@ -101,7 +101,7 @@ futex_atomic_cmpxchg_inatomic(int __user
return -EFAULT;
__asm__ __volatile__(
- "1: " LOCK_PREFIX "cmpxchgl %3, %1 \n"
+ "1: lock ; cmpxchgl %3, %1 \n"
"2: .section .fixup, \"ax\" \n"
"3: mov %2, %0 \n"
--