Received: by 10.192.165.148 with SMTP id m20csp951309imm; Sat, 5 May 2018 01:37:07 -0700 (PDT) X-Google-Smtp-Source: AB8JxZpeKwClsewP/768Wq/+UDHzcMxusXzAdyB0YWlMBaTp3rQU9H64w0xRj/fX2nnjO8tWmjhB X-Received: by 2002:a17:902:274a:: with SMTP id j10-v6mr31691346plg.393.1525509427653; Sat, 05 May 2018 01:37:07 -0700 (PDT) ARC-Seal: i=1; a=rsa-sha256; t=1525509427; cv=none; d=google.com; s=arc-20160816; b=yhJ8LYDnsV4SsBlF0BkUWc0uLgZmcb4ifvu84ci4qQZszVpzWbDQ0C10akbNTjcVzx OBOE2e1jD8J94DhQCQn1U6NSCh42sIuLim9dEr3w3stEkj3E9EsFgTbYuGIfnYrgU0A6 MJ9DF1SdzrBFQ0S1yjtk5hZDdBjQl+uurEzMpdGU1yQrOxbuhex1CLXvCvT/tEg42rl6 Z9I65SaoUKwt2PUneMdKdGFI0wD7yLIIxkFYJfsLrMoLLEwPIAR7OpFRUmbftfNmwBEX FZfxPVtQOXezpB8TadTjk/n4nvaYzzWIYaL+/O9mVvIGPwcA1Zv24VJZvTejYvCJovbM ii9g== ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20160816; h=list-id:precedence:sender:user-agent:in-reply-to :content-disposition:mime-version:references:message-id:subject:cc :to:from:date:dkim-signature:arc-authentication-results; bh=6VA61MMQVtZBQ1reWgIcNjw+EtfbaCEfrCvIiVQMQXA=; b=0tlAM20xu3iDoHHsj9R/bUbKEeaAtWDfh2gOBNNxcVkeKEYt3j+HOiGBBe0wVvHKdh 3xfiRIr5xc7N/5ccpVRCo5GwZ5VgnrIIO11BvlHN36sDMwS8DLTUm3PFaAQS+0Wz0OMh go+zeF0R1m4SNZvdkmfY/mCtGN7Hhs8k3q3TFvQcj7F1kbJFSipjdd2ApyrleT5UM0Pg nk74b0LzWcqb6kkGF7Rb4Qi290njMAGTFo00tuEYMaSDMAJSivSTFxew5NXFYD86ATm9 9Q/xnkqyudUS0XLWS2Li5axLxY0YAG3MMQiEn/aQ5vf8U2tb8vQ5xdIZ7UdQeEn2PefD Rl+g== ARC-Authentication-Results: i=1; mx.google.com; dkim=fail header.i=@gmail.com header.s=20161025 header.b=ebsX7vqu; spf=pass (google.com: best guess record for domain of linux-kernel-owner@vger.kernel.org designates 209.132.180.67 as permitted sender) smtp.mailfrom=linux-kernel-owner@vger.kernel.org; dmarc=fail (p=NONE sp=NONE dis=NONE) header.from=kernel.org Return-Path: Received: from vger.kernel.org (vger.kernel.org. [209.132.180.67]) by mx.google.com with ESMTP id q1-v6si14842636pga.417.2018.05.05.01.36.52; Sat, 05 May 2018 01:37:07 -0700 (PDT) Received-SPF: pass (google.com: best guess record for domain of linux-kernel-owner@vger.kernel.org designates 209.132.180.67 as permitted sender) client-ip=209.132.180.67; Authentication-Results: mx.google.com; dkim=fail header.i=@gmail.com header.s=20161025 header.b=ebsX7vqu; spf=pass (google.com: best guess record for domain of linux-kernel-owner@vger.kernel.org designates 209.132.180.67 as permitted sender) smtp.mailfrom=linux-kernel-owner@vger.kernel.org; dmarc=fail (p=NONE sp=NONE dis=NONE) header.from=kernel.org Received: (majordomo@vger.kernel.org) by vger.kernel.org via listexpand id S1751206AbeEEIgn (ORCPT + 99 others); Sat, 5 May 2018 04:36:43 -0400 Received: from mail-wr0-f194.google.com ([209.85.128.194]:47083 "EHLO mail-wr0-f194.google.com" rhost-flags-OK-OK-OK-OK) by vger.kernel.org with ESMTP id S1750764AbeEEIgk (ORCPT ); Sat, 5 May 2018 04:36:40 -0400 Received: by mail-wr0-f194.google.com with SMTP id o2-v6so20283749wrj.13 for ; Sat, 05 May 2018 01:36:39 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=sender:date:from:to:cc:subject:message-id:references:mime-version :content-disposition:in-reply-to:user-agent; bh=6VA61MMQVtZBQ1reWgIcNjw+EtfbaCEfrCvIiVQMQXA=; b=ebsX7vquGtyM+Y2yf1kN69dNjRPnhSuhFnzGCxdwDKW7kBmauR1MIkRmRIwLDKp2uR tGl+sFwJETmz0HsD1bJ7oPMtd5jzVzzTCwtB/XdDBjJSq08FqkhBaQC1SnjR3HhnyHJD Dvy5skXhD8xL6E+Mi7bF7XgI/A6j3+vtUKepvKZN+uxC17iq0pU8Qm4TQ6yOFDGNGtKu mPgLyKXjOCQtj074p5KKmgYlx6SWe6+1yxoxlO9JjQRnlBbk9RbQPu1fgEIKbWuUMK8b TXuVAfmq+dLY8VBVdXmlOlE7QpUtyk/HXQKKZRWjb3D4aETIXoZj1igfg+lamIS3js19 XL8w== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:sender:date:from:to:cc:subject:message-id :references:mime-version:content-disposition:in-reply-to:user-agent; bh=6VA61MMQVtZBQ1reWgIcNjw+EtfbaCEfrCvIiVQMQXA=; b=sUlyuLOtvuh23zTCapnWWtC6UQ+C3ArhyUx4AYfQjKHFiEiIps0MRoh4Pn0Alz9tir JYdlidiH/SMxSj0Pc1CK07D7JqpprlFuPa/qMqYhjF4PRTnnZj7SI5caTRhAQzW4XXOC 0/ZbSUdWnkbqrCLFKEK0rq2+T10shhqBolJ3E0wGGpVmx5pvysEfsEdn+sMoD4P5Fxqt SZ8AJZucM0dUFDXS0hCS0tSg4SoT3eHMWrIm1PlE51+Ud64ZKL8C1Gx2CDUpG3Sru8BT 3DPrXfEecO755DNxkpN4ShLNfHB0HF+Cg3pP4ldWaEmBaxNTTbE/L87/kJc50+cONATh +2gA== X-Gm-Message-State: ALQs6tCYFWeGg1akvZ2P1HS7wgeIF2IluWFp+Np1TYHfjD6lqfq/xxcm XwE47UgoqrjVDAEitqXwCG4= X-Received: by 2002:adf:b246:: with SMTP id y6-v6mr10017291wra.99.1525509399026; Sat, 05 May 2018 01:36:39 -0700 (PDT) Received: from gmail.com (2E8B0CD5.catv.pool.telekom.hu. [46.139.12.213]) by smtp.gmail.com with ESMTPSA id v111-v6sm18541440wrb.30.2018.05.05.01.36.37 (version=TLS1_2 cipher=ECDHE-RSA-CHACHA20-POLY1305 bits=256/256); Sat, 05 May 2018 01:36:38 -0700 (PDT) Date: Sat, 5 May 2018 10:36:35 +0200 From: Ingo Molnar To: Mark Rutland Cc: Peter Zijlstra , linux-arm-kernel@lists.infradead.org, linux-kernel@vger.kernel.org, aryabinin@virtuozzo.com, boqun.feng@gmail.com, catalin.marinas@arm.com, dvyukov@google.com, will.deacon@arm.com, Linus Torvalds , Andrew Morton , "Paul E. McKenney" , Peter Zijlstra , Thomas Gleixner Subject: [PATCH] locking/atomics: Simplify the op definitions in atomic.h some more Message-ID: <20180505083635.622xmcvb42dw5xxh@gmail.com> References: <20180504173937.25300-1-mark.rutland@arm.com> <20180504173937.25300-2-mark.rutland@arm.com> <20180504180105.GS12217@hirez.programming.kicks-ass.net> <20180504180909.dnhfflibjwywnm4l@lakrids.cambridge.arm.com> <20180505081100.nsyrqrpzq2vd27bk@gmail.com> MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline In-Reply-To: <20180505081100.nsyrqrpzq2vd27bk@gmail.com> User-Agent: NeoMutt/20170609 (1.8.3) Sender: linux-kernel-owner@vger.kernel.org Precedence: bulk List-ID: X-Mailing-List: linux-kernel@vger.kernel.org * Ingo Molnar wrote: > Before: > > #ifndef atomic_fetch_dec_relaxed > > #ifndef atomic_fetch_dec > #define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > #define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) > #define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) > #define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) > #else /* atomic_fetch_dec */ > #define atomic_fetch_dec_relaxed atomic_fetch_dec > #define atomic_fetch_dec_acquire atomic_fetch_dec > #define atomic_fetch_dec_release atomic_fetch_dec > #endif /* atomic_fetch_dec */ > > #else /* atomic_fetch_dec_relaxed */ > > #ifndef atomic_fetch_dec_acquire > #define atomic_fetch_dec_acquire(...) \ > __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > #endif > > #ifndef atomic_fetch_dec_release > #define atomic_fetch_dec_release(...) \ > __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > #endif > > #ifndef atomic_fetch_dec > #define atomic_fetch_dec(...) \ > __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > #endif > #endif /* atomic_fetch_dec_relaxed */ > > After: > > #ifndef atomic_fetch_dec_relaxed > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) > # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) > # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) > # else > # define atomic_fetch_dec_relaxed atomic_fetch_dec > # define atomic_fetch_dec_acquire atomic_fetch_dec > # define atomic_fetch_dec_release atomic_fetch_dec > # endif > #else > # ifndef atomic_fetch_dec_acquire > # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > # endif > # ifndef atomic_fetch_dec_release > # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > # endif > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > # endif > #endif > > The new variant is readable at a glance, and the hierarchy of defines is very > obvious as well. > > And I think we could do even better - there's absolutely no reason why _every_ > operation has to be made conditional on a finegrained level - they are overriden > in API groups. In fact allowing individual override is arguably a fragility. > > So we could do the following simplification on top of that: > > #ifndef atomic_fetch_dec_relaxed > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) > # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) > # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) > # else > # define atomic_fetch_dec_relaxed atomic_fetch_dec > # define atomic_fetch_dec_acquire atomic_fetch_dec > # define atomic_fetch_dec_release atomic_fetch_dec > # endif > #else > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > # endif > #endif The attached patch implements this, which gives us another healthy simplification: include/linux/atomic.h | 312 ++++++++++--------------------------------------- 1 file changed, 62 insertions(+), 250 deletions(-) Note that the simplest definition block is now: #ifndef atomic_cmpxchg_relaxed # define atomic_cmpxchg_relaxed atomic_cmpxchg # define atomic_cmpxchg_acquire atomic_cmpxchg # define atomic_cmpxchg_release atomic_cmpxchg #else # ifndef atomic_cmpxchg # define atomic_cmpxchg(...) __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) # define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) # define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) # endif #endif ... which is very readable! The total linecount reduction of the two patches is pretty significant as well: include/linux/atomic.h | 1063 ++++++++++++++++-------------------------------- 1 file changed, 343 insertions(+), 720 deletions(-) Note that I kept the second patch separate, because technically it changes the way we use the defines - it should not break anything, unless I missed some detail. Please keep this kind of clarity and simplicity in new instrumentation patches! Thanks, Ingo ==================> From 5affbf7e91901143f84f1b2ca64f4afe70e210fd Mon Sep 17 00:00:00 2001 From: Ingo Molnar Date: Sat, 5 May 2018 10:23:23 +0200 Subject: [PATCH] locking/atomics: Simplify the op definitions in atomic.h some more Before: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec_acquire # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec_release # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # endif #endif After: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif #endif The idea is that because we already group these APIs by certain defines such as atomic_fetch_dec_relaxed and atomic_fetch_dec in the primary branches - we can do the same in the secondary branch as well. ( Also remove some unnecessarily duplicate comments, as the API group defines are now pretty much self-documenting. ) No change in functionality. Cc: Peter Zijlstra Cc: Linus Torvalds Cc: Andrew Morton Cc: Thomas Gleixner Cc: Paul E. McKenney Cc: Will Deacon Cc: linux-kernel@vger.kernel.org Signed-off-by: Ingo Molnar --- include/linux/atomic.h | 312 ++++++++++--------------------------------------- 1 file changed, 62 insertions(+), 250 deletions(-) diff --git a/include/linux/atomic.h b/include/linux/atomic.h index 67aaafba256b..352ecc72d7f5 100644 --- a/include/linux/atomic.h +++ b/include/linux/atomic.h @@ -71,98 +71,66 @@ }) #endif -/* atomic_add_return_relaxed() et al: */ - #ifndef atomic_add_return_relaxed # define atomic_add_return_relaxed atomic_add_return # define atomic_add_return_acquire atomic_add_return # define atomic_add_return_release atomic_add_return #else -# ifndef atomic_add_return_acquire -# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) -# endif -# ifndef atomic_add_return_release -# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) -# endif # ifndef atomic_add_return # define atomic_add_return(...) __atomic_op_fence(atomic_add_return, __VA_ARGS__) +# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) +# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) # endif #endif -/* atomic_inc_return_relaxed() et al: */ - #ifndef atomic_inc_return_relaxed # define atomic_inc_return_relaxed atomic_inc_return # define atomic_inc_return_acquire atomic_inc_return # define atomic_inc_return_release atomic_inc_return #else -# ifndef atomic_inc_return_acquire -# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) -# endif -# ifndef atomic_inc_return_release -# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) -# endif # ifndef atomic_inc_return # define atomic_inc_return(...) __atomic_op_fence(atomic_inc_return, __VA_ARGS__) +# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) +# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) # endif #endif -/* atomic_sub_return_relaxed() et al: */ - #ifndef atomic_sub_return_relaxed # define atomic_sub_return_relaxed atomic_sub_return # define atomic_sub_return_acquire atomic_sub_return # define atomic_sub_return_release atomic_sub_return #else -# ifndef atomic_sub_return_acquire -# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) -# endif -# ifndef atomic_sub_return_release -# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) -# endif # ifndef atomic_sub_return # define atomic_sub_return(...) __atomic_op_fence(atomic_sub_return, __VA_ARGS__) +# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) +# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) # endif #endif -/* atomic_dec_return_relaxed() et al: */ - #ifndef atomic_dec_return_relaxed # define atomic_dec_return_relaxed atomic_dec_return # define atomic_dec_return_acquire atomic_dec_return # define atomic_dec_return_release atomic_dec_return #else -# ifndef atomic_dec_return_acquire -# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) -# endif -# ifndef atomic_dec_return_release -# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) -# endif # ifndef atomic_dec_return # define atomic_dec_return(...) __atomic_op_fence(atomic_dec_return, __VA_ARGS__) +# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) +# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) # endif #endif -/* atomic_fetch_add_relaxed() et al: */ - #ifndef atomic_fetch_add_relaxed # define atomic_fetch_add_relaxed atomic_fetch_add # define atomic_fetch_add_acquire atomic_fetch_add # define atomic_fetch_add_release atomic_fetch_add #else -# ifndef atomic_fetch_add_acquire -# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) -# endif -# ifndef atomic_fetch_add_release -# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) -# endif # ifndef atomic_fetch_add # define atomic_fetch_add(...) __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) +# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) +# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) # endif #endif -/* atomic_fetch_inc_relaxed() et al: */ - #ifndef atomic_fetch_inc_relaxed # ifndef atomic_fetch_inc # define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) @@ -175,37 +143,25 @@ # define atomic_fetch_inc_release atomic_fetch_inc # endif #else -# ifndef atomic_fetch_inc_acquire -# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) -# endif -# ifndef atomic_fetch_inc_release -# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) -# endif # ifndef atomic_fetch_inc # define atomic_fetch_inc(...) __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) +# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) +# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) # endif #endif -/* atomic_fetch_sub_relaxed() et al: */ - #ifndef atomic_fetch_sub_relaxed # define atomic_fetch_sub_relaxed atomic_fetch_sub # define atomic_fetch_sub_acquire atomic_fetch_sub # define atomic_fetch_sub_release atomic_fetch_sub #else -# ifndef atomic_fetch_sub_acquire -# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) -# endif -# ifndef atomic_fetch_sub_release -# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) -# endif # ifndef atomic_fetch_sub # define atomic_fetch_sub(...) __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) +# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) +# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) # endif #endif -/* atomic_fetch_dec_relaxed() et al: */ - #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) @@ -218,127 +174,86 @@ # define atomic_fetch_dec_release atomic_fetch_dec # endif #else -# ifndef atomic_fetch_dec_acquire -# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) -# endif -# ifndef atomic_fetch_dec_release -# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) -# endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) +# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) +# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif #endif -/* atomic_fetch_or_relaxed() et al: */ - #ifndef atomic_fetch_or_relaxed # define atomic_fetch_or_relaxed atomic_fetch_or # define atomic_fetch_or_acquire atomic_fetch_or # define atomic_fetch_or_release atomic_fetch_or #else -# ifndef atomic_fetch_or_acquire -# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) -# endif -# ifndef atomic_fetch_or_release -# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) -# endif # ifndef atomic_fetch_or # define atomic_fetch_or(...) __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) +# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) +# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) # endif #endif -/* atomic_fetch_and_relaxed() et al: */ - #ifndef atomic_fetch_and_relaxed # define atomic_fetch_and_relaxed atomic_fetch_and # define atomic_fetch_and_acquire atomic_fetch_and # define atomic_fetch_and_release atomic_fetch_and #else -# ifndef atomic_fetch_and_acquire -# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) -# endif -# ifndef atomic_fetch_and_release -# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) -# endif # ifndef atomic_fetch_and # define atomic_fetch_and(...) __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) +# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) +# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) # endif #endif #ifdef atomic_andnot -/* atomic_fetch_andnot_relaxed() et al: */ - #ifndef atomic_fetch_andnot_relaxed # define atomic_fetch_andnot_relaxed atomic_fetch_andnot # define atomic_fetch_andnot_acquire atomic_fetch_andnot # define atomic_fetch_andnot_release atomic_fetch_andnot #else -# ifndef atomic_fetch_andnot_acquire -# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) -# endif -# ifndef atomic_fetch_andnot_release -# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) -# endif # ifndef atomic_fetch_andnot # define atomic_fetch_andnot(...) __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) +# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) +# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) # endif #endif #endif /* atomic_andnot */ -/* atomic_fetch_xor_relaxed() et al: */ - #ifndef atomic_fetch_xor_relaxed # define atomic_fetch_xor_relaxed atomic_fetch_xor # define atomic_fetch_xor_acquire atomic_fetch_xor # define atomic_fetch_xor_release atomic_fetch_xor #else -# ifndef atomic_fetch_xor_acquire -# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) -# endif -# ifndef atomic_fetch_xor_release -# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) -# endif # ifndef atomic_fetch_xor # define atomic_fetch_xor(...) __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) +# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) +# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) # endif #endif - -/* atomic_xchg_relaxed() et al: */ - #ifndef atomic_xchg_relaxed #define atomic_xchg_relaxed atomic_xchg #define atomic_xchg_acquire atomic_xchg #define atomic_xchg_release atomic_xchg #else -# ifndef atomic_xchg_acquire -# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) -# endif -# ifndef atomic_xchg_release -# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) -# endif # ifndef atomic_xchg # define atomic_xchg(...) __atomic_op_fence(atomic_xchg, __VA_ARGS__) +# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) +# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) # endif #endif -/* atomic_cmpxchg_relaxed() et al: */ - #ifndef atomic_cmpxchg_relaxed # define atomic_cmpxchg_relaxed atomic_cmpxchg # define atomic_cmpxchg_acquire atomic_cmpxchg # define atomic_cmpxchg_release atomic_cmpxchg #else -# ifndef atomic_cmpxchg_acquire -# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) -# endif -# ifndef atomic_cmpxchg_release -# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) -# endif # ifndef atomic_cmpxchg # define atomic_cmpxchg(...) __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) +# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) +# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) # endif #endif @@ -362,57 +277,39 @@ # define atomic_try_cmpxchg_release atomic_try_cmpxchg #endif -/* cmpxchg_relaxed() et al: */ - #ifndef cmpxchg_relaxed # define cmpxchg_relaxed cmpxchg # define cmpxchg_acquire cmpxchg # define cmpxchg_release cmpxchg #else -# ifndef cmpxchg_acquire -# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) -# endif -# ifndef cmpxchg_release -# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) -# endif # ifndef cmpxchg # define cmpxchg(...) __atomic_op_fence(cmpxchg, __VA_ARGS__) +# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) +# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) # endif #endif -/* cmpxchg64_relaxed() et al: */ - #ifndef cmpxchg64_relaxed # define cmpxchg64_relaxed cmpxchg64 # define cmpxchg64_acquire cmpxchg64 # define cmpxchg64_release cmpxchg64 #else -# ifndef cmpxchg64_acquire -# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) -# endif -# ifndef cmpxchg64_release -# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) -# endif # ifndef cmpxchg64 # define cmpxchg64(...) __atomic_op_fence(cmpxchg64, __VA_ARGS__) +# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) +# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) # endif #endif -/* xchg_relaxed() et al: */ - #ifndef xchg_relaxed # define xchg_relaxed xchg # define xchg_acquire xchg # define xchg_release xchg #else -# ifndef xchg_acquire -# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) -# endif -# ifndef xchg_release -# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) -# endif # ifndef xchg # define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) +# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) +# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) # endif #endif @@ -569,98 +466,66 @@ static inline int atomic_dec_if_positive(atomic_t *v) # define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) #endif -/* atomic64_add_return_relaxed() et al: */ - #ifndef atomic64_add_return_relaxed # define atomic64_add_return_relaxed atomic64_add_return # define atomic64_add_return_acquire atomic64_add_return # define atomic64_add_return_release atomic64_add_return #else -# ifndef atomic64_add_return_acquire -# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) -# endif -# ifndef atomic64_add_return_release -# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) -# endif # ifndef atomic64_add_return # define atomic64_add_return(...) __atomic_op_fence(atomic64_add_return, __VA_ARGS__) +# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) +# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) # endif #endif -/* atomic64_inc_return_relaxed() et al: */ - #ifndef atomic64_inc_return_relaxed # define atomic64_inc_return_relaxed atomic64_inc_return # define atomic64_inc_return_acquire atomic64_inc_return # define atomic64_inc_return_release atomic64_inc_return #else -# ifndef atomic64_inc_return_acquire -# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) -# endif -# ifndef atomic64_inc_return_release -# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) -# endif # ifndef atomic64_inc_return # define atomic64_inc_return(...) __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) +# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) +# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) # endif #endif -/* atomic64_sub_return_relaxed() et al: */ - #ifndef atomic64_sub_return_relaxed # define atomic64_sub_return_relaxed atomic64_sub_return # define atomic64_sub_return_acquire atomic64_sub_return # define atomic64_sub_return_release atomic64_sub_return #else -# ifndef atomic64_sub_return_acquire -# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) -# endif -# ifndef atomic64_sub_return_release -# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) -# endif # ifndef atomic64_sub_return # define atomic64_sub_return(...) __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) +# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) +# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) # endif #endif -/* atomic64_dec_return_relaxed() et al: */ - #ifndef atomic64_dec_return_relaxed # define atomic64_dec_return_relaxed atomic64_dec_return # define atomic64_dec_return_acquire atomic64_dec_return # define atomic64_dec_return_release atomic64_dec_return #else -# ifndef atomic64_dec_return_acquire -# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) -# endif -# ifndef atomic64_dec_return_release -# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) -# endif # ifndef atomic64_dec_return # define atomic64_dec_return(...) __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) +# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) +# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) # endif #endif -/* atomic64_fetch_add_relaxed() et al: */ - #ifndef atomic64_fetch_add_relaxed # define atomic64_fetch_add_relaxed atomic64_fetch_add # define atomic64_fetch_add_acquire atomic64_fetch_add # define atomic64_fetch_add_release atomic64_fetch_add #else -# ifndef atomic64_fetch_add_acquire -# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_add_release -# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) -# endif # ifndef atomic64_fetch_add # define atomic64_fetch_add(...) __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) +# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) +# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) # endif #endif -/* atomic64_fetch_inc_relaxed() et al: */ - #ifndef atomic64_fetch_inc_relaxed # ifndef atomic64_fetch_inc # define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) @@ -673,37 +538,25 @@ static inline int atomic_dec_if_positive(atomic_t *v) # define atomic64_fetch_inc_release atomic64_fetch_inc # endif #else -# ifndef atomic64_fetch_inc_acquire -# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_inc_release -# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) -# endif # ifndef atomic64_fetch_inc # define atomic64_fetch_inc(...) __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) +# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) +# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) # endif #endif -/* atomic64_fetch_sub_relaxed() et al: */ - #ifndef atomic64_fetch_sub_relaxed # define atomic64_fetch_sub_relaxed atomic64_fetch_sub # define atomic64_fetch_sub_acquire atomic64_fetch_sub # define atomic64_fetch_sub_release atomic64_fetch_sub #else -# ifndef atomic64_fetch_sub_acquire -# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_sub_release -# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) -# endif # ifndef atomic64_fetch_sub # define atomic64_fetch_sub(...) __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) +# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) +# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) # endif #endif -/* atomic64_fetch_dec_relaxed() et al: */ - #ifndef atomic64_fetch_dec_relaxed # ifndef atomic64_fetch_dec # define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) @@ -716,127 +569,86 @@ static inline int atomic_dec_if_positive(atomic_t *v) # define atomic64_fetch_dec_release atomic64_fetch_dec # endif #else -# ifndef atomic64_fetch_dec_acquire -# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_dec_release -# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) -# endif # ifndef atomic64_fetch_dec # define atomic64_fetch_dec(...) __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) +# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) +# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) # endif #endif -/* atomic64_fetch_or_relaxed() et al: */ - #ifndef atomic64_fetch_or_relaxed # define atomic64_fetch_or_relaxed atomic64_fetch_or # define atomic64_fetch_or_acquire atomic64_fetch_or # define atomic64_fetch_or_release atomic64_fetch_or #else -# ifndef atomic64_fetch_or_acquire -# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_or_release -# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) -# endif # ifndef atomic64_fetch_or # define atomic64_fetch_or(...) __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) +# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) +# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) # endif #endif - -/* atomic64_fetch_and_relaxed() et al: */ - #ifndef atomic64_fetch_and_relaxed # define atomic64_fetch_and_relaxed atomic64_fetch_and # define atomic64_fetch_and_acquire atomic64_fetch_and # define atomic64_fetch_and_release atomic64_fetch_and #else -# ifndef atomic64_fetch_and_acquire -# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_and_release -# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) -# endif # ifndef atomic64_fetch_and # define atomic64_fetch_and(...) __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) +# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) +# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) # endif #endif #ifdef atomic64_andnot -/* atomic64_fetch_andnot_relaxed() et al: */ - #ifndef atomic64_fetch_andnot_relaxed # define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot # define atomic64_fetch_andnot_acquire atomic64_fetch_andnot # define atomic64_fetch_andnot_release atomic64_fetch_andnot #else -# ifndef atomic64_fetch_andnot_acquire -# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_andnot_release -# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) -# endif # ifndef atomic64_fetch_andnot # define atomic64_fetch_andnot(...) __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) +# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) +# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) # endif #endif #endif /* atomic64_andnot */ -/* atomic64_fetch_xor_relaxed() et al: */ - #ifndef atomic64_fetch_xor_relaxed # define atomic64_fetch_xor_relaxed atomic64_fetch_xor # define atomic64_fetch_xor_acquire atomic64_fetch_xor # define atomic64_fetch_xor_release atomic64_fetch_xor #else -# ifndef atomic64_fetch_xor_acquire -# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) -# endif -# ifndef atomic64_fetch_xor_release -# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) -# endif # ifndef atomic64_fetch_xor # define atomic64_fetch_xor(...) __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) +# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) +# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) # endif #endif -/* atomic64_xchg_relaxed() et al: */ - #ifndef atomic64_xchg_relaxed # define atomic64_xchg_relaxed atomic64_xchg # define atomic64_xchg_acquire atomic64_xchg # define atomic64_xchg_release atomic64_xchg #else -# ifndef atomic64_xchg_acquire -# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) -# endif -# ifndef atomic64_xchg_release -# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) -# endif # ifndef atomic64_xchg # define atomic64_xchg(...) __atomic_op_fence(atomic64_xchg, __VA_ARGS__) +# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) +# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) # endif #endif -/* atomic64_cmpxchg_relaxed() et al: */ - #ifndef atomic64_cmpxchg_relaxed # define atomic64_cmpxchg_relaxed atomic64_cmpxchg # define atomic64_cmpxchg_acquire atomic64_cmpxchg # define atomic64_cmpxchg_release atomic64_cmpxchg #else -# ifndef atomic64_cmpxchg_acquire -# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) -# endif -# ifndef atomic64_cmpxchg_release -# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) -# endif # ifndef atomic64_cmpxchg # define atomic64_cmpxchg(...) __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) +# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) +# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) # endif #endif