Received: by 10.192.165.148 with SMTP id m20csp2114106imm; Sun, 6 May 2018 07:15:48 -0700 (PDT) X-Google-Smtp-Source: AB8JxZqgE/7Mw5pyExoOaiyFXCV1BGA782+MpH6B7/n3kKV3Sdzj3rcuY6O23NUNExwUgtQ5oqnb X-Received: by 2002:a17:902:1566:: with SMTP id b35-v6mr35088913plh.107.1525616148297; Sun, 06 May 2018 07:15:48 -0700 (PDT) ARC-Seal: i=1; a=rsa-sha256; t=1525616148; cv=none; d=google.com; s=arc-20160816; b=peKvDNeQpZJ9OPHie0CzgN9BuDGpfszd/El5lT8GxrKQCTjqDccJtB2Y0VgdqhRPi4 IxwVXbttYloaR7Xf1aKL+wfimeLR/sdyLYsUdfJ9FThOhIpCAoeBUSPxV2j+rZeoa6Ky EE5GxbjnZ3n6PAbuP9HlBxvo5pxxWHcTTivirs6VayQX4ip43wQZ0kJJ4p6A2DLxoTz1 /Rm1Ni+FHAdPOWH9KcYUSsmQ9FcbfypqP5oYdCV2AoVfAgOej75OYKNro9kl5tG/EYtl dK/WKfF3DuL7bEXrDx9ry1mh6fM/2lMd8zqZA8PdHvVPg8db8nUab2KD7nnj5zWFimhs IwPQ== ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-20160816; h=list-id:precedence:sender:user-agent:in-reply-to :content-disposition:mime-version:references:message-id:subject:cc :to:from:date:dkim-signature:arc-authentication-results; bh=BQ/dArquzC0UNWgAui8WogeKMgaBF1/UDcK7c0xZOpg=; b=LKaBTI7nFHXZPqpNIRHqq3IehjueMWsYQzs1YAKqWndWUdxu5cNB2t25+tLvJ4RoVV 58wdrNav7zhcZPFCWvqX4e5Jq3sBxrzHVdfDjgGJZl/UMf3M3wZWhLzD4iapDKA/4qYX TPeXB+pTJNNBy0p1WY8EnXM7Cxs2xZ5ZJNxidQ30KH+pndEtJpuOc3+sR8wrrCbF8kH4 11Dyd/Twto1ptyRnlX8U7lm52HDINEeQhrVrL2wiXnLpnV8VIngeSkR40lUMrZOeUdQ3 CZclKWMmsM+4GwqAeV1qRme52nVMMj4n6Xzxx1clqBSCXoJdAdjAqyhgi8+9whdSIETz Ul5A== ARC-Authentication-Results: i=1; mx.google.com; dkim=pass header.i=@amarulasolutions.com header.s=google header.b=GYWxLrZ7; spf=pass (google.com: best guess record for domain of linux-kernel-owner@vger.kernel.org designates 209.132.180.67 as permitted sender) smtp.mailfrom=linux-kernel-owner@vger.kernel.org Return-Path: Received: from vger.kernel.org (vger.kernel.org. [209.132.180.67]) by mx.google.com with ESMTP id m19si12936015pff.303.2018.05.06.07.15.21; Sun, 06 May 2018 07:15:48 -0700 (PDT) Received-SPF: pass (google.com: best guess record for domain of linux-kernel-owner@vger.kernel.org designates 209.132.180.67 as permitted sender) client-ip=209.132.180.67; Authentication-Results: mx.google.com; dkim=pass header.i=@amarulasolutions.com header.s=google header.b=GYWxLrZ7; spf=pass (google.com: best guess record for domain of linux-kernel-owner@vger.kernel.org designates 209.132.180.67 as permitted sender) smtp.mailfrom=linux-kernel-owner@vger.kernel.org Received: (majordomo@vger.kernel.org) by vger.kernel.org via listexpand id S1751782AbeEFONA (ORCPT + 99 others); Sun, 6 May 2018 10:13:00 -0400 Received: from mail-wm0-f65.google.com ([74.125.82.65]:56299 "EHLO mail-wm0-f65.google.com" rhost-flags-OK-OK-OK-OK) by vger.kernel.org with ESMTP id S1751751AbeEFOM6 (ORCPT ); Sun, 6 May 2018 10:12:58 -0400 Received: by mail-wm0-f65.google.com with SMTP id a8so10223688wmg.5 for ; Sun, 06 May 2018 07:12:57 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=amarulasolutions.com; s=google; h=date:from:to:cc:subject:message-id:references:mime-version :content-disposition:in-reply-to:user-agent; bh=BQ/dArquzC0UNWgAui8WogeKMgaBF1/UDcK7c0xZOpg=; b=GYWxLrZ7Inrn8P4NGswn8IymDTi0NZiAKJMeGKCXckjlqkMcqqB8yK4xHuahugD82c z2VN8JlRk4Bk5CDRfQMvnPDijELyOudvX7DEfuP4MwFOCU/26d9lZY2TY21vAywyXQ+K H7KKwytb/u5V4RvmI7KpkE1cND6HHSCcMKdgs= X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:date:from:to:cc:subject:message-id:references :mime-version:content-disposition:in-reply-to:user-agent; bh=BQ/dArquzC0UNWgAui8WogeKMgaBF1/UDcK7c0xZOpg=; b=n0tCa7VH/umz/QzPw2by6WJBl0uAYJwvxQvJBlPFoGgePC/JUWOoLMMTweN/cd4ogL bl2EFI8+yI1HblVwQ0MN3ysoKi5dke0nSkB+HsNO2H6pjzg+whtoua47fBNE+Qou5LiK 1SWipFscAuNPEjAVj/Ef+Pf+r32MeQgMm8/fPWHuu/L/0Qmg4r3idCNfHJWQ5JmThb5r 7Xk3RFZub0TtnneY3XdhLvxQJa4crk3ZHdjfnWB0JWkdj4+XoELYE2Mnl9dU6R82HD7P WPJvXtScTy5ASPcQ9hwmQN82el9F7t+HcWwNUZfBp4TrMNEh5quIoTyxh6bsaSon5AEY bSzA== X-Gm-Message-State: ALQs6tAyfZmNfQQ/5cQK6om0FERQASH3G+iAviACXXR3Ex3c0B0mAj1N BZiFc3NRfAKSC4jMOJzhE/cxaA== X-Received: by 10.28.7.139 with SMTP id 133mr21891933wmh.72.1525615976616; Sun, 06 May 2018 07:12:56 -0700 (PDT) Received: from andrea ([94.230.152.15]) by smtp.gmail.com with ESMTPSA id 131-v6sm9061499wms.34.2018.05.06.07.12.55 (version=TLS1_2 cipher=ECDHE-RSA-AES128-GCM-SHA256 bits=128/128); Sun, 06 May 2018 07:12:55 -0700 (PDT) Date: Sun, 6 May 2018 16:12:50 +0200 From: Andrea Parri To: Ingo Molnar Cc: Mark Rutland , Peter Zijlstra , linux-arm-kernel@lists.infradead.org, linux-kernel@vger.kernel.org, aryabinin@virtuozzo.com, boqun.feng@gmail.com, catalin.marinas@arm.com, dvyukov@google.com, will.deacon@arm.com, Linus Torvalds , Andrew Morton , "Paul E. McKenney" , Peter Zijlstra , Thomas Gleixner Subject: Re: [PATCH] locking/atomics: Simplify the op definitions in atomic.h some more Message-ID: <20180506141249.GA28723@andrea> References: <20180504173937.25300-1-mark.rutland@arm.com> <20180504173937.25300-2-mark.rutland@arm.com> <20180504180105.GS12217@hirez.programming.kicks-ass.net> <20180504180909.dnhfflibjwywnm4l@lakrids.cambridge.arm.com> <20180505081100.nsyrqrpzq2vd27bk@gmail.com> <20180505083635.622xmcvb42dw5xxh@gmail.com> MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline In-Reply-To: <20180505083635.622xmcvb42dw5xxh@gmail.com> User-Agent: Mutt/1.5.24 (2015-08-30) Sender: linux-kernel-owner@vger.kernel.org Precedence: bulk List-ID: X-Mailing-List: linux-kernel@vger.kernel.org Hi Ingo, > From 5affbf7e91901143f84f1b2ca64f4afe70e210fd Mon Sep 17 00:00:00 2001 > From: Ingo Molnar > Date: Sat, 5 May 2018 10:23:23 +0200 > Subject: [PATCH] locking/atomics: Simplify the op definitions in atomic.h some more > > Before: > > #ifndef atomic_fetch_dec_relaxed > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) > # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) > # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) > # else > # define atomic_fetch_dec_relaxed atomic_fetch_dec > # define atomic_fetch_dec_acquire atomic_fetch_dec > # define atomic_fetch_dec_release atomic_fetch_dec > # endif > #else > # ifndef atomic_fetch_dec_acquire > # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > # endif > # ifndef atomic_fetch_dec_release > # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > # endif > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > # endif > #endif > > After: > > #ifndef atomic_fetch_dec_relaxed > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) > # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) > # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) > # else > # define atomic_fetch_dec_relaxed atomic_fetch_dec > # define atomic_fetch_dec_acquire atomic_fetch_dec > # define atomic_fetch_dec_release atomic_fetch_dec > # endif > #else > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > # endif > #endif > > The idea is that because we already group these APIs by certain defines > such as atomic_fetch_dec_relaxed and atomic_fetch_dec in the primary > branches - we can do the same in the secondary branch as well. > > ( Also remove some unnecessarily duplicate comments, as the API > group defines are now pretty much self-documenting. ) > > No change in functionality. > > Cc: Peter Zijlstra > Cc: Linus Torvalds > Cc: Andrew Morton > Cc: Thomas Gleixner > Cc: Paul E. McKenney > Cc: Will Deacon > Cc: linux-kernel@vger.kernel.org > Signed-off-by: Ingo Molnar This breaks compilation on RISC-V. (For some of its atomics, the arch currently defines the _relaxed and the full variants and it relies on the generic definitions for the _acquire and the _release variants.) Andrea > --- > include/linux/atomic.h | 312 ++++++++++--------------------------------------- > 1 file changed, 62 insertions(+), 250 deletions(-) > > diff --git a/include/linux/atomic.h b/include/linux/atomic.h > index 67aaafba256b..352ecc72d7f5 100644 > --- a/include/linux/atomic.h > +++ b/include/linux/atomic.h > @@ -71,98 +71,66 @@ > }) > #endif > > -/* atomic_add_return_relaxed() et al: */ > - > #ifndef atomic_add_return_relaxed > # define atomic_add_return_relaxed atomic_add_return > # define atomic_add_return_acquire atomic_add_return > # define atomic_add_return_release atomic_add_return > #else > -# ifndef atomic_add_return_acquire > -# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) > -# endif > -# ifndef atomic_add_return_release > -# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) > -# endif > # ifndef atomic_add_return > # define atomic_add_return(...) __atomic_op_fence(atomic_add_return, __VA_ARGS__) > +# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) > +# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) > # endif > #endif > > -/* atomic_inc_return_relaxed() et al: */ > - > #ifndef atomic_inc_return_relaxed > # define atomic_inc_return_relaxed atomic_inc_return > # define atomic_inc_return_acquire atomic_inc_return > # define atomic_inc_return_release atomic_inc_return > #else > -# ifndef atomic_inc_return_acquire > -# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) > -# endif > -# ifndef atomic_inc_return_release > -# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) > -# endif > # ifndef atomic_inc_return > # define atomic_inc_return(...) __atomic_op_fence(atomic_inc_return, __VA_ARGS__) > +# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) > +# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) > # endif > #endif > > -/* atomic_sub_return_relaxed() et al: */ > - > #ifndef atomic_sub_return_relaxed > # define atomic_sub_return_relaxed atomic_sub_return > # define atomic_sub_return_acquire atomic_sub_return > # define atomic_sub_return_release atomic_sub_return > #else > -# ifndef atomic_sub_return_acquire > -# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) > -# endif > -# ifndef atomic_sub_return_release > -# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) > -# endif > # ifndef atomic_sub_return > # define atomic_sub_return(...) __atomic_op_fence(atomic_sub_return, __VA_ARGS__) > +# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) > +# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) > # endif > #endif > > -/* atomic_dec_return_relaxed() et al: */ > - > #ifndef atomic_dec_return_relaxed > # define atomic_dec_return_relaxed atomic_dec_return > # define atomic_dec_return_acquire atomic_dec_return > # define atomic_dec_return_release atomic_dec_return > #else > -# ifndef atomic_dec_return_acquire > -# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) > -# endif > -# ifndef atomic_dec_return_release > -# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) > -# endif > # ifndef atomic_dec_return > # define atomic_dec_return(...) __atomic_op_fence(atomic_dec_return, __VA_ARGS__) > +# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) > +# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) > # endif > #endif > > -/* atomic_fetch_add_relaxed() et al: */ > - > #ifndef atomic_fetch_add_relaxed > # define atomic_fetch_add_relaxed atomic_fetch_add > # define atomic_fetch_add_acquire atomic_fetch_add > # define atomic_fetch_add_release atomic_fetch_add > #else > -# ifndef atomic_fetch_add_acquire > -# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) > -# endif > -# ifndef atomic_fetch_add_release > -# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) > -# endif > # ifndef atomic_fetch_add > # define atomic_fetch_add(...) __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) > +# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) > +# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) > # endif > #endif > > -/* atomic_fetch_inc_relaxed() et al: */ > - > #ifndef atomic_fetch_inc_relaxed > # ifndef atomic_fetch_inc > # define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) > @@ -175,37 +143,25 @@ > # define atomic_fetch_inc_release atomic_fetch_inc > # endif > #else > -# ifndef atomic_fetch_inc_acquire > -# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) > -# endif > -# ifndef atomic_fetch_inc_release > -# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) > -# endif > # ifndef atomic_fetch_inc > # define atomic_fetch_inc(...) __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) > +# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) > +# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) > # endif > #endif > > -/* atomic_fetch_sub_relaxed() et al: */ > - > #ifndef atomic_fetch_sub_relaxed > # define atomic_fetch_sub_relaxed atomic_fetch_sub > # define atomic_fetch_sub_acquire atomic_fetch_sub > # define atomic_fetch_sub_release atomic_fetch_sub > #else > -# ifndef atomic_fetch_sub_acquire > -# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) > -# endif > -# ifndef atomic_fetch_sub_release > -# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) > -# endif > # ifndef atomic_fetch_sub > # define atomic_fetch_sub(...) __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) > +# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) > +# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) > # endif > #endif > > -/* atomic_fetch_dec_relaxed() et al: */ > - > #ifndef atomic_fetch_dec_relaxed > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) > @@ -218,127 +174,86 @@ > # define atomic_fetch_dec_release atomic_fetch_dec > # endif > #else > -# ifndef atomic_fetch_dec_acquire > -# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > -# endif > -# ifndef atomic_fetch_dec_release > -# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > -# endif > # ifndef atomic_fetch_dec > # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) > +# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) > +# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) > # endif > #endif > > -/* atomic_fetch_or_relaxed() et al: */ > - > #ifndef atomic_fetch_or_relaxed > # define atomic_fetch_or_relaxed atomic_fetch_or > # define atomic_fetch_or_acquire atomic_fetch_or > # define atomic_fetch_or_release atomic_fetch_or > #else > -# ifndef atomic_fetch_or_acquire > -# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) > -# endif > -# ifndef atomic_fetch_or_release > -# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) > -# endif > # ifndef atomic_fetch_or > # define atomic_fetch_or(...) __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) > +# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) > +# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) > # endif > #endif > > -/* atomic_fetch_and_relaxed() et al: */ > - > #ifndef atomic_fetch_and_relaxed > # define atomic_fetch_and_relaxed atomic_fetch_and > # define atomic_fetch_and_acquire atomic_fetch_and > # define atomic_fetch_and_release atomic_fetch_and > #else > -# ifndef atomic_fetch_and_acquire > -# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) > -# endif > -# ifndef atomic_fetch_and_release > -# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) > -# endif > # ifndef atomic_fetch_and > # define atomic_fetch_and(...) __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) > +# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) > +# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) > # endif > #endif > > #ifdef atomic_andnot > > -/* atomic_fetch_andnot_relaxed() et al: */ > - > #ifndef atomic_fetch_andnot_relaxed > # define atomic_fetch_andnot_relaxed atomic_fetch_andnot > # define atomic_fetch_andnot_acquire atomic_fetch_andnot > # define atomic_fetch_andnot_release atomic_fetch_andnot > #else > -# ifndef atomic_fetch_andnot_acquire > -# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) > -# endif > -# ifndef atomic_fetch_andnot_release > -# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) > -# endif > # ifndef atomic_fetch_andnot > # define atomic_fetch_andnot(...) __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) > +# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) > +# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) > # endif > #endif > > #endif /* atomic_andnot */ > > -/* atomic_fetch_xor_relaxed() et al: */ > - > #ifndef atomic_fetch_xor_relaxed > # define atomic_fetch_xor_relaxed atomic_fetch_xor > # define atomic_fetch_xor_acquire atomic_fetch_xor > # define atomic_fetch_xor_release atomic_fetch_xor > #else > -# ifndef atomic_fetch_xor_acquire > -# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) > -# endif > -# ifndef atomic_fetch_xor_release > -# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) > -# endif > # ifndef atomic_fetch_xor > # define atomic_fetch_xor(...) __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) > +# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) > +# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) > # endif > #endif > > - > -/* atomic_xchg_relaxed() et al: */ > - > #ifndef atomic_xchg_relaxed > #define atomic_xchg_relaxed atomic_xchg > #define atomic_xchg_acquire atomic_xchg > #define atomic_xchg_release atomic_xchg > #else > -# ifndef atomic_xchg_acquire > -# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) > -# endif > -# ifndef atomic_xchg_release > -# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) > -# endif > # ifndef atomic_xchg > # define atomic_xchg(...) __atomic_op_fence(atomic_xchg, __VA_ARGS__) > +# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) > +# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) > # endif > #endif > > -/* atomic_cmpxchg_relaxed() et al: */ > - > #ifndef atomic_cmpxchg_relaxed > # define atomic_cmpxchg_relaxed atomic_cmpxchg > # define atomic_cmpxchg_acquire atomic_cmpxchg > # define atomic_cmpxchg_release atomic_cmpxchg > #else > -# ifndef atomic_cmpxchg_acquire > -# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) > -# endif > -# ifndef atomic_cmpxchg_release > -# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) > -# endif > # ifndef atomic_cmpxchg > # define atomic_cmpxchg(...) __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) > +# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) > +# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) > # endif > #endif > > @@ -362,57 +277,39 @@ > # define atomic_try_cmpxchg_release atomic_try_cmpxchg > #endif > > -/* cmpxchg_relaxed() et al: */ > - > #ifndef cmpxchg_relaxed > # define cmpxchg_relaxed cmpxchg > # define cmpxchg_acquire cmpxchg > # define cmpxchg_release cmpxchg > #else > -# ifndef cmpxchg_acquire > -# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) > -# endif > -# ifndef cmpxchg_release > -# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) > -# endif > # ifndef cmpxchg > # define cmpxchg(...) __atomic_op_fence(cmpxchg, __VA_ARGS__) > +# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) > +# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) > # endif > #endif > > -/* cmpxchg64_relaxed() et al: */ > - > #ifndef cmpxchg64_relaxed > # define cmpxchg64_relaxed cmpxchg64 > # define cmpxchg64_acquire cmpxchg64 > # define cmpxchg64_release cmpxchg64 > #else > -# ifndef cmpxchg64_acquire > -# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) > -# endif > -# ifndef cmpxchg64_release > -# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) > -# endif > # ifndef cmpxchg64 > # define cmpxchg64(...) __atomic_op_fence(cmpxchg64, __VA_ARGS__) > +# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) > +# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) > # endif > #endif > > -/* xchg_relaxed() et al: */ > - > #ifndef xchg_relaxed > # define xchg_relaxed xchg > # define xchg_acquire xchg > # define xchg_release xchg > #else > -# ifndef xchg_acquire > -# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) > -# endif > -# ifndef xchg_release > -# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) > -# endif > # ifndef xchg > # define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) > +# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) > +# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) > # endif > #endif > > @@ -569,98 +466,66 @@ static inline int atomic_dec_if_positive(atomic_t *v) > # define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) > #endif > > -/* atomic64_add_return_relaxed() et al: */ > - > #ifndef atomic64_add_return_relaxed > # define atomic64_add_return_relaxed atomic64_add_return > # define atomic64_add_return_acquire atomic64_add_return > # define atomic64_add_return_release atomic64_add_return > #else > -# ifndef atomic64_add_return_acquire > -# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) > -# endif > -# ifndef atomic64_add_return_release > -# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) > -# endif > # ifndef atomic64_add_return > # define atomic64_add_return(...) __atomic_op_fence(atomic64_add_return, __VA_ARGS__) > +# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) > +# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) > # endif > #endif > > -/* atomic64_inc_return_relaxed() et al: */ > - > #ifndef atomic64_inc_return_relaxed > # define atomic64_inc_return_relaxed atomic64_inc_return > # define atomic64_inc_return_acquire atomic64_inc_return > # define atomic64_inc_return_release atomic64_inc_return > #else > -# ifndef atomic64_inc_return_acquire > -# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) > -# endif > -# ifndef atomic64_inc_return_release > -# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) > -# endif > # ifndef atomic64_inc_return > # define atomic64_inc_return(...) __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) > +# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) > +# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) > # endif > #endif > > -/* atomic64_sub_return_relaxed() et al: */ > - > #ifndef atomic64_sub_return_relaxed > # define atomic64_sub_return_relaxed atomic64_sub_return > # define atomic64_sub_return_acquire atomic64_sub_return > # define atomic64_sub_return_release atomic64_sub_return > #else > -# ifndef atomic64_sub_return_acquire > -# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) > -# endif > -# ifndef atomic64_sub_return_release > -# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) > -# endif > # ifndef atomic64_sub_return > # define atomic64_sub_return(...) __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) > +# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) > +# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) > # endif > #endif > > -/* atomic64_dec_return_relaxed() et al: */ > - > #ifndef atomic64_dec_return_relaxed > # define atomic64_dec_return_relaxed atomic64_dec_return > # define atomic64_dec_return_acquire atomic64_dec_return > # define atomic64_dec_return_release atomic64_dec_return > #else > -# ifndef atomic64_dec_return_acquire > -# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) > -# endif > -# ifndef atomic64_dec_return_release > -# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) > -# endif > # ifndef atomic64_dec_return > # define atomic64_dec_return(...) __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) > +# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) > +# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) > # endif > #endif > > -/* atomic64_fetch_add_relaxed() et al: */ > - > #ifndef atomic64_fetch_add_relaxed > # define atomic64_fetch_add_relaxed atomic64_fetch_add > # define atomic64_fetch_add_acquire atomic64_fetch_add > # define atomic64_fetch_add_release atomic64_fetch_add > #else > -# ifndef atomic64_fetch_add_acquire > -# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) > -# endif > -# ifndef atomic64_fetch_add_release > -# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) > -# endif > # ifndef atomic64_fetch_add > # define atomic64_fetch_add(...) __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) > +# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) > +# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) > # endif > #endif > > -/* atomic64_fetch_inc_relaxed() et al: */ > - > #ifndef atomic64_fetch_inc_relaxed > # ifndef atomic64_fetch_inc > # define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) > @@ -673,37 +538,25 @@ static inline int atomic_dec_if_positive(atomic_t *v) > # define atomic64_fetch_inc_release atomic64_fetch_inc > # endif > #else > -# ifndef atomic64_fetch_inc_acquire > -# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) > -# endif > -# ifndef atomic64_fetch_inc_release > -# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) > -# endif > # ifndef atomic64_fetch_inc > # define atomic64_fetch_inc(...) __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) > +# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) > +# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) > # endif > #endif > > -/* atomic64_fetch_sub_relaxed() et al: */ > - > #ifndef atomic64_fetch_sub_relaxed > # define atomic64_fetch_sub_relaxed atomic64_fetch_sub > # define atomic64_fetch_sub_acquire atomic64_fetch_sub > # define atomic64_fetch_sub_release atomic64_fetch_sub > #else > -# ifndef atomic64_fetch_sub_acquire > -# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) > -# endif > -# ifndef atomic64_fetch_sub_release > -# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) > -# endif > # ifndef atomic64_fetch_sub > # define atomic64_fetch_sub(...) __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) > +# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) > +# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) > # endif > #endif > > -/* atomic64_fetch_dec_relaxed() et al: */ > - > #ifndef atomic64_fetch_dec_relaxed > # ifndef atomic64_fetch_dec > # define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) > @@ -716,127 +569,86 @@ static inline int atomic_dec_if_positive(atomic_t *v) > # define atomic64_fetch_dec_release atomic64_fetch_dec > # endif > #else > -# ifndef atomic64_fetch_dec_acquire > -# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) > -# endif > -# ifndef atomic64_fetch_dec_release > -# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) > -# endif > # ifndef atomic64_fetch_dec > # define atomic64_fetch_dec(...) __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) > +# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) > +# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) > # endif > #endif > > -/* atomic64_fetch_or_relaxed() et al: */ > - > #ifndef atomic64_fetch_or_relaxed > # define atomic64_fetch_or_relaxed atomic64_fetch_or > # define atomic64_fetch_or_acquire atomic64_fetch_or > # define atomic64_fetch_or_release atomic64_fetch_or > #else > -# ifndef atomic64_fetch_or_acquire > -# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) > -# endif > -# ifndef atomic64_fetch_or_release > -# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) > -# endif > # ifndef atomic64_fetch_or > # define atomic64_fetch_or(...) __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) > +# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) > +# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) > # endif > #endif > > - > -/* atomic64_fetch_and_relaxed() et al: */ > - > #ifndef atomic64_fetch_and_relaxed > # define atomic64_fetch_and_relaxed atomic64_fetch_and > # define atomic64_fetch_and_acquire atomic64_fetch_and > # define atomic64_fetch_and_release atomic64_fetch_and > #else > -# ifndef atomic64_fetch_and_acquire > -# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) > -# endif > -# ifndef atomic64_fetch_and_release > -# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) > -# endif > # ifndef atomic64_fetch_and > # define atomic64_fetch_and(...) __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) > +# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) > +# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) > # endif > #endif > > #ifdef atomic64_andnot > > -/* atomic64_fetch_andnot_relaxed() et al: */ > - > #ifndef atomic64_fetch_andnot_relaxed > # define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot > # define atomic64_fetch_andnot_acquire atomic64_fetch_andnot > # define atomic64_fetch_andnot_release atomic64_fetch_andnot > #else > -# ifndef atomic64_fetch_andnot_acquire > -# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) > -# endif > -# ifndef atomic64_fetch_andnot_release > -# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) > -# endif > # ifndef atomic64_fetch_andnot > # define atomic64_fetch_andnot(...) __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) > +# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) > +# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) > # endif > #endif > > #endif /* atomic64_andnot */ > > -/* atomic64_fetch_xor_relaxed() et al: */ > - > #ifndef atomic64_fetch_xor_relaxed > # define atomic64_fetch_xor_relaxed atomic64_fetch_xor > # define atomic64_fetch_xor_acquire atomic64_fetch_xor > # define atomic64_fetch_xor_release atomic64_fetch_xor > #else > -# ifndef atomic64_fetch_xor_acquire > -# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) > -# endif > -# ifndef atomic64_fetch_xor_release > -# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) > -# endif > # ifndef atomic64_fetch_xor > # define atomic64_fetch_xor(...) __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) > +# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) > +# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) > # endif > #endif > > -/* atomic64_xchg_relaxed() et al: */ > - > #ifndef atomic64_xchg_relaxed > # define atomic64_xchg_relaxed atomic64_xchg > # define atomic64_xchg_acquire atomic64_xchg > # define atomic64_xchg_release atomic64_xchg > #else > -# ifndef atomic64_xchg_acquire > -# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) > -# endif > -# ifndef atomic64_xchg_release > -# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) > -# endif > # ifndef atomic64_xchg > # define atomic64_xchg(...) __atomic_op_fence(atomic64_xchg, __VA_ARGS__) > +# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) > +# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) > # endif > #endif > > -/* atomic64_cmpxchg_relaxed() et al: */ > - > #ifndef atomic64_cmpxchg_relaxed > # define atomic64_cmpxchg_relaxed atomic64_cmpxchg > # define atomic64_cmpxchg_acquire atomic64_cmpxchg > # define atomic64_cmpxchg_release atomic64_cmpxchg > #else > -# ifndef atomic64_cmpxchg_acquire > -# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) > -# endif > -# ifndef atomic64_cmpxchg_release > -# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) > -# endif > # ifndef atomic64_cmpxchg > # define atomic64_cmpxchg(...) __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) > +# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) > +# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) > # endif > #endif >