On 12/31/2013 11:31 AM, Jean-Francois Moine wrote:
> Some audio cards are built from different hardware components.
> When such compound cards don't need specific code, this driver creates
> them with the required DAI links and routes from a DT.
>
> Signed-off-by: Jean-Francois Moine <[email protected]>
> ---
> This code was first developped on the generic simple card, but its
> recent DT extension cannot be easily extended again to support compound
> cards as the one in the Cubox.
> Note also that the example relies on a proposed patch of mine aiming to
> render the codec name / OF node optional in DAI links
> (http://mailman.alsa-project.org/pipermail/alsa-devel/2013-December/070082.html).
> ---
> .../devicetree/bindings/sound/compound-card.txt | 95 ++++++++++++
> sound/soc/generic/Kconfig | 6 +
> sound/soc/generic/Makefile | 2 +
> sound/soc/generic/compound-card.c | 247 +++++++++++++++++++++++++
> 4 file changed, 350 insertions(+)
>
> diff --git a/Documentation/devicetree/bindings/sound/compound-card.txt b/Documentation/devicetree/bindings/sound/compound-card.txt
> new file mode 100644
> index 0000000..554a796
> --- /dev/null
> +++ b/Documentation/devicetree/bindings/sound/compound-card.txt
> @@ -0,0 +1,95 @@
> +Device-Tree bindings for compound audio card
> +
> +Compound audio card describes the links between the different parts
> +of an audio card built from different hardware components.
> +
> +Required properties:
> + - compatible: should be "compound-audio-card"
> + - audio-controller: phandle of the audio controller
> +
> +Optional properties:
> + - routes: list of couple of strings (sink, source)
> +
> +Required subnodes:
> + - link: DAI link subnode
> + At least one link must be specified.
> +
> +Required link subnode properties:
> + - link-name: names of the DAI link and of the stream
> + - cpu-dai-name: name of the CPU or CODEC DAI
> + An empty string indicates that the CPU DAI is
> + the same as the audio controller.
> + - codec-dai-name: name of the CODEC DAI
> +
> +Optional link subnode properties:
> + - audio-codec or codec-name: phandle or name of the CODEC
> + in case the codec-dai-name is not unique
> + - format: DAI format. One of:
> + "i2s", "right_j", "left_j" , "dsp_a"
> + "dsp_b", "ac97", "pdm", "msb", "lsb"
> + - front-end or back-end: present if the DAI link describes resp.
> + a front-end CPU DAI or a back-end CODEC DAI
> + - playback or capture: present if the DAI link is used for
> + playback or capture only
As Mark also said, this binding definitely leaks way too much internals of
the current ASoC implementation. In my opinion the way forward for ASoC is
to stop to distinguish between different types of components. This is on one
hand CODECS and CPU-DAIs and on the other hand also front-end and beck-end
DAIs. The first steps in this direction have already been take by the start
of the component-fication, but its still a long way to go. Exposing those
concepts via the devicetree will only make it harder to get rid of them
later. The bindings for a compound card should essentially describe which
components are involved and how the fabric between and around them looks
like. If the type of the component is needed in the ASoC implementation it
should be possible to auto-discover it. Also I think we want to align the
devicetree bindings with what the media people have been doing[1]. Audio and
video are not that different in this regard and there will also be boards
where the audio and video fabric will be intermingled (e.g. like on your
board with HDMI).
- Lars
On Wed, 01 Jan 2014 20:05:05 +0100
Lars-Peter Clausen <[email protected]> wrote:
> As Mark also said, this binding definitely leaks way too much internals of
> the current ASoC implementation. In my opinion the way forward for ASoC is
> to stop to distinguish between different types of components. This is on one
> hand CODECS and CPU-DAIs and on the other hand also front-end and beck-end
> DAIs. The first steps in this direction have already been take by the start
> of the component-fication, but its still a long way to go. Exposing those
> concepts via the devicetree will only make it harder to get rid of them
> later. The bindings for a compound card should essentially describe which
> components are involved and how the fabric between and around them looks
> like. If the type of the component is needed in the ASoC implementation it
> should be possible to auto-discover it. Also I think we want to align the
> devicetree bindings with what the media people have been doing[1].
(you forgot the [1] reference)
> Audio and
> video are not that different in this regard and there will also be boards
> where the audio and video fabric will be intermingled (e.g. like on your
> board with HDMI).
I found a way to discover the DAI link types for my system: when simple
DAPM, the kirkwood CPU DAIs have an ID != 0. For the Cubox, the CPU DAI
of the first link (system playback) has the ID 0, so I can move to DPCM
setting the 'dynamic' and 'no_pcm' flags in the DAI links at snd
platform probe time.
Then, after some extensions of the simple-card, the DT would look like
(not tested yet):
sound {
compatible = "simple-audio-card";
simple-audio-routing =
"HDMI I2S Playback", "System Playback",
"HDMI SPDIF Playback", "System Playback",
"SPDIF Playback", "System Playback",
"hdmi-out-i2s", "HDMI I2S Playback",
"hdmi-out-spdif", "HDMI SPDIF Playback",
"spdif-out", "SPDIF Playback";
simple-audio-card,cpu@0 {
link-name = "System audio"; /* extension */
sound-dai = <&audio1 0>;
playback; /* extension */
format = "i2s";
};
simple-audio-card,codec@0 {
sound-dai-name = "snd-soc-dummy-dai";
};
/* multi-links extension */
simple-audio-card,cpu@1 {
link-name = "hdmi-i2s";
platform-name = "snd-soc-dummy"; /* extension */
sound-dai = <&audio1 1>;
playback;
};
simple-audio-card,codec@1 {
sound-dai = <&hdmi_codec 0>;
};
simple-audio-card,cpu@2 {
link-name = "hdmi-spdif";
platform-name = "snd-soc-dummy";
sound-dai = <&audio1 2>;
playback;
};
simple-audio-card,codec@2 {
sound-dai = <&hdmi_codec 1>;
};
simple-audio-card,cpu@3 {
link-name = "spdif";
platform-name = "snd-soc-dummy";
sound-dai = <&audio1 1>;
playback;
};
simple-audio-card,codec@3 {
sound-dai = <&spdif_codec>;
};
};
May I go to this direction?
--
Ken ar c'hentañ | ** Breizh ha Linux atav! **
Jef | http://moinejf.free.fr/
On 01/01/2014 09:08 PM, Jean-Francois Moine wrote:
> On Wed, 01 Jan 2014 20:05:05 +0100
> Lars-Peter Clausen <[email protected]> wrote:
>
>> As Mark also said, this binding definitely leaks way too much internals of
>> the current ASoC implementation. In my opinion the way forward for ASoC is
>> to stop to distinguish between different types of components. This is on one
>> hand CODECS and CPU-DAIs and on the other hand also front-end and beck-end
>> DAIs. The first steps in this direction have already been take by the start
>> of the component-fication, but its still a long way to go. Exposing those
>> concepts via the devicetree will only make it harder to get rid of them
>> later. The bindings for a compound card should essentially describe which
>> components are involved and how the fabric between and around them looks
>> like. If the type of the component is needed in the ASoC implementation it
>> should be possible to auto-discover it. Also I think we want to align the
>> devicetree bindings with what the media people have been doing[1].
>
> (you forgot the [1] reference)
http://lxr.free-electrons.com/source/Documentation/devicetree/bindings/media/video-interfaces.txt
On Wed, 01 Jan 2014 21:11:23 +0100
Lars-Peter Clausen <[email protected]> wrote:
> On 01/01/2014 09:08 PM, Jean-Francois Moine wrote:
> > On Wed, 01 Jan 2014 20:05:05 +0100
> > Lars-Peter Clausen <[email protected]> wrote:
> >
> >> As Mark also said, this binding definitely leaks way too much internals of
> >> the current ASoC implementation. In my opinion the way forward for ASoC is
> >> to stop to distinguish between different types of components. This is on one
> >> hand CODECS and CPU-DAIs and on the other hand also front-end and beck-end
> >> DAIs. The first steps in this direction have already been take by the start
> >> of the component-fication, but its still a long way to go. Exposing those
> >> concepts via the devicetree will only make it harder to get rid of them
> >> later. The bindings for a compound card should essentially describe which
> >> components are involved and how the fabric between and around them looks
> >> like. If the type of the component is needed in the ASoC implementation it
> >> should be possible to auto-discover it. Also I think we want to align the
> >> devicetree bindings with what the media people have been doing[1].
> >
> > (you forgot the [1] reference)
>
> http://lxr.free-electrons.com/source/Documentation/devicetree/bindings/media/video-interfaces.txt
I see the idea. So here is a proposal of DT for the Cubox (audio + video)
which should work without too many difficulties.
/* video / audio transmitter */
tda998x: hdmi-encoder {
compatible = "nxp,tda998x";
...
/* video */
port@0 {
tda998x_0: endpoint {
reg = <0x230145>;
remote-endpoint = <&lcd0_0>;
};
};
/* audio */
port@1 {
hdmi_i2s_audio: endpoint@0 {
reg = <0x03>;
remote-endpoint = <&audio_hdmi_i2s>;
};
hdmi_spdif_audio: endpoint@1 {
reg = <0x04>;
remote-endpoint = <&audio_hdmi_spdif>;
};
};
};
toslink: spdif {
compatible = "linux,spdif-dit";
port {
spdif_audio: endpoint {
remote-endpoint = <&audio_spdif>;
};
};
};
/* video */
lcd0: video-controller {
compatible = "marvell,dove-lcd";
...
port {
lcd0_0: endpoint {
remote-endpoint = <&tda998x_0>;
};
};
};
video {
compatible = "media-video";
video-controller = <&lcd0>;
};
/* audio */
audio1: audio-controller {
compatible = "marvell,dove-audio";
...
port@0 {
audio_hdmi_i2s: endpoint {
remote-endpoint = <&hdmi_i2s_audio>;
};
};
port@1 {
audio_hdmi_spdif: endpoint@0 {
remote-endpoint = <&hdmi_spdif_audio>;
};
audio_spdif: endpoint@1 {
remote-endpoint = <&spdif_audio>;
};
};
};
sound {
compatible = "media-audio";
audio-controller = <&audio1>;
};
--
Ken ar c'hentañ | ** Breizh ha Linux atav! **
Jef | http://moinejf.free.fr/
On Thu, Jan 02, 2014 at 10:26:47AM +0100, Jean-Francois Moine wrote:
> /* audio */
> port@1 {
> hdmi_i2s_audio: endpoint@0 {
> reg = <0x03>;
> remote-endpoint = <&audio_hdmi_i2s>;
> };
I think we want an explicit object in the card representing the DAIs.
This will both be useful for making it easy to find the configuration
for the link and will be more extensible for the cases where multiple
devices are connected, you can't just assume there's just two.
On Thu, 2 Jan 2014 11:10:56 +0000
Mark Brown <[email protected]> wrote:
> On Thu, Jan 02, 2014 at 10:26:47AM +0100, Jean-Francois Moine wrote:
>
> > /* audio */
> > port@1 {
> > hdmi_i2s_audio: endpoint@0 {
> > reg = <0x03>;
> > remote-endpoint = <&audio_hdmi_i2s>;
> > };
>
> I think we want an explicit object in the card representing the DAIs.
> This will both be useful for making it easy to find the configuration
> for the link and will be more extensible for the cases where multiple
> devices are connected, you can't just assume there's just two.
I don't see the problem: the 'port' is the DAI. The associated
endpoints give the DAI links and the routing information.
As the DT definition has been done for video, some properties may be
added at will for audio.
What kind of object were you thinking of?
--
Ken ar c'hentañ | ** Breizh ha Linux atav! **
Jef | http://moinejf.free.fr/
On Thu, Jan 02, 2014 at 12:43:31PM +0100, Jean-Francois Moine wrote:
> Mark Brown <[email protected]> wrote:
> > I think we want an explicit object in the card representing the DAIs.
> > This will both be useful for making it easy to find the configuration
> > for the link and will be more extensible for the cases where multiple
> > devices are connected, you can't just assume there's just two.
> I don't see the problem: the 'port' is the DAI. The associated
> endpoints give the DAI links and the routing information.
> As the DT definition has been done for video, some properties may be
> added at will for audio.
> What kind of object were you thinking of?
Like I say multiple devices on the same link - if you're just listing a
single remote device there can't be more than one.
On Thu, 2 Jan 2014 11:56:18 +0000
Mark Brown <[email protected]> wrote:
> On Thu, Jan 02, 2014 at 12:43:31PM +0100, Jean-Francois Moine wrote:
> > Mark Brown <[email protected]> wrote:
>
> > > I think we want an explicit object in the card representing the DAIs.
> > > This will both be useful for making it easy to find the configuration
> > > for the link and will be more extensible for the cases where multiple
> > > devices are connected, you can't just assume there's just two.
>
> > I don't see the problem: the 'port' is the DAI. The associated
> > endpoints give the DAI links and the routing information.
>
> > As the DT definition has been done for video, some properties may be
> > added at will for audio.
>
> > What kind of object were you thinking of?
>
> Like I say multiple devices on the same link - if you're just listing a
> single remote device there can't be more than one.
I still don't understand. There is already such cases in the Cubox:
the S/PDIF output from the kirkwood audio controller is connected to
both the HDMI transmitter and the S/PDIF TOSLINK. So, in the audio
controller, the port @1 defines the S/PDIF DAI and the endpoints @0 and
@1 point to the remote DAIs, creating 2 snd DAI links:
port@1 {
audio_hdmi_spdif: endpoint@0 {
remote-endpoint = <&hdmi_spdif_audio>;
};
audio_spdif: endpoint@1 {
remote-endpoint = <&spdif_audio>;
};
};
in the snd card:
- DAI link 1 = 'audio controller spdif out' <=> 'hdmi spdif'
- DAI link 2 = 'audio controller spdif out' <=> 'spdif'
If I am wrong, may you give us an example for which such a DT would not
work?
--
Ken ar c'hentañ | ** Breizh ha Linux atav! **
Jef | http://moinejf.free.fr/
On Thu, Jan 02, 2014 at 01:44:37PM +0100, Jean-Francois Moine wrote:
> I still don't understand. There is already such cases in the Cubox:
> the S/PDIF output from the kirkwood audio controller is connected to
> both the HDMI transmitter and the S/PDIF TOSLINK. So, in the audio
> controller, the port @1 defines the S/PDIF DAI and the endpoints @0 and
> @1 point to the remote DAIs, creating 2 snd DAI links:
> port@1 {
> audio_hdmi_spdif: endpoint@0 {
> remote-endpoint = <&hdmi_spdif_audio>;
> };
> audio_spdif: endpoint@1 {
> remote-endpoint = <&spdif_audio>;
> };
> };
Oh, so the endpoints are virtual and that's supposed to be three things
wired together rather than a single device with multiple links? That's
really not very clear from reading the above and seems cumbersome -
every device will want to explicitly identify every other device on the
link and any configuration is going to either need to be replicated on
every device or we'll need to check lots of places for the configuation.
It seems like this will be hard to work with.
On Thu, 2 Jan 2014 13:10:45 +0000
Mark Brown <[email protected]> wrote:
> On Thu, Jan 02, 2014 at 01:44:37PM +0100, Jean-Francois Moine wrote:
>
> > I still don't understand. There is already such cases in the Cubox:
> > the S/PDIF output from the kirkwood audio controller is connected to
> > both the HDMI transmitter and the S/PDIF TOSLINK. So, in the audio
> > controller, the port @1 defines the S/PDIF DAI and the endpoints @0 and
> > @1 point to the remote DAIs, creating 2 snd DAI links:
>
> > port@1 {
> > audio_hdmi_spdif: endpoint@0 {
> > remote-endpoint = <&hdmi_spdif_audio>;
> > };
> > audio_spdif: endpoint@1 {
> > remote-endpoint = <&spdif_audio>;
> > };
> > };
>
> Oh, so the endpoints are virtual and that's supposed to be three things
> wired together rather than a single device with multiple links? That's
> really not very clear from reading the above and seems cumbersome -
> every device will want to explicitly identify every other device on the
> link and any configuration is going to either need to be replicated on
> every device or we'll need to check lots of places for the configuation.
> It seems like this will be hard to work with.
No, the 'endpoint' <=> 'remote-endpoint' is a point to point relation.
Even if the sources and sinks are not explicitly defined, the way
the stream flows is easy to find: the main source is always in the
'audio-controller' node
sound {
compatible = "media-audio";
audio-controller = <&audio1>;
};
and, then, the controller ports are sources (CPU DAIs) and the
associated remote ports are sinks (CODEC DAIs).
With many levels, once the remote (sink) ports are identified, in the
devices where such sinks exist, the remaining ports are sources.
Usually, the devices don't have to know to which other device they are
connected, and, yes, the reverse pointer sink to source is not useful.
But the way (link) the audio stream comes from may be important to
know. This is the case for the HDMI CODEC which must tell the HDMI
transmitter from which hardware port(s) ('reg') it may get the audio
stream. That's why, the HDMI encoder has two endpoints in its audio
port, each endpoint being a different CODEC DAI.
--
Ken ar c'hentañ | ** Breizh ha Linux atav! **
Jef | http://moinejf.free.fr/
On Thu, Jan 02, 2014 at 06:50:55PM +0100, Jean-Francois Moine wrote:
> No, the 'endpoint' <=> 'remote-endpoint' is a point to point relation.
> Even if the sources and sinks are not explicitly defined, the way
> the stream flows is easy to find: the main source is always in the
> 'audio-controller' node
But the links in question aren't point to point links and may be
bidirectional...
> Usually, the devices don't have to know to which other device they are
> connected, and, yes, the reverse pointer sink to source is not useful.
This may be the case for the systems you've looked at but other designs
are quite common.
> But the way (link) the audio stream comes from may be important to
> know. This is the case for the HDMI CODEC which must tell the HDMI
> transmitter from which hardware port(s) ('reg') it may get the audio
> stream. That's why, the HDMI encoder has two endpoints in its audio
> port, each endpoint being a different CODEC DAI.
Obviously if there are multiple DAIs on a device then it needs to be
possible to represent them separately but that seems orthogonal to the
rest of the discussion (and resolved already)?