 Okay, so supporting audio on your embedded board. So basically how to actually get sound out of your custom design, right? So who am I? I'm Alexandre Belloni, I'm an embedded Linux developer at FreeElectrons. Nous sommes principalement focussé sur nos expériences sur les devices d'embêtement, surtout les armes. Nous sommes principalement spécialisés dans le développement en armes. Donc nous faisons le développement en consultant. Nous avons aussi des entraînements. Peut-être que vous avez déjà vu ça. Nous sommes vraiment focussé sur l'open source, et spécifiquement, le kernel Linux. Moi-même, je suis aussi un grand contributaire. Je suis la maintenance de l'article sub-system, donc dans le kernel. Donc l'article pour réel clock, rien à faire avec l'audio. Et je suis aussi en train de maintenir le support kernel pour les armes processeurs. Donc quelque chose qui est 80, 91, quelque chose, ou quelque chose d'a5, d3, d2, d4, d'exemple. Donc ASOC, peut-être que vous avez attendu la dernière présentation ce matin. Donc sur votre bord, vous avez quelque chose comme ça. Évidemment, je ne parlerai pas de tous les cas, mais ils sont assez similaires. Donc, nous allons parler de ça. Donc vous avez votre ASOC. Et cet ASOC a plusieurs interfaces qui connectent à un codec particulier. La première sera l'interface de l'audio digital. C'est généralement un port serial, ou parfois c'est un interface synchroniste serial. Et puis vous avez aussi une façon de configurer votre codec. Donc généralement, vous avez ça sur i2c ou spi. C'est un simple bus serial, aussi. Et enfin, votre codec va either output un audio analog à un amplifier, ou peut-être directement à un connecteur, selon ce que vous faites avec ça. Donc comme je l'ai dit, la configuration de codec s'occupe sur cette bus i2c ou spi. Vous pouvez trouver d'autres... Je vais parler d'autres. Par exemple, vous avez quelques ASOC qui a un connecteur à l'inside de l'ASOC. Donc évidemment, ce n'est pas connecté sur l'i2c ou spi. C'est directement connecté sur les bus armes, par exemple. Et donc vous accesserez ceux avec MMIOs en utilisant un access serial. Donc l'ASOC, comme je l'ai dit, s'appelle parfois le synchronisation serial interface. Les formats que vous verrez sont les AC-97, ou l'I2S, ou des PCM, ou des TDM, donc la division de temps, ou des modes de network, et vous avez aussi des modes qui sont appelés DSPA ou DSPB. Je pense que c'est seulement en TI. Je ne sais pas si vous le voyez, mais je pense que c'est seulement en TI. Donc quelques exemples pour ces interfaces. Par exemple, l'ASOC, donc la synchronisation serial, je ne sais pas... NXP, SSI, ou TI, MC, ASP, ou ce genre d'interfaces. C'est assez utile, c'est juste une interface audio. Vous avez aussi des SOCs qui vont avoir un contrôle SPDIF, donc celui-ci va mettre la interface Sony Phillips digital. Donc ça peut être séparé, ça ne peut pas être séparé selon votre SOC. Les amplifiers que je vous montre sont en fait optionnels, mais vous pouvez le voir, c'est plus important. ASOC, c'est ASA pour système en chip. C'est donc un design pour donner un meilleur moyen d'intégrer les drivers. Donc, en fait, pour avoir un driver pour votre codec, pour chaque SOC que vous voulez supporter pour votre codec, vous pourrez avoir un driver pour votre codec et un driver pour chaque SOC, et vous pourrez pouvoir connecter votre codec avec votre SOC ou votre SOC avec votre codec. Donc, vous avez deux différents API, vous avez un API pour le codec des drivers, et vous avez aussi un API pour les drivers d'interfaces. Et c'est... Donc, vous avez les drivers de codec, et pour votre codec, vous avez les drivers de plateformes pour votre interface ASOC audio. Et puis, vous voulez quelque chose d'autre, vous avez aussi besoin d'un dernier driver, qui s'appelle le driver de machine, le driver de fabrique, le driver de glu. Et ce driver va être connecté à votre driver ASOC avec votre driver codec. Parce que vous voulez dire que vous avez ce particular SOC, vous avez ce particular codec, et ce sera votre soncard audio. Donc, vous vous décriverez que nous sommes connectés, et généralement, si vous avez choisi les propres composants, vous devez juste avoir to write ce codec. Vous devez avoir to write le driver de codec ou le driver ASOC. Ce particular driver de glu. Vous devez aussi voir que le codec ne peut pas être un codec standalone. Il peut aussi être une partie d'un autre IC. C'est ce que vous voyez pour la chip Bluetooth. Quand vous voulez mettre l'audio sur Bluetooth, parfois vous devez avoir un codec sur la chip modem. Et ce genre de choses. Donc, le driver de machine c'est une chose qu'on veut faire et c'est de régister une structure, et cette structure est un SOC donc, elle va créer un codec audio. Et évidemment, la structure est vraiment grande. Elle est définie et inclure un codec audio.h Pour le régister vous devez utiliser un codec audio. C'est un codec audio.h On va utiliser une interface de Devm donc, vous devez utiliser un codec audio.h Vous devez régister et vous ne vous devez pas régister. Donc, cette structure ressemble à ça. Elle contient quelques noms. Elle contient aussi un système de structure parce que c'est un système de structure et surtout, elle contient deux différents listes. Donc, elle contient les links d'AI. Et ces links c'est en fait un liste de links qui vont linker votre SOC avec votre codec. Donc, on va avoir un look à cette. Donc, c'est le link d'AI.h et cette elle contient aussi un nom aussi un nom de stream et j'ai regardé les commentaires parce que ces commentaires sont vraiment importants. Donc, vous avez plusieurs manières de définir quel SOC vous devez utiliser et quel codec vous devez utiliser et vous ne ne pouvez pas utiliser plusieurs manières spécifiant ce SOC en même temps. Donc, ces commentaires sont vraiment importants. Donc, vous devez spécifier votre site de CPU, donc le site de SOC seulement par le nom de device ou par le nom de la deviserie mais vous ne pouvez pas faire les deux. Donc, vous devez utiliser ce nom de CPU ou le nom de CPU et vous devez spécifier le nom d'AI de votre CPU d'AI par l'interface, vous devez avoir plusieurs portes, par exemple, donc vous devez sélectionner le propre port sur lequel votre codec est connecté. Donc, euh... Ouais, donc, basiquement les commentaires et c'est toujours bon. C'est le même chose, donc un peu plus tard dans cette structure vous devez trouver le nom de codec ou le nom de codec ou vous pouvez aussi spécifier le nom de codec d'AI qui est le complet nom pour votre codec. Donc, de nouveau, vous devez choisir l'un ou l'autre mais pas les deux. Finalement, vous avez aussi la plateforme PCM ou DMA driver, donc basiquement il y a une autre partie de votre SOC qui va copier l'audio data de la mémoire d'AI interface, et votre interface d'AI va mettre l'audio data sur le bus mais vous devez sélectionner votre plateforme nom de votre plateforme interface. Donc, j'ai un exemple, j'ai pris celui-là parce que c'est en fait un driver qui est assez simple de lire parce que il ne fait pas trop de fantaisies donc, let's have a look at what's what it's doing. So, basically what we want there is to have WM 8904 connect connected on an atmel board, because it will work on any atmel board, almost any atmel board, because basically the atmel SSC didn't change between SOCs. So, it will basically work on any atmel board if you use Wolfson 8904 obviously. So, how do you select that? So, it will create that soundsoc dAI link it gives it a name, a stream name, and then you will use a codec dAI name which is that one and that one is coming from the codec. The codec driver is registering itself then you will actually see that it's using that name, that particular name. So, quite simple to say ok, I want to use WM8904 codec and it actually has that dAI name. Then you can select a particular format so I will talk about that a bit more later so basically you just say ok, I want to use that codec and that codec is on that interface and I will use I2S so that's the one, so you will use I2S NBNF is basically just how your signal phases are done so usually you will use NBNF but if you have some particular cases you may change that and then finally you have NFM which says which part of it is the slave and which part of it is the master I will talk about that a bit later you will also register some operations and again I will talk about that later right so finally you really want to register one soundsock card and that soundsock card is created like that so you give it a name so basically we just say asoc that is using asoc and that particular codec and we are using that dailink which is just a structure I defined before we can also register some widgets and I will talk about those a bit later but basically they will define what connectors are available on your sound card right so a bit later so we want to actually register that sound card and how do we get it so we use a function that is called init and it's actually called from prob a bit later I guess I have it somewhere so that's the prob function on that driver we just call the dt let me get it somewhere oh I don't have it yeah it's calling the dt init right so the dt init what it's doing it's just basically parsing the device tree and when parsing the device tree it will actually try to find what ssccontroller are you actually trying to use on which ssccontroller your codec is connected so we parse that property so the atmail ssccontroller and from that we actually get a pointer to the node of our socdai right so that's the soc part there the same there we want also to get the codec part so we get the atmail audio codec and when we parse that property we just get a node pointer to the codec device tree node so once we get both node pointers and we can continue and fill the link and that's what is happening there right and because both of those are filled now we can register that dai link using the sound card if you have any question at some point just shout finally we we get the the sscid which is something completely atmail specific but so I will not talk more about that and the final step is to actually register that card so we filled the structure and those structures are describing how things are connected on the board or at least how we want the codec we want the soc to talk to the codec so after linking your codec driver with your soc driver you still have to define what are the codec outputs or inputs that you want to use because your codec may have I don't know 3 or 4 analog outputs and maybe you only connected one of those on your board so you just have to say oh on my board I have that particular codec output that is actually connected to something so you want to use the roots for that so you just you have two ways of doing it so you can use the DAPM roots inside a strict sound card son soc card but usually what you can do is just use a son soc or fpar audio routing and give it your property name and that will just under standard device tree properties that will define roots for you so let's have a look at how it's done so there I have an example statically basically you have an array of strict sound soc DAPM roots and that array will define ok I have that input on my codec and I will want to connect that input on this output on this input on my board and for example you may have I guess those outputs from your codec and they are connected on those outputs from your board and it's simple you just register them like that so basically you have a pointer to your array and then you can use array size for example or if you want you can just put a number but array size a bit cleaner so just to say ok I have that number of different roots on my board so the device tree how to do it with device tree it's kind of simpler at some point in DT init you just use sound soc or if pass out your routing like I was saying you give it the property name that you are using for those roots and how does that look like in your device tree it's actually well documented so if you have a look at the device tree binding for the Atmel WM8904 you will get all the different pins from your codec that you can use and actually all the pins from your board that you can also use and then it's a simple matter of giving a list of pairs of strings to that property so basically a bit the same as before the codec out so HP out is actually going to the headphone jack the line in jack from your board is going to the intu L and intu R input of your codec right so how to define or how to find your codec pins you will have to if it's not documented like I just showed you will have to actually find it in your codec driver for soundsock DAPM input and soundsock DAPM output and you will find the name that you can actually use unfortunately I don't think there is any way to actually know which actual input and output from your codec are used until you can read the code and actually see in the data sheet what is done in the driver but so have a look for those two kind of definitions and you will find all your inputs and all your outputs hopefully the writer of your driver will use names that you can relate with your data sheet so the board connectors how do you define them so they are part of your board so they will actually live in your machine driver and so you will have to define them so you just define a strict soundsock DAPM widgets and it's actually part of the registered soundsock card I guess I have it just before so there it's part of your soundsock card so just the same as before you just create an array of structure and you just register it inside your soundsock card so you will just give names so you have outputs so headphone outputs microphone input and line input so clocking so now we defined all the roots we defined how your codec and your SOC are connected and one huge part of audio is actually getting the correct clocks to or from your codec Dépendant WSMProtent Dépendant les caractéristiques audio data ou le sound que tu es tentant de jouer ou que tu es tentant de récorder tu vas avoir d'adjuster les clocs par exemple tu ne peux pas utiliser les clocs pour quelque chose qui est à 44 1100 kHz et je pense que c'est à 48 kHz c'est pas le même cloc donc tu vas avoir les caractéristiques de ton audio donc les températures et les dévots sont les facteurs les plus importants et c'est l'un des parts les plus difficiles d'éteindre ton driver donc on va essayer d'éteindre ça donc par exemple tu as un exemple avec i2s donc quand tu utilises i2s tu usually as quelque chose comme ça donc tu as ton codec sur l'arrière ce codec va prendre un cloc master le cloc master ne peut pas venir de ton SOC il peut venir de un cristal mais juste savoir que ton codec va en need 1h00 pour pouvoir travailler un codec va peut-être déderir mclk c'est aussi parfois appelé cclk de bclk donc bclk c'est partie de l'interface audio et bclk c'est le cloc de bit donc sur chaque cloc de bclk tu vas avoir 1 bit qui sera transférée sur les pins donc tu as 2 pins de data tu as data in, data out c'est souvent appelé comme ça mais de taie c'est souvent appelé transmite et recil donc le transmite va être data in et le recil va venir de data out et puis tu as un dernier cloc c'est wclk c'est aussi appelé lrclk pour le cloc de la droite ou aussi tu peux trouver fsclk c'est en fait le frame rate de tes audio samples oui et ce que je veux montrer c'est que tu peux faire les clocs pour tous tes clocs ou tu peux trouver les clocs pour les clocs donc c'est important de savoir parce que le cloc de la droite c'est le master et je sais que c'est normal mais les SOCs n'ont pas toutes les séances pour les clocs de la droite donc et généralement tu veux être le master de cloc c'est pas toujours le cas mais généralement tu veux être le master de cloc la dernière note donc si tu regardes ce diagramme tu verras c'est un exemple c'est un exemple c'est pas toujours le cas mais c'est généralement le cas et donc tu as la droite de la droite donc quand c'est down tu transmites la droite et quand c'est up tu transmites la droite c'est un exemple c'est normal c'est ça parfois WCLK est juste une étape pour tous les canaux donc c'est ce que j'ai dit je n'ai pas oublié la relation finale entre BCLK et WCLK donc ton cloc de la droite est votre cloc de la droite donc si tu es en mono c'est 1, si tu es en stereo c'est 2 et puis c'est un peu de faim c'est juste la taille de ta sample donc c'est peut-être 16 ou 32 ou même plus grand si tu as le nécessaire hardware donc comme je l'ai dit WCLK est aussi parfois refert comme Fs et BCLK ou MCLK c'est généralement un multiple de Fs ce n'est pas toujours le cas donc comment tu définis quel est le master et quel est le slave comme je l'ai dit c'est partie de ton link donc c'est ce que j'ai dit juste avant et c'est la partie qui dit ok, ceci est codec bit master et codec frame master il y a 4 different ways of doing it so you can say that your codec is master for the bit clock but you want your SOC to be master for the frame clock for example I guess it's a really unusual configuration but it's something that you can do so finally just before I said that you could also register some operations so it's just basically a structure a few function pointers and those function pointers there is one that is really important it's actually called hwparams and that's I guess the most important one and that's the one that will allow you to dynamically change your clocks so hwparams is actually called each time you are trying to play a new stream and in that callback you can actually are able to get the parameters from your streams so that's important you can use so params rate params channels and params formats to get the sample rate the number of channels and the format of it inside the format you actually have the bit depth so with those parameters you can actually but sometimes you really want to have bclk but sometimes your codec will actually want multiple of the frame rate so you don't have to actually compute bclk you just have to know the number of channels and the sample rate itself if you want to compute bclk you can also use that function so sandsack params to bclk that will just give you the bclk rate for that particular stream so I have an example just after what I guess there so basically that one is just saying ok I want to set my pll for my codec at 256 times the rate of my stream because the codec wants that so each time you get a new stream that is played you will get new params and you will get those params and actually say ok I want to set the plls to 256 times the rate of your stream so you have two functions too you have three functions I think to actually set the clock speed, clock rate for both your codec and your SOC depending on which one is master and sometimes you actually need to set both we'll see that so you can use the most common one that is used is sandsackdai and that one should be used and that one should actually give you the proper result but sometimes it's not always easy to know which actual dividers you want to use or which actual pll you want to use from the driver so maybe the driver is not able to do that so you actually have two other functions that will allow you to specify a specific divider and some specific plls for the codecs or the SOC so that will take different parameters so the DAI is simply the DAI from either your codec or your SOC then you have clock ID so CLK ID that one is a bit complicated to to use because it actually is implementation specific so that one will actually depend on your SOC or on your codec so usually you will see that people are using 0 but sometimes the codec is able to derive its own clock from inputs and then you will have to use something that is dependent on your codec and you will actually have to read the codec driver to know which one you have to use then you have the frequency itself and you have the direction so the direction is basically saying ok I want my clock to come from the codec or to come from the SOC so that one so there we set the PLL for the codec and then we actually want to set the CCLK for the codec afterwards and we say ok that one is soundsock clock in so the clock is coming from the SOC finally you see that HW param callback is just filled that way and that structure was actually part of I guess I have it somewhere it was actually part of my DAI link which is just before I guess oh I can't find it it was part of my DAI link that structure that I just filled yeah so finally you may not need to know all that you may not need to do that because if your design is let's say common enough or simple enough but you will have a driver that is called simple card it's also called simple audio card depending on where you are looking at in the code it's referred to as simple card and in the device tree it's a simple audio card so that driver will actually allow you to specify directly the connection between your SOC and your codec and do that configuration so instead of writing code you will just have to write a device tree which is good so it looks like that so it's fairly well documented because it's a binding so it's correctly documented you will be able to pass so I square S or left justify, right justify or DSP you will specify which side of your sound card is actually the clock master so the bit clock master the frame master so on that side it's actually the codec because we can find it there finally you have the audio card widgets so those are then you can do your routing just like we saw before right and then you can define two sub nodes and one sub node will be the CPU side so the SOC side and the other one will be the codec side and you just basically with that it's just basically doing everything I've been showing to you so you may not need a line of code just by writing that device tree it may not fit your particular use case so that's why we still have some machine drivers depending on what you are doing so like I was saying there was an amplifier on my first diagram how do you support that it actually depends on what you want to do but if you want to really integrate it well with ALZA you are able to register a strict that is a strict ox dev and it's actually an array of structures and you will be able to pass multiple different codecs there and all those codecs will expose their widgets, their control widgets as controls from your sound card so your amplifier will be part of your sound card using that so you can do with the simple card binding so that's good so obviously you wrote a bunch of code and it's not working so how do you troubleshoot that so if you don't have any sound afterwards you have multiple cases so maybe you have some sound your audio seems to be playing it's actually playing for the correct duration it's important you know that your clocks are somewhat right and the usual mistake there is just that you have to unmute master but usually it's fine it's unmuted but usually what you will see is that your codec may have some controls and for example it will have a control for the HP out and that one may be muted so you will want to unmute it and obviously you will also want to turn up the volume for that control you will also want to check what are the mixing and mixing capabilities of your codec and actually control those from other mixer for example because some codecs are able to say I can take input 1 and put it on output 3 for example and you could also take input 1 and output it on output 2 and you actually want to be that set to the correct output so that's something you also want to check the amplifier configuration because basically at some point I had everything working but I didn't add the amplifier in my sound card so but it was off basically I didn't have audio because my amplifier wasn't working and you also want to check the routing that's also something maybe you thought that out1 was your HP jack but it's actually your amplifier output or something like that so maybe you mixed something there so you have another case where you don't have any sound but it's actually seem stuck so you wait for 10 minutes and you still don't have anything what do you have to do there it's probably an issue with the clocks meaning that either your codec is not sending a clock to your SOC or your clock is not finding that it actually has to send some data on the bus or something like that so usually what you want to do first is to check your pin waxing and then you will want to also check how you configure the clock direction because if you put clock direction as both inputs you will not have any clocks on there you will want to check the master slave configuration too because if for example you say oh ok my codec will be the master and then you don't configure it to actually output any clock then your SOCDAI will not know when to send any data on the bus so that's also a usual issue so the best things to do is to actually check that you have clocks using an oscilloscope for example that's the best thing to do and then again check again your pin waxing it's always the pin waxing so check again right and also know that some SOCs will actually have more waxing than that so some SOCs will actually will actually be able to test one serial port to one audio bus or that kind of thing it's specifically the case for NXP so that's something you have to configure it's quite SOC specific so I will not talk about that too much but that's also something that you actually want to check right so you may also find that kind of error so it doesn't say much but basically when you have write error and that that one is coming from one of your driver so it may come from your machine driver but I really doubt that it may come from your SOC driver or from your codec driver so usually it will be caused by an issue with routing because at some point your codec will not find a route or something like that maybe you are trying to use a route or you are trying to use an input or an output that doesn't exist on your codec so that's usually what's happening there also check that the codec driver actually exposes a stream name playback because when you are trying to play audio the first stream that is use this playback if you don't have that basically you will not have any audio out and I actually had the issue with some mainline drivers for codecs they didn't have any playback stream one good tool to actually see what's happening with the routes and if everything is connected everything is connected correctly is this Dappem so it's a simple script that you can run on your target on it's actually meant to be run on PC but you can actually get it to run on your target and generate the graphics on your PC it's quite simple it's using gravvis to actually say ok when you were playing your audio stream all those routes were connected and then it will stop somewhere and that's where you will have to check so it's quite nice it's showing everything that is activated in green everything that is not activated in red it's quite easy to know what's wrong there so finally you may find something like that so some overruns and underruns it's usually caused by a BCLK that is not precise enough so that usually is happening when you are using your SOC as the clock master so we add that you may not care about it depending on what is your application but it may generate unpleasant sounds so you may want to actually switch to use your codec as a master you may not be able to do that depending on your hardware but usually it's always better to use the codec as a master again so if you want to go further and it's still not working you probably want to have a look at the CPU, DI, driver and also the SOC driver and also the codec driver in particular you will want to have a look at CELK ADV and CELK because that's where the clocks are actually set up on your SOC so that's a bit more complicated but it's still doable so basically you take the data sheet from your SOC yeah so most important functions to look at are those and then you can also have a look at HW params and set the AI FMT because they will do some mixing if your SOC is capable of that so you may also have an issue there for the codec callbacks you want to have a look at set a CCL key and in particular like I was saying the CLK ID parameter is actually codec specific so you will want to know what it's actually doing with it so maybe it will select a particular clock input or maybe it will be doing something else but it's that's where you will know also remember that using your codec as a slave like I was saying it's usually best to use it as master and it's even worse because sometimes it's not even tested but maybe the driver is not meant to be used with a codec as a slave and sometimes it leads to some crashes or that kind of things so remember that using the codec as a slave is quite uncommon so you may have trouble with that if you want to do that when in doubt and you don't know what configuration you got what is the final configuration of your codec or of your amplifier for example devmem or i2c yet so basically what those tools allow you to do is just to read directly the registers so you will know when playing for example you start playing a stream and when playing you will just have a look at okay how does my codec how does my driver configure my codec for example and that's usually really useful you may also be able to actually write there so with devmem it's also possible and with i2c set that's usually what I'm doing when developing a codec driver basically you just start your stream with a play something and then you do i2c set until you find the correct configuration and then yeah at least you have a feedback direct feedback you don't have to compile anything so it's quite convenient to do that let's not try random configuration but that's something you can do oh I have 2 times the same slide so references documentation sounds so most of it will just explain what I just explained it also has a good documentation for how to avoid those click sounds or plop sounds when starting a stream or something like that and if you want to know more about the common interfaces that you will find you have a good documentation on the analog website and finally the I2S specifications that I use for that particular talk you will not be able to find it on the website from Philips and it's not on the website from NXP either so yeah archive.org is where you will be able to find it it's quite simple if you have any question I think I have 10 minutes something yeah so I've been recently trying to get a board working where we had 2 codecs connected to the same dialing so ADC and DAC simultaneously and I tried to use a simple card which doesn't support that and there were some patches on the mailing list which didn't get merged and nobody commented on them so I wonder is this out of scope for a simple card and I'll be better off writing my own driver or other plans to extend that someday in that case I will just write my own driver but I guess Mark may be able to comment on why it was not taken if there are no comments did somebody actually send the patches to me cause that's usually the reason why things get no comments okay yeah if it wasn't in my inbox that's probably why it didn't get comments was it too to 2 dies going to one codec yeah it's probably out of scope write your own driver another question oh ok so you were talking about the routing earlier between the codecs basically your card and the widgets defined in your card and setting up the routing yeah is there any incentive to do that except being shiny because it will work if you don't do any kind of routing actually yeah it works ok like Lars said in his talk it's part of trying to standardize the output names so that your application will be able to find its way without being dependent on a particular codec or a particular SOC I think if you set the card to fully routed then if you have no routes the card doesn't come up yeah if you set it to fully routed and you don't have any routes you will not expose anything so don't set it to fully routed yeah always set it right it's about troubleshooting I have a weird problem if you have any idea my left and right channel are overlapping it happens only in a short period which produces a cracky noise which is the best place to look for that pack it happens only on higher sample rates which means 41 kHz and 48 kHz yeah cracky noise I'm usually not investigating those much but it's usually caused by DAPM so maybe Mark you have some comments there your clocking is probably wrong the codec master clock is not synchronized with the audio bus clocks I guess I could see what's used in this case so that should make a difference but there's probably an M clock into the codec which there's usually a requirement that all the clocks in the codec including the audio bus clocks be from a single route clock source if there's a slight drift then that sounds like what happens when that goes wrong just comment about you use itc get and set to debug things you may want to have a look at the rec map rp which exposes the registers and debug fs which is quite useful yeah indeed so thank you