 So last talk for today before the wrap up, same speaker, AES 70, control protocol this time for audio. Yeah. OK. So in case anyone wasn't here for the last one, hello, I'm Conrad Bebenton. I'm here from Focusrite. We do audio interfaces and various audio devices. So this is a talk on the AES 70 protocol. It's a standard for controlling audio devices on an audio network. So in case anyone wasn't here just now, ProAudio is dealing with areas like Live Sound, like Studio, and like Broadcast. We have various devices for inputs, so microphones and preamps, outputs to amplifiers and speakers, and in the middle there's processing, mixers, effects, those sorts of devices. Yeah, I know this is repeated, but in case anyone wasn't here. So a typical setup has some equipment on the stage, some equipment at the back of the room in the audio area for mixing, for recording. And so signals come from the stage, get mixed, recorded, and then can be sent back out for front of house sound and for stage monitoring. That's a fairly typical setup. So for AES 70, the motivations were slightly different than the AES 67 standard. This is about controlling the devices. So with an audio network, because we're taking audio in as close to the source as possible, we're not running long analog cables. It means things like the actual equipment, your microphone preamplifiers, for instance, can be placed in quite inconvenient locations. They can be in the recording room or on the side of the stage. So that was one of the main motivations for the control protocol. There are existing protocols. They're generally proprietary, and again, they interrupt poorly with other manufacturer's equipment. So there was a need to improve that. And there was also a need to allow integration of actual physical hardware controllers. So maybe your console can communicate with your microphone preamps and your outputs and all be integrated into a single network. And of course, there's a need AES 67 didn't specify control. And so having a complementary standard that does that, that kind of fills in the gap. So there's two terms, and they're generally used interchangeably, but might as well get them out of the way. AES 70 is a standards document specifying the control protocol. And OCA stands for Open Control Architecture. That's the actual protocol specified by standard. In most cases, terminology seems to be quite interchangeable. So the standard is split into three areas. There's a framework. There's a class structure specifying a kind of list of a tree, really, of classes of things that can be controlled. And there's communication protocols, so how we actually get the messages for doing the control. So the framework itself, it's object oriented. AES 70 device is specified as being composed of a number of objects. And the framework allows only for single inheritance. So there are no things like there's no interface classes or anything like that. There's just a single inheritance tree. Classes, so in the standard classes are allowed to have methods, properties, and they're allowed to generate events, which is your feedback mechanism for sending updates back to a controller. There is a single root class called OCA root. And in order to define what class an object is, there is a class ID, which is composed of class numbering for each level of the inheritance hierarchy. And the standard does make some allowances for proprietary subclasses. So if you want to implement a non-standard control, you can put it in a proprietary subclass. And standard controllers will be able to access your superclass methods wherever you leave the standard tree. And a custom proprietary controller can access the extended methods. So this is just an example of how our class ID works. We have inheritance. And at each level, we add another ID onto our class ID. They're just numbered at each level. So the methods themselves, they do typical object-oriented method things. They retrieve properties. They perform actions. And as we have single inheritance, it's very well defined that each method has a level as how far down the tree it is. And then on each level of the tree, the methods are just specified with method numbers. So again, we have examples of methods. And if you want to call this method, you go down to level four. I don't know why it's one index and not zero index. And you go across on this level to method number two. And that's how you specify a method. And it's all very well defined like that. So we've got this object tree. And every class inherits from OCA root. So this is actually quite a rich class. It allows quite a lot of functionality in the base class of everything. So it allows you to specify a role for the class. And that's just a string that's intended for human readable use. So for example, it might be channel one gain or channel five input source. That's the typical sort of thing that would go into that. There's notifications on when any property of the class changes. So receiving updates from your control device is quite easy. You just subscribe to that one notification. And you get all the property updates on your control device. And there's also a locking mechanism. So a particular controller can just take ownership of a particular object in the device and say, I control this exclusively. And no other controllers are allowed it. So that was the base class. And now we have a rough outline of our class hierarchy. It's quite dense and rich. So don't worry about the details there. But the built-in classes are split into three main groups. And then workers has a subgrouping. The actual signal processing happens in workers. Managers and agents kind of deal with more global state of the device in various capacities. So typical workers are the sensors. These can detect any inputs. These can be level sensors, so actually measuring the signal. Or they can be other things like stress, which states on the input of various types. Then the actual work of signal processing is done by actuators. So these range from the simple stuff, like just applying gain all the way through to complicated things like filtering, parametric EQ. So there's quite a rich range of classes there to work with, which covers most typical audio signal processing use cases. So once we've got these classes and we've made objects, we can group them into blocks. These blocks don't really have a signal processing function. They're more for the actual logical layout of the device. So for instance, you would typically group a single channel of signal processing into a single block just to make it clear that this is what the functionality is relating to. They can be nested. They can contain other workers. And they have methods for describing the signal flow of the signal within the block. So in a typical case, each worker within a block could have some ports for input and output. And the block that contains them all, this outer rectangle, you could have a method on it to describe the signal flow. And it tells you this output is connected to this switch and just enumerates all of the signal path connections like that. So this is a typical example from a microphone pre-amplifier. We have some inputs. We choose a particular input using the switch. We can apply gain to it and optionally filter off the low frequencies. So outside of the workers for doing signal processing, there are managers and agents. Managers are quite typically global controls. So the mandatory ones all deal with the device on some level. So there's describing what the device actually is and what firmwares are on it. There's a subscription manager, which is mandatory, which is part of the event handling mechanism for allowing controllers to actually receive events from the device. And there's some network information available. There's also quite a rich range of optional managers dealing with various areas around security, clocking, and just the general settings. And these aren't quite so important, and so they're made optional. The agents, there's actually only one mandatory agent, and that's called the stream network. This provides the mechanism for setting up connections between your audio device and other audio devices. It's the mechanism for connecting them together. That's mandatory, and it can be any sort of class of stream network. So for example, it could be NAS67 stream network. It could be a Dante stream network if you have a proprietary device. It could be any of those. The optional agents, these are generally related to handling controls. So for example, you can group controls together with a grouper, or you can affect control over time with a rampa. That would be a typical example of that, would be doing a fade-in. So you just send one message to the rampa that the gain must ramp up over a certain period of time, and that would be handled. Observers are, again, just a slightly different report back mechanism. And there's also a handling of media clock. And finally, the other half of the event handling mechanism is the event handler itself, which is an agent which can be implemented on a controller and not the controlled device. So for the event handling, both the controller and the control device are OCA devices. That's a critical part of this process. The controller implements the event handler. The control device implements the subscription manager. And the controller then uses the subscription manager on the control device to register for what events it will need. And when notifications come back through over the network, those are just treated like method calls on the event handler. So our method call mechanism works in both directions. So that's the structure, that's the classes, and the agents and managers and what's in there. Now we need to look at how to communicate this with these things. And there's scope for multiple protocols in AES-70. Currently, there's one protocol defined, which is a TCP ID protocol. In the future, there's planning in place for doing control over UDP for smaller embedded devices, which can't necessarily handle full TCP. There's also the possibility of doing control over USB. So that would kind of standardize the control mechanism that no matter how your device happens to be connected at the time, it uses the same control mechanism. So ocp.1 is the TCP IP control protocol. It defines a discovery mechanism, the message format, and the optionally some TLS and heartbeats for monitoring device presence. Discovery is done by DNS service discovery with multicast DNS. There's a couple of text records specified within the standard just so that devices can know what version of the protocol they are connecting to. And the actual service types are also specified. The message format used itself on the protocol is a binary format. It has various message types for the different situations and also specifies fully the data marshalling so that you can provide parameters on your message. So for example, you can provide method parameters or responses. It's a fairly, I haven't actually written it off here, but it's just a fairly standard binary format type thing. So that's the controls and the protocols. So now we have the organizations. Again, obviously the AS is involved. Again, standardization, technical discussion. There's also an alliance of companies called the OCA Alliance. They handle more of the promotion of the standard, encouraging adoption, and actual discussions of the practical implementation tend to happen under the OCA Alliance. So the current OCA Alliance members, these are the full members. Again, there's also associate membership. They tend to be audio equipment manufacturers, focus writers in there, Yamaha's in there, DMB, et cetera. So it's quite wide across the audio industry. And as this is more of a control protocol, there are actually implementations available. So there's a sample implementation called OCA Micro. This is designed for embedded devices, and that's intended more as a kind of dev kit. So if you download that, you actually get some source code demonstrating the implementation of AES 70, and there's some electronic schematics for a dev board that will run this code. The OCA JS is more of a client-side implementation, so for implementing controller applications, that can run in your browser and generate OCA commands to control various things on your network. So that's quite easy to get open running and get started with. So we've got implementations. And finally, just summing up the benefits. Having a standard protocol allows a better interrupt, and it allows the possibility of things like custom controllers, of hardware controllers, to interact with the audio network. It makes a good fit with AES 67, and the structure used, the object-oriented structure, allows for easy extension where needed. So yes, for more information, there's the OCA Alliance, there's the AES, and there's the downloads for the implementations. Thank you. What does AES 57 bring to the web scene group to them, like, under? AES 70, so yes. OK, so the question is, what does? Where is it? It's in the gym. Yes, yes. OK, the question is, what does AES 70 bring that OSC, open sound control, couldn't handle? I'd say it's more in terms of the specialization. So OSC was more focused towards being kind of a midi replacement for instrument products, for that market. And this is more focused towards the high-end pro audio market. So we just see a different focus in kind of what objects are implemented in the protocols and what controls they allow. Someone's going to be exactly the same. OK, for another question, no regrets. OK, thank you, Ben Rather. Thank you, thank you. Sorry, there's one. We've got the problem. We're kind of... Yeah. Yeah, unfortunately, we can't see.