 Okay, good morning. Thanks for joining us We're going to go without slides at least for a while when we get the AV fixed So please bear with us stay with us We're going to talk about that CNV plaque tests and opna v plaque fest co-location. My name is Sylvia. Maggia. I work for Etsy in the center for testing and interoperability. I am a technical expert there. I am in charge of the nv practice program Hey there, my name is Pierre Lynch. I work for Ixia which is part of keysight now more importantly for this though, I'm the Etsy in a v TST the test working group chair, which is responsible for testing Experimentation meaning proofs a concept and also open source collaboration So the big deal that happened what we're going to go through is that the fact that we co-located a plug test at CNV plug test with a plug fest from opna v We'll go Through as much as we can without visuals give you an overview and the results opna v activities during their plug fest and also more importantly the joint activities that happened because there was co-located and Then we'll explore what's coming up next So the big deal This is the third Etsy in a v plug test and it was like a fourth or the fifth opna v plug fest It was their Frazier release plug fest. It happened in in early June of this year Opna v has been a participant and a supporting organization for the Etsy in a v plug test since the beginning But this was the first time that we actually brought both communities together at the same spot Which was a beautiful spot by the way in southern France Etsy headquarters jealous of that So that was a very cool part But it allowed the communities to get together get to know each other and then also collaborate on a bunch of things that we'll go through Apart from opna v though There's a bunch of open source communities that already were supporting this plug test open stack being one of them as well as OSM or open source manual open area interface sonata open baton So we've got a collection and a pretty wide collaboration with open source going on with Etsy in a v Okay, so a bit of background on the energy practice program these started we started putting these together with With Etsy in a v in 2016 This was the main goal here is to validate and make sure that the standards this group is developing our feet for purpose And also to validate their implementations of these standards So we started putting this together in in 2016 that we had our first black test in Madrid in January 2017 And there was a kind of experimental practice was the very beginning the release to I think was still under development And we started testing Interoperability of different components when it comes to To single bnf network services, okay after this event in January 2018 to one year after We had the second in a v-plug test We were also hosting an open source manual hackfest at the same time same location And in this case we expanded the interoperability testing to start covering multi bnf and multi vendor network services Orchestrated by different manual solutions on different platforms, and we also started working with experimental API Testing so during this year the HCNV group started developing API's mainly for the Manos tax and And developing open API's definitions for these API's so we started playing with that And right after that in May June End of May beginning of June last year we had this third black test co-located with the opinion of you blackfest There we started playing with automating the test sessions Many focus was multi bnf network services. There was a lot more of API testing So the the number of API's that we were able to test was was bigger and we started seeing a nice Bunch of cross-community activities that will be explained later with this presentation So actually for the NFE practice program is not just we meet once a year twice a year And we get together and we plug things together and test them is that we have a Worldwide network let's say HC access a VPN hub where interconnecting all the participants labs so that any point in time From anywhere participants can access to other participants implementations and test together They can use these to build a proof of concept They can use these to prepare for black test or they can do these to do some more testing after a black test event Okay, so this is up and running in a continuous way. We have right now Over 45 remote sites connected over 60 organizations some of them are sharing data centers and sharing deployment points And we have around 250 people involved in these remote activities And then they send a delegation to the practice events for the face-to-face testing For the test plan development. This is an activity that started with the preparation of the first Black test is an open or continuous process is never completed Let's say what we do is we look at what HC NFV is defining in terms of specs But we also look at what black test participants are implementing and we work very closely as well with different open source communities What they are doing how they are interpreting these specs and how they are solving some of the problems with all this input We try to set up some System under test configurations combining different components in order to enable for the testing and Then we write a test plan, which is discussed is open and discuss Continuously its implementation agnostic. It's meant to be run and be meaningful to whatever Implementation of man or whatever implementation of platform whatever VNF independently or the functionality that this VNF is providing It's and the testing is run At the functional level. So this is continuously discussed and continuously fed back to HC NFV Which are taking into account feedback on the basis specifications and they are also publishing These test plans as as group specifications. Okay, so this is a continuous process when it comes to the third black test that we had in in End of May June the main focus as I was saying was the multi vendor network services The idea there is to have all the participants busy. We had around I think 80 or 100 people present 40-something organizations So the idea is to have them all busy all the time we schedule Test sessions one session at a time for each mano solution. So we have 10 mano solutions We had 10 parallel trucks and on each of them each of the mano solutions was testing with one platform Providing nvi and beam capabilities and with at least two different VNFs from different providers trying to build a network service out of these different VNFs The scope for these kind of sessions Which were the the most tested was in network service onboarding instantiation and terminations Those are the things that usually work and then on top of that we tested a network service updates to stop and restart VNFs in the network service Network service scaling on request of the operator We tested as well some auto scaling taking into account different triggers for this auto scaling this could come from metrics From the beam and infrastructure this could come from metrics From the VNF or this could be on request of the VNF or element manager and finally we tested a lot Fault and performance management of the this overall system. So the goal during this Testing is to test the maximum of combinations of platforms manos and and VNFs Then in addition to this we had some optional additional testing The the idea here is for those that completed the test sessions quickly or for those that wanted to test before or after the the black test additional capabilities we put together some Test plans to test well single VNF network services. This was for pre testing mainly But also on top of the multi VNF network services. They could test EPA enhanced platform awareness Aspects they could test multi site deploying these network services across different Sites they we had some testing defined to test Specific VNF management. This is when the VNF comes with its own VNF manager and this needs to interact with the man solution We also had some new test cases covering scale to level That are very late a little bit late in the preparation of the practice, but we're included as well We had some descriptions for the VNF rewarding graph and the NSH Service function chaining if I will talk about it a bit later And also as I said before we were automating some of these testing including more and more test cases in the automation So I said as well in parallel with the interoperability testing we are running API testing as HNV is developing these soul specifications or two soul three soul five Specifications and API's we're trying to see how we are going to test them This is aside from the interop testing for the time being in here We put an implementation in front of test system and we are The implementation is offering the API server The system is acting as a client and testing not only the requests and responses But also that the intended behavior after these requests are is is met Okay, so for the time being this was just a subset of the available API's we tried to be very pragmatic here and we check what's implemented and we will only develop Test cases for what's implemented that will be present at the practice But this base of testing is is growing. We'll talk about it as well In terms of Participation we have nine platforms providing NfV infrastructure and and we're telling first record infrastructure management capabilities I think most of them is not always were open-stack based or a PNV based Somehow many of them through commercial distributions In terms of Mano stocks, we have ten different manos providing NfV orchestration and generic VNFM management capabilities Most of them were commercial products Some of them were based in open source Solutions we had Sonata which is coming from some European research projects and it's now a standalone Open source project and we had two distributions of open source Mano participating So you will get the full list in the slides that hopefully will be available for download In terms of virtual network functions, we have different 19 different VNFs coming from different providers and here because the goal was to test More complex network services with different VNFs inside We tried to combine them and we came up with 27 different network services combining these VNFs There was a lot of test VNFs and simulators involved in this in this network service building There's a lot to either simulate parts of the network or to generate traffic to test the behavior of the network services Among the 19 we only had one Open source VNF this was coming from the opener interface Software Alliance, which are currently working in 5G network services, but for the practice they brought an EPC negative functionality results So imagine if you can a series of Beautiful graphs behind me pie charts lines green and red Showing a lot of results. I'll try to describe it the main takeaways were We had less set test sessions But the test session individual test sessions were long longer than previous plug test And we still managed to run more test cases than in previous previous plug tests The success rate of the interop tests did go up slightly We had an increase in automated testing of a hundred and seventy five percent So that's really taken off. There was a couple of companies that came in that could automate things like that So that that got really popular really quick So that's something that we're going to try to keep keep the momentum with going forward as well The API testing which is not interop. So it's a little bit of side That went up by a hundred and twenty five percent. So people are starting to participate more and more into the API testing Relative to the second plug test, which is natural since the the API's weren't totally defined For the last plug test in the first place The I'm gonna skip quickly through this but basically of the test cases that were tried especially focusing Yeah, focusing on the entire Test plan The success rate was 89 percent of the attempted test cases Having said that 43 percent were not attempted and if they're not attempted typically as if they ran out of time or one Of the participants We have a Windows desktop One of the participants didn't support the necessary functionality to be able to try the test case, but it's still Markedly better the things The takeaways things are improving things are ramping up at the API level same thing 28 the whole night half of the test cases What is it 25% of the test cases were not attempted however Of the ones that were attempted there's a 70% success rate again a market improvement since the first one. I want to give it a shot on this guy now Anyway, we keep going Yeah Then there was another pretty table showing per subgroup especially for network service testing and all I wanted to highlight there is Things like onboarding instantiation all the basics is almost at a hundred percent success rate or almost there So we've got the basics. They're working We've improved over on scaling so in the previous plug test from what I remember manual scaling was working Okay, now manual scaling is a hundred percent Successory where I like the fact that it's improving is some of the auto scaling Possibilities are now starting to improve as well So for instance on a VNF indicator if a VNF sends an indication to the Mano stack to say please scale That went up to 72% success rate of those who attempted it Good stuff having said that scaling based on the KPI coming in from the Vim meaning open stack That's at zero percent. So nobody supports it right now but improvement still Where other areas where? That need improvement are things like performance management and false management for both network services and VNFs where the success rate Very between 57 to 72 percent for those types of tests So again improvement on the past one not quite there yet. So we still have a little bit of ways to go the API track Same thing a little bit. There's a focus on the on the reference point between the orchestrator and and the VNF manager. It's called solutions three specification There's a lot more test cases that were run and attempted there versus other API's around the Mano and that's For a few reasons. There's a lot more to test on that reference point and secondly It was the first spec from Etsy that was ready So a lot of implementation had a head start there So but there's other sole three parts like package management and granting not yet So it's an incremental process there is where we expect to see a lot of progress for the next plug test Okay, so aside from these detailed results that we are compiling during these kind of events We are also trying to gather a lot of feedback While implementers get together and talk together and sometimes they realize they are seeing things in a different way This is probably not because they are done, but because the spec is not very clear So we try to capture as much as possible of all that For this practice we did get a lot of feedback mostly on the sole specifications I think it's because we've been running on top of the other specifications for a longer time and Things that need to be solved or clarify have have been solved But it was the first time we were putting that much focus on the sole specifications We got a lot of general feedback recommendations and identified gaps And inconsistencies and there were a number of Request for clarification what was also nice during this event is that we had many of the officials from From Etsy and a VP present there in the room So a number of items could be solved on the fly and they were also getting this feeling Or where are the points where they need to start clarifying or adding details to these specifications for the Open API definitions. There was also a number of bugs I don't remember exactly the name but the number of bugs that were Identified some of them fixed on the stop some of the spot some of them fed back to etsy and a V and some recommendations on how to improve the reliability and the Drafting of these open API definitions. So this is Probably the main reason why etsy is running these these kind of events and these kind of continuous testing is to gather all This feedback and make sure that the quality of the standards improves and they are fit for purpose Concerning the test plans. There was a lot also a lot of feedback. So from the interoperability test plan. This is something We're continuously feeding back to etsy and a V testing working group these there are now three new Or updated test descriptions that have been sent back to to NVT ST These cover additional testing in performance and for management these covers NSH base service function chaining and And the famous scale to level I mentioned before For the API testing all the learnings all the experience that we gather by testing with all these different implementations Is been compiled as well and this has been input to a new work item. That's called TST 10 this is the NV conformance testing specification and it's under development But the fact that we've been playing with those API's and testing those API's beforehand is going to help to focus and and and to develop a better Spec on the open of his side. So open a V had their own plug fest. So they had their own activities going on as well so having It being an open a V plug fest the focus was entirely on testing me So they did a little lot one of the one of the big attention points that they're getting there Focusing on these days is long duration testing and performance testing and a lot of the activities surrounded that so First thing is there's two Open a V has a series of different testing projects bottlenecks and yardsticks are two of the standards They were focusing on doing two-day tests. They'd let they'd let things roll for two days just to see what the results are and This leads to some another project that I'll talk about later on but one of the issues that comes out of this is repeatability of performance tests and and No errors and long duration tests which led to a project that came to Etsy and I'll talk about it later another tool called NFB bench NFB benches is a Performance test tool from an opian a visa looks at the system as a black box and it uses t-rex open source Traffic generator to do that. So it was testing against one opian a v scenario and a commercial platform from Wind River And they were trying to compare acceleration or different networking technologies and it's specifically vert IO versus AVP from titanium cloud which And then they were also comparing different test tools to VS perf is one major test tool at opian a V that's the virtual switch performance testing It's been dedicated to do that since the very beginning of opian a V It uses four different traffic generators, and they're comparing OV as dpdk layer 2 forwarding capability between if there are any differences in performance between VS perf and NFB bench which has a more black box kind of look at it and what they came out is Packet format number flows packet path does impact results. So the configuration obviously will configure it will will impact the results But when those configurations were matched between the two test tools the results were pretty similar Here is a great picture of all hundred fifty participants that were there and they're all waving at you so Next so some of the collaboration items now that the happen between Etsy And opian a V and being the guy trying to make that happen. This got me really excited So one of the the way we work at Etsy and a V is we call them work items and basically you can look at them as project it has a ptl we call it a report tab and the work each work item has a Name three letter acronym which working group it comes from and a number so tst 009 If you're interested in benchmarking, this is this is the new standard in my mind It's for NFEI network benchmarks and measurement methods. So if you're familiar with RFC 25 44 This is basically an update to 25 44 for virtualized platforms taking into account that this is not a standalone box It has a benchmark definitions test setups even requirements over test tools that you're trying to do this and Methods of measurement and what they did is they came up with new search algorithms to help Mitigate the the the problem with repeatable results that I was mentioning earlier This became the highlight of all Collaboration items between Etsy and a V and open a V because it literally was an agile iterative Development process between us at Etsy handing it over to open a V They would review and come back to us with comments But more importantly they would prototype the algorithms and try things out, which is what open source communities are so great So and then they come back said hmm try this or tweak that and the results didn't so let's try this And it was a 15 iteration type of thing And it finally came out as as a full test spec that was published in January if benchmarkings are a thing I'm talking exclusively about that at 2 30 later on this afternoon The in test 10 which is another work item, and that's the one that's going to be pretty Linked to the API testing is we're coming up with an automated complete conformance test package for The Mano stack so it's three different reference points a ton of APIs These are all REST based interfaces And we're we're going to use robot as a test framework to automate a full sweep of conformance tests for that The collaboration part is that opiate of these so far it's all initiated But it's in the initial stages. Let's say will help us automate it as well So if you run it against by using an opn avi platform to lay your Mano stack on it then you'll be able to automate it fully using using opn avi the func test test tool and then Further a future potential thing, which is really cool is help us automate Instrument the platform in order to help the automation for things like fault management It's kind of hard to automate a CPU going down While as opn avi if they instrument their platform to say to communicate with the test system and say okay bring this one Down and then you check the results that makes it a full package. So that's pierce pie in the sky But hopefully that'll go forward as well It gave the opportunity Opn avi has what they call the ovp opn avi verification program Which is basically conformance test package to verify an opn avi base system Against test criteria the tool that executes that is called dovetail, which is also the name of the project Dovetail is an umbrella tool Kind of thing that uses existing opn avi test framework like Mainly func test and yardstick, but there's a few others as well And then it selects and runs test cases that those tools support against a hardware and virtually virtualized platform It gave dovetail the opportunity to run for the first time. I think against commercial Hardware platforms from Nokia and white stack a red hat platform and a wind river platform all of them by the way open stack A very exciting and new thing that happened the opn avi has a project called xci Cross-community integration Which brings in from master branch multiple open source projects for them right and well it was for them It brings an open stack open daylight Fido fd.io fast datapath and also own app and brings it into a large CI pipe to immediately test master branches from all those four communities and it's been very valuable and apparently really valuable to open stack as well because the The results are fed straight back into open stack and there's been a few bugs here and there which is Exactly what we're all about right fail fast What what what xci because they the project team lead for xci was present at the plug fest and OSM was also part of the plug test plug fest as well. Well, they got together and say hey Why don't you get on board to this and they finally integrated OSM open source manual into the opn avi xci So now there's five communities five packages being integrated into that that led to The opn avi sfc project going hmm They sfc service function chaining is a project upstream well opn avi project that upstreams its Development into odl to be able to support the nsh network service header Based service function chaining nsh is an ietf protocol from their sfc group That project's been in existence for two three years But now they saw they took a look and said hey wait. There's a pretty good OSM A manno stack there open source that supports sfc. Why don't we use that? So they started working together as well and now the sfc project uses osm to orchestrate the service function chain Test and then once that's and they have a certain amount of tests present right now And now that that's in place they gave them a little bit more flexibility and they're going to Extend the test base for that as well Okay, and yet another nice cross-community activity that that happened during this event typically Companies and participants come to practice to check what's failing in their implementation to check what they are not doing as the others They learn a lot, but this time we wanted to give them a chance as well to learn and show what they were doing Well, so we had a challenge that was to build during the the week of practice Some multi-vendor multi-project demonstrations of real use cases of real network services So we had like four or five that were set up on the spot like that and demonstrated on stage With with the full functionality one of the nicest one was one grouping together I think was for open source projects seven vendors and they were actually orchestrating a 4g mobile network. Okay, so we had In terms of open source we had of course Open stack and opnv on the platform level. There were a couple of sites there on the orchestration There was open source manual on one of the vnf's that was part of this multi vnf network service was the the 4g ePC from from the opera opener in interface over alliance, so this was Really nice case of additional collaboration You must remember that all these activities all these cross community things happen on top of all the testing That's running and it's been Compiled so it's it was quite a nice thing to see So all these all these activities the testing that results the feedback the cross community activities All this has been compiled in a couple of reports You have the link on the screen, but you can also Google them. One is the plaque test report. This is all the plaque test Activities all the testing results and feedback Related to the HC specs. The other one was a joint report between HC plaque test and opnv plaque fest And this is an overview is a shorter document is here to read and it's an overview of all these Highlights and the main cross community activities that started there. So really nice read And then what's next the ones we've done this what are we planning for the future? so in from the Etsy side, we're really going to put the focus on API testing in the next months Most of the specs are being finalized and the open API definitions are being finalized and stable and We feel now comfortable to run an exhaustive testing of those APIs. We're planning for a fully remote event We don't need Participants to come to Etsy to do that testing. We're going to to leverage this hive this hub for interoperability and validation and what we will do is to have individual test sessions which is each Interested party implementing one of those API servers and we'll have this Implementer working with someone from the plaque testing that will be guiding through the through through the through the testing Running the test system and guiding also when things fail on what needs to be fixed on on which side So this is interesting for both parties for the for the Etsy team is good because we will make sure that the The switch that we are developing will be Will be functional and have been validated with several implementations for the implementers for the participants It's interesting because they are going to get a very early check on their implementation of the API They are going to get guidance on how to use it and then later on they can always go Download these test suites and run them on on their own or incorporate them to their CICD So this is going to run between February and March 2019 Remotely as I said those participants that already joined the practice program in the past can simply Express their interest of testing and and they build schedule for the testing those that haven't joined yet They can join now and they can go through the registration and we are still on time to to onboard them and have them participate into this testing And then after that we are planning for a new face-to-face event Same place and dates will be first week of June 2019 and this will happen in Etsy premises in Sofia and T-police. We have a very big Room where we can fit 250 people. We have a nice lab nearby and we're very close to this hive A core which give us access to all the remote sites You don't need to send your hardware that if you don't want you can just connect to a remote lab Keep the connection you have actually and use that for the for the testing So here again, we will be focusing again in multi-party interoperability sessions. The actual testing scope is going to be expanded. We'll have to see and discuss with With Etsy and Evie whether this is going to be to incorporate release 3 functionality where it is going to be to expand the scope of the interop to make some API checks during the interop sessions or whether it will be both of them so we'll see for the Detail scope of the API plaque test here as we were saying we are going to be exhaustive in terms of APIs and reference points So the participation will be will be open actually to anyone providing a VNF or element manager offering sold to capabilities any VNF M offering sold to or sold 3 and any orchestrator also offering sold 3 or sold 5 Okay, everything is underscore of course in addition of all these pieces of VNFs and VNF managers and orchestrators. We will need as well some platforms where we run This testing so participation is also open to hardware providers and Veeam providers Opium Evie has their future plans as well So I'll just quickly go that go through that the opium Evie just released their Gambia release The G release so their sixth release this week just went out. I think Tuesday So that's a big deal and now they're working on the next one, which is called Hunter of River in Australia I just found out this morning They have a joint plug fest opium Evie and own app will have a joint plug fest in January First full week of January just outside of Paris hosted by Nokia And it's the first time those two communities get together for a plug fest Now the potential hopefully anyways, we're trying to work out if it's possible to have opium Evie and own app Join us at the plug test that Sylvia just described in the first week of June So two things have opium Evie co-locate their their their plug fest and own app as well with us and Hopefully get on that to participate in a plug test as one of the Mano stacks It would be half of what it does, but one of the Mano stacks In the plug test, which I think would be really super valuable And that's it. Those are the that's a potential. I'm crossing my fingers. Hopefully we'll get there And some key takeaways a bit of resume of the presentation. So the interrupt results They are improving some areas are still not thoroughly tested and we need to look at those and see why this is happening Whether we need to make the specs better that this plans better all the implementations have things still to to develop We've seen a great progress on automated interrupt testing and a great progress as well as well in NV API adoption and testing And we're going to keep Working on that we are seeing an increase in cross-community collaborations and we're very very pleased with that we have already starting to connect into this Hive network and and and making their implementation available for future testing And we will be happy to have all the communities as people say in joining as well as as many as possible And finally just save the dates for the plug test and plug fest activities for 2019. You will be most welcome So thanks a lot for staying with us. It's been tough. I guess for you as well without the slides But thanks a lot for staying the slides will be available Wait a run of applause for the poor AV guy in the back who heroically tried to fix