 Hello everyone so in this session of unit 4 we will look at a very useful and a very prominent technique of verification just called equivalent checking or formal verification. So formal verification is a very general class of is a group of methodologies that is used to formal technique or a group of method that I use to verify or dissolve in boundary of the other. So equivalent checking is part of is one part of formal verification that we will look at there are other formal verification techniques known as ascension based verification I think there are one or two more types of formal verification techniques. So equivalent checking is a part of formal verification is a type of formal verification so we will look at that in detail I will introduce the topic and then we will we will compare it with the we will compare it with the regular dynamic verification and static timing analysis both we will see how not equivalent checking where does it lie and how is it useful what is its uniqueness we will look at the application where and what all cases can be applied then we will look at the formalities so formalities a tool function of this so we will take that formalities an example and I will go through the formalities flow there is one more tool called one more very famous tool called the formal analysis which is considered in and both tools are very good for formal verification when you work in your labs or in your PST whatever tool you have access to you can work on that if you learn if you understand the formalities flow the formal is not for the formal the commands are different but the flow wise they both are good and then we will look at the summary. So the definition of formal verification we saw we are looking at the logic equivalent check so logic so for this for the scope of this code even for the industry wise you can safely say that formal verification on logic equivalent checking these two terms so although formal verification is much more than with them but logic equivalent checking and formal verification they go on and on the idea is that it verifies the logic equivalent so formal verification is a process which always needs two designs on one side the design can be in on both the sides the design can be either RTL form or netlist form and it tries to compare these two design for functional equivalent so it it works at the boolean level in the sense that it will only check the boolean equivalent it has no way there is no scope of checking the time verification or or there is no need to give any so one side there is a design which you consider to be golden it can be either RTL or netlist on other hand we again have a design which is either RTL or netlist and common verification can verify whether both of them are possibly equivalent or not even it can even some of the tools can even work on functional equivalent so the advantages okay does not take test function correctness it is an advantage in the sense that usually what happens in a dynamic simulation is that let us say you are verifying a simple SSM then in dynamic verification you need to get some vectors and make sure that all these states are covered all possible you please walk over to make sure that your design is functionally correct now formality formal verification does not need any kind of similar so it will it does not need any test vectors and it is very very fast when compared to the only generate function vectors when it finds out that the both the the okay one side the design is called golden other side or reference other side it is called implementation so let us differentiate the two terms the golden or the reference design is the design which you consider to be correct and you are verifying the implementation design against this design so reference will be your reference people and implementation is something you want to verify so once you have so how this logo first you have to obviously so it does not replace dynamic simulation obviously because in the first place the RTL has to be verified with some kind of vector whether it be random or the specific or whatever or deterministic whatever there has to be some testing that has to verify that the RTL matches the specification right nothing neither STA nor formal verification is trying to replace them but once you have the design this one you are satisfied that that yes my RTL is correct then all other versions all other versions of this RTL can be verified against the reference what could be that version that could be synthesized that list that could be a post layout net list and the process of verification is comparatively faster so let us say okay let us go forward and see the if it design flow and see where the formality lies so one more example here is that since it formal verification is aimed at comparing the Boolean equivalence comparing the functional equivalent it can even compare net list in different technologies in different hierarchical structure for example let us say you have one design which is synthesized in a one one particular technology let us say artisan 130 nanometer on the other hand let us say you have a design which is having a different log frequency but it is synthesized with artisan 180 nanometer for example so the base design is same for both these net lists but the frequency and the technology they both are different now can you now the formality since it works at a functional level it can compare these two net lists and tell you whether they are functional equivalent or not there is no timing checks so frequencies of not of any consideration even if the hierarchy in one of the net list is different than the RTL hierarchy or whatever the hierarchy is in the golden net list or RTL it can still verify it so that is the advantage it does not depend on hierarchy or does not depend on what type of cells are implement used for implementing implementing the net list the only thing it needs is that it needs the functionality of each and every lab reason that is obviously you will have the total so in this in the case on the left hand side for the artisan 130 nanometer case you already have 130 nanometer laboratory that you will read in the pool here in this case you you will have to read the 180 nanometer laboratory in the tool this way it it knows the functionality of each and every standard then and now it can verify whether this net list on the left hand side is equivalent to this net list right so so always remember there is one golden design and there is one implementation design or golden or reference on the other hand there is always an implementation design so this is the typical ASIC design flow and how do we insert formal we will see how do we insert formal verification into it how is formal verification now becoming the de facto standard for comparing RTL versus net list and net list versus net list so first thing you have the RTL design now the RTL design has to be checked using some kind of stimulus so it is not that the stimulus the dynamic verification is gone but no yes there will be some kind of dynamic verification here to make sure that RTL needs the specification now let us say now the design with the RTL needs the specification but now some other guidance use your manager tells you that yes the RTL is functionally correct but the the coding style is bad or the or you have to make make sure you have to rename some signals you according to you have to follow good coding styles or some you have to change some hierarchy right but the the functionality is still the same you are not changing the functionality but you are reading improving the readability of your code maybe you are improving the logical partitioning of your code and so on you can work on the RTL and now you can use formality you can use formal verification tool like formality to make sure that the new RTL here the modified RTL now matches the RTL that had that passed your stimulus right without the let us say if the RTL is of a medium size design the process here the the of checking whether they are logically equivalent to not through formality or anything will take about an hour or so for a small even a smaller design it will take still quicker it will be still shorter for a decent size design if it might take even enough that is it or few hours not more than that for a for a decent a big design let us say the dynamic verification can take up to few days complete dynamic verification running all the test tools but on the same design running formality between RTL and the modified RTL will take maximum up to two or three hours right depending on the size of the technology that we want to own but that is the max I mean formality runs of more than a few hours are not it means that you are running on a very very big and complex design usually formality runs are of duration of two to three to four hours for a decent size design so that is your runtime saving on you do not need to run all the test case on the modified RTL you can just run formality and be done with it again then now this modified RTL we will make we will take it through synthesis which is the normal process for getting a net list and then let us say now in synthesis now you want to verify whether the design is synthesized correctly this is not to verify whether the tool has problems whether the synthesis tool has problems although yes in some in some cases the synthesis tool might have a problem might have a bug and you might discover this bug during formality but the idea is there is to make sure that your synthesis constraints are correct. What if by mistake you set a logic 0 on an input port if you set logic 0 on an input port DC with optimize all the logic related to that related to whatever input port that input port is driving DC will remove the clock completely and now you can discover such problems in formality many a time the DC scripts are long and the constraints are long and it is not easy to debug them but such cases can be easily called by formality or you have set some variable in synthesis which optimizes all your registers some of the registers then also formality can point you out that I cannot find these registers in the net list what I was able to find in the RTL. So, all such things all such problems that might be might have swept in because of user input can be called by formal equivalence. So, this is the idea with first idea is first you can do synthesis you can do formality between RTL and RTL next step is you can do between RTL and net list and third now the synthesis goes through a lot of the synthesized net list goes through a lot of changes in back end in PNR during PNR there is clock based synthesis there is placement there is routing and there are so many so many things that go on. So, the best thing the best tool to make sure that there is no logic change when you go from synthesis to post layout net list is formal equivalence. So, you have so you are making sure that your back end flow is good and it is not introducing any design change at every step usually in the industry at every step that modify the net list in any manner usually formality is run or some kind of formal equivalence is done to make sure that there is no design change this is the de facto standard nowadays right. So, the bottom line is that make sure that no logical changes are made that is the bottom line. So, once you do dynamic verification here at the first stage after that all these stages you do not need any sort of changes in the mode you can verify everything using formal equivalence. So, if you look at the look at the post layout net list. So, post PNR net list let us say now at PNR net list ideally you will want to check that the functionality is correct we check it using formal equivalence by making sure that this net list is equivalent to net gate level synthesized net list you already make sure that synthesized net list is already equivalent to RTN. So, A is equal to B, B is equal to C and A is equal to C the timing you check using STA. So, at the post layout stage a tool STA tool like prime time and a formal equivalence tool like formality using both of them you can make sure that your design is good for sign off from both functional and timing purpose. This is why these two tools are very necessary now to sign off any chain right. So, let us look at the flow. So, equivalence checking is a branch of static verification what does static verification means again static verification means that any vectors are not applied it is independent of any vector it employs formal mathematical techniques to make sure a design is equivalent it proves two versions of the design are or are not equivalent the flow is read in the design match matching has to go the matching process make sure that the so there is something called a compare point we will look at with afterwards it makes sure that all the compare points are matched between the two designs the reference implementation only after matching has done as completed it verifies. So, this is the part verification is the part where it will compare the logic and then if let us say you have any any register or any output which is showing as functionally not equivalent then the tool also provides you ways to develop the problem right matching and verification stages are the most important those are most impacted by design transformation. We will look at what design transformations are. So, we now going ahead we will take a we will all our discussion will be focused on one tool that is an Opses Formality we I will be using the material forms in Opses University courseware and again all the concepts here discuss here are also applicable to any other formal equivalence tool. So, formality is an equivalence checking solution that uses static technique we have already seen that it supports all out of the box DC ultra optimization. So, now compile ultra DC ultra is very very aggressive in terms of optimization we have seen that it employs few techniques that change the design in some manner and you might have problems when doing formal verification. So, one such example we will look at first let us look at compare points then I will give you examples of the cases where the some of the DC ultra optimization tend to cause problems right, but there is a communication sort of communication channel between design compiler and formality using which design compiler can tell formality ok I did this aggressive change. Now, formality knows that such a thing has taken place it will help you to make sure that it is functionally equivalent. It supports verification of power up and power down states formality supports low voltage low power flow low power flow multi voltage flow multi voltage flow has a lot more complexities when compared to single voltage flow. So, this the low power multi voltage flow has lot adds lot of additional cells during synthesis they can be level shifter cells or the tension cells for example, in the in that case you will have to make sure that the addition of this tension cells or the level shifter cells do not modify the functionality right. So, it supports that formality supports formality. So, the the implementation of synopsis the group of synopsis tools that aid in the design implementation the platform is called Galaxy platform which includes design compiler or synthesis IC compiler for PNR and formality for formal ignorance. So, formality is part of the Galaxy platform from synopsis. So, capabilities this these are a couple of marketing slides which tell the capabilities of formality exhausted verification without test specters we have seen that verification of all design compiler default optimizations. So, this is a very important thing here. Now, what happens is that if you are using if as a as an engineer I am using let us say I am using a third party tool I am using a tool which is not formality, but I am using design compiler compile ultra command. Now, let us consider a case where I want to do re-timing. Now, what re-timing does it will move the flops around and in the combination logic cloud. So, it will so you have a flop the piece one example in the detailed representation that the flops are placed at logical boundaries in our team, but the let us say if we have violations going if we have violations in synthesis then DC can move these flops around for you. So, to borrow some timing slack from the next or the or the earlier stage which has positive slack. Now, the problem with re-timing is that although at the port level the designs are functionally equivalent, but if you consider flops the deep in of flops the deep in of flop in the RTL is now not equivalent to the deep in of the re-time flop in the net. And formality checks and any equivalence tool it will check the functionality at each and every flop data. So, in this case using a third party tool a tool such a a tool which does not understand re-timing the two points will be no not equivalent they will be not functionally equivalent why because in the RTL the data pin of the particular flop has some other value, but in net list because of the re-timing because of the movement of combination logic the functionality is different. Now, what many engineers would do if they do not have if they cannot pass the formal equivalence between RTL and the net list engineers tend to avoid such optimization steps. I have also done that in my in my career is that we tend to avoid such aggressive optimization steps which are difficult to verify using any formal equivalence tool. So, it is very very important to have a good formal equivalence tool that will help you make sure that all aggressive optimization techniques such as those applied by DCL form like re-timing re-timing is very popular nowadays like re-timing or sequential output conversion all these aggressive techniques are they can be verified by a formal equivalence tool this is where formality is very strong. This is where the the leader tools the industry leading tools like formal equivalence and formal equivalence are very strong about they are able to verify even the most complex optimization techniques by design compiler right. So, it mentions here it proves functional correctness of the re-timing complex data path phase inversion. So, so the compiler as in as a as a of the switch where it by default it will try and optimize the design by inverting your sequential output the output of the register it can invert it and we can add into more optimization low power implementation. So, lot of advanced features formalities supports then you have you can have you have if you are short on the compute resources you can use the distribution verification technique and split the load across different CPUs using formality. It has a since it is very tightly integrated with design compiler the formality helps you in reducing the user setup with the automated guidance or sanity where it helps you using automatic guidance you can very quickly that the formality you do not need to write around commands. It has a good GUI so a GUI in formality is very very good it helps a lot in debugging then it has some lock it has all the language support that you need system well of VSD and well of and formality also includes a tool called ESP. So, there was a tool called ESP seen earlier which is synopsis a bot this tool is can even work at function we will not go into this and it is outside the scope of this course so very very unique tool and the again the application is again very it addresses the very niche segment where it addresses the segment like memories focused on memories where memories are built separately they are not synthesized. But the RTA the code is written for simulation which is called a behavioral code and you want to make sure that the behavioral code in fact matches the net list of the memory. So, the net list of the memory in K in this case is a transistor level net list and this tool is able to verify that it is able to verify whether the behavioral model which is written separately is it functionally equivalent to the transistor level memory. So, this is an example of ESP it is good to know that such a tool exists when if you will go into industry and you work on full question design then it is a good tool to try now this is again a marketing slide is that it tells us that formality is algorithms are they are pretty good at doing the fast verification they are good on performance side they are less compute intensive and so on it has lot of tool optimization techniques a lot of DCs tool optimization design optimization techniques like before sharing, re-timing, re-stimulging or re-stimulging are easily verified. So, yeah this this figure is a very good figure which tells what all can be targeted by formality. So, whatever digital design you have complete synthesis based design like logic this logic here means it is a design RTA design that is synthesized again data path is nothing but an Italian which has lot of let us say orders and multipliers only that the full custom part that is the RAM ROM and any other full custom logic this part can be verified by ESP formality ESP as I told you before it compares the behavioral model against the transistor level. Now, let us look at the key concepts of formality. So, on one hand you have the reference of the golden design this design is the golden design I mean with formality test for equivalent and it is assumed that it pass function verification obviously formality has only the engineer has to make sure that the design pass function verification before calling it golden on the other hand you have the implementation design this is the modified design this is derived from reference usually and it is the modified design whether it be a modified RPL or a synthesized netlist or a full model. So, one example here is the gate level implementation which is a synthesized netlist then there is a term called containers use the formality a container is a self-contained space into which formality will be the design. Now, formality leads to design reference design and implementation design. So, for reference design formality will form a reference container or implementation design it will form a an implementation container. So, you have different containers for reference and implementation apart from the designs the containers also contain the RPL. Let us say you are reading RTL by reading RTL you will might not need a standard cell library why because RTL if your RTL does not contain any any high instantiate cell then in all probability you will not need standard cell libraries. So, the reference design will only contain the RTL reference container will only contain the RTL. But the implementation container now will also will contain a netlist synthesized netlist that is also the standard cell library that are used to synthesize the design. So, a container is just a superset which can contain the design that it can also contain the library. So, this is the ASIC verification flow using formality. So, again RTL is verified functionally using function simulation it goes through synthesis in using DCL truck you get the netlist first level of formality you do between RTL and netlist netlist goes to IC compiler for back end. Now, you also do the formality between the post out netlist and the synthesized netlist. You can also do formality between the post out netlist and RTL, but it is recommended to do the formality as as shown here why because in terms of naming convention and structure the gate the synthesized netlist is more closer to the post out netlist when compared to RTL. So, if the two designs are structurally similar it will take less time to for formality to verify. Since you already verified that RTL is matches with the synthesized netlist a post out netlist use the gate level netlist with synthesized netlist as golden do not use the RTL. It is you can use the RTL, but probably you will have to make a more of an effort to verify the design. So, this is the idea behind formality design A design B any design sources that modifies the designs whether it be RTL or netlist formality can check the equivalency point after formality proves the equivalence the implementation design. So, this is the what I have been saying here is that after formality proves the equivalent of the design implementation design to a known reference now this implementation design can be established. So, for RTL to netlist RTL to synthesized netlist this is the reference design and this is the implementation design, but for netlist to netlist now this becomes the reference design and this becomes the implementation design it is logical of course, once you have R is equal to I, I is equal to R this is a logical inference right. Now, let us look at the formal verification component. So, similar to the way time time breaks or design compiler breaks your design into timing paths, commodity will break the design into compare points. What are the compare points? Compare points are the design nodes at which the functionality is compared. It is the primary output register internal registers that means the deep end of the register the input of the register and so on inputs of black boxes we will come to we will see what black boxes are next driven by multiple drivers. So, these are the compare points that means at all of these time see formality when it comes to RTL to gate level formality or even netlist to netlist formality what goes through most restructuring is the combination of you to understand that. Let us talk about RTL the synthesis process. The synthesis process will quickly map first elaborate itself process itself it will map your register to GTG registers it will map your combination logic in the component and during optimization if you do not consider re-timing and sequence and output inversion it will make sure that the logic cone the functionality at the deep end of a register between RTL and netlist remain same and it will optimize among that combination logic that is the most optimization the sequence and optimization for most of the cases will remove the register if it is not used or if it forms but apart from re-timing and sequence and output inversion it will not change the functionality at the input level. This is why formal any formal verification technique the compare points are the register spins primary outputs inputs of black boxes black boxes are such such modules or components for which formality does not know the functionality or we tell formality that do not worry about what is inside. So, we set them as black box by setting something as black box we are telling formality to make sure to verify the functionality at the input spins of the black box like memories next driven by multiple drivers we see that logic cone is a term used for a group of combination logic for a combination cloud that drives a compare point if you see some figures that will elaborate with so two things are important here compare point and the logic cone. Let us see this figure now during the read process reference and implementation designs are automatically segmented into manageable sections why it has to divide the design into manageable sections where if I see functionality it cannot read the complete design and make sure that all the outputs are approved it cannot do that it is not possible from CPU resources point of view or from algorithm point of view. So, it breaks it into a chunks of logic which it calls logic cones. So, logic cones are group of logic bordered by resistors ports or black boxes the output border of a logic cone is known as a compare point. Let us take this case what what formality it is. So, let us draw let us go back here let us see there is some logic here also. Now, it started with an input port and goes through the combination cloud and hits the first VFF it sees this on the left side is a logic cone. So, this part here is the logic cone and the compare point is this VFF. Now again starts at the output of the VFF and traverses through combination logic and hits let us say hits the black box input pin for this compare point the black box input pin now this is the logic cone anything any logic that is coming via any input port or any black box output pin or any clock output pin all this this cloud forms the logic cone for this particular compare point right. So, this is the way that and please note it can have multi it can have the effect of multiple input right this for example this black box input where it is being affected by the VFF it is being affected by this input port it is being affected by this input port again this black box and so on right. So, it is sort of a at every compare point there will be a cloud of compare of combination logic that is being affected by a number of flip flops or input codes or black box output code this total cloud of combination logic is called is known as a logic code now. So, in the in the reference design both in the reference design and the implementation design commodity will break the design into logic codes and compare points and now it will try and match the name of the compare points in reference versus the name of the compare point of implementation why this matching is required a lot let us a synthesis for example synthesis changes name we have seen that the register names are changed if you ungroup any any any module then the hierarchy will be different again the name will be changed. So, lot of design processes like synthesis or PNR they will so PNR usually does not change the instance name these processes lot of times will ungroup some hierarchy or maybe form a separate group or change all such processes that change some of them change the hierarchy of an instance some of them will change the instance name this is the reason why commodity should first known. So, you have to compare you have a compare point in RTL and you have a compare point in netlist now it needs let us say there are thousand to compare points in RTL and let us say there are again a number of compare points in netlist each compare point in RTL should be first matched to a compare point in the netlist before you can make sure that logic code is equivalent first you need to make sure that the name matching this process is called matching both non-function name base and functional base matching methods are deployed to map compare points even if you change the name completely by using the chain name command or something with ET commodity will still try to match the name using some functional algorithm. So, the name matching the name base matching is called non-function matching the function base matching this based on algorithm they are based on the functionality of the compare point right. So, the matching cycle is very important before the comparison takes place then comes the verification cycle after the logic codes have been matched after the compare points have been matched the next step is to make sure that the logic codes are equivalent there are many different algorithms that are used by commodity I think this VTT is binary decision tree ATPG algorithm we know it is automatic system implementation I am not sure what SAT is but there are there are lot of algorithms there are lot of papers published on this you can probably go through them using some so you can read up this material but commodity uses a bunch of these algorithms to make sure that the codes are functionally equivalent then comes the debug cycle once the verification step is completed the tool will it will generate a vector it will generate a number of vectors that will prove that both the codes are not equivalent they are not functionally same for example here let us say you have a problem in the design and they are not equivalent it will generate a vector like this it will add the all the input all the input conditions here for example it can be input code or data it will post some value and it will show you that at the compare point the values are different let us say in this case the value is 0 and 1 so they are different so it will give you a vector also so that you can verify that where is the problem right the logic cones can be huge for a data path usually the logic cones are very very big and it is not easy to make sure to point out the problem unless and until you have a vector so vector helps a lot so this is the flow environment setup this is the place where you set up set the variables there are variables which control a lot of things in formality you set those variables you create containers by reading the design you you first create the container then load and library then design into a particular container there are two containers one is reference one is implementation so you read the reference design into reference container you read the implementation design into implementation container use run verify verify will first do matching and then whether verification view results and then if you find any non-equivalent points you debug right thus this is the formality view so formality interfaces library interfaces formality can read very long also it can read it can read lib synthesis library it can read even system error log or the SDL or the ES formality ESP can even be it is fine right. So for standard cells let us say you are using the dotlib for synthesis you can use the same dotlib for formality also because formality again has to match for any tool any if you even if you are doing simulation at a gate level at least you will need to read the standard cell library well log model here you can read the well log model also you can read the dotlib you can read any one of them because both of them have functionality and formality understands both for guided setup it needs a formality guide file for SVF so I haven't used VSDC but I use SVF I will explain this. So DC ITRA employs a lot of optimization techniques or it will many times change names for registers and for some nets so DC ITRA can write out a SVF file and take formality guide file which list down all this information and this can be read by formality to understand the design better right the outputs are failing patterns for failure cases it will it will write out the vectors formality reports it lists down all the compare points that are non equivalent whether whatever are equivalent you can save a session just like crime time and design compiler you can then restore the session and debug more you can save the containers also for future use and these are the platform supportive. So this is the the formality window so formality window is a tab based interface for example there are tabs for reference implementation then there is debug and there is verify and so on and it again supports a tickle tickle command interface which is same as what is supported by design compiler and crime time. So guided setup is a very interesting feature which is very useful if you are working with compile ITRA usually it is not needed with compile because compile doesn't do that many aggressive optimizations so now formality can account for synthesis optimizations through the use of a guided setup file automatically generated by DFI ITRA so there are some commands you set SCF file name and so on when the compile ITRA starts working on optimization it will write out this information of optimization in this container in this file it is a binary file it includes information about name changes register merging now name change although might not be important because formality will be able to match the compare even the the name change but register merging is very important why because register merging will reduce the registers it will it will so for example F2 or more registers are merged together the number of registers in implementation now will be fewer than the ones present in RTM so register merging is very very important again multiply architecture right now if you go through design the documentation you will find that adders and multipliers are of two kinds the first is the non pipeline that means it is 100% combination of it the second ones have pipeline inside them now this pipeline is not fixed at a particular place so when DC let us say you are reading a pipeline in your RTM DC will use this multiplier and then it will shift the pipeline according to the timing constraint that means if the path depending on let us say you are a single pipeline depending on whether the timing volition is on the left hand side of the pipeline or the right hand side of this pipeline and depending on the slack available on the other side it will move this pipeline this is very similar to this 3 time such cases are very difficult if you do not give any guidance to formality then they are difficult to verify because some pair points now have different logic cones RTM has a different logic cone at least the different logic cone so these transformations are a compile ultra will report will write something about these transformation in the SCF pipe so this way formality based on the guidance setup formality can use most efficient algorithms during matching and verification and it improves performance over and above that it reduces the headache of debugging such complex failure now in case of re-timing if the re-timing occurs if you have enabled re-timing then if you are not reading SCF file there will be certainly there will be an equivalence failure now in that case if you debug that now re-timing happens for the case where the combination logic is too much then only re-timing will happen re-timing happens to re-timing is there to solve the timing problem majorly and when is the when is the timing problem mostly when there is a big combination logic and when there is a big combination logic the logic cone will be very very big so debugging such problems such logic cones are huge and it will take you days to debug the problem if you do not use guided setup so this is where guided setup is very very useful it is a very useful feature of formality that that is what it says right required when verifying a netlist containing re-timing or register optimizations SCF is most important when you have re-timing input so how do you use the drive automated setup file you use this commands at SCF you tell what name in design compiler so DC will automatically write out the SCF file for you and to use that you set this variable in formality to prove and use you give the path of this this file dot SCF is written by so file dot SCF is an output from design compiler guided setup is an output from design compiler and the formality is an input now when you load the designs you can it supports these many formats for all practical purpose you would be probably using your log so you could you would be using read their log or read a serial log during read command you have to give minus r or minus i to tell formality what is this is this a reference design or the implementation design right now this is the read design process flow read their log minus r means reference read db this is the standard cell library and then now set the top level design whether you are reading multiple article file or you are reading a single netlist file that has multiple modules either you set the top yourself the top level design name or you set the minus auto minus auto formality will automatically find out what is the top level design name and if you set it it is very important before going from before switching from reference to implementation this step is very very important set top again you read the implementation the only thing that changes is the minus i option you read the same library if if you are doing for using the same library across both the design again set top so read technology read difference set top read technology again read implementation set the top level if your RTL does not use any cell phone technology library this step is optional only when the reference design is RTL and does not use any hand instantiated you might be using memory so here you will also read the memory next so this is the the commands read design file read by law read db set top these are the commands again pure RTL does not require any component library now in the let me show you so in the GUI as soon as you read reference the reference would be if it is read correctly and the top level is set correctly there will be a green mark appearing here showing you that you have read the reference correctly same thing goes for implementation so these buttons here will tell you at what step you are flow wise and whether that step has gone correctly or not so implementation design you read it like this you give the minus i option there is the warning here do not read the implementation design until having specified the set top for now ok these reference and implementation design are ready for equivalence checking here because there is tick mark both the places you move move ahead then we perform the set up so set up is used to speed up verification and prevent from unexpected failure now if you are not using also auto set up then you might have to there are lot of variables like design compiler a lot of variables formality which control the so see formality where it compares between RTL and netlist it needs to internally synthesize RTL itself right because it needs to compare it also needs to understand all the synthesis functions to make sure that the gate Boolean function of the RTL matches the Boolean function of the netlist when in case of then not being functional equivalent it will show you the logic cone and then the logic cone it will show you the gates on the RTL side as well so it is internally synthesizing although the idea is not to get a lower area like design compiler so the formality synthesis engine is just aimed at providing a gate level representation not optimized for either timing or area it is just for showing you the functionality so the synthesis engine inside formality entirely different from the synthesis engine inside design compiler but there are variables which control some kind of functionality for example there is one variable which controls the value of the registers at initialization now these variables you might have to tweak before you can get this set up ready so you will have to spend some time in set up to understand to make sure that the verification goes on and this is an iterative process it might happen that you did not initialize you did not take the correct value of some variable you go and you find out of some failures then you find ok like I am setting this value so you go back again to a set up part set the variable particular variable and then again go and again for example clock gating now clock gating the step in synthesis will add a latch and an AND gate or an integrated clock gating cell now this integrated clock gating cell is x clock right now in RTL the enable pin gating is on data pin bottom clock but in the net list as the rate of clock gating the enable the gating part shifts to clock from the data pin right so now the compare points are not equivalent if you compare the clock connection or the data connection for this clock they are not equivalent from RTL because of the clock gating now you have you can take care of this in the set up in the set up you can say that I have clock gating in my design so please make sure that all such cases of clock gating are verified properly so there is one variable that controls which controls that so this is the part of set up the best thing you can do is you can use synopsis auto set up to through so this will read the SVF and read the synthesis options and compare the set up from here this is the quickest thing you could do now black boxes represent the logic whose function is unknown it can also contain logic but you do not want to verify one example is memory a very famous example now memories you will read well of five behavioral model or dot for memory you do not want formality to go inside a memory let us say there is a memory it is a full custom design they are transistors inside you do not want to go inside this memory but you want to make sure that the logic driving the input of this is verified so formality for any black box whether implicit or explicit formality will make a compare point at the input of the black box and it will verify them right. So, black box is a common use of blocks that are not simple but you need some kind of model for that either dot lip without the function attribute or an empty web log module where you only define the ports. So, formality needs to know what are the inputs and outputs for this black box right to make the compare points. So, you need to give the port definition this port definition can come from a dot lip or from a value log module. So, how do we mark a design as a black box this is the variable. So, I was talking about variable this is the variable it says that whenever formality encounters any unresolved effect it will make it a black box. So, for example, let us say you are using a cell and for cell you did not detail at all now it is unresolved. So, formality will make that a black box this is not recommended right because now for a for this kind of a black box empty design is fine because empty design at least tells formality what is the input and output point but a black box will not formality will not tell formality whether the ports on the design in the instance of variable of that list whether they are input and output right. So, it is a good practice or if you have a black box, but you do not have a definition it is a good practice to make an empty design and read in the formality. So, it will make it a black box. Second thing you can do is that you can use this command set a real interface only the set variable and set it to mark cells if you want to be exactly black box or you can use this command set black box you can use a multiple you can use their multiple ways to set something of black box then there is a problem of matching the compare points. So, as I told before verification before comparing the logic points formality needs to match the compare point between reference and net list. So, it follows following algorithm in the given on it will go for exact name matching obviously that is the most straight forward technique. If you do not have any name change name change between reference and implementation and exact names will match for compare points. So, it will go for name based and it will go for name filtering this is again name based name filtering is not exact name matching, but let us say if case is changed or if any underscore is added like it happens for register then it goes for topological equivalence signature analysis we see what it is then compare point name matching based on net names this is again name based but it is based on net names not the compare names. So, when it matches it will give you a pop-up box that matching is completed and it will tell you how many points are matched and how many points are not matched. So, exact name matching it will it will map exact case sensitive case insensitive both come under the category of name matching exact name matching for example if reference is like this capital A B capital C and if the implementation has small A and B it will still match it will not have any problems you do not need to do anything extra name filtering there is a variable which has name matched filter characteristics many times these will add underscore or it will add by if you ungroup it will add underscore for example see this example reference has a slash here which means that it has a separate horizon but implementation does not have a slash which means that DC has ungrouped it whether explicitly or explicitly. So, it will add an underscore here it will add an underscore here also. So, the default value is this all these characters are filtered for demo. So, a slash is filtered and it is matched underscore using name filtering. So, it is capable of doing that again you do not have to do anything special then comes the topological name matching. So, it will so first the name filtering and exact name matching techniques are used and in most of the cases almost more than 90 percent of the compare points will be lost because again design compiler or any other school they do not use arbitrary name changing they do not do arbitrary name changing unless and until you do something particular right. Then formality attempts to match the remaining compare points by topological equivalence what it means is that if now it will start comparing it will know ok it will know that that ok these two compare points might be matched there is some suspicion. So, it will start comparing the logic codes and if the logic codes are topologically equivalent that they have similar structure it will match them again. So, match can matches can also be made to either directly attached. So, if the net name is same across the implementation in reference and it if it sees a compare point driven by a particular net because the same name it will try to match next comes the signature analysis then ask them the signature analysis there is slightly more complex thing now a signature of a compare point is in iterative analysis. So, it is the way the logic cone behaves on giving some input vector. So, that is the signature. So, they are a function signature that derived from random pattern simulation. So, it will input some random patterns and so the difference is that function signature derived from random pattern topological signature is derived from cone topological. So, the second case here the topological case compares the logic codes by structure. The function signatures compares the logic codes by input by giving some random patterns and looking at the signature as the function values in the logic code. Signature analysis since it involves giving vectors internally generating vectors it is only useful if the number of mismatches are less. If you have a thousands of compare point mismatches and if you try to run signature analysis it will take a lot of amount of time right it will be time based. So, what people do is they will set signature analysis for this to start with and then later if they find few cases where signature analysis can be turned on they turn it on right. So, this was all about matching you do not need to know all of this to start working on formality only because when you run matching you run where if I will do all this nothing like turning signature analysis on and so on. So, it is good to know that such things happen otherwise if you give the basic automatic method if you use that you do not need to give any special command to aid matching right formality will do the matching for you yes there might be some point which are not matched in that case in only in the case where the matching is not complete you can read more about it but if you are using compile ultra if you are using formality and if you are using SVF in most of the cases you will not need to do anything special right. Now comes the verification the command to run is verify and after the verify is done formality will give you a list of compare point and it will categorize them as succeeded that means the logic phones are equivalent failed means logic phones are not equivalent in conclusions means that formality cannot conclusively say whether they are equivalent. It does not have any vector to verify they are non equivalent nor does it say conclusively they are equivalent. This happens in term in when they are data paths are very complex or the logic cone is very very good right and the algorithm runs out of time . So, this is the report a typical verification report match compare points passing failing here it is categorizing the compare point that as black box pin loop black box net cut point let us not go into this port DFS these are most important the important ones are black box pin port DFS and latch it tells that they are total two latches which are not equivalent what is not compared constant registers are not compared which is expected constant registers are more. So, now the if you are doing RTL versus net list and design compiler would have optimized of the constant register. So, the register that is that is optimized of the Lincoln surface will be present still present in the RTL in the reference design but that register will will be absent. So, obviously it will not be compared do not verify is something which you might have said is not verified we can said some points are do not verify unread are not verified by default and do not affect any other thing unread are something which is here what is unread ok latches are unread let us see further so verification is incremental it can continue again after being stopped you can stop verification after let us say some point of time and then you can say verification verify minus restart you can even compare verify a single compare point by giving the option you can verify a compare point this is there lot of lot of there is lot of lot of flexibility in verify the month you can even set some some point as don't verify then let us look at the state associated passing is that passing means that to compare points are functionally equivalent failing means that they are not equivalent and formality would have generated a vector for you to verify about it means that it cannot conclusively say whether they are equivalent or non equivalent may be because compare point is too difficult to verify that the cone is huge and verified it represents a point which has not been verified yet not verified or not run if there is some error that prevented verification from having on that point so you need to debug this more right now let us come to debugging obviously debugging only makes sense you have some experience on any formality but these slides will help you in future also so in case equivalent second fail formality is the command called diagnose so there are two main verification results that require debugging one is failing other is the bottom if something failed or if any point is aborted that means it is not conclusively equivalent then you can say debug you can switch on the diagnose command and help formality debug it so this is the the the flow chart you run diagnosis on failing points if the problem is identified choose first choose the point to debug display pattern window display logic code I throw it a different let us see it in the green I think we also have a GUI ok let us move on when a verification reports that design is not equivalent the failure can be due to an incorrect setup also like the clock rating part I told so in case you have not done anything special for clock rating formality will report a number of latches as being extra not matched plus all these logs that are affected by clock rating it will report as non-equivalent not equivalent this does not mean that design is not equivalent this means that there is some problem in setup or you can even have a genuine logical difference you have to check in warnings and information messages in the log file this is very important before you debug the failures check the transcript window that means the window the PPL window in the GUI for black boxes or for simulation or synthesis this is very important simulation because we have seen that in RTL coding if you do not do do good RTL coding you can have a simulation in the format and such a mismatch can also affect the formality check for unmatched compare points any unmatched compare points can also lead to a failure compare failure so you have to resolve all you know if you know that the compare points what are the unmatched compare points whether they affect the failing compare points or not check whether SCF guidance file is read successfully so this is these are the basic steps you do before even going on to the debugging of an individual failing point right now you can have these states where you have large number of unmatched points if you have large number of unmatched points you should not actually verify you should first resolve this now when you have very small number of unmatched points which is usually in an expected design so what will be unmatched will be some constant registers they will always come as unmatched so if they and number of unmatched points are small then you need to look at whether they are only small number of failing point and small number of abortive point if such is the case then there can be a lot genuine logical difference if they are large number of abortive point then it might have you might have a set of problem right if you have no failing point and some abortive points it leads up to it points to a very complex circuit or combination loops so mostly when you are doing formality for the first first time without any experience you will face this type of cases secondly you might face these type of cases where there are any set of issues right usually if the version of design compiler stable and the formality version stable it is being used by different groups different design groups in your team or then first debug the set of features first debug the variables then debug that you have set it correctly for clock dating you have read correctly the SVF file so debug all such things first before going on to the logic code itself and when you have verified that your set of experience and you have small number of failures and you suspect that it could be due to genuine logical difference then you go to the diagnose part so analyze point commands examine the failing point you can say analyze point all and it will generate a list of it will a list of suspect causes in setup right or it will tell you that there is a problem it can tell you whether there is a problem in setup or it can tell you whether there is a genuine logical failure so analyze point will help you in diagnosing the failures first time so this is a debug window it shows you the failing points so it is telling you that on the left hand side it is right hand side is implementation the type is BFF now you isolate a failing point right click on that and set diagnose so there is separate analyze tab will appear you select the error candidate and then you can say show me the logic code it will show you the logic code like this you can say you can select the analyze selected you can you can select one failing point say analyze selected it will open up the logic cone for you logic cone it will give you as you see debugging logic cone is not an easy job of the logic cone it will tell you where is the mismatch so it is getting here back as the data pane here there is a mismatch there is a 0 and there is a 1 the 0 and 1 comes from a vector from a vector that formality generates for you so pattern viewer apply pattern feature is used to diagnose you can say right click and say show patterns it will show you the reference and implemented design input side by side together with input patterns which cause the outputs to be unequal this is the vector that formality generates to prove that the logic cones are not equivalent so again you can use the features in the in the GUI to debug it further so this is a pattern window now this is a very interesting part where it shows you that reference is affected by this these are the pins affecting the reference input cone these are the pins affecting the implementation code they can be different in many cases these can be different it means the left hand side this side and this side are different you have to first debug that they are equal or so in case you have a clock waiting in case the left hand side right hand side is in place and you have clock waiting you will have some extra entries there which is ok so the the point I am trying to make is that the left hand side and right hand side can be different and the differences can lead to failures or they can also not lead to failures in that in the cases where you are able to explain the differences right so the differences can be due to clock waiting they can be due to scan insertion but you have to be first very sure that these differences are not causing the failure on this side you get the pattern the input vector that causes the failure here it is telling you that reference design is loading one implementation design is loading zero it will tell you the state of synchronous data this is the input data the clock and synchronous load if the pin is there one constant which means they are both same so you have to worry about this you have to worry about the synchronous data which is different and this is the vector 1 and 2 there are two vectors that prove that design is functioning not equal right obviously this will only make sense once you start working on formality but again it is good to know these these things beforehand and then you start running formality you will probably recall recall what we did you can fix the design once you fix the design you will have to run the formality again and it will say formality succeeded failing non equivalent is 0 0 everything is 0 right so this was all about formality so we discussed the the importance of formal equivalence in the ethic design flow we saw the ethic design flow and we saw that formality is very essential to make sure that the ethic design flow you are not any any synthesis flow or any TNR flow or any RTL modification flow is not causing any functional disturbance so it is so formality is essential and I am underlining the word essential any chip design team in the industry every chip design team uses formality and there are many unique uses of formality right so these two tools are very they are the industry reading tools for formal verification many times require design understanding for efficient development this is the problem so as an ST engineer or as a synthesis engineer usually synthesis engineers are most comfortable in formality because the setup is the same right but many times you will have to go back to designer and get him fix his RTL or get him understand the help you and the the formality non equivalences right so the tool is essential I mean the any any logic equivalence tool in your flow you can have one either cadence LEC or as an office formality but it is essential it is most effective while doing ECOs what are ECOs so ECOs are something now let us say you have done synthesis now let us say you have a design a decent size a big design will take about a day for synthesis about few days more for making sure that all the synthesis reports are correct it will go through close layout of the back end flow the back end flow will take multiple days it will take 4, 5, 6, 7 days to get the routing completed and then the back end engineer plus the front end engineer the board will start fixing the timing using some tool or some flow and then there will be in some more days there will be very close to sign off but now let us say and during all this all this time period function verification will keep going on function verification is never complete the fact is that function verification is really complete and then you might there might be some specification change or there might be some new bug that you discover and that requires an RTL change now what do you do if you start from synthesis again 2 to 3 weeks to again come to the state where you are now that means to come to a state which is close to sign off is it desirable no what it means is that if a change is simple enough let us say if the change is adding an AND gate ANDing 2 signals then you change the RTL on one hand and then you modify the post layout met with another hand you do not do synthesis you modify the post layout on netlist on one hand this is called engineering change this is called an ECO formality formal verification is the best way to verify this there is no other good way in fact you do a formality between the ECO netlist and the updated RTL to make sure that the change the engineering change you did in the RTL in the netlist is functionally equivalent formality is most effective in such cases people add complete logic like adders in FSM in functional ECO and in those cases formal formal equivalence formality a tool like formality is essential to verify that the idea is the bottom line is that any process that leads to design change you have a golden design and any process that leads to a change in its functionality formality will help you with that whether it is being between RTL netlist or whether it is being netlist between netlist so please go ahead try out formality it is a very interesting tool and it is most essential in AC design thank you