 Hello and welcome, my name is Shannon Kemp and I'm the executive editor for Data Diversity. We would like to thank you for joining this month's installment of the Data Diversity Webinar Series, The Heart of Data Modeling, moderated by Karen Lopez. And this month's series is sponsored by Embarkadero. Today Karen will be discussing the best data modeler as a lazy data modeler, and just a couple of points to get us started. Due to a large number of people that attend these sessions, you will be muted during the webinar. For questions, we'll be collecting them by the Q&A section in the bottom right hand corner of your screen. Or if you'd like to tweet, we encourage you to share highlights or questions by Twitter using hashtag heart data. You can also access the chat section in the top right corner of your screen to engage with Karen and each other throughout this session. As always, we will be sending a follow-up email within two business days containing links to the recording of this session and additional information requested throughout the webinar. Now let me introduce our speaker for today, Karen Lopez. Karen is a senior project manager and architect at InfoAdvisors. She has 20-plus years of experience in project and data management on large multi-project programs. Karen specializes in the practical application of data management principles. She is a frequent speaker, blogger, and panelist. Karen is known for her fun and sometimes snarky observations on data and data management. Mostly she just wants everyone to love their data. And with that, and you can also follow Karen at Data Chick on Twitter if you don't already. And with that, I will turn it over to Karen to get us started. Hello and welcome. Hi, Shannon. Thank you for that. How can people follow DataVercity on Twitter? At DataVercity, of course. Perfect. Perfect. And DataVercity.net, right? Yes. On the intertube. Perfect. Well, I wanted to thank everyone for showing up. I hope you're having a beautiful summer or winter day, depending on where you are. I know we're having one here. You know, coming up with topics for these things. This year, we're trying to do things that are more about the core, the heart of data modeling. And I think for summer topics, it's good to have something that is full of snark yet very useful. So I'm hoping we found the right sort of level of both of those today. And yes, I'd love for you to tweet and share today's event. You can copy me at DataChick and use heart data, which is our hashtag for this webinar series. Shannon went to a great bio that I gave her about me. But basically, I want you to love your data, and I want to confess that I'm very, very lazy. I'm a very lazy data modeler, and I want you to join me in being lazy. But I want to know how lazy you are. And now we always go through this is like a great test. So pulling. I want to ask you if what sort of automation features have you used in your own model tools? So macros, naming standards, templates, but mostly automation, scripts that you've written, any of those things. I want to hear about it. So have you used them? Have you used them and given up? You used a lot of them, and you have a whole development team and projects and projects managers dedicated to your doing that. So you've got about 30 seconds or so to tell us, or 20 seconds to tell us how lazy you are as a modeler. Last chance for your votes. Kind of see the initial returns. So that's kind of interesting. So what I'm seeing on my screen, because I don't get to see what you're seeing, is that a few of you, 28 of 135 here, said, no, you're not using any. 13 of you said you tried and gave up. 41 says we used them a lot. And one of you said you have a whole development team dedicated to that, and I think we're all very jealous. 52 of you were so lazy, you didn't even vote on the poll. I want to give you a gold star for that. So now the next poll question is, have you shared any of the scripts, macros, naming templates, do-dads that you have developed for your data models? And so yes, you have. No, you're not allowed to. Don't know where to share them. Or no, not at all. And we're going to talk about this whole sharing of scripts and models. I should have added another poll that asked, would you like to use other people's scripts and have them? That would have been a good one. And then I could have done the analysis of people aren't sharing them, but they all want others to share. So you get about 20 seconds to jump in. And it looks like 34 of you have shared them before outside your organization. 12, you aren't allowed to. 11, about the same number. Don't know where to share them. 40 of you haven't shared them at all. Just don't share them. And almost 60 of you just had no answer. Again, a gold star. So some of the comments that people are saying is you haven't given up, but you don't use them often enough and you share them internally only, or that you've used other people's macros. I use other people's macros all the time. So it turns out that a lot of you are lazy. That's good, especially given how we're using that today. But not enough of you are lazy. So my goal for today is to make all of you the lazy data modelers, as well as to share your laziness throughout. We talked about this. So I want to talk a little bit why I'm a lazy data modeler and what the heck do I mean by lazy? I'm going to attempt to do some demos and screenshots and whatnot of some of the automation features in some tools. Of course, we only have one hour for this whole thing, so this isn't going to be a how-to. I'm going to show you some demos of the macros that I've developed and I'm going to explain sort of the reason I had to do a macro and not use some other feature or not sit through and slog through an 800 entity data model and do all this stuff manually and give you some backstory about it. And then, of course, 10 tips for being more lazy. But why this topic? So I started out, and if you're experienced like I am, you know that data modeling tools, you know, for the longest time, we couldn't even, and even still in some tools, you can't just say, okay, I've got this beautiful data model with my 100 subject areas. Now just go print all the nice beautiful diagrams from my tool, all of them, all at once. Just send that job to the printer. I'll go wander over and stick another couple of reams of paper in the printer. But just go do that. No, the way you have to do it in a lot of tools is open each subject area or submodel. Go to file and print and then settle your settings and then print it. Hope that you have enough video memory on your really small under-spec work machine and hope for the best. Well, I was a lazy person and I decided my first foray into writing macros was figuring out a way to automate that so that I didn't have to do it. And I'd love for you to share in the chat some examples of ways and ways that you thought of of being lazy, either out of frustration or because you got tired of messing up doing 100 of the same change over and over again. So you can quote me on this, the title to this webinar, the best data modeler is a lazy data modeler. And in today's session this sort of came about, it's a thing I've been using for a while, but it also came about from a blog post on my website called the best data modeler is a lazy data modeler. And you can go read up on a little bit of a rant that I have there on why people should be more lazy, but most of the content in that post has made it into this webinar. So this is what most people think when I say I'm a lazy data modeler and why I want to be one. And these are just examples of things I do in my life when I'm not preparing data models. They're things I do while I'm not cutting and pasting sheets of paper and taping them all together to make one giant printout. What I'm doing while I'm not going through and adding a create date, modified date, modified user to the end of all 800 tables in my data model, what I do in, let's just say, the life outside of modeling. Now not all of this is fun. Some of it is by being lazy, that gives me time to blog, to do presentations, to do these webinars, to speak, to be an advocate for good causes, to attend EDW, to do lots of space things, to eat and enjoy beverages, and to do some advocacy stuff that I do, especially with Barbies and Legos. But it's not just about having free time, just to be lazy. It's about focusing the activities that I do as a data modeler to make better data models. So in a lot of my webinars, I've mentioned that I think most data modeling is more about forensics than creating things. I'm trying to be like Quincy MD. I'm talking to users, trying to get to the truth. I'm investigating data. I'm looking at source systems. I'm mapping source systems to target systems. Those are the hard things. Those are the activities that the human brain excels at, and computers are really lousy at. I want to provide better service to my customers. I want to build better quality data models, data models that are more accurate, that reflect the requirements as we need them today, as well as our near-term requirements. I want to build flexible data models. I want to build data models that perform well. I want to have time to learn about my target DBMSs, so I want better databases. I want to provide better support to my teams. I don't want to be an obstacle to them, and I want to give them all the pieces of the data model, and I want to tailor it to their needs. Not just generate one report or one diagram of the model, but tailor diagrams and reports, one for the DBAs, one for the devs, printouts for the users that are different than for the DBAs, but I can't do all that manually. The key thing is I want to spend my time doing tasks that require my mind and not just a bunch of mouse clicks. So that's my goal for that. So a lazy data modeler is a better data modeler, but that doesn't mean we're doing this to avoid work. It means that I want to spend my time on more important tasks and tasks that have a greater impact. But when I talk to a lot of modelers, either in sessions or on my projects, one of the things I've heard is that modelers aren't using a lot of the automation techniques. So the first one I hear is there's such a huge learning curve, I don't have time to learn it. And we're going to talk about that one in a minute. The second one is I'm not a programmer. I'm not a programmer either. I think I was a programmer for a good 18 months at the start of my career, and I knew I didn't want to do that. A lot of people don't know that there are automation features that they can use and automate away a lot of their junk data modeling time. And by junk data modeling time, I mean that time of literally the examples I gave, adding the same attributes to every entity. Responding to a bulk change request, like to rename customer to client, to what else, to change all of our VARCHAR data types to in VARCHAR data types. All of those things, those tasks, are something that a computer can do not only faster, but close to perfectly. Whereas humans, it will take a long time and it will be error prone and we will miss out on things. The other issue that I really want to put a lot of weight on today is that no one shares their scripts or their macros or whatever it is that we're talking about for automation. And this is, as the years progress, this has become more and more of a problem. And I work with a lot of other communities, DBAs, devs, even business analysts, that share their content, like their non-proprietary content like crazy. And I think there's a lot of myths going around in the data modeling community or sort of people stuck in a mindset of, you know, that we haven't got into this sort of mindset of open source and sharing and common licensing to make everyone's jobs easier, which means lazier. So people don't know they can automate things because maybe automation came in your tool sets and even though our tools have been around for 20-some years, the automation features are either fairly recent or they keep changing and being upgraded or they're hidden away someplace else or they only work outside the tool. So maybe people have never even clicked on that feature or gone there or they went there and they saw a bunch of code and some very oddly named list of things and were perplexed. They had no idea what to do when they clicked on something or they ran a script because a sample script came with the tool. It broke their model and they never want to come back to working with the automation or they tried to do it. It was a huge time suck so they gave up. This is common to anything that's new and if we talk to accidental data modelers who were told they needed to do a data model and were handed over or went or ER studio or power designer or one of those tools and told go do a model, they'd experience this exact same thing. Maybe they've done a data model before by reverse engineering a database and they got a nice diagram and they think of data models as diagrams but we know they're not diagrams. We know that it's a whole application and that the diagrams are mostly just a print out or just a view of the data model or a way of interacting with the data model. So these exact statements that we data modelers say about automating our data modeling are the same ones that people who automate things say about our data modeling tools and yet we look at them and say what do you mean it was too hard? This stuff's easy. You just drag this. You click on this. You do this. This is where the two sets of skill sets can come together and work together to make each other's lives better. So yes, there's a learning curve but one of the great things is most of the tools come with samples and we can access shared scripts if we're sharing them. You can build the equivalent of a Hello World script that's very easy to do and once you do that, you just start expanding on it. I think you could spend 20 minutes a day or a week learning a bit more or you could spend 20 minutes a day or a week making a business case to get a developer or scripter to help you with these things. You could get some training or you could join an online community or forum to help you work you through how these scripts work. So here's some of the community resources you have. So Embarcadero, thank you Embarcadero for sponsoring this webinar. They've just created a new community site on their website. So the old one which was called EDN is now being transitioned over to community.embarcadero.com and there are forums where you can go ask questions where you can ask specifically, I'm looking for a script to do this, has anyone done this before or which of the sample scripts and there's a lot of sample scripts that come with your studio, which ones they were. Irwin also has online communities at irwin.com and the same sort of thing and a lot of the people posting here are people that we've all known in the Irwin community for a long time. Irwin was one of the first communities to have online communities. I used to host these online communities for all of these tools and that sort of died off as the vendors took up the slack and provided their own communities. But you can get macros or macros support here. SAP, which owns Power Designer, also has a community for Power Designer. So this is part of their formal SAP community network where you can go ask questions and get help. You can have a community in a news group which you can access through some Google properties. This has probably been around the longest and probably where you could find some great automation tips for Power Designer. So one of the feedback I got from a couple of my webinars has been that I'm not pausing to take questions and it's a lot harder since we're not doing panels anymore. I used to be able to multitask that. If you need to catch up on the coffee and the mosquito repellent, we'll do that in the after show. So there is a question about how does Erwin handle automation like this? I'm going to show you some examples of those things in a minute. Auk is a powerful programming language for text processing. There's all kinds of things you can do with the output of a data modeling tool. Also a great tip from Jason says from a Unix professor, if you're going to do something more than once write a script, if you think you only have to do it once, write a script. Yes, I've heard that one before. That's awesome. Also some comments about some tools are very difficult and others make it easy. You know, the funny thing is, is that depending on your skill set, one person's easy is another person's awkward. So I'm definitely going to address that as I show some of the examples. Any forms or script samples for InfoSphere data architect? I'm sure there are. I didn't look into that. But I'll try to follow up with that afterwards. And so while I was looking at your questions in the chat, if you actually have questions for me, try to put them in the Q and A section because depending on how much chat you guys get going, I might not see all the questions. So the other objection that people have is I'm not a programmer. And I say great, not a problem because some tools require or pretty much demand that you have real application development skills, tool sets that you understand what multiple inheritance and threading and object isolation and all these things are. Some tools just require scripting level skills. You can use some tools to record keystrokes and then generate a script. So in ER Studio, which I'll be showing you some of my macro, some of the functions that I put into the script, I did by using a Microsoft Office feature that allows you to record your steps as you do something in Excel, like intertext into a cell and then color it and then maybe underline it and center it. You can record a script in the Microsoft scripting language, which is Visual Basic based, of course, VBA based. And then that will give you kind of the general syntax for the Office objects and the Office automation that then you can incorporate into your tools macros and you might have to slightly change the syntax. But generally the syntax in Office that you might use in your data modeling tool will be either exactly the same or very similar. All the tools come with sample macros and scripts and some of the sample macros and scripts and we'll look at it, just kind of do some really scary, silly things. But I think they're there to expose you to the concepts of selecting a model, opening a model, choosing things, selecting things and only doing actions on the objects that are selected and iterating through all the diagrams and then all the tables and then all the attributes and then all the relationships. Those kind of provide a framework for how you can automate some of your lazy tasks. And then some tool vendors provide places for organizations and people to share their macros and scripts. And all of these together can help solve this, but I'm not a programmer because I'm not a programmer either. I don't share the scripts. I talked about that. This community has a long and infamous and sad history of not sharing scripts. And I'm here to script shame you all for not doing that. I've tried many times in my online communities to set up macro exchange points. I know Embarcadero has, I know CA has. I'm pretty sure that SAP does. We build these places, but no one comes and shares their scripts. The way the rest of the world does it is through online collaboration things like GitHub or even private scripting areas within your organization, whether it's SharePoint or JIRA or some of our Team Foundation server, places where you put all your scripts together and share them. So I understand that there are legal issues and that people are concerned about sharing their work. Certainly you wouldn't want to be mailing your data model around outside your company if that was considered proprietary information. But there's kind of this, it's well recognized in other parts of the IT community that non-proprietary scripts that do things in a very generic way, like backup and restore a database. That's not proprietary. And there's really only about, there's probably less than a dozen ways that one could do that. In fact, there's even some IP law that says that these scripts aren't even works of art because they're not complex enough and they're not a creative work of art. They're just scripts. And I think that these legal obstacles that I keep hearing about, of course if your boss says you can't share them, then you can't share them. But I think people are assuming that everything that they do at work can't be shared. And yet I see all these other communities have been doing this for decades and not just open source companies, but sharing scripts widely. They blog about them. They release them in a Creative Commons license that allows other people to improve them. And we're going to talk about this incremental approach in a minute. This is where we need to be. It's time for data modelers to join this century. And I would so love if like 100 people would go tweet that right now. It's time for us to join this century. So what kinds of lazy does Karen think you could do? Well, I think of scripting and automation really in two different buckets. There are probably more of these. There's internal model crud. So things we do inside a model to create, update, and delete objects and properties. So naming standards was the original automation of being able to automatically generate, say, physical names based on the logical names for creating new columns, such as what people call the audit columns, the create date, the modified date, the last updated by reason codes, adding IDs to everything. That can be highly automated. And then there's some more I'm going to show you in a bit for applying indexes and constraints. And just, you know, there's hundreds of different things, types of things inside your model is not thousands. And most of these automation interfaces allow you to do them that way. But then there's external to your model productivity. So things that you can do, like I said, generating a diagram, a image file of every subject area, printing them all, generating custom reports, making backups of your model, managing all the supporting files like a naming standards template or a forward engineering template or configurations or standard data models or templates that you use. And the more you can automate about these things, the more likely they'll be consistent either across your models or across your modelers and your workstations. So there's some automated naming standards that people can do. And most of these have examples that I call metadata stuffing. And I blogged about this once. Let your computer apply your crazy metadata stuffing scheme. So by metadata stuffing, I mean if your shop says that every table has to be prefixed with TBL underscore and that the first two characters of every table name have to be an abbreviation for the subject area for which it belongs to. And then even in this blog post, I talk about a real-life example where we had to have a three-letter ID of the DBA who was responsible for doing all the work on that table. And I'll let that sink in for a minute. That's metadata stuffing. So I'm not a fan of these naming schemes, but we do have naming schemes that we like to apply, like naming every view with view or V in front of it or all dimensions this way and all facts that way. You want to automate this because, again, it's something that takes a lot of time and needs to be explicitly 100% accurate so things that humans are terrible at and that computers are great at. But you might have to deal with physical constraints of your DBMS. So the common one, Oracle and DB2, only allowing 30-some characters in a name. But your logical names, you know, you've made them free and easy and open and long so that you get names like retail transaction line item modifier event, and now you have to wedge that into 32 characters. Or worse, even 18 or three. You can apply these constraints either with the native features of your tools. So lots of tools allow you to do that or through some other type of script that would allow you to, you know, change all spaces to underscores or remove all spaces or change any special characters to spaces or underscores. So there's all kinds of things you can do with automation and all kinds of tools have these features that are built in and they're fairly similar, not quite exactly the same and how they do it. But sometimes the naming tools aren't enough and I'm going to show you some examples of that. So let's look at some tools. So CA Irwin Data Modeler. It has a full-blown API object model where you can build a completely separate client to do automation against your internal tools and the external typechains. They have active scripting. It's based on visual basic, so VBA visual basic for applications. And it's documented online. If you search for Irwin API reference guide, I don't get a support.ca. I have hard time finding things for the right version of the tool. So I just search for API reference guide version nine or 9.6 or whatever version of the tool that you're working with. And I find that an easier way to find this resource. There's also an Irwin's knowledgebase.com and in this base they have this Irwin API tutorial spreadsheet. And I think I can show that to you. So that basically is a spreadsheet, but it's not just a spreadsheet and I have these clickable things disabled right now because I can show you how behind those clickable things are, I should know how to change this. So that you can see some of what happens here. Basically there's code embedded into that spreadsheet that goes and does any of the things that are clickable on there. This is code that has been distributed as sample code and like I said it's embedded into this spreadsheet that does things like delete objects from your model, hardened data types, find certain physical names, rename some attributes. Now this is not how I think of automating my model, but certainly you can think about how you could embed some automation features into a report that you distributed for instance something like having a user go through the objects and potentially being allowed to update definition names or definitions or something like that. So this is intended as a sample resource for automating in Erwin. So that's one resource for you. So what happens in SAP? So SAP, sorry, in Power Designer they have ability to automate things using whatever languages you have, Java, VBScript, C, other languages. It can be executed inside the tool, but because these are text files just like the other one, just like in Erwin they can be searched outside the tool and you can go to infocenter.sidebase.com and search on power designer, macros and automation. So some examples from their documentation is, and you're going to see something that looks a lot like this. So this is something that is getting the current model and getting the objects in it and it's going to do something throughout. So it's going to scan a script and it's a visual basic script. So embarkadero has a macro language inside the tool based on sacks basic that is very similar to VBScript. So when I brought up before that some people find it something very difficult to work with is some olden experience and one of the very first programming languages I ever learned was basic. So when I go to use VBScript or sacks basic, that's how I think about automation. It's how I think about working with code. See I told you I'm olden experience. But that's very easy for me. But for people who are used to working in less procedural language, less scripty like languages, it's actually harder for them to work in there and they're going to be naturally drawn to a more object oriented. So someone saying in the link it doesn't appear to be live anymore. So I just captured that today. Maybe if you search for, so that's probably what ended up in my URL that was much longer. So if you just search for power designer macro and automation, you'll find the documentation for power designer. I believe they're at version 16 or something. So I'll try to update that link, get you the proper link before we distribute the slide. Sorry about that. But within barcadero, so the macro language is there. The documentation is inside the ER studio tool. That doesn't mean there aren't resources probably outside of it, but the documentation is there. So let me see if anyone else has any questions before I go. Nope. Doesn't look like it. I'm going to pop over to, let's start with ER studio. So what I have in ER studio is a data model. So this is an adventure works data model. You don't need to read the model right here. I have my usual objects all on the left-hand side. But the macro and automation features are down here. And basically they are, I have macros that I have written, almost all of these have been derived from the sample ones that come when you install the tool. You just have to go find them but I'll show you one. I'll show you sort of the interface. So you can have this area where it's going to show me things. You can have multiple tabs open and the macro language is right in here. You can edit it in here. I tend to use another editor. But this one's pretty good. So I define a bunch of variables and it iterates through. It gets me the active diagram and then I get the active model. So the model I have open. And then it figures out whether I've got a logical or physical one. And then I have to fill in where I want these images. So what this macro does is it generates a PNG file which is like a JPEG image of every data model. And it names it according to the diagram which is kind of the subject area and the submodel name, the subject area name. Now one of the issues that I needed to solve as I built this iteratively. The first time I built this I just had it go generate a PNG and it was just named like one, two, three, four, five of all the things. And I thought well that's not a good thing. I want to put a proper name to the image file so that when I put these out on a shared file server that I know what it is without having to open it up. And then I found out well we have these beautiful background colors and that makes it harder to print and harder to read. So I don't want to do that especially if people were sending it to a black and white printer. So it goes through and I can set the various properties that would normally have to set up in a wizard of what the image quality was, what type it was, what size it was going to be and everything. So if I go through and in most of these tools when you automate the undo isn't going to be available to you. So that's definitely true. So what's happening is this used to be something that would literally take me an hour in a data model that had a hundred submodels to go do this. But now what I've done is created something that I had a custom name. So kitty demo adventure works is the name of my model and log means logical not physical and then it's got the name of the submodel or subject area and it pulls its version number from the version property on there. And then I can go see, hold on, let me go find this. So imagine this was a shared drive and what you were doing was every day printing out an image where anyone who wanted to see your data models could go do self service to go get an image because this used to be a real interruption time suck for me as people would come in as could I get a print out of the employee data model and I'd want to do that for them. Now of course we have portals, we have ways of generating reports that have these in here. But this is to supplement and compliment that where people just want to go and have a print out. They don't need to have to go log in, they can just go find something. Now the other thing I can do as I'm going to talk about later at the end is I can have this macro run as a job and just run every morning or every night and print out the working copies of my data models of the ones I'd like to share. And those are all there now and I didn't have to spend hours doing it. So there's other, I can go find columns and tables that have spaces in their name and the reason I needed that for one project is yes, I can use a naming standard for that, but this was solving a specific problem of people who were creating new things, other modelers who had a different naming standard because they had a different target DBMS and we kept ending up with names that didn't match our standard and our standard was camel case or initial cap case and I could quickly just search for problem things and this was a problem for us. We also had some DB2 specific issues about generated expressions which are calculated fields where we had to use all fields that have certain suffix on them. We wanted to automatically generate a really complex calculation for them, calculated field and expression so it was removing some special characters but not all special characters, what we considered special characters and also turning them to all uppercase and that's what the uppercase thing did. One of the more complex things we had to do was automatically create indexes for certain types of attributes such as foreign keys or primary keys. Now in a lot of tools that something that you can have automatically done at creation but if you reverse engineer a model or if you've compared in something it doesn't necessarily mean that those things get created but you can't just run a splat type script that just recreates them because what do you do if there's multiple foreign keys on a table you don't want them all renumbered and renamed all the time because they ended up being in a different order in the object model of your data model. So it needed to be a little bit more complex than a naming standard template or a naming standard utility. And what we've done is it goes through our entire model and goes through all of the columns and for foreign key columns it creates an index using not just our naming standard but to preserve the names that are already there in case there are duplicates so if there's already a name we don't want to touch it but if there's more than one we want it to increment so you can see we have one where we a table where we have at least seven of these things so we don't want to go back and rename all the indexes. That's something you just can't do with an automatic naming feature. So the other things that I like and this is highly borrowed from existing macros that were done at embarkadero years and years ago is exporting some metadata into my Excel is stopped. Now what this is doing is it's pulling just the stuff I want to report out of my model in the format I want it and I can choose the colors I want and yes there's some native features. I can export metadata out of a model using some reporting features or export to CSV but then every time I wanted to publish this to something then I would have to bring in that CSV data, format it according to the way but now with a macro I can do things like add the project or the company logo, the times, pull the things that I really think are relevant and not just all of them I can choose how they're described. Now there are things that I can do to just customize this view in a way that maybe the modeling tool vendors don't want to express it but maybe we have unique ways by calling something mandatory or optional versus null or not null or required or not required. I've also used this on one project to produce reports so that some of the characteristics were done in another language so instead of saying mandatory or optional I was able to say in the language what they were. Now one of the things about using macros that work this way in most tools is that this is very fragile. Like if I started working on this Excel spreadsheet it would just start printing where I put the cursor. Some of these macros are things that you have to set it and literally take a break. Go have coffee or go work on something else. So this isn't quite the same thing as a native feature but the tradeoff of getting exactly what you want out of it can be really good. Now one of the reasons why a lot of us have started automating some of these Excel reports is that while most tools have beautiful reports that you can extract out, they're kind of known for having a lot of white space and a lot of pages. Being able to customize things allows me to compact the data and also to just do exactly what I want in the format and the content that I want. So I think I can stop this now. But you can see while this is very similar to the sample macros that came out I was able to customize it the way I wanted it. So here's a really short example of I just want to quickly, this is a selection one sorry. I'm not even paying attention to my own documentation. This allows you to pick some entities and selectively produce just their table like this particular macro, prints the tables are selected and their definitions. Very quickly something I wanted to do and I didn't have to go through a port generation thing and exclude a bunch of stuff and it didn't take me 15 pages to print it out. I'm saving trees as well as mosquitoes. But my goal here wasn't to show you about these specific macros but to talk about problems that I had that were taking a lot of my time that were impacting my ability to service my end clients and my end clients were both users and project managers and business analysts and DBAs and devs by developing these macros. And I've improved on some of these macros over time to get it to do what I want to do. You notice my macros aren't that complex. There are some over here in the sample macros that have a full blown interface where you can choose which files you want and what drive you want it to be on, how you want to name things. And if I had more time I'd add more of those features. One of the key things about working with automation is to just think of it as something you're going to build incrementally. You're going to leverage other people's work if they'll share it with you and constantly make it better. Once I sat down and decided I needed to develop all these macros and do it all at once and make them perfect at once, I'd probably still be sitting here months later which would keep me from doing it at all. So that was ER studio. In Irwin, I told you they have this API interface. I showed you the sample that they have with Excel. But just like in most tools, the number one way we get exposed to automation is through just sort of functions and functionality. So for instance, you know, I can look at how I want to physically name a table based on its entity name. But if I wanted to expand on it, I could do that. And I could say, you know, I want to add the words kitty table at the end because I'm metadata stuffing. I want everyone to know that kitty that Karen worked on the store table. That's a crazy one. But maybe I want heaven forbid if you work in the shop that requires you to have the table name prefixed to every column name. You don't want to do that manually. You want to do that in an automated fashion. So that's where these physical naming structures as well as sort of the naming standard templates come into play. And also hardening of them. So the other thing to think about is as you run these macros and especially use these naming functions inside of tools is sometimes when you set a physical name, no matter what happens after, you don't want it to be reset, especially after you've already gone into production. So you can do something called hardening in most tools where you say even though I created this dynamically named object, at some point I want to break that link so that if I change my naming standard or if we just decide now we're going to allow a few more characters, you definitely don't want to go change all these physical names. You usually won't want to go change all these physical names because the development and the migration cost of doing that will be significant and you will not be loved and valued. So those were some examples of some automation. I wish we had more time to cover on it more but why I want you to be lazy is to take mindless tasks, anything that computers are great at and humans are bad at and that take a lot of time and automate them because you are hired for your brain, not for your good looks. You need to be spending your time on tasks that involve thinking and analysis and deciding not just click here, click here, click here, click change this, go add these 100 objects. This will give you more time for modeling and not for printing and reporting and it will give you more time to help be a good friend and a good advocate for your developers and DBAs and we all want that. Now I wish I could also spend another hour talking about PowerShell. So PowerShell very quickly is a feature that comes with Windows now I think starting in 2007, definitely in Windows 8. It's a Windows feature that allows you to automate almost anything. So I have PowerShell scripts that I use for creating virtual machines and configuring and installing software and restoring databases, sample databases into them and I use these for my test databases. I use them for training as well. And I have again built the I and other people that I work with have built these scripts to automate both virtual machines and things we do on Windows, both locally and on premises as well as work I do in the cloud in Microsoft Azure. And it allows me to go do a whole bunch of things consistently very quickly. So I can fire up a virtual machine in the cloud. I can install, I can have it have SQL server on it because that's what I work with and install ER studio and get the license key all set up and do a restore of some test databases I use. I can do my testing and then I have another script that goes through and shuts down all my virtual machines and just do that. And setting up a VM takes a few minutes to an hour and I'd rather spend that hour not doing that and not patiently waiting for wizards to go and things to click. Let the computers do the work that they're good at. But just about anything in Windows or Azure that you can do, PowerShell can do. And we joke in some of the communities that I work in that PowerShell is a cult because people start using it for things that go beyond what it was used for. But I could use PowerShell to run a job and a Windows agent or a SQL server agent to, you know, I can have all these options of tasks, of scheduled tasks that I can create. For instance, I could create a scheduled task to go fire up ER studio and run these three macros every morning even if ER studio wasn't running. I can't do that inside ER studio. I can do it because I have a combination of Windows automation, so OS automation together with my modeling tool automation and I can make those all work together. So I have these rules for being lazy. Don't spend your time doing things that a computer is faster and better at. Automation is your friend. Don't try to do it all at once. Don't get crazy. Don't be part of the cult where you're trying to do everything the modeling tool does in a script. So for instance, I've seen people, you know, hand generate all the DDL using the scripting language when they could just use the features of their tool. But I do have some macros that if we had had time I could show you that generated a series of permission grants based on some unique rules that we had. I just generated all of those to a text file. So technically it was DDL. But it wasn't my tables and structures. It was just a bunch of permissions that I wanted to use to grant permissions for like QA users, test users and developers and some read only grants so that every time we went to build a database we had those. And yes, I probably could have done that in a SQL server tool. I could have done it based on a spreadsheet. But I was able to quickly iterate through all my tables and all of my users in my model and generate those scripts just the way I wanted them to be. I want you to focus on mindful things, not mindless ones. Don't be doing junk data modeling tasks. And then finally a really important thing. If you've automated it you must ask your vendors to make this a feature of their tool. So I'm still perplexed why it takes so long to print an image or open up every subject model and do something to it. I think those are some of this automation ought to be built into our tools. So let's summarize a bit. Automating boring tasks are going to make you happier and happier data architects are better data architects. Automating all these mindless things is going to make your boss happier. Automating tasks makes for much more accurate work. If you have to do something 100 times over and over again I'm going to guarantee to you you're not going to do the same thing. Saving time for you and your team members is going to make everyone happier and a happier data model is a happy data model. Learn these automation features. Use them in your tools. Learn some power shell. There's a lot of resources I'm going to show you in a minute. Never run a script on your production models without testing and understanding it completely. People open these sample scripts that come with their tools that do crazy things like add a whole bunch of attributes to the person or delete all your relationships. Some of the scripts there are nifty little scripts that you shouldn't just run. The problem with the scripts is a lot of them you can't undo what's been done. A lot of them might cause layout problems with your models even if you could undo them. You need to read through the script, understand what it does even if it came from the vendor because those are samples. They're there for you to learn. They're not there explicitly for you to just run willy-nilly to see what it does. You should ask for developer support if you need that for your tools just like anything else. Your models are production data for you so they deserve professional skills. You should look at all the activities you do every day all day and question whether they need to be done and if they're mindless boring repeating tasks you should try to automate them. You should free up junk modeling. You should just put that in your status report that you freed up some junk modeling to do mindful things. And think in terms of being iterative and building a little bit more not trying to build a complete automated data modeler system and I think you should be lazy all the time every day and get more lazy every day. So some resources there in the slide I also wrote about and gave a presentation called the best DBA is a lazy DBA which also has some of these features not quite the same. The place you want to go for PowerShell help is right here in the script center. There's videos, tutorials, blogs, funny videos and songs and GitHub which is where I'm going to start putting my scripts that I'm going to share under a shareable license and I hope that you will join me there and also post them to your communities. So find three tasks right now that are junk modeling and start thinking how you could automate them. You're going to go search for some macros. You're going to make it your own and use it. So that's as much as I have for the presentation. I'm going to check here for questions and I asked for spell checking definitions back when it was logic works and only took them 20 years to add it. Yeah. So in a lot of dev tools that I use, I end up having to generate some report into word and do the spell checking there and then it's really difficult because if you have initial case objects it makes it hard but you could just generate the definitions and do spelling and grammar checks that way. And you like my space slipper so do I and I'm wearing them right now actually. They're kind of what I do when I'm lazy. So are there any other questions out there? I don't see any coming in. What that means is that it's probably near time to end the recording but Shannon do you have anything to add? Just want to thank Embarcadero again for sponsoring today's webinar and as always this is fantastic Karen and thanks to our attendees for being so engaged in the chat and everything else so let me turn off the recording for you and just a reminder I will be bringing out the slides and a link to the recording within two business days so for this webinar by end of day Monday.