 It's called Scaling the Artisan. And it's about what it means to be a senior developer. By way of introduction, I'm Coraline Eda Emke on Twitter at Coraline Eda. I judge how well I did her in a talk by the number of people who tweet about it. So please be kind. I am a code witch. And I once had someone ask me, Coraline, why do you call yourself a code witch? And I'm like, I write code, and I'm literally a witch. So it seems to go together nicely. I have code witch stickers with a wonderful 8-bit witch on it. So if you want code witch stickers or greater than code stickers after the talk, come up and say hi. I built my first website in 1994. And I'm proud to say it was in the top 5% of all websites. My new website looks a little bit different, but it's still in the top 5% of all websites. I've been an open source contributor since 2004. I won a Ruby Hero Award in 2016 for my work in diversifying open source. I should warn you that according to Breitbart News, I'm a notorious social justice warrior, so you should take everything I say with a grain of salt. I'm a founding panelist on the greater than code podcast, which I strongly recommend to everyone. We have an incredibly diverse lineup of guests, and we talk about the human side of software development. And perhaps best known as the creator of the contributor covenant, the first open source code of conduct. And it has over 40,000 adoptions, including Angular, the .NET framework, Bootstrap, GitLab, Electron, Eclipse, RubyGems, JRuby, Alexa, Swift, Jenkins, Yarn, and Rails. I'm a principal engineer at Stitch Fix. I've been there for just under a year, and I'm really enjoying it. Stitch Fix is one of the sponsors, and we have a booth. So I'll come by and say hi. So I'm going to start with a simple question. Who are we? Are we architects? An architect is someone who plans, designs, and reviews the construction of buildings. To practice architecture means to provide services in connection with the design of buildings and the space within the site surrounding the buildings. They have, as their principal purpose, human occupation. I don't think we're architects. Are we scientists? A scientist is a person engaging in systematic activity to acquire knowledge that describes and predicts the natural world. A scientist uses scientific method, formulating and validating hypotheses. I don't really validate hypotheses in my work. I don't think we're scientists. We like to call ourselves engineers. Engineers design materials, structures, and systems while considering the limitations imposed by practicality and regulation. The foundational education of an engineer is a four or six-year degree followed by four to six years of practical, peer-reviewed professional practice culminating in a project report or thesis. A lot more disciplined than the work that we do as software engineers. Are we artisans? An artisan is a skilled craft worker who makes or create things by hand that may be functional or strictly decorative. Artisans practice a craft made through experience to achieve the level of expression of an artist. Naming things is really hard. Maybe we should call ourselves firefighting space cowboys and be done with it. Firefighting space cowboys who, of course, love cat guests. We all know that naming things is hard and it's not even easy to name our profession. But of all the options available to us, artisan seems to come the closest. We're going to go with that for the purpose of this talk. So what is an artisan? We all want to preserve our high-paying jobs and work somewhere that we're respected for our skills and talents. We want to be like artisans. So what exactly does that mean? An artisan, as I mentioned, is a worker in a skilled trade that produces things by hand. Artisans were the dominant producers of consumer products prior to the Industrial Revolution. The best artisans work directly for merchant princes and nobles and produce unique artifacts that were practically works of art. In medieval times, artisans formed fraternal orders to exchange innovations in their craft with each other while protecting their public secrets, their professional secrets from potential competitors and the public at large. Over time, these orders developed their own jargons, symbolic frameworks, and elaborate rituals to represent and preserve their hard-won knowledge. Some of these orders survived to the modern era, still steeped in mystery and rumor and legend. By the 12th century, to ensure the smooth succession of artisans and the continuity of their traditions, Gilles developed a system of apprenticeship. In the 16th century, seven-year apprenticeships were a legal prerequisite for becoming a craftsman or an artisan. So I'm gonna take you on a little historical journey and talk about something that we take for granted in our daily lives, the screw. The screw thread was invented around 400 BC by Architus of Tarentum. He was a contemporary of the philosopher Plato and he's sometimes called the father of mechanics. Archimedes further developed the screw principle and used it to build machines that raised water. Early screws were used in olive presses, machines that compressed olives to extract their oil. This is Hero of Alexandria, a mathematician and engineer who lived in the first century BC. He tutored Alexandria's museum and wrote many books on mathematics, geometry, and engineering, many of which were still in use as later as the medieval period. He also invented the first known steam engine. In the second volume of his book, Mechanics, he outlines the five mechanical powers, the windlass, the lever, the pulley, the wedge, and the screw. But it wasn't until the 15th century that screws and threaded bolts became started to be used as fasteners. Gutenberg famously used screws as fastenings on his printing press. The use of screws in this capacity gained momentum and they started appearing in mechanisms like clocks and in items like armor. For centuries, screw heads for fasteners were cut by hand. There was no standardization. A craftsman would carve and file individual mated pairs of screws and nuts. Interchangeability was not a requirement, so custom fitting was the norm. This was true until around 1800 when an inventor named Henry Modsley invented a type of metal lathe that enabled the manufacturer of standard screw thread sizes. With Modsley's lathe, you could provide, you could reliably produce screws that were interchangeable. At first, different manufacturers had their own standards for screw threads. A national standard wasn't established in the UK until the mid 1800s. And even into the 20th century, there was no international standardization of screw threads, which created significant supply and repair problems for the Allies during the World War. In 1949, finally, the unified thread standard was adopted by, not making this up, the through-scred standardization committees of Canada, the United States, and the United Kingdom. In 1947, the International Standards Organization was founded. 15 years later, the metric-based international standard of units was created and the ISO metric screw thread was adopted as a worldwide standard. So today, when you need to screw something in or tighten a bolt, you don't go to your local metalworker artisan, you go to Home Depot. Artisans produce machines and tools, but the most profitable endeavor for artisans for almost 4,000 years was the production of weapons. The sword, as we know it, was developed during the Bronze Age and was a natural evolution of the dagger. The earliest specimens were actually made of arsenic copper and date to the Turkish region of approximately 1600 BC. Occasionally, a swordsmith would mix iron with optimal amounts of carbon and accidentally produce a steel blade, but it wasn't until the early Middle Ages that this became a repeatable process. By medieval times, the process of sword-making began evolving largely in response to the availability of more and more sophisticated materials. The availability and quality of raw materials drove improvements to the process of sword-making, not the other way around. By the 16th century, knowledge of the manufacture of guns arrived in Europe by way of Middle Eastern trade. Throughout the 18th century, guns in Europe were made one at a time by artisans known as gunsmiths, and each gun was unique. If one single component of a weapon needed replacement, the entire weapon had to be sent to an expert gunsmith for customer repairs or discarded and replaced by another weapon. Although the killing power of a gun was significantly higher than that of a sword, firearms did not supplant the sword into the early 19th century. American gunsmiths were few and far between, and the American colonies of the early to mid-18th century their records of less than 20 gunsmiths. Government tried and failed to entice and mandates gunsmiths to practice no other profession. The raw materials for gunsmithing were scarce in the Americas, so most guns were imported from France and Germany. Gunpowder and firing mechanisms also had to be imported. A gun back then cost about a year's income for an ordinary farmer. For comparison's sake, a basic rifle now cost the equivalent of three days of labor at the average national wage. There were attempts at producing professional standards for gunsmithing in Europe, but not in America. This means that every gun produced in America is unique to the artisan that produced it. At the end of the 17th century, Maryland reported an inventory of only 20 muskets, 38 carbides, 16 horse pistols, and 78 barrels of powder accumulated over the past 25 years. In the years leading up to the revolution, they had 200 muskets, 86 carbines, and six pistols in usable order. For each working musket, there were two or more that were deemed worthy or capable of repair. Before the Civil War, America had only two armories, one at Harper's Ferry, Virginia, and one in Springfield, Massachusetts. In an attempt to equip the militias sufficiently to protect the newly independent country, Congress ordered the production of 7,000 muskets in 1793. A year later, it had only managed to buy 400. In the late 18th century, a French general named Grubeval saw this resource-intensive approach as a problem and promoted the idea of standardized parts for weapons. This approach came to be known as the System Grubeval and became law in France by way of royal order. So by around 1778, France began producing some of the first firearms with interchangeable flintlocks, but the parts were still made carefully by craftsmen. In the U.S., Eli Whitney saw the potential benefit of developing interchangeable parts for the firearms of the United States military. In July of 1801, he went before Congress, built 10 guns, all containing the exact same parts and mechanisms, then disassembled them in front of all the congressmen. He placed the parts in a giant pile and with help reassembled all the weapons in short order. They were sold. It was the development of interchangeable parts that led to predictable production. We no longer needed artisans to produce weapons and artisans could shift their focus to design and innovation. Today, of course, guns are mass produced and can be bought for under $200. 5.5 million guns were produced by the United States last year and their ready availability means we have a whole new set of ethical dilemmas to deal with. The Industrial Revolution upended many aspects of society, making some better and some worse, but predictable production with interchangeable and standardized parts, generally speaking, raised the quality of life for most Americans and let us realize the promise of new technologies. Today, we associate mass production with low quality and low skill labor. We prefer to buy things with artisanal in their name from sandwiches, to bread, to beer, to clothing. To many of us, mass production smells like commoditization and we don't want that in our profession, but that's classist and elitist and I don't want that in our profession either. So how does the changing nature of artisanal work and the impact of the Industrial Revolution apply to our industry? I would argue that we're at the beginning of the process of standardizing on interchangeable parts and we're gonna soon have to deal with the commoditization of code. Early computing pioneers had to invent everything. This is a picture of a bomb, one of the earliest computers. It was developed by Alan Turing during World War II and was used to break the Nazi enigma cipher. The hardware and software to run the bombs was tightly coupled. Hardware standardization in the 70s and 80s led to the creation of more general purpose programming languages that can be compiled across a small number of architectures. Standardized hardware platforms were a material innovation not unlike the consistent production of steel in the medieval times and allowed us to focus on things that we produce on top of the material layer, in other words, software. With the advent of the web and the increasing versatility of front-end frameworks, the browser is becoming our main platform and this is only possible because of the creation and wide adoption of standards. TCPIP is a lower level protocols like HTTP and HTTPS on top of it and the standardized front-end language like JavaScript. And just like the standardization approach that the ISO took for screws, these standard protocols are making possible a revolution in the production of software. Behind the scenes, of course, is our back-end code written in a variety of languages but running on increasingly standardized server platforms. In the early days of the internet, servers were owned by the same companies that produced software and they spent a lot of time on the reliability and maintainability and security of their data centers. These days, most servers live in the cloud and we don't know or care what hardware they use. So data centers became a commodity. The age of DevOps arrived and more than ever we can focus on crafting software instead of crafting tools. Commodalization happened to hardware and server infrastructure and software is next. More and more software development is shifting to reusable components like NPM modules or RubyGems that can be assembled to provide large swaths of functionality for our apps. We no longer have to roll our own authentication or data visualization or ORMs or even UI frameworks. And AWS provides building blocks for enterprise class software architectures. We no longer have to invent or create or maintain our own data storage, file storage, caching, search or messaging services. Our tools generate scaffolding on top of which we build our applications. We don't have to write boilerplate code anymore. As an aside, the term boilerplate was first used to indicate reusable chunks of text that typesetters didn't need to be set by hand for use in newspapers. And there was a metaphor for the 1840s innovation of large rolls of metal and flat large plates for use in making steam boilers. We're using more and more standardized parts to build our software and shifting our attention from assembling machines from reusable components instead of inventing steel or machining our own custom screw threads. And today, I don't know if you saw this, but NASA is even working on a machine learning system that will allow users to produce programs without writing their own code. In short, we're inventing ourselves out of traditional software jobs. Software development is increasingly about connecting pieces of existing technologies together with a small amount of glue code and a sprinkling of custom algorithms. And this is just like the beginning of the Industrial Revolution, but just as the Industrial Revolution didn't eliminate the need for artisans, neither is this the end of software development. It just means that we have to adapt. The job of the future is not the job you're doing today. So let's talk about artisanship in a modern context. There was a tweet exchange between DHH and Sarah May a few weeks ago, and Sarah says we're starting to differentiate software developers from software engineers. An engineer builds complex software systems that run in the context of people and a developer writes code. And I took exception to this because it struck me as very classist. It's not us and them. It's not engineers versus developers. It's who we are and who other people can aspire to be. And it's our job to bring them up to our level. So I responded, I think it, where it here is the ability to create software in the context of people is a great to have for early career developers, but it becomes part of doing your job with increasing importance as you progress in your career. A senior dev without empathy for end users is not a senior dev. Maybe you thought that as you progress in your career, the only thing that should be changing is the quality of your code and designs, but that's short-sighted. You're expected to deliver more complex code, code that is readable and elegant and performant, but your job now is about more than delivering story points. You're no longer responsible simply for writing code. You're moving from being a producer to an innovator. You're now expected to focus on creative solutions to challenging problems. And some of these problems, some of these solutions won't even involve code, but when they do, you'll be the one designing custom algorithms with a clear focus on exactly the problem you're trying to solve. And you have the skills now to invent what needs to be invented. Questions you should ask yourself, should we solve this problem with code? Some problems are best solved with a process improvement that doesn't require a code solution. Not every spreadsheet that someone in your company uses to track work should be turned into a new Rails application. That runs contrary to what I used to say, which was every spreadsheet was an application waiting to happen, but sometimes a spreadsheet, even on paper, is good enough. We should ask ourselves, can we fix this with better communication? Most software problems are people problems, and most people problems come down to a matter of communication. Many process related features that we're asked to develop are organizational scars over a communication wound. If the feature that your team is asking to build, you should ask yourself, are you just putting a band-aid on a bad communication channel? Is this something we should build ourselves? It's our job to distinguish between the category of just doing business versus doing something that's unique to our business. If your company's in the fashion retail business like mine is, you might consider machine learning to pick out styles and make recommendations to your stylist for what customers might like. But if that's your business, you probably don't wanna design a custom data warehousing system because that's not core to what your business is doing. There's nothing wrong with building it yourself if you really need to, but we need to evaluate third-party solutions in a way they're cost against the cost of engineering. Often what you'll find is a 90% solution. Is the missing 10% painful? It may be, but it takes someone with experience to distinguish between a good enough solution and the best solution. We can ask the question, should we extract this into a library? A library is a good way to show and make sure that your innovation becomes more widely used in your organization or in the larger open source community. It's a way to demonstrate best practices to other devs and it's a way to democratize the output of your labor by having other people use it, modify it, fix it, and extend it. Our second job is standardization and that means picking the right components and standardizing on best practices and sharing knowledge through communities of practice. The questions we should ask ourselves, how do we wanna do things around here? This might involve creating a style guide or establishing architecture patterns that will be repeated across all the services in your application infrastructure. How do we build consensus? Good, solid, healthy development organizations don't do things by fiat. When you're trying to build consensus, start with something that everyone agrees on. For example, a shared database is a bad idea. Then let people share their ideas and alternatives. Focus on one alternative at a time and describe each alternative's strengths. Then do the process again, describing each alternative's weaknesses. Make sure that everyone is heard and make room for dissent and make room for people of all skill levels who often have insights that we as senior devs don't have. Draw them into the conversation deliberately. Say, Sarah, we haven't heard from you. What is your opinion on this particular approach? And even if you can't reach consensus, at the very least, everyone will feel like their needs and ideas were heard. Finally, how do we spread best practices? At one company I worked for, we actually had community of practice meetings every other week. It was a video call where people who were interested in a particular language or framework would come together to discuss ideas. At my current job, we have a principles meeting where all of the principal engineers come together to discuss ways to improve the quality of the code base and standardize. We also maintain a shadow backlog of engineering debt and technical debt and engineering ideas that we want to implement and we share that with the broader team to inform the decisions that they make as they're developing features. And of course, you can always use a Slack channel as a special interest group for a particular language or framework or technology. Our third job is being a forced multiplier and this involves mentoring and teaching, helping our teammates level up and giving back to the community. Question you could ask is, do we have a good mix of people? Diverse experiences and diverse experience levels lead to better problem solving and we have to recognize that not every software problem requires a senior principal developer. We have to ask ourselves, are we pairing people in an effective way? In his book, Apprenticeship Patterns, Dave Hoover lays out what he calls the boxcar theory. So imagine a train with boxcars and there's an engine at one end. Toward the end of the train, you have early career developers and as you move right toward the engine, you have developers with increasing levels of experience. A lot of companies will pair a very senior person with a very junior person and this is actually a mistake because people who are closer to one another on the train have a greater memory of what it was like to be at that previous stage. They have more empathy as a result and they're better able to share a knowledge and remember the tips and tricks that got them through when they were leveling up. And we ask ourselves, how do I help the team level up? And it's through mentoring and pairing. Code, show and tell is a great way to do this. We do this at Stitch Fix where when our developer has created something they think is cool or innovative or a good solution to a problem, they walk through the code in 20 minutes and share their ideas and share what they did with other developers and get feedback. And of course, humane code reviews come into this process as well. As a senior developer, your code reviews should not be vicious. You should never ask why didn't you just and you should never be overly critical of the code. You should always be asking questions and helping people arrive at their own conclusions. So innovate, standardize and multiply. But are we done? I mentioned that the mass production and ready availability of guns created an ethical dilemma. So does the mass production and ready availability of software. No one wants to be accountable for gun violence as we've seen in the political situation in America today. And also no one wants to be accountable for the injustices done by naive, biased or altogether malevolent algorithms. And this is on us to figure out before it becomes an epidemic if indeed it has not already become an epidemic. This is our fourth job. And in some ways, this is our most important job. We have to be conscious of our companies. We have to be the conscience of our companies and our industry as a whole. We have to decide what is possible versus what is right. There are a number of questions we should be asking ourselves and our companies about the software that we're writing and what its repercussions could be. Examples, the Facebook Cameraman Analytica data scandal involved the collection of personally identifying information for over 87 million Facebook users and it actually started in 2014. The data was used to influence voter opinion on behalf of politicians who hired Cameraman Analytica. We knew about this as early as December 2015 when the Guardian reported that the politician Ted Cruz was using data from Facebook and the subjects of the data were unaware that the company was selling their information and that politicians were buying that information. Engineers just like us built the system that was exploited and engineers just like us exploited that system to influence a presidential election. This is a moral failing on a part of engineers. Facebook engineers didn't do their jobs. They didn't ask the question, can the data we're providing be used for unjust political gain or even to undermine our democracy? Another example, the gay hookup app Grinder, which has 3.6 million daily active users across the world was just recently revealed to be providing its users HIV status and last tested date to other countries, other companies. Because the HIV information is sent together with GPS data, phone IDs and emails, the data could actually be used to identify specific users. Grinder's engineers didn't do their jobs. This was an ethical failing. Their engineers didn't ask the question, is the data we collect and share putting people's lives in danger? Another example, algorithmic hiring. More and more human resources staff rely on data driven algorithms to help with hiring decisions and then navigate the vast pool of potential job candidates. Software systems like these can be so efficient at screening resumes and evaluating personality tests that 72% of resumes are weeded out before a human even sees them. But there are drawbacks to this level of efficiency. Man-made algorithms are fallible and may inadvertently reinforce discrimination in hiring practices because algorithms mimic human decision-making that basically trained based on past successes which may embed existing bias. There was an experiment in which recruiters reviewed identical resumes and shown they selected applicants with white-sounding names rather than people with black-sounding names. So if the algorithm is trained on data like that and learns what a good hire is based on biased data, it's gonna make biased hiring decisions. The result is that job applicants are judged based on subjective criteria such as their names by latching onto the wrong features. This approach discounts the candidate's true potential. Algorithms are not neutral. When humans build algorithmic screening software, they may unintentionally determine which applicants will be selected or rejected based on the wrong information. Going back to a time where there are fewer women and minorities in workforce, for example, leading to legally questionable and morally unacceptable results. Engineers who write algorithmic hiring software are not doing their jobs. This is an ethical failure. These engineers are not asking themselves the question, is the algorithm or creating perpetuating systemic bias? Safia Umochin-Noble, a professor of communication at the University of Southern California, recently published a book called Algorithms of Oppression. She argues that while most people think of Google and other search engines as a public library, a trusted place to get accurate information about the world, these platforms are increasingly not trustworthy at all. She did searches on terms like black girls, Asian girls, and Latina girls, and found that pornography was the primary way they were represented on the first page of search results. That's not a fair, credible representation of women in color. It reduces them to sexualized objects. Search engines aren't merely selecting what information we're exposed to, they're cementing assumptions about what information is worth knowing in the first place. There's a dominant white male Western-centric point of view that gets encoded into the structure of information, and an algorithm is just a structured decision tree. If these keywords are present, then a variety of assumptions have to be made about what to point to, and all the trillions of web pages that exist on a web. Google's engineers are not doing their jobs, and this isn't ethical failing. Search engineers are failing to ask the question, is the algorithm we've developed reinforcing stereotypes? In 2013, the city of New Orleans entered into a relationship with a company called Palantir, which is largely funded by the CIA to implement a predictive policing program. The city's agreement with Palantir was secret, never revealed to the public, and never subject to a vote or deliberation. The company provided the police force with hit lists of people that their algorithms indicated were sources of potential violent crimes. But government-funded research cast doubts on the effectiveness of predictive policing, and studies are showing increasingly that it can have a disproportionate impact on poor communities of color. Predictive policing patterns perpetuate systemic bias against overpoliced communities of color. Palantir's or engineers are not doing their jobs. This isn't ethical failing. These engineers failed ask the question, is the algorithm we're creating putting marginalized people in harm's way? Facebook's chief technology officer, recently in the wake of the Cambridge Analytica scandal, told the media that the company will start mapping out potential threats from bad actors before it launches products. 10 years after the formation of Facebook, too little, too late. If you're not thinking about how your features could be abused and put people at risk, you're not doing your job and this isn't ethical failing. Marco Rogers wrote, Facebook is literally destroying the internet. Google is a trash fire of white supremacists, but all we remember is the execs. All the employees doing the dirty work of these places are gonna be fine. We're putting blame on the business people who request features like this from us, but ultimately we should be held accountable for the code that we write. We need to acknowledge that software doesn't exist in an ethical or moral vacuum. William Butler Gates wrote, nobody running in full speed has either a head or a heart. We need to slow down and consider the impact of the software that we're creating. Algorithms can be weapons and if you're willing to build a weapon, you'd better be comfortable accepting the responsibility for the people that it kills. So innovate, standardize, be a forced multiplier and be ethical. Those are our jobs. And what does that mean for the future of our jobs? We need to scale our craft. Just as the Industrial Revolution didn't eliminate the need for artisans but changed the role of the artisan, moving their focus from production to design and innovation, we can foresee the same thing happening to software artisans. Artisanship in a production capacity doesn't scale, but just as raw materials drove improvements to the process of stored making, perhaps the greater availability and quality of software developers can also drive improvements to our craft. We now have tools at our disposal that make producing code easier and lowers the bar to entry. We don't have to waste artisanal effort on low value parts of the system. Tackage solutions are often good enough and the output of a less seasoned developer may also be good enough. We don't need senior engineers to build every feature in our software system. We can save the effort and cost of the artisanal approach for the parts of the system that matter the most. Artisans, after all, are rare and expensive resource and we need to use them wisely. Predictable low-cross friction means that the attention of the artisan can shift away from manufacturing. Artisanal effort applied to design becomes a force multiplier and a source of previously untapped value. The fear of competition among artisans created bottlenecks in innovation and production that it took the industrial revolution to overcome. Competition drives innovation but only up to a point. To truly change the world we need to turn our attention to creating standards, technical standards as well as ethical standards. By working with our peers and finding opportunities to teach, to learn, to help and to grow and to most importantly work in an ethical way we foster innovation, we become better at craft and we ensure the generation of new groups of artisans. No secret handshakes required. Thank you. I wanted to take a moment to announce a new project that I have in the works, a book that I'm writing with Naomi Freeman called The Compassionate Coder. In the book we're describing a framework for dealing with the changing role of software developers, practicing empathy and software development, empathy for the modern age. The book will help you learn how to better integrate your teams for efficient and profitable outcomes, learn about your own role in a changing workplace and teach you how to be a leader, not only in your workplace but in the larger tech community. We're hoping to publish later this year. You can sign up for updates at CompassionateCoder.com. We're using that to gauge interest in our books so please do sign up. It's a low-volume mailing list and we would greatly appreciate your support. Thank you for real. Thank you. Thanks.