 Thank you so much for the introduction. So again, my name is Brad. I'm a developer programs engineer at Google And so what that means is that I spend a lot of time working very closely with developers attending conferences and just speaking with you and learning about the things that you may like or dislike about some of the Projects that we have going on so I specifically focus in the areas of machine learning and big data and One of those projects that happens to fall into those categories is TensorFlow So today I'm going to tell you a bit about TensorFlow 2.0 how you can get started with it as well as some of the changes from the Tenderflow 1.x versions So just before we dive in I would just like to get a show of hands here Who here has done machine learning before in any capacity? Just put your hand up Okay, cool. That's a good chunk of you. What about deep learning specifically? Okay, TensorFlow have you used it just any either version. Okay, awesome So let's let's get into it just to give you an idea of what we're going to specifically discuss today I'm going to introduce TensorFlow and just tell you what it is generally Just so that we're all on the same page here I'm them going to discuss TensorFlow at Google and how we're using it on some of the projects that we're working on We'll then discuss why Python is so important to TensorFlow and how the two go hand-in-hand We'll then discuss TensorFlow 2.0 and some of the features available to it as Well as how to upgrade if you're using TensorFlow 1.x right now I'll tell you how you can move to 2.0 and then getting started just if you haven't used TensorFlow at all You know just some of the resources that are available for you to continue your learning and to use TensorFlow 2.0 Okay, so what is TensorFlow it is an open source deep learning library that is developed at Google I was released in 2015, but it actually existed a little bit before that We were using it for projects internally, but then we released it as an open source project in 2015 And so what is it well? TensorFlow is a it's a Python framework that includes a lot of utilities for helping you write deep neural networks Deep neural networks of course being the the main component of what makes deep learning what it is and so a lot of deep learning involves using mathematics statistics linear algebra and then low-level optimizations with your system and so what TensorFlow actually does is it removes a lot of those Abstractions away from you so that you only have to worry about actually writing Your model and it so it just takes a lot of those what otherwise would be complicated steps and just makes it super easy for you to use Tender flow provides support for both GPUs and TPUs. These are hardware accelerators that are GPUs and TPUs specifically work out heavily with linear algebra and mathematical Computations and so TensorFlow is able to utilize these this hardware right out of the box so that you can get those the benefits of using these using these Today, TensorFlow has over 2,000 contributors all over the world and the the 2.0 beta version was released just last month in June So as I mentioned TensorFlow was released publicly in 2015 and since then we've just seen massive growth both In turn in internal use and also throughout the community and so we're continuously adding new features to this Here's just a bit of a brief timeline to show you some of the changes that were done over time Then I mentioned TensorFlow is being used all over the world I love looking at this graph just to see how TensorFlow is able to help developers build their their machine learning systems as Globally, it's this really blows my mind As I mentioned, you know, we have two over 2,000 contributors and just there's a lot of activity in the repo Which is super exciting and Yeah, so TensorFlow is used all over the world, but it's also used internally We use it to power all of the machine learning and AI that we have going on inside Google and so I just want to tell you about some of the examples and how we're actually using this stuff So one of the first things I like to talk about is how our data centers are powered using AI So given that we are Google and that the scale that we operate out We have a lot of data centers that do a lot of computations and use a lot of power So where we're actually able to do is use AI and TensorFlow to help optimize the Usage of these data centers both to reduce bandwidth make sure that network connections are optimized Reduce power consumption as well And this just you know helps helps the environment and just is really you know a better way to have these data centers actually be running So we're using TensorFlow and AI to do that We're also using these technologies for global localization in Google Maps So for those of you who may have used Google Maps before you might know that we have an augmented reality feature Where if you're walking through a city such as basil Then you can use it to help you get from point A to point B Directly on the map and you can say directly on your phone and you can see an example of that here How the the directions are actually just showing up on your screen? So you know what street to go down and these are using TensorFlow and artificial intelligence and then we're also using it heavily also in Google Maps But within the Google pixel itself to help optimize some of the software that we have going on there So in this use case we're talking about the portrait mode on the Google pixel Which helps you blur out the the background of an image so that you get to focus in specifically on the Whatever it is that you want to focus on and Then here's an example of an audio synthesizer And so what we actually have here is effectively a chaos pad for those of you may have used that before But the idea here is that you can slide you can slide the cursor around on the pad and it will actually generate music and this is done Beyond train on an algorithm that was using TensorFlow to train it We're also using TensorFlow and AI for medical research So in this use case we have two images here one the one on the left is what we might consider a Retinal image of a healthy eye and the one on the right is an image of an eye that has what we call diabetic Retinopathy and so what we're actually able to do is there's research going on that's using TensorFlow and computer vision to actually predict which one of these is a healthy eye versus an unhealthy eye and Then this is probably my favorite example of the bunch This here is we're using AI and TensorFlow to help us actually predict whether or not foreign whether or not objects in space are planets and Just a brief astronomy lesson of how this works if you imagine that you have if you look at something like the Sun This might be hard for you to actually see but if you imagine you have a large body of light and something moves in front of it Then the brightness of that object will decrease ever so slightly but enough that we have we can use telescopes and whatnot to actually pick up on the differences in those brightness and We can then graph that as we see here on the right and then using using artificial intelligence We can actually predict whether or not those fluctuations in the brightness Results is due to it being an actual planet or another object So we're this is another example of the sort of research that we're that we're doing Okay, so I talked about some examples and now I just want to briefly discuss why Python is so important and why We use Python as the why we're using Python with TensorFlow So Python is a great choice for scientific commuting of course. It's you know, it's very easy to use I would hope everyone agrees which is why you're all here And also it has a super rich ecosystem for doing a data science, you know, you have tools such as NumPy scikit-learn and pandas And we and if we look at the success of these, you know, a lot of these do stem from the the package NumPy itself And NumPy is great because as the performance of C, but it has the high-level API of Python and the ease of use of Python And so when we when TensorFlow is being built the idea is we wanted it to have the simplicity that NumPy has So with that it has the performance of C But also the ease of use of Python and that's why we're actually able to use Python for this because We're able to leverage the best of both worlds with both of these So let's talk about 2.0 and some of the changes that that have come with it So for those of you who may have used TensorFlow 1.x before you might have realized that it's you know It's great. It's powerful and there's a lot that it can do But it definitely had its shortcomings and I'll be the first to admit these of course just having used these personally You know some of the things that I personally found frustrating were using session run Just it did necessarily feel super Pythonic as well as having multiple different ways to do the same thing So a an RNN layer is implemented was implemented multiple different ways And how would you know which one to use it could sometimes be a little frustrating and So both of these things that I mentioned were actually addressed in TensorFlow 2.0 So the redundancy in the API was cleaned up a lot So there's only you know, there's there should be one way to do most things So we're of course focusing on making sure that we remove all the redundancies as we continue to develop the project And also session.run has been removed as we use a concept called eager execution Which effectively means that your TensorFlow code runs just like NumPy code and I will show you an example of that in just a moment and Then another change is that we've introduced Keras as the main high-level API who ears use Keras before just a quick show of hands Okay, so I don't know about you but Keras is personally was I loved using Keras It's super easy to use and so we've actually taken Keras and adopted that into the TensorFlow project and again more on that a little bit later We also want to make sure that TensorFlow is powerful and that it's flexible It's usable for research purposes for production purposes And we really want to make sure that we can get this into the hands of as many people as possible and help as many people as possible with their projects So it's super flexible and then also given that operates at this or given that we've tested TensorFlow at the at Google scale We you know, we know that it works at this scale So we you know, it's it's super scalable and it can should be able to use it for free use case as well We're also able to deploy TensorFlow code anywhere or what we're at least hoping to do is make it a continuing to make This as flexible as it can be we want to make sure that you have different options for where you can run your TensorFlow models So the first example is On TensorFlow extended which is a Python library that you can actually run on your servers to productionalize your models We also have a package called TensorFlow light which lets you run your TensorFlow models on On edge devices and then you can also run your TensorFlow models in the browser using TensorFlow TensorFlow.js and so why is it that we're able to do this and how is it that we're actually able to do this So we use something called a saved model, which is the format that you can output Your TensorFlow model and once you've trained it So if for those of you who have done data science before and who have built a machine learning model You know that you start off by reading and processing the data You then apply layers to it via tf.carus or using TensorFlow estimators, which are black box models you then Choose to distribute it either over just the CPUs on your laptop or GPUs or GPUs on a cluster But once you do all that and once you have the model actually trained You can export this into what we call a saved model and this saved model is a universal format that you can then load into any one of the Deployment options that I mentioned earlier. So in this case, you can use TensorFlow extended and TensorFlow serving to be able to run It on servers. You can use TensorFlow light for edge devices as I mentioned and then TensorFlow.js To run it in the browser, but also we have other language bindings available A lot of these are community driven but for some of the examples we have see Java go a C sharp rust and are And using the saved model. It lets you actually run these run these anywhere Some other packages that exist in the TensorFlow ecosystem are for more I guess niche use cases. So I have some examples listed here Tf probability tf agents a tender to tensor and so these really as I mentioned just exist for these more specific use cases for instance tf agents is a is a Package that exists to do reinforcement learning and it has some higher level apis stacked on top of TensorFlow to help you build reinforcement learning Tf text is used for natural language processing using TensorFlow And so there's a whole long list of these and definitely worth checking out if you have a specific use case that you want to use TensorFlow for We also are introducing TensorFlow hub, which is You can loosely consider that the github of models in that you can actually store You can store and download pre-built models here And you can actually get started working with TensorFlow and machine learning using using these models You can modify them and you can do whatever you want with these, but this is just a place for you to start Start working with machine learning So earlier I mentioned that you can use TensorFlow 2.0 Just like NumPy and so you know for those of you who have used NumPy before this sort of this code may look sort Of familiar to you and that we're creating just a two by two matrix in this case And then just doing a multiplication on it and then we can print it out immediately We couldn't actually do this with TensorFlow 1.x you had to then you had to initialize the variables You had to then run the graph and there was just a bit more involved than just creating the matrix and then doing the Mathematical operation and then printing it So this is definitely a really really nice feature and just creates makes it much more easy to use So then just to talk about some of the specifics of what was what's gone and then what's actually new So I keep mentioning this but session run is gone. We don't have to worry about that anymore a lot of the TensorFlow specific operators such as conditionals if statements while statements that you had to use TensorFlow specific operations for have actually been removed. You can just use normal Python code and there's a reason for that Which involves using a new feature that I'm going to mention in just a second But the last thing I also want to mention is gone is a tf.com trip The reason for this is that the pack the project just got so large and just so much involvement from the community that we Had to actually just remove it from the base build. It would just it was just it's too much memory So it still exists, but it has been removed just went from a if you just do pip install TensorFlow You won't necessarily get it anymore But then some of the things that are new Include eager execution enabled by default. So this allows you to run TensorFlow using a numpy esk style Keras is the main high-level API and then take a tf.function Which is a Python decorator that lets you run your regular Python code using while loops and your you know If statements just in Python, but it will actually get compiled down to TensorFlow code It will talk a little bit more about that in just a moment as well So the next thing I want to talk about is tf.Karis and so I asked earlier who here's use Karis And you know, I personally mentioned that I really like Karis And so the TensorFlow community agrees and so what we're actually doing is we've joined or we've Implemented the Karis API into TensorFlow itself as the main high-level API And so what does that actually mean for those of you who have used Karis before? You may notice it's Karis serves as an API spec So it's not in it of itself an engine it actually relies on using something like a tender flow or piano as the back end So all we've done is we've taken the API spec of Karis and just moved it into TensorFlow The two projects do exist separately still, but they are very closely related So if you want to just use regular Karis with whatever back end you'd like to use you could just do pip install Karis But then if you wanted to actually and then do import Karis but if you wanted to use a tender flow specifically you can just You'd install TensorFlow and then from TensorFlow import Karis and the experience should be more or less the same And so when you're actually using Karis with TensorFlow There's two ways that I like to describe that you can get started using this And so one of them I say is what's called for beginners. The other one is what I say is called for experts They're more or less interchangeable, and I actually like the beginners method more, but it just depends on your use case So if you're using the beginners method The way that you would do this is you would import a Karis sequential model and then just add the layers Row by row so each one of these actually represents a layer of your model So in this case, it's just five lines of code and you have a model built Once you have this you then compile the model Which just essentially make sure that the the model or the layers line up and that the input and output sizes are correct And then you provide your optimizer your loss function and then the metrics that you want to optimize for You then fit it on your training data and then evaluate it on your test data So this is using the beginners method and Then there's also the the experts method as we say So this is effectively using Python subclassing and this allows you to inherit the tf dot karis that model class to then create a Model often scratch and so this gives you a bit more customized ability And then you just can add a call function and then you're able to treat this like you would use karis layers otherwise And so what's what's the difference between these two? I mean we talked about it briefly here, but just to give you Just a general idea If you're using the beginners method, which we call the symbolic method we're using the karis sequential Your model is a graph of layers Anything that compiles will run and that tender flow actually helps you debug by catching the errors at compile time So this removes a lot of the debugging away from you and just makes the code. I guess a bit easier to develop But then in an instance where you may want to use the imperative method or what we call the experts method Your model is Python bytecode, so it runs just like Python code would you have complete flexibility and control over what it is That you're actually building but of course with that it becomes a bit harder to debug a bit harder to maintain And there's definitely pros and cons of using each method. It really just depends on what your specific use cases So next I want to talk about tf dot function So I mentioned earlier that this is something that lets you run Python code just as you normally would What do I mean by that? So so let's say here that you just have a function here We're just having a function that calls a just calls an LSTM cell from a deep learning from a deep neural network So if we have a benchmark here, you know, we'll see that this take would take let's say a point zero three seconds But what we're actually able to do to convert this into a tensor flow code is Add this tf.function decorator just an extra line of code and you'll actually see that we have about a nine times speed up here from this example But the idea is is that you can do this on any Any Python code that you have and So the reason that this is possible is that we're able to use a technology called autograph So what it will do is it will take any Python function you have and as I mentioned convert it into the Appropriate tender flow code and if you wanted to see what that looked like you can use the tf.autograph.to code function And it will take this function here and it will change it into this You don't need to know how this works This really isn't important for necessarily building the model But it could sometimes be interesting to actually see what's going on underneath the hood So next up we'll talk about distribution strategies So I mentioned how we want TensorFlow to be flexible and scalable and how you can use it over different hardware environments So let's say that you have this model here that you may have just built locally on your laptop If you wanted to then take this, you know Let's say you train this on a couple hundred rows just to make sure that it works and that you have something Reasonable that's working before you actually deploy this onto a larger scale If you wanted to then take this and then move it over to whatever hardware cluster you have set up All you'd have to do is just add it within the scope of a distribution strategy And so distribution strategies are effectively ways for you to just take Take the code that you have and deploy it over you know over your hardware cluster And so in this case we're using the marriage strategy What this does is if you have let's say four GPUs It will just take the same model and just copy them over all the different GPUs There's different ways to do this You can take a large model and split it up over the multiple GPUs This is a little bit outside the scope of this talk, but in this case, we're just using the marriage strategy for this example The next thing you want to talk about is TensorFlow datasets This is one of my personal favorite features. I think it just helps Developers get up and going with machine learning much faster So for those of you who you know may have worked with data before you know that it can sometimes be very difficult to actually get a Good data set to work with Models are only as good as the data. I like to say And so what we actually have is a bunch of datasets available for you to use within the TF The TensorFlow datasets package and so you there's a list of these that I'll show in the next slide but the idea is you just You load whatever data set it is that you want to load you can then split up the training in the test data sets And then you can take this data and plug it into any any model that you want And so I'm using the cats versus dogs Example here, but we have a whole long list of them Most of they're all available at tensorflow.org forward slash datasets some examples here include ones that you may have seen before We have MNIST we have a C far 10 image net that the Titanic data set So some of these might seem familiar But again, if you're interested in seeing the entire library of what we have available Tenderflow.org forward slash datasets So let's say you're using Tenderflow 1.x and you want to actually upgrade to 2.0. How can you do that? So we have a bunch of migration guides available on our website Tenderflow.org So that would definitely be the first place that I recommend going to if you want to learn how to do this We also have a library available called tf.compat.v1 and what that will do is that some as some of the APIs are deprecated in 2.0 We do have this library available for you to actually gain access to some of the older APIs If you're not ready to fully move away from those This is also mostly relevant using the the tf upgrade v2 script And so what this will do is you can execute this on top of any Python script And it will take the tf1.x code and actually convert it to 2.0 code Similar to if any of you have used the Python 2.0 to 3.0 script before it sort of does the same thing and with that it'll It'll tell you what was actually changed between the two versions and then it will also implement Yeah, I don't rename it and then show you what was actually changed inside of the scripts themselves So if you're just curious about how to get started generally with Tenderflow again I keep mentioning this but definitely go to the website at Tenderflow.org But also if you want to just get started today, you can install it now just using pip install Dash you double dash pre Tenderflow. So feel free to do this. They are now or at the conclusion of the talk Tenderflow.org we have tons of resources available for you collabs introductions documentation API specs all of this is available here We also have partnerships with Udacity and Coursera We have Tenderflow courses specifically designed to help you get started So I definitely recommend taking a look at these if you're interested in a deep dive With you know more class instructors. We also work with deep learning AI Which is run by Andrew Wing who is you know very active in the machine one of the biggest names in the machine learning community today We're also on GitHub of course So if you're interested in actually getting involved with the project definitely take a look at the the GitHub repository And you know we'd love to hear your feedback or just if you want to add new features or anything You just get involved in the open source project by all means we'd love to have you And then lastly I just want to talk about two extra projects that we have going on in the Tenderflow community So these are Swift for Tenderflow and then Tenderflow.js, which I actually mentioned earlier So these projects are actually so Swift for Tenderflow is a movement to actually use Swift to develop machine learning models and so Swift in it of itself has become increasingly popular in the data science community for its ability to What people argue is I've fixed a lot of the shortcomings that come with Python That's definitely debatable, but I think it's super interesting. So I definitely recommend you checking it out if you're curious And then Tenderflow.js will allow you to actually run machine learning models using JavaScript in the browser Or you can also run them on servers using node it works with both regular JavaScript and nodes So super interesting and again if you're curious, I definitely check that out as well and with that I you know I issue a call to action to go build so definitely go and install the project Continue to learn about this and let us know What you yeah, what you think So thank you all for listening and I'll stick around for a few minutes if anyone has questions. Thanks Okay, question time. Can you raise your hand if you'd like to raise a question, please? No questions Surely there are Okay Thank you very much for that. Yeah, so for some new information about 2.0. It's very useful to know Something that I'm always wondering about It's how people actually kind of curate the the information they got and get out of the kind of training Yeah, the training that you're doing kind of the the improvement on the on the losses over time and How they kind of yeah, it's actually just kind of curate Where do they store the model all of the models? Locally say and how do they evaluate which model has been performing the best for a certain set of examples or yeah Sure So I think you asked a couple things in there, so I'll just I'll try and answer this one by one so the one thing you asked is where models get stored and so If I heard you correctly so one way to do that is to store the model on something like a on a bucket Whatever cloud provider is that we're using you can store it in some Central location and then you can just access the model via an API call That's one way to do it if it's just get outputted as a file In terms of evaluating if a model is actually good That's it sort of depends on what your use case is there's different metrics for evaluating how How effective a model is in some cases you might want to use accuracy, which is just Given a hundred samples. How many of these did it correctly? Predict but that's not going to always be the case in something like if you have a medical if you're building a model for medicine That's detecting some very rare disease if you just say that every case is negative You're gonna have a very high accuracy rate, but that's obviously not helpful for Picking up whether or not it's the model works. So then you would use something called Precision to recall or things called precision recall to actually evaluate whether or not This is a good model and you can do that using different hyper parameters. So for all of these models There's different values that you can set When you're when you're building the model So the best way to do it would just be to basically train several models using these different numbers And just see which one is the best for your use case There's definitely a bit of trial and error in this and that's as you do this a bit more you get some intuition But they're at the end of the day. It's a lot of just I guess guesswork Loosely Next question Yeah, hi, thanks for a talk. It was very informative And you told us that you TensorFlow 2.0 moves Into the direction of Keras and influence interface, but I think you had one sentence that said Not hundred percent now. Is there some interesting case where I said, okay, there's this new TensorFlow dot Keras thing That's not Compatible to if people are using Keras now So which would prevent you from moving to TF dot Keras Thanks for the question I think the the biggest pull at this point would be if for some reason you didn't want to use TensorFlow as the underlying engine I'm in terms of the API. I don't know anything specific that is Significant enough to say don't move to TF dot Keras But yeah, I guess that would be the one, you know specific use case I could think of Any other questions? Thanks for the talk. I think you're going in an interesting direction with the TensorFlow when you think it will it be a stable released Like right now is beta I'm honestly not sure. I think there's it sometime in 2019 is what I keep hearing But definitely keep a lookout for it So the alpha was released in March and the beta was released in last month So I would expect it to be yeah sometime soon. Okay. Thank you. Yep Any other questions Hello, I'm wondering what's the relationship between TensorFlow and all these other TensorFlow libraries that you mentioned like TF agent TF Probability like is it because of the distribution scheme? That's the same or what's the relationship between all these entities? sure so TensorFlow in and of itself has these like these raw Variables, I guess if you will and like the ability to build models like you would use something like NumPy So effectively something like TensorFlow.agents is just built using these TensorFlow objects so as you might implement something like Like a Q learning function and that's just basically using The TensorFlow objects underneath the hood So it's just built on top of similar to how something might be built on top of NumPy These are just built on top of TensorFlow Other questions Okay, can I ask a question, please I teach young people and they are moving through things at the rate of knots some of them And they are tremendously Interested in machine learning and artificial intelligence if they took your set of tutorials on this subject and work through it Independently or with a teacher's help how easy or difficult would it be do you think for a simple project? So I think there's a there's a ton of examples available for like some of the simpler ones Like if you wanted to do something like computer vision or something using natural language processing There there is a lot of there are a lot of resources available So I think it would be enough to get someone started a lot of these introductory courses both on like Udacity and Coursera Also go through some of the more common examples. So those would definitely be Another good place to go, but I think it's yeah just for simple stuff I think it's pretty easy to get started with this Thank you Yeah, hi, sorry just to quickly carry on from what you were just saying about the Udacity course I was just interested to know it is that gonna be on TensorFlow 2 or is that still talking about the old TensorFlow? With I don't know some some sort of detail about the new TensorFlow coming in there It should use TensorFlow 2 it should be an introduction to yet using using this stuff specifically Okay, next question. Thank you for the talk just seeing That the Keras being integrate the how the estimators I want to I mean does it make sense to continue using estimators with the new Keras integration. Thank you sure that's a very good question So the the the estimators are not being I guess further developed So they will long term be deprecated in favor of the Keras APIs They're still there, but I wouldn't expect any new changes to come to them any time of the near future Okay, next question No more questions you sure I get stick around for a couple minutes to if anyone has any you know What's the top offline? Okay, can you put your hands together for a round of applause for Brad mirror? Thank you