 Baiklah. Selamat pagi, semua. Baiklah. Mana suatu hari kamu, energi? Baiklah. Selamat pagi, semua. Jadi itu sebuah hari komuniti, energi. Terima kasih banyak. Sekarang kita akan melakukan sebuah rekap cepat. Dan kemudian, Steve akan memperkenalkan apa yang sesi ini adalah tentang. Dan semua pengalaman yang dia ada. Baiklah, Steve. Kamu ada. Dan kemudian kita akan memperkenalkan beberapa pengalaman. Bukan semua, cuma beberapa pengalaman yang kita berdua fikir. Ia akan sangat bagus untuk kita berkongsi dengan semua orang. Baiklah, saya beri kepada Steve. Terima kasih, Donnie. Jadi saya hanya ingin berjumpa dengan bahagian yang kita ada di sini. Hari ini kita akan bercakap tentang ReInVen 2022. Rekap cepat pengalaman yang terbaik. Dan juga kita akan beritahu lebih banyak tentang ReInVen. Saya tak tahu apa ReInVen itu. Sebenarnya untuk memperkenalkan saya lagi. Saya, Steve. Saya seorang hero komuniti AWS di Singapura. Donnie di sana. He's a principal developer advocate. My official title for today is Steve Frank. Did you change the order? I changed the order. Why? Because he follows my order. Okey, so yes. He is a very prominent guy. If you look at the blogs, if you see all his conferences and stuff like that. And he also likes to be known as my friend. I'm not too sure the other way around. But let's see. It just happened on a day-to-day basis. I walk at AWS. Okey, so really ReInVen. Maybe just a show of hands. How many of you know what is ReInVen? Put up your hand. So it sounds like the other half don't really know what's ReInVen. So this is what this section is really for. So ReInVen is the premier technical conference organized by Amazon Web Services. Every single year. And it's always held in Vegas. Last year was actually in November 27th to 1st December. 2000 technical sessions. Not sales sessions. Technical sessions. 50,000 attendees. In Vegas, it's always crowded. It's crazy. And most importantly for people who are interested in announcements. That's where Christmas comes. All the announcements for the new announcements that they're going to launch. So really it's just a big conference. Many people just all coming together with in-person content. And also free virtual content that you can find online later on. So they always highlight of any ReInVen. Like any Apple Keynote. With product launches. In the Amazon world, you have the big three keynotes. That is with Adam Salipsky, Swami. And Dr. Werner. They all give different announcements every day during the conference itself. So this is where they will announce new features. Some of it gets pushed to the top. Some maybe not so much. But that's where everyone listens out for the keynotes. All the interesting stuff is going to be announced on that site. And it's not just really a technical conference. A conference that you come together just learning. It's not exactly maybe the best experience or so. Really the idea here is community. Community coming together. Learning together. Networking together. Having fun together. It's really the benefit of a conference. Like ReInVen. And like this conference that we have over here. So that is really just ReInVen. It's not just about learning. It's also about the meeting of people. The melting of minds in that sense. And of course this is a paid conference. And really the best way to go for this. Just join a company that can help you sponsor. Because it's not cheap to go there from Singapore. So now we really want to go through the top announcements. And we are not going to go through this war of text. But what we are trying to highlight here is that there are a lot of announcements in various categories. Definitely if you are from different walks of life. You are developer, DevOps, networking, security. Learn CX level. All these announcements will be exciting and different for you. So if you really look, this is what all the top announcements have been. 3 slides. Actually there's more than this. But we're not going to go through this with you. Because we think it's boring. But what we really want to go through with you. We believe that most of the crowd here is developer. And this is where I want to pass the time to Johnny. Because he will actually be going through the highlights of what we're going to be presenting today. And also with demos for some of the announcements that we're going to be talking a lot more on. So, Johnny, please. Thank you. Alright. So as Steve mentioned, there are a lot of stuff announced at Rain Van. Starting from new services, new features. And then the wall of text that you just saw, the 3 slides, there are the highlights. So what we're going to do here is to pick you some highlights of the highlights. And we break it down into 3 categories. The first one is builder experience. Those services that emphasise on development on AWS for us as a builder and developers. The second category is more on data and analytics. And then lastly, what is the last one? The others. The others highlights. Because there are so many stuff, we try to break it down. Okay, so first question to all of you. Do you want me to talk more or you want me to give you more demos? More talk as more demos. Demos, cool. So the first off is AWS Application Composer. So remember before the cloud era, I mean up until now, whenever I want to build an application, I start by designing on a diagram. Which one of you still using that? Using diagram to build architecture? So most of us, we're still doing that. But now imagine that you have this kind of like a canvas, the diagram canvas, which steroid. What does it mean? It means that whenever you put into that diagram, it will give you the ISC or you can build infrastructure directly from that diagram. And that is what AWS Application Composer. So AWS Application Composer is very straightforward. So this is how it looks like. Right. Oh, if I put it here. So this is AWS Application Composer. It's currently in preview. So there were going to be some changes while we getting like a feedback from developers like you. So I'm going to like do open the demo. And what's really cool about AWS Application Composer is not only that you can drag and drop all the surfaces, from the left pane into the canvas, but it also has this ability to connect or sync with your IDE. So if I choose here and then I going to create a new folder like temporary, I'm going to select here. Right. It's going to create like... So now what is happening in the background is that everything that I create on this canvas is going to be reflected on my files. Right. So I'm going to open my files where is the document scouts. Composer. And there you go. So this is really cool stuff because as you made your changes here like if you want to add another DynamoDB table is going to, and then you want to add some kind like a connector or you want to create another S3. This is something that you can do in AWS Application Composer. And then if you go to this template, you all have the same, several template that you can just change and then you can deploy it using SAM CLI. Right. So that is AWS Application Composer. Now the next one will be Amazon Code Catalyst. Amazon Code Catalyst is also in preview. And have you heard about this Amazon Code Catalyst? Cool. Right. So I also have, I also got this t-shirt from Amazon Code Catalyst team. So it's kind of cool. It kind of like a personal obligation for me to present Code Catalyst. But this is really cool because with Code Catalyst the idea is to, it's going to be your unified development hub. For example, whenever we do development, we have our code Git repo. And then we use another tool to implement CICD. And then we use like a Git hub or like a Git version control repository service to maintain the issues or doing the project management. And we also use another tool to create like a report like your unit test report, your uptime report and all of this stuff. So imagine all of those components into one single service and that is Amazon Code Catalyst. So you can see here, this is a sample project that I deployed. And you can see all these components like starting from repositories, CICD also bake inside into Amazon Code Catalyst which is really amazing. And then you can see the open pool request and then issues or assigned to you or created by you. So you have this full visibility on what's going on in that kind of repository. But something that's really cool that I really like from Code Catalyst is that it will give you a different environment. You have this flexibility to configure your application environment. Let's say that you have staging. You have production environment. And then it also reflects on your dev environment. Let's say you want to create a pool request or you want to work on new features. You can create your development environment and then you can just go to you can just it has a nice integration with resource studio code which really minimise the friction to start up a new IDE or development environment. So you just need to go to the Code Catalyst click few buttons and there you have it. This is the repo that I use for the medical misfit so it has a really nice integration. Right. So that is Amazon Code Catalyst. Right. So moving on. So Donny, just a quick question, right? So with Code Catalyst, do you need an AWS account then? Oh. Well, that's a really good question. So this is something that we introduce at Rainfan as well. So it's called AWS Builder ID. Right. So with AWS Builder ID means that it gives you a federated way on accessing AWS services. And that includes Code Catalyst. So with Code Catalyst that you can easily create your builder ID and then all those like repos, issues and all services are going to be hosted in one account. So that means that you can invite collaborators and then just provide them with the development environment and then you can start working together. Right. So and then one thing that I really like about Code Catalyst is that if you are in the organization and you want to standardize the development environment, if you want to standardize the requirements for your project, this is a really cool way to do that. Right. Okay. So next one is Amazon Code Whisperer. How many of you have heard about this? Cool. So Amazon Code Whisperer as it says it generates code recommendation. You can use Python, you can use Java, JavaScript, C sharp, TypeScript and it has a nice integration with IDE as well. You can use Visual Studio Code. You can use IntelliJ. You can use PyCharm. You can also use AWS Cloud9 and AWS Lambda. And the way it works for Amazon Code Whisperer is that it really depends on it really depends on what kind of sorry, this one. It really depends on what kind of IDE that you want to use Code Whisperer. Like if you want to use AWS Cloud9 or AWS Lambda directly into the code editor, you just need to enable IAM. Now, but if you want to integrate with Visual Studio Code like for example, you need to create a builder ID, something that I mentioned before, that's something that you're going to use with Amazon Code Catalyst. So with builder ID, you will have this access to Code Whisperer as well. Now, what is really cool about Code Whisperer, I'm not sure if it's too small. So let's say that I want to I want to create a function about creating a saving a data into DynamoDB. Just for example. Sumit? Oh, right. Okay, so I'm going to I'm going to put it down. Yeah. Let's say that I want to create a function to save data into DynamoDB. And then let's say that I want to with following attributes. So what's going to do is that you can see it's give us the recommendation. Let's say that I want to name it the save data, good enough. And then I can also like this one, one example, I can get the recommendation from base on the comment that I just type in. So that's cool. Now, another cool thing about Code Whisperer is that let's say that you are walking in like libraries with libraries or with frameworks like pandas, like flas in this case. Let's say that I want to create a function to convert data frame to write data frame from CSV. Just like that, right? Oops. Yeah. So there you go. So it doesn't only work for any of the services, but it also works for popular frameworks. You can use pandas, you can use flas or even if you want to have like a building of machine learning workloads, you can also use like a PyTorch using Code Whisperer as well. But something that I really want to highlight in the oops, in Amazon Code Whisperer is that it also has a built-in feature to run to check the security of your codes. Let's say that you building an integration with API you put some keys and secrets and you want to make sure there's nothing like no credentials you're going to share into your GitHub repo. So you can just run the security scan. This is something that I really like and I usually do it before I make a comment to my Git repo. So that is Amazon Code Whisperer. Next one is Amazon Event Bridge Pipes. This is really for me, this is a highlight like this is a highlight of AWS Rain Van because it really helps me as a developer to integrate surface from AWS and then I can make some kind of transformation and then I can modify the payload into the target destination. So what Amazon Event Bridge Pipes does is that it remove the need to write a glue code if you want to connect from one surface to another surface. For example, how many of you using Amazon SQS? Right, how many of you using Amazon SQS? How do you connect these two? That is one of the problem that is going to be addressed by Amazon Event Bridge Pipes. Now, Amazon Event Bridge was introduced a couple of years ago. It was the evolution of watch events and last year we also announced Amazon Event Bridge Scheduler. So to let you know that Amazon Event Bridge is a family now because it has three services. The first one is the Event Bus which you can build your event-driven application by using Amazon Event Bridge Event Bus. The second service which was announced around November, it was called Event Bridge Scheduler. So Amazon Event Bridge Scheduler the way I like to think about is a cron tab with steroids. So you can use like a scheduling rules to initiate an event or trigger any kind of events with Amazon Event Bridge. Now, this is the last and the most recent service for Amazon Event Bridge is called Pipes. So let me show you how it looks like. Amazon Event Bridge Pipes Okay, so is it too small? Okay, so this is Pipes. So with Amazon Event Bridge Pipes you can define source and then you can fill and then this source going to be the producer of the events and then these events that you can do filtering so that helps you to only process the events that you need and also reduce the cost because Event Bridge Pipes only charge you for the events that is going to be processed. And then you can also do enrichment. Let's say that you got a payload and you have an existing Lambda functions or you have an M, you want to connect with external source API that you want to enrich your payload that you can do that as well and then you can grab everything and then you can also do like transforming your events before it goes to the target destination. It's really that simple. Let me show you how it looks like. So this is Pipes, right? 25 minutes so I'm not going to do like a live demo but essentially this is integration between SQS that I have and then I also have a step functions. Now, whenever I pass the event to Amazon SQS it's going to be processed by step function. It's really easy like that. But what really amazing about Amazon Event Bridge is that the integration with various services that you can choose. For example, you can use Kinesis, SQS, DynamoDB, Amazon MQ and this is really interesting. Self-manage Apache Kafka. So let's say that you already you are running Kafka for your application and then you want to integrate this kind of like ingesting all the events from Kafka into your application that you can use Amazon Event Bridge pipe as well. So yeah, it's really simple as that. You can send the message everything will go through Amazon Event Bridge pipe and then it's going to be received by AWS step functions and then it's going to being locked by these watch logs. So the idea of Amazon Event Bridge pipes is just to make it easier for you to create like a pipe like a channel to integrate one surface to another surface. I can sense Steve has a question for me. So Donny, with this feature, is there any more need for any edge case to write some form of glow code between all those services that you just mentioned? I will say no. As long as this source is supported by Event Bridge pipes, which you can see from the list is quite many and then the target list actually is it has more supported services and what's really amazing about Amazon Event Bridge pipes is that you can also integrate like with external third-party API. So each step that you see in the filtering let's say if you want to augment your data with the data that you have on Salesforce you can do that as well. So it's really like consolidating the event into pipes. Does that make sense? Cool. So moving on we're not supposed to run this right? Okay. We're not supposed to present this but I think I accidentally unhide this slide. But this is a really cool service. You want to have some? No, I think you should go ahead. So you can't disappoint them now. So how many of you working in advertising and marketing industry? We have a couple of you working in so at least we have somebody who is really keen to know about this. Clean rooms. Clean rooms is actually a technology to bridge collaboration between one party and another party. So let's imagine that you are working in the marketing industry and you are running a social media campaign and you also have a client who is running a campaign. Let's say that I'm an airline. I run a campaign and I work together with a third party company to run my social media campaign. But we know that we should collaborate but we don't want to refill our data. So that is the issue. That means that we you understand the need that we need to collaborate but we don't want to share the underlying raw data. Now, that is a technology called Clean Rooms. So with Clean Rooms it helps you to collaborate without sharing raw data and then more importantly that you can define your own restriction. Let's say that you have table and you have specific columns that you don't want to share to the others or you want to allow some sort of functions for any kind of query into this database. So that's something that you can do with Clean Rooms. So, I do have a demo on this but I'm not sure that right. So you want me to do a demo on this? Cool. So let's say that okay. So this is the Clean Rooms. So it requires I mean this demo requires like a two accounts. The first one is the initiator who create the collaboration and the other one who contributes with their data sets. Now, I already have these collaborations as a membership. So the idea is that I'm an airline company and I have this campaign with the social media company. So I have two members in this collaboration myself as an airline and Steve, who is the CEO of social media company. And then Steve and the social media company already share with me the tables that all resides in adibus glue. You can see here. You can see that I have like business traveler CD. This is all the columns, hash email, name, status. Right. So to show you what does it mean by means that collaboration without revealing underlying data. So what what happened now is what I have is all residing in adibus clean rooms. I have the data from the social media company and I'll also share my data with social media company. Now, let's say that I want to I want to find the emails of the customers or potential customers that has been participating in this particular campaign. So this is my query. I want to select hash email and the data. And then if I run this but I can see that I cannot do that. Why? Because the hash email is not allowed in select query. But and then on the social media company side, on Steve's side he already defined all the tables, all the data and then he also defined what can I do as an airline company so he set aggregation only allow for aggregation query. So let's say that I want to just like account. How many people that's participating in this kind of campaign, right? And I mean it's going to provide me with the results. So yeah, so that is the idea of this clean rooms. And then what's more really exciting about clean rooms is that it has the optional encryption and then with this optional encryption it can even enhance or protect your data from your site. So the way it works is that with this cryptographic computing security, Steve can also define a key to protect his data and be like really unreadable from my end but I can still query the data. So that is the idea of AWS clean rooms to help you collaborate with other parties without sharing or like refilling the underlying raw data. So Donny, I think I'm the CEO of the red dumb question. So both of us must be AWS customers, right? So that was the only question I wanted to ask. So with AWS clean rooms you need to have your data and also already running AWS accounts. So the way it works is that on the onboarding let me see that if I can just show it to you. So this is the onboarding with collaborations. And then you need to define your database account ID and then you need to also define the database account ID for your members. You can also add up to 5 members because we already have 2 here you can add more 3 members. So this is really good to do collaboration. So moving on what's next? I hope I don't slide. So next one is top data and analytics highlights. The first one will be Amazon Appflow. So how many of you use Amazon Appflow? Oh my goodness. Okay, one. I think we need to do more meetups talking about Appflow. So Amazon Appflow is a fully managed service that helps you to transfer data from external SaaS application to and from your AWS account. So that's easy as it sounds. How many of you use external third party SaaS application like Salesforce Jira, Datadoc Google Analytics? Come on, come on. So imagine that you have this kind of capability to grab those data from SaaS application into your AWS accounts. The way that we use to grab those data is by creating a custom code like a custom application and then we map all these fields from those SaaS API data fields into our own database. And then we also need to maintain the API versioning. And then we also need the frequency of this application when it runs, when it stops, when it needs to be timed out. How if we need to define another schema and that is what Amazon Appflow is. Amazon Appflow, it really helps you to create an integration creating a bridge like a two way sync for some of the connectors from SaaS API into your AWS account. Now imagine that you can have those data sitting on your SaaS application into Amazon S3 or Amazon Redshift. It gives you all these unified data like a data lag into your AWS account so you can even enhance it for your analytics, your BI reporting or even for machine learning workloads. So that is Amazon Appflow. The real highlights is that now Amazon Appflow supports over 50 applications. So that means that you now can have integration with Facebook. You can have integration with Google Ads, Google Search Console, Instagram, LinkedIn, Mailchimp, Sandgrid, Datadog, like many connectors that you can integrate with Amazon Appflow. So let me show you how it looks like. I always forgot to I always forgot to Okay, so this is Amazon Appflow, right? This is Amazon Appflow. Whenever you want to create some kind of integration, it's called as flow. So here you can see you just need to connect your source and destination and then you need to map source field to destination, add filters and then you need to activate or run the flow. You can have it scheduled really cool stuff, especially if you have data sitting on third-party application. Now, let me show you how it looks like if I want to create a flow. I'm not going to create a flow, but I just want to show you the source that you can choose. Now, once that you've put a name for your flow, now you have all of these integrations, right? And then, for example, we have something as a connector from Trend Micro, which is really cool. And then you need to create the connection with this source application and then you need to define like the destination destination. You can see we have a manual list on supported services to get those data from and to third-party application. In this case, I choose Amazon S3. Well, this is really cool. Now, but another cool thing that we announced at Rain Event is that it has a nice integration with AWS Glue Data Catalog. Now, if you've been working with AWS Services and then you're working with AWS Glue, S3, Redshift, you probably know where it's going to. So, having this kind of nice integration with AWS Glue Data Catalog, this means that you now have this kind of index of your data sitting on AWS Glue that you can extend the use cases into another stage. Like, for example, machine learning workloads. Now, let's say you have all the data from third-party and then you want to get it into your Amazon S3, build your own data lake, but at the same time you also want to process some of the services into your machine learning pipelines. And you can do that because if you use the integration with AWS Glue Data Catalog, you can use that data on Amazon SexMakers data wrangler. So, it's currently loading, but data wrangler is a feature, to be honest, from Amazon SexMaker to help you do data preparation. So, once that you enable in the AWS Glue Data Catalog, you can import those data. You can import those data like what you see, this is the same connectors that we see in AppFlow. So, by having this, so it was a new, two new features that I package in one announcement that with AppFlow now you have this more flexibility and connecting various SaaS applications and you have this nice integration if you want to build your machine learning workloads directly from SaaS integration with AppFlow. All right. Okay. Do you have any questions about that? I think that really solves a lot the heavy lifting, because nowadays most companies just use more SaaS than SaaS. And of course if you go up maturity, you want to bridge the data from all the sources, do data processing, I can see that I would use that definitely or when I have the use case. I'm not going to write all those glue code, I don't want to do that. So, this is really useful. AppFlow integration with 50 plus connectors. I know that it's going to be easy for us to build a data lake by simply having this AppFlow. Cool. So, next up is Amazon Aurora Zero ETL integration. So, this is currently in preview. So, in short that now with Amazon, how many of you use Amazon Aurora? Cool. How many of you use Redshift? Right. I should build a demo on this. So, with this new features now you have really nice integration between your Amazon Aurora and Amazon Redshift. What it means is that whenever you have write data into your Amazon Aurora it's going to be available as well into Amazon Redshift So, it removes away the complexity on like a synchronizing between from Amazon Aurora and Amazon Redshift because with this nice integration you can just do it on the console. So, that's kind of straightforward. So, next up is Amazon RDS MySQL MariaDB and Amazon Aurora Blue Green Deployments. So, how many of you are dealing with database or working with database in day-to-day basis? How many of you know this pain on migrating database migrating the version? How many of you see? So, I'm happy to share with you that now with this Blue Green Deployments you don't need to manually create your Blue Green Infrastructure whenever you want to do database operations like if you want to upgrade the major or minor patches or you want to define a new schema this is something that you can do directly from the console. So, for those who haven't really familiar with the Blue Green Deployment it's a way that we could migrate from one state to another state to like a new state. Actually, the current production environment and the green is actually the staging environment that we're going to switch over the data once that the new infrastructure is ready for us to use. Now, in the past what we use is the approach on creating replica and then we promote the replica but we don't need to do that anymore because with this Blue Green Deployment we can easily switch from the current state that you have into the new state and what I really like about this feature is that you don't need to change any code which is awesome because all of those like a switching from redirecting the traffic from Blue to the grid environment is going to be fully managed by RDS RDS Blue Green Deployment All right So, Donnie I think when I saw this announcement came up I was actually quite happy because in the past year I had to deal with two end of life Aurora database upgrades and it was always a nightmare but not as bad as Oracle upgrades on-prem I tell you So, I will ask the elephant in the room where's Aurora Postgres Next up We have Amazon VPC Right So, thank you for your feedback Steve We got a lot of feedback from customers that you want also this integration with Postgres My answer is that stay in touch with Adobe CC Group Singapore We're going to give you like the updates whenever is there any support for Postgres Okay Right Next up is Amazon VPC Lattice This is super cool So, VPC Lattice It really simplifies the way that we connect our surface sitting in let's say in different subnet So, for example I don't have any demos on this I do have a demo and I want to like share a bit about this in this recap Let's say that you have an application running on EC2 on subnet A and you also have Lambda Function running on different VPC and then you want to connect from this EC2 to this Lambda You want to call an HTTP API The way it works is going to create an extra networking layer that will give you the DNS address that you can call So, it supports HTTP1, HTTP2 HTTPS with TLS as well and also GRPC So, it really helps you to build that kind of connectivity across account and also helps VPC connections to services Now, if you are interested in working with Amazon VPC Lattice I really suggest you to go to Bumi's presentation when he's going to talk a lot about the use cases on Amazon VPC Lattice And also, I almost forgot to mention that if you want to learn more about EventBridge and how call these for you to build asynchronous or event-driven application I really suggest you to go to Sonu Top He's going to share like a elaborate more use cases with architecture about how you can use Amazon EventBridge So, I think that's all Steve So, I mean these are all the announcements It's really just to dig the depths and if you really want to find out more just look through the like this one is from the blocks I think most of y'all who are familiar with all the announcements, you would know that this exists So, yeah just go and search and I think that's the next one also YouTube all the sessions from re-invent it's all free just go for it And also, if anyone is interested on using those announcements or those services that is in preview and you really want to have that kind of access just reach out to me and I probably can help you to include you into the waitlist like if you want to try like what else like a limited preview on blue-green things like that, just let me know Alright So, thank you so much everyone For the rest of the day I'm pretty sure that you are going to have much more fun in the upcoming sessions So, and yeah I think that's all from me