 Saya ada 1 kawan di sini. Biar saya bawa ke kanan? Jadi, untuk mulakan, saya mahu tanya satu pertanyaan. Bagaimana banyak orang yang beli HPFLAB recently? Tidak ada lagi. Jadi, saya mahu beritahu, refreshing dari mempunyai, apa yang kita lakukan sebelum ini. Bagi saya dan Yoxie, yang mencari data yang digunakan dari data.GV.SG yang adalah idea portal untuk banyak data dari Singapura, keadaan Singapura. Jadi, apa yang kita nampakkan adalah keadaan HGV Resil data yang digunakan seperti data ini dari JNP, jika anda ingin. Jadi, kita nampak ini dan kita cuba meminta diri kita bagaimana kita melihat data ini. Jadi, apa yang kita gunakan untuk menggunakan jS untuk menggunakan data keadaan. Dan kita cuba menggunakan jS dengan perangkat yang seperti di tempat di Kepulah Mungkio, Lamponis, etc. Dan kita lihat pada periuk periuk periuk yang berbeda. Kita akan menjelaskan periuk periuk average yang berjaya untuk menggunakan periuk minimum dan periuk medium. Jadi, dengan itu, kita cuba visualisasi dan menjelaskan dalam format yang lebih menjelaskan yang menjelaskan chart. Jadi, apa yang kita dapat melihat dengan jS Jika mengenai jS, ia sangat menakjubkan sebuah data dengan mudah untuk membuat data sebegini dengan sebuah array pada periuk yang akan membantu anda memulahkan melihat itu di chart. Dan ia juga membantu mencabar keadaan semasa anda dapat mencari keadaan keadaan. Jika anda ingin tahu lebih rupanya alasan di May 2005 untuk kemungkinan� file yang berjalan dalam You can check the points and then you can invite events and functions to the quick event and with that what we did was basically list down the data, all the transactions that were made during that particular month. So this is one of the things that we did with the data. The other one was we wanted to know where in Singapore are the most expensive HDB flags. You have a big pandemonium. Pandemonium is closed. So if you look at where the hottest places in Singapore, you can see it's not an accurate representation. Let's look at four rooms. Actually no, I should look at this one. Pandemonium is probably somewhere in central. Do you want to know the most specific way to put pandemonium? Let's see. So this is the highest one. It sounds like it. 1.02 million. And it's the second one, which is Tufton. And it can go up to 46 floors if you want to go to Tufton. I'm sorry. So, okay, we're going to the next one. Next, we're going to Jinpoq, but now we process the data from Jinpoq. I've got about the right hand code. I'll talk about the backgang code to actually create all these charts. Because you see the original data is in a table form. And we need to summarize all this into the main, max, media, and stuff by month. And the thing is the data, even though data not really provide API, but because we are looking at time series, so we need to start from year 2000 all the way until year 2000. And turn out that actually they split into three data set. So, the challenge is you need to actually query this data set one by one in the background. It will be very... The user experience will be good if we actually do that at the fine side. Then query, then process, and all that. So, we have to build a band to maybe do a schedule one every month, check whether they update the data, query, process it. So, okay. And on top of that, because this is a data-based API, so actually that's pagination. Every time we query, there's a limit of how many records we can walk. So, you actually need to build a loop to actually query out all the results to populate your database. So, I say... There's a challenge to... You need to actually make multiple query and know when it's a fetch, it's actually a same function. And it returns a promise. So, what you need is actually to build a promise chain. And what are some strategies to that? Actually, when I did this, at first, I thought, okay, just write my code synchronously that I write and actually it doesn't work. I need to build a promise chain to make it work. So, if you see this, I think it's my way to hack it. So, I create a function called fetchData that will actually call the first fetch. What happens is, it will return a promise that will promise to return itself. So, you actually look for yourself to create a chain then you can actually build a chain so, your promise will trigger. So, I collect the first data, I check whether I collect the end of the data because there's a limitation, there's a limit. If it's not the end, I call myself again to do a... with the new offset set until I reach the end that I check. Do I have the... that I go to the second data set to repeat the same process? That's how... There's one strategy to... So, actually, everything else, I build in the same... same way. All my... when I... after I process my data, I fit it back into my... MongoDB. It's also the same way. I use it really, I build a promise chain that actually, at some point, I decide it's more efficient to split up into two promise chain and continue to merge it back into... to resolve the whole design of my background. Then, of course, you all can ask a recursive assistant. Recursive may not be the best way because you will reach a point where your set will work if there's too much query. Actually, there's also a way to just build an iterative for follow method. In this case, for example, in this case, what I did is, you can actually use the reduced method what I need is actually to start it off if a promise is resolved. Then, we just keep calling dot damn and this is the iterative way to achieve your promise chain. So, that roughly is what I would like to share with you all today on a little bit more using promise. Okay, question. Can you explain my 30s over the fetch file? The teacher has spoken. I'm not really a teacher now. Ya, no. I know. That is so cool. Okay, oh, ya. It's so cool. I'm going to talk about this. Okay. This file, what we're trying to do here is because we need to query Google Joe Coke to get out the latlong of all our dress, all the original data just block and straight name and block. So, we need to query Google Joe Coke but because the API they set a limit, you cannot query more than 10 per second. So, every 100 milliseconds you can only query once. So, we need to actually break apart our query into once every 100 milliseconds, more than 100 milliseconds. What I did here is create a promise resolve itself after 150 second. So, instead of a promise, new promise, resolve reject, set time up, call resolve with whatever I want to I want to resolve it, which is the fetch result from mine. It turn out, it's an idea, a set just told me yesterday that actually it turn out when I write code this way it's a very efficient way because before this 150 second is resolved before this 150 second is reached this portion already has been resolved. So, it will take at most 150 second and not more than that to for each call to the Google API. He is sorry for the technical difficulties.