 Hi everyone, my name is Saurav and I am a third year student at Delhi Technological University. I will be presenting my thesis of project to you which was to implement a storyboard software in Krita. For those of you who are unfamiliar with storyboarding, this is a graphical representation of a story. So it is meant to convey the idea of a story using pictures. Most storyboards consist of small thumbnails that describe the content of the scene. They may also have some other information like the dialogue or action or the camera movements and positions. Like in this one they have a thumbnail and some deep description of what is happening in the thumbnail. Storyboarding happens before the actual animation. This sort of a planning for the animation and makes sure that everyone is on the same page. Also animating things is time taking and you don't want to spend time on animating something only to be told that this was not what the director wanted. So a storyboard facilitates better discussion about the animation and characters. It also allows you to showcase your idea to potential employers. Alright, moving on to how the docker is implemented. This docker is part of Krita's plugin system. Qt's MBC framework is used. So there are two models which interact with each other and there are view and delegate classes for each of the models. Also Krita's animation interface is used for interaction with the timeline docker in Krita. It takes a list of keyframes that exist in the timeline docker and provides an interface to add some extra data for those frames. Also thumbnail for all the frames in storyboard docker are visible at the same time. Alright, so now let's take a look at the storyboard docker in Krita. First of all, open it. Go to settings, dockers and select storyboard. Now we have the storyboard docker collected here. We have a list of panel in storyboard docker. Each panel comes with a thumbnail, the panel's name, exploration and some comment fields of the editable. The comment field can be changed from the comments of many and we can change the visibility of each of the comment fields from old document pages and rearrange them and add new comments and delete them. So edit a comment, just double click and type anything. Now let's take a look at the interaction between the storyboard docker and timeline docker. Every panel in the storyboard docker responds to a frame in timeline docker. In the top left corner, there is a frame number for each of the panels. Duration field in the top left corner corresponds to the number of frames to get to the next detail. You can see in the timeline docker at the bottom, to see the duration for a panel in the storyboard docker, all three frames after the frame for that panel move to the right. Similarly, if we decrease the duration, we move to the left. Also, when frames are inserted in the timeline docker, the storyboard panel is inserted in the storyboard docker. So you can see I inserted a frame in timeline docker. And the corresponding panel was inserted. Similarly, if we remove a frame, the panel is removed. Remove the frame at frame number changes. Let's take a look at the buttons at the top of the docker. In the top right corner, we have the arrange button, which is used to manage how the panels are arranged in the docker. You can choose the views and modes. Views decide what part of the panel is visible in the docker. You can choose to see only the thumbnail or only the comments or both. The modes options decide how the panel is arranged. You can choose to arrange them in row-wise fashion or column-wise fashion or in a grid. In the top left corner, we have the export button, which can be used to export the storyboard to either a CDF or SDG docker. The export dialog is options to manage the layout of the export docker. You can choose the range of the panels to be exported, rows and columns of panels per page, font size, page size and orientation. It is also an option to choose layout from an SDG file. Let's export the storyboard and see the results. Thank you very much for watching the presentation. Hi everyone, my name is Leonardo Segovia. I'm from Bahia Blanca, Argentina, and my project is entitled Dynamic Feed Layers in Rita using Seattle. This project was implemented in the Kita Painting Suite and was sponsored by Booty Witching Event, Dimitri Keseiko and Ivan Chosin. Let's begin with some context. Layers are one of core concepts of digital painting. They let artists manage certain components independent of the rest of the artwork, for instance backgrounds, lighting, line art and so on. Patterns and textures are also essential components. They are used to simulate the appearance of physical material and phenomena. Usually, they are saved as raster images. This means they are linked at creation time, at their size and resolution. As of the latest television version of Krita, Photo3, it supports using patterns and textures to two types of layers. The first is called a file layer, and it lets you embed these two textures under gambas. And the second is called a field layer, and it can be used for plot fields that need raster patterns and random noise. As you can see, either of them lets users create the dynamically generated content in an artistically free way. From an implementation point of view, ending a field layer type requires many moving parts. The burst minimum is in following two. First, you must implement a generator which finds the layer contents. Generators are a subclass of Krita's disk generator. Next, you need to subclass this config widget to expose a configuration widget. You may also need to implement a new type of resource. This is done in four steps. First, you expose a new resource as a subclass of core resource, then Krita must be made aware of it through a suitable extension of its resource server. Next, to make it selectable, you need a selection widget. Krita provides a widget for this purpose, for resource-intensive services, that can be instantiated with your resource type. Obviously, you will still have to write the same widget and its structure. Parts is pretty designable, as long as it relates to the resource server. So, my project adds a new type of field layer that implements scriptable dynamic content. This is powered by an expression language library created by Disney Animation, called CXLIM. This library enables festival-affected control and customization of the end result. In autumnal, it allows users to write their own script, but also touches individual variables using the widget. Since CXLIM's textures are procedurally generated, those applications, like Krita, allow you to maintain them at any resolution desired, considering their quality. Furthermore, CXLIM's scripts are just like any other resource in Krita. This means it can be created, edited, bundled, and shared across the Internet. My project touches many parts in Krita. To begin with, this one involves setting CXLIM as a third-party dependency, and creating a new generator and resource management, as described earlier. I also add extensive documentation to Krita's user manual, a language of reference which was provided with state-configuring documentation, an introductory tutorial, a resource management entry, and a field layer entry explaining how it works. This project includes an extensive preset bundle for the seed of all Krita bandflowers and a new reward. As per site, inches were much more extensive. I cleaned up on a factor I rate chunk of the code base, I fixed many bugs, and implemented improvements suggested by the community. Its UI tool kit was also updated and refactored. I added support from the ARM devices, and the native localization cloud-based language partner and a provided PT widget. This part of the work is described in more detail in a talk at the CS Academy. It's a great in Hollywood open source with KDE applications. Well, that is not all. I had time to take on three stretch points. First, I added multi-thread rendering on both field layers, in the CSS stroke system. Up to now, field layers were rendered with a single background thread. It makes checks for layers and so double volumes would be canvases. So I refactored the rendering process into small files, and each tile is rendered in separate shell. It must be noted that this works with supported generator only. Secondly, I enabled field layers to be previewed before addition. My project effected the rendering process to make the shell injectable in the layer-addition stroke. Finally, I cleaned up the rest of the lines of the local documentation in the grid repository, moving it to the user manual. The drawing received now shows what you can do with CS4. It was made by David Trevoy for the episode 33 of his web comic, Pepper and Carrot, the thunder-like force fields of Anete, which was created using this work. Therefore, it's available now in I2D sub-data at crita.com. Thank you all for watching this. Hello, everyone. My name is Gitte Kim. In this summer, I participated in Google Summer Code with KDE, and I work to improve memory protocol and integration of Kiryaki. Kiryaki is a young officer's ground control station, which is a software to control drones. It is started and being developed by KDE community. Web link is a messaging protocol between drones and ground control station. Famous flight control units for drones like R2Pilot and PX4 use this protocol. As name suggests, my project is about improving web link protocol integration of Kiryaki. For that, I added three major features such as TCP and serial connection support, multiple vehicle management, multiple connection management. TCP connection support allows users to control drones through TCP. Serial connection support is the basis of formal update feature through serial connection. Plus, now Kiryaki can manage up to 255 drones at the same time. This feature will be very useful to enter implementing mission planer. In addition to that, now Kiryaki can manage different connections with different properties, like multiple serial connections with different port and board rate. The most challenging thing for me was thinking about overall architecture. Kiryaki has plugins to support various models of drones. It supports drones like rice tunnel and terrapiva, and the way how web link works is quite different from others. As all plugins use same GUI rather than using GUI for each plugin, I had to think about architecture of GUI and libraries that are shared among plugins. During this project, I learned much about how to design a software. In addition to that, I learned about Qt, QML, KDE framework, drones, embedded development, and most importantly, how to contribute to KDE projects. Thank you for watching my presentation. If you have a question, please feel free to ask me. Hello everyone, my name is Ashwin. I'm a GSOC student for Krita, and my project is integrating the MIPing brush engine in Krita. My mantos are Bao, Dimitri, and Valtara. So what are brush engines? Brush engine is just a code piece or a component which governs the mannerism in which colors have been painted over the canvas. So they are responsible for look and feel of the brush. So brush engines just take some user input like some tilt angle from stylus, say pressure from stylus, or the velocity at which the stroke is happening. So they just take these inputs, these dynamic inputs, and on the basis of that, they generate a stroke. So for instance, here I have just compared two brush engines of Krita. To the left we have a grid brush engine and it simply looks like a grid of pixels here. To the right it looks like some pixels have been sprayed over the canvas. So that is just because we are using different brush engines so we are getting different effects. So that is just an illustration that brush engine can give us different effects. So MyPaint is a software which is similar to Krita and is used for painting and artists do love MyPaint brushes a lot. So because of this popularity the MyPaint developers have separated out their brush engine in the form of a library, namely LibMyPaint. So tilt it, other than the MyPaint itself, jump, open tunes and we have done this integration. So coming to the implementation to use LibMyPaint to render some stroke over our canvas all we have to do is define this drawDab and getClear methods. We just need to overwrite this and write the code concerning our own canvas object classes. So drawDab is responsible for rendering the dabs over the canvas whereas getClear is used by LibMyPaint to see the color that is currently present over the canvas. So that is used by MyPaint to render some color smudge effects. This is just a schematic representation of the functioning of BrushLib. So this is just MyPaint library and in our case it is a Caspian device and this is the strokeTo method which is called for generating strokes. It accepts MyPaint brush object along with some dynamic inputs say x and y tilt of stylus D time values which is the time elapsed between current and the previous stroke. At the back end it calls our method on drawDab and getClear method over a recent surface to generate dabs. So for the project I had two milestones the first one was to integrate the LibMyPaint in the form of brush engine in Krita and the other one was to expose MyPaint settings in the Krita's preset editor so as to facilitate creation of new MyPaint pressures. So this is the Krita interface and this is preset chooser of Krita so let us just filter on MyPaint and this is the MyPaint presets which have been loaded. We can just pick say acrylic and this is the acrylic stroke. We can pick something like calligraphy so this is the calligraphy stroke we can choose something like modeling and this is the modeling brush and pick airbrush that is an airbrush stroke. So from here we can trigger convert it into a razor and it will start working like an eraser and we can just switch back to our airbrush this is the preset editor where we can just toggle off the settings say we can change the radius to something smaller and we can just increase it and here we have all over the other settings which MyPaint gives us. So that's my project and thank you for giving me this opportunity Thank you.