 In the name of my colleagues, Gilles Antoine, Nis, Roland Bilen, and I don't know, I don't remember why I'm not present, it's not on the slide, Pierre Hallot, which is also one of the co-writers of this presentation. In the name of my colleagues, I thank you to listen to this presentation and considering 15 minutes, it's short. I will be rather direct. I will present the way we have chosen to use SIDOC CRM and compatible models for temporal and spatial reasoning, keeping in mind that there are several different ways to use those models. We do not pretend to have found the best way or something like that. I'm just here to explain the test we have done. And of course, if you have suggestions, and I hope you will have, please do not hesitate to submit them. I also add that this test has been developed by a team formed by three geomaticians and one archeologist. I am the archeologist of the team. So if your questions are too technical for my brain, be aware that my answers could miss the point. And please forgive me as well for my poor English. And as I have just said, here is the way we have chosen to use SIDOC CRM and compatible models for temporal and spatial reasoning. I will begin with the purposes of the test. In previous papers, we had proposed a mapping to CRM models and also a small extension, actually two versions of an extension. Both mapping and extension were aimed at covering a conceptual model called MIDM, Formal Interpretation Data Model. You spoke about heterogeneity, what a word, in archeology. You spoke about format, but there is also in archeology, heterogeneity in the semantic, beyond the words. And even if the formats are the same for same data, the things in the data are not necessarily the same. We therefore developed a test to move away from the theoretical to practical and check the sustainability and the workability of our mapping and extension proposals. The main purposes of our test were first to validate our extension integration with CRM models using, for example, Perlet Reasoner to check inconsistencies, for example. Secondly, to validate the complete covering of MIDM model by instantiation of the ontology and three, reach the next step and try to reason to gather new data or to prefer new relations and to undertake queries. To reach these goals, we used the following ontologies, our Nontology Editor Protegé. We have downloaded the more recent and available LDF files of the following ontologies. You can see CDoB CRM, CRM SCI, et cetera. I let you read the different ontologies. Be aware that except for Geosparker and OWL time and EPA, all the other files do not match with the most recent PDF versions of the ontology. They often match with the previous version of the ontology. All files have been downloaded and stored locally because some small arrangements have been made to enable the attaching ontologies with each other and also to make the overall ontology workable on Protegé. The EPA, EPA, it is our extension proposal has been created directly into Protegé. And that extensions are stored in an open way on the university server. You have the address on the slide. Why did we use Geosparker and OWL time and not only the CDoC CRM and extensions which have quite a lot of spatial and temporal possibilities? First of all, Geosparker is recried by CRM Geo which provides the necessary links to connect CRM models and Geosparker. Secondly, we had a huge need of workability. But there is a lack of workability. As you know, properties do not have expressiveness in a description logic way. I mean, for a simple storage application, it's normal. This is not mandatory. There is no workable semantics behind the concepts but we wanted to exploit ontological possibility to reason and to undertake queries on spatial and temporal properties. Therefore, we had to find a way to make the properties meaningful and workable or reasonable or active if you prefer. I will explain in a few minutes how we manage that question but at that stage, it's just important to say that to reach functionalities of properties, specific data types and data properties were necessary. And what kind of data types did we need? In our test for time, for example, we had decided to set the time granularity on the year but in the RDFC doctrine file, you have only the XSD deadline or the XSD deadline span which are supported in four data types. It means that you must know the rule date year, month, day, hour, minutes of an event or you have to agree to store unwanted but necessary information. I mean, store a list of zero of null which means no data. So we wanted to have the data type XSDJ year. For space, we needed WPT literary data type or GML literary. They are existing in the CRMGO RDF file but they are stored as class and not as data properties. Lastly, we were needed, not as a class, they were, excuse me, WKIT are stored in CRMGO as class and not as data types, sorry. Lastly, we were needed data properties in CRM, you have four data properties leaking to XSD deadline which is not, as I said, the data type we were needed. And with the four properties in the CDoc CRM RDF file it is impossible to link beginning and ends for a single interval. It is impossible to record relative reference and the domain of that properties is E52 time span which was unusable for instances of our test because we have decided to use the CRMGO classes SPA 14 time expression. For space in CRMGO, there is no data property. Thus, we need it to use GeoSparkle and OWL time properties and data types, data types. The conditions of our building of ontology test by conditions we mean first the way we linked T-box or terminological box if you prefer the ontology and E-box for assertional box or if you prefer the instances. This linking is an important question indeed the CDoc CRM and all compatible models and also our proposal were already linked together. The main question was to link and merge firmly GeoSparkle and OWL time with the overall CDoc and extensions models about the merging GeoSparkle on the right was already partially introduced in CRMGO in 2015. So we exploited the CRMGO, you can see it link the CRMGO SP5 geometric place expression class which is defined as a class subclass of GeoGeometry. For OWL time on your left, we simply specify that SP 14 time expression is equivalent to the temporal entity class from OWL time accordingly to the definitions respectively. Thus in the ABOX instances of SP 14 are also instances of OWL time, instant or interval and instances of SP 5 are instances of points or polygons. For example, the contribution of the hybrid ontology lies in data types properties opening. WKT and GML data storage are now fully open for special data management. The lack of data types and data properties is therefore fulfilled by the use of GeoSparkle and OWL time. Let's see now how manage the workability. After the ontology gathering, after test instantiation we wrote rules for temporal reasoning. I will come back in one minute. Thus we used the framework GraphDB. We downloaded ABOX, TBOX and the temporal rules on GraphDB that we used as triple store but also as Sparkle endpoint. The main objective was to undertake reasoning. Reasoning means workability. So for GeoSparkle properties we used an existing external tools called GeoF plugging in GraphDB. It was a sufficient solution for us but for OWL time properties, something had to be developed to process and reason on them. So for this purpose, we wrote some rules in Swirl to deal with Alan's relationships between instant and interval. So because of the lack of time, I won't explain the rules we wrote. We divided them into two steps, a main rule and a secondary rule. We knew that a Greek team had already wrote rules on the temporal ontologies but if you are interested I can explain or try to explain why our way to wrote that rules is different. So these rules allowed us to undertake spatial reasoning and temporal reasoning and of course spatial temporal reasoning which should have been impossible without the rules we have brought and the rules existing in the plugging of GraphDB. This is very resumed. The end of my slides because of the time. So thank you for your attention.