 My name is Ivana Ivanova and I work at Curtin University at Special Sciences Discipline. We used to be department now we are a discipline and my role there is a senior lecturer in special infrastructures in geographic information and all sorts of related things. My background is in special data quality and for 19 years or a little bit over I'm dealing with standardization in all sorts of levels national international community and so forth. So Ming asked me to introduce the fundamental standardization bodies and their roles for special data quality. I'm not sure well some of you for sure are dealing with special data but I think not all of you is that correct Ming? In this group we have non-spatial people as well. Probably for people come here today or deal with the special data quality. Okay never mind so anyway this talk is about standardization of special data quality and we look at the two main bodies which drive this space which is the OGC Open Geospatial Consortium and the Technical Committee at ISO. I'd like to start with introducing both groups and then see what are the synergies between the two. ISO TC211 is a technical committee that deals with standardization of all aspects of geographic information and geometrics. It has been established in 1994 after the miss of a European Normalization Committee who started driving standardization in special data or geographic information domain. Today ISO has 38 participating members among which Australia and 32 observing members and it has more than 40 internal and external liaisons including bodies such as the Open Geospatial Consortium and WSAC the two main organizations which deal with Geospatial Web. The role of ISO TC211 is to produce standards within the 1900 series and the overall philosophies to build these standards for a long-time perspective and that is also a reason that these standards are very well most of them are quite abstract offering the conceptual foundation in a specific aspect of geographic information rather than specific implementation instructions. Currently we have 75 published standards and 25 standards under development so that's it's quite a large body. They are related some closer some not so close but it's not meant to be implemented as a full collection standards can be selected individually based on the needs of implementing organization. On this list I've only selected few interesting standards for us in this group and those that relate to data quality some are specifically defining data quality and its model and aspects and some are defining additional quality elements and those are the standards for data quality and the metadata. In Australia within organizations dealing with spatial data standards for metadata are well known and most organizations implement them in full or in as a profile of these standards. I'm not sure about the data quality standards they are implemented indirectly most of the time as part of the metadata implementation but I haven't seen a full implementation of data quality so far but that may be only me. The model of spatial data quality as defined in ISO series of standards is as following the data quality is expressed through data quality elements a long list of elements and there's two elements which can be quantified or at least qualified some of them. These elements relate to a data quality scope which relates to the data set scope so if part of the data set has data quality evaluated that's possible within ISO standards and these data quality elements are typically reported in metadata and some are reported those that are directly or indirectly evaluated are reported also in more detailed data quality evaluation report. Those elements currently present in standards are the following so these are the five well notoriously known in spatial data quality domain completeness logical consistency temporal quality thematic accuracy and positional accuracy or spatial accuracy and one additional element which is fairly new from 2013 which is called data quality usability element and this set is very likely to change shortly or shortly within few coming few years but I'll conclude my talk with more detail on that. The other defining organization for our standardization is the open geospatial consortium which is well contrary to ISO TC211 which is a formal standardization body producing so-called the U.S. standards. O.G.C. produces de facto standards which are those that are driven by the needs of industry and immediate needs of industry so these are very technical and implementation or usually implementation specifications. As you can see from the new definition of O.G.C. this organization strives at producing geo-web and services in a fair way and that's the concept many of you might have encountered in recent years. It stands for findable accessible interoperable and reusable. That didn't used to be well fair wasn't a word or wasn't a thing few years back so it wasn't that O.G.C. as well but the organization is picking up the general approach to fair data science if we can say that and they change their mission into paying specific attention to that aspect as well. O.G.C. standards are two main standards which are abstract specification and implementation specification and because of the very close liaison between O.G.C. and ISO, abstract specifications are typically equivalent to ISO conceptual schema so for instance a standard for feature geometry features as defined or the main object of real world object that is standardized that O.G.C. is specified equally as is in special schema by ISO. Several implementation specifications exist here just few examples which might be familiar to you. There are two other interesting documents at O.G.C. website which are worth to follow and these are the engineering reports which report results of the innovation program where new standards are developed or existing standards are implemented and also discussion papers which normally bring topics which are hot outside spatial data domain and try to actually see what that means in spatial data domain so one example is the discussion paper on uncertain ML which was a proposal to define metrics to mark up uncertainty measures as an expression of spatial variation in objects. Here in this graph I'm not going to go into detail just so I wanted to include that in slides to show you the synergies between various organizations and in standardization so you can see there are different types of liaisons between these and class aliason between O.G.C. and ISO means ISO T.C.211 means that whatever topic is standardized within ISO T.C.211 is not going to be redundantly standardized at O.G.C. but O.G.C. adopts that standard within their set and other way around so for instance standard specifications such as web map service web feature service have been defined by the O.G.C. and adopted within the set of ISO 1900 series. This is interesting for one other reason because O.G.C. strives at producing open standards whereas ISO produces standards which are conceptually open but they cost a lot of money so if you actually are after a standard for web map service or web other service then it's wiser to actually have a look at the O.G.C.'s website and download it from there because the content is exactly the same. At O.G.C. we have there are several standards working groups and domain working groups where standards working groups are set up ad hoc when a new request for the new standard is there and members join efforts in producing the standard and then there are domain working groups which aim at facilitating work of these standards group or other groups in certain topics. So one of the topics which is which has a domain working group is the data quality. Now what we mean here is spatial data quality but also quality of service and quality a bit more general so not really spatial data driven. So this group has been established in 2006 and one of the first tasks was one of the first programs of work or items on program of work was to learn what data quality means to various stakeholders in geospatial supply chain. So all sorts of bodies involved in production or consumption of spatial data. In this survey we had quite a lot of responses more than 700 from a wide range of organizations and interestingly only 17% of these were members at the time so this was really a wide view from outside organizations and most of them were both suppliers and consumers of spatial data. A result in summary was that almost everybody stated that data quality is important but close to I don't want to say close to nobody because just 60% but 60% is a large number indicating that they have no idea how to manage these data. Now you might think that since 2008 must have changed but I'm not so optimistic so I think if we would do the benchmark here the percentage will slightly maybe even rise. Well I think it's very similar today. Now at OGC when it was established and as a result of this survey the main mission was to actually help provide guidance in implementing quality control in organizations so there were data quality standards at the time with different numbers but also defining the fundamental set of elements such as accuracy, consistency and integrity and this group aimed at helping implementer organizations to actually establish quality evaluation routines in their process workflows and one of the first deliverables was exactly related to that and that was topological quality assessment guidance. Today data quality domain working group has around 40 members 41 members I didn't check today but more or less. We are with three co-chairs two based in Europe and one based in Australia and what we are interested in today is that we'd like to understand what in more detail what quality means for producers and consumers of special data so you can see that from since 2008 the interest the core interest didn't change because it's up until today there is no agreement within these two extremes on the supply chain so it's not so sure producers understand what users mean and other way around. Then we'd like to see what are the limitations in use of data quality information and these are the activities related to constantly evaluating the applicability of existing standards for producers such as the current data quality standard is it possible is it easy to implement where are the problems what needs to be changed and we also like to know how to manage or how to how to deal with the quality of non-authoritative special data resources and those are the typically coming from citizens volunteers so which is the type of special data and the special resource increasingly common and increasingly being adopted and combined with authoritative data and but both of these resources have very different notion of special data quality. How we work we we usually meet at quarterly OJC technical meetings in person or online and then part of that is the objective of those meetings are discussions so we have several presentations and then group discussions about selected aspects of data quality and what we also do we commonly do reviews and mainly of existing standards so for instance in 2010 the group has reviewed both the metadata standard and the data quality standard and we do internal reviews before engineering reports are published at OJC and those reviews those reports that relates to direct directly to data quality or the aspects and at this moment the data quality is now indirectly involved in revision of ISO 19157. What does indirectly mean that this we don't have in the revision process we don't have a data quality domain working group at OJC participating as a well in the project but we have project members nominated through standards bodies who are also members of data quality domain working group so let me just conclude this talk with few information about revision of 19157 so that's the data quality standard. Currently we have well three parts of that standard one amendment and one implementation and the part of the revision is to join these and then also update its content so there is a new project at ISO since July 2019 and it aims at producing a new standard called data quality parts one general requirements so which defines the fundamental concepts for data quality force geographic information. Now this well there are two project leaders myself nominated by standards Australia and my colleague from Swedish Standards Institute and we have a support with from 227 experts nominated by respective standards organizations among which some of them are also members of the OJC. The timeline is as outlined and the standard should be ready in June 2020 and well the one interesting thing to say is that this the revision has been proposed despite the fact that all national well most national standards bodies during systematic review confirmed the standard as is there have been only three objections or suggestions for revision but with arguments so strong that the that ISO's technical committee decided what they would normally not do if 19 members approve and just three object they would not propose the revision but those arguments in those three comments were so strong that the standards has been proposed for for revision despite it's being confirmed by the majority and I think this is a good thing because standard as it appears is still not that straightforward to implement and there are many many uncertain things in in its definition and it requires another look and I think that was all from myself thank you for your attention and if there are any questions then feel free