 My name is Ivana Ivanova and I work at Curtin University, at Curtin University. It's a special sciences discipline, now used to be department, now we are a discipline. And my role there is a senior lecturer in special infrastructures in geographic information and all sorts of related things. My background is in special data quality and for 19 years or a little bit over, I'm dealing with standardization in all sorts of levels, national, international, community and so forth. So Ming asked me to introduce the fundamental standardization bodies and their roles for spatial data quality. I'm not sure what some of you for sure are dealing with spatial data, but I think not all of you. Is that correct, Ming? In this group we have non-spatial people as well? Probably for people come here today or deal with spatial data quality. Okay, never mind. Anyway, this talk is about standardization of spatial data quality and we look at the two main bodies which drive this space, which is the OGC Open Geospatial Consortium and the Technical Committee at ISO. I'd like to start with introducing both groups and then see what are the synergies between the two. ISO TC211 is a technical committee that deals with standardization of all aspects of geographic information and geometrics. It has been established in 1994 after the miss of a European Normalization Committee who started driving standardization in spatial data or geographic information domain. Today ISO has 38 participating members, among which Australia, and 32 observing members. It has more than 40 internal and external liaisons including bodies such as the Open Geospatial Consortium and WSAC, the two main organizations which deal with geospatial web. The role of ISO TC211 is to produce standards within the 1900 series and the overall philosophies to build these standards for a long time perspective and that is also a reason that these standards are very, well most of them are quite abstract, offering the conceptual foundation in a specific aspect of geographic information rather than specific implementation instructions. Currently we have 75 published standards and 25 standards under development, so it's quite a large body. They are related some closer, some not so close, but it's not meant to be implemented as a full collection. Standards can be selected individually based on the needs of implementing organization. On this list I've only selected few interesting standards for us in this group and those that relate to data quality. Some are specifically defining data quality and its model and aspects and some are defining additional quality elements and those are the standards for data quality and the metadata. In Australia within organizations dealing with spatial data standards for metadata are well known and most organizations implement them in full or in as a profile of these standards. I'm not sure about the data quality standards, they are implemented indirectly most of the time as part of the metadata implementation and I haven't seen a full implementation of data quality so far, but that may be only me. The model of spatial data quality as defined in ISO series of standards is as following. The data quality is expressed through data quality elements, a long list of elements and there are two elements which can be quantified or at least qualified, some of them. These elements relate to a data quality scope which relates to the data set scope so if part of the data set has data quality evaluated that's possible within ISO standards and these data quality elements are typically reported in metadata and some are reported, those that are directly or indirectly evaluated are reported also in more detailed data quality evaluation reports. Those elements currently present in standards are the following. These are the five notoriously known in spatial data quality domain, completeness, logical consistency, temporal quality, thematic accuracy and positional accuracy or spatial accuracy and one additional element which is fairly new from 2013 which is called data quality usability element and this set is very likely to change shortly or shortly within coming few years but I'll conclude my talk with more detail on that. The other defining organization for our standardization is the OpenGeo Special Consortium which is well contrary to ISO TC211 which is a formal standardization body producing so-called day-use standards. OGC produces de facto standards which are those that are driven by the needs of industry and immediate needs of industry so these are very technical and implementation or usually implementation specifications. As you can see from the new definition of OGC, this organization strives at producing geo-web and services in a fair way and that's the concept many of you might have encountered in recent years. It stands for findable, accessible, interoperable and reusable. That didn't used to be, well, fair wasn't a word or wasn't a thing few years back so it wasn't at OGC as well but the organization is picking up the general approach to fair data science if we can say that and they change their mission into paying specific attention to that aspect as well. OGC standards, there are two main standards which are abstract specification and implementation specification and because of the very close liaison between OGC and ISO, abstract specifications are typically equivalent to ISO conceptual schema. So for instance, a standard for feature geometry features as defined or the main object of real-world object is standardized that OGC is specified equally as is in spatial schema by ISO. Several implementation specifications exist, here just few examples which might be familiar to you. There are two other interesting documents at OGC's website which are worth to follow and these are the engineering reports which report results of the innovation program new standards are developed or existing standards are implemented and also discussion papers which normally bring topics which are hot outside spatial data domain and try to actually see what that means in spatial data domain. So one example is the discussion paper on UNCERT ML which was a proposal to define metrics to mark up uncertainty measures and expression of spatial variation in objects. Here in this graph I'm not going to go into detail just so I wanted to include that in slides to show you the synergies between various organizations and in standardization so you can see there are different types of liaisons between these and class aliasing between OGC and ISO means ISO TC211 means that whatever topic is standardized within ISO TC211 is not going to be redundantly standardized at OGC but OGC adopts that standard within their set and other way around. So for instance standard specifications such as web map service, web feature service have been defined by the OGC and adopted within the set of ISO 1900 series. This is interesting for one other reason because OGC strives at producing open standards whereas ISO produces standards which are conceptually open but they cost a lot of money so if you actually are after a standard for web map service or web other service then it's wiser to actually have a look at the OGC's website and download it from there because the content is exactly the same. At OGC we have, there are several standards working groups and domain working groups where standards working groups are set up at HOP when a new request for a new standard is there and members join efforts in producing the standard and then there are domain working groups which aim at facilitating work of these standards group or other groups in certain topics so one of the topics which has a domain working group is the data quality. Now what we mean here is spatial data quality but also quality of service and quality a bit more general so not really spatial data driven. So this group has been established in 2006 and one of the first task was one of the first programs of work or items on program of work was to learn what data quality means to various stakeholders in geospatial supply chain so all sorts of bodies involved in production or consumption of spatial data. In this survey we had quite a lot of responses more than 700 from a wide range of organizations and interestingly only 17% of these were members at the time so this was really a wide view from outside organizations and most of them were both suppliers and consumers of spatial data. A result in summary was that almost everybody stated that data quality is important but close to I don't want to say close to nobody because just 60% but 60% is a large number indicating that they have no idea how to manage these data. Now you might think that since 2008 this must have changed but I'm not so optimistic so I think if we would do the benchmark here the percentage will slightly maybe even rise. Well I think it's very similar today. Now at OGC when it was established and as a result of this survey the main mission was to actually help provide guidance in implementing quality control in organizations so there were data quality standards at the time with different numbers but also defining the fundamental set of elements such as accuracy, consistency and integrity and this group aimed at helping implementer organizations to actually establish quality evaluation routines in their process workflows and one of the first deliverables was exactly related to that and that was topological quality assessment guidance. Today data quality domain working group has around 40 members, 41 members I didn't check today but more or less. We are with three co-chairs two based in Europe and one based in Australia and what we are interested in today is that we'd like to understand in more detail what quality means for producers and consumers of spatial data so you can see that from since 2018 the core interest didn't change because it's up until today. There is no agreement within these two extremes on the supply chain so it's not so sure producers understand what users mean and other way around. Then we'd like to see what are the limitations in use of data quality information and these are the activities related to constantly evaluating the applicability of existing standards for producers such as the current data quality standard is it possible, is it easy to implement where are the problems, what needs to be changed and we also would like to know how to manage or how to deal with the quality of non-authoritative spatial data resources and those are typically coming from citizens' volunteers so which is the type of spatial data and the spatial resource increasingly common and increasingly being adopted and combined with authoritative data but both of these resources have very different notion of spatial data quality. How we work, we usually meet at quarterly OJC technical meetings in person or online and then part of that is the objective of those meetings are discussions so we have several presentations and then group discussions about selected aspects of data quality and what we also do, we commonly do reviews and mainly of existing standards so for instance in 2010 the group has reviewed both the metadata standard and the data quality standard and we do internal reviews before engineering reports are published at OJC and those reviews, those reports that relate directly to data quality or the aspects and at this moment the data quality is now indirectly involved in revision of ISO 19157 what does indirectly mean that we don't have in the revision process we don't have a data quality domain working group at OJC participating in the project but we have project members nominated through standards bodies who are also members of data quality domain working group so let me just conclude this talk with few information about revision of ISO 19157 so that's the data quality standard. Currently we have three parts of that standard, one amendment and one implementation so part of the revision is to join these and then also update its content so there is a new project at ISO since July 2019 and it aims at producing a new standard called data quality part one general requirement which defines the fundamental concepts for data quality first geographic information Now these, well there are two project leaders, myself nominated by standards Australia and my colleague from Swedish Standards Institute and we have a support from 227 experts nominated by respective standards organizations among which some of them are also members of the OJC The timeline is as outlined and the standard should be ready in June 2020 and one interesting thing to say is that the revision has been proposed despite the fact that all national, well most national standards bodies during systematic review confirm the standard as is there have been only three objections or suggestions for revision and the argument so strong that the ISOs technical committee decided what they would normally not do if 19 members approve and just three object they would not propose the revision but those arguments in those three comments were so strong that the standards has been proposed for revision despite its being confirmed by the majority I think this is a good thing because standard as it appears is still not that straightforward to implement and there are many uncertain things in its definition and it requires an underlook and I think that was all from myself thank you for your attention and if there are any questions then feel free