Русская версия
+7(343) call:2 110 256

Business Semantics and ISO 15926

ISO 15926 describes various aspects of data structures modeling, and applications data exchange. These are somehow similar to principles used in our software, but are, in general, much more complex. IT professionals familiar with ISO 15926 may have a question, "is Business Semantics compatible with this standard?".

Tho short answer is: Business Semantics may be used to transfer ISO 15926 data, and can act as a "facade" - data exchange point, defined by standard (in fact, just a SPARQL endpoint).

To give a more detailed answer, let us give necessary facts about ISO 15926, and then describe, how its requirements may be applied to Business Semantics. We will end with the use case, showing how Business Semantics may be used in ISO 15926 environment.

1. ISO 15926 - what is it?

ISO 15926 is a long-time developed, very large standard, initially intended for petroleum industry, but currently universal enough to pretend to cover all possible data exchange schemes. Standard has several parts, describing information modeling principles, standard type library, reference data, modeling templates and patterns etc. It particular, it defines data exchange interface (facade), that is in fact a SPARQL endpoint, populated with the data of particular application, that may be queried by all other applications.

2. Standard vs Implementation

First of all, ISO 15926 is a standard. Various software implementations may be compatible to this standard in particular aspects. There are no well-known and available implementations of software stacks, covering all its possible usage. This primarily concerns data exchange interfaces.

Business Semantics Server is a software allowing to set up and run semantic-based information integration quick and easy. Particularly, Business Semantics may be used in ISO 15926-compatible environments, as we will show below.

3. Goal and the Means

ISO 15926 is developed primarily for large industries: production, processing etc. It is implied that ISO 15926-based integration involves several enterprises, linked by some production chain. This leads to necessity of mapping data from sources controlled by different stakeholders using common "dictionary", or ontology. This, in turn, leaded ISO 15926 developers to an idea of creating common high-level ontologies and concepts. Particular data is expressed using terms inherited from these ontologies.

Our opinion is that this approach is not quite effective and necessary for many particular purposes, primarily in mid-sized enterprise integration. Carefully following ISO 15926, we may create an excessively complex data model, which will brilliantly describe enterprise architecture, business and data flows, but will be overcrowded and ineffective regarding practical usage. In such cases a compact custom ontology, build for particular case, may be much more effective.

Each task must have adequate means for solution. We may build a nuclear plant to supply a little settlement with electricity (= ISO 15926), but it is obvious, that from economical point of view it would be much more rational to build a hydro- or wind power plant (= Business Semantics). And it still be effectively than using firewood (= most of currently used data exchange methods).

4. Active mode vs Federated access

Another important issue is a principle of interaction. ISO 15926 implies that there are two procedures - data push, when one application send some data to another's facade, and pull, when it retrieve information. With this approach it is very hard to implement data synchronization, which is, in general, Business Semantic's primery objective.

In ISO 15926, a querying application must perform data selection, building an appropriate SPARQL query. It is a custom and usually very complex process, which not always can be implemented in the mode that does not require a human intervention. In Business Semantics, data exchange is performed in completely automatic mode. Administrator has just to set up exchange schema once.

ISO 15926 standard does not define such implementation details as conflicts resolution in case of simultaneous update of one information object in several applications, does not telling anything regarding data integrity recovery, about security and access rights. All these features, implemented in Business Semantics, are our know-how and a obvious advantage, very important in practical usage.

5. ... but ISO 15926 still actual and necessary!

However, we does not neglect ISO 15926 main opportunities. First, it is an international standard, that most of the future integration tools will comply. It has clear advantages for inter-corporative data exchange. That's why we are offering ISO 15926 compatibility in our software.

6. Use case: Business Semantics in ISO 15926 environment

Compatibility between Business Semantics and ISO 15926 must be discussed regarding the fact that Business Semantics is only a data exchange tool. It does not perform any logical data processing, except conversion from-to local applications data structures using mapping, and the logic written in custom client-side handlers.
So the correct question is: can Business Semantics transfer data built using ISO 15926 rules? Can it interact with ISO 15926-enabled applications? The answer is "yes". Finally, ISO 15926 data is expressed as RDF triples, and stored or retrieved using SPARQL server (facade). Business Semantics can to that.

Let's take a practical case from Hans Teijgeler's work "ISO 15926: an Introduction". It describes transferring temperature measurement data from certain sensor. Let's imagine that Business Semantics is working with some relational database, storing measurements data. Its task is to convert these data to semantic form, and put it into the SPARQL facade.

According to ISO 15926, measurements data has to be expressed using PropertyWithValueOfTemporalPart template. Some measurement data has the following OWL representation:

<owl:Thing rdf:ID="T340982">
<rdf:type rdf:resource="&p7tpl;PropertyWithValueOfTemporalPart"/>
<meta:annUniqueName rdf:datatype="&xsd;string">
A temporal part of MPO349818 has a temperature of 67 Celsius since
2013-03-17T12:00:00Z
</meta:annUniqueName>
<meta:annRule rdf:datatype="&xsd;string">#com1_4598292121</meta:annRule>
<meta:annAccessCode rdf:datatype="&xsd;string">#com1_273872</meta:annAccessCode>
<p7tpl:hasTemporalWhole rdf:resource=“#MPO349818"/>
<p7tpl:hasPropertyPossessor rdf:resource="#MPO349818_2013-03-17T12-00-00Z"/>
<p7tpl:hasPropertyType rdf:resource="&rdl;R41192093771"/> <!-- Temperature -->
<p7tpl:valPropertyValue rdf:datatype="&xsd;float">67</p7tpl:valPropertyValue>
<p7tpl:hasPropertyScale rdf:resource="&rdl;R74877992703"/> <!-- degree Celsius -->
<p7tpl:valStartTime rdf:datatype="&xsd;dateTime">
2013-03-12T12:00:00Z
</p7tpl:valStartTime>
</owl:Thing>

Note the main components of this piece of data:

  • T340982 - this measurement URI (identifies this measurement as informational object)
  • MPO349818 - sensor URI
  • R41192093771 - URI of the kind of measurement ("temperature")
  • R74877992703 - measurement unit URI (Celsium grades)
  • 67 - measurement result (temperature, in grades)
  • 2013-03-12 12:00:00 - the moment of the measurement

This information will look as following set of RDF triples (this table is from Hans Teijgeler's work cited above):

ISO 15926 data RDF triples

Business Semantics can form, process and transfer these triples. To be able to do this, it need to know definitions of all terms used in these triples, such as http://www.w3.org/2006/12/owl2-xml#Thing или http://www.infowebml.ws/owl/tpl#hasPropertyType. Also we have to define access rights (this possibility is a clear advantage of Business Semantics, as the standard tells nothing regarding security), and mapping. But first of all, we need to load into server the fragment of ontology, used to express our data. We say "fragment" because loading a huge full set of ISO 15926 definitions is unnecessary for particular case, and at least will cause inconvenience of working with administration console. So we have to save definitions required in this particular case into an OWL file, and load it into the server.

Client s module task is forming and interpreting triples. So in fact standard compatibility applies more to client that to the server. Web connector we are using today works using the same ontology as the server, so we have to request it from server using client administration panel, or load as a file. Then we have to choose mapping for each ontology element used in our case, or write a custom handler.

So, after loading ontology into client module, we have to set up mapping. As we said, in our case we are dealing with database, having some tables that are storing measurement results, list of the sensors etc. Let us take one table to simplify example. Let it be named temp_data, and have the following columns:

  • id - local record (measurement) identifier (1, 2, 3 etc);
  • uid - measurement URI (T340982 in our example);
  • dt - date and time of measurement;
  • value - temperature measured.

In the real case, this table should also contain sensor reference, but now we will omit it for simplicity.

Below you can see mapping setup page of Business Semantics Client administration panel, containing all necessary settings:

Business Semantics settings

Note that for an entity, beside DB table reference, we define handler method named handleTemp. It will form elements of PropertyWithValueOfTemporalPart template that cannot be derived from database columns: hasPropertyScale, hasPropertyPosessor etc.

Also note that most of ontology elements that in fact are references to another entities (e.g. a sensor, measured value type, scale) are having there "string" type, however in the real environment they should be actual references. For example, if we had sensor list table in the DB, we should make hasTemporalWhole element be a reference to it.

After these settings are done, and the handleTemp procedure is written (it has 12 lines of code), we may run integration. At the moment when the application working with sensor put a new record into the database, Business Semantics Client transform this information to triples and sends it to the Business Semantics Server. This message will have the following content (we are using Turtle syntax for exchange):

@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix owl: <http://www.w3.org/2006/12/owl2-xml#> .
@prefix p7tpl: <http://www.infowebml.ws/owl/tpl#> .
@prefix : <http://business-semantic.ru#> .

:T340982
rdf:type owl:Thing ;
p7tpl:hasPropertyPosessor :MPO349818_2013-03-17T12-00-00Z ;
p7tpl:hasPropertyScale "&rdl;R74877992703" ;
p7tpl:hasPropertyType "&rdl;R41192093771" ;
p7tpl:hasPropertyValue "67" ;
p7tpl:hasTemporalWhole :MPO349818 ;
p7tpl:valStartTime "2013-03-17T12:00:00Z" ;
rdf:type "&p7tpl;PropertyWithValueOfTemporalPart" ;
.

Business Semantics Server, in its turn, sends this information to SPARQL server (Apache Jena/Fuseki in our experience). To test all this, we have created two "measurements" and transferred it into Fuseki.

Now let's imagine another application that wants to query facade to obtain some measurements data. It should make the following request:

PREFIX : <http://business-semantic.ru#>
PREFIX p7tpl:<http://www.infowebml.ws/owl/tpl#>

SELECT *
WHERE {
?measure p7tpl:hasTemporalWhole :MPO349818
}

We have ran this query manually through Fuseki console, and obtained such result:

(http://business-semantic.ru prefix is arbitrary, any other may be used instead)

The result contains URIs of all measurement done by given sensor. Let now the external application query our facade to know time and result of first measurement:

PREFIX : <http://business-semantic.ru#>
PREFIX p7tpl:<http://www.infowebml.ws/owl/tpl#>

SELECT *
WHERE {
:T340982 p7tpl:valStartTime ?time.
:T340982 p7tpl:hasPropertyValue ?value.
}

The query againt second measurement will differ only be measurement URI, and will give the following result:

So we may state that Business Semantics has performed its task of retrieving information from source relational DB, transforming it into semantical form, according to ISO 15926 requirements.

Using this case, we can classify our software ISO 15926 compatibility using the scale given in JORD ISO15926 Compliance Specification. Our application performs not every function listed in this document; at least, we don't have ontology modeling tool. So let's list only appropriate criteria, and acheived level of ISO 15926 compliance (supposing that Business Semantics works in conjunction with SPARQL server):

2.3 Representation Technology Category (iii) RDF/OWL Schema Level The interface format is defined according to a registered RDF/OWL compliant XML-Schema. (eg as defined in Part 8.)
2.4 Interface Technology Category (iii) SPARQL Query Level The interface supports querying through a SPARQL-based Façade. (eg as defined in Part 9.)
2.7 Change-Management Meta-Data Category (i) Identity Only Level It is a minimum requirement that all versionable objects subject to change-management have business identifiers as meta-data content in the interoperability interface
2.8 Change-Management Functionality Category (iii) Seeding Level The ability of a system to persist imported data, independent of existing data, if any