During the Cloud Interoperability Week, the face-to-face Tests Sessions will focus on interoperability testing among different implementations of Cloud Standards: CAMP, CDMI, CIMI, OCCI, OVF.

Remote participation to CloudPlugfest is possible but experience shows that in-person attendance at one of the CloudPlugfest locations makes the testing more efficient.


The Test Descriptions for the Cloud Interoperability Week are covered by the 2 following documents:

1) Test Decriptions for Cloud Interoperability (OCCI, CDMI)
TS 103 142.  

This document covers the following standards/configurations:
  • OCCI Interoperability (client/server)
  • CDMI Interoperability (client/server)
  • OCCI + CDMI interworking

2) Test Descriptions for Cloud Interoperability (OVF, CAMP, CIMI) : CTI Guide

This new document covers the following standards/configurations:
  • CAMP interoperability (client/server)
  • OVF interoperability (provider/consumer)
  • CIMI interoperability (provider/consumer)
  • CAMP + OVF interworking
  • CIMI + OVF interworking

The idea behind the Interoperability Test Descriptions is to provide a framework to CloudPlugfest participants to run interoperability testing in a consistent and efficient way. 

Some participants implement clients and other implement servers, some implement both. To validate that clients and servers from different participants are interoperable (i.e.can talk to and understand each other), we run multiple Test Sessions with all the possible combinations of clients and servers from different participants. 

During a Test Session dedicated to one particular base standard, a client tester and a server tester get together and test the level of interoperability of their implementations following the applicable Test Descriptions. 

A typical Test Session following one test description would look like this:

Client Tester: Hey Server Tester, do you support feature "Read value from existing CDMI Data Object", it has as pre-requisite "Existing CDMI Data Object with capability cdmi_read_value"

Server Tester: Yeah, I support that feature and the pre-requisites are fulfilled

Client Tester: Great, here's my request 

Server Tester: Let me check that request... Mhm, looks good, you seem to follow the spec. Here's my response.

Client Tester: Perfect. The response seems to differ a bit from the information described in the check step, you seem to be a bit sloppy in implementing the spec. Nevertheless, I understood what you wanted to express. I verify that the value of the CDMI Data Object is displayed in the client so everything is fine from an interoperability point of view. Let's move to the next test description

As you can see, the Test Descriptions provide the test pre-requisites plus a step-by-step test procedure addressing 2 complementary levels of validation:

Interoperability, validated with verify steps - can client and server understand each other? - This requires some degree of user perspective to be evaluated.

Conformance, validated with check steps - are the messages they are exchanging compliant to the base standard?  This can be done manually by observing the exchanged messages which is very time consuming and can only be applied to a limited number of Test Descriptions... or can also be automated and run offline on a recorded test session (pcap or text file)

This methodology, defined by ETSI, has been successfully applied by their Center for Testing and Interoperability to achieve interoperability in many different technologies. 

Alan Sill,
Oct 4, 2013, 2:43 AM