Thursday, June 24, 2010

Think Link, not Sync when doing tool integrations

Often when working with ALM tool development teams, partners and customers there is a need to go through an evolutionary re-thinking of how tool interoperability should work.  Often the time when discussing how OSLC can help them, it often starts as a discussion how they can throw out their current 10's of product specific connectors to have a single OSLC API to synchronize data with their tool.   Though there are cases where pulling data, not bi-directional synchronization, has value for purposes of efficient data warehouse access for reporting solutions.  I challenge these tool integrators and vendors to re-think their integrations to instead provide "just enough" information about the other tool being integrated with.  This "just enough" could be as simple as two things: 1) a link to resource owned by another tool and 2) knowing the semantics of that link (OSLC).


There are many advantages of leaving the data within that tool that owns it and providing a link to it:
  • The tool that owns it knows best the rules to govern changes to it, including auditing support
  • The tool that owns it, knows best how to control access to data.  By copying the data, often access controls needs to be replicated as well (as best possible)
  • State-models of resources across tools often don't align
  • There are ways to expose this data in other tools, without replicating it
It is sometimes convenient to have a cloned/cached copy of data in a local tool, OSLC does not prohibit it this and can support this.

Update:  Also see interesting and related IBM DeveloperWorks article "Stop copying, start linking"

Wednesday, June 16, 2010

OSLC reference implementations and test suites

There have been a number of implementation efforts underway for various OSLC specifications.
I've heard many requirements including...

for Service Providers:
  • A hosted reference implementation that can react to client consumer requests, to ensure consistent behavior across implementations
  • Make the source code available for download
  • Allow contributions to the source
  • The language it is written in is less important, though some tend towards JEE based
  • A client testsuite that can give level
  • Samples that highlight key integration scenarios
  • A framework in which to quickly enable new implementations
for Consumers:
  • A hosted reference implementation that can react to client consumer requests
  • A reference service provider that can provide feedback on consumer implementations (testsuite)
  • Provide a variety of samples
  • Java client samples and/or SDK
  • Command-line or Perl based samples and/or SDK
  • HTML/Javascript samples and/or SDK

Some current efforts underway:
Some thoughts on technology basis for service provider reference implementation:
  • Apache Wink - REST framework
  • Jena
    - RDF/XML, Turtle parsers and generators
    - Apply custom rules for RDF/XML and Turtle
    - Add JSON support
    - Simple storage
    - Extended to support ResourceShapes
    - Query - mapping of query syntax - oslc.where/select
    - Resource subsets - oslc.properties
  • OAuth - Provider only
    (need good consumer example)
  • Service Discovery
    - Various models combinations of Catalogs and ServiceProviders
  • Web UI
    - Simple example/demo HTML/JS
    - Prefill
    - Via a draft resource creation
    - Via direct prefill and redirect
    - UI Preview
Dave Johnson posted some thoughts as well here. Which are fairly close to this as well.

Feedback and additional requirements as needed.