Often when working with ALM tool development teams, partners and customers there is a need to go through an evolutionary re-thinking of how tool interoperability should work. Often the time when discussing how OSLC can help them, it often starts as a discussion how they can throw out their current 10's of product specific connectors to have a single OSLC API to synchronize data with their tool. Though there are cases where pulling data, not bi-directional synchronization, has value for purposes of efficient data warehouse access for reporting solutions. I challenge these tool integrators and vendors to re-think their integrations to instead provide "just enough" information about the other tool being integrated with. This "just enough" could be as simple as two things: 1) a link to resource owned by another tool and 2) knowing the semantics of that link (OSLC).
There are many advantages of leaving the data within that tool that owns it and providing a link to it:
- The tool that owns it knows best the rules to govern changes to it, including auditing support
- The tool that owns it, knows best how to control access to data. By copying the data, often access controls needs to be replicated as well (as best possible)
- State-models of resources across tools often don't align
- There are ways to expose this data in other tools, without replicating it
It is sometimes convenient to have a cloned/cached copy of data in a local tool, OSLC does not prohibit it this and can support this.
Update: Also see interesting and related IBM DeveloperWorks article
"Stop copying, start linking"