Sunday, December 11, 2011

W3C Workshop on Linked Enterprise Data Patterns

I attended a W3C Workshop title "Linked Enterprise Data Patterns" on December 6&7, 2011 at MIT in Cambridge, MA. It had many interesting sessions covering a wide variety of topics. The position paper by Rational presenting by Martin Nally covered what we learned in Rational and with OSLC. Responses made it clear that we weren't alone with seeing this need. This included Tim Berners-Lee and many other attendees with years of experience deploying and building Linked Data applications.

Key result of the workshop was consensus on the need for standardization, as logged in IRC:
(1:45:06 PM) sandro: RESOLVED: We want a Working Group to produce a W3C Recommendation which defines a Linked Data Platform -- something that solves IBM Rational's use case (presented yesterday). We expect this to be an enumeration of specs which constitute linked data, with some small additional specs to cover things like pagination, if necessary.

Next steps will involve evolving the IBM Developer Works publication "Towards a Basic Profile for Linked Data" into a member submission for consideration into the W3C Recommendation track.

Monday, November 14, 2011

EclipseCon Europe Trip Report and EclipseDemo Camp Raleigh

I attended EclipseCon Europe Nov 2-4 in Ludwigsburg, Germany. I was on the agenda to do a 90 minute workshop on using Eclipse Lyo to enable OSLC integrations. I had a small be interested group of attendees and was good to see people already had a familiarity with OSLC (and Lyo for that matter). They were looking to see how better to apply OSLC to some specific tools and scenarios. Good to see real usage. Also good to hear that one of the attendees had plans to leverage Lyo in early 2012 as part of an offering they had.

I attended a number of interesting talks on some lessons learned and best practices for managing contributions at Eclipse. Something that has been of immediate interest as we get Lyo operational and efficient. These talks were often led or facilitated by Eclipse Mylyn committers and it was good to see they referenced OSLC in their description of Mylyn.

I also was able to see the great working going on in Eclipse Orion and see that OSLC is on their roadmap as well. Think there are some great ways that Orion can benefit from OSLC-based integrations and I'll be digging into that as well.

Enjoyed the various 10 year talks and events as well. Including keynote by former IBM-exec John Swainson as he reflected on the business decision to open source and create Eclipse. Also the look back at Eclipse's 10 year in action by John Kellerman and Kim Moir, including a nice photo album collected over the years at the IBM booth.

In conferences like this, it was great to finally put a face with a name, make some new connections and have some good one-on-one discussions. These things are often hard to do remotely.

Immediately when I returned, I co-presented Eclipse Lyo with Michael Fiedler (fellow IBM'r and Lyo committer) at Eclipse DemoCamp Raleigh. It was a packed room and a good time.

Monday, October 3, 2011

OSLC Implementations

We are working on putting together a catalog of OSLC implementations on the OSLC website and starting to jot down some ideas on how best to present this. It is quite common for many specification efforts to list a number of implementers of those specs. The intent of this list is not to be a validated by some 3rd party to be "good OSLC" implementations. Instead just simply to be a way to pull together OSLC implementations onto a single consumable page and be redirected to details about that.

Here's my attempt at building such an implementation list for OSLC with full disclaimer that no guarantees on the OSLC-ness of these. Please comment or contact me with anything missing or not represented properly.

Take 2: I decided to "keep it simple". Have the spec details supported will be a nice future enhancement that I don't plan to do soon.

OSLC Providers


IBM Rational Team ConcertSupports CM 1.0 and 2.0 specifications for its Work Item component.
IBM Rational Quality ManagerSupports QM and CM specifications
IBM Rational Requirements ComposerExposes requirements according to RM specifications
IBM Rational DOORSSupport for RM specification, facilitating greater and simpler integration possibilities
IBM Rational Software Architect Design Manager
IBM Rational Rhapsody Design Manager
Trace from designs to other lifecycle artifacts using Open Services for Lifecycle Collaboration (OSLC) based linking
IBM Rational ClearQuestExposes all records according to CM specification
IBM Rational Asset ManagerExpose resources per AssetMgmt specification
FusionForge TrackerSupport CM V2 specifications in order to implement a REST based API
Rational OSLC Adapter for Atlassian® JIRA®Support CM V2 specifications and tested with Rational CLM Solution

OSLC Consumers


IBM Rational Team ConcertCan connect to CM, RM, QM and AM providers
IBM Rational Quality ManagerCan connect to CM, RM, QM and AM providers
IBM Rational Requirements ComposerCan connect to CM, RM, QM and AM providers
IBM Rational DOORSSupport for CM, QM specifications, facilitating greater and simpler integration possibilities
IBM Rational Software Architect Design Manager
IBM Rational Rhapsody Design Manager
Trace from designs to other lifecycle artifacts using Open Services for Lifecycle Collaboration (OSLC) based linking
IBM Rational ClearQuestCan connect to CM, RM and QM providers
IBM Rational Asset ManagerCan connect to CM, RM, QM and AM providers
Jenkins PluginConnects to different remote bug trackers via the OSLC protocol
IBM Tivoli Service Request ManagerConnects to bug trackers (dev teams) via OSLC

Common References

Monday, July 25, 2011

Using HTTP REST to handle long running jobs / requests

At OSLC we haven't dug into how best to handle long running requests, something a server may take a fairly long time to generate a response to. Take for example a submission to a build engine to execute a build, the execution may take hours. Of course, you don't want to initiate a request and have to leave the connection open to wait to receive a response. I'd expect the Automation(Build/Deploy) working group to work on evolving a proposed solution to this. Though there is precedence on solutions for this, as indicated in a blog post by fellow IBMer / OSLCer Bill Higgins.

I'd be interested in hearing any feedback on experience with HTTP REST-based solutions to handling long-running requests.

Tuesday, June 14, 2011

Innovate 2011: Trip report and OSLC feedback

My Innovate conference again was filled with OSLC discussions.  I found it tough to get to many sessions this year, which was rather unfortunate due to the large number of high quality content.  My schedule consisted of a mix of presenting, customer meetings and exhibit hall duties (probably sounds familiar to most attendees).

Monday June 6th:
Opening Keynote - inclusion of content about Eclipse Lyo (OSLC SDK) project proposal, which was good to hear.

I was off to the new "Jazz Interopability Center" feature OSLC where I was setting up the shared local network, along with CLM, JIRA (early prototype), Bugzilla (adapter from OSLC tutorial work) and ClearQuest servers.  There was a strong push to get this out and working, it was great to see it all come together with minor interruptions.  There was a lot of hard work to make this a success and they deserve the credit.

I presented "ISM-1071 Improve Collaboration between Support and Development Using Open Services for Lifecycle Collaboration" with John Arwe from Tivoli.  About 30 people in attendance and good questions about how the integration works.  We'll have to wait to see what he survey says.

Due to some networking the night before at the reception, the Tivoli SRM and ClearQuest OSLC-based integration was working (early prototype) just in time for exhibit hall opening.

Then I was off to the Interop Center for final setup and opening.  Of course, there were some problems but was able to recover just in the nick of time. There was a slow start to the traffic as attendees made their way to the back (and past the food and bar).  Feedback from the ped staff across the board stated that they were very pleased with the amount of traffic and discussions on-going.

Tuesday June 7th:
Early day discussions and breakfast with a customer and their integrations.  This was mostly focused on adoption of RTC and how to co-exist with CC/CQ setup.  Though some discussion on Siemens integration was brought up.

Late morning I ran the workshop "TW-1161 OSLC enable your tool in a day" (based on OSLC Tutorial Part 2) which was wait-listed but only had about 35 out of 50 capacity actual show.  Throughout the week I had many people ask if the workshop was going to be run a second time during the week, something to consider in the future.  The workshop went really well in my mind.  A good number of students were able to get through all the labs in the amount of time given.   I was able to get into some great discussions from some of the attendees.  I gave many copies of the material to IBMer's who said that it was exactly what their account teams and colleagues needed and were going to share with them.  Some of the attendees had 3rd party providers already in plan and had good design discussions about how to approach.  Great feedback from these attendees on the value of the workshop for them and on this effort.

I was able to attend some of John Arwe's session titled "Extending OSLC" which was a very full session.  John talked about how OSLC approach (both community and technical) can be applied to integrated service management (ISM).

Presented TJI-1072 Keys to Building and Consuming Open Services for Lifecycle Collaboration Providers in Support of Open Lifecycle Integrations" just as the exhibit hall is about to open.  There was a good mix of attendees here, some familiar faces and none I have seen before (IBM and not).   There were some good questions throughout (people trying to jump ahead). 

Ran back over to the Interop Center for more good interaction with attendees.

After exhibit hall closed we held "BOF-2319A Lifecycle Integration: Open Services for Lifecycle Collaboration" which had great participation from Rational technical leaders, AIM, Tivoli, Siemens, Tasktop and more.  Discussion was started by reviewing progress on recent "2.0-based" specs such as Core, CM, RM, QM and AM.  Other community topics including open-source, alignment with other standards efforts (W3C linked data/semantic web) and need to continue as an independent community.  Some observations were that OSLC is now part of Rational's "fabric" (as Martin Nally says) and part of every integration discussion.

Wednesday June 8th:
Was fortunate enough to make it back to the keynote to catch IBM Watson and IBM Fellow Grady Booch.  I thought it was great to see Grady using the Rational tools to describe the internals of Watson.  My only "criticisms" were:
  • it was near impossible to read/understand the diagrams
  • there was no connection to DeepQA and how it uses Linked Data / Semantic Web technologies, the same that OSLC and the Jazz Platform are based on.  Thought it would have been a very powerful connection
  • it would also have been insightful to get Grady's thoughts on how this technology will influence the "tools" that we Rational are building.  The last chart had something about areas IBM is exploring: healthcare, helpdesk, etc but not "development intelligence"

Interop Center again!  Final session we thought was going to be slow to start, it was the opposite, started strong and finished a little slow.  We held a retrospective after the exhibit hall closed.  One thing that made it clear to me that the ped staff, even though they just finished a tough few days, were energized by the reaction they got and had many great ideas on how to improve/enhance it in the future.

Met with a customer who has both Rational and Tivoli products.  It was good to see that our strategy and solutions really resonated with them, they were pleased with our direction and looking to deploy it this year.

Thursday June 9th:
I attended "CS-1651A Leveraging Open Services for Lifecycle Collaboration for Project Management Information Using Excel to Improve Software Delivery" presented by our colleagues from IBM Japan, NEC and Fujitsu.  This was very well attended, especially with the Japanese attendees.  This was a very interesting presentation on the differences in how projects are run in Japan and the tools used.  It provided an interesting approach using OSLC-enabled Excel to connect to and from tools like RTC.  I look forward to seeing their progress.

In Summary:
I found very few people who didn't know what OSLC was at a high-level but when presenting during my sessions some of the feedback I got was that I had cleared some misconceptions that they had.  Also the message is clear that customers would like to see more participation in OSLC by key ALM/PLM tool vendors and ways for it to continue to operate as an independent entity.  Many were encouraged to hear about the Eclipse Lyo project proposal and the continued contributions to open source. It was great to meet some colleagues for the first time in person, catch up with others and make new connections.  I look forward to the continued collaboration and reach of OSLC.

Friday, May 27, 2011

OSLC at Innovate 2011 Conference

OSLC will be all over this year's IBM Rational Innovate conference June 5 - 9 in Orlando, FL.  I've included some of the many OSLC-related sessions below in the embedded calender.  The OSLC community has put together a list of a sessions and known attendees as well.

I specifically wanted to mention a new addition to this years conference around OSLC and that is the "Jazz Interoperability Center".  This will be available during exhibit hall hours and will highlight OSLC-based integrations from a mix of participants including: Rational, Tivoli, Siemens, Tasktop, iTKO, BSD Group and more.  Look for this area when you arrive at the exhibit hall.

The other interesting thing to note, is that OSLC is not just showing up at the IBM Rational conference but has been, or will be, showing up in a number of other conferences as well.  It is good to see that a larger community is seeing and getting the value out of OSLC that IBM is.
Other conferences with OSLC presence:

Friday, March 18, 2011

Determining how best to participate in OSLC service discovery

OSLC specifications are often written with the intent to solve some basic integration scenario. Sometimes it is not obvious the best way to apply the specifications to some specific tools. I'll take a look at a couple of bug trackers (Change Management Providers) and see how one might use oslc:ServiceProvider and oslc:ServiceProviderCatalog with them to allow for service discovery.

Let's take a look at a couple of specific installations of Bugzilla: one at Eclipse and the other for Mozilla (Firefox).  The typical partitioning of bugs within Bugzilla is by product.   Eclipse has them grouped by common areas (Classification) as these shown below:
Let's say from this list I want to open a bug against EMF.  I'm presented with either a guided or traditional web page for filling out this form.   From there I pick a Component and provide additional fields needed to complete the form.

So the organization of bugs within Bugzilla are:
With OSLC, we have the possible structure of mapping to any number of oslc:ServiceProvider and oslc:ServiceProviderCatalog.  So what makes the most sense?  We need to look at how our resources are grouped together and restrictions put on that grouping.  A oslc:ServiceProvider contains the URIs to various capabilities a provider has to offer like: resource creation by POST, query, delegated web UIs for creation and selection of resources.  This grouping is usually driven by certain constraints put on these resources.  For example it could be all resources in a given project or product.  The access rights and rules on creations are often isolated to a given project or product area.  From this criteria, the Bugzilla "Product" seems like a good fit for oslc:ServiceProvider.    From here, it would seem natural to have one oslc:ServiceProviderCatalog per Classification.  This would allow someone to get a catalog for a specific Classification and it would lists all the Products associated with it.  So this would give us a picture like this:
The next question one might ask is: how do I know what all the Classifications are?  Since oslc:ServiceProviderCatalogs can reference also other oslc:ServiceProviderCatalogs we can introduce another catalog as the "root" of all these pieces of information.  So the picture would now look like:

So one may be thinking now: what about Component?  Looking at its usage within Bugzilla instances it is more of a "common required property" than what was described before as criteria for selecting a oslc:ServiceProvider.
Some other options could have been done as well.  Not having intermediate oslc:ServiceProviderCatalog but instead list all oslc:ServiceProviders (Products) in a single oslc:ServiceProviderCatalog.  This may have worked fine and depending on how you server partitions its resources, an organization list this could make sense.

Next let's take a look at Google Code's Issue Tracker.  There is only one instance of this server of this code base and it is at  Let's look at some of the qualities of the projects there and organizational structure.   As I type this, there are 5,681 projects.  These projects can be tagged to make them easier to find.  Each project has its own issues URL, which can be used to get feeds on issues and POST to it to create a new issue.  As you could imagine, a common way to find projects and issues within projects is by using Google's free-text search.  So given this, it seems like a project would be represented as an oslc:ServiceProvider.   Though it makes me think how many of our existing clients are prepared to deal with an implementation that has an oslc:ServiceProviderCatalog with 5,681 entries.  Within OSLC there is a general-purpose resource paging solution that could help this, which is what I'd recommend.  It may be some time before Google Code's Issue Tracker becomes OSLC-compliant but it may be that someone can easily put an OSLC fascade onto using one of API's that are available.

To be clear, this is only exploring how any some tools may expose their capabilities.  It does not imply that oslc:ServiceProvider represents a product or a projects.  Clients should only interpret these constructs as a way a tool has exposed a path to the oslc:ServiceProvider.  Often the client is driven by a end user who will be presented with choices to select the appropriate one based on their understanding of the current application and with which they are integrating.

Wednesday, February 23, 2011

The "O" in OSLC

 is for Open, I think we all know that.  Though what is really unique and special about OSLC is the many ways it is open.  Not just that fact it is on a public website for people to see or that the integration style opens up your data, it covers many dimensions.  Let's take a close look how OSLC is open:

Open participation model
Anyone who wants to join in can do so and contributions are made under the Creative Common Attribution License.

Open and free use of specifications
Each of the working groups adheres to an intellectual property covenant, allowing implementers the ability to freely implement the specifications without fear of licensing fees.

Open community
If you are motivated to make a change and have the time, you are encouraged to join the community.  Contributions to improve lifecycle tools interoperability can be accomplished in many ways: scenario development and review, specification development, researching solutions, reviewing specifications, doing implementations, providing implementation reports and on and on.

Open collaboration
Collaboration occurs in the open at in the form of updates to wiki pages, mailing lists and regular conference calls.   If you can't make a meeting, minutes are posted (as good as they are taken) and community members can review at their leisure.  When the communications and collaborations don't work for a given working group, changes are proposed and enacted.

Open services
After all that is what the S stands for.  Though there is more to it than putting the two words "open" and "services" together.  Its about defining consistent patterns, guidance and specifications for achieving the common goal of interoperability between lifecycle tools.  The architectural principles for these services is based on HTTP REST and user interface patterns supporting loosely coupled integrations.  Since these services don't rely on any particular programming language or frameworks, it allows greater possibility for all tool vendors to implement.

Opening up data 
...and their relationships.  As important is the lifecycle data is the relationships (links)  defined in that data.  This has been the foundation and basis for most scenarios that drive our efforts.  Therefore it is natural that a way to expose this data and their relationships (links), that follows in the approach outlined by W3C and Tim Berners-Lee as "Linked Data".  As we continue to open up the data traditionally closed in lifecycle tools, we see that we can achieve new levels of interoperability using  linked lifecycle data.

Open possibilities
We continue to find new and innovative ways to leverage the open linked lifecycle data.  We see this in the evolving work done to support scenarios in: PLM/ALM, DevOps, model management, to name a few.  It clear though that OSLC is real, adoption continues to grow and end-users (who may not even know they are using OSLC) are getting real value: this will continue for years to come.