Most recent articles
May is one of the windy month in San Francisco city but also a key month when talking about Java thanks to the JavaOne conference that take place every year.
I personally like this city and the region around so this is for me a traditional appointment each year to talk about java, to meet people and to introduce latest features and news about our Workflow and BPM solutions. San Francisco is also for me the opportunity to taste some Nappa Valley wines, to eat some craps close to Fisherman’s Wharf, to relax near the Golden Gate Park or to have some good food on the Latino Corner.
On the technical part, this year was indeed special for me: the OW2 consortium hosting Bonita and Orchestra projects had a booth in the Conference Pavillon, our RC1 versions of Nova Bonita and Nova Orchestra got released, the Process Virtual Machine technology got a good momentum and last but not least, a lot of friends and partners were there: eXo platform, xWiki, EBMWebsourcing, JOnAS, Linagora, Talend, Jboss/Redhat, Jasper…
Together with Charles, we made some great presentations around Nova Bonita and Nova Orchestra and we even shown an earlier release of our next Web 2.0 Process Console. We also took the chance to make some join demos with Benjamin Mestrallet from eXo Platform.
That was really cool, on one hand Benjamin introduced eXo 2.0 platform including new collaborative tools such mail, agenda, chat and video meeting. On the other hand, I made a demo of the Process Console based on the eXo Web 2.0 technology. We are currently finishing some skin related work before release all this stuff !
On the partnership and collaboration flags we had really good meetings with Talend and JasperSoft guys. We indeed appreciated both technical and business discussions with Guillaume (Talend Consultant) and Bertrand (Talend CEO). Those guys has really set up a nice solution for ETL that we will leverage in Nova Bonita and Nova Orchestra to extract and to transform BPM data.
On the next couple of weeks we will work on getting the right technical and business BPM indicators from both execution and historical BPM data. Next step I see would be to leverage those indicators with a BI solution (i.e Jasper) to generate some dashboards, chats or reports…
The idea behind all that is to integrate those graphical elements inside the process console as web 2.0 widgets.
In an open world, the content and information management is strategic for companies. There is a huge number of open source solutions intending to solve content management related problems. So how to differentiate between a good or a bad Content Management solution ?
Content Management System strategic functionnalities
If we take a look to a CMS comparaison system like CMS Matrix, we see that there are a lot of common functionnalities in a CMS. What should be the strategic functionnalities for a company? All CMS will well done the content management part (CRUD, classification,indexing,search) and commonly you should also be able to customize the look and feel of the solution. So how to choose the right CMS Solution? We assume that the support, project continuity and all project around services have been checked before.
IMO, the strategic functionnalities are not core functionnalities but functionnalities that allows adaptability. We can list 2 main categories of functionnalities: security (with user management) and lifecycle workflow.
Why these two categories?
- The first (Security) is because a company have already an information system and a user database. An IT manager doesn’t want to see user management be duplicated = SSO paradygm.
- Lifecycle workflow because an organization is unique and the way of working also. So most of the time, the lifecycle is specific to an organization and most of the time there are more than one lifecycle…
Human organization is complex
It is more easy to change a software than a human organization, it is a fact. So adapting a software to an organization is crucial for a good CMS. But isn’t it one of paradygm of Business Process Management: be able to adapt a software solution to human organization? Yes it is. We finally find a link between content management and business process management…
Real BPM solution or not?
IMO, a CMS should be based on a workflow solution. If the workflow solution is an internal solution of the CMS, the result could be great but not really adapted to all companies. The workflow solution should have is own language (for example based on open standar like xPDL) with is own tools to create new processes or to uodate existing ones… In fact, the only good CMS are those which integrate external workflow solutions? Why? Workflow is a specific world and it is not the same than content management…
Personnaly, I participate to some open source CMS projects and I see the advantages to have an external BPM Solution integrated in:
- To integrate a BPM Solution, the API must be present in CMS
- Then, You can choose your own BPM Solution to work with the choosen CMS
- A good BPM solution always provide powerfull graphical tools
- So complex is your organization, a good BPM solution provides you an easy solution
Actually if you ask me for a powerfull content management system, I will suggest you, no hesitation, the ECM part of the eXo project. Why? Not only because i’m part of the development team :-) but also because is based on a pure workflow solution thanks to the the integration of the Bonita workflow system. eXo ECM based on a workflow solution such Bonita give you the flexibility you need to handle ECM processes in your organization.
The integrated solution give you the best of both worlds: presentation, personalization and integration from a portal like eXo and automation, modeling and collaboration from a workflow solution such Bonita.
In a separate post I will go in deep on main features that can be handled by a workflow solution in a portal (not only document management !).
IMO, a CMS that cannot integrate an external BPM Solution is not actually a professionnal system
Task management technologies and concepts are often associated to workflow and BPM but most of non-workflow applications are leveraging its capabilities as well.
In a way or another our applications needs to handle the way in which some task/activities/steps are assigned to users, as well as to take care about how users are interacting with them (is this user allowed to perform this task ? are there some constraints to be applied to the task execution ?… ).
Traditional Task Management features
Common features provided by a task management module are:
– Task assignments to single users or groups from a well defined business logic
– Escalation operations depending on different criteria: the amount of work of a user, the charge of a server…
– Delegation capabilities: implicit or explicit user delegation of tasks (i.e the user who is responsible of a particular task is on holidays)
– Support for multiple user’s repositories interaction such LDAP, Active Directory, Database, API…
Task Management, advanced features
Advanced task management implementations take the users-roles mapping paradigm to another level by integrating task management with organization’s information systems, management rules, organization structure or organization risks. This is known as Information Technology Governance.
Task Management plays also a big role in SOA. Traditional service oriented technologies and standards such SCA or BPEL, which were lacking on human support in the past, are standardizing or proposing multiple ways to add human workflow capabilities to SOA.
The recently approved Human Task specification (in Oasis) is a good example of this movement. This specification defines interfaces to allow introducing people tasks as services in an SOA.
Task Management and the PVM
That said, I hope you share my vision that Task Management is more than human workflow and so that this component deserves a dedicated focus!
Indeed, the Task Management module is one of the main components we are building on top of the Process Virtual Machine technology. Our intent is to develop a generic infrastructure for tasks that can fit with all that different use cases (sounds pretentious but we like challenges -:)
This Task Management module will be:
– Extensible and “smart”, in a way that it could interact with other human related applications around (i.e Web 2.0 technologies interaction such web mail, forum, calendars… or still online tasks managers like “Remember the Milk”).
– Lightweight, and so embeddable in any external application (i.e a web application or a desktop one)
– Generic, will support multiple implementations such workflow human tasks (i.e integration with Bonita XPDL workflow), Human Tasks specification (both standalone and BPEL integrated versions, i.e with Ochestra BPEL project)
– “Governed”, thanks to advanced Governance capabilities.
In a more technical basis, this module will allow to create tasks through a defined API. Once created, the task life cycle will be handled by the module in a “sur mesure” way. That means that tasks life cycle will be configurable thanks to the use of a Process Virtual Machine internal process definition (we call that “life cycle as a process”).
The module will be also responsible to handle callback operations with the client (i.e when a task is finished). Those operations will allow synchronization between the two partners (a client application and the Task Management module). For instance, in a workflow engine implementation (i.e Bonita Workflow), workflow activities and tasks (those ones handled by the Tasks module) could have different life cycle that requires synchronization.
Advanced features such tasks variables injection or users-roles resolution will be configurable allowing multiple implementations and different strategies support.
This approach is the basis for the technology we are developing together for the last year and half: The Process Virtual Machine.
In the same article Tom talks about different process languages and graphical notations (BPEL, BPMN, XPDL, JPDL, BPDM…) before addressing some of the issues of BPM tools and products such the relationships between analysts and developers during BPM definition.
While I agree with him on the Process Component Model approach, I could only disagree with some of the statements discussed in the article and so I decided to post here my comments and thoughts:
– First item on my list is about the classification of process languages between definition vs executable languages. I agree that XPDL would be in the first group and BPEL in the second one, but would be great to discuss about what is the border between both groups or, in other words, what is required by a definition language to be considered as an “executable” one ?
Tom claims that JPDL is an executable language that also allows communication between analysts and developers. Could a single process language fit in both groups ?
Hum, having a look to JPDL looks pretty close to XPDL syntax but adds java logic to the process definition. This allows developers to “include pieces of Java code hidden from the diagram” as Tom said.
Those pieces of Java code can also be used to resolve users-groups assignments at process runtime. That’s great and definitely useful but do you think that this is a good argument to build a proprietary language for process definition and execution?
The approach we proposed in Bonita Workflow, is to extend XPDL to provide those kind of features to the basic XPDL grammar. To achieve that we are just leveraging the XPDL “extended attributes” (kind of extensions points in XPDL) at both activities and participants sides. This allow to add java code calls (known as hooks and mappers in Bonita) to XPDL.
Imo, this is a more natural approach than going completely proprietary.
– Another important item is related to execution semantics, languages such BPEL are based on a standard that defines how a particular activity/statement must behave.
XPDL do not specify, for instance, how a “split” must be implemented by engines but does JPDL do that? Well, as there is no formal specification around JPDL (remember it’s a proprietary language), there is only one engine implementation so the execution semantics are the one implemented by this engine.
– XPDL vs BPDM: Tom pointed out that BPDM could be a serious candidate to replace XPDL in the future as its being defined inside the same consortium that BPMN. I mostly agree with Keith Swenson about the maturity of XPDL compared to BPDM (not yet approved as standard) so I feel confident that XPDL will stay with us for a while !
Anyway, I would suggest you to use workflow and BPM technologies based on component process model technology such The Process Virtual Machine. This will allow vendors to easily add support for new standards (such BPDM) if some of the today’s standards is condemned to die. I could tell you that at Bonita and Orchestra side we will be ready to support other languages to come.
– My last point, for today, would be around how to allow users when choosing the best process language (and BPM solution) for a particular need.
Indeed a process language classification based on execution vs definition languages could helps users to better chose the process language that better fits his needs (this is one of the premises of the Process Component Model).
However, additional parameters should also be added to the list: degree of collaboration between analyst and developer’s, integration requirements with an ESB middleware, integration in a web application, human interaction requirements, web services scripting, integration with your favourite programming language…
Let me give you an example, as a technical architect i’m looking for a BPM solution for adding HR processes to my web application. My requirements (parameters) are: the application will be developed in Java, with users-groups mapping to be implemented on top of an LDAP directory, with processes persistence on top of Oracle and users email notifications when tasks are assigned to them; In this case, I would suggest to use a Java based workflow engine compliant to XPDL standard.