Archive for the ‘BPM’ Category

Intelligent Workload Distribution

While I’m not doing much these days in the BPM space, I did recently have lunch with a friend of mine who works for Genesys Lab. I don’t normally talk about vendor products by name, but the iWD (intelligent Workload Distribution) product had a different enough approach from things I’ve seen that I thought I’d share it. In the spirit of full disclosure, I’m doing this on my own time (and dime), simply because I thought the solution was interesting. Hopefully you will too.

In the past, the BPMSs that I dealt with (and the businesses trying to use them) were primarily focused on process automation and process management. Process automation tries to automate as many of the tasks within the process as possible. More importantly, the tools tried to put a process centric view around collections of tasks so that they could be more effectively managed. When successfully applied, these tools have delivered an increase in productivity, but there’s still plenty of room for more improvement. This is where iWD comes in.

iWD, as the name implies, focuses on the distribution of the manual tasks associated with business processes. It is not a BPM tool on its own, rather, it is solely focused on the distribution of tasks from your systems to the individuals that will execute them. Based on what I saw, you can think of iWD as a context-aware distribution engine. In some tools, a “worker” logs into the queue associated with a particular process and takes the next item off the queue. What if they’re not the most qualified for that task? What if there are other tasks associated with another process that are more important? What if customers have varying SLAs that cause one customer’s tasks to take precedence over another’s. By taking into account the customer and any associated SLAs, the skills and location of the workforce, and other factors, tasks can be distributed from across all processes to the global workforce in a more efficient manner.

Anyway, with context become increasing important in today’s systems, most commonly associated with location-based services, I thought this was a great example of using a variety of contextual items to improve the distribution of tasks in a BPM system. You can learn more by visiting the Genesys Lab website here. I’d love to hear other examples of context-aware computing, feel free to comment or send me a message.

Context Aware Computing and the iPad

All content written by and copyrighted by Todd Biske. If you are reading this on a site other than my “Outside the Box” blog, it’s probably being republished without my permission. Please consider reading it at the source.

I just posted a response to a question about the iPad in an enterprise setting over in an eBizQ forum and decided that I wanted to expand on it here in a blog post.

Much of the discussion about the iPad is still focusing on a feature by feature comparison to a netbook or a laptop. The discussion can not get out of the 20 year old world of keyboards, mice, and the windows and desktop metaphors. To properly think about what the iPad can do, you need to drop all of this context and think about things in new ways. In my previous post on the iPad, I emphasized this point, stating that the iPad is really about taking a new form of interaction (touch, with completely customizable interface) and putting it on a new form factor. In answering the eBizQ question, I realized that it goes beyond that. The key second factor is context awareness.

Back in 2007, I attended the Gartner Application Architecture, Development, and Integration Summit and the concept of “Context-Oriented Architecture” was introduced. In my blog post from the summit, I stated that:

[Gartner] estimates that sometime in the 2010’s, we will enter the “Era of Context” where important factors are presence, mobility, web 2.0 concepts, and social computing.

In that same post, I went on to state that this notion of context awareness will create a need for very lightweight, specific-purpose user interfaces. While at the time, I was leaning toward the use of Dashboard widgets or Vista sidebar items, but guess what has taken over that category? iPhone and iPod Touch apps. Now, we have the potential for a device with a larger form factor that can present a touch-based interface, completely tailored to the task at hand. This is another reason why I don’t see multi-tasking as a big deal. The target for this audience isn’t multi-tasking, it’s for these efficient, single-purpose interfaces. Imagine going into a conference room where your iPad is able to determine your meeting room through sensors in the building, where it knows what meeting you’re in and who else is in the room through calendar integration, it knows the subject of the meeting, and can now present you with a purpose-driven interface for that particular meeting. Our use of information can be made much more efficient. How many times have you been in a meeting only to wind up wasting time navigating around through your files, email, the company portal, etc. trying to find the right information. What if you had an app that organized it all and through context awareness, presented what you needed? The same certainly holds true for other activities in the enterprise beyond meetings. As we have more use of BPM and Workflow technologies, it is certainly possible that context awareness through location, time, presence of others, and more can allow more appropriate and efficient interfaces for task display and execution, in addition to providing context back into the system to aid in continuous improvement.

This isn’t going to happen overnight, but I am very excited to see whether Gartner’s prediction of the 2010’s being the “era of context” comes true. I think it will, and it will be great to look back from 2020 and see just how much things have changed.

Socially enabled BPM

All content written by and copyrighted by Todd Biske. If you are reading this on a site other than my “Outside the Box” blog, it’s probably being republished without my permission. Please consider reading it at the source.

A succession of tweets between Forrester’s Gene Leganza and Clay Richardson along with Brenda Michelson of Elemental Links caught my attention this morning. At Forrester’s Enterprise Architecture Forum next week, Clay will be reviewing a few case studies for Social BPM. It’s too bad that I won’t be there, because this sounds very interesting.

I haven’t seen any definitions yet of what socially-enabled BPM is, so I thought I’d throw together my own thoughts. First off, let’s take two dominant social technology platforms, Facebook and Twitter. I’ve previously posted that an internal Facebook for the enterprise could be revolutionary for inter-company communication. I’ve also previously posted on the role of Twitter as an information bus. So, now combine the human-facing communication of either platform’s news/event stream, the application platform of Facebook, and toss in some process modeling, orchestration, and universal task management capabilities on top of it, and I do think you have socially-enabled BPM. What could be most compelling is if there’s a way to combine the communication features of the social technology platform for “ad hoc” processes with the more formally modeled and managed processes that are the strong suite of BPM platforms to get a better view (and hopefully better management) of the processes in the enterprise as a whole.

I look forward to hearing what others think about the case studies Clay will be presenting. This is definitely an emerging area where there are opportunities to lead the back and be revolutionary.

Update: Here’s two posts from Clay on the subject that he forwarded to me. It looks like my thinking is consistent with what he had previously written on the subject. The first is titled “Social Technologies Will Drive The Next Wave Of BPM Suites” and the second is titled “BPM Promises ‘Simplicity’ In 2010. Is This ‘Hope We Can Believe In’ Or Still A Pipe Dream?”.

BPM and SOA Tool Linkage

I’ve been invited to participate in SearchSOA.com’s “Ask the Expert” series and will be fielding questions primarily on BPM technologies in the context of SOA, but I hope to see some EA related questions as well. My first response was posted on November 3rd, answering the question, “What is a key characteristic I should look for in BPM modeling tools, especially when looking to pair them with SOA?” You can read my response at SearchSOA.com.

Oracle OpenWorld: SOA-Enabled BPM Adoption, Reference Architecture and Methodology Aspects

Full disclosure: I’m attending Oracle OpenWorld courtesy of Oracle.

The speakers for this session were Manas Deb, Sr. Director, SOA/BPM/Governance Product Management from Oracle and Mark Wilkins, Enterprise Architect, EAP, from Oracle.

They framed up the session as a discussion based on best practices from their EA practice, focused on companies who have a goal to adopt to BPM, built on a foundation of services. They began with a quick recap on what they feel SOA and BPM are, with SOA being focused on encapsulation and loose coupling, and BPM being focused on improved efficiency. I’m not going to debate those definitions here, just repeated them to understand the context of the presentation.

They, like many others, extended the three tier model to insert processes between the presentation tier and the service tier. The one thing that they did do was to not claim the process layer was a new tier, rather, they presented it as an extension of the services tier. Obviously, the one risk with this is it immediately puts BPM into a technology context, rather than a business context. This isn’t a problem, but it shouldn’t be the sole focus of your BPM and SOA conversations. It may be the cornerstone for conversations with IT developers, engineers, and solution architects, but certainly not for analysts, business architects, and other non-IT staff.

The first slide on methodology is emphasizing what they are calling scopes. The examples shown include enterprise (cross-project) scope, project scope, and operations scope. At the enterprise scope, the interest is assessment, strategy, and planning. It performs value-benefits analysis, forms CoEs, establishes roadmaps and maturity models, plans the portfolio, establishes governance, etc. The project scope is execution and delivery focused, the operations scope is focused on measurements, scorecards, and keeping things running. It’s important to keep these scopes, or viewpoints, in mind and ensure that they all work together.

Mark went on to blow through a whole bunch of slides way too quickly. This should have been a 60 minute presentation. It appeared that there was some good content in there, including Oracle’s approach to the use of BPM and SOA conceptual reference architectures and how they eventually drive down to the physical view of the underlying infrastructure. He went on to show examples of the conceptual architectures for BPM and SOA, some information on a maturity model, a governance framework, and a few slides that tried to fit it all together. Once I’m able to download the slides, I’ll try to remember to come back and edit this post with the details. It’s unfortunate that a presentation that appeared to have very good content with appeal to architects got crammed into half the time frame of the other sessions.

Oracle OpenWorld: EA, BPM, and SOA

Full disclosure: I am attending Oracle OpenWorld courtesy of Oracle.

The speaker is Dirk Stähler from Opitz Consulting And he is talking about how to bridge the information gap using Oracle BPA Suite and an integrated model.

He started by presenting the EA, BPM, and SOA problem which includes no unified methodology, unclear semantics, and no differentiation between EA, BPM, and SOA aspects.

He presented the three domains in a Venn diagram and called out the overlap in artifacts from each, including org structure, infrastructure, business processes, IT systems, and business objects. This overlap forms the foundation for the metamodel which can be captured in Oracle’s BPA suite.

In discussing this, he presented a pyramid, where EA is at the top (providing a conceptual blueprint of the org), underneath that is business process management (as a business design tool), then comes technical business process management (for IT specifications), and finally is information technology (supporting development). SOA spans one leg of the pyramid, impacting all four layers.

In discussing the artifacts, he defined domains for process architecture, application architecture, infrastructure architecture, data architecture, organziation architecture, and service architecture. All of the artifacts can be captured in BPA suite. In aligning this to EA, BPM, and SOA, he feels that EA covers app and infrastructure architecture, BPM covers organization, process, and data, and SOA covers service and some of data.

After this, he switched to a demo of the BPA suite, showing how to navigate the metamodel, associate different diagram types with different domains, etc. As someone with no experience with BPA suite or any other EA tooling, this was a good overview of how BPA suite could be used to manage the various models associated with an EA practice. The metamodel description covered how to separate these things within BPA suite, however the talk did not get into any issues or concerns with having two or even three different audiences using one centralized tool and repository, making sure they leverage each other’s work where appropriate.

For more information, they have published a book on their methodology, however it is currently only available in german.

Oracle OpenWorld: Next Generation Business Process Platform

Full disclosure: I am attending Oracle OpenWorld courtesy of Oracle.

This was a session focused on BPM 11g. It’s a bit of a whirlwind overview, but so far they have emphasized the use of BPMN 2.0, Business Rules integration, SCA, and the new rich form designer.

Next up is the BPA suite based on their OEM relationship with IDS Scheer ARIS. 11g introduces round trip integration with BPM Studio and a unified repository with IDS products, etc. I haven’t heard anything about Oracle ER becoming a centralized metadata repository for all Oracle products, which a find a bit surprising. It could just be that I’m in the wrong sessions, but more and more, I believe a common repository is going to be a critical component.

The presenters went on to talk about process portals, including a collaborative modeling portal, a work space model (a context specific portal for one or more related processes) and an instance-specific view (for tracking events, dealines, etc. about a specific instance of a process).

The next topic discussed was dynamic BPM. This included rule driven processes, rule driven data validation, dynamic service binding, and rule driven task management. Clearly, the theme here is integration with a rules engine and having the ability to modify the rules which will change how the process gets executes, while not requiring a redeployment of the process itself.

They have also made changes to the human workflow component to better support for unstructured processes. The technique described makes sense, but this feels like a “show me” category. Getting users to use the tooling to dynamically add new process participants and steps sounds great, but there may be some big cultural hurdles to overcome to make this useful.

They then went over enhancement to process instrumentation and business activity monitoring, including real time publishing of process metrics to BAM. They also can feed Oracle’s CEP engine for dynamic processing based on incoming metrics and events.

Overall, the message is that Oracle has a comprehensive and unified BPM platform. From the slides, it certainly appears comprehensive. The 11g release is all about unification onto a common platform, and as long as what’s been on the slides accurately reflects this new platform, 11g should be a good step forward for Oracle BPM.

Oracle OpenWorld: An Architect’s View of the New Features of Oracle SOA Suite 11g Release 1

Full disclosure: I’m attending Oracle OpenWorld courtesy of Oracle.

First wave of industry standardization was around functional-specific standards in areas causing headaches in the integration space. Emphasizing the role of SCA in the standardization of the service platform in the same way that Java EE played a role in the evolution of the application server. I’ll be honest, I’m still not a big SCA fan. I know Oracle is, though. The one good thing being shown is that the hosting environments can be managed in a single, unified way, regardless of whether that service is hosted in BPEL PM or WebLogic. As long as there’s good tooling that hides of the various SCA descriptors, this is a good thing.

Now they are talking about the event delivery network. It’s nice to see a discussion on fundamentals rather than trying to jump into a CEP discussion. They’re talking about having an event catalog, utilizing an EDL (event description language), and easily connecting consumers and subscribers. This is a good step forward, in my opinion. It may actually get people to think about events as first class citizens in the same way as services.

Now, they’re on to Oracle Human Workflow. It is all task-based, with property-based configuration. The routing of tasks can be entirely dynamic, rather than based on static rules. It has integration with Oracle Business Rules. It publishes events on the EDN (e.g. onTaskAssigned, onTaskModified, etc.). Nice to see them eating their own dog food with the use of EDN.

They’ve now moved on to Service Data Objects. They’ve introduced entity variables into BPEL to allow working with SDOs.

Additional subjects in this session included Metadata Services (MDS) and the Dev-Test-Prod problem (changing of environment-specific parameters as code is promoted through environments). On the latter, there are a large number of parameters that can now be modified via a “c-plan,” applied at deployment time. Anything that makes this easier is a good thing in my opinion.

Factoring in Barriers to Entry

As part of understanding what business projects need to do to leverage BPM technology, I’ve been trying to eat my own dog food, so to speak. I’ve been looking at some of the EA processes and trying to model them using BPMN. These processes aren’t terribly complex, but at the same time, there is potential for technology to assist in their execution. They involve email distribution, task assignment, timer-based checks, notifications, etc., all of the same things that a process for a non-IT department may want to leverage too. The problem is that I look at these simple lightweight processes and think about the learning curve required to leverage the typical enterprise BPM suite, and that big barrier to entry is a large inhibitor. Even with using BPMN versus the built-in flowchart template, I can generate the process model in Visio very quickly.

A challenge that the BPM space faces right now is its barrier to entry. There are tools, most prominently SharePoint, that excel at having a low barrier to entry. When a team can quickly create a process model that can orchestrate and manage their work requests via an intranet site, that’s a big win. At the same time, does that low barrier to entry eventually become a boat anchor, either through infrastructure that poorly scales or a lack of more advanced features such as process analytics? On the flip side, does the business need another technology that requires significant development expertise and months-long (or more) projects to utilize? What’s the right path to take?

My opinion is that adoption is more important than sophistication, especially with the rate of technology change today, and the influence of consumer technologies on the enterprise. There is so much an individual user can do today, and human nature is to take the path of least resistance. This doesn’t necessarily mean that we should all pick the tool with the lowest barrier to entry, but it does mean that whatever tool you choose, you must get that barrier to entry to an appropriate point, especially if there are competing technologies that can be used. If your BPM technology requires that every process initiative have the equivalent of a senior developer involved, that could be a big problem if it’s something that the end users could do using Visio, Excel, or anything else. Find a way to lower the barrier.

Finding Value in BPM/Workflow Technology

Some recent conversations about the use of workflow and orchestration technologies got me thinking about how to properly look for value when trying to apply these technologies, whether associated with a BPM suite, or with any of the other multitude of tools out there that claim to have orchestration/automation/workflow/work management capabilities.

The one common term that always comes up is process. All of these tools always wind up having some sort of process definition be a requirement. There is one big factor, however, that has a significant impact on where you should look for value, and that’s whether those processes involve manual (i.e. done by a person) activities or not.

Let’s handle the simpler of the two cases, first, which is where there is no manual activities whatsoever. In this case, what we’re really talking about is process automation. If there are no manual steps, then there is no reason that the entire process can’t be fully automated. If we fully automate a process, what are the factors in the value equation? Clearly, if the process isn’t fully automated today, there is a one-time benefit in efficiency. The execution time should move from a variable, potentially unpredictable value, to a consistent, predictable value. This is the case regardless of what tools we use to automate it. Theoretically, I could automate the process with scripts or a programming language and achieve the same value. If you agree with me, then the real value contribution in applying BPM/Workflow technologies lies not in the run-time space, but in the development time space. By either reducing inefficiencies in the communication between analysts and developers through a common language (a process model), or by improving productivity in the development time through the drag-and-drop visual environments of most tools, value can be obtained through time-to-delivery. Beyond this, there is probably not as much value to be obtained through the “management” portion of the BPM suite. Even if the process is subject to frequent change, the area of interest is the time to deliver the change, not optimization of the process itself, since by fully automating the process, we should assume it’s also fully optimized.

If we throw manual tasks into the equation, then we have a different story. While the development time efficiencies certainly still apply, there’s now significant value that can be obtained through process analysis and optimization. I need to know how long those manual tasks take, why Judy accomplishes more tasks than John, what chaos ensues when Fred calls in sick, what the impact of task assignment and escalations are, etc. This information can be obtained by managing the processes, through instrumentation, analytics, and reporting. By doing so, we can get into a cycle of continuous improvement, and strive to optimize the manual efforts that can’t be automated.

Now the reason I bring this up is that there are no shortage of tools that claim to have workflow/business process capabilities. If you have a BPM suite, now you’re faced with the question of which workflow tool to use. What you need to think deeply about is where you’re going to get your value. Products with workflow capabilities may have advantages in development time value because they will come pre-populated with actions/tasks appropriate to the context of that tool, while a generalized BPM platform may not. The flipside, however, is that those same tools with workflow capabilities may only provide a piece of the BPM suite, namely, business process development. If what you really need is business process management, with the ability to monitor, analyze, and optimize the manual parts of your processes, then you may need to sacrifice some development time efficiencies to get the more important run-time value.

Finally, keep in mind that not all work can be defined by a process. As Keith Harrison-Broninski talks about in his book, Human Interactions: The Heart And Soul Of Business Process Management: How People Reallly Work And How They Can Be Helped To Work Better, there will always be ad hoc work. You’ll still need to consider how to best utilize technology to support those ad hoc activities, rather than trying to define a rigid process for something that isn’t.

Think Orchestration, not BPEL

I was made aware of this response from Alex Neihaus of Active Endpoints on the VOSibilities blog to a podcast and post from David Linthicum. VOS stands for Visual Orchestration System. Alex took Dave to task for some of the “core issues” that Dave had listed in his post.

I read both posts and listened to Dave’s podcast, and as is always the case, there are elements of truth on both sides. Ultimately, I feel that the wrong question was being asked. Dave’s original post has a title of “Is BPEL irrelevant?” and the second paragraph states:

OK, perhaps it’s just me but I don’t see BPEL that much these days, either around its use within SOA problem domains I’m tracking, or a part of larger SOA strategies within enterprises. Understand, however, that my data points are limited, but I think they are pretty far-reaching relative to most industry analysts’.

To me, the question is not whether BPEL is relevant or not. The question is how relevant is orchestration? When I first learned about BPEL, I thought, “I need a checkbox on my RFP/RFI’s for this to make import/export is supported,” but that was it. I knew the people working with these systems would not be hand-editing the XML for BPEL, they’d be working with a graphical model. To that end, the BPMN discussion was much more relevant than BPEL.

Back to the question, though. If we start talking about orchestration, we get into two major scenarios:

  1. The orchestration tool is viewed as a highly-productive development environment. The goal here is not to externalize processes, but rather, to optimize the time it takes to build particular solutions. Many of the visual orchestration tools leverage significant amount of “actions” or “adapters” that provide a visual metaphor for very common operations such as data retrieval or ERP integration. The potential exists for significant productivity gains. At the same time, many of the things that fall into this category aren’t what I would call frequently changing processes. The whole value add of being able to change the process definition more efficiently really doesn’t apply.
  2. The orchestration tool is viewed as a facility for process externalization. Here’s the scenario where the primary goal is flexibility in implementing process changes rather than in developer productivity. I haven’t seen this scenario as often. In other words, the space of “rapidly changing business processes” is debatable. I certainly have seen changes to business rules, but not necessarily to the project itself. Of course, on the other hand, many processes are defined to begin with, so the culture is merely reacting to change. We can’t say what we’re changing from or to, but we know that something in the environment is different.

So what’s my opinion? I still don’t get terribly excited about BPEL, but I definitely think orchestration tools are needed for two reasons:

  1. Developer productivity
  2. Integrated metrics and visibility

Most of the orchestration tools out there are part of a larger BPM suite, and the visibility that they provide on how long activities take is a big positive in my book (but I’ve always been passionate about instrumentation and management technologies). As for the process externalization, the jury is still out. I think there are some solid domains for it, just as there are for things like complex event processing, but it hasn’t hit mainstream yet at the business level. It will continue to grow outward from the developer productivity standpoint, but that path is heavily focused on IT system processes, not business processes (just like OO is widely used within development, but you don’t see non-IT staff designing object models very often). As for BPEL, it’s still a mandatory checkbox, and as we see separation of modeling and editing from execution engine, it’s need may become more important. At the same time, how many organizations have separate Java tooling for when they’re writing standalone code versus writing Java code for SAP? We’ve been dealing with that for far longer, so I’m not holding my breath waiting for a clean separation between tools and the execution environment.

Gartner EA: Case Study

I just attended a case study at the summit. The presenter requested that their slides not be made available, so I’m being cautious about what I write. There was one thing I wanted to call out, which was that the case study described some application portfolio analysis efforts and mapping of capabilities to the portfolio. I’ve recently been giving a lot of thought to the analysis side of SOA, and how an organization can enable themselves to build the “right” services. One of the techniques I thought made sense was exactly what he just described with the mapping of capabilities. Easier said than done, though. I think most of us would agree that performing analysis outside of the context of a project could provide great benefits, but the problem is that most organizations have all their resources focused on running the business and executing projects. This is a very tactical view, and the usual objection is that as a result, they can’t afford to do a more strategic analysis. It was nice to hear from an organization that could.

Piloting within IT

Something I’ve seen at multiple organizations is problems with the initial implementation of new technology. In the perfect world, every new technology would be implemented using a carefully controlled pilot that exercised the technology appropriately, allowed repeatable processes to be identified and implemented, and added business value. Unfortunately, it’s that list item that always seems to do us in. Any project that has business value tends to operate under the same approach that any project for the business does, which usually means schedule first, everything else second. As a result, sacrifices are made, and the project doesn’t have the appropriate buffers to account for the lack of experience the organization has. Even if professional services are leveraged, there’s still a knowledge gap that relates the product capabilities to the business need.

One suggestion I’ve made is to look inside of IT for potential pilots. This can be a chicken versus the egg situation, because sometimes funding can not be obtained unless the purchase is tied to a business initiative. IT is part of the business, however, and some funding should be reserved for operating efficiency improvements within IT, just as the same should be done for other non-revenue producing areas, such as HR.

BPM technology is probably the best example to discuss this. In order to fully leverage BPM technology, you have to have a deep understanding of the business process. If you don’t understand the processes, there’s no tool that you can buy that will give you that knowledge. There are packaged and SaaS solutions available that will give you their process, but odds are that your own processes are different. Who is the keeper of knowledge about business processes? While IT may have some knowledge, odds are this knowledge resides within the business itself, creating the challenge of working across departments when trying to apply the new technology. These communication gaps can pose large risks to a BPM adoption effort.

Wouldn’t it make more sense to apply BPM technology to processes that IT is familiar with? I’m sure nearly every large organization purchases servers and installs them in its data center. I’m also quite positive that many organizations complain about how long this process takes. Why not do some process modeling, orchestration, and execution using BPM technologies in our own backyard? The communication barriers are far less, the risk is less, and value can still be demonstrated through the improved operational efficiencies.

My advice if you are piloting new technology? Look for an opportunity within IT first, if at all possible. Make your mistakes on that effort, fine tune your processes, and then take it to the business with confidence that the effort will go smoothly.

Is Identity Your Enabler or Your Anchor?

I actually had to think harder than normal for a title for this entry because as I suspected, I had a previous post that had the title of “Importance of Identity.” That post merely talked about the need to get identity on your service messages and some of the challenges associated with defining what that identity should be. This post, however, discusses identity in a different light.

It occurred to me recently that we’re on a path where having an accurate representation of the organization will be absolutely critical to IT success. Organizations that can’t keep ActiveDirectory or their favorite LDAP up to date with the organizational changes that are always occurring with find themselves saddled with a boat anchor. Organizations that are able to keep their identity stores accurate and up to date will find themselves with a significant advantage. An accurate identity store is critical to the successful adoption of BPM technology. While that may be more emerging, think about your operations staff and the need for accurate roles associated with the support of your applications and infrastructure. One reorg of operations and the whole thing could fall apart with escalation paths no longer in existence, incorrect reporting paths, and more.

So, before you go gung-ho with BPM adoption, take a good look at your identity stores and make sure that you’ve got good processes in place to keep it up to date. Perhaps that should be the first place you look to leverage the BPM technology itself!

ActiveVOS BUnit

While I don’t normally comment on press releases that I occasionally receive in email, one tidbit in a release from Active Endpoints about ActiveVOS™ 5.0 caught my eye:

Active Endpoints, Inc. (www.activevos.com), inventor of visual orchestration systems (VOS), today announced the general availability of ActiveVOS™ 5.0. …

Scenario testing and remote debugging. ActiveVOS 5.0 fundamentally and completely solves a major pain experienced by all developers: the question of how to adequately test loosely-coupled, message-based applications. ActiveVOS 5.0 includes a new BUnit (or “BPEL unit test”) function, which allows developers to simulate the entire orchestration offline, including the ability to insert sample data into the application. A BUnit can be created by simply recording a simulation in the ActiveVOS 5.0 Designer. Multiple BUnits can be combined into BSuites, or collections of smaller simulations, to build up entire test suites. Once deployed into a production environment, ActiveVOS 5.0 delivers precisely the same experience for testing and debugging a production orchestration as it does for an application in development. Remote debugging includes the ability to inspect and/or alter message input and output, dynamically change endpoint references and alter people assignments in the application.

Back in November, in my post titled Test Driven Model Development, I lamented the fact that when a new development paradigm comes along, like the graphical environments common in BPM tooling, we run the risk of taking one or more steps backward in our SDLC technologies. I used the example of test-driven development as an example. As a result, I’m very happy to see a vendor in this space emphasizing this capability in their product. While it may not make a big difference in the business solutions out there, things like this can go a long way in getting some of the hard-core Java programmers to actually give some of these model-driven tools a shot.

Ads

Disclaimer
This blog represents my own personal views, and not those of my employer or any third party. Any use of the material in articles, whitepapers, blogs, etc. must be attributed to me alone without any reference to my employer. Use of my employers name is NOT authorized.