Archive for the ‘IT’ Category
Think Orchestration, not BPEL
I was made aware of this response from Alex Neihaus of Active Endpoints on the VOSibilities blog to a podcast and post from David Linthicum. VOS stands for Visual Orchestration System. Alex took Dave to task for some of the “core issues” that Dave had listed in his post.
I read both posts and listened to Dave’s podcast, and as is always the case, there are elements of truth on both sides. Ultimately, I feel that the wrong question was being asked. Dave’s original post has a title of “Is BPEL irrelevant?” and the second paragraph states:
OK, perhaps it’s just me but I don’t see BPEL that much these days, either around its use within SOA problem domains I’m tracking, or a part of larger SOA strategies within enterprises. Understand, however, that my data points are limited, but I think they are pretty far-reaching relative to most industry analysts’.
To me, the question is not whether BPEL is relevant or not. The question is how relevant is orchestration? When I first learned about BPEL, I thought, “I need a checkbox on my RFP/RFI’s for this to make import/export is supported,” but that was it. I knew the people working with these systems would not be hand-editing the XML for BPEL, they’d be working with a graphical model. To that end, the BPMN discussion was much more relevant than BPEL.
Back to the question, though. If we start talking about orchestration, we get into two major scenarios:
- The orchestration tool is viewed as a highly-productive development environment. The goal here is not to externalize processes, but rather, to optimize the time it takes to build particular solutions. Many of the visual orchestration tools leverage significant amount of “actions” or “adapters” that provide a visual metaphor for very common operations such as data retrieval or ERP integration. The potential exists for significant productivity gains. At the same time, many of the things that fall into this category aren’t what I would call frequently changing processes. The whole value add of being able to change the process definition more efficiently really doesn’t apply.
- The orchestration tool is viewed as a facility for process externalization. Here’s the scenario where the primary goal is flexibility in implementing process changes rather than in developer productivity. I haven’t seen this scenario as often. In other words, the space of “rapidly changing business processes” is debatable. I certainly have seen changes to business rules, but not necessarily to the project itself. Of course, on the other hand, many processes are defined to begin with, so the culture is merely reacting to change. We can’t say what we’re changing from or to, but we know that something in the environment is different.
So what’s my opinion? I still don’t get terribly excited about BPEL, but I definitely think orchestration tools are needed for two reasons:
- Developer productivity
- Integrated metrics and visibility
Most of the orchestration tools out there are part of a larger BPM suite, and the visibility that they provide on how long activities take is a big positive in my book (but I’ve always been passionate about instrumentation and management technologies). As for the process externalization, the jury is still out. I think there are some solid domains for it, just as there are for things like complex event processing, but it hasn’t hit mainstream yet at the business level. It will continue to grow outward from the developer productivity standpoint, but that path is heavily focused on IT system processes, not business processes (just like OO is widely used within development, but you don’t see non-IT staff designing object models very often). As for BPEL, it’s still a mandatory checkbox, and as we see separation of modeling and editing from execution engine, it’s need may become more important. At the same time, how many organizations have separate Java tooling for when they’re writing standalone code versus writing Java code for SAP? We’ve been dealing with that for far longer, so I’m not holding my breath waiting for a clean separation between tools and the execution environment.
The Real SOA Governance Dos and Don’ts
Dave Linthicum had a recent post called SOA Governance Dos and Don’ts which should have been titled, “SOA Governance Technology Selection Dos and Don’ts.” If you use that as the subject, then there’s some good advice. But once again, I have to point out that technology selection is not the first step.
My definition of governance is that it is the people, policies, and processes that ensure desired behavior. SOA governance, therefore, is the people, policies, and processes the ensure desired behavior in your SOA efforts. So what are the dos and don’ts?
Do: Define what your desired behavior is. It must be measurable. You need to know whether you’re achieving the behavior or not. It also should also be more than one statement. It should address both behavior of your development staff as well as the run-time behavior of the services (e.g. we don’t one any one consumer to be able to starve out other consumers).
- Don’t: Skip that step.
- Do: Ensure that you have people involved with governance who can turn those behaviors into policies.
- Don’t: Expect that one set of people can set all policies. As you go deep in different areas, bring in appropriate domain experts to assist in policy definition.
- Do: Document your policies.
- Don’t: Rely on the people to be the policies. Your staff has to know what the policies are ahead of time. If they have to guess what some reviewer wants to see, odds are they’ll guess wrong, or the reviewer may be more concerned about flaunting authority rather than achieving desired behavior.
- Do: Focus on education on the desired behavior and the policies that make it possible.
- Don’t: Rely solely on a police force to ensure compliance with policies.
- Do: Make compliance the path of least resistance.
- Don’t: Expect technologies to define your desired behavior or policies that represent it.
- Do: Use technology where it can improve the efficiency of your governance practices.
There’s my take on it.
Gartner EA: EA and SOA
This is my last post from the summits (actually, I’m already at the airport). This morning, I participated in a panel discussion on EA and SOA as part of the EA Summit with Marty Colburn, Executive VP and CTO for FINRA; Maja Tibbling, Lead Enterprise Architect for Con-way; and John Williams, Enterprise Architect from QBE Regional Insurance. The panel was jointly moderated by Dr. Richard Soley of the OMG and SOA Consortium and Bruce Robertson of Gartner. It was another excellent session in my opinion. We all brought different perspectives on how we had approached SOA and EA, yet there was some apparent commonalities. Number one was the universal answer to what the most challenging things was with SOA adoption: culture change.
There were a large number of questions submitted, and unfortunately, we didn’t get to all of them. The conference director, Pascal Winckel (who did a great job by the way), has said he will try to get these posted onto the conference blog, and I will do my best to either answer them here on my blog or via comments on the Gartner blog. As always, if you have questions, feel free to send them to me here. I’d be happy to address them, and will keep all of the anonymous, if so desired.
Gartner EA: Case Study
I just attended a case study at the summit. The presenter requested that their slides not be made available, so I’m being cautious about what I write. There was one thing I wanted to call out, which was that the case study described some application portfolio analysis efforts and mapping of capabilities to the portfolio. I’ve recently been giving a lot of thought to the analysis side of SOA, and how an organization can enable themselves to build the “right” services. One of the techniques I thought made sense was exactly what he just described with the mapping of capabilities. Easier said than done, though. I think most of us would agree that performing analysis outside of the context of a project could provide great benefits, but the problem is that most organizations have all their resources focused on running the business and executing projects. This is a very tactical view, and the usual objection is that as a result, they can’t afford to do a more strategic analysis. It was nice to hear from an organization that could.
Gartner EA: The Management Nexus
Presenters: Anne Lapkin and Colleen Young
One thing all of the presenters in the EA Summit are very good at doing is using consistent diagrams across all of their presentations. This is at least the third presentation where I’ve seen this flow diagram showing linkage between business goals and strategy, and business planning and execution. Unfortunately, Anne points out that the linkage is where things typically break down.
Colleen is now discussing strategic integration, which begins with an actionable articulation of business strategy, goals and objectives. From there, she recommends a standardized, integrated, results-based management methodology. As a result, she claims that we will see exponentially greater benefits from enterprise capabilities and investments.
Anne is speaking again and emphasizing that we need a unified contextual view. This consists of a goal, which is one level deeper than the “grow revenues by XY%” which includes a future end state with a timeline and measurable targets, principles that establish the desired behavior and core values, and relationships.
Colleen now has a great slide up called, “The Implication of ‘Implications’.” The tag line says it all- “Unclear implications lead to inconsistent assumptions and independent response strategies that inevitably clash.” Implications that must be investigated include financial implications, business process implications, architecture implications, cultural change implications, and more. All parties involved must understand and agree on these implications.
A statement Colleen just made that resonates with my current thinking is, “Based upon these implications, what do I need to change?” All too often, we don’t stop to think about what the “change” really is. Work starts happening, but no one really has a clear idea of why we’re doing it, only an innate trust that the work is necessary and valuable. If the earlier planning activities have made these goals explicit, the execution should be smoother, and when bumps in the road are encountered, the principles are right there to guide the decision making process, rather than on relying on someone’s interpretation of an undocumented implication.
Once again, this was a good session. I know I’ve commented on a few sessions that they could have been a bit more pragmatic or actionable, this one definitely achieved that goal. I think the attendees will be able to leave with some concrete guidance that they can turn around and use in their organizations.
Gartner EA: Strategic Planning Tools and Techniques
Presenter: Richard Buchanan
The first topic Richard is covering is the need for enterprise architects to master strategic thinking. His current slide is consistent with an earlier talk today, showing that enterprise strategy is at the intersection of three disciplines: Enterprise Strategy and Planning, Enterprise Architecture, and Enterprise Portfolio Management. He states that enterprise architecture must translate business vision and strategy into effective enterprise change. He’s discussing how a budget and the organization chart are not part of the business strategy, pointing out that a budget should be a downstream deliverable derived from the business strategy. Great point. His definition of strategy includes an organization’s environment, goals, objectives, major programs of action, and the resource allocation choices to execute them.
The next topic he is covering are the categories of tools and techniques that are used in developing a business strategy. These are not software tools, as the first one he’s showing is Porter’s 5 Forces Model (this is second time Michael Porter has been referenced at the Summit). He’s challenging us to go and find the people in our organization that are looking at these things. Good advice. There’s no doubt that if you want to do strategic planning, you need to be looking at these five forces, and there’s a good chance that someone at the company (probably outside of IT) is already doing this. The same thing holds true for the other categories of tools that he has went through.
The final point he’s covering is how to leverage these strategic tools within the EA process. To some extent this is motherhood and apple pie, but it’s very good advice, especially knowing that many EA’s have grown out of the world of application development and may still be very technology focused. As a result, it’s entirely possible that the EA team has never read the company’s annual report. It’s even more likely that EA hasn’t seen things like competitive analysis documents. If an EA doesn’t understand how a company competes, how can they make appropriate decisions? Speaking very broadly and citing Michael Raynor’s earlier presentation, do you know whether your company differentiates on cost or on products? Both of those can have significant impacts on how information technology is leveraged. A company that differentiates based on product excellence and customer service must have significantly better technology for understanding their customers than a company that simply tries to be the lowest cost provider in the marketplace.
My final thoughts: There’s not much to disagree with in this presentation. I think he paints a great picture of what many of us would like to be doing. The challenge I suspect that many attendees have is that our EA organizations, as a previous presenter put it, “are mired in the world of technology architecture.” Somehow, we need to find a seat at the strategic planning table so when we ask about some of these artifacts, everyone knows its importance versus stopping us in our tracks and asking, “Why do you need it?”
Gartner EA: Context Delivery Architecture
Presenter: William Clark
I’m looking forward to this talk, as it’s a new area for me. I don’t remember who told me this, but the key to getting something out of a conference is go to sessions where you have the opportunity to learn something, and you’re interested in the subject. That’s why I’m avoiding sessions on establishing enterprise technology architects. I’ve been doing that for the past 5 years, so the chances are far less that I’m going to learn something new than in a session like this one, where it’s an emerging space and I know it’s something that is going to be more and more important in my work in the next few years. The only downside is I’m now on my fourth day in Orlando which is starting to surpass my tolerance limit for sitting and listening to presentations.
He’s started out by showing that the thing missing from the digital experience today is “me.” By me, he implies the context of why we’re doing the things that we’re doing, such as “where am I,” “what have I done,” “who are you talking to,” etc. He points out the importance of user experience in the success and failures of projects, especially now in the mobile space.
Some challenges he calls out with incorporating context into our systems:
- Blending of personal contexts and business contexts. For example, just think of how your personal calendar(s) may overlap with your business calendar.
- Managing Technical Contexts: What device are you using, what network are you connecting from, etc. and what are the associated technical capabilities available at that point?
- Context timing: The context is always in a state of flux. Do I try to predict near-term changes to the context, do I try to capture the current context, or do I leverage the near-past context (or even longer) in what is shown?
It’s always a sign of a good presentation when they anticipate questions an audience might ask. I was just about to write down a question asking him if he thinks that a marketplace for context delivery will show up, and he started talking about exactly that. This is a really interesting space, because there’s historical context that can be captured and saved, and there’s an expense associated with that, so it makes sense that the information broker market that currently selling marketing lists, etc. will expand to become on-demand context providers with B2B style integrations.
All in all, I see this space with parallels to the early days of business intelligence. The early adopters are out there, trying to figure out what the most valuable areas of “context” are. Unlike BI, there are so many technology changes going on that are introducing new paradigms, like location aware context with cellphones, there’s even more uncertainty. I asked a question wondering how long it will be before some “safe” areas have been established for companies to begin leveraging this, but his answer was that there are many dimensions contributing to that tipping point, so it’s very hard to make any predictions.
This was a good presentation. I think he gave a good sampling of the different data points that go into context, some of the challenges associated with it, and the technical dynamics driving it. It’s safe to say that we’re not at the point where we should be recommending significant investments in this, but we are at the point where we should be doing some early research to determine where we can leverage context in our solutions and subsequently make sound investment decisions.
Gartner EA: Effective IT Planning
Presenter: Robert Handler
He’s showing a slide on IT Portfolio Management Theory, and how there is a discovery phase, a project phase, and an asset management phase. Discovery explores new technology, projects implement new technology, and the asset management phase operates it once it’s in production. Next, he’s shown an Enterprise Architecture diagram and discusses the whole current state/future state approach, risk tolerance principles, etc. He now has a slide with a summary of three areas: IT Strategic Planning, Project & Portfolio Management, and Enterprise Architecture. He shows that all three of these have overlapping goals and efforts that could be better aligned because at present, they tend to exist in vacuums.
He’s now talking about some of the issues with each of these disciplines. First, he used Wikipedia’s definition of technology strategy to show the challenge there (Wikipedia claims it’s a document created by the CIO, the audience chuckled at that). On to Project and Portfolio Management, he’s calling out that only 65% of organizations cover the entire enterprise in their portfolio, and most PPMs are focused on prioritizing projects. On the EA side, he calls out that most efforts are mired in the creation of technical standards.
His recommendations for creating a win/win situation are:
- Collectively maintain, share, and use business context.
- Use EA to validate strategic planning and improve portfolio management decisions.
- Use portfolio management to generate updates against IT strategy and EA design and plans.
He proceeded to go into detail on each of these. Overall, I think this was a good 100-level presentation to tell an audience of EA’s that they can’t ignore IT strategy efforts and PPM efforts. They need to be aligned with them. It could have been a bit more pragmatic to emphasize how one would go about doing this.
Gartner EA: Michael Raynor
Presenter: Michael Raynor, Deloitte Consulting
This session is from Michael Raynor, author of “The Strategy Paradox.” The title is “The Accidental Strategist: Why uncertainty makes EA central to strategy.” He feels that it is ironic that there is a separation between the formulation of strategy and the implementation of strategy. He doesn’t agree with this approach. He feels that formulation and implementation should be a more interactive process and less linear, minimizing the strategic risk that an organization takes.
On this slide, his observation is that strategic uncertainty has been ignored. He used an example of the search engine competition of years ago and how, at least in part, Google won the space by making the best guess with regards to their strategy. It’s not that AltaVista made poor choices, they simply guessed wrong with what would be the most important factors in that marketplace. There is uncertainty associated with strategy.
An interesting anecdote he’s showing us now is that organizations that have a high commitment to strategy, which often times are the companies that we try to emulate, have an extremely high chance of failure, while companies with a relatively low commitment to strategy have a very low failure rate. To me, this seems be an example of low risk/low return and high risk/high return.
Extreme positions help customers know what to expect. Companies that are in the middle, “wander around like a stumbling drunk.” His example of the continuum was Wal-Mart at one end (cost differentiation, if given a choice of make it better or make it cheaper, Wal-Mart makes it cheaper), Nordstrom at the other end (product differentiation, Nordstrom makes it better), and Sears in the middle. Margins are best at the extremes and squeezed in the middle, yet most companies are in the middle. The reason is that at the extremes, it’s a winner take all approach. K-Mart can’t compete with Wal-Mart, Lord & Taylor couldn’t compete with Wal-Mart, yet Sears and JcPenney can both co-exist just fine. The reason for this is that companies in the middle have chosen to minimize their strategic risks. Companies at the extremes take on more risk in their strategic choices.
He’s now discussing Microsoft. He’s explaining the Microsoft manages strategic risk through their portfolio and understanding that things will change over time. This is different than diversification, where the profits of one division would cover losses in another. This is where if what’s important to revenue changes, the company is positioned to quickly leverage it. For example, if consolidation of computing in the home is centered at the gaming console rather than at the PC, Microsoft has XBox.
Another good example he’s presenting is Johnson & Johnson. Their Ethicon & Endo-Surgery division sells colonoscope, an area that previously differentiated on the technical excellence of the product. For growth, however, the problem was enough people weren’t getting colonoscopies. In the US & Canada, a colonoscopy is a sedated procedure, which greatly increases the cost associated with it. In order to manage the strategic risk that selling colonoscopes may switch and become a pain management rather than technical excellence issue, Johnson & Johnson’s VC arm invested in a company that was advancing sedation technologies. (Hopefully, I got this recap right…)
The metaphor that he believes captures how to manage strategic risk is not evolution, but gene therapy. That is, if the environment changes in certain ways, the genes can be recombined in new ways to leverage that environment appropriately. Good talk!
Gartner EA: EA by Stealth
The presenter is Gary Doucet, the Chief Architect for the Government of Canada. His summary is that EA makes everything the business does better. EA needs to be ubiquitous, not owned by any single non-EA process. He emphasized that there are hidden architects all over the place, producing artifacts (artefacts, if you’re from Canada like him), that are the descriptions of the business today. EA’s role is to discover and align these efforts, not necessarily to own them all. Interesting talk.
Gartner AADI/EA: Nick Carr, The Big Switch
I’m now in the keynote from Nick Carr, author of IT Doesn’t Matter and his latest book, The Big Switch. I haven’t read his book yet (I did get a copy when he offered to send free ones to the first 100 bloggers who responded), but from reading the reviews of others and comments around it, I knew that it was basically advocating a cloud computing approach to corporate IT. His presentation reaffirmed this. Nothing much to say beyond that, they’ve now transitioned to a discussion with him and Darryl Plummer and David Mitchell Smith. The first question from Darryl was on the moving of data into the cloud. Nick’s response was that company’s will do it when their competitors wind up “saving millions of dollars” by doing it. Darryl drilled more into the notion of what to do with the legacy “stuff” and Nick expects that for the foreseeable future, large companies will have a hybrid model, slowly moving things into the cloud. He feels that smaller and mid-size companies will be quicker to adopt. David then asked if 50 years out, the Google data center that Nick used in one slide will be seen as “the dinosaur” and replaced by a peer-to-peer model. Nick hesitated a bit and said that 50 years is hard to predict in technology, but then said he doesn’t see it. Personally, I agree with David, and I think Nick’s analogy to power grids shows it. We moved to centralized power generation, and now with the focus on being green, we now see individual home owners contributing back to the grid through solar panels on their roof, etc. It wouldn’t surprise me at all, as standards evolve, to see individuals contributing their compute resources back to the cloud, however, we’d need to have far, far better standards for doing so. I had a conversation with Mike Kavis about this at lunch, and we both agreed that the cost of moving is still too high. I can’t simply take the .ear file from my application server and stick it on a server in the cloud yet. We’ll get there, though, and that’s when it will become a more interesting discussion for the large enterprise.
Darryl just did an informal audience poll, asking the question, “How many of you think a significant part of your IT will leverage cloud computing in 2 years, 5 years, and 10 years?” Within 2 years, there were probably less than 20 people. At 10 years, probably about 30-40% of the room said yes. That’s probably realistic, especially not knowing the size of the companies that are represented here. If 30% of the people here represent SMB’s, it’s a no-brainer in my opinion in a 10 year timeframe. Ultimately, this was just a fun exercise to stimulate the discussion, much as how Darryl pinned Nick down to make a prediction on how many of us won’t have a job due to this “big switch” in 10 years. For the record, Nick said 60%, but then said the more safe statement, “IT headcounts will be at lower levels than they are now.”
Gartner EA: EA as Strategy
Presenter: Colleen Young
This is the opening keynote for the EA Summit. Colleen is telling us that we are the individuals that need to bring the message to the business that IT can contribute to growth, rather than simply being about back-end processes and support. Things that she says that are critical to success of EA:
- EA’s must be bilingual- speaking the language of the business and the language of IT.
- Architects must be political.
- Architects must be influential. Architects need to be confident, with a firm grasp on the business’ needs.
- Architects must deliver distinctive solutions. How do we do this? We need to evolve toward integrated IT and business strategic planning. Architecture needs to bridge the gap.
Overall, a good presentation.
Gartner AADI: Measuring the Value of SOA
I just finished my panel discussion with Mel Greer and Mike Kavis on measuring the value of SOA. I think we all had hoped that there would be more attendees, but hopefully those that chose to attend got something out of it. My main message was measure, measure, measure. I think it’s difficult to put a direct value on SOA adoption, that is, one where you can say the value was directly as a result of SOA efforts, but it’s not difficult to put a contributory value on SOA adoption. In other words, we need to measure the way IT is contributing to the success of the company as a whole, and as part of that, we can see some before and after measurements to see the impact of SOA and any other changes. The two things that I brought up in answering questions that I thought I’d share here are:
- Instrument your services now. Part of the problem with measuring things today is that we haven’t instrumented things in the past. These days, value is almost always expressed in relative terms, such as “relative to what we’re doing now.” If you’re not collecting metrics, you can’t say what “now” is, though. Once again, we’re at one of those unique opportunities where the door is open to do things differently. Put the instrumentation in now, before you have a portfolio of 100+ services that have no instrumentation.
- Measuring puts the spotlight on you, but will always enable you to answer questions better than before. A member of the audience asked the question, “What happens if your measurements show that you’re not achieving your goals?” This was a great question. Unfortunately, sometimes by the mere act of measuring things, people will immediately put the blame on you when things aren’t achieving the desired benefits, simply because you’re the one thing that can concretely demonstrate contribution (or lack thereof). My answer to this was two-fold. First, it was to try to make sure you have the backing metrics to allow proper root cause analysis. If you just focus on one metric, and nothing else, it makes root cause identification very difficult, and it puts the spotlight at the one area when you’re measuring. This puts strategic initiatives like SOA at risk, because people will think the whole thing is flawed, when in fact, the lack of results may have nothing at all to do with SOA adoption. Second, I talked about the appropriate spin to put on it, this being the political season in the US. When something doesn’t work out as planned, the way to spin the metrics is to show that we’re in a better spot to fix the problem because of the measurements than we would have been before.
The final thing I wanted to call out was a reference to a blog I posted yesterday at the request of Rob Eamon. Someone asked a question about how to get the stated goals from “the business” and the role of IT in contributing ways of measuring it. I called out that IT is part of the business, so there’s no reason that IT can’t contribute to the definition and appropriate ways to measure the business goals. Rather than viewing it as an “extraction” effort, it should be a joint effort with all members of the business, which includes IT.
If you attended the session, please feel free to post any comments or questions here. I hope it was valuable.
Gartner AADI: Dr. Andrew Lippman
Presenter: Dr. Andrew Lippman from MIT’s Media Laboratory
Dr. Lippman came and talked to us about MIT Media Lab in a keynote this morning. He was an excellent speaker. Most of the talk was focused on how technology can contribute to the increasingly social nature of our society. While we increasingly have more and more personal technology, the usefulness of that technology will largely depend on its ability to focus on the “we” rather than on “me.” A great point that he made is that a company’s value is in its social network- the speed with which the values and ideas flow through the company. Again, the degree to which technology supports that is a key element. Excellent talk.
Gartner AADI: State of SOA
Presenter: Daniel Sholler
Dan is largely presenting results from some surveys that Gartner has done. Highlights:
- Adoption is increasing, but so is number of organizations that are choosing to delay/do nothing
- Nearly all organizations are at maturity level 1
- Of the very few organizations that are above level 1, interest/usage in WOA/REST is increasing
- More mature organizations are using services for B2B and multi-channel applications
- Only 1/3 of organizations adopting SOA are using an ESB
- Stage 2 maturity companies have nearly double the number of service consumers for the same number of services as Stage 1 maturity companies
- “BPM is the ‘killer app’ for SOA”
My thoughts: Nothing really surprising here. I’m not at all surprised that we’re at a very early stage of maturity. The statement that the more mature organizations are pursuing B2B and multi-channel opportunities is an aberration, in my opinion. I think those are simply opportunities that some organizations have and other don’t, rather than being tied to the maturity of the organization. The bullet point that more mature organizations have twice the number of consumers was interesting to me. That one seems to make sense from a maturity standpoint. The BPM comment isn’t surprising at all, because I don’t think you can do a good job with BPM without having services.