Mentoring and Followup to Clarity of Purpose

James McGovern posted his own thoughts in response to my Clarity of Purpose post. In it, he asks a couple of questions of me.

“I wonder if Todd has observed that trust as a concept is fast declining.” I don’t know that I’d say it is declining, but I would definitely say that it is a key differentiator between well-functioning organizations and poorly functioning organizations. I think it’s natural that as an organization grows, you have to work harder to keep the trust in place. How many people in a small town say they trust their local government versus a big city, let alone the country? The same holds true for typical corporate IT. As James’ points out, trust gets eroded easily when things are over-promised and under-delivered. Specifically in the domain of enterprise architecture, we’re at particular risk because we often play the role of the salesperson, but the implementation is left to someone else. When things go bad, the customer directs their venom at the salesperson, rather than digging deep to understand root cause. We also too frequently look to point fingers rather than fix the problem. It’s unfortunate that too many organizations have a “heads must roll” approach which doesn’t allow people to make mistakes and learn. A single mistake is a learning opportunity. Making the same mistake over and over is a problem that must be dealt with.

“Maybe Todd can talk about his ideas around the importance of mentoring in a future blog entry as this is where EA collectively is weak and declining.” Personally, I think it’s a good practice to always have some amount of your enterprise architect’s time dedicated to project mentoring. Don’t assign them as a member of the project team where the project manager controls their tasks, rather, encourage them to actively work with the project team, keep up to date on what they are doing, and look for opportunities where you can help. The most important thing, however, is to have an attitude of contributing the help that is needed, rather than contributing your own wisdom. If you come in pontificating, going off on tangents, and expressing an “I know better” attitude, you’ll only get resentment. If, instead, you seek first to understand, as Stephen Covey suggests, you’ll have much better luck. While I was working as a consultant, I had a client who indicated that what they really needed was a mentor. For some consultants, this would have been perceived as the kiss of death, because it can result in an open-ended, warm body engagement, without clear expectations and deliverables. There’s a lot of risk when expectations aren’t clear and can change on a moment’s notice. In reality, the engagement was simply to listen and then offer suggestions and advice to either confirm what they already knew but lacked confidence to go after with conviction, or to suggest things that they might not have thought about. It’s not an easy task to do, but it is absolutely critical. I think an architect who is willing to stand by his or her strategy and see it through to completion, not necessarily from a hands-on perspective, but from a mentoring and guidance perspective, can build far more trust.

Tweeting…

In case you didn’t see it at the bottom of my last post, I’m now on Twitter. Just what they needed, one more user to contribute to their capacity problems. Anyway, it only took two days and I’m hooked. Thanks Mike. You can follow me at http://www.twitter.com/toddbiske. In the meantime, thanks to a failed thumb drive (that’s what I get from relying on the vendor freebies from conferences), you’ll have to wait another day for my next post.

Clarity of Purpose

Do you have clarity of purpose in your job, your projects, your teams, your committees? I’m seeing more and more that lack of clarity in purpose is a very common problem, and one that frequently goes unnoticed until things are in a very bad state.

Why don’t we pick up on this? Human nature certainly has a part in this. We go through life being told what to do without being told why. Some things need to be done on trust- trust that the person giving the direction understands the purpose. If they don’t, then the problem can begin. Another problem is that I don’t believe we’re a community of Wallys. We like to be doing something, we like to be productive, and we like to have a purpose. Therefore, if we haven’t been given one, we’ll probably make one up. Unfortunately, in a team setting, my perception of purpose may differ from my teammates, which has the potential to create tension (there’s good tension and bad tension, but that’s a subject for another day). My teammate and I may have the same perception of purpose, but that may be different from someone outside of the team. That’s an even more dangerous situation, because now the team thinks they are doing good work, but the perception of an outsider is exactly the opposite.

Take my job: enterprise architecture. Most EA’s I know, myself included, would consider themselves big picture thinkers. Our purpose, however, is not just to establish strategic direction, but to ensure that strategic direction is followed. If all we do is create Visio and Powerpoint, and don’t also include planning, communication, and mentoring, are we success? All of the outsiders may talk to an enterprise architect and walk away thinking that person is brilliant and has a great vision, but if their purpose is also to get the organization there, and the organization isn’t moving any closer, that’s a problem.

Finally, I think it’s very easily to lose sight of our purpose as we get bogged down in the day to day efforts. It’s important to go back and review your purpose on a regular basis and make sure you’re staying on track and completing all aspects of it, and if not, seek help. Get a mentor or coach, recognize your strengths and weaknesses, and take the necessary steps to do it. If the current path isn’t working, something needs to change. If you don’t change, neither will the outcome.

FYI: I’ve signed up for Twitter. I’m somewhat skeptical about it, but I figured the only way of understanding whether its worth my time or not is to try it. My id is toddbiske, follow me here.

Comments on TUCON 2008 Podcast

Dana Gardner moderated a panel discussion at Tibco’s User Conference (TUCON) on Service Performance Management and SOA. There were some great nuggets in this session, I encourage you to listen to the podcast or read the transcript. The panelists were Sandy Rogers of IDC, Joe McKendrick, Anthony Abbattista of Allstate, and Rourke McNamara of TIBCO.

First, Sandy Rogers of IDC commented that what she finds interesting “is that even if you have one service that you have deployed, you need to have as much information as possible around how it is being used and how the trending is happening regarding the up-tick in the consumption of the service across different applications, across different processes.” I couldn’t agree more on this item. I have seen first hand the value in collecting this information and making it available. Unfortunately, all too often, the need for this is missed when people are looking for funding. Funding is focused on building the service and getting it out the door on-time and on-budget, and operation concerns are left to classic up/down monitoring that never leaves the walls of IT operations. We need to adjust the culture so that monitoring of the usage is a key part of the project success. How can we make any statements on the value of a service, or any IT solution for that matter, if we aren’t monitoring how that service is being used? For example, I frequently see projects that are proposed to make some manual process more efficient. If that’s the value play, are we currently measuring the cost of the manual activity, and how are we quantifying the cost of doing it the new way? Looking at the end database probably isn’t good enough, because that only shows the end results of processing, not the pace of processing. Automated a process enables you to process more, but if demand is stable, the end result will still look the same. The difference lies in the fact that people (and systems) have more time available for other activities.

Sandy went on to state:

They (organizations) need a lot more visibility and an understanding of the strains that are happening on the system, and they need to really build up a level of trust. Once they can add on to the amount of individuals that have that visibility, that trust starts to develop, more reuse starts to happen, and it starts to take off.

Joe picked on this stating “that the foundation of SOA is trust.” No arguments here. If the culture of the organization is one of distrust, I see them of having very slim chances of having any success with SOA. Joe correctly called out that a lot of this hinges on governance. I personally believe that governance is how an organization changes behavior and culture. Lack of trust is a behavior and trust issue. Only by clearly stating what the desired behavior is and establishing policies that create that behavior can culture change happen.

Anthony provided a great anecdote from the roll-out of their ESB stating that they spent 18 months justifying its use and dealing with every outage starting with someone saying, “TIBCO is down.” In reality, it was usually some back end service or component being down, but since the TIBCO ESB was the new thing, everyone blamed it. By having great measurements and monitoring, they were able to get to root cause. I had the exact same situation at a prior company, and it was fun watching the shift as people blamed the new infrastructure, and I would say, “No, it’s up, and the metrics it has collected makes me think the problem is here.”

A bit later in the podcast, Joe mentioned a conversation with Rourke earlier in the day, commenting that “predictive analytics, which is a subset of business intelligence (BI), is now moving into the systems management space.” This sounds very familiar…

Rourke also made a great comment when referring to a customer who said “their biggest fear is that their SOA initiative will be a victim of its own success.” He went on to say:

That could make SOA a victim of its own success. They will have successfully sold the service, had it reused over and over and over and over again. But, then, because of that reuse, because they were successful in achieving the SOA dream, they now are going to suffer. All that business users will see from that is that “SOA is bad,” it makes my applications more fragile, it makes my applications slow down because so many people are using the same stuff.

That was a great point. SOA, if it is successful, should result in an increase in the number of dependencies associated with an IT solution. Many people shudder at that statement, but the important thing is that there should be those dependencies. What’s bad is when those dependencies aren’t effectively managed and monitored. The lack of effective management results in complicated, ad hoc processes that give the perceive that the technology landscape is overly complex.

This was one of the better panel discussion I’ve heard in a while. I encourage you to give it a listen.

Integration Competency Centers and SOA

Lorraine Lawson of IT Business Edge had a post last week that linked to my previous posts on Centers of Excellence and Competency Centers entitled, “The Best Practice That Companies Ignore.” In this article, she references an eBizQ survey that revealed that only 9% of respondents had a competency center or center of excellence. While she wasn’t surprised at this, she was surprised at recent comments from Ken Vollmer of Forrester that said the same is true for Integration Competency Centers, a concept that has been around for several years. In her discussion with Ken, she states he indicated that “any organization with mid-to-high-level integration issues could benefit from an ICC.” My take on the discussion was that Ken feels that every mid to large organization should have one (my opinion, neither he nor Lorraine stated this).

The real issue I had with some of the justifications for having an ICC was an underlying assumption that intergration is a specialized discipline. While this was the case 8-10 years ago, I think we’ve made significant progress. I actually think there is a specific detriment that an ICC can have to an SOA effort. When an ICC exists, integration is now someone else’s problem. I worry about my world, and I leave it up to the integration experts to make my world accessible to everyone else. It’s this type of thinking that will doom an SOA effort, because everyone’s first concern is themselves, not everyone else. To do SOA right, your service teams should be consumer-focused first.

Regarding ICCs, the reason I don’t think there is broad adoption of the concept is that majority of companies, even large enterprises, only have one or two major systems that represent 80% of the integration effort, typically either mainframe integration or ERP integration. Companies that have grown via acquisition may have a much more difficult problem with multiple mainframes, multiple ERP systems, etc., and for them, ICCs are a good fit. I just don’t think that’s 80% of the mid-to-large businesses.

The last piece of the message, and where she linked to my posts, deals with whether or not the ICC should temporary or not. Ken’s comment was that there are always new integration tools coming out, and the ICC should be responsible for them. I don’t agree with this. There are also new development tools coming out, and I don’t see companies with a development competency center. Someone does have to be responsible for integration technologies, but this could easily be part of the responsibilities for a middleware technology architect.

Applying the same argument to SOA, again, if it’s technology-focused, I don’t buy it. If we get into the space of SOA Advocacy and Adoption, then I think there’s some value. Clearly, individual projects building services does not constitute SOA. Given that, who is guiding the broader SOA effort? Perhaps what is ultimately needed is a SOA Advocacy Center or SOA Adoption Center that is repsonsible for seeing it forward. There’s no formula for this, though. A person dedicated to being the SOA Champion with excellent relationships in the organization could potentially do this on their own. Ultimately, this become just like any other strategic initiative. To acheive the strategy, the organization must put proper leadership in place. If it’s one person, great. If it’s a standing committee, great. Just as long as it is positioned for success. Putting one person in charge who lacks the relationships won’t cut it, but putting a committee together to establish those relationships will. Whether it’s permanent or not is dependent on whether the activities can become standard practice, or if there is a continual need for leadership, guidance, and governance.

Redefining Banking…

Here’s an idea for some entrepreneur to go and run with, or even better, for someone to read and go, “That’s already been done! Go visit -blank-.” I was thinking about banking, budgets, and money management and was thinking just how inconvenient it is to move money around between various accounts. I’ve been a Quicken user for over a decade now, and it’s frustrating that there are still financial institutions that don’t download into Quicken easily. The second thing that occurred to me is that it still seems to difficult to move money around between different accounts. The average person can have a checking account, a saving account, a retirement account, an investment account (which may also require having accounts with each company that manages a mutual fund in that account), plus accounts for their family, credit cards, etc.

I thought back to when I was growing up and remember my Mom having a collection of envelopes at her desk, each of them containing the cash for the month for a particular category. This was how she created her budget. I also found out later about some of the other tricks my parents used for handling things like Christmas and vacations. They had additional bank accounts that were reserved for these expenses that were going to be larger and thus required a longer time period of savings. They’d rather get interest on it from a bank somewhere than keep it in an envelope on the desk.

Then it clicked. Why can’t a financial institution today provide an electronic equivalent of the envelopes my Mom used years ago? All the pieces should be there. We leverage electronic fund transfers every day, there’s absolutely no reason that this can’t be made more consumer friendly so that performing a transfer from my checking account into a custodial investment account for my kids is as simple as doing a “Transfer Funds” operation in Quicken. Does anyone make this easy today? I know there’s lots of room for improvement with my financial institution. What about budgets? If you pay cash for everything, you may as well stick with the envelopes. If you pay with credit card, once again, the technology is there. I just entered an expense report at work and the system has visibility into expenses charged to my corporate card. It was able to pre-populate the category of the expense by looking at who the payee was. Take this a step further, and it would be great if I could place controls over the charges (this would be very good for debit cards) so that I wouldn’t be warned (or even stopped) that I was going to blow the budget with a particular purchase. This would be great for kids, as well, where a parent could give them a limited access debit card that could only spend up to a budgeted amount and only for certain categories of expenses.

What about those long term items like vacations and saving for Christmas presents? Why on earth should I need to open up another account to do this? Can’t the bank allow me to create a “virtual” account where money can be transferred in and out, but checks and debit card transactions couldn’t go against it?

It’s certainly true that one can probably execute sound financial management with today’s tools, but it just seems to me that it can be made much easier, and if it’s easier, maybe more people will have better luck with it. So, what do you think? I’m of the opinion that someone out there has to be doing this already, it seems too obvious for someone not to be jumping all over it. If someone has, please comment or send me mail. Maybe one of the new Internet banks is doing this today. If not, well, make you thank me for the idea and give me a free toaster or something when I open my account.

Think Orchestration, not BPEL

I was made aware of this response from Alex Neihaus of Active Endpoints on the VOSibilities blog to a podcast and post from David Linthicum. VOS stands for Visual Orchestration System. Alex took Dave to task for some of the “core issues” that Dave had listed in his post.

I read both posts and listened to Dave’s podcast, and as is always the case, there are elements of truth on both sides. Ultimately, I feel that the wrong question was being asked. Dave’s original post has a title of “Is BPEL irrelevant?” and the second paragraph states:

OK, perhaps it’s just me but I don’t see BPEL that much these days, either around its use within SOA problem domains I’m tracking, or a part of larger SOA strategies within enterprises. Understand, however, that my data points are limited, but I think they are pretty far-reaching relative to most industry analysts’.

To me, the question is not whether BPEL is relevant or not. The question is how relevant is orchestration? When I first learned about BPEL, I thought, “I need a checkbox on my RFP/RFI’s for this to make import/export is supported,” but that was it. I knew the people working with these systems would not be hand-editing the XML for BPEL, they’d be working with a graphical model. To that end, the BPMN discussion was much more relevant than BPEL.

Back to the question, though. If we start talking about orchestration, we get into two major scenarios:

  1. The orchestration tool is viewed as a highly-productive development environment. The goal here is not to externalize processes, but rather, to optimize the time it takes to build particular solutions. Many of the visual orchestration tools leverage significant amount of “actions” or “adapters” that provide a visual metaphor for very common operations such as data retrieval or ERP integration. The potential exists for significant productivity gains. At the same time, many of the things that fall into this category aren’t what I would call frequently changing processes. The whole value add of being able to change the process definition more efficiently really doesn’t apply.
  2. The orchestration tool is viewed as a facility for process externalization. Here’s the scenario where the primary goal is flexibility in implementing process changes rather than in developer productivity. I haven’t seen this scenario as often. In other words, the space of “rapidly changing business processes” is debatable. I certainly have seen changes to business rules, but not necessarily to the project itself. Of course, on the other hand, many processes are defined to begin with, so the culture is merely reacting to change. We can’t say what we’re changing from or to, but we know that something in the environment is different.

So what’s my opinion? I still don’t get terribly excited about BPEL, but I definitely think orchestration tools are needed for two reasons:

  1. Developer productivity
  2. Integrated metrics and visibility

Most of the orchestration tools out there are part of a larger BPM suite, and the visibility that they provide on how long activities take is a big positive in my book (but I’ve always been passionate about instrumentation and management technologies). As for the process externalization, the jury is still out. I think there are some solid domains for it, just as there are for things like complex event processing, but it hasn’t hit mainstream yet at the business level. It will continue to grow outward from the developer productivity standpoint, but that path is heavily focused on IT system processes, not business processes (just like OO is widely used within development, but you don’t see non-IT staff designing object models very often). As for BPEL, it’s still a mandatory checkbox, and as we see separation of modeling and editing from execution engine, it’s need may become more important. At the same time, how many organizations have separate Java tooling for when they’re writing standalone code versus writing Java code for SAP? We’ve been dealing with that for far longer, so I’m not holding my breath waiting for a clean separation between tools and the execution environment.

The Real SOA Governance Dos and Don’ts

Dave Linthicum had a recent post called SOA Governance Dos and Don’ts which should have been titled, “SOA Governance Technology Selection Dos and Don’ts.” If you use that as the subject, then there’s some good advice. But once again, I have to point out that technology selection is not the first step.

My definition of governance is that it is the people, policies, and processes that ensure desired behavior. SOA governance, therefore, is the people, policies, and processes the ensure desired behavior in your SOA efforts. So what are the dos and don’ts?

Do: Define what your desired behavior is. It must be measurable. You need to know whether you’re achieving the behavior or not. It also should also be more than one statement. It should address both behavior of your development staff as well as the run-time behavior of the services (e.g. we don’t one any one consumer to be able to starve out other consumers).

  • Don’t: Skip that step.
  • Do: Ensure that you have people involved with governance who can turn those behaviors into policies.
  • Don’t: Expect that one set of people can set all policies. As you go deep in different areas, bring in appropriate domain experts to assist in policy definition.
  • Do: Document your policies.
  • Don’t: Rely on the people to be the policies. Your staff has to know what the policies are ahead of time. If they have to guess what some reviewer wants to see, odds are they’ll guess wrong, or the reviewer may be more concerned about flaunting authority rather than achieving desired behavior.
  • Do: Focus on education on the desired behavior and the policies that make it possible.
  • Don’t: Rely solely on a police force to ensure compliance with policies.
  • Do: Make compliance the path of least resistance.
  • Don’t: Expect technologies to define your desired behavior or policies that represent it.
  • Do: Use technology where it can improve the efficiency of your governance practices.
  • There’s my take on it.

Gartner EA: EA and SOA

This is my last post from the summits (actually, I’m already at the airport). This morning, I participated in a panel discussion on EA and SOA as part of the EA Summit with Marty Colburn, Executive VP and CTO for FINRA; Maja Tibbling, Lead Enterprise Architect for Con-way; and John Williams, Enterprise Architect from QBE Regional Insurance. The panel was jointly moderated by Dr. Richard Soley of the OMG and SOA Consortium and Bruce Robertson of Gartner. It was another excellent session in my opinion. We all brought different perspectives on how we had approached SOA and EA, yet there was some apparent commonalities. Number one was the universal answer to what the most challenging things was with SOA adoption: culture change.

There were a large number of questions submitted, and unfortunately, we didn’t get to all of them. The conference director, Pascal Winckel (who did a great job by the way), has said he will try to get these posted onto the conference blog, and I will do my best to either answer them here on my blog or via comments on the Gartner blog. As always, if you have questions, feel free to send them to me here. I’d be happy to address them, and will keep all of the anonymous, if so desired.

Gartner EA: Case Study

I just attended a case study at the summit. The presenter requested that their slides not be made available, so I’m being cautious about what I write. There was one thing I wanted to call out, which was that the case study described some application portfolio analysis efforts and mapping of capabilities to the portfolio. I’ve recently been giving a lot of thought to the analysis side of SOA, and how an organization can enable themselves to build the “right” services. One of the techniques I thought made sense was exactly what he just described with the mapping of capabilities. Easier said than done, though. I think most of us would agree that performing analysis outside of the context of a project could provide great benefits, but the problem is that most organizations have all their resources focused on running the business and executing projects. This is a very tactical view, and the usual objection is that as a result, they can’t afford to do a more strategic analysis. It was nice to hear from an organization that could.

Gartner EA: The Management Nexus

Presenters: Anne Lapkin and Colleen Young

One thing all of the presenters in the EA Summit are very good at doing is using consistent diagrams across all of their presentations. This is at least the third presentation where I’ve seen this flow diagram showing linkage between business goals and strategy, and business planning and execution. Unfortunately, Anne points out that the linkage is where things typically break down.

Colleen is now discussing strategic integration, which begins with an actionable articulation of business strategy, goals and objectives. From there, she recommends a standardized, integrated, results-based management methodology. As a result, she claims that we will see exponentially greater benefits from enterprise capabilities and investments.

Anne is speaking again and emphasizing that we need a unified contextual view. This consists of a goal, which is one level deeper than the “grow revenues by XY%” which includes a future end state with a timeline and measurable targets, principles that establish the desired behavior and core values, and relationships.

Colleen now has a great slide up called, “The Implication of ‘Implications’.” The tag line says it all- “Unclear implications lead to inconsistent assumptions and independent response strategies that inevitably clash.” Implications that must be investigated include financial implications, business process implications, architecture implications, cultural change implications, and more. All parties involved must understand and agree on these implications.

A statement Colleen just made that resonates with my current thinking is, “Based upon these implications, what do I need to change?” All too often, we don’t stop to think about what the “change” really is. Work starts happening, but no one really has a clear idea of why we’re doing it, only an innate trust that the work is necessary and valuable. If the earlier planning activities have made these goals explicit, the execution should be smoother, and when bumps in the road are encountered, the principles are right there to guide the decision making process, rather than on relying on someone’s interpretation of an undocumented implication.

Once again, this was a good session. I know I’ve commented on a few sessions that they could have been a bit more pragmatic or actionable, this one definitely achieved that goal. I think the attendees will be able to leave with some concrete guidance that they can turn around and use in their organizations.

Gartner EA: Strategic Planning Tools and Techniques

Presenter: Richard Buchanan

The first topic Richard is covering is the need for enterprise architects to master strategic thinking. His current slide is consistent with an earlier talk today, showing that enterprise strategy is at the intersection of three disciplines: Enterprise Strategy and Planning, Enterprise Architecture, and Enterprise Portfolio Management. He states that enterprise architecture must translate business vision and strategy into effective enterprise change. He’s discussing how a budget and the organization chart are not part of the business strategy, pointing out that a budget should be a downstream deliverable derived from the business strategy. Great point. His definition of strategy includes an organization’s environment, goals, objectives, major programs of action, and the resource allocation choices to execute them.

The next topic he is covering are the categories of tools and techniques that are used in developing a business strategy. These are not software tools, as the first one he’s showing is Porter’s 5 Forces Model (this is second time Michael Porter has been referenced at the Summit). He’s challenging us to go and find the people in our organization that are looking at these things. Good advice. There’s no doubt that if you want to do strategic planning, you need to be looking at these five forces, and there’s a good chance that someone at the company (probably outside of IT) is already doing this. The same thing holds true for the other categories of tools that he has went through.

The final point he’s covering is how to leverage these strategic tools within the EA process. To some extent this is motherhood and apple pie, but it’s very good advice, especially knowing that many EA’s have grown out of the world of application development and may still be very technology focused. As a result, it’s entirely possible that the EA team has never read the company’s annual report. It’s even more likely that EA hasn’t seen things like competitive analysis documents. If an EA doesn’t understand how a company competes, how can they make appropriate decisions? Speaking very broadly and citing Michael Raynor’s earlier presentation, do you know whether your company differentiates on cost or on products? Both of those can have significant impacts on how information technology is leveraged. A company that differentiates based on product excellence and customer service must have significantly better technology for understanding their customers than a company that simply tries to be the lowest cost provider in the marketplace.

My final thoughts: There’s not much to disagree with in this presentation. I think he paints a great picture of what many of us would like to be doing. The challenge I suspect that many attendees have is that our EA organizations, as a previous presenter put it, “are mired in the world of technology architecture.” Somehow, we need to find a seat at the strategic planning table so when we ask about some of these artifacts, everyone knows its importance versus stopping us in our tracks and asking, “Why do you need it?”

Gartner EA: Context Delivery Architecture

Presenter: William Clark

I’m looking forward to this talk, as it’s a new area for me. I don’t remember who told me this, but the key to getting something out of a conference is go to sessions where you have the opportunity to learn something, and you’re interested in the subject. That’s why I’m avoiding sessions on establishing enterprise technology architects. I’ve been doing that for the past 5 years, so the chances are far less that I’m going to learn something new than in a session like this one, where it’s an emerging space and I know it’s something that is going to be more and more important in my work in the next few years. The only downside is I’m now on my fourth day in Orlando which is starting to surpass my tolerance limit for sitting and listening to presentations.

He’s started out by showing that the thing missing from the digital experience today is “me.” By me, he implies the context of why we’re doing the things that we’re doing, such as “where am I,” “what have I done,” “who are you talking to,” etc. He points out the importance of user experience in the success and failures of projects, especially now in the mobile space.

Some challenges he calls out with incorporating context into our systems:

  • Blending of personal contexts and business contexts. For example, just think of how your personal calendar(s) may overlap with your business calendar.
  • Managing Technical Contexts: What device are you using, what network are you connecting from, etc. and what are the associated technical capabilities available at that point?
  • Context timing: The context is always in a state of flux. Do I try to predict near-term changes to the context, do I try to capture the current context, or do I leverage the near-past context (or even longer) in what is shown?

It’s always a sign of a good presentation when they anticipate questions an audience might ask. I was just about to write down a question asking him if he thinks that a marketplace for context delivery will show up, and he started talking about exactly that. This is a really interesting space, because there’s historical context that can be captured and saved, and there’s an expense associated with that, so it makes sense that the information broker market that currently selling marketing lists, etc. will expand to become on-demand context providers with B2B style integrations.

All in all, I see this space with parallels to the early days of business intelligence. The early adopters are out there, trying to figure out what the most valuable areas of “context” are. Unlike BI, there are so many technology changes going on that are introducing new paradigms, like location aware context with cellphones, there’s even more uncertainty. I asked a question wondering how long it will be before some “safe” areas have been established for companies to begin leveraging this, but his answer was that there are many dimensions contributing to that tipping point, so it’s very hard to make any predictions.

This was a good presentation. I think he gave a good sampling of the different data points that go into context, some of the challenges associated with it, and the technical dynamics driving it. It’s safe to say that we’re not at the point where we should be recommending significant investments in this, but we are at the point where we should be doing some early research to determine where we can leverage context in our solutions and subsequently make sound investment decisions.

Gartner EA: Effective IT Planning

Presenter: Robert Handler

He’s showing a slide on IT Portfolio Management Theory, and how there is a discovery phase, a project phase, and an asset management phase. Discovery explores new technology, projects implement new technology, and the asset management phase operates it once it’s in production. Next, he’s shown an Enterprise Architecture diagram and discusses the whole current state/future state approach, risk tolerance principles, etc. He now has a slide with a summary of three areas: IT Strategic Planning, Project & Portfolio Management, and Enterprise Architecture. He shows that all three of these have overlapping goals and efforts that could be better aligned because at present, they tend to exist in vacuums.

He’s now talking about some of the issues with each of these disciplines. First, he used Wikipedia’s definition of technology strategy to show the challenge there (Wikipedia claims it’s a document created by the CIO, the audience chuckled at that). On to Project and Portfolio Management, he’s calling out that only 65% of organizations cover the entire enterprise in their portfolio, and most PPMs are focused on prioritizing projects. On the EA side, he calls out that most efforts are mired in the creation of technical standards.

His recommendations for creating a win/win situation are:

  1. Collectively maintain, share, and use business context.
  2. Use EA to validate strategic planning and improve portfolio management decisions.
  3. Use portfolio management to generate updates against IT strategy and EA design and plans.

He proceeded to go into detail on each of these. Overall, I think this was a good 100-level presentation to tell an audience of EA’s that they can’t ignore IT strategy efforts and PPM efforts. They need to be aligned with them. It could have been a bit more pragmatic to emphasize how one would go about doing this.

Gartner EA: Michael Raynor

Presenter: Michael Raynor, Deloitte Consulting

This session is from Michael Raynor, author of “The Strategy Paradox.” The title is “The Accidental Strategist: Why uncertainty makes EA central to strategy.” He feels that it is ironic that there is a separation between the formulation of strategy and the implementation of strategy. He doesn’t agree with this approach. He feels that formulation and implementation should be a more interactive process and less linear, minimizing the strategic risk that an organization takes.

On this slide, his observation is that strategic uncertainty has been ignored. He used an example of the search engine competition of years ago and how, at least in part, Google won the space by making the best guess with regards to their strategy. It’s not that AltaVista made poor choices, they simply guessed wrong with what would be the most important factors in that marketplace. There is uncertainty associated with strategy.

An interesting anecdote he’s showing us now is that organizations that have a high commitment to strategy, which often times are the companies that we try to emulate, have an extremely high chance of failure, while companies with a relatively low commitment to strategy have a very low failure rate. To me, this seems be an example of low risk/low return and high risk/high return.

Extreme positions help customers know what to expect. Companies that are in the middle, “wander around like a stumbling drunk.” His example of the continuum was Wal-Mart at one end (cost differentiation, if given a choice of make it better or make it cheaper, Wal-Mart makes it cheaper), Nordstrom at the other end (product differentiation, Nordstrom makes it better), and Sears in the middle. Margins are best at the extremes and squeezed in the middle, yet most companies are in the middle. The reason is that at the extremes, it’s a winner take all approach. K-Mart can’t compete with Wal-Mart, Lord & Taylor couldn’t compete with Wal-Mart, yet Sears and JcPenney can both co-exist just fine. The reason for this is that companies in the middle have chosen to minimize their strategic risks. Companies at the extremes take on more risk in their strategic choices.

He’s now discussing Microsoft. He’s explaining the Microsoft manages strategic risk through their portfolio and understanding that things will change over time. This is different than diversification, where the profits of one division would cover losses in another. This is where if what’s important to revenue changes, the company is positioned to quickly leverage it. For example, if consolidation of computing in the home is centered at the gaming console rather than at the PC, Microsoft has XBox.

Another good example he’s presenting is Johnson & Johnson. Their Ethicon & Endo-Surgery division sells colonoscope, an area that previously differentiated on the technical excellence of the product. For growth, however, the problem was enough people weren’t getting colonoscopies. In the US & Canada, a colonoscopy is a sedated procedure, which greatly increases the cost associated with it. In order to manage the strategic risk that selling colonoscopes may switch and become a pain management rather than technical excellence issue, Johnson & Johnson’s VC arm invested in a company that was advancing sedation technologies. (Hopefully, I got this recap right…)

The metaphor that he believes captures how to manage strategic risk is not evolution, but gene therapy. That is, if the environment changes in certain ways, the genes can be recombined in new ways to leverage that environment appropriately. Good talk!

Ads

Disclaimer
This blog represents my own personal views, and not those of my employer or any third party. Any use of the material in articles, whitepapers, blogs, etc. must be attributed to me alone without any reference to my employer. Use of my employers name is NOT authorized.