Tuesday, November 1, 2011

Journey into Semantic Space

How context affects meaning and its transition through situational participants....












This is a more generic view on the pattern alluded to in previous posts....

B.I. Tools don't replace human judgment

Along the earlier point of Augmented Intelligence, it should be re-emphasized that even the fanciest Business Intelligence tool and infrastructure is not replace critical thinking by humans.

Currently there is a bubble developing around visualization techniques. The mere visualization of data is hailed as "Business Intelligence", while the semantic layer ("what does this data mean?") is still neglected.

Computers' main purpose is to empower human thinking, to speed up tedious tasks and help managing vast amounts of data that would overwhelm the human mind. But the actual judgment, as to what the data reveals, is still a human responsibility.

Traditional decision support systems wait for humans to pose questions for the DSS to answer. More current approaches let the software/data provide pro-active insights human users were not even aware of. But how to use these insights is entirely up to the business user.

If business analytics solutions would be so powerful to actually make decisions as well, then business people wouldn't be needed anymore. And these decision engines better be implemented by business-savvy developers.

Friday, April 22, 2011

Ambient Analytics

A while back I talked about Augmented Intelligence, tools that help us humans make ever more complex decisions in our increasingly detailed and networked world. Business Intelligence is just a sub-section of that, with businesses being early adopters of tools that help efficiency, because they have the funding and sense for investing strategically. Eventually these tools go mainstream and become available for the masses. Just think about how computers evolved out of the corporate real into every day use by individuals.

Microsoft has this slogan, "Business Intelligence for the Masses", and was pretty successful in taking the tools & interfaces of traditional data warehousing & reporting to a more intuitive level, so more people could understand and adopt the technology. The Microsoft BI Stack doesn't really do anything groundbreakingly new from a functional solution perspective, but it shines in being fairly integrated and comparatively intuitive to use.

Along this philosophy, I believe that the next step in augmented intelligence will be a push for "Analytics for the Masses". This thought inspired my online identity as "Analytics-To-Go".

While the nitty-gritty under the covers is based on heavy statistical algorithms, usability will become a huge factor of success. And that does not just mean intuitive user interfaces. It also pertains to reliability, the confidence-factor. A tool can spit out all kinds of recommendations and answers, if the user doesn't trust it, there will be no use, no acceptance, no adoption.

The developments are already under way to address the intuitive user interface, for example car companies working with navigation solution providers on voice controlled human/machine interaction. This in itself is a fairly opportunistic field: voice recognition (tone, mode, speed, dialect/accent, etc)

Yet, there is a whole other layer of complexity to be solved for the results of the augmented intelligence system to be useful to the user. Keyword SEMANTICS.

Semantics is about context, "what does it mean?". The business man talks into his car computer "call my wife and tell her I am running late for dinner". The system will have to figure out who the wife is, how to reach her and what late means (inferred ETA?).

One way to go is to store all possible information in a huge database, akin to Deep Blue when playing chess against Grand Master Kasparov. More advanced and the trend is toward a Watson/Jeopardy approach: interactive learning, just as humans do. After all, intelligence is not just about putting many "facts" together, but considering newly evolving information, from different sources, putting them into proper context.

That's where sensors and higher level machine/environment interfaces come in. Just as humans relay on their biological senses to take in information about their surroundings, technology-driven data-acquisition will play a pivotal role in Ambient Analytics.  Just consider how modern automobiles have sub-systems constantly inquiring environmental information, like proximity to cars around, lane tracking, tire pressure, speed, location, deceleration/momentum, driver sleepiness and then take these information into account for active driver-assistance in augmented vehicle control: pro-active seatbelt tightening, brake-priming, lane departure alert, blind spot alert, steering wheel vibrating to keep drivers alert when they doze off etc.

These systems need to be highly adaptable and cannot rely on heavy top-down configuration/programming, or even static update schedules. The science behind this approach is called machine-learning. But the important part is that the science becomes usable in a practical scenario, not just functional, but user-acceptable. Again, confidence level drives adaption.

Also to consider, most corporate-drive analytics solution tend to be rather top-heavy, aimed at strategic insight. This is in line with many business' investment horizon. The opportunity though, in the mass market is more for tactical decision making assistance (another angle on augmented intelligence). The "for the masses" market breakthrough will be in delivering "just enough, and just right", and not over build, or over engineer, losing perspective of the actual opportunity. Integration of all the tactical solutions will be an evolution that can only emerge from the bottom up, i.e. adoption, and how users apply it. Just like kids rarely ever use their toys the way they were intended, but put them together in new creative ways the adult toy designers never dreamt of. Give the end user flexible tools and they will come up with new uses.

Tactical Analytics will be the new killer app!

So the high-level approach to augmented intelligence is to solve the systematic & market challenges of:
  • Usability (e.g. natural language interface, high confidence level)
  • Context (semantics, the situational meaning of things, the users's perspective)
  • Active Machine Learning (tracking changes, reconciling different information sources, applying to contexts)
  • Shift from Strategic to Tactical Analytics (focused opportunities will eventually merge into broader strategic patterns through social adaption, exchange, and user creativity)

The concepts are becoming mature, and the technology is largely there. The opportunity moving forward will be integrating it into useful products, creating the next killer application, just as mobile device adoption exploded when the intuitive touch interfaces, coupled with built-in camera, location awareness (GPS) and network-connected multi-media hub came all together in a handy package. And mobile devices will most likely play a key role in the augmented intelligence evolution.

Thursday, April 14, 2011

Obsolete Top Down Data Modeling!

Just wanted to capture real quick a thought before it escapes me....will elaborate more later. Here goes...

So much effort had been going into designing, build relational data models, often in an information vacuum, without even fully understanding (if even having seen) the source data intended to go into that model.

At best, considerable effort went into profiling the source data. Many tools to automate profiling exist. Why not have them generate the data model for us? We entrust tools to help us with predictive modeling. How about having them help us model the database for that which is already there?

If we can look into the future, shouldn't it be easy to look into the past? I say: let's automate data modeling, no more hand-crafted ERDs, no more ivory-tower modeling!

Once again, the rift between intent & need. We master building machines that make things happen fast, and software that automates processes. Now let's focus on having them do the right thing.

Sunday, April 10, 2011

Testing Perspectives

Got to love the ambiguity in this headline. I don't mean testing some/the perspectives, rather: "Perspectives on Testing".

In my latest project I had the opportunity to get more involved in a formal testing strategy of our B.I. deliverable. As I tried to prepare myself for the best approach, I realized that there is not much literature on how to properly test business intelligence architectures. While there are plenty of treatises on general software development, most with emphasis on user interfaces or transactional systems, there is not much that covers the qualitative aspects of an end-to-end Business Intelligence solution.

One could argue that there is nothing new under the sun, even in the B.I. space. It basically is just comprised of user interface elements, database tiers and some sort of data processing system (ETL can be seen as the transactional component).

And in the end, the biggest challenge is not the technical/functional component, but the process, the business.
That pattern is not much different than the requirements management arena. In fact, with the popularity of test-driven development (TDD) the tendency is to more closely link requirements with testing anyway.

If we narrow it down to the important aspects of testing, we arrive at something like this:



While test requirements are at the center, as always in life, what they mean, how they fit into the bigger picture is a matter of perspective.




A good testing process, or testers as individual professionals, reconciles user needs with project plan, developed artifacts and what impact they all have on the business. 

In B.I. as well as other business solutions.

To be continued...



Friday, March 11, 2011

Augmented Intelligence

The current state of affairs in Business Intelligence is a rather top-down driven and manual process. 

Let's say, a business user is interested in how a business process is performing. If they are lucky, a performance dashboard for monitoring key performance indicators (KPI), exists. If not, it will have to be developed. A request-response situation, through the whole IT/developer/analyst organizational stack. In more sophisticated business, possibly with sophisticated tools, the business user can configure new KPIs on the fly, assuming they have the proper understanding of KPI logic and proper tool usage.

Three issues stand out from this approach:
  1. Turn-around time (lack of agility, due to reliance on human-driven process)
  2. Business people's focus gets side-tracked by required tool & database knowledge
  3. Potential for error (as always in overloaded human responsibilities)
So what can we do? How about automating the mundane tasks and leaving humans with work they are good at? Welcome to "Augmented Intelligence". Instead of going form one extreme (manual process) to the other extreme (Artificial Intelligence), how about a balance to leave the tedious work to computers, and the more fluid, ambiguous information that requires human judgment to people. But how to make sure that people don't over-ride human judgment, and vice versa, humans not second-guessing proven algorithmic correlations of real data?

That's where the feedback loops and iterative approach comes in. Delegate & monitor, trust-but-verify, do & review. And track history of all decisions made. If business changes are applied, keep tracking the old reality and compare it to the new (something like A/B testing in marketing). You can only know if something works when comparing to the alternative(s).

Make things easy for people, use common interfaces: Web, Email, Wikis, News/Event formats.

Simulate a work-flow they are used to from their personal life: Send messages. Read how-to articles. Get News. Write letters to the editor. Publish their ideas.

Get buy-in from your organization by cultural popularity, make it another social engagement opportunity. Accountability & motivation will evolve naturally as people enjoy these more flexible and bottom-up driven opportunities to make an impact.

Don't have the computers try to do the complicated things humans can easily do, and don't have people do dull tasks that are much better performed by computers.



(Click on picture for larger view)

Tuesday, March 8, 2011

Ambiguity: Requirements vs. Solution

During the course of every Business Intelligence initiative, project teams will encounter the pervasive challenge of ambiguity.

There are various project management approaches to tackle ambiguity, like the agile movement of iterative discovery.

Since B.I. often needs to address business processes, in terms of replicating implied business logic and derive related metrics, it is interesting to look look at a common pattern in the B.I. tool/service acquisition process:






Now, consider that this pattern does not just apply to company/3rd party vendor relations, but also to internal  customer/provider relationships, such as a corporate marketing department requesting self-serve functionality from the internal B.I. team.

You see how at each connection point, another layer for potential confusion gets introduced, like in the old childrens' game of "telephone operator".

Of course pragmatic requirements management could alleviate some of those layers of ambiguity.

Monday, March 7, 2011

The implications of Predictive Analytics

After the concept of "Business Intelligence" has been milked for what it's worth, now the latest hype is "Predictive Analytics".

Consider this, the majority of stock trading transactions are being conducted by algorithms, which leverage a form of predictive analytics. Some of the major fluctuations in recent years have been attributed to these algorithms all "running for the exit" at the same time, as most of them use similar logic.

Also, have you ever used your GPS navigator with real-time traffic data to obtain an alternate route, supposedly so you could bypass the jam? Did you find out that every other modern car had the same technology and got the same "hints", leading to a new jam at the bypass route? Another form of Predictive Analytics at work, with undesired outcome.

So isn't the whole intent of a Business Intelligence or Analytics initiative to get ahead of the competition?
But what if everybody leverages the same set of methodologies and tools? Won't we have similar situations as described above?

Since businesses are run by people, for people, they are a social system. And social systems are reflexive, they adapt their behavior based on what they think might happen. Any broadly spread analytical predictions following similar patterns are bound to alter the very future they attempt to predict.

Sunday, March 6, 2011

Agile Documentation

A lot of agile practitioners misunderstood the focus on delivering tangible artifacts as "documentation is bad". Consider that often a deliverable is not code, but information. Especially n the Business Intelligence space, just delivering data without context does not add any value. If you are in an organizational culture where initial ambiguity stimulates fruitful conversations, more power to you. But more often than not, stake holders (solution developers & business customers) are not co-located geographically, or even time wise. So we have to consider a communication media that works asynchronously (non real-time), which amounts conceptually to the same as "documentation". 

My observation is that the aversion against documentation tends to be a developer/implementer driven phenomenon. That software developers and IT pros can afford to not like creating documentation will change once IT becomes a buyers' market. Much of this "real programmers don't comment their code" macho attitude stems from a time when IT/software development talent demand outstripped supply. But the inherent TCO (cost of ownership) of badly maintainable code (lack of documentation being a part of that) will drive pragmatic changes of attitude in business, which very well may find the business-ignorant "rockstar" developers on a dying branch of the IT tree.

Still, another impulse against documentation stems from the wisdom that outdated documentation is worse than none whatsoever. I would personally disagree with that generalized statement, by virtue of "it depends" reasoning. If there is clear life cycle meta data, like "last changed timestamp" on documentation, as the user I can gauge how reliable the content could possibly be. The problem is, that documentation is not supposed to replace common sense and critical thinking on the document consumer side.

It is also curious, how having worked in a couple of agile-inspired project environments with aversion towards documentation, impromptu diagrams drafted on white boards and subsequent snapshots with smart phone cameras (later posted on Wiki or emails). Now what was that, if not documentation? 

The stuff we deal with in business processes, rules, data models & relationships, data flows, process flows, is far too complex as to not use documentation artifacts to communicate them across people. Along the philosophy of a picture being worth a thousand words, a diagram, and info graphic, is a precious way to capture & communicate this complexity. That is documentation, not necessarily formal, but it is a form of documentation. 

Let's not view this so black & white: "documentation is so waterfall" vs. "without documentation I can't start my work"

The balance is in treating the documentation artifacts with the same agile spirit as our project tracking & software development. Iterate, increment, track status, review, improve, abandon if irrelevant (minimize work done).

This is as easy as maintaining "last modified timestamp" or "edited by" tagging, which Wikis already do for you. Then your organization can agree on what constitutes "stale content", and in how far "stale" implies "obsolete". 
Just don't simply delete what you deem obsolete. Put that content into an Archive folder (or take "obsolete" tagged content out of your operational views/filters), so you can go back when the need arises to understand how a system evolved. If you have version tracking activated in your Wiki, you can cover change control for audit purposes that way, with even less overhead.

Documentation? Be pragmatic, and it won't contradict the agile spirit!

Solution Requirements Mismatch

Impedance Mismatch is an issue occuring in mismatching database types of different paradigms (e.g. object-oriented vs. relational). They may conceptually contain the same information, but their way of representing the data (model) does not relate well between the different database types . 

A similar effect can be observed in business settings, during requirements management for business intelligence solutions.
  1. Let's say a business customer communicates some functional requirements to a B.I. solution provider. Whether that's an internal developer, or an external vendor doesn't matter in this context.
  2. And let's say the B.I. solution provider can confirm the requirements and make a proposal about the planned solution.
  3. Both parties agree to move forward with a project, purchase, or consulting engagement.

When all is implemented and ready for use, the business customer realized that the provided solution doesn't do what it should. 

What happened?   Most likely...
  • the business customer expressed their wants, and not the business' actual needs, or should have
  • the solution provider confirmed what he can or wants to deliver, not what should be delivered.


Cynics will encounter this conclusion, which strangely is often very apparent, with "hindsight is always 20/20", or "shoulda-woulda-coulda".

But wait a minute, wasn't learning one of the key aspects of intelligence? If these patterns are apparent, why not learn from them, apply them with each initiative moving forward?

So let's examine the above scenarios a bit closer:

Why do business customers often ask for requirements about what they want, instead of what the business needs?

Why do solution providers commit to delivering requirements that the customer actually needs, but instead of go by what the provider is able to deliver, or is interested in delivering?
  • Sales-driven: eagerness to please customer (catering to broad desire for instant gratification)
  • Competitive pressures: no time to explore details of the requirements, the competition might snatch up the contract instead
  • Focus on Profits: lack of selectivity whether provider actually is able to delivery solution, since they don't want to miss out on the opportunity of doing business. Unfortunately the lack of capability/willingness to grow into a promised deliver is often not honestly communicated to the customer (which often results in the over-promised/under-delivered effect, when providers confused their capability with their motivation of "we can do it!")
  • Cultural factors, professional image/pride: the right thing to do would often be to advise a customer that what they require is not what they really need. But in our business culture it is rarely appreciated to push back on customers, and say no to setups that have low chance of success. The customer/provider relationship is often one subliminally serving social validation, more so than pragmatic business conducting. This dynamic affects the compatibility of agreedupon requirements vs. actual solution need. 

In summary, the above phenomena can be expressed in four scenarios:



  • lacks willingness & lacks understanding
  •  (willingness might have potential if understanding is addressed; see scenario 3)
  • lacks willingness, but understands (abandon all ye hope, a.k.a. "politics")
  • is willing, but not understanding (needs consulting)
  • is willing, and understanding (perfect setup for success!)

More on requirements analysis here.

Saturday, March 5, 2011

The Participation Web

So there is this huge movement of marketeers to "engage" consumers.

But that is the very crux of the problem. We've been passive consumers of information ever since TV became pervasive. The whole premise of the Internet was to encourage exchange, not just be yet another consume-only feed.

How about we talk about the Participation Web?

That requires two aspects:
  1. People actually participating, not just consuming
  2. And producers offering services and content that truly engage, and who encourage potential participants to get involved.
How do we get there?

It requires a cultural shift, probably more evolutionary (smooth transition) than revolutionary (abandon all that was before). There will be parallel paths, overlap, for quite a while.

There will be those opposed in principle, or due to vested interests in the old way.

There will be those skeptical, critical thinkers (which we need more of). Who perhaps might come up with a third path, alternatives, middle grounds, integrative approaches (very valuable, but from a demographics perspective, probably a minority).

And there will be those lost, unsure how to approach. This second group is a huge opportunity for educators, trainers, coaches, consultants, and media platform & content providers.

Just make sure you keep above points 1 and 2 in mind!

Friday, March 4, 2011

Twitter Intelligence 2.0

As Twitter has become one of the world's largest event-based databases/warehouses/clouds, it would be neglectful to not investigate it further in the realm of Business Intelligence and related Analytics. I've referred in an earlier article to this as Twitter Intelligence (not sure if I coined the term, but it sounds good to me :-)

My fascination with Twitter is that of an analyst of social phenomena, and not that of a Content Junkie.
Hence, my natural inclination is the dissect and examine Twitter into its elementary pieces and understand how they work together. Now what's really fascinating to find is that Twitter is inherently simple. In fact, so simple that I run into many cynics who don't "get it", what this hype is all about. On first glance it is easy to miss out on the results of the network effect, the hive mind, and the collective intelligence that has emerged from Twitter, and only becomes apparent after some exposure, if not involvement.

At a conceptual level, we have 4 basic concepts in Twitter:
  • The Social realm (human short messages with socializing intent)
  • The Content realm (referral to cool/interesting Web content, via URL links)
  • The Contributor role (people who initiate/author/share stuff)
  • The Consumer role (the listener/reader of that stuff)




These are 4 pretty straight forward concepts. Where it gets interesting, is that each of those realms can overlap and intersect with any of the others, in any combination. THis is how out of 4 simple elements, we derive 13 different scenarios, the network effect!

This principle was originally applied by Robert Metcalf in a technical context, which lead to the explosion of the data network now called the Internet.

There is yet another dynamic to all this, after all, 13 scenarios is not a whole lot of combinations to attribute complexity. Each of these 13 scenarios can affect any other combination, recursively, iteratively, incrementally, infinitely. What we have here is a complex system of self-motivated, but environmentally influenced "agents".

The intersections and ovelap areas have been labeled with letters and numbers, due to space constraints in the diagram. But here is the key:

 A  Initiator of a general social conversation/discussion (typically topic/interest related)
 B  Listener to general conversation initiated by others
 C  Reader/Watcher of Web content referred to by others
 D  Producer or Referrer of Web content, referring it to generic audience (by topic/interest/subject matter)


  1. Social promotion of produced content
  2. Conversation provides promotion to contributor, and feedback from consumer
  3. Consumer finds content by monitoring social buzz
  4. Contributor provides the content the consumer needs/likes
  5. Contributed content, as well as Contributor get social exposure to Consumer (2 way: promotion & feedback/rating)

Below just another insight into the relationships between content and social exchanges:


The orange elements are the reason why Twitter is labeled as a "social network", even though it gains real momentum with its connection to other Web content.

Thursday, March 3, 2011

Requirements Analysis

A while back I've held a presentation on requirements analysis, specifically targeted for the Business Intelligence space. It might be interesting to review this here.


The above link contains the whole slide deck. Below pictures are just some key points (excerpts).






Content Market Dynamics & Relationships

As I was analyzing how the online content market works, I drafted up this nifty little diagram to understand the relationships and dynamics involved. Thought I'd share...



Twitter Intelligence

Twitter Contribution Value Chain

As an engaged Twitter observer & contributor (@AnalyticsToGo) with a background in Analytics, I am fascinated by the vastness and richness of collective intelligence evolving there.

It occurred to me that there is an inherent ranking of contributors, based on some fundamental dynamics.



The originality starts at the bottom, and as the stack rises, the focus shifts from authoring to distribution.

This graph illustrates the relationships in more detail:




And each color-coded Twitter activity type has its own purpose:


Red: implicitly rate content by spreading it


Orange: help categorizing content by marking it up (not scientific, but in the spirit of crowd wisdom)


Yellow: raise visibility to not much noticed content (help with discovery)


Green:  help categorize Web content by marking it up (similar to Orange, which is more refining)


Cyan: Discover new Web content, promote it to broader audience


Blue: Create new Web content, and promote it



If you're looking for more detailed assessment of Twitterers' contributions, I found this free service quite useful: http://klout.com/

Supporting Decisions

Considering that the concept of Business Intelligence grew out of the earlier concept Decision Support, I would like to take a few moments to review what contributes to good decision making.

  • Goals, Purpose, Reason
  • Context, Scenario
  • Knowledge, Experience, Intuition (as a derivative of the latter)
  • Analysis

The fact that a decision is to be made implies that there is a scenario requiring some action. This action can either be a response to something, or a self-motivated initiative. Either way, it follows that there is a reason for the need to make a decision. Thus, there is a goal to be achieved.

In order to decide (between alternate paths) one must first be aware of the situation and available options. This requires knowledge, or even understanding, about what is the scenario, and what might be the options to address the scenario.

Knowledge without context is not very useful. Analysis helps us ponder how things fit together. Even decisions apparently driven by intuition have an underlying analysis component to them. Either from previous occurrences of similar problems, where we did more elaborate analysis, and upon repetition of the problem can reuse the outcome. Or by the instinctive combination of everything we ever learned, the so called "sixth sense", by confidence in "gut feelings".

The relevant aspects of analysis in support of an impending decision are around the dependenciesimpact and consequences of our planned action. Or in business terms:  the cost, benefit, and risk of our decision.

Think about this in the context of your Business Intelligence efforts. Doesn't this logic seem very straight forward? Should B.I. initiatives not be more manageable along those lines?

Customer Intelligence versus Market Intelligence

It has been my mission from the inception of this blog to address the business side of Business Intelligence. It was not my intention to become yet another technology-driven blog about "how-to" details, as there are plenty of them about already. There are also plenty of forums and channels that work the marketing side. Not just products, but evangelizing of methodologies, tools, architectures. My aim is more along the lines "teach a person how to fish", not cut-and-paste silver bullets. If I can stir debate, make people think, then I have realized my mission.

People can get worked up on terminology and semantics, as if the goal was to create ambiguity, versus resolving it. There is...
  • Business Intelligence, 
  • Customer Intelligence, 
  • Operational Intelligence, 
  • Market Intelligence,
  • Competitive Intelligence 
  • Social Intelligence, etc.

Let's face it, if you are interested in "Business Intelligence", chances are you look at this from a company perspective, representing a business function. The term "Business" itself can be very ambiguous.

Business, as in...
  1. a company (legally, organizationally, transactionally)
  2. conducting commerce (buying/selling/trading goods, expertise or labor/services)
  3. value (transactional activity, "thank you for your business")
  4. mission ("we are in the business of...")
  5. participation ("this will put us out of business!")

There are way more scenarios than the few I listed, but my point is to emphasize that the perspective, the context  in which you consider "business" intelligence matters.

When you are involved with the broader concept of Business Intelligence, what is it that you're really after?
  • Operational insight in your business? -> Operational Intelligence
  • Insights into customer behavior? -> Customer Intelligence
  • Understanding of the competitive environment? -> Competitive Intelligence
  • Identifying opportunities -> Market Intelligence

It is important to distinguish, and be aware what your focus is with your B.I. inspired Analytics initiatives.
They are obviously all business-related, given your professional role and interest. Operations, customers, markets, competition are all business concepts, in a way a subset of Business Intelligence. Just like national intelligence services have multiple purposes, such as, for example, security, defense, economic, political.

So consider Business Intelligence just the umbrella term, but be mindful of the specific focus your B.I. initiatives are after. Don't let the marketing terms confuse your core interest.


One of the hottest topics in the Business Intelligence realm is so called "Customer Intelligence". There is strong desire, and incentive, for a business to understand its customers. Merely "knowing" a customer is a very traditional concept, and was a natural given throughout ages when commerce was conducted only local and in-person. The customer "knowledge" challenge has become more pertinent in the age of logistical distribution of labor, and the proliferation of online market places. What a successful business really needs to have is a good "understanding" of its customer base.

Concepts like understanding customer behavior need to be carefully communicated, as from a customer perspective, the term "customer behavior" hints at being monitored, raising privacy concerns. And even the perception of privacy issues, hurts a business' reputation, which in itself is a competitive value, just as intelligence.

Customer Intelligence manifests itself typically in terms of understanding patterns of
  • acquisition of new customers (e.g. advertising, referrals)
  • conversion (turning shoppers into buyers)
  • retention (loyalty, repeat purchases, maintain relationships)
So what is Market Intelligence then? It is not my goal here to impose meanings on terms. As always, I prefer to stimulate thoughts around the concepts, not so much the wording.

What means "market" anyway? 
In the context of our Business Intelligence scoped considerations here, it implies...
  • a potential customer base (specific interest/demand in products, services)
  • our company's specialty ("we serve the tools market")
  • opportunity (potential to produce and market something in an unproven but valuable space)
Customer Intelligence is very focused, and can be a subset of Market Intelligence. The majority of the effort in figuring out customer behavior is not so much in tracking what customers are doing, but why. From a provider of services or goods it is immensely valuable to understand what customer want. But even more so, what motivates them, what is their underlying interest in a product or service. Tracking behavior can be largely automated in the age of online transaction. But the underlying dynamics of what leads people to buy is a more tricky subject, and a rewarding challenge for any Business Analyst and Analytics initiative.

Customer Intelligence tends to have more tangible qualities, seems to be more straight forward to address, and Market Intelligence might be a natural progression after that. Also, depending on the maturity of your business, your interest in C.I. vs. M.I. can vary. For example, startup businesses might more likely be interested in exploring market opportunities. Established businesses would benefit more from maintaining good relations with their existing customer base, not just from a loyalty standpoint, but also as potential referrers of new business by word of mouth.

Again, the details are debatable, what matters more is the conceptual awareness, to help you focus your B.I. efforts along the proper lines.

Wednesday, March 2, 2011

Solution Patterns

Within the IT/Software space, the concept of patterns has emerged as a practical tool to design complex systems. Patterns help categorizing solutions and avoid re-invention of the wheel as much as possible. They help keeping the focus on the similarities between solutions, and isolate the differences necessitating customized development. Pattern-derived solutions are deemed more efficient during implementation time as well as support time for two major reason:
  1. Time & effort is saved in "trying out what works" (establishing proof-of-concept models)
  2. The common underlying patterns help stake holders communicate with common terminology, decreasing the risk of miscommunication during requirements gathering and acceptance testing.

Between the various sets of patterns, one can generally distinguish between the following pattern types:
  • Structure (architecture)
  • Behavioral (process)
  • Data (content)
  • Usage (presentation, interaction)

Within the Business Intelligence space there have evolved a couple of architectural patterns that recur in a broad range of solutions.

Structural (data models):
  • The Operational Data Store (ODS) first proposed by Inmon
  • The Kimball-inspired Dimensional Model (Fact & Dimension Tables)
Behavioral (processing):
  • ETL, the process of extracting source data, transforming it, and loading it into a target data warehouse
  • ELT, extracing and loading first, then transforming it locally in the data warehouse
Data (content)
  • Time grain (date vs. timestamp)
  • Measure (quantitative business metric) 
  • Criteria (qualitative, for grouping, filtering, sorting)

Usage (presentation & interaction):
  • OLAP (multi-dimensional cubes, drill-down, drill-through/across, rollup)

But these are just the means. These patterns help implement a technical infrastructure, but they do not solve the business challenge of having proper intelligence (actionable insight) at hand.

So what are the application patterns, from a business perspective? Business solutions that directly correlate to one of the key success factors, such as for instance increase organizational efficiency, increase customer satisfaction, or risk management. Some examples coming to mind:

Increase Customer Satisfaction:
  • Recommendation Engines (e.g. Amazon's suggestions)
  • Stock Market Alerts (stock price trends triggering threshold events, notifications)
Risk Management: 
  • Fraud Detection (trigger alerts at unusual credit card activity)
Organizational Efficiency:
  • Web Site Path Analysis (identify most commonly taken browsing paths, correlate to purchases)
  • Search Engine Marketing (keyword spend vs. conversion events)

The reason why this understanding is useful before getting into the details of a B.I. project is that there may already be a standard pattern for what you try to achieve. Perhaps not a canned solution, but something you can adapt and build on, instead of having to reinvent the wheel.

When you hire seasoned B.I. specialists, you inadvertently get some pattern knowledge, based on how they solved similar challenges in other environments before. That's why it is helpful to focus on not only technical or tool expertise when staffing, but also leverage candidate's industry background as applicable.

Solution patterns are useful to communicate and collaborate on complex business intelligence aspects across functional and technical roles, as well as across companies.

The Ethics of Data Mining

For many years companies have been aware of the risks of neglecting proper data management. And data loss was just one of them. The opposite, making data available to unintended audiences could have equally devastating consequences as a database crash without a backup in place.

In the age of global real-time communication, such acts of neglect or malfeasance can have far reaching consequences, beyond even the original stake holders in the data affected.

But today I want to strike on another point: the ethic imperative of Data Mining Social Activity.

Ethics always seems a welcome topic for broad and controversial debate. The question for a business is how do ethics map to tangible business objectives and results?

In this age of real-time propagation of opinions, reputation spreads fast, and damage control seems to be more expensive than prevention. The broader masses out there seem to have a general consensus as to what's ethical. All you have to do is listen and adjust.

You will want to weigh the cost versus the benefits of indiscriminately pushing for data mining social activity.
Does your immediate gain on the bottom line justify the strategic risk to your reputation?

Consider that in dealing with people you affect what you measure. You are not counting beans anymore, you are dealing with conscious humans, who can and often will alter their behavior if they are observed.

It is tempting to get driven by what's technologically possible. And in recent years, data mining tools have matured quite nicely to be usable by more generic business audiences instead of highly trained scientists. Let's not forget, that just because something is possible that it doesn't always make good sense. Think ahead, mine not just your customer's behavior, but mine the trajectory of your company's actions. Apply predictive modeling to your organizational handling of the business and the resulting outcome.

The fact that you are interested in data mining leads to conclude that you have a sense about what you do today is having an impact on the results of tomorrow. So to you it should be easy to weigh the impact of your data mining efforts on your target audience. After all, data mining is the means to an end. And there is no point with the means jeopardizing the end, now is there?

Do not wait for privacy laws to mature, forcing you into a reactive mode adjusting your legal practices, and company culture in dealing with your market's sensitivities. Be a thought leader, a role model to the industry. It may very well be one of your competitive advantages. The trust of your customers itself becomes a vital factor of your business' success!

Tuesday, March 1, 2011

Data Relevance in the Knowledge Life Cycle

The latest article in my blog about future trends is pertinent to the B.I. space, as information life cycle is a vital part of B.I. and in the bigger picture feeds into trends as knowledge management.

To this day the B.I. industry has no established and integrated method to track data relevance or manage the information life cycle, to the extent the digital media industry has pioneered (to enforce DRM). A lot of the existing data life cycle concepts in B.I. come out of the top-down architected Data Warehousing space, which, if applied, are heavily human-process controlled, with a variety of tools, each component and interfacing with the rest leaving room for mistakes, ultimately leading to questionable outcomes.

The basic understanding is here today, with a practice referred to as "data quality". One of the DQ metrics is "relevance". Today this determination is a rather manual process, always open to arbitrary decisions with little bearing to the real world. The people having to determine data relevance do not have all the insight of the users of the data, and the overhead for a few specialists to find out the broader scope & usage of data is often intolerable within the typical speed of business operations. Hence the need for a crowd-sourced kind of rating model for the relevance of data. Of cource, that will be subject to distortion based on perspective. But that multi-dimensional approach is nothing well thought out software could not manage.

If we had an intrinsic system, embedded in every aspect touching data, which would maintain life cycle intention and actual status at every stage of use, the knowledge maturity and currency described in my other article above could become common place.

I suspect that business will adapt any technology that will prove itself on the broader Web. Perhaps this aspect will ultimately evolve out of Web 3.0 or the "Semantic Web" ?

Scaling Out Agility in B.I. Infrastructures

One of the challenges in applying agile methodologies to the Business Intelligence space is that traditional architectures layer tiers by technical function, introducing tight coupling of technology that makes changes expensive.


This tightly coupled dependency makes an over all B.I. solution more susceptible to change and support risks. Changes are hard to implement, without disturbing related functionality that wasn't originally intended to change. And because changes are often "under the hood", end users will have little understanding about delivery delays ("technical debt") or having to go through another Q/A step that doesn't seem to provide any additional functionality to them.


Along the concepts established by Service Oriented Architectures, if Business Intelligence infrastructure, tools and interfaces become more service oriented, than platform-driven, the agility of B.I. (the content, the results) could be increased.



The linking arrows at each generic level indicate different ways to integrate across functions.
For example, the User Interface level, hyper links facilitate cross functional relationships, such as drill-downs and drill-throughs across disciplines (as appropriate, driven by security rules).

On the Process level, each subject matter has its own functionally independent component that can be referred to by other components, or call out to other components, which do not have to be aware, let alone dependent, on the other components. The key point being loosely coupled.

Lastly, the Data leve would integrate by remote database linking, replication, or a on-demand loading.
Data models should be more modular, and again loosely coupled, for example horizontally partitioned across hosts.


Now, let's take note that layered tiers by functionality still make sense where the system performance is more important the the agility of the business; a classical trade-off between speed and flexibilityNew technologies and system architectures, such as Apache Hadoop can help reconciling the both.

In the spirit of focusing on providing intelligence to the business, these technical/architectural concerns ultimately should be delegated to a cloud operated by service providers who focus on optimizing along those lines, so your company can focus on the pertinent business questions.

Monday, February 28, 2011

Convergence

In the 1990s there was a lot of buzz around "Convergence" in the Telecom/Internet space. This referred to the merging of technologies used for textual Internet data, telephone/voice calls, and video conferencing. The ambition was to have one unified communications network for all types of media. A lot of technologies (e.g. Asynchronous Transfer Mode) were specifically developed for that purpose. Yet, ultimately things simply converged into an already existing protocol, TCP/IP. As systems became faster, production cheaper, and the necessary support maturity for the established network infrastructure became common place, it was pragmatic to have the communication need merge into existing technologies. 

Consider the following evolutionary timeline between the dawn of computing and modern applications, starting from the bottom of the stack. Notice the tilting from "Technically focused" into the "User oriented" space over time. Early in the evolutionary cycle development was driven by what was technically possible, progress was largely limited by technology constraints at the time. Higher up in the evolutionary time line, we notice the shift more toward optimization for user value. The development goals are driven by the business goal, not the technical feasibility.




If we consider the evolution "through the ages" among interrelated aspects, we get a better sense for the concept of convergence:



If we follow the trajectory of what has become Business Intelligence, we see a similar trend to innovation not being so much in the individual functional tool or technical infrastructure space, but in the maturity of their integration. Along those lines, we can project where Business Intelligence may go.

What is today commonly referred to as "Business Intelligence" will eventually evolve into the following concepts:


Cloud Services
Abstraction of the underlying Information Technology & Software infrastructure from the user.

Semantic Web
Smart tagging of content, more meaningful search results, sets the stage for collective intelligence.

Social Media
Collaboration portals & market places for dynamic interaction and trade. Exchange of knowledge, virtual goods, services.

Crowd Sourcing
The effect of dynamic and self-motivated work distribution based on individual interest, capability and availability.

Collective Intelligence
The iterative, incremental and recursive knowledge evolution as a result of crowd sourcing. More powerful than any algorithmic search engine.


Also, notice the evolution of the network effect. Originally, things evolved in a single stack. Then more vertical stacks developed more or less in parallel, with little integration. And ultimately, the various concepts play into each other, in reciprocal and recursive fashion, causing the effect of emergence (complex behavior arising from comparatively simple parts).


The network effect and emergence are not to be underestimated in the evolution of what is known as "Business Intelligence" today. Ultimately they will contribute to various different technologies, methods, and cultural paradigms to converge into the next level of intelligence, business being just a part of it, but with much dependent on and impacted by.

Future, Agility & Pragmatism

Another shameless pitch for our sister blog "What the Future may hold":

Why is it so hard to predict the future?

I am cross-posting this because this question is very relevant to the Business Intelligence and Analytics space.

If you are in the business of predicting your business outlook, it is of little value if you don't frequently check course, to make sure you're on target of your prediction, or better yet, if you anticipated target has moved, and you may need to adjust your strategy.

This is where Agility & Pragmatism in Business Intelligence comes in.
Check your course often, adjust as needed

Saturday, February 26, 2011

Collective Intelligence

A few posts back I was struggling to find a term for the phenomenon of what I called then "macro intelligence". Turns out (credit where credit is due), there has already been coined a term for that: collective intelligence



Diagram courtesy of Wikipedia.com - Author: Olga Generozova. The diagram is based on the types and examples of collective intelligence discussed in the books 'The wisdom of crowds' and 'Smart mobs'

The momentum in the media for this phenomenon only came from the proliferation of what we call "social networks" on the internet now. But in a way, the trend has been there all along, just the media has become more efficient and finally put the potential of high-velocity collaboration into the lime light.

Collective Intelligence was the concept I so ardently tried to wrap my head around, when I aspired to integrate Business Intelligence with a more holistic approach to approach broad solutions toward complex problems.

Cross Reference

Since Business Intelligence and Agility have a mutual relationship, allow me to reference a pertinent article from my blog about Agility here.

After all...

  • Business Intelligence implementations (on any level, whether technical, organizational, or cultural) benefit from an agile approach (evolutionary, incremental, iterative).
  • And Business Agility can be increased and supported by mature Business Intelligence practices.


So they go hand in hand, as these two blogs go.

The Human Side of Business Intelligence

For those who wonder whether there are more tangible recommendations coming in this series of thoughts about Business Intelligence, I would like to clarify a few things...

Intelligence is about empowerment, self-motivated learning, not being told what to do. As such, this is not a "how-to" guide. This forum is supposed to stimulate thought, hence the popular format of asking questions, instead of claiming static/universal answers. It is a journey of discovery, and not finding the goals for you.

I feel that there are plenty of Web sites, discussion forums, articles, and blogs already that address the marketing side and technical aspects of "business intelligence".

The big opportunity for evolving more maturity in the B.I. space, the way I see it, is in the common sense department.

If B.I. is to provide intelligence to business, then the following is relevant:

  • businesses are operated by people, humans.
  • businesses serve people, humans

Therefore, BUSINESS intelligence is foremost about the intelligence of HUMANS.

Short of strategy consultants, professional coaches, and program managers, I don't see that reality addressed much in the general B.I. hype.

To address this space, the human side of business intelligence, is my mission in this blog.

A Menu of Topics

I have a couple of topics in the pipe line, which I would like to throw out there in summary. I am a big friend of interaction, feedback. Intelligence, after all, encompasses learning, and learning relies on feedback loops.

I appreciate any input on priorities, additional topics, scope, depth, relevance on what I have in mind so far for future articles in this series...



Why Scorecards without Discussion are useless
Numbers (quantitative data) without context (qualitative information) don't explain anything.
A good business leader will use scorecards to initiate conversation with business stake holders, inquire context, stimulate exchange on background and ideas to respond to realities. Reflexive response to numbers without context is NOT business intelligence.

Social Business Intelligence
Self-serve analysis tools integrating different users' interests, sharing the patterns with each other, surfacing correlations of interest and overlapping functions, increasing transparency and accountability. The dawn of B.I. Wikis.

The Power of Reflection
Reflexive behavior has its purposes. But it needs to balance out with deliberation, intentional delay in response, to consider. Just like any good diplomat knows, sometimes delaying action is a good response to a crisis, for example to avoid escalation, going down the wrong direction.

The non-functional requirements
The following aspects are a case in point that "Business Intelligence" is not mainly about technology, software, or databases, but starts with people.  A business, as we know it today, is still largely comprised of and driven by people, humans with individual personalities, preferences, and attitudes. Thus, it is important to recognize the non-functional pre-requisites necessary to support any Business Intelligence effort.

More on the LEGO principle and how to apply it to B.I. 
  • Build complex solutions by combining the simple
  • How do you keep control over the emerging complexity?
  • Self managing parts emerge in responsibility for the whole
All these dependencies and assumptions 
Why the distinction between "structured" and "unstructured" data? 

Streaming Business Intelligence
No more "ETL"! The usage of data will have to be driven at demand-time, not design-time.

The historical strata of B.I. / Decision Support
At any given time we have to support and integrate that which we built in the past with that which we need today, and with that where we may go in the future.