Friday, March 11, 2011

Augmented Intelligence

The current state of affairs in Business Intelligence is a rather top-down driven and manual process. 

Let's say, a business user is interested in how a business process is performing. If they are lucky, a performance dashboard for monitoring key performance indicators (KPI), exists. If not, it will have to be developed. A request-response situation, through the whole IT/developer/analyst organizational stack. In more sophisticated business, possibly with sophisticated tools, the business user can configure new KPIs on the fly, assuming they have the proper understanding of KPI logic and proper tool usage.

Three issues stand out from this approach:
  1. Turn-around time (lack of agility, due to reliance on human-driven process)
  2. Business people's focus gets side-tracked by required tool & database knowledge
  3. Potential for error (as always in overloaded human responsibilities)
So what can we do? How about automating the mundane tasks and leaving humans with work they are good at? Welcome to "Augmented Intelligence". Instead of going form one extreme (manual process) to the other extreme (Artificial Intelligence), how about a balance to leave the tedious work to computers, and the more fluid, ambiguous information that requires human judgment to people. But how to make sure that people don't over-ride human judgment, and vice versa, humans not second-guessing proven algorithmic correlations of real data?

That's where the feedback loops and iterative approach comes in. Delegate & monitor, trust-but-verify, do & review. And track history of all decisions made. If business changes are applied, keep tracking the old reality and compare it to the new (something like A/B testing in marketing). You can only know if something works when comparing to the alternative(s).

Make things easy for people, use common interfaces: Web, Email, Wikis, News/Event formats.

Simulate a work-flow they are used to from their personal life: Send messages. Read how-to articles. Get News. Write letters to the editor. Publish their ideas.

Get buy-in from your organization by cultural popularity, make it another social engagement opportunity. Accountability & motivation will evolve naturally as people enjoy these more flexible and bottom-up driven opportunities to make an impact.

Don't have the computers try to do the complicated things humans can easily do, and don't have people do dull tasks that are much better performed by computers.



(Click on picture for larger view)

Tuesday, March 8, 2011

Ambiguity: Requirements vs. Solution

During the course of every Business Intelligence initiative, project teams will encounter the pervasive challenge of ambiguity.

There are various project management approaches to tackle ambiguity, like the agile movement of iterative discovery.

Since B.I. often needs to address business processes, in terms of replicating implied business logic and derive related metrics, it is interesting to look look at a common pattern in the B.I. tool/service acquisition process:






Now, consider that this pattern does not just apply to company/3rd party vendor relations, but also to internal  customer/provider relationships, such as a corporate marketing department requesting self-serve functionality from the internal B.I. team.

You see how at each connection point, another layer for potential confusion gets introduced, like in the old childrens' game of "telephone operator".

Of course pragmatic requirements management could alleviate some of those layers of ambiguity.

Monday, March 7, 2011

The implications of Predictive Analytics

After the concept of "Business Intelligence" has been milked for what it's worth, now the latest hype is "Predictive Analytics".

Consider this, the majority of stock trading transactions are being conducted by algorithms, which leverage a form of predictive analytics. Some of the major fluctuations in recent years have been attributed to these algorithms all "running for the exit" at the same time, as most of them use similar logic.

Also, have you ever used your GPS navigator with real-time traffic data to obtain an alternate route, supposedly so you could bypass the jam? Did you find out that every other modern car had the same technology and got the same "hints", leading to a new jam at the bypass route? Another form of Predictive Analytics at work, with undesired outcome.

So isn't the whole intent of a Business Intelligence or Analytics initiative to get ahead of the competition?
But what if everybody leverages the same set of methodologies and tools? Won't we have similar situations as described above?

Since businesses are run by people, for people, they are a social system. And social systems are reflexive, they adapt their behavior based on what they think might happen. Any broadly spread analytical predictions following similar patterns are bound to alter the very future they attempt to predict.

Sunday, March 6, 2011

Agile Documentation

A lot of agile practitioners misunderstood the focus on delivering tangible artifacts as "documentation is bad". Consider that often a deliverable is not code, but information. Especially n the Business Intelligence space, just delivering data without context does not add any value. If you are in an organizational culture where initial ambiguity stimulates fruitful conversations, more power to you. But more often than not, stake holders (solution developers & business customers) are not co-located geographically, or even time wise. So we have to consider a communication media that works asynchronously (non real-time), which amounts conceptually to the same as "documentation". 

My observation is that the aversion against documentation tends to be a developer/implementer driven phenomenon. That software developers and IT pros can afford to not like creating documentation will change once IT becomes a buyers' market. Much of this "real programmers don't comment their code" macho attitude stems from a time when IT/software development talent demand outstripped supply. But the inherent TCO (cost of ownership) of badly maintainable code (lack of documentation being a part of that) will drive pragmatic changes of attitude in business, which very well may find the business-ignorant "rockstar" developers on a dying branch of the IT tree.

Still, another impulse against documentation stems from the wisdom that outdated documentation is worse than none whatsoever. I would personally disagree with that generalized statement, by virtue of "it depends" reasoning. If there is clear life cycle meta data, like "last changed timestamp" on documentation, as the user I can gauge how reliable the content could possibly be. The problem is, that documentation is not supposed to replace common sense and critical thinking on the document consumer side.

It is also curious, how having worked in a couple of agile-inspired project environments with aversion towards documentation, impromptu diagrams drafted on white boards and subsequent snapshots with smart phone cameras (later posted on Wiki or emails). Now what was that, if not documentation? 

The stuff we deal with in business processes, rules, data models & relationships, data flows, process flows, is far too complex as to not use documentation artifacts to communicate them across people. Along the philosophy of a picture being worth a thousand words, a diagram, and info graphic, is a precious way to capture & communicate this complexity. That is documentation, not necessarily formal, but it is a form of documentation. 

Let's not view this so black & white: "documentation is so waterfall" vs. "without documentation I can't start my work"

The balance is in treating the documentation artifacts with the same agile spirit as our project tracking & software development. Iterate, increment, track status, review, improve, abandon if irrelevant (minimize work done).

This is as easy as maintaining "last modified timestamp" or "edited by" tagging, which Wikis already do for you. Then your organization can agree on what constitutes "stale content", and in how far "stale" implies "obsolete". 
Just don't simply delete what you deem obsolete. Put that content into an Archive folder (or take "obsolete" tagged content out of your operational views/filters), so you can go back when the need arises to understand how a system evolved. If you have version tracking activated in your Wiki, you can cover change control for audit purposes that way, with even less overhead.

Documentation? Be pragmatic, and it won't contradict the agile spirit!

Solution Requirements Mismatch

Impedance Mismatch is an issue occuring in mismatching database types of different paradigms (e.g. object-oriented vs. relational). They may conceptually contain the same information, but their way of representing the data (model) does not relate well between the different database types . 

A similar effect can be observed in business settings, during requirements management for business intelligence solutions.
  1. Let's say a business customer communicates some functional requirements to a B.I. solution provider. Whether that's an internal developer, or an external vendor doesn't matter in this context.
  2. And let's say the B.I. solution provider can confirm the requirements and make a proposal about the planned solution.
  3. Both parties agree to move forward with a project, purchase, or consulting engagement.

When all is implemented and ready for use, the business customer realized that the provided solution doesn't do what it should. 

What happened?   Most likely...
  • the business customer expressed their wants, and not the business' actual needs, or should have
  • the solution provider confirmed what he can or wants to deliver, not what should be delivered.


Cynics will encounter this conclusion, which strangely is often very apparent, with "hindsight is always 20/20", or "shoulda-woulda-coulda".

But wait a minute, wasn't learning one of the key aspects of intelligence? If these patterns are apparent, why not learn from them, apply them with each initiative moving forward?

So let's examine the above scenarios a bit closer:

Why do business customers often ask for requirements about what they want, instead of what the business needs?

Why do solution providers commit to delivering requirements that the customer actually needs, but instead of go by what the provider is able to deliver, or is interested in delivering?
  • Sales-driven: eagerness to please customer (catering to broad desire for instant gratification)
  • Competitive pressures: no time to explore details of the requirements, the competition might snatch up the contract instead
  • Focus on Profits: lack of selectivity whether provider actually is able to delivery solution, since they don't want to miss out on the opportunity of doing business. Unfortunately the lack of capability/willingness to grow into a promised deliver is often not honestly communicated to the customer (which often results in the over-promised/under-delivered effect, when providers confused their capability with their motivation of "we can do it!")
  • Cultural factors, professional image/pride: the right thing to do would often be to advise a customer that what they require is not what they really need. But in our business culture it is rarely appreciated to push back on customers, and say no to setups that have low chance of success. The customer/provider relationship is often one subliminally serving social validation, more so than pragmatic business conducting. This dynamic affects the compatibility of agreedupon requirements vs. actual solution need. 

In summary, the above phenomena can be expressed in four scenarios:



  • lacks willingness & lacks understanding
  •  (willingness might have potential if understanding is addressed; see scenario 3)
  • lacks willingness, but understands (abandon all ye hope, a.k.a. "politics")
  • is willing, but not understanding (needs consulting)
  • is willing, and understanding (perfect setup for success!)

More on requirements analysis here.

Saturday, March 5, 2011

The Participation Web

So there is this huge movement of marketeers to "engage" consumers.

But that is the very crux of the problem. We've been passive consumers of information ever since TV became pervasive. The whole premise of the Internet was to encourage exchange, not just be yet another consume-only feed.

How about we talk about the Participation Web?

That requires two aspects:
  1. People actually participating, not just consuming
  2. And producers offering services and content that truly engage, and who encourage potential participants to get involved.
How do we get there?

It requires a cultural shift, probably more evolutionary (smooth transition) than revolutionary (abandon all that was before). There will be parallel paths, overlap, for quite a while.

There will be those opposed in principle, or due to vested interests in the old way.

There will be those skeptical, critical thinkers (which we need more of). Who perhaps might come up with a third path, alternatives, middle grounds, integrative approaches (very valuable, but from a demographics perspective, probably a minority).

And there will be those lost, unsure how to approach. This second group is a huge opportunity for educators, trainers, coaches, consultants, and media platform & content providers.

Just make sure you keep above points 1 and 2 in mind!

Friday, March 4, 2011

Twitter Intelligence 2.0

As Twitter has become one of the world's largest event-based databases/warehouses/clouds, it would be neglectful to not investigate it further in the realm of Business Intelligence and related Analytics. I've referred in an earlier article to this as Twitter Intelligence (not sure if I coined the term, but it sounds good to me :-)

My fascination with Twitter is that of an analyst of social phenomena, and not that of a Content Junkie.
Hence, my natural inclination is the dissect and examine Twitter into its elementary pieces and understand how they work together. Now what's really fascinating to find is that Twitter is inherently simple. In fact, so simple that I run into many cynics who don't "get it", what this hype is all about. On first glance it is easy to miss out on the results of the network effect, the hive mind, and the collective intelligence that has emerged from Twitter, and only becomes apparent after some exposure, if not involvement.

At a conceptual level, we have 4 basic concepts in Twitter:
  • The Social realm (human short messages with socializing intent)
  • The Content realm (referral to cool/interesting Web content, via URL links)
  • The Contributor role (people who initiate/author/share stuff)
  • The Consumer role (the listener/reader of that stuff)




These are 4 pretty straight forward concepts. Where it gets interesting, is that each of those realms can overlap and intersect with any of the others, in any combination. THis is how out of 4 simple elements, we derive 13 different scenarios, the network effect!

This principle was originally applied by Robert Metcalf in a technical context, which lead to the explosion of the data network now called the Internet.

There is yet another dynamic to all this, after all, 13 scenarios is not a whole lot of combinations to attribute complexity. Each of these 13 scenarios can affect any other combination, recursively, iteratively, incrementally, infinitely. What we have here is a complex system of self-motivated, but environmentally influenced "agents".

The intersections and ovelap areas have been labeled with letters and numbers, due to space constraints in the diagram. But here is the key:

 A  Initiator of a general social conversation/discussion (typically topic/interest related)
 B  Listener to general conversation initiated by others
 C  Reader/Watcher of Web content referred to by others
 D  Producer or Referrer of Web content, referring it to generic audience (by topic/interest/subject matter)


  1. Social promotion of produced content
  2. Conversation provides promotion to contributor, and feedback from consumer
  3. Consumer finds content by monitoring social buzz
  4. Contributor provides the content the consumer needs/likes
  5. Contributed content, as well as Contributor get social exposure to Consumer (2 way: promotion & feedback/rating)

Below just another insight into the relationships between content and social exchanges:


The orange elements are the reason why Twitter is labeled as a "social network", even though it gains real momentum with its connection to other Web content.

Thursday, March 3, 2011

Requirements Analysis

A while back I've held a presentation on requirements analysis, specifically targeted for the Business Intelligence space. It might be interesting to review this here.


The above link contains the whole slide deck. Below pictures are just some key points (excerpts).






Content Market Dynamics & Relationships

As I was analyzing how the online content market works, I drafted up this nifty little diagram to understand the relationships and dynamics involved. Thought I'd share...



Twitter Intelligence

Twitter Contribution Value Chain

As an engaged Twitter observer & contributor (@AnalyticsToGo) with a background in Analytics, I am fascinated by the vastness and richness of collective intelligence evolving there.

It occurred to me that there is an inherent ranking of contributors, based on some fundamental dynamics.



The originality starts at the bottom, and as the stack rises, the focus shifts from authoring to distribution.

This graph illustrates the relationships in more detail:




And each color-coded Twitter activity type has its own purpose:


Red: implicitly rate content by spreading it


Orange: help categorizing content by marking it up (not scientific, but in the spirit of crowd wisdom)


Yellow: raise visibility to not much noticed content (help with discovery)


Green:  help categorize Web content by marking it up (similar to Orange, which is more refining)


Cyan: Discover new Web content, promote it to broader audience


Blue: Create new Web content, and promote it



If you're looking for more detailed assessment of Twitterers' contributions, I found this free service quite useful: http://klout.com/

Supporting Decisions

Considering that the concept of Business Intelligence grew out of the earlier concept Decision Support, I would like to take a few moments to review what contributes to good decision making.

  • Goals, Purpose, Reason
  • Context, Scenario
  • Knowledge, Experience, Intuition (as a derivative of the latter)
  • Analysis

The fact that a decision is to be made implies that there is a scenario requiring some action. This action can either be a response to something, or a self-motivated initiative. Either way, it follows that there is a reason for the need to make a decision. Thus, there is a goal to be achieved.

In order to decide (between alternate paths) one must first be aware of the situation and available options. This requires knowledge, or even understanding, about what is the scenario, and what might be the options to address the scenario.

Knowledge without context is not very useful. Analysis helps us ponder how things fit together. Even decisions apparently driven by intuition have an underlying analysis component to them. Either from previous occurrences of similar problems, where we did more elaborate analysis, and upon repetition of the problem can reuse the outcome. Or by the instinctive combination of everything we ever learned, the so called "sixth sense", by confidence in "gut feelings".

The relevant aspects of analysis in support of an impending decision are around the dependenciesimpact and consequences of our planned action. Or in business terms:  the cost, benefit, and risk of our decision.

Think about this in the context of your Business Intelligence efforts. Doesn't this logic seem very straight forward? Should B.I. initiatives not be more manageable along those lines?

Customer Intelligence versus Market Intelligence

It has been my mission from the inception of this blog to address the business side of Business Intelligence. It was not my intention to become yet another technology-driven blog about "how-to" details, as there are plenty of them about already. There are also plenty of forums and channels that work the marketing side. Not just products, but evangelizing of methodologies, tools, architectures. My aim is more along the lines "teach a person how to fish", not cut-and-paste silver bullets. If I can stir debate, make people think, then I have realized my mission.

People can get worked up on terminology and semantics, as if the goal was to create ambiguity, versus resolving it. There is...
  • Business Intelligence, 
  • Customer Intelligence, 
  • Operational Intelligence, 
  • Market Intelligence,
  • Competitive Intelligence 
  • Social Intelligence, etc.

Let's face it, if you are interested in "Business Intelligence", chances are you look at this from a company perspective, representing a business function. The term "Business" itself can be very ambiguous.

Business, as in...
  1. a company (legally, organizationally, transactionally)
  2. conducting commerce (buying/selling/trading goods, expertise or labor/services)
  3. value (transactional activity, "thank you for your business")
  4. mission ("we are in the business of...")
  5. participation ("this will put us out of business!")

There are way more scenarios than the few I listed, but my point is to emphasize that the perspective, the context  in which you consider "business" intelligence matters.

When you are involved with the broader concept of Business Intelligence, what is it that you're really after?
  • Operational insight in your business? -> Operational Intelligence
  • Insights into customer behavior? -> Customer Intelligence
  • Understanding of the competitive environment? -> Competitive Intelligence
  • Identifying opportunities -> Market Intelligence

It is important to distinguish, and be aware what your focus is with your B.I. inspired Analytics initiatives.
They are obviously all business-related, given your professional role and interest. Operations, customers, markets, competition are all business concepts, in a way a subset of Business Intelligence. Just like national intelligence services have multiple purposes, such as, for example, security, defense, economic, political.

So consider Business Intelligence just the umbrella term, but be mindful of the specific focus your B.I. initiatives are after. Don't let the marketing terms confuse your core interest.


One of the hottest topics in the Business Intelligence realm is so called "Customer Intelligence". There is strong desire, and incentive, for a business to understand its customers. Merely "knowing" a customer is a very traditional concept, and was a natural given throughout ages when commerce was conducted only local and in-person. The customer "knowledge" challenge has become more pertinent in the age of logistical distribution of labor, and the proliferation of online market places. What a successful business really needs to have is a good "understanding" of its customer base.

Concepts like understanding customer behavior need to be carefully communicated, as from a customer perspective, the term "customer behavior" hints at being monitored, raising privacy concerns. And even the perception of privacy issues, hurts a business' reputation, which in itself is a competitive value, just as intelligence.

Customer Intelligence manifests itself typically in terms of understanding patterns of
  • acquisition of new customers (e.g. advertising, referrals)
  • conversion (turning shoppers into buyers)
  • retention (loyalty, repeat purchases, maintain relationships)
So what is Market Intelligence then? It is not my goal here to impose meanings on terms. As always, I prefer to stimulate thoughts around the concepts, not so much the wording.

What means "market" anyway? 
In the context of our Business Intelligence scoped considerations here, it implies...
  • a potential customer base (specific interest/demand in products, services)
  • our company's specialty ("we serve the tools market")
  • opportunity (potential to produce and market something in an unproven but valuable space)
Customer Intelligence is very focused, and can be a subset of Market Intelligence. The majority of the effort in figuring out customer behavior is not so much in tracking what customers are doing, but why. From a provider of services or goods it is immensely valuable to understand what customer want. But even more so, what motivates them, what is their underlying interest in a product or service. Tracking behavior can be largely automated in the age of online transaction. But the underlying dynamics of what leads people to buy is a more tricky subject, and a rewarding challenge for any Business Analyst and Analytics initiative.

Customer Intelligence tends to have more tangible qualities, seems to be more straight forward to address, and Market Intelligence might be a natural progression after that. Also, depending on the maturity of your business, your interest in C.I. vs. M.I. can vary. For example, startup businesses might more likely be interested in exploring market opportunities. Established businesses would benefit more from maintaining good relations with their existing customer base, not just from a loyalty standpoint, but also as potential referrers of new business by word of mouth.

Again, the details are debatable, what matters more is the conceptual awareness, to help you focus your B.I. efforts along the proper lines.

Wednesday, March 2, 2011

Solution Patterns

Within the IT/Software space, the concept of patterns has emerged as a practical tool to design complex systems. Patterns help categorizing solutions and avoid re-invention of the wheel as much as possible. They help keeping the focus on the similarities between solutions, and isolate the differences necessitating customized development. Pattern-derived solutions are deemed more efficient during implementation time as well as support time for two major reason:
  1. Time & effort is saved in "trying out what works" (establishing proof-of-concept models)
  2. The common underlying patterns help stake holders communicate with common terminology, decreasing the risk of miscommunication during requirements gathering and acceptance testing.

Between the various sets of patterns, one can generally distinguish between the following pattern types:
  • Structure (architecture)
  • Behavioral (process)
  • Data (content)
  • Usage (presentation, interaction)

Within the Business Intelligence space there have evolved a couple of architectural patterns that recur in a broad range of solutions.

Structural (data models):
  • The Operational Data Store (ODS) first proposed by Inmon
  • The Kimball-inspired Dimensional Model (Fact & Dimension Tables)
Behavioral (processing):
  • ETL, the process of extracting source data, transforming it, and loading it into a target data warehouse
  • ELT, extracing and loading first, then transforming it locally in the data warehouse
Data (content)
  • Time grain (date vs. timestamp)
  • Measure (quantitative business metric) 
  • Criteria (qualitative, for grouping, filtering, sorting)

Usage (presentation & interaction):
  • OLAP (multi-dimensional cubes, drill-down, drill-through/across, rollup)

But these are just the means. These patterns help implement a technical infrastructure, but they do not solve the business challenge of having proper intelligence (actionable insight) at hand.

So what are the application patterns, from a business perspective? Business solutions that directly correlate to one of the key success factors, such as for instance increase organizational efficiency, increase customer satisfaction, or risk management. Some examples coming to mind:

Increase Customer Satisfaction:
  • Recommendation Engines (e.g. Amazon's suggestions)
  • Stock Market Alerts (stock price trends triggering threshold events, notifications)
Risk Management: 
  • Fraud Detection (trigger alerts at unusual credit card activity)
Organizational Efficiency:
  • Web Site Path Analysis (identify most commonly taken browsing paths, correlate to purchases)
  • Search Engine Marketing (keyword spend vs. conversion events)

The reason why this understanding is useful before getting into the details of a B.I. project is that there may already be a standard pattern for what you try to achieve. Perhaps not a canned solution, but something you can adapt and build on, instead of having to reinvent the wheel.

When you hire seasoned B.I. specialists, you inadvertently get some pattern knowledge, based on how they solved similar challenges in other environments before. That's why it is helpful to focus on not only technical or tool expertise when staffing, but also leverage candidate's industry background as applicable.

Solution patterns are useful to communicate and collaborate on complex business intelligence aspects across functional and technical roles, as well as across companies.

The Ethics of Data Mining

For many years companies have been aware of the risks of neglecting proper data management. And data loss was just one of them. The opposite, making data available to unintended audiences could have equally devastating consequences as a database crash without a backup in place.

In the age of global real-time communication, such acts of neglect or malfeasance can have far reaching consequences, beyond even the original stake holders in the data affected.

But today I want to strike on another point: the ethic imperative of Data Mining Social Activity.

Ethics always seems a welcome topic for broad and controversial debate. The question for a business is how do ethics map to tangible business objectives and results?

In this age of real-time propagation of opinions, reputation spreads fast, and damage control seems to be more expensive than prevention. The broader masses out there seem to have a general consensus as to what's ethical. All you have to do is listen and adjust.

You will want to weigh the cost versus the benefits of indiscriminately pushing for data mining social activity.
Does your immediate gain on the bottom line justify the strategic risk to your reputation?

Consider that in dealing with people you affect what you measure. You are not counting beans anymore, you are dealing with conscious humans, who can and often will alter their behavior if they are observed.

It is tempting to get driven by what's technologically possible. And in recent years, data mining tools have matured quite nicely to be usable by more generic business audiences instead of highly trained scientists. Let's not forget, that just because something is possible that it doesn't always make good sense. Think ahead, mine not just your customer's behavior, but mine the trajectory of your company's actions. Apply predictive modeling to your organizational handling of the business and the resulting outcome.

The fact that you are interested in data mining leads to conclude that you have a sense about what you do today is having an impact on the results of tomorrow. So to you it should be easy to weigh the impact of your data mining efforts on your target audience. After all, data mining is the means to an end. And there is no point with the means jeopardizing the end, now is there?

Do not wait for privacy laws to mature, forcing you into a reactive mode adjusting your legal practices, and company culture in dealing with your market's sensitivities. Be a thought leader, a role model to the industry. It may very well be one of your competitive advantages. The trust of your customers itself becomes a vital factor of your business' success!

Tuesday, March 1, 2011

Data Relevance in the Knowledge Life Cycle

The latest article in my blog about future trends is pertinent to the B.I. space, as information life cycle is a vital part of B.I. and in the bigger picture feeds into trends as knowledge management.

To this day the B.I. industry has no established and integrated method to track data relevance or manage the information life cycle, to the extent the digital media industry has pioneered (to enforce DRM). A lot of the existing data life cycle concepts in B.I. come out of the top-down architected Data Warehousing space, which, if applied, are heavily human-process controlled, with a variety of tools, each component and interfacing with the rest leaving room for mistakes, ultimately leading to questionable outcomes.

The basic understanding is here today, with a practice referred to as "data quality". One of the DQ metrics is "relevance". Today this determination is a rather manual process, always open to arbitrary decisions with little bearing to the real world. The people having to determine data relevance do not have all the insight of the users of the data, and the overhead for a few specialists to find out the broader scope & usage of data is often intolerable within the typical speed of business operations. Hence the need for a crowd-sourced kind of rating model for the relevance of data. Of cource, that will be subject to distortion based on perspective. But that multi-dimensional approach is nothing well thought out software could not manage.

If we had an intrinsic system, embedded in every aspect touching data, which would maintain life cycle intention and actual status at every stage of use, the knowledge maturity and currency described in my other article above could become common place.

I suspect that business will adapt any technology that will prove itself on the broader Web. Perhaps this aspect will ultimately evolve out of Web 3.0 or the "Semantic Web" ?

Scaling Out Agility in B.I. Infrastructures

One of the challenges in applying agile methodologies to the Business Intelligence space is that traditional architectures layer tiers by technical function, introducing tight coupling of technology that makes changes expensive.


This tightly coupled dependency makes an over all B.I. solution more susceptible to change and support risks. Changes are hard to implement, without disturbing related functionality that wasn't originally intended to change. And because changes are often "under the hood", end users will have little understanding about delivery delays ("technical debt") or having to go through another Q/A step that doesn't seem to provide any additional functionality to them.


Along the concepts established by Service Oriented Architectures, if Business Intelligence infrastructure, tools and interfaces become more service oriented, than platform-driven, the agility of B.I. (the content, the results) could be increased.



The linking arrows at each generic level indicate different ways to integrate across functions.
For example, the User Interface level, hyper links facilitate cross functional relationships, such as drill-downs and drill-throughs across disciplines (as appropriate, driven by security rules).

On the Process level, each subject matter has its own functionally independent component that can be referred to by other components, or call out to other components, which do not have to be aware, let alone dependent, on the other components. The key point being loosely coupled.

Lastly, the Data leve would integrate by remote database linking, replication, or a on-demand loading.
Data models should be more modular, and again loosely coupled, for example horizontally partitioned across hosts.


Now, let's take note that layered tiers by functionality still make sense where the system performance is more important the the agility of the business; a classical trade-off between speed and flexibilityNew technologies and system architectures, such as Apache Hadoop can help reconciling the both.

In the spirit of focusing on providing intelligence to the business, these technical/architectural concerns ultimately should be delegated to a cloud operated by service providers who focus on optimizing along those lines, so your company can focus on the pertinent business questions.