Agile Data Management: Because Change is the Only Constant


Jeff Zabin

Guest blogger Jeff Zabin oversees Gleanster’s strategic direction and day-to-day operations. A bestselling business author, influential research analyst and former technology marketing executive, he also serves as research director.


“Change is the only constant.” “Nothing stays the same.” “The only thing we can be sure of is change.”

These aphorisms apply to virtually all aspects of life and business. Data management is no exception. Fact is, current data requirements don’t always reflect future needs. And seldom can these future needs be anticipated as companies evolve and as the business environment in which they operate changes over time.

For most companies, being able to sense and respond to emerging trends, competitive activity, brand devaluation threats and unforeseen economic, regulatory and technological discontinuities in a timely fashion can spell the difference between life and death. In a world where business is in a constant state of flux, continuous innovation and tactical outmaneuvering – turning on a dime, when necessary, to revamp a marketing campaign, launch a new product or shift some aspect of operational performance into a higher gear – may be a company’s only path to survival and prosperity.

Hence the need to build business reporting and analysis tools on top of a flexible data management foundation. Business users need to be able to dynamically adjust the parameters of data access and integrate new data sources in a timely fashion. How else can they hope to generate new custom reports quickly, on an on-demand basis? Needless to say, relying solely on static reports derived from predefined data sources is a recipe for disaster.

Unfortunately, according to new Gleanster research, more than half of all companies today are forced to go outside the standard reporting and analysis procedures at least once a month to get the information they need to make informed decisions. Business users complain that they commonly have to knock on the door of the IT department multiple times to finally get the information view they’re seeking. Meanwhile, the IT department readily admits that integrating a new data source into a standard report can be a long and cumbersome process, given the constraints of their current data management environments.

Agility has become a big focal point for improving business intelligence, and there’s a lot of talk these days about the importance of agility as it relates to the end user interface. These interfaces can dazzle, with visualization tools providing layered access to operational data in a dizzying array of shapes and configurations. But while these self-service dashboards can provide unprecedented value, what is commonly overlooked – and what arguably matters even more – is the structural integrity and flexibility of the underlying data infrastructure. How quickly the IT department – or, better yet, business users themselves – can turn critical business data into actionable insights that drive sound decisions is at least as much about the availability, consistency and accuracy of the data itself as it is about being able to create pretty pie charts and bar graphs on the fly.

In other words, the soup is only as good as the ingredients that go into it. It isn’t enough to refer to data initiatives that simply use new interface tools as “agile”. Nor is it enough to take traditional data warehousing tools and try to architect new ways to analyze and integrate data faster and more frequently. Top Performers are differentiated not by the capability of a visualization tool but by the speed with which new data can be integrated.

Collaboration between IT and business users is key. Traditional data management initiatives often skip business requirements and dive directly into a costly extract, transform and load (ETL) effort, with business users asking for everything but the kitchen sink from the get-go. By contrast, agile initiatives emphasize continuous efforts to refine and evolve data integration requirements on an ongoing basis. It is this give-and-take mindset that presents novel challenges – and opportunities – for organizations willing to adopt a more mature approach to data management. A flexible data management architecture allows companies to avoid costly and unnecessary data modeling efforts by permitting business users to set requirements. Companies that give business users the ability to define requirements can achieve significantly faster turnaround times in response to business change events.

And this brings us back to that universe truth, which is that the only constant is change. Today most companies view their data warehouses as a stodgy museum. They treat them as they would a collection of static artifacts to be viewed by a passive audience. Modern approaches to data management, on the other hand, view data as a living organism that requires constant care and attention in order for it to grow and reach its full potential.


Jeff joined us for a webinar in June where he discussed Gleanter’s report “How Agile Data Management Enables Faster, Smarter Business Decisions” along with fellow analyst Tim O’Brien. We think you’ll find it interesting.

Watch the webinar »

Trackbacks/Pingbacks

  1. Agile Data Management and Protein Regulation Therapy | Gleanster Insights - FRESH PERSPECTIVES ON MARKETING, SOCIAL MEDIA, SALES, & BUSINESS INTELLIGENCE - December 6, 2012

    [...] December 7, 2012 by Jeff Zabin & filed under Uncategorized. A guest blog post I wrote for one of our clients, Kalido, focuses on agility in the context of data management. Of [...]

Leave a Reply