People, Process & Technology Levers to Optimize Analytics

COVID-19 Is changing analytics spending, with 97% of companies pursuing cloud-based analytics and collaboration technologies for the growing number of analytics professionals now working remotely.

Today, businesses expect analytics to be fast, scalable and easy to use. The key to achieve this is to empower business users and reduce reliance on centralized shared services. The primary challenge remains that only “80% of time [is] spent in cleansing and prepping data and 20% on actual analysis”, increasing the lead to deliver insights and reducing ability to action quickly.

An optimized analytics architecture helps reduce the degree of separation between data and business to bridge this gap. Gartner defines this as self-service analytics.

Here are key levers to optimizing self-service analytics to improve time to decision.

Effective Implementation of a Cloud BI / Analytics Platform

Self service in analytics is driven by BI/ analytics platforms that have integrated reporting capabilities and are possibly augmented with AI/ML. Most of these platforms support ease and scale of analytics with a large number of API and database connectors that support data integration preparation in addition to a slick UI to build and visualize analysis.

A user-friendly UI is necessary to source, merge, clean and aggregate data accurately in a business semantics layer. Some vendors have a strong, business friendly ETL framework while others leverage partners or cloud ETL tools.

Good UI design is important to improve user adoption, and your tool of choice should be as easy as common office software. Choose a BI platform that requires the least vendor or IT department support every time users want to change data or visualization.

BI vendors who claim to enable self service through cloud platforms ought to adhere to these basic tenets. A low touch, plug and play model without much professional service is a very attractive proposition to a business leader who has a business challenge to solve without going through long winded conversations.

Ensuring Data Quality & Understanding

While BI platforms improve data reusability and intelligence inside the platforms, upstream challenges to source and organize information remain. Many data support requests are driven by ambiguous definitions of data and unstable data quality.

Business users must have an easy way to submit data requests through a service desk for new asks, workarounds or fixes; these need to be added to the engineering backlog as platform feature requests.

Rigorous processes to catalog enterprise data and publish a searchable lexicon for business users are important. But cataloging information in a multi-data system environment is not easy and often deprioritized due to other business needs. Crowdsourcing and syndicating tribal knowledge gathered while data munging can  improve data understanding.

It is just not good enough to bestow upon a business user with an jazzy analytic platform and expect users to self serve; data is the base ingredient to any analytical recipe and needs to be treated as a strategic asset.

Successful Data Lake & Virtualization Strategy

BI/ Analytics software is best deployed as “close” as possible to the data to avoid network latency and traffic surge. While Data Warehouse and Data Marts have been quite common in the past, new technologies are helping operational efficiencies and cost. Cloud platforms have component services supporting various use cases of data ingestion. Usually these services are “stitched” together to spin up an end to end analytics service and feed polyglot cloud data stores collectively called data lakes.

A data lake is a product and should be designed accordingly. Product owners need to address scalability, storage, schema, retention, caching, monitoring, security and query requirements to support self-service. Synchronizing data across multiple sources with a cloud BI platform while introducing new datasets/ sources should occur with minimal re-engineering. Usually cloud data lakes are built in agile sprints; it is critical to establish a definition of ‘done’ in line with business expectations.

Data virtualization offers BI platforms a unified, abstracted and encapsulated view for analytics, when data is across heterogeneous data stores. This is helpful to integrate data on-demand without moving data into a new store. Data Virtualization is a useful approach in scenarios where it would be too costly to create and maintain a database for integrated data.

Teams & Collaboration

Colocation perhaps has a new meaning- anyone who can be available for work and be on all communication channels during nominally accepted business hours.

A key disruption of COVID-19 is going to be largely distributed teams to minimize in-premise work force density. BI/ analytics integration with chat, project management and visual communication tools are going to be more important than ever before.

The analytics collaboration platform from Chisel Analytics enables organizations to screen and recruit consultants, float projects, share data, assign tasks, review progress and interact with experts across multiple channels to make analytics solutions easier.

Interactive solutions with features like multi- kernel support, secured data connection, deployment and approval  workflows can disrupt the future of distributed analytics delivery.

Postscript

A well-architected and optimized analytics solution takes off tactical maintenance and integration challenges from IT support and enables business users to self-serve. The top reasons for doing so are reducing analytics cost, enhancing business processes, improving consumer experience, collaboration therefore improving business line users self serve

See what others are saying