BI Trends 2013: BI Gartner event recap
A summary of the three-day Gartner BI summit on the key themes that emerged would be -
- Big Data
- High performance/ In Memory Analytics
So clearly, lightening did not strike at the Gartner BI summit 2013.These subjects have been the buzz for some time and would have been any BI practitioner’s first guess. However, the sessions provided a good insight into these trends.
The summit started with the keynote session “New information, New challenges, New solutions”. The session laid main emphasis on Descriptive, Diagnostic, Predictive and Prescriptive analytics. This set the essence for rest of the conference. All the sessions at the event had an undertone of these four stages of analytics. Apart from the speaker sessions, the BI Gartner event gave a good opportunity to network with industry peers. For instance, the ‘network breakfast’ had tables spread out based on industry groups so that people interested in a particular industry could gather and interact together.
The key takeaways based on the speaker sessions and interactions with delegates are summarized below:
1) Variety in Big Data
Based on the sessions at the Gartner event, Big Data can be defined as data sets that have a variety (unstructured data) of data, or data sets that use Hadoop Mapreduce architecture framework.
Big Data was represented not as an emerging concept, but as a ‘must’ for any Information management solution. Any Datawarehouse architecture referenced in the conference had hadoop distributed file system framework as an integral part of the architecture.
Key learnings that came across on Big Data:
Big Data does not replace a Datawarehouse – Big Data technology does not replace a datawarehouse but rather complements it. For example, datawarehouse holds the internal data on the customer, but Big Data would further enrich the information about your customer by taking external unstructured data from secondary sources, social media, etc.
Velocity and Quality paradigm – Big Data enriches the information with ‘Variety’ and ‘Volume’ of data. Big Data also provides data at a high ‘Velocity’, but it has a huge barrier to cross on data quality aspect. Big Data solution does not go through a matured data cleansing lifecycle and is not matured for audits or regulatory (SOX, Basel) kind of reporting .This also re-emphasizes that datawarehouse and Big Data need to co-exist.
Pick the right signal from the Noise – The Key Note session from Nate Silver brought about significant challenges with analytics on large volume of data, the prominent one being that with lot of data there would be lot more patterns and signals emerging. It would be important to identify the false positives and false negatives. The risk is that with more and more data available, businesses are given more “opportunity to cherry pick the results they want to see.”
The underline message on Big Data at the summit was, Big Data is not about doing more of the same thing. It is about doing things differently.
Although Big Data was the focus area in the event, the details in the sessions clearly came with a caveat from a technology perspective. Big Data is emerging and comes with lot of complexities like challenges with open source stability, huge cost to support commercial Big Data environment, creating a strong case of ROI to initiate a Big Data program, etc. It is also important to know that anyone implementing Big Data needs to be constantly learning as this field is evolving.
2) In-Memory Analytics is mainstream
If speaker sessions gave a sense of ‘what next’, the booths at the exhibition center disclosed the real trend. There were very limited exhibitors on Big Data solution, but all the product exhibitors from SAS,TIBCO, MS and SAP showcased the power of processing and ‘In-memory analytics’. In-memory analytics is an approach to querying data when it resides in a computer’s random access memory (RAM), as opposed to querying data that is stored on physical disks. This gives very high BI performance as compared to traditional BI analytics using physical disks.
Product demos focused on advanced visualization along with high performance and it was very difficult to differentiate their offerings. Some of the altered features these products offered were:
Social Media Analytics – An add-on to provide sentiment analytics with provision of slice and dice in real time (IBM, Tableau)
In Memory Statistical Services – The ability to provide advanced statistical modeling on high performance analytics server. It claims to make models run in minutes or seconds and frequent modeling iterations can be made (SAS, TIBCO Spotfire).
Big Data visualization – This probably was the most common feature with all product vendors. The differentiator here was providing very high (terrabytes) volume data in memory (SAP, Oracle) and advanced visualization. Products like Tablueu, Spotfire and Qlickview focused on advanced visualization and provided best visual constructs for high volume of data. High Performance analytics also indicated the trend moving towards self-service BI, empowering end users to analyze and create reports.
3) Importance of MDM
For 20 consecutive years now, MDM is trending!
But for a change, this time it was not on another MDM or Data quality tool that was discussed, and even ‘structured process’ around MDM was given lesser importance. The key focus this year was on behavior and culture change required to make MDM process along with tools and technology more effective.
MDM and Data Quality are no more identified as IT roles but as a combination of IT and business. Interestingly, this has given rise to a new designation of Chief Data Officer (CDO). CDO is not an IT role but it cuts across IT and business. The objective of a CDO is to drive an organization’s data as an asset and control it through effective governance and stewardship process.
Another MDM challenge foreseen was Big Data. While Big Data is seen to provide tremendous opportunity, it also requires a lot of external data from Social Media and Secondary sources. This will make MDM challenges more demanding.
Apart from the above themes, another observation in the summit was that there were long queues outside sessions on ‘ROI/TCO for Business Intelligence Program’ or ‘how to market BI internally in an organization’. It was evident that although BI and Analytics is number one in CIO’s to-do list, there is still a lot of pressure on BI managers to provide tangible outputs or create a strong business case for Business Intelligence in their organization.
Please feel free to share your thoughts on the blog.
OSMOSIS 2013 - My Experiences Parallels Summit 2014 - Reflections Digital Relationships With Customers: A Key to the Future of the Store Mobile Testing-Part 2 Getting on top of the internet of things View all
A fresh look at metrics and the marketing funnel (5804) Can You Entrust Your Services Partner With Your Demand Reduction Goals? (4112) Crossing Swords – ARMed Intel (3920) What is Consulting? (2739) What is the difference between Marketing and Sales? (2324) View all
What is the difference between Marketing and Sales? (24) Is change inevitable? – A Scrum Master’s dilemma (22) A Dip in the Chart – An Agile Story (17) An inbuilt mechanism for innovation: organic & ecological (16) Mumbai Dabbawalas (16) View all
Creating Sanity Amidst Test Methodology Madness – Webinar Series Transforming Test Organisation MindTree Vlogs: Role of Independent Testing in the Manufacturing industry A Look Back and A Look Ahead Some Brands Never Get Old View all
Is change inevitable? – A Scrum Master’s dilemma A Dip in the Chart – An Agile Story The Perplexed Scrum Master What’s in it for me? (WIIFM) When you are an expert on something, where do you learn from? View all
Find us on