Data and Analytics Trends for 2017

Increasingly complex global capital markets regulatory environment coupled with slow pace of global economic growth is leading to shrinking margins and increased competition for both the buy side and sell side firms. 2017 is expected to be a pivotal year for the adoption of big data, analytics and artificial intelligence in capital markets as business leaders are increasingly turning to digital technologies as a lever to reduce costs and improve service delivery.

Data driven insights not only enable better strategic decisions for business leaders but also provides timely operational intelligence to automate business processes and provide more targeted and personalized service delivery for improved customer experience.

Following are some key technology trends for data and analytics that we are likely to see in 2017:

1. We will see a steady growth in Hadoop and NoSQL adoption as organizations are increasingly looking at modernizing their data platforms to augment existing analytics capabilities.

A recent report from Forrester predicts the big data market growing at nearly 13% rate over the next five years, with “non-relational” platforms like Hadoop and NoSQL segments growing nearly twice as fast. NoSQL is expected to grow by 25% and Hadoop will grow by 32.9% annually over the next five years.

According to Gartner, by 2017, all leading operational DBMSs will offer multiple data models, relational and non-relational, in a single DBMS platform. Gartner’s Magic Quadrant for Operational Database Management Systems 2016, includes many NoSQL databases such as Marklogic, MongoDB and others as challengers.

Availability of big data experts will remain scarce compared to the rising demand. Expectations from big data professionals is very high given the evolving technology landscape in the big data space. Developers are expected to be proficient in Hadoop, spark and variety of NoSQL databases. They also need to know a plethora of programming languages such as Java, Scala, R and Python. A good understanding of data modeling, ETL and data warehousing concepts is also a pre-requisite for many use cases.

2. Growing interest in Data Lakes will continue. Next generation data lakes will leverage Semantic technologies to evolve to “Smart” Data Lakes.

Semantic models provide a common business vocabulary and meaning across diverse data sources (both structured and unstructured). Smart data lakes use semantic graph query engines that link and contextualize enterprise data. It enables improved data governance in the data lake and enable self-service data discovery and analytics. This can also streamline regulatory reporting by enabling efficient linking of data from heterogeneous systems and schemas.

3. Demand for Data virtualization will grow as organizations increasingly build complex hybrid data management ecosystems with Hadoop, NoSQL and traditional databases deployed on premise as well as in cloud.

Data Virtualization helps in integrating new and old data sources through a unified data access layer avoiding time consuming data integration and ETL. Data Virtualization needs to be an integral part of data strategy for financial services firms.

Forrester predicts data virtualization industry to double from $3.3 billion in 2015 to $6.7 billion in 2021.

4. Demand for advanced analytics and artificial intelligence (AI) based solutions is on the rise.

Advanced analytics based applications will utilize structured as well as unstructured data – including alternative data such as emails, news, social media as well as satellite and drone images to derive market insights. AI will power autonomous business processes (such as surveillance and fraud detection) and conversational applications (such as chatbots).

According to Gartner, Artificial Intelligence (AI) and machine learning have reached a critical tipping point and will increasingly augment virtually every technology enabled service, thing or application. By 2018, Gartner expects most of the world’s largest 200 companies to exploit intelligent apps and utilize the full toolkit of big data and analytics tools to refine their offers and improve customer experience.

5. Capital markets firms are gradually opening up to the public cloud – particularly the buy side. Key use cases include data ingestion, processing, storage and compute intensive analytics.

Wall Street regulator FINRA has recently claimed to have built the biggest database in history in AWS public cloud! FINRA is one of three shortlisted bidders to build a consolidated audit trail (CAT) national market system for the US SEC. Another bidder, FIS has built a prototype solution for CAT in Google cloud that can process 25 billion stock market events in an hour!

A recent research publication from Celent, entitled 'The cloud comes of age in capital markets' states that attitude towards cloud has softened in the last 12-18 months.
profile-image

Saurabh Banerjee

Guest Author Saurabh is a Senior Specialist at Sapient Global Markets. He has 20 years of industry experience in technology consulting and product development in USA and India. Saurabh’s domain experience includes Financial Services, Healthcare and Telecom. In his current role, he is responsible for developing data science capabilities and innovation efforts at Sapient Global Markets. Prior to Sapient, Saurabh worked with PwC as Associate Director, Strategy & Architecture practice. He is passionate about advanced analytics and its impact on businesses, economy and our daily lives.

Also Read

Subscribe to our newsletter to get updates on our latest news