zubinZubin Dowlaty, futurologist and head of innovation and development at Mu Sigma, the world’s largest dedicated decision sciences and data analytics company, picks his top predictions for 2017…

Until now, enterprises have tended to talk about data analytics, AI and Intelligent Systems (operational systems which include predictive analysis) in mainly technological terms. 2017 is the year when discussions in IT departments will shift focus from technology and theory to real-world implementation and practice. We’ll see advanced analytics systems rolled out more systematically, and more broadly, throughout enterprises.

However, a recent report from Mu Sigma found that many enterprises are still not exploiting analytics effectively, despite the fact that two-thirds of senior decision-makers recognise the potential positive impact of such systems on business growth.

Greater emphasis is still needed on applying analytics to real-world business issues. This means putting an end to arguments about the relative merits of Spark, Hadoop or MapReduce, and instead putting Intelligent Systems into production and managing them effectively.

This will demand new thinking from enterprise decision-makers, many of whom are still hooked on analytical models of the 1980s, managed with multiple dashboards and complex spreadsheets. The growing band of digital natives understand how to bring analytics into the real world.

Enterprise data analysts could do worse than to take a leaf out of the book of the traditional DevOps function, which has brought radical change to IT management and governance. A similar design approach is needed for analytics: a so-called ‘AnalyticsOps’ function.

We’ll see the long-overdue transformation of analytics model development, algorithms and problem-solving, as enterprises apply the same level of rigour as they already do to the management of IT assets.

Moving from the Cloud to the Edge

The inexorable shift to the Cloud is good for many things, but it is not a cure-all, especially as we enter the heavily-instrumented world of the Internet of Things (IoT). Many applications will be better served by ‘fog’ or ‘edge’ computing.

Data analytics is no longer the preserve of geeks, but is a multi-disciplinary ‘team sport’

Around 21 billion devices will be connected to the Internet by 2020, according to Gartner Group. It makes no sense to carry all the data generated by these devices into the Cloud. It would be highly inefficient: consuming a lot of bandwidth and affecting processing speeds, especially when handling larger files such as video content.

Edge computing, where data processing is handled at the edge of the network, in devices such as routers, gateways or smart mobile devices, enables a more efficient, distributed approach, especially where real-time decision-making is needed.

Natural language programming offers transparency and repeatability

Natural language programming is coming to the fore again as enterprise notebooks like Jupyter, RSTUDIO and Zeppelin grow in popularity among the scientific computing and analytics communities.

Online notebooks are becoming the norm for top data scientists. These combine computer code and human-readable rich text, and they can be embedded as code to perform tasks such as data analysis.

The growing use of enterprise notebooks highlights the fact that data analytics is no longer the preserve of geeks, but is a multi-disciplinary ‘team sport’. Such tools help provide a collaborative platform that give data scientists and non-technical contributors a seamless, transparent view of analytics projects as they progress. They also promote knowledge sharing and transfer, so reducing the effects of team members leaving and improving repeatability.

Maker Culture encourages experimentation

Big businesses are waking up to the potential of the rapid and inexpensive innovation strategies of the Maker Culture, and are moving to replace traditionally complex, rigid and expensive research and development processes. The same concepts can be applied to analytics. Furthermore, it is enabling experimentation with intelligent systems.

2017 will see enterprises making more use of such approaches. They shorten the development cycle by allowing quick, cheap prototyping. Concepts can be tested and improved in a short time-frame, before they are either built into enterprise-grade solutions or discarded with minimal losses.

One example of this could be the creation of a custom aerospace component using 3D printing to include various sensors, a Raspberry Pi processor and Wi-Fi capability. This component can be deployed on a real test rig and start streaming data to evaluate its performance very quickly. Modifications can be made and tested in very short cycles, and the resulting final design and knowledge provides the confidence to start production at scale.

In summary, 2017 will see the shake-up of traditional analytics, with new organizational structures and processes to meet the needs of modern analytics with the adoption of lower-cost technologies and more experimental innovation strategies.

Home