Every time I am attending a conference, there is AI on the agenda. Somewhere. I have to admit this is not a new topic as such – Strictly not -, but it is interesting to see the growing interest from business executives.
We have all been speaking about the rise of AI, APIs, Big Data, cloud technology, Internet of things (IoTs) and sensors technologies for years. I have myself started to implement AI solutions in 2014. Yet the topic around AI still greatly attracts people in the same way as 8 years ago, with much of the same questions, unclarities, and concerns. Why?
Probably it is relates to the fact that business executives still have difficulties to understand how to apply AI in their daily operations. Also, to the fact that AI vendors have not managed to really explain how to use AI in a particular industry, in specific business operations, and the benefits associated. As a result, business cases are not straightforward, especially in the current business context, besides quick cost reductions initiatives through chatbots for instance.
The main question here and I am quoting McKinsey, is: “Artificial Intelligence is getting ready for business, but are business ready for Artificial Intelligence?”. Let us say that the awareness is here, the understanding not quite yet.
Recently, beyond AI as such, I have seen the rise of the ”Data / AI as-a-Service” topic. Here I mean the combination of APIs, AI algorithms and cloud technology, on demand, to support business operations. This is very interesting.
With AI-as-a-Service, I am talking about combining all those technologies together to access information from devices like electrical networks, production chains, or electric vehicles. The objective is to provide added-value information on consumption, equipment, etc to business units and partners.
Of course, AI has the potential to transform business processes to create new offerings, optimise operations, and enhance the customer experience. But, clearly, to move forward, we have to go beyond traditional reporting and chatbots.
Would AI-as-a-Service be the remedy: Autonomous, self learning, reasoning systems on demand. Mature use cases, already implemented and tested somewhere. Ready to use algorithms. Flexibility of API technology. Scalability of the cloud, etc.
For some good time, many business executives were confessing they were lacking an AI strategy. Limited or “maverick” initiatives were flourishing in disparate units across organisations. A chatbot here, some image recognition there, without any real coordination and linkage. They lacked AI strategies, but they also lacked a vision on how to apply AI consistently across their value chain and operating model.
Today, we see that organisations are getting increasingly more mature and are now designing and deploying AI strategies, identifying clear use cases across their value chain, and trying to exhaustively looking at their operating model. Most of this traction emanates from the financial services, manufacturing, retail, energy, or automative industries. In many cases, it was (still) mainly driven by cost reduction objectives.
However, now, it looks like the main driver is progressively shifting towards the Customer Experience. In Financial Services for instance, the combination of Open Banking, API and cloud technologies, combined with a banking license, fuels the rise of the Banking-as-a-Service offering. The banking business model might be changing under our own eyes.
Another striking example I have seen is within the energy sector.
In this case specifically, the main objective for the organisation was to respond to real-time needs:
– How to optimise electric vehicle charging or production cycles?
– How to optimise the energy consumption of photovoltaic panels or air conditioning for instance?
Those are relevant questions to be addressed, especially in the current economic or geopolitical context.
The company decided to expose several data sets “as a Service” to internal units and partners to display energy and equipment dashboards and alerts: Alarms on equipment, time series on the energy consumption of buildings, and the like.
Shortly, from a technological perspective, the effort started with setting up a data factory, where data engineers, data analysts and data scientists, regardless of the business unit, could test and implement their use cases. The data related to the consumption of water, electricity, and gas was aggregated and then integrated with the information related to the equipment and the topology of the buildings. A data transformation pipeline which is connected via APIs to the IoT automation platform, now consolidates this real-time IoT data, standardises it, and refines it.
This sounds very technical. I agree. But what I like best is the business perspective of it: The connection with an ecosystem of partners and with marketplaces: Offer access to data sets, pipeline and algorithm libraries, off-the-shelf connectors that would allow partners to enrich their own offers. This is very interesting!
This was a super cool project. Obviously, as a result, the organisation will learn from this initiative, learn from the data, understand how people use buildings, and ultimately optimise its services, creating new opportunities for improvement.
And while those business perspectives are important, the key question remain: How to monetise? This will be the topic of another post. Probably. Meanwhile, let’s follow the developments!