This is a contributed piece for the Computer Weekly Developer Network written by Paul Jones, head of technology UK&I at SAS.
SAS is known for its work in data analytics, data management and Artificial Intelligence (AI).
Jones is regular speaker at industry events and published author.
His current role is to help organisations face their data and AI challenges by adopting a transformative enterprise-wide analytical strategy to derive value within their data.
APIs have fundamentally altered the landscape of analytics, by making it quickly consumable. They are central to how we can now quickly deliver analytical business value in the cloud [and many other services besides] using lightweight composable architecture principles, allowing us to seamlessly plug new functions such as analytics into existing business processes.
Analytical APIs and their management are an evolving art.
The more we learn and progress, the more we can benefit from various capabilities across the entire analytical lifecycle. This includes everything from enabling data collection from new sources to modelling and communicating the results to consumers.
Open big data often powered through APIs is now ubiquitous, so, therefore, being able to draw down data from more sources is a key need for new analytics models. Some of the most valuable insights are obtained by extracting information from new or different data sources.
APIs are essential for this and as suchthe analytics economyhas evolved rapidly alongside the dataAPI economy.
All of this has meant that the role of data scientists has changed in tandem.
Allowing business users to affect a business process by pushing out their AI via an API, adds agility and impact, driving value, efficiencies and automation to these processes. It’s a real business game changer.
APIs have enabled us to make sure less and less responsibility for the deployment of AI models falls on IT teams. Open self-documenting APIs can be integrated into analytical platforms to publish to an operational framework, enabling end users to automatically ‘publish’ business. Lightweight Open Container Initiative compliant Docker containers are created to execute AI decisions that are published to a container registry. These are completely portable, API centric and allow businesses to run this IP anywhere in the case of a global company with lightweight infrastructure requirements. We can now easily launch and maintain multiple AI models and do so on a significant scale.
There’s still room for improvement and we’re all on a path of discovery. Today there is still relatively restricted integration with enterprise-scale data sources – but I predict that over time that will change. We will continue to see even more efficiency and more simplicity in deployment patterns so that real-time interaction becomes increasingly sophisticated. Deployment of APIs can also still be a reasonably technical process, especially the last step. Normally it requires an IT skill set for the deployment of the container image. There’s a clear possibility to make that even easier. We’re also working on the metadata that sits within AI and APIs to ensure we’re at the forefront of accountability and compliance. There is no question that APIs linked to AI are realising their huge potential in altering the way in which we collect, manage and use data. We’ve seen more and more citizen data scientists in businesses (and those just exploring) and the evolution of APIs is what has made this possible. , which is cloud native, is built on open APIs in many layers. But really, it’s what our customers build using these assets, and what can be deployed that’s important. Analysts talk about two-thirds of AI models never going live because IT teams don’t know how to deploy them – that’s all changing now, thanks to easy-to-use APIs. Over time APIs will continue to evolve to allow even more real-time value – there’s still room for growth and the AI applications that this will deliver could be astounding.