Artificial Intelligence has become a matter of public knowledge, even though most of the concepts it is based on are not yet comprehensible to the public.
Yet, we should not be unprepared for this revolution. The ability to generate rules or algorithms to search for insight in our data, combining advanced statistical models and computational power, allows us to make the most out of the available sources of information.
It is not an easy step, as it means revising existing organizational structures and acquiring new skills. Let's explore the subject in depth with Stefano Gatti, Head of Data Analytics at Nexi and co-author, together with Alessandro Giaume, of the recent "#AI Expert. Architects of the future"(Franco Angeli Editore).
Q: Why should the algorithmic organizations of the future be structured in an antifragile way and have liquid skills?
A: Algorithmic companies are, as we wrote in #AI Expert, an evolution of data-driven companies. An important part of their business model is algorithm-based.
Uber is an example of an algorithmic company: the price and a specific driver are assigned based on the geolocation data of passengers and drivers, on the number of requests in an area, on weather conditions and many other parameters.
Uber's algorithms use all this data both to maximize real-time revenues and profitability of the company and to meet the needs of drivers and passengers. To be antifragile, this type of company must continuously adapt to the contexts and scenarios of the external world.
The management of algorithms is no longer in the hands of data scientists alone, but in order to quickly adapt to change, the skills of these teams of AI experts need to be very liquid, so other professionals must support the data scientists in a project team.
As highlighted in our book, the virtual table of the protagonists of AI projects has expanded, embracing other profiles that are becoming more and more important, such as the data engineer, the data architect, the data subject expert and others.
Q: What is the potential of cloud technology in enabling Artificial Intelligence?
A: The cloud is a strong accelerator of AI-driven projects for two main reasons. The first is the innovation potential it brings into companies, from small startups to large corporations, by giving them access to the tools of cloud giants such as Amazon, Microsoft and Google.
A positive point is accessing all the improvements that are released every month, things that seems increasingly difficult to activate even in the data centers of large companies. In this sense, we can speak of the cloud as a strongly democratizing factor with regard to the use of AI-driven systems.
The second reason is the elasticity of computational power that the cloud provides. This is all the more important in the training of algorithms phase, which requires much more power than the amount necessary when the project is up and running. That’s why the cloud is an enabler and an accelerator, especially in the set-up phase.
Q: What skills and knowledge do you think a professional should have, to call themselves an “AI expert"?
A: The scenario has changed a lot in the last 10 years. When Drew Conway in 2010 defined the basic skills for successful data science projects, he recommended mathematical and statistical knowledge, computer science knowledge and specific domains.
In these 10 years, failures especially have highlighted the importance of other hard skills, as well as of soft skills. In particular, the ability to communicate projects and their evolution within the company with all stakeholders has emerged as an important “must have” within a team of AI experts. The knowledge of legal and security aspects of data management is also fundamental, not only in the initial phase but throughout the entire life cycle of these projects.
In our book, we also focused on other strategic aspects so that these projects won’t get lost in inevitable Proof of concepts but become real factors of success in bringing a company in its "algorithmic" phase.