White papers Webinars

An increasing number of companies are opting for data-driven strategies and embarking on marathon Data Science and Artificial Intelligence projects, in the hope of sharing the benefits of new technologies and data. What is the best way to ensure the success of Data Science or Artificial Intelligence projects? Which players will help effect change and what are the various project design and deployment steps to take over time?

Data Science and AI: how to properly scope your business projects?

IT Departments tasked with deploying these strategies have had a history of adopting extremely technocratic approaches to cope with the constant pressure exerted on them by evolving business needs and technologies. However, Heads of data are increasingly choosing to bet on disruptive approaches, breaking away from existing systems to establish a data sharing culture. They are hence designing, deploying and imposing reform projects which, on paper, seem flawless.

Today, we are witnessing the emergence of an increasing number of collective approaches which involve the end user as much as IT Department players in the design of the data strategy as well as its implementation. However, many Data Science and Artificial Intelligence projects are facing set-up issues, and are sometimes doomed to fail right from the start. Impact is often drastic, with profound implications and the sudden change can scare away service recipients.

Three categories of Data Science and Artificial Intelligence projects

The uses of Data Science and Artificial Intelligence (AI) are countless. It is nonetheless possible to efficiently set three concrete objectives to rapidly generate ROI: process automation, insight analysis (either to understand the existing system, identify patterns or predict events) and improvement of service level and commitment to end users. Let us take a closer look at these with a few examples of concrete use cases.

1. Robotic Process Automation (RPA)

The most basic and common form of Artificial Intelligence (AI) is the use of machines to automate manual tasks. It is the most advanced process automation technique, which can be used to, for instance:

  • Transfer data and emails gathered from various information systems to update customer databases or identified opportunities
  • Reconcile and control invoices and orders in order to make accounting processes more robust
  • Read and analyse contracts and legal documents to extract possible constraints and anomalies

RPA is the easiest and cheapest form of AI to implement. It usually shows results in the short term and its ROI is exceptional. However, despite its ease of implementation and significant benefits, many companies still have some way to go in the field.

2. Data analysis

The second most common and known form of AI and Data Science is information analysis and value extraction from vast amounts of data (Big Data) for descriptive and predictive purposes. For example:

  • Predictive maintenance for industry chains
  • Identifying potential fraud in banking, insurance or healthcare systems
  • Automating customer targeting during marketing campaigns
  • Providing insurers with more accurate models regarding potential customers

The most advanced Machine Learning techniques, known as Deep Learning, help perform functions such as voice, image or Natural Language Processing (NLP) recognition.

3. Engagement and decision-making

The third form involves engaging the end user (employee or customer) in an interaction process with the machine in order to collect information or provide a service. For example:

  • Smart conversational agents (chatbots) designed for 24/7 use and that provide a range of services that depend on the robot’s level of learning
  • Recommendation engines that help customers select services and products

The number of features available to organisations is extremely appealing. However, there are many obstacles of different kinds (organizational, financial, technological, strategic) in the way of setting up an Artificial Intelligence project.

How to integrate new Data Science and AI technologies into your project?

I will present below an integration framework for new Data Science and Artificial Intelligence technologies, which can be applied to any project regardless of its scale.

1. Understanding current technologies

Before launching Data Science projects, assessing technological solutions’ maturity, relevance and purpose is mandatory. For example, RPA is generally based on a set of transparent rules, which facilitates the interpretation of results. Contrary to Deep Learning, which excels in machine learning on large volumes of data, but generates results that remain difficult to interpret. This makes decision-making somewhat of a black-box-type process. An aspect which is problematic for highly regulated sectors such as financial services where any action taken must be decrypted and justified.

Unfortunately, many organisations are choosing inadequate technologies to address interesting issues. This is a result of several factors: resistance to change, non-awareness of new technologies and lack of skills. Awareness and understanding of new technologies emerging on the market require watch efforts and dedicated teams to test and validate effective operation on, and compatibility with, current platforms. Naturally, in the absence of a structure dedicated to this activity, companies can turn to well-known consultancy firms whose core business is Big Data and Data Science.

In any event, it is essential to have yourself of the very best expertise, whether internally or externally. In view of talent scarcity and the speed at which technology is evolving, several groups have decided to create IT, strategy and even business expertise centres. These centres group talents and provide know-how leveraged by internal customers to work on priority, complex and innovative subjects.

2. Creation of a project portfolio

The second step in the launch process of a Data Science or AI project is an assessment of the capacity to carry out projects, set the course and develop a roadmap based on needs and priority levels.

In my opinion, regardless of the sector it will be applied to, the approach is always the same and includes the following assessment tasks:

a. Identifying the Data Science or AI use case

The first assessment determines the entity or team most likely to benefit from the data or AI. Having worked in expertise centres, some entities, pressured by the #BigData and #AI buzz, and by fear of missing the Data Science bandwagon, are trying to stifle the issue by deploying technically unfounded projects. It is thus important to clearly identify the right use case that will provide a “not-to-be-missed” opportunity.

b. Evaluating the Data Science or AI use case

The next step is to evaluate use cases that may benefit from Big Data technologies and cognitive services. A step as crucial as the previous one since it will lend weight to the project by determining its importance or criticality. In my opinion, that there are no specific tasks to perform at this “pre-project” thinking stage. Nevertheless, it is still a good idea to ponder on the following types of questions:

  • Is the defined use case aligned with the company’s strategy?
  • How important/critical is the Data Science or Artificial Intelligence use case?
  • What positive or negative impacts will the use case application have if it succeeds/fails?
  • What are the potential obstacles or brakes that may hinder deployment of the Data Science or AI use case?

c. Identifying risks associated with the idea

It is then imperative to identify all barriers to the implementation of high added value data projects. For example:

  • Bottlenecks: in some cases, the lack of data is due to poor data sharing processes. The knowledge does exist within the company but often, due to organisational or conflictual reasons, the data or opportunity is not brought to light.
  • Data enhancement: in other cases, raw data exists and is accessible. But transforming raw data into exploitable information requires cumbersome processes that involve several people and complex management rules. It makes sense to work on simplifying these processes and to delegate repetitive tasks to the machine.
  • Lack of governance strategy: opening the system to facilitate data sharing is vital. Data may be accessible and intelligible, but that fact that it is split can render it useless.

d. Assessing feasibility: defining scope and mobilising resources

After having identified and evaluated the opportunity, then having studied the associated risks, it is only natural to address the cost issue in terms of human and financial resources. Most Data projects are nowadays branded “Agile” and observe the usual rituals of this method.

Members of teams working on these types of projects often face requests and ideas originating from different sponsors. If the team’s size is not right and/or it is poorly organised, trying to strike the balance necessary to meet customer needs in terms of cost and deadline is simply impossible. It’s impossible in an unstable environment where priorities change every day. It is therefore a good idea to clearly define the project’s scope and to allocate to it all necessary resources.

e. Selecting the technology

After a long period, determinedly and patiently spent evaluating identified opportunities, it is time to pass the torch to the “technical” teams to select what is(are) likely the best solution(s) to address questions raised. If there is no real debate regarding Data Science technologies to use, and this in spite of the market’s maturity, AI technologies are for their part not yet widely accessible. The deployment of cognitive services, from chatbots to facial or voice recognition, remains the prerogative of pioneering companies in the field of Artificial Intelligence.

The definition of a technological roadmap is essential. It helps take into consideration technical offerings’ volatile nature due to innovation. As a result, all technological choices should be constantly reviewed by asking yourself the “right” questions: compatibility of the solution with the technological platform already in place, ease of use, installation and maintenance costs …

3. Launch of prototypes

The gap between the company’s ambitions and its ability to deploy Data Science and AI projects may be quite significant (lack of resources, organisation, technology…) and as such, constitute a risk for the successful execution and completion of the project.

The “prototyping” mode is in my opinion the best approach to adopt as it helps carry out experiments and demonstrate the benefit and added value of innovative ideas. If the company has several projects in the pipeline, it is a good idea to base yourself on a selection and prioritization structure. Once validated projects get the green light, I recommend that you opt for a “small-steps” policy and set objectives that can be achieved quickly. This will help gain the “fragile” trust of service buyers. At this stage, it is advised that you resist the temptations of glory and refrain from grand speeches aimed at firing up users’ expectations; you should avoid the risk of disappointing them if the announced objectives are not met.

Unlike “traditional” Business Intelligence projects, where technical solutions are mature, expectations clear and results more often than not satisfactory, Data Science and AI projects have the peculiarity of having, at the beginning, rather “exploratory” objectives (ensuring that there is enough data to learn, assessing data quality, checking the convergence of models…).

This distinctive characteristic is an excellent argument to promote a PoC (Proof of Concept) initiative, followed by a prototype (or a pilot) and a limited-scope service. Once the latter has been tested and validated, it can then be extended or even subjected to an in-depth redesign. But the user’s confidence is already won and this is half the battle!

4. Project scalability

However, many companies have successfully launched Data Science projects that are stuck in the pilot phase. The large-scale deployment of projects (from pilot phase to putting into production) is no easy task.

One the one hand, prototype scalability and implementation sometimes require that technology bricks be changed as reliability and quality levels must meet the companies’ operational constraints. Furthermore, large-scale deployment, opening of the service to a wider population and the integration of more data in real time require the adoption of a post-implementation technology roadmap. For instance, shifting from an on premise platform to, or its enhancement with, cloud technologies will increase usage flexibility and autonomy as well as help control usage-related invoicing.

On the other, because the implementation of Data Science and AI projects usually requires integration with existing production systems, there is a need for organisation, well-defined responsibilities and a communication plan.

Data Science or AI projects: opt for an incremental approach

We have thus seen that the best approach to adopt to set up Data Science or AI projects is the incremental one. It channels at its heart the voice of the end user and is supported by the IT Department which plays a decisive role in the choice of technological tools and their popularisation amongst service beneficiaries.

Opportunity identification, evaluation and confirmation remain cumbersome and costly phases but they are however necessary to ensure better control over projects. The prototyping phase which precedes implementation is imperative in this context, as no matter how much control you have over data, it is always full of surprises!

The recommended framework is by no means revolutionary and its set-up is no longer a secret for large IT firms. I am sincerely convinced that some companies’ relentless determination to develop their data skills and establish a data culture will eventually pay off.

Business & Decision

Business & Decision, a global consulting and systems integration (CSI) Group, is a leader in Business Intelligence (BI) and CRM, and a major player in e-Business. We leverage a unique combination of technical, functional and industry specialization, as well as partnerships with all of the…

Learn more >

Your email address will not be published. Required fields are marked *

Your email address is only used by Business & Decision, the controller, to process your request and to send any Business & Decision communication related to your request only. Learn more about managing your data and your rights.