Evaluating new technologies and ways of working can be a real challenge for organisations, especially when taken within the context of changes we have all had to make in our daily working practices over the past 18 months.
Having built and implemented intelligent document processing solutions ourselves, we are often asked how to best assess this new technology and how to ensure you are selecting the best vendors to fit your own needs and goals.
To achieve a positive evaluation and successful commercial relationship it’s vital that both parties understand what is important as well as what they want to get out of the early stages of evaluation. Here are three tips we’d recommend from our experience being on both sides of the process.
1. It’s (almost) never just about the technology
An ongoing trend in enterprise AI and automation is that organisations are looking for solutions that effectively fit into their existing technology landscape. This can be to uplift an existing capability or process with lower risk, cost and timescale or, to help define best of breed solutions as part of a new digital or platform strategy.
With platforms generally working as part of an overall solution or process, it’s vital to factor in the people and process aspects of the overall solution. Failing to get this right early and conducting a short evaluation focusing purely on the technology risks sourcing a solution that ‘demos well’, whilst risking failure to address additional complexity and challenges later on in implementation as well as potentially achieving a lower return on investment than planned.
Whilst it may seem like more work on your part initially, experienced vendors will work closely with you to understand the context of your data, documents and broader business processes, both to support how they conduct an evaluation but also how the solution as a whole would work in practice, at an appropriate scale.
2. Have the right stakeholders involved
Whilst executive sponsorship is important, we recommend having an individual lead the evaluation who has an appreciation of the overall process, the data and the stakeholders. It’s that person who will be able to support an effective evaluation, be in a position to define the right success criteria and possess the ability to align business goals with the vendor.
As a result, you will likely achieve much better outcomes when both line of business and technology stakeholders are involved for anything beyond a short proof of concept. Particularly within financial services, as you move closer to a live pilot or anything involving real data, having someone involved with the process from a legal / compliance background can avoid unwelcome surprises later on as well as avoid delays around things like data availability.
3. Agree on what’s important
It sounds obvious but it’s often taken for granted that everyone knows what it is that they want and, as importantly, how they want to assess that. Central to this is clear and measurable Success Criteria. In our experience, many teams aren’t really sure what success should look like and how to assess it. A good vendor will ask you to outline your goals and criteria for success but an excellent one will work with you to define this and mitigate the largest risk as early on in the process as possible. A clear view of success allows both you and the vendor to have clarity both in terms of the final objective and the appropriate type of engagement that will help deliver the right supporting evidence.
We always recommend defining and agreeing quantitative KPIs up front wherever possible that link clearly to the use case or business driver. Within intelligent document processing, whilst broad measures like accuracy can be helpful as a guiding benchmark, often the more useful and impactful measures in the productivity space are those that can directly relate to human time / manual intervention (error correction) and speed of process. We’d recommend supporting these with qualitative measures, particularly if the goal is to establish a long term relationship to develop new use cases and opportunities. In these cases, speed of iteration and innovation can play a huge part alongside core platforms and capabilities.
Our Approach
For those at an early stage and looking to either explore or define a new opportunity then a smaller scale proof of concept can be appropriate where data volumes and organisational risk are kept low and both time and resources can be constrained.
Where scaled productivity of an existing use case is a focus then a proof of value over a longer period, involving greater amounts of data can be a more effective product evaluation. Depending on the use case, effective ways of running this can either be using historical data or sanitised data or, if resources allow, a champion-challenger approach running a platform alongside an existing process.
Which approach is more appropriate to support you depends on both the use case and stage of your decision process. An experienced vendor will be able to work with you to uncover what you need to conduct an effective evaluation and ultimately reach your final decision.
About Freyda
Freyda is a cloud-based solution helping financial institutions to process, interpret and analyse data from their documents. Founded by a team of financial professionals, technologists and PhDs, the Freyda platform leverages machine learning, natural language processing and artificial intelligence to help organisations and their workers free up time and resources for high-value, high-impact work. For a demo, visit us at www.freyda.io