In part one of this blog, we looked at two Bridgeall innovation stories that resulted in positive outcomes – one that got us off and running as a business, and one that became a genuine runaway success. We observed that there appeared to a close correlation between the effort and diligence applied at the idea exploration and selection stages, and the extent of the success that followed.
In this second part of the blog, we look at two further Bridgeall innovation stories that had very different outcomes and learnings.
The one that we killed
Just over a year before divesting our SaaS library software spin-off business, we embarked on another product innovation initiative – how hard could it be? We now knew the formula for success in innovation, right?
Through an acquaintance we were alerted to an opportunity in the Finance sector. The financial regulator was putting pressure on UK banks to ensure that independent price verification was carried on financial trades. Our acquaintance was a sector domain expert – he knew the pain points, could help us define the requirements and USPs of our product, QA the solution, and arrange the all-important initial sales meetings with key prospects.
A ‘cursory’ glance at the sector told us all that we needed to know about the size of the opportunity – there was a plethora of tier 1 and tier 2 banks across the UK who were our prime target customers. The regulator was ‘promising’ to increase pressure for compliance which would be creating the perfect conditions for an urgent need, and a ‘brief’ search revealed no competitive solutions from other vendors on the market (yet). The project was good to go!
We engaged our domain expert on a fixed term contract, re-assigned a full development team away from fee-paying work, and jumped in head-first with a complex project to build the pre-production version of our system. With a new website, associated marketing materials and 3 sales meetings in the calendar, we were ready to take the market by storm!
Unfortunately, this story had a different tale to tell.
The feedback from our key target sales meetings was consistent and clear – previously unbeknown to us, at that time all tier 1 and many tier 2 banks had a strict ‘build before buy’ policy when considering new software solutions. Their large IT departments were geared up, ready to deal with the vast array of projects that regularly reared their head. Many had already initiated their project to build their own version of our solution. Our likelihood of winning our key identified targets was practically nil.
Our domain expert quickly departed on the news that the journey was going to be much more arduous than originally envisaged, and we were left with a nice piece of software without a home. Six months on from the initial idea, with considerable energy, financial cost and lost opportunity cost incurred, both the project and our product were dead. So, what went wrong?
Reflecting back, it is painfully clear that we disregarded pretty much everything that we had learned about diligent innovation process – ignoring the basics of idea validation from the “Triangle of Innovation Success: Desirability, Feasibility and Viability”.
What made us take this approach? Ultimately, we convinced ourselves that, since we had such a small window of opportunity to get our product to the market ahead of anyone else, there was no time for due process to be followed.
What activities and uncertainties should have been explored before committing to the project?
- Idea Brainstorming: An ideation phase never formed part of our approach. We suffered a severe case of ‘innovation blindness’, being so consumed with the potential size of the opportunity that was presented to us. When people have few ideas (or worse, just one as in this case), it is more likely that they hold onto those that they do have in the hope that they can make them work.
- Desirability – Product & Vendor: Would the customer agree that our solution solved a current or future problem? Would our solution be an exciting and attractive means of solving that problem? Assumptions are dangerous. We didn’t confirm our theories with a single target customer before embarking on the build, but instead took the advice of our domain expert as ‘gospel’.
- Desirability – Regulatory: Our solution was designed and built to meet what we believed was a pending regulatory requirement that would place real pressure on banks. Were the regulations likely to be mandatory, or simply advisory? Could banks potentially meet the new regulations with light-touch process changes rather than new IT solutions?
- Feasibility – Organisational (Customer): What level of influence did the banks’ IT departments exert on software procurement decisions? What was the general IT policy in banks on software build versus buy? Would a cloud solution be deemed as too high-risk by the banks IT security teams?
- Feasibility – Organisational (Bridgeall): Financial Services at the time was a new domain to Bridgeall – our lack of depth of experience was a critical risk. Was our domain expert ‘invested enough’ in our venture? Could the project still be delivered if that person left our company? What were the risks of that happening, and how could we protect ourselves from those risks?
We often talk today of applying “smart kill” thinking in innovation initiatives – terminating early idea exploration work as quickly as possible following the identification of one or more ‘unresolvable’ uncertainties.
The unfortunate truth with “the one that we killed” is that it was anything but a smart kill – by taking the approach of not identifying, exploring and resolving key uncertainties, we embarked on a painful venture that was likely doomed from the outset.
The one we’re doing now!
The preceding use of the terms ‘uncertainties’ and ‘smart kills’ brings us round neatly to the “why of smartcrowds”.
Rolling forward to 2015 – and having built up the IT services arm of Bridgeall up to around 40 employees – we decided it was time to start looking to invest in the development of a new SaaS product. The brief was simple: we wanted to find a new SaaS solution that would take the UK by storm. The solution needed to have a low barrier to entry (no complex technical integrations) but be “sticky” enough that clients would renew their subscriptions with us each year.
To help us achieve this new audacious goal, we launched a “New Bridgeall cloud product” innovation challenge for all of our employees, facilitated by a short-term subscription to an innovation platform that we’d trialled earlier.
As the challenge drew to a close, and with lots of ideas (40+) submitted from across the business, we shortlisted the list down to the 10 most interesting submissions. For those shortlisted ideas, we configured the innovation platform to enable a group of experts in the business to “vet” (using 1-10 scoring) the ideas against an extensive set of criteria.
What came back from the exercise surprised and disappointed us in equal measure – every idea scored so poorly across the vetting criteria that our confidence to proceed was left in tatters. Worse, the associated vetting ‘comments’ explained in great detail why the ideas could never work, only adding to our fears.
Following a period of reflection, a colleague who is a trusted innovation practitioner of many years (and now works with us) commented during a separate engagement that “there is only one thing that is guaranteed to be consistent with all ideas – people will always find a way to knock them down”
This was our lightbulb moment – in effect, the vetting process that the innovation platform naturally followed exposed a flaw that we have since observed in many innovation programmes – a tendency to over complicate the initial idea selection process, which often kills potentially great ideas before they ever get a chance to get off the ground.
Looking back at our own innovation learnings over the years, it became very clear that if we were going to utilise a software platform to help us manage our own innovation projects, it must fulfil 4 basic requirements:
- Help us ‘spot’ and tag the ideas that look most exciting without the need for complex vetting;
- Facilitate rapid, flexible cycles of ‘exploration’ around the uncertainties that all high impact, breakthrough ideas naturally exhibit;
- Enable us to plan, assign and track the exploration work to resolve positively (or otherwise) each uncertainty, and critically;
- Help us learn during each cycle of exploration. A key lesson from our previous innovation projects was that the learnings from each cycle often act as a catalyst for making adaptations, often small but sometimes significant, to the initial idea itself.
Surprisingly, an exhaustive search for a platform that would meet these requirements returned zero results, and hence a new idea – entitled “A new approach to innovation management” was added to our previous list of shortlisted ideas.
With some luck (or was it foresight?), we had previously developed a solution for our own internal use which facilitated the capture of end-of-project learnings from our project teams and customers, and the creation of action plans and assignment of tasks to track the work identified from the learnings. We were able to bring elements of this solution forward into our new product idea – which now had its own snazzy name – “smartcrowds”.
The why of smartcrowds
So, back to our original question – what was the “why of smartcrowds”?
In a nutshell, we know from personal experience that Innovation can be difficult, but can bring immeasurable benefit to society.
Our Vision is to accelerate global innovation by helping organisations turn more ideas into more breakthrough change, more of the time.
In driving towards our Vision, all of our energies and focus go into the creation of ground-breaking products & services for organisational innovation which incorporate the best of what we’ve learned and continue to learn from our own innovation efforts.
Our solutions help companies of any shape or size identify more diverse & high impact ideas, break down the common barriers to innovation, and deliver repeatable positive outcomes from their innovation effort.