AI is just another biased team member in a blended team

Matt Shanks

Matt Shanks

Principal Product Designer & Author/Illustrator for children. I'm out to improve the quality of life for everyone who uses the things that I make.

Unless you’ve been living under a rock you will likely have heard the term AI or Artificial Intelligence around a lot lately. Heralded as the next frontier in tech, the current writing about AI is making some pretty big promises. Find out more about these promises and why AI is just another biased team member in your blended team.

At Cogent, we work in blended teams for a reason. Product Managers, Designers and Engineers work closely with the Founder or someone from the business to build a great digital product. We work in this way because we know that without one of those team members, we’re going to build a more biased and less successful product.

The current writing about AI is making some pretty big promises, like that it’s going to bring on the next version of the industrial revolution. Think mass layoffs, jobs vanishing into thin air while new jobs, ones we cannot even imagine yet, proliferate. They give the sense that this will all happen overnight. Suddenly. Like we’re going to wake up one day and boom, a robot took your job. We love a bit of drama, us humans. But, in the same way that hiring only developers won’t build you a successful tech business, AI (or, more accurately, Machine Learning) can’t build anything worth scaling.

Parents, meet your new baby, AI

Humans learn about the world through experience, and artificial ‘brains’ are no different. In the same way that someone like me, a designer, will bring my own user-experience-biased view of the world to a problem and come up with a certain set of solutions that solve for that, so too will a developer bias theirs by technology or a product manager by business outcomes.

We all live in bubbles. There’s evidence to suggest that these bubbles are only getting more pronounced. Our minds are trained on input from the world that we’ve accumulated throughout our lives. Our view on family, relationships, work, and leisure are all based on the very narrow experiences we’ve had. An individual human cannot experience everything so we’ve evolved to work in blended teams to smooth out those biases.

Why do we think AI would be any different. Humans are to AI like parents are to children. We’re the ones who get to choose what we let our AI see and experience. And, depending on the type of parent you are, you’ll hold different values and beliefs, and those will be translated, ever so subtly in some cases, to the way your algorithm learns.

When I was growing up, my parents played a lot of Rock music. Neil Diamond, T-Rex, that sort of thing. As an adult, my understanding of rock music, what I like or dislike about it, is far deeper than my knowledge of, say, RnB or Hip-Hop. This in itself isn’t a bad thing. But if I was a robot deciding what all people want to listen to on the radio, I’d probably play mostly rock, rock I thought was ‘good’. I wouldn’t even know where to start with other genres.

Single-parenting doesn’t work for AI

Organisations are starting to give data scientists the reigns over the AI and machine-learning realm. They’re empowering them to choose the parameters by which we train algorithms. What data to use, what to leave out, and what data can be derived and how? A data scientist would state their goal as something like, “I want to build the most accurate model possible and achieve the best outcome as I understand it.”

But the outcome for the data scientist isn’t always the best outcome for the business, or the technical platform, or the people who are using whatever the algorithm was designed for in the first place.

We acknowledge that, for children, being exposed to as many diverse experiences as possible is a great way to lead a more balanced life. As adults, we widely recognise travel – experiencing different foods, cultures, climates – as one of the best ways to learn and grow and become more empathetic.

It takes a village to raise an algorithm

So we come back to blended teams. There’s NO getting around this, not in the short-term. Algorithm design needs as much diversity in its creation as any other thing we’re building for humans at scale.

It takes a village to raise an algorithm. But then we have to hope that after that algorithm has lived a life and learned, it can return home to look after us – it’s ageing, less than ideal parents.

Find out more about AI at Cogent here.

Related posts

Anwesha Chatterjee

Remote Pairing – Do’s and Don’ts

Whilst pairing has been a fairly consistent part of my years as a software developer, Cogent was where I first started remote pairing. I’ve found the remote aspect of the experience amplifies certain challenges so in this post, I share some of the lessons I’ve learnt.

Read More
Sam Chalela

Deploying micro frontends with module federation and Next.js

Micro frontends are an architectural pattern that has become a hot topic in recent years for many good reasons. It has helped organisations break complex UI applications into smaller, more manageable pieces, as well as enabling teams to work end-to-end across a well-defined product or business domain whilst maintaining a consistent user experience across the entire app.

Read More