Home Episode 15

Podcast Episode 15

Maximizing the Benefits of Data and AI

with Dr. Alex Antic, Managing Director of the Dr. Alex Antic Group

In this episode of Automating the Enterprise, host Dayle Hall interviews Dr. Alex Antic, the managing director of the Dr. Alex Antic Group. Dr. Antic is a well-respected leader in the field of data and analytics, and was recognized in 2021 as one of the top five analytics leaders in Australia.

Full Transcript

Dayle Hall:  

Hi, and welcome to our latest podcast for Automating the Enterprise. I’m your host, Dayle Hall, the CMO at SnapLogic. This podcast is designed to give organizations the insights and best practices on how to integrate, automate and transform the enterprise.

Our guest for today is a well-esteemed data and analytics leader. In fact, he was recognized in 2021 as one of the top five analytics leaders in Australia. His mission and dedication have always been for helping shape Australia’s future through the responsible use of data and analytics. Please welcome to the podcast, Dr. Alex Antic, the managing director of the Dr. Alex Antic Group. Alex, welcome to the podcast.

Alex Antic:

Thank you, Dayle. It’s an absolute pleasure to be speaking with you today.

Dayle Hall:

Yeah. Before we get started, Alex, I think what would be interesting for our readers, because I’m not sure if people will be fully aware of what you do down there and the group, so why don’t we start with some of the background on your organization, what it is that you do, where you work, and how you actually came into this as the focus for your career.

Alex Antic:  

Sure. So what I do is I run a consulting advisory and training business centered on data science and AI. I work with public and private sector organizations and also academia. So for me, it’s a great fun to be able to work across all sectors that I’ve actually- to consult to all sectors I’ve actually worked in in the past and to draw on that experience, to try and help organizations basically align the use of data and analytics to drive strategic outcomes and to really create impact and change.

The consulting work can be quite different across different projects, which I enjoy the variability in the clients I work with. The advisory can see me sitting on an advisory board one day and speaking with the C-suite, and the next day, I could be working with some people doing low-level coding and try and get out of me some kind of insights in terms of why a particular project isn’t going the direction they should be going based on the technology they’re using or whatever problems they’re facing.

And with academic, having been an academic before, I think the real key to the survival academia long term is really the integration with industry, public and private sector as a whole, to try and make sure that both sides can work together. I’ve seen this fail at times. I’ve seen the different incentives on both ends make this very challenging. So it’s a real pleasure to be able to try and make a difference with the groups I work with, and I thoroughly enjoy that.

Dayle Hall:

And did you have- in terms of your background, what you studied, your early career, did you always have an interest in and around data and analytics, or you saw an opportunity? Obviously, there’s a very tech-heavy background to this, how did you get started with that?

Alex Antic:

So that was a very long time ago now. And I wish I could say I had some magical insight. Basically, from an early age, I had a real interest in mathematics and its ability to try and make sense of the world through logical thinking using various quantitative techniques and problem solving. And just as computers were becoming mainstream and the norm for us in our daily lives, I thought that studying both mathematics and computer science at university would be advantageous but also fun. I didn’t see exactly how advantageous it would become in terms of where we are these days with data science and AI. So that’s what really drove my transition from doing math and computing to pure math and applied mathematics, and then into data science after that.

The eventual transition to data science as what we know as data science today felt completely natural given its reliance on all the training I’d already had in math and stats and coding. And then over time, I just really developed this passion to help share that knowledge with others. And it’s really helped them make a difference using their data and analytics, which is becoming so much more accessible to people these days.

Dayle Hall:

Yeah. Well, like you, I feel like getting into this industry was a very long time ago for me, too. But I really feel, if I look back now and think of as we start to talk about big data and you look at what we have now around predictive analytics and the cloud data warehousing industry and so on, it’s just amazing how data has become such a central part of pretty much everything. And what I’m really interested in, as you mentioned earlier, the organizations you work in, so education, public sector, private sector. People have similar thought processes around how to use data analytics. And obviously, the challenges are different, I get that. But do they look at it and go, okay, so we want to do more with our data, but we just don’t know where to start? Are the challenges similar? Or is it very different through those three different sectors of education, public and private sector?

Alex Antic:

Great question. I think, in a nutshell, there’s a lot of commonality. There’s a lot of the same challenges faced by people is trying to understand, we’re sitting on this great pile of data and we have some specific strategic outcomes we’re aiming for, how do we actually go from ingesting and storing this data to analyzing it to then having some outcome that helps us drive better decision-making. Some people think it’ll make decision-making easier. I don’t think that’s necessarily the case. I think using a data-driven paradigm really helps you make better decisions, assuming you’ve done everything in the best possible way.

So I think there’s a lot of common challenges and a lot of common pitfalls and mistakes that people make along the way. Yeah, we can discuss some of those. But it’s really about people understanding, how do I fundamentally align what I’m doing in data analytics to the strategic goals of my organization? When that breaks down, I think that’s when people can face problems, when they steer away from it. They get too caught up in the hype and the technology per se rather than thinking about it as a means to an end.

Dayle Hall:

And how many of the organizations you work with around different types of projects? Do they come to you with- that they feel like they can do more? Or do they have specific use cases or problems they’re trying to solve? And I asked that question because over the series of this podcast, we’ve had a number of people, and the thing that I hear the most is that the success of using AI looking at pulling data in and making better decisions is usually more successful when they start with, we’re trying to solve this use case or this business issue or this challenge. So is that consistent with what you hear? And do you advise organizations to really think about that upfront?

Alex Antic:  

100%, the success in the field is really driven by understanding what your business problem is, and then determining is data and/or analytics part of that solution, or is it politics and people are something that’s different? Often, it’s all mixed up together, so I feel like I’m sometimes doing data politics instead of data science. So yeah, I completely agree with that statement.

If someone comes to me, a client comes to me and says, here’s my data, what can you tell me about it? That’s when I turn to them and say, we need to talk about it, more of an educational piece and try and help you understand what data science and AI is and isn’t. And let’s talk about your business problems. Don’t talk to me about solutions and data. I just want to hear what your problems are, then we can delve into, does the data and the technology you may have at your disposal actually help with that, or do we need to rethink what you’re doing, because I’m very agnostic to the technology that I use. It’s about developing tailored solutions to the client, depending on what they may have access to or what they want to potentially purchase, and then partnering with other organizations to deliver the best possible value. It’s all driven by business problems. People should never think of the data and technology separately or on their own. It can’t be thought of in an isolated way.

Dayle Hall:

Do any of those sectors that you talk about are any further ahead? You would think that sometimes maybe the tech companies or the people you advise there, they may be further ahead because they’re more used to dealing with new software and bringing in different processes. But do you find education is also advanced in the government because, potentially, they have- or they would want to spend more funding on it? Are any of them ahead of the others in terms of using this kind of data and analytics?

Alex Antic:

Yes and no, it depends on what you mean by ahead. If they’re using data and analytics to help solve their specific problems, then I think they’re near the front of the pack, obviously. Each will face different challenges and are driven by different incentives. So for the government, it’s very much about social good. And they have strict frameworks and policies they need to work within, which can make things a bit slower and a bit more difficult. But that’s understandable given that as citizens of the government, we have high expectations in terms of the data we’ll share, and how the data will be used. And there’s strict regulation and legislation that covers that. And even simple things, relatively simple things like different agencies and departments sharing your information to deliver services and value to you, that can be really difficult.

Private sector, if you think about the Big Bang, the big tech companies, Facebook, Apple, etc., from a technical perspective, they’re probably ahead of everyone else given the vast amounts of data they have access to. So they can do things like deep learning really at scale, whereas for many organizations, deep learning is not really at the forefront of what they’re doing. They don’t have enough data or really the need sometimes. It’s really the simple solutions that can deliver great value. It’s really about automating and improving processes, increasing efficiency. That’s where I think a lot of organizations really are at the moment, trying to get that right, getting the fundamentals right, getting data governance rights, having the right people in the right roles. That’s where I’m finding myself helping them much more than coming in and developing a large-scale, purely technical solution, which I greatly enjoy doing, but I think there’s less need of that with many clients I work with relative to just getting the foundations right and helping them understand, how do you actually succeed in this field.

Dayle Hall:

Right. So let’s move on and talk a little bit about the organizations that you deal with and potentially how you advise them. So what does a successful, sustainable type of data science program or project look like within an organization? And how do you advise these, I’ll say, entities because there are companies or higher ed or governments, where do you advise them to start? What do their organizations look like?

Alex Antic:

Yeah, great point. I think one of the first things I like for them to understand is what is actually data science. There’s a lot of, I think, misinformation and misunderstanding out there. A lot of these senior executives, in my experience, don’t necessarily have the data literacy that I’d like to see in many C-suite data leaders. I think that’s still an area that’s emerging.

Initially, I tried to instill this notion of data science really being the science of change. And there’s two key elements there. There’s the science part, which is this notion of exploration and uncertainty. And then there’s change, it’s really about transformational change in an organization. So this can be very confronting for many organizations because they focus on the fact that it’s just technology driven. We’ll just buy some software, bring in some data scientists and suddenly, the magic happens.

Quickly, they realize it’s not really that simple, especially when you’re doing a long-term transformational change project in your organization, you’re really shifting the dial to truly be data-driven. So to ensure that you’re doing decision-making in a way that’s supported by evidence-based data and analytics, you really need to commit to a long-term, strategic, I guess, initiative around transformational change. And that’s another thing that many struggle to understand is that it’s not necessarily a quick thing, it’s not a quick fix, it can be a long-term investment.

One of the barriers- or two barriers I’ll say, which I call the two Rs, in terms of companies that struggle is one is risk aversion. They’re just so scared in many ways of trying new things and dealing with uncertainty that they don’t really invest in innovation in a true sense. And the other one, the other R, is resistance to change. They’re so stuck in their old ways that, once again, it’s kind of fear holding them back. They don’t really know how to move beyond that, how to shift perceptions on the use of data and create the right culture in their organizations.

So when it comes to trying to advise them, I try and focus on what I call the three Ts. It’s trust, technology and talking. So the trust part is really around people and culture. It’s having, first and foremost, the right leadership such as at the chief data officer level or something similar to that. I think in most organizations I’ve worked with, where I’ve seen things go really well or really struggle, it’s been because of senior leadership’s support or lack of, and that can be a real deal breaker. I really think it’s top down. If you’re pushing from the bottom up, it is a huge journey ahead of you. And I’ve seen this fail so many times, and I’ve been through it myself in the past. So having support in senior leaders, I think, is absolutely paramount to success broadly. And part of that is really around having the right culture, which we can discuss in detail later.

And that also hinges on them having the right data literacy, they understand what data is, they understand how to ask questions of their staff, they understand, at least at a conceptual level, how to leverage technology that exists, at least at a high level, to drive the strategic goals. And a lot of that around the people aspect is really about cross-disciplinary teams, not just having the traditional geek techie team sitting in a corner doing their own thing. They really have to be integrated into the whole business and be aware of how and what they’re doing will drive strategic success.

Dayle Hall:

That’s very interesting. And I like the concept that it has to come from someone higher up in the organization that’s going to get behind the things like the resistance to change and get through that risk aversion thing, which I think is, in general, any kind of change like this, you do need to champion. We talked about it. When we’re trying to close deals, it’s who’s the champion in your organization. That’s very similar.

Is there a cultural challenge with that, too? So obviously, I’m a big believer in I think the culture of an organization comes from the leaders. And that permeates down, and people resonate with that and they take it forward. But how much of- not just the leader, how much of the cultural aspect of the organization is a driving factor of, let’s call it, future success. Even once you’ve advised someone, it has to take a little bit longer and it’s not a quick fix, you won’t just get some great results immediately, but how does culture play a part in that?

Alex Antic:  

I’ll drag up the quote that is often mentioned by Peter Drucker that culture eats strategy for breakfast. I think that is 100% relevant to the field of data and analytics.

Dayle Hall:

My CEO uses that, probably once a month. So he’s going to love this episode of the podcast.

Alex Antic:

That’s good to hear because too many times, I’ve seen organizations focus on their strategy. A strategy is absolutely paramount, especially having a data strategy. And I always ask them, do you have a strategy to support your business strategy to have a symbiotic relationship and really be intertwined? I don’t even talk about culture often enough. That very rarely comes up. Or sometimes someone will come to me and say, we know we have a poor data culture, can you help us fix it? And that’s a great conversation to have. They’ve realized where there’s a huge bottleneck and huge opportunity to change things.

I truly believe, through experiences I’ve had as an employee and as a consultant, that culture can make or break the success in this field not just in how you invest and try and drive strategic outcomes through data science and analytics broadly, but in terms of attracting and retaining the right people, which is becoming difficult, I’m seeing from many organizations in any sector through COVID and post-COVID in many ways. And I think the culture is really what will get your staff behind what you’re trying to do and how you’re trying to deliver value to your customers, to your clients, to citizens, whatever the case is.

Having the right culture centered around data, helping people understand how to use data to make their jobs easier, more fun and ultimately, more beneficial for the company, organization entity is definitely worth investing in. And I think once again, these two points of failure, the risk aversion and resistance to change, can really pull the organization back because they don’t realize just how important that cultural element is. But yet again, as you’d imagine, it’s one of the most difficult to change. It’s very challenging.

Dayle Hall:

Yeah, I can imagine. And like I said, I think it does- I’m sure you see both sides, I’m sure you see a very strong culture that can get behind it. And I’m sure you see the other side, like, ooh, this could be a problem. If you see an organization where you can get that sense that there’s going to be resistance to change, or that the culture may not be ready for it, what do you generally advise them? What are the things that you think that they could do to get prepared for it or to be a little bit more open?

Alex Antic:

Yeah. In terms of the data culture specifically?

Dayle Hall:

Yeah.

Alex Antic:  

I think to begin with, as we mentioned a moment ago, it’s really about having senior leadership who really back and embody that notion of a data-driven culture. So you really need to either have that support or try and shift the dial with their perception, which can be challenging. So a part of that often is if there is some resistance there, it’s about doing some data literacy training for them, helping them understand what really is data and analytics and AI, and what does it mean in their specific domain, what are some of their peers doing, what are the risks they don’t really invest in it.

Because often, the question that I ask is, what does best practice look like in this field in my domain, what are some of my competitors, peers, collaborators doing, how do I then do that in a way that really suits my business rather than just copying someone and investing blindly? I think questions like that often mean- to me mean that they’ve thought about this. And they’re asking really intelligent questions in terms of,  we want to change, how do we actually do that?

Beyond that, I then try and discuss with them, having a culture that is basically built upon those notions of innovation and experimentation, science part we talked about earlier, exploration, giving the staff freedom, tools, access to people that they need to try and explore and innovate, basically, to explore new potential solutions I never would have thought of by integrating their business objectives with emerging and current technology that may exist. So they could be working with natural language processing in a way that they’ve never thought of. Maybe they’ve realized they’ve got all these handwritten notes or textual information sitting that they never really, truly explored because humans can’t really do that. They can’t scale in terms of getting through everything in a feasible time.

We’re looking at computer vision to look at some other problems I’ve never even considered. Ultimately, it comes down to what we’ve heard many times through the start-up, failed years ago is this notion of fail fast, fail cheap, but also try and learn heaps through that. I think there’s a lot of truth in that. It’s giving your staff the ability to try things. And if it doesn’t work, that’s fine, not so much label that as failure. That’s when people get a bit more resistant to innovate, they hold back, they don’t ask as many questions, and that’s bad. They need to be questioning the status quo and questioning why hasn’t something been done.

Then it comes down to collaboration. People need to be collaborating. As I said earlier, you don’t want data scientists and IT staff being isolated from the business. They have to be very much integrated in what business does. So you need to be able to foster strong working relationships between different groups. Collaboration and diversity is so key, I think, in this field, especially when we talk later about responsible and ethical AI, that people haven’t really realized just how important it is to have diversity and inclusion in a complete sense.

Then there’s this notion of sharing data, but also sharing information, what has failed, what has worked, what are the lessons learned. You want that to be shared beyond just one core team. I think those really are at the core of how to create the right culture.

Dayle Hall:

Yeah. You mentioned something interesting there about collaboration. And I don’t want to delve into the pandemic and the impact on everyone’s life. I think everyone’s tired of talking about the impact on everyone’s life. It was major, and I think we’re still going through it. I think life is very different. But when you talk about these kind of projects and you talk about making sure the IT and the data science team have collaboration with the rest of the organizations, the business units, the E team and so on, the executive team, how have you seen that work progress or devolve through the pandemic? And is there an opportunity for us to- has that made things harder for us, or has it opened up new opportunities? Because I think that collaboration for me is key. And I think collaboration has been harder when you’re not sat next to someone in the office. What have you seen in and around your business?

Alex Antic:

I’ve seen both sides. I’ve seen some people struggle with not having as much face-to-face time. I think there’s still this prevailing expectation from senior executives that if they can’t see you working, then they have some doubts as to how hard you’re working as a staff member. But I think really, the results speak for themselves in most organizations. And with many I’ve worked with, they’ve realized that there are so many tools available and capabilities to collaborate in a virtual sense, in an online capacity that really shouldn’t be holding you back, that shouldn’t make or break your business. It’s really around, how do you support your staff and instill confidence and trust in them to do the job, but given the support structure they need? And that can vary between person to person, introverts versus extroverts have a different paradigm in terms of how they work and methodologies in terms of working.

So it’s really about supporting the different needs of individuals, helping each group, if we wanted to segregate into two separate groups like that, feel like they’re valued and that their needs are being met. And I think there’s so much you can do through virtual platforms these days to really promote that. It’s less of an issue now. Coming out of COVID, I think many organizations are now shifting to this part-time basis. And that may be the ultimate way we work. But I think it’s being able to help both different groups work in the way that is best for them. That is really at the core of this. And that can vary from organization to organization. Just being in tune with what your staff need to be at their best and just supporting that, I don’t think it’s much more complicated than that.

Dayle Hall:

Yeah. And given the wide variety of organizations you work with, have you seen that the opportunity to use data and analytics has helped people be more productive? Has it helped employees be more successful or actually be more satisfied because they have access to data so they can be more successful? Because I talked to- I feel like I’m constantly recruiting. Because there’s so many opportunities out there, I think the war on talent has got stronger during COVID, which seemed like people didn’t want to move. I think people do want to move. And I just wonder whether having the opportunity for this kind of data to help people be more successful, does that help retain people? Are people going to move to jobs because they have better data and AI models, because they know they can be more successful?

Alex Antic:

That’s a fantastic question, Dayle. So I think to begin with, what’s important to me is getting people, all your staff, even those that don’t work in the data and analytics directly each day, but getting everyone excited about the data that you hold as an organization, as an entity, and how it can be used to drive impact and change for you. I think that is really at the core of having people understand what data and analytics can do, what it’s capable of doing and how that can play a role, either big or small role, depending on what their specific role and responsibilities are.

And a part of them is giving them freedom and responsibility to help turn data into actionable insights and having them see what they’re doing is integrated into the broader strategic outcomes. That, I think, is absolutely vital. And part of that is kind of this notion of democratizing data in your organization, getting it to the hands of everyone to a point, to what they need, and helping them start thinking about how can they better use data to help with their own roles and in the organization more directly.

And you’re right, attracting and retaining key staff, especially in the tech field, is very challenging. I think it’s becoming harder. There’s a war on talent. And also, given how some organizations struggle with the right culture, as we spoke earlier about, and with various other elements, it can be hard to initially attract the right people. But once you attracted them, how do you keep them long term, whatever is long term these days. I don’t know. When I was starting my career,  it was years. Now it’s measured in months. I’m not sure anymore.

Dayle Hall:

Sure. I agree with that.

Alex Antic:

You understand. I think a part of that really is that the senior data leader, not necessarily the CTO, say, kind of the mid-level data leader, which I think is actually at the crux of many of these organizations, they’re the linchpin because they’re the ones that translate business problems into technical solutions. Ultimately, they try and traverse both sides, the business world and the technical world. I think that’s one of the roles which can be the most difficult to fill successfully. But if you get that right, I think those people can really make a huge difference.

So data leaders need to be able to attract top talent, and they need to be able to vet people. So something that some clients will ask is, we need data scientists and senior leaders in the data and analytics field, but we don’t have a background ourselves. We never worked and hired them. How do we know who to look for? A lot of people sound good on paper, but as we know, some people talk themselves up a bit too much. So how do we actually test how good they are? Having someone at that level of being able to bring in the expertise where you need it, I think, is absolutely paramount. And I always enjoy working with clients when they realize they don’t have the skill set to do that, so we need help with this. I think that’s really important because when they don’t get that right, they can extend in tears on both sides.

This senior data leader needs to really, I believe, have tech credibility themselves. If the staff members google this person, do they have some sort of presence, do they have some, I guess, enough of a tech background that they think themselves, I actually want to work with this person, I think I can learn from them, they understand my world and what I do, they’ve had some hands-on experience, they can help me, they can also help sanity check at a deeper technical level what their staff are doing, they can provide them career guidance, they can provide them some technical support, and help them align what they’re doing to the broad strategy of the organization so the staff feel valued.

One really important thing that they do that is often missed by very senior leaders or people that don’t come from this field is that they find the right problems to solve. They find the right problems to solve, not just any problem, it has to be problems that have merits, they have strategic outcome, and that they can quantify and show, look, you’ve invested in this capability, here’s how we’ve developed value with you.

And a part of that is being able to work closely with business leaders and senior leaders. They need to be able to understand the world of business, whatever the domain they work in. And they need to understand how to turn those viable business problems into measurable technical outcomes. I think that is really key to having, ultimately, success in this field is that mid-level layer where they straddle both sides. Yeah, I think that is absolutely paramount.

Dayle Hall:

Yeah. I think it’s a very- it feels like a minor point, but it’s very important, finding the right problems to solve. And this goes back to the use cases and so on. But I think there are organizations- I mean, if you’ve seen the latest MarTech landscape, there’s 8,000 MarTech companies. Most of them probably say they do some level of AI or data-driven something. I think that if you’re going to use a technology or a process, or you’re going to really do more with data, finding the right business problems to solve is the right first step.

And then there’s people like you and other, then there’s software and technologies you can use to help get the most out of it, but always start with that business use case and make sure- like you said, what is it solving for your organization it’s got to be the right things. And then if you’ve got the leader behind it, someone that understands and someone that can inspire people, you’re going to get more success out of your initiatives.

Alex Antic:

Exactly. And get the buy-in from senior executives, which can be hard and challenging for many organizations to have that ongoing funding to build from a proof of concept into a full-scale production solution. So yeah, definitely.

Dayle Hall:

Let’s move on to the last topic, which is around responsible AI, AI ethics and so on. And I feel like whilst we’ve been talking about AI for a while, while we’ve been talking about using data, the ethics piece, I think, is starting to get more discussions for the right reasons. And I believe that there’s a lot of things that people don’t understand around what AI ethics actually means and why it’s important. So before we actually get into a couple of specifics, if you were to describe to someone the principle of responsible AI and AI ethics, how would you describe that to someone listening to this podcast who may be going down like, oh, I’ve heard of that, but what does it mean? How would you describe it?

Alex Antic:  

I’d keep it quite simple, to be honest, and that is to ensure that the human is at the center of your solution. If you’re developing a system that leverages technology, it’s about understanding what impact can that possibly have on the end user. You have to remember that this technology is up and scaled, you don’t have one person making a decision that affects a small group of people, this could be one solution affecting thousands or millions of people. How do you make sure that’s done in an ethical, fair, just manner?

And then depending on the domain, I probably speak about specifics that are related to them. But really, I think it’s about fairness or giving everyone, I guess, an equal stance in how a decision is made and making sure that they have a right to question some of those decisions. They have some level of transparency to that, so they understand this decision that was made about me, say, credit scoring or whatever it is that’s often used, how was that made based on the information you store on me? And can you prove to me basically that it’s a fair and just decision that’s been made rather than it’s obscured in some way? I guess it’s really at the core.

Dayle Hall:

Yeah. One of the other podcasts that I’ve done was talking to an organization, and they were looking at human resources. And that was one of the things that when they looked at technology to bring into human resources, they would- whilst they understood AI can help, if the vendor they were looking at could not explain how the algorithm, how the AI works, where the data, how they use it, not this marketing black box or it’s just magic, if they couldn’t explain it, they would not work with that vendor because the danger of just trusting a technology to do the right thing and collect data in the right way is just too much of a risk in all roles, but particularly in human resources. Like you said, it affects the individual.

So how do you protect against that? What are the things that organizations should think about when they’re looking at these kind of initiatives to make sure, for example, data is collected in the right way, data is used in the right way? How do you advise your clients?

Alex Antic:  

Great point. And there’s, I guess, many different parts we can speak about to that because it is a very important question. And I completely agree with their view of having some level of transparency and understanding in terms of how decisions are made.

I think it’s important, the groups I work with, to first ask themselves, why are you collecting the data in the first place? What data do you need? What will you use that data for immediately? And what procedures can you have down the line? And basically, if you’re on the other end, how would you feel about that data being used for those ends? I mean, are you comfortable with that? Are there any potential issues or questions you have?

So some groups I work with, I say to them, just don’t collect data just for the sake of it, not just from an ethical standpoint, but also you need quality data that’s fit for purpose for you. You just can’t take any data, and suddenly, it’s magic, you just throw it into a black box and you get a solution. I see a lot of misconception. So I think it’s important to focus on safe and secure capture and storage of this data with overarching clear ethical guidelines on what can be captured and why it’s being captured and used. And also, the specific guidelines and frameworks can dictate the use of the data, but dependent on the outcome you’re looking for. It could be data matching and de-identification for data sharing. But all has to be done within the bounds of regulation and legislation and just what is right.

So I think it’s important for organizations to think about the end-to-end pipeline and data lifecycle to think about what are they collecting, how are they storing it, what’s it being used for, when is the data deleted, how can someone delete data, what’s the metadata that’s been collected around it and stored, what about data governance processes. I think these are all important aspects rather than just we’ll collect data and have an outcome and not worry about all the gray bits on the sides. I mean, when it comes to ethical and responsible use of AI, they become absolutely paramount to having responsible solutions at the end.

Dayle Hall:

Right. And you mentioned something earlier on. You mentioned a word which is obviously very hot today, and I want to talk about using the concept of responsible AI and ethical AI to protect against bias. And you mentioned diversity. So specifically around diversity and responsible AI, how do you protect- once you’re pretty sure that we’re going to capture the data the right way- I love what you said about make sure people understand why and how they’re capturing data and how it will be used. But then how do you protect against bias? And where does diversity play into that bias conversation?

Alex Antic:

Sure. And I’ll speak a bit about the rulebook I tend to use about, at a high level, how to actually develop responsible AI in relation to bias as well in this context. So first of all, I think it’s important for the organization to understand the business context they work within and what bias means for them. Typically, bias is really around areas that can manifest themselves and result in unfair outcomes. But often, many organizations don’t really understand, how does this actually occur? Is it part of the data they’ve used to train a machine learning model? Is it part of the data that they just happen to store? Is it about the people who do the analysis and decision-making who may have- humans are biased, of course. Is it about them bringing in their own bias in terms of the data they chose to collect and how they’ve analyzed it?

Initially, it’s about understanding, how can bias creep in? How do you identify bias? You’ll never remove it, but how do you identify it and try and work with it? And then how do you define- create a definition for your organization in terms of how biased is this data and how can we actually try and resolve that? What are we comfortable with? Some people will turn around to me and say, look, humans are biased. If AI merely reflects human bias, then why is that a concern? 

And I think there’s two really important parts to this that people need to understand. One is scale. Models, as we mentioned earlier, can have far reaching implications. They can reinforce and perpetuate bias in a way that no single biased human could ever do in terms of how we use this technology. But they also have this potential to allow us to hide behind our moral obligations and to justify immoral judgments. So if the head of a bank says, look, I understand the model we created was very much biased to, say, women instead of men, and things like this happen many times, or some minority group, whatever, look, the model data, it’s not me as the CEO or the head of the organization. You can’t blame me. I think that is- obviously, that’s not what he can do. So that’s where I’ve seen people try and get around it.

So while we can’t completely eliminate bias, what we can do is work towards understanding it, identifying it and then reducing it in these systems that can scale. So there’s three things, I think, that can be done, at least at a high level, what I call the three Ds. It’s all about twos and threes for me, data, discovery and diversity. So with data, it’s really about understanding the data both from a technical perspective and also from a domain perspective. This is actually crucial in understanding how is bias embedded within the data, and is the data that you collect is fair and representative of the group that you’re actually working to provide some solution and value to. It’s really about understanding data itself as information is a technological widget, and also about the broader business context of what does the data and information that you’re collecting actually do and mean. That’s, once again, where the collaboration between the technical people and the business people who are domain experts have to really work together to understand.

Then the next D is really about discovery. I mean, how is fairness and bias actually defined? You talk to three different companies, you go through different definitions. For instance, is fairness defined based on the model’s input or the outputs on the model? How do you define if the model is a fair one? Is that the data you’ve put in or the data that’s come out on the other end of the solutions that you’ve created? When do you deem it fair enough to deploy a model? What measure do you use there? And what about data privacy in terms of how do you ensure that data you collected is stored, I guess, in a privacy preserving manner? Is the model explainable? That’s the one that comes up a lot. You talked about earlier the transparency around how the model works.

But then the question becomes- it gets a lot more nuanced as it becomes, explainable to whom? To a developer? To someone trying to debug the model? To the manager of the team? To the CEO? To an auditor? To the customer? I mean, there’s different levels of explainability. What is explainable to me will be very different to what’s explainable to you. So there’s always this trade-off between fairness and accuracy, especially with the machine learning models that need to be juggled and accepted. And that is a nuance that I think some people at high levels miss. That can be quite difficult.

So for me, ultimately, it’s about AI systems supporting us in making better decisions. But it’s really up to us to define fairness, morality, privacy, transparency, and explainability. And I think really the future lies in humans and machines working together to advance society rather than just us being dependent on machines with AI. It’s really about this integration and working in this human-machine relationship, which I think is the core.

Dayle Hall:

Yeah. Well, I have a feeling that we could talk for a lot more. And I actually would like to, at some point in the future, do another follow-up, maybe drill into a little bit more on the specifics around this. But I think for now, you’ve given me so many soundbites. So what I love about these podcasts, Alex, is I always think about people listening to it and think about what are the things that they could take away, what are the things that they’ve learned. And I feel smarter from talking to you. So I know that [it’s a good podcast], right?

Some of the things you talked about, I think, are incredibly valuable. Make sure that you understand about what are you capturing and why from the data perspective, really think about bias that’s in there and make sure you work within it, so it doesn’t perpetuate. You can’t remove it, but if you identify it, it’s easier to then deal with it. I love the concept of, we mentioned about find a good leader, someone that’s going to get people excited about it, particularly then when you’re trying to democratize the data that if people are behind it, they’ll get more value, but they’ll also feel much better about how they use it to attract and retain people. I think in the future, people will want to stay or join organizations because they have a very robust data model.

I like that you talked about the collaboration between IT, data scientists, and the rest of the organization. I think that is key. And I loved your concept of, you said data politics. I’d never heard anyone say that before. But data politics is a new one. So it’s not just data discovery and data analytics, but the politics around that.

But my favorite quote, and if there’s anything that when we put this podcast out, the thing that I take from it, and this would be my headline, what you said around AI and ethics, which is to ensure that the human is at the center of the solution. I think that is an excellent way to think about AI, responsible AI, and also think about what we’re going to use data for in the future.

So this was an amazing podcast. I appreciate you being part of this. Dr. Alex Antic, thank you so much for joining us on the podcast.

Alex Antic:

Thank you very much, Dayle. Absolute pleasure.

Dayle Hall:

That’s it for this episode of Automating the Enterprise. We’ll see you on the next one.