'How This General Manager Transformed a $25B Company by Insisting on High Standards in AI'

Published Thursday, February 26, 2026
Live Interview
Expert Analysis Included
Full Transcript

Watch the Complete Interview

See the candidate's full response, body language, and how they handle follow-up questions in real-time.

Full HD Video
Real Reactions
Complete Context
Unlock Pro Access

Complete interview transcript & analysis below

INTERVIEWER

Interviewer

Uh, this next question's about you specifically, uh, and most specifically about overachieving. Uh, I, I, you know, recency bias is helpful, but just anytime in your career where you wouldn't compromise on achieving a great outcome. Where the people around you are like, hey, this is good enough. What, what do you want to talk about?

CANDIDATE

Candidate

Mhm. That's a, uh, that's an interesting question. I'm gonna think about that one. So let me make sure I understood the question first. Um, you, you You're thinking about um a time or, um, asking whether there was a time where I did not compromise on my expectations of a certain outcome or a certain process or a certain deliverable, uh, versus the team arguing for a simpler solution.

INTERVIEWER

Interviewer

Is that it's like, hey, we've done enough. Why, why are we still like, can't we just, can't we hit the done button now and be done? You're like, nope, nope, no, this, we've got this bigger opportunity. Let's get, let's get to the end line.

CANDIDATE

Candidate

Yeah, um, let me, let me make sure I, I get, uh,

INTERVIEWER

Interviewer

sure,

CANDIDATE

Candidate

take a

INTERVIEWER

Interviewer

minute.

CANDIDATE

Candidate

I think of an example, um, that's uh appropriate here. Yeah, so Um, in, in about, it's slightly dated, so, uh, bear with me on that, but I, I do feel like this is probably a, a good example of, uh, uh, of, I think what you're, what you're talking about here. One, it's, it's, uh, the year is 2017. I'm gonna say this like a story here,

INTERVIEWER

Interviewer

but you sound like a, a movie preview. The year is 2017.

CANDIDATE

Candidate

Yeah, yeah, no, I remember this because I think it was one of my, uh, I think one of the biggest learnings I've gotten from my career, and Um, in, in, in AI and just in the, in the data space, so. Uh, so, yeah, yeah, in 2017 I took ownership of our, uh, data science and AI function within Collins Aerospace. So this is about a, uh, $25 billion business within Raytheon, and, and the,

INTERVIEWER

Interviewer

sorry, billion

CANDIDATE

Candidate

or million

INTERVIEWER

Interviewer

billion

CANDIDATE

Candidate

billion with a B, uh, and, uh, we had a data science and AI function which was at that time just incubated, you know, just, just new, you know, to drive, uh, the adoption of these kind of methods. Uh, we are an industrial company, aerospace company, so very different than, than your average tech company, but, uh, but we saw a lot of potential in these technologies, uh, and, and made, uh, needed to make that a part of the fabric of our enterprise. And, uh, so fairly new, sort of a year into the function, uh, and I was hired just maybe a couple of months, uh, right after, um, and, and, and this was something that, uh, sort of I took ownership of over time because I've sort of got more responsibility at the organization. Um, so when I took ownership of this, um, the, uh, the sort of looked at the team and the, the situation is more around they had a set of objectives and key results, which was awesome, um, you know, to see, uh, our team kind of progressing and, and thinking about it in this way, um, and essentially, um, you know, but the challenge that I saw, uh, at the time, which is what sort of. a few quarters long discussion was essentially all of the OKRs sort of boiled down to a productivity metric, which is how many models in that the team delivered over a period of time, whether it was a month or a quarter or a year, right? And I think while it was well intentioned, um, it Um, it caused, uh, in my opinion at the time, uh, talking to our customers and the main customers of this particular function, uh, particularly our external customers that were, um, sort of, uh, raising their hands and said, uh, we need to, um, you know, we need this, uh, we're, we're going to try out whatever new innovation you're coming up with. So these are our early adopter customers at the time. And uh so I think the issue was well-intentioned towards productivity, but I think it caused unintended consequences towards the quality of the predictions and the explainability of those models to, to our customer base. And that was what I was hearing from, from our customers, uh, of, of this particular service and this particular, the service that this function generated. So, um, so over there, I think it caused a lot of customer satisfaction issues, uh, which, which essentially the, the main problem that we saw within the business was, uh, you know, our early adopter trials were not converting to the trial, uh, to pay customers essentially. And this included our largest, uh, airline customer that had raised their hand and said, hey, we wanna, we want to try what you're innovation you're working on. This is Singapore Airlines, which is obviously a large customer in, in Asia Pacific. So the challenge is that, uh, you know, uh, this was, uh, and again, you know, from a team's perspective, they were doing everything from an effort perspective, they were. You know, doing everything, uh, that their, their management and the functional management was talking about. I think the challenge here was to move them more towards a, uh, value-based, uh, uh, sort of OPR framework that measured outcomes more holistically. And that's where I think there was some, uh, tension around, uh, not just, uh, being over-indexed on productivity, but actually balancing. The customer value with the productivity goals, uh, that we may have internally as well. So I worked with three team leaders here, uh, the, the, uh, the leader of data engineering, uh, leader of program management to interface with the customers, and, um, and the, the lead data scientist, uh, on, on, on the, on the, the basically the chief technology person for, uh, the data science team. So, 22 main things that, uh, so I'm more, uh, reflecting now looking back, um, learned from this. One was, um, you know, I think identifying the, uh, how to measure success and communicating and, and, and having the team sort of define their, uh, measure of success, uh, in, in their own words was very important. Um, so if I asked the three, team members what they Uh, their own definition of success was, I think that, that was a, uh, in none of the, in, in none of those, uh, discussions, while the customer was, was mentioned a couple of times, um, the success of our product in the hands of our customers was, was sort of a missing, missing link. Um, and I think the part that, um, you know, eventually we realized was that it required us to go further in our measurements, not just measure what we can conveniently measure, uh, within. The four walls of our operation but also measured what the customer measures, uh, and sometimes that required sensitizing things that, uh, you know, did not from a process perspective that did not we did not directly control and in this case it was the customer's operation. So the solution we found there, uh, eventually after looking at it was, was, uh, try to be very data driven about it but at the same time providing some allowance for imprecision and what I mean by that was, um, you know. Imprecision, um, uh, which is, uh, not being super precise, and don't let the, um, you know, um, perfect be the enemy of good, uh, was the sort of thought process and the example of this or how we were able to solve for that was we needed to understand how many delays and cancellations our customers, uh, airlines were taking or how many aircraft on the ground situations they had. Um, and this only the customers could provide back to us, uh, because they, they knew where they were flying their aircraft, but this was, uh, for whatever reasons, you know, they thought of this data as, uh, as competitively sensitive, uh, confidential, uh, very specific to their operations, and they didn't want to share very detailed granular data on with us on that. So what we were able to do was find alternate data sources to act as a proxy for customer performance characteristics. So essentially, the FAA in the United States issues very detailed statistics. Uh, on, uh, on, uh, on airline, uh, performance, you know, operating, uh, delays, so on and so forth, weather-related delays. And we took that data set and analyzed our own performance in the context of our customers. So, so looked at, um, um, you know, what, how our product, if somebody, if an airline that adopted our product would be more successful, um, than, than their peers. So providing that level of, sure, we didn't get the most highest quality data set, but we were still able to, uh, without that perfect uh data set, still able to come up with a directional sense of whether we're doing well or we're not doing well. And at the same time, we were able to also perform some comparative analytics between, um, say, uh, we did, uh, between Say if Singapore Airlines was seeing a similar or better benefit than Etihad Airways in the Middle East, and that gave us a lot of insights to say, hey, we may be doing well, something with one customer in one region, and that has potential to work within another region, we may not be doing that organically. So, so that being data driven about it was, I think, the main. Uh, main thing in, in measuring success and being, uh, and making sure that that was a part of it. And, and in order for this to stick, I think there was one, process change that, uh, we needed to, uh, uh, and we had, we had a lot of discussion on was in order for this to stick, it can't just be on retroactively applied to things that are already, uh, deployed. It also has to be, um, um, it also has to become an acceptance criteria or this way of looking at it has to be an acceptance criteria for future development. So. Uh, the, the key sort of, uh, dimension here that, um, I think the, the team and I, uh, when we were working through this, uh, there are multiple voices, but essentially the, the, the abstraction of the conversation was how do we make sure that the team has enough time to innovate and not be bogged down by. Uh, customer value, uh, all the time, um, so balancing those two things and, and the idea was that, you know, we make sure that the team understands that sure you can innovate as much as you want as long as. Um, you know, we know that something, if something is not working, we know about it and we make a collective call, um, about whether we should further, uh, further pursue, uh, putting investment dollars or time and focus and energies into it, um, and, uh, rather than moving on to the next, uh, bright idea in terms of, uh, and so that was a process that we put in place. Uh, implementing, uh, what we call the value reviews every quarter, say, hey, what's the portfolio of projects that are going on for the next quarter, and, and then the team would come and demonstrate how far they've been able to, uh, achieve it, not just saying this is the amount of effort we've spent, but this is the customer value that we can create in terms of outcomes, and that became a sort of a running process into making sure that, uh, uh, the future development is also following. Similar practices versus, you know, kind of chasing, um, issues, uh, after they happen. So really kind of avoiding, um, uh, and making sure that we are aligned on the backlog of development before it goes into a lot of time and effort and energy is spent on that. So, so that was the part that I think, uh, you know, we again that team could have worked, uh, without, um, you know, going on and they just being judging themselves with the productivity metric, but I don't think they would have been very successful. Um, uh, if they did not, uh, incorporate, uh, the customer value that was being driven out of their products, uh, and services. So, so that was, uh, I think the, the net, uh, learning from there.

INTERVIEWER

Interviewer

So I, I want to ask a clarifying question because I, there was a lot of really good content in there, but it was a bit dense. And so I want to make sure I understood what the, what the breakpoint between good and great was, or, or rather, let me just, let me just put it, put it back to you. What, what. Simply, 2 or 3 sentences or less. What where was the breakpoint between what was good that you were working towards, or rather what was great that you were working towards versus, hey, this is good enough. I'm, I'm, I'm, I get, I think I missed that part.

CANDIDATE

Candidate

Yeah. So at two places, I would start with the first one. The good was heavy shifter model. Uh, and it, it works. It doesn't break, um, in, in production, uh, and it's workable code, uh, right? So, so that was the sort of good in, in the team's mind. Um, I think that the break was, well, the problem is, sure, it works, uh, it's runtime is awesome, it's 5 9s availability, we get that, but the customer's not getting any value in terms of reduced delays and cancellations and improved costs, uh, you know, positions that they're, they're working with, so. So from that perspective, I think we were just missing any measurement of that particular value outcome and using that to actually further development. So that, that was the main, uh, area of our breakpoint. So I'll pause there and see if that, that gives you

INTERVIEWER

Interviewer

more help me understand a little bit like just given how the nature of how the company was operating and, and based on I'm, I'm queuing off of what you said you were a month into the role when you kind of took this on. Maybe I missed the timeline. Um, why did you feel it was so important to push on this specific issue?

CANDIDATE

Candidate

Yeah, so, um, one, I, uh, I think there was a belief, uh, starting with, uh, two things. One, was customer feedback. So the first thing I did at, uh, going into the role was, uh, interview all the customers, right, uh, to the extent that I could, most, especially the most important ones that I, um, like I said, we were still in product, uh, finding product market fit with this particular team. We were fairly nascent, so, you know, we were trying everything. The team was trying everything. Um, you know, at the time to see what sticks in with, with the customer base. So, uh, but at the end of the day there are 1200 airlines in the world, uh, and, uh, 300 of them, uh, constitute 80% of the fleet. Uh, so if you try to take a sample of, of those customers, um, you know, um, working with the top ones in each region and interviewing about I would say 15 or so customers that by the time I was 90 days into the role. Um, interviewed almost all the customers and all of which there were 5 early adopters that were, that had raised their hands that are already implemented the trial version of our, of this particular service. Um, and, and they provided very clear feedback to say, hey, if we covered 6 months of, uh, us operating with your, with this service and us operating without this service, we're just not seeing enough benefit, uh, for us to warrant a further investment of time and energy as a customer into your, into your solution. So that was the feedback. Uh, that motivated this, uh, sort of, uh, belief that, you know, we have to keep customer value at the forefront of development efforts.

Get the Expert Assessment

Unlock the interviewer's detailed analysis, scoring breakdown, and specific feedback on this candidate's performance.

Detailed scoring breakdown
Strengths & weaknesses
Improvement recommendations
Key learning points
Build confidence with expert insights
Get Pro Access