Episode

4

23

Minutes

Leaders

Leaders

From Paper to Prediction: AI's Role in Enhancing Biomanufacturing Operations
From Paper to Prediction: AI's Role in Enhancing Biomanufacturing Operations
Mike Tomasco
Mike Tomasco

,

Ex Vice President of Digital Manufacturing at Pfizer
Ex Vice President of Digital Manufacturing at Pfizer
Listen Now on:

Apple

Apple

Apple

Apple

Spotify

Spotify

Spotify

Spotify

YouTube

YouTube

YouTube

YouTube

Without a solution like Katalyze AI, typically a lot of paper and spreadsheets, maybe some off the shelf statistical analysis tools, and Six Sigma experts would be needed for months of retroactive analysis.

Without a solution like Katalyze AI, typically a lot of paper and spreadsheets, maybe some off the shelf statistical analysis tools, and Six Sigma experts would be needed for months of retroactive analysis.

In this Episode

In this episode, Brandy and Mike Tomasco discuss the integration of generative AI in biopharmaceutical manufacturing, particularly how it can enhance shop-floor operations and improve efficiency. Mike, a former VP of digitization at Pfizer, emphasizes the potential of AI-driven tools like Katalyze AI to simplify complex tasks for operators by acting as a co-pilot. This technology can predict and suggest optimal actions based on historical data, helping to preempt issues and guide newer employees to operate with the expertise of long-time workers.

Topics Discussed in this Episode

00:52 > Applying Generative AI to Manufacturing Operations

04:25 > Katalyze AI’s Role in Enhancing Raw Material Data Integration

06:04 > Shift from Retroactive to Predictive Analysis in Manufacturing

07:58 > Building Trust in Predictive AI Through Explainability

12:00 > Collaborative Benefits of a Single Source of Truth in Manufacturing

17:26 > Challenges and Scalability of AI Models in Biopharmaceutical Manufacturing

00:52 > Applying Generative AI to Manufacturing Operations

04:25 > Katalyze AI’s Role in Enhancing Raw Material Data Integration

06:04 > Shift from Retroactive to Predictive Analysis in Manufacturing

07:58 > Building Trust in Predictive AI Through Explainability

12:00 > Collaborative Benefits of a Single Source of Truth in Manufacturing

17:26 > Challenges and Scalability of AI Models in Biopharmaceutical Manufacturing

Resources Mentioned

Transcript

Brandy (00:01.943)

Hey Mike, welcome back to Katalyze AI. How are you doing today?

Mike Tomasco (00:06.634)

I'm great, Brandy, how are you doing? Thanks for having me back.

Brandy (00:08.523)

I'm doing, yeah, I'm doing really great. This is our part two. So many of you tuned into part one with Mike Tomasko. He is a former VP of digitization at Pfizer and now is doing some consulting work with various companies and organizations. Mike, welcome back. And we're going to get right into it. And the very first thing that I want to touch on based on our conversation last time is just

thinking about and talking through specific examples where AI, particularly generative AI, has been effectively integrated into operations.

Mike Tomasco (00:52.044)

Yeah, I think one of the best ways to think through how to apply generative AI to shop floor operations would be where can we make an impact in people's lives to make their lives easier? And one of the things that I always think about is the complex tasks on the shop floor. And if I think about what they're working through on a day -to -day basis, there's a lot of different inputs coming out in operator.

There's a lot of different computer systems that they have to do or use in their daily job to get through the day. And what they don't necessarily have is a workflow that naturally leads them through everything amongst the day. So if you conjure up the idea of a manufacturing co -pilot and what that might be able to do. So think of a screen that's composed of different tiles.

Brandy (01:20.098)

Hmm?

Mike Tomasco (01:45.492)

And one of them might be a chat bot like device or some graphs and charts about how operations are going. And as you're going through your day, if you run into issues, you might be able to ask the chat bot, hey, what do I do here? Or maybe it's using some predictive algorithms behind the scenes and then saying, hey, you're going to have an issue. I suggest you do X, Y, Z next. The last time you had this problem, this is how you.

resolved it, and then you're starting to prevent these issues. So the opportunity has been created through technology to create these experiences for people that help lead them through their day or even better, behind the scenes start to do some predictions about how your operations are going and then pepper in suggestions throughout your day of your next best step to keep things optimal and efficient.

Brandy (02:37.185)

Yeah, I mean, that makes a ton of sense. It's just this learning program in the backdrop, seeing how everything's operating, knowing that issues are going to occur, and those issues maybe have been seen before.

Mike Tomasco (02:52.094)

Exactly. So this means that you must have a large set of data somewhere behind the scenes about your operations. So maybe you've been collecting these things for a long time. Hopefully you've put them into some context for future use. Also, maybe you're bringing in other data sources from either the lab operations of your facility and even better yet, your quality operations. So when you start to consume this information into your generative AI models,

the recommendations that the model will give to you on the shop floor will be directly related to your prior experience versus just general large language model ability to summarize and write text. This is now summarizing writing text based upon your exact experience and things you've run across before. And if you're not the expert that's had that issue before, well, think about the time to competence for somebody. now,

We're kind of teaching people on the job on a day -to -day basis how to become that expert operator that's been around for 30 years.

Brandy (03:57.997)

Cause that's where that expertise comes in, right? It's just being there and doing it day in and day out. And when you're bringing in a new workforce, it's having to make those mistakes and learn from them. So companies like Katalyze AI, being able to kind of see the mistakes that have been made previously and help the new workforce come up to speed.

Mike Tomasco (04:25.654)

That's what's really exciting about Katalyze in particular. They have a very specific niche right now that they're focused on with raw material data and obtaining raw material data from suppliers electronically and then bringing it into this modeling system that's been set up around your specific processes and what's important, your critical process parameters and things like that for your operations and tying those two things together, the external data coming through.

the internal data being aggregated and brought in here and then analyzed. So that's done behind the scenes. And now, experientially, when I'm on the shop floor, I have the ability to see how things are running, how to set up my jobs to be most effective based upon the predictions coming from the raw material quality and its attributes combined with what I expect out of my process. Now, if we're doing this right, we're monitoring it in real time, which is exciting.

and then predicting these outcomes and then ultimately driving a more efficient outcome than we would have had otherwise of higher quality. And that's the key is consistent high quality output is great. Some of this capability will also allow us to layer in the ability to optimize yields with that same information and types of analysis. So you get a double positive impact of higher quality, higher yield.

What else do you want?

Brandy (05:55.509)

Yeah, I mean, that sounds optimal. So without a solution like Katalyze, what does that look like or what did that look like previously?

Mike Tomasco (06:04.608)

I mean, typically a lot of paper and spreadsheets, maybe some off the shelf statistical analysis tools, oftentimes if we think about how it's been done historically, it's through the operational excellence groups. So we will run our jobs and our batches through our processes. Maybe we will have an issue that will spur a quality investigation into what occurred and why. And then we keep that as historical information.

Brandy (06:08.141)

Yeah.

Mike Tomasco (06:31.542)

And then what we might do is set up a team of Six Sigma experts to go study this for a couple months and figure out what's really going on, do some retroactive statistical analyses and look at the data that way and start to hypothesize some improvement projects for continuous improvement that we then put into play. And maybe after a couple months, three, six months, we're now monitoring maybe that's successful or not. So.

That's all historically extremely important because we also have all that history and that data. If we use that as well, now we're building this continuous improvement loop and this operational excellence into a predictive model that prevents that stuff from ever happening. And we can focus on other downstream issues. That's the most exciting thing is taking that retroactive analysis we used to do and moving it into the predictive world. That's really where it becomes a game changer for us.

Thank

Brandy (07:28.865)

And you're talking about years of work.

Mike Tomasco (07:32.546)

It could be 20, 30 years of history. And there's ways now with technology to capture that, not just from documents and databases, but from people. And that's maybe the secret sauce of some of these new GEN .AI tools is if we go through a structured interview process about how people approach problem solving and the ways they fix specific issues from the past.

Brandy (07:35.67)

Yeah.

Mike Tomasco (07:58.412)

that then becomes part of the corpus of information that's used in the Gen -AM model to suggest how we prevent problems from happening in the future.

Brandy (08:07.565)

Yeah, it's pretty incredible. mean, you've got just with your history, your work. I mean, you've spent over 18 years in this industry thinking about all the people that you surrounded yourself with from this world, the expertise, all of that being fed into this generative AI modeling is incredibly impressive.

Mike Tomasco (08:32.674)

It's also maybe scary, right? So if we think about how that goes about, we're taking people's experience and knowledge and you take the best of the best and then you can provide that to anybody. So by scary, I like, wow, isn't that amazing that that can be done? 10 years ago, that wasn't possible. It was very difficult to do stuff like that, even five years ago. So the technology has come such a long way so quickly and it's gonna continue to get better.

So we can only just imagine where this stuff might go as we move forward.

Brandy (09:07.969)

Yeah, it's truly amazing. mean, I think that, you know, one of the promises of AI is enhancing real -time decision -making on the production floor. How do you see a company like Katalyze AI facilitating these real -time decisions, especially in critical moments when where human oversight is limited?

Mike Tomasco (09:29.046)

What I think what's really unique about Katalyze is the relationship with the raw material providers. That is something that we in the industry have long looked at ways to obtain information from our suppliers and our partners throughout the end manufacturing processes. Because the other thing we have to realize is in biopharmaceutical manufacturing, it's very rare that a single plant

makes everything from the raw materials through the intermediates, through the finished good and the packaged good, right? Like it just doesn't work that way. It usually works across three or four different facilities. Some are owned by your company. Some are external partners as well. So not only do you have to worry about data from your raw material suppliers and your partners, if you want to look at the overall end -to -end quality of your product, you want to monitor each one of those steps. What Katalyze is focused on right now is

that raw materials at the beginning parts of this and bringing that data into the ecosystem upfront, which we would have had to take PDFs or pieces of paper, scan them in or transcribe them into our own systems. This is now being automated for us through Katalyze's system. And then you put into the hands of the people that can configure and create the algorithms within the Katalyze system. And then

drive predictive output. So if we're looking to drive certain things in different parts of the processes, let's say process step A has a historic issue, we can study that process step, look at all of the inputs to it, understand what's going on in the impact through math, right? And then present it to a regular person, maybe somebody like me, working on the shop floor, who doesn't understand the intricacy of the math, right? They don't have to understand the math.

they have to understand what to do with the outcome. So we can take fewer data scientists and process scientists and expand what they can do across the organization and put it into the hands of a regular operator to do something with. All we have to do is teach the operator how to respond to what they're seeing and trust that it's gone through a validated process to give them that information. And then they can make that decision.

Mike Tomasco (11:50.248)

in real time as to what to do with it and link back to all these systems of record to make sure they're doing everything in a validated process center.

Brandy (12:00.685)

Yeah, and I think that there's a pretty big collaborative piece here too, right? Where you have all of these various folks who are involved in this process being able to better collaborate, whether it be the data scientists, the operators, and other decision makers in these environments. How do you kind of see that coming into play? What does that collaboration look like? What is that optimal vision?

Mike Tomasco (12:30.382)

I think the single biggest aspect of it is that we're all working from the same source of information, right? Historically with our paper -based or Excel -based processes, everything that I do in my realm, I think is correct and good, but it's harder to share with others upstream or downstream of my process. So as we start to get into the connected world, okay, we're starting to connect.

the workflow together, we're starting to connect the different groups together. But without that single trusted source of information, and then this validated set of algorithms and processes on top of it, I don't know if I'll be able to make decisions in the same way. So that collaboration is key that we all trust in the information that we're putting into the system. We understand the math that it's going through. And then

Even if we don't know how to do that math personally, we understand that we have people that do know how to do that. And then ultimately building trust in the output is what enables the decision making to occur. That's probably the hardest part in the process is how do you build trust in that output? And it really is through the transparency of that capability and going backwards through the system to understand this output came from these inputs that went through this math process, right? So.

When we teach people that, they're more likely to feel comfortable with what they're seeing. Being able to link that back to some key sources of information as well and understanding that will then allow you to take that action proactively versus running your process, letting it maybe fail and then retroactively go, yeah, look, the thing was right. That's not the outcome you're looking for. You want them to take that action and trust what's being presented to them.

But building trust in a human based upon what a machine is telling it is historically one of our more challenging things to do.

Brandy (14:35.489)

Yeah, and why is that?

Mike Tomasco (14:38.538)

I'm gonna say human nature. Maybe we want to fully understand what we're looking at. A lot of us are inquisitive and we wanna know how did something get from A to B to C. Don't just trust me that C is correct. And in pharmaceutical manufacturing, you can't just trust, right? You have to prove the entire process of GMP compliance or good manufacturing practices through the various.

Brandy (14:58.925)

Right.

Mike Tomasco (15:07.532)

regulatory authorities, we have to prove that every step of building the building, installing the equipment and commissioning it and testing it, then putting the molecules through their processes, testing the outcomes and then proving the efficacy, it's all got to be tied together. So anytime you introduce something new, like an AI algorithm, if it's a black box algorithm, there is going to be zero trust in that.

You can't trust that that output is what you think it is. So it has to be explainable AI, so to speak. And once we have an explainable understanding of what that is and how we can walk through it, then we can start to build much more trust in it.

Brandy (15:35.073)

Yeah.

Brandy (15:51.553)

Yeah, absolutely. mean, it's just, again, it kind of goes back to what you talked about in transparency. That trust comes from transparency and being able to see the whole picture and where it came from, because this is the outcome and the impact is we're talking about human lives at the end of the day.

Mike Tomasco (16:11.934)

Exactly. Exactly.

Brandy (16:16.151)

So the stakes are high. And how do you see companies embracing this technology and these tools? And like, what does that look like for output and quality and the ultimate impact, right, for most of these companies? And that's on human lives.

Mike Tomasco (16:34.624)

You know, I think what we're seeing is everybody is experimenting in trying to drive forward very quickly with how do we adopt these technologies, normal AI algorithms, generative AI, how do we adopt them into our processes? I don't know of a single company that's not exploring how to use these technologies in one

Oftentimes what happens is we get bogged down in two different things when we do this. One would be business cases. What's the value proposition of what we're proposing? And the other is, can we actually get past the pilot? Is it really gonna be something that we can do sustainably? We read a lot right now about where executives feel like they're not getting.

Brandy (17:15.693)

Yeah.

Mike Tomasco (17:26.528)

the value from their generative AI investments that they've made so far. And there could be a lot of different reasons for that. Maybe we're shooting for too much value too quickly. Maybe we're trying to solve the wrong problems. What's always historically worked for me is looking at where we have a specific issue that needs an improvement. So a use case, let's say. Understanding that

We've spent years and years years building the data infrastructure around our operations so that when I have a specific issue that I want to go solve, I have a trusted set of information that's being built up over the past several years to help us analyze that issue. And you typically will analyze it offline and understand what's going on. What you really want to move to is once you have a proven

model that will work for solving that issue. You want to move it back into a production zone. So real time, hopefully on the edge, you can run your algorithm and have it be predicting things and helping this whole co -pilot situation with your operators drive decisions. But the unit of deployment in a biopharmaceutical process is very, very small.

It's a step in a process for a molecule running in that process on that piece of equipment at that point in time. If you're running on a different piece of equipment, that algorithm is not going to be the same. If you're running a different molecule, it's not going to be the same. So the challenge is how do you generalize your solution to the point where it applies to more than one specific step of a process on a piece of equipment?

That's a real challenge when it comes to the manufacturing intelligence world and the AI world for any biopharmaceutical process.

Brandy (19:23.489)

Yeah. So it kind of sounds like currently people are perhaps just tackling like one thing, one use case and having a hard time identifying how this can be applied across the board.

Mike Tomasco (19:39.37)

It's the age old question of how do you scale? So how do you scale an AI model that was used to solve a specific issue? Well, in my mind, you create a product out of it and you have to understand the inputs and outputs of what that product is doing. And that's how you can start to generalize and apply that product to different situations. So instead of looking at it as just a math problem to solve that step,

Brandy (19:42.251)

Yeah. Yeah.

Mike Tomasco (20:08.054)

You have to productize it and think a little bit more broadly about how it's going to interact with other things as well. Make it a little bit configurable with some variables and inputs. And that's something that you should be able to start to replicate elsewhere. But we're not seeing a ton of that currently in the industry.

Brandy (20:28.277)

And do you think it's just the barriers and the of the ROI to the immediate benefits of it?

Mike Tomasco (20:36.372)

It's interesting. So in the manufacturing intelligence space in particular for bio -pharma processes, if you can improve yield by a small percentage point, these projects pay for themselves a hundred times over. So I don't think it's the ROI itself. The barrier is before that. A lot of us talk about these problems, talk about how we want to approach these problems, but sometimes there's a

fear to pull the trigger on it because it's a disruptive change. And anytime there's a disruptive change that could have a really positive outcome, there's a huge risk to today's operations that there could be a negative outcome today. So if I have to do something that alters my operations, we better be sure that it's going to be correct. So that fear, that difference between the possible negative today

value versus the potential future positivity. That's something that we find not everybody understands, graphs or believes in.

Brandy (21:42.369)

Yeah, yeah, I could definitely see that. mean, and it feels like Katalyze is addressing especially the raw materials portion of the process, right? And that to me seems like a pretty significant impact to at least that piece and that kind of being able to see the cost benefit to not only the quality, but also just being able to produce at a mass scale.

Mike Tomasco (22:07.202)

I mean, it's such a great place to start is on that raw material side, because if you start with high quality, consistent raw materials, and you know their parameters, so to speak, and they're the input to your process, you can better adjust and monitor your process instead of guessing as to how those raw materials might behave. It just starts you off in a better way.

And then you can worry about downstream process issues that might be a.

Brandy (22:40.141)

So it sounds to me like for the people listening in this space that this would be a good place to dip your toe in.

Mike Tomasco (22:49.666)

It doesn't hurt, right? You get to start somewhere. And what I always say is like, pick an area, one that makes sense to you, and do something. Try it. See how it works. And if it's working pretty well, start to think about how am gonna scale this now as a product to everything across my organization, not just this one piece of.

Brandy (23:13.633)

Yeah. Mike, thank you so much for coming back on the show, digging in deeper to AI and what this can look like on a mass scale. I really appreciate you taking the time.

Mike Tomasco (23:25.588)

Absolutely, thanks for having me. I look to see you again soon.

Brandy (23:29.321)

Alright, sounds good. Thank you, Mike.


Mike Tomasco

Mike Tomasco is a Vice President in Pfizer Digital with responsibility for leading Pfizer Global Supply’s (PGS) Digital Transformation. The goal of the program is to transform PGS through a business strategy driven focus on digitization applied end to end across Manufacturing and Supply Operations. Mr. Tomasco has experience across strategy, marketing, finance, manufacturing and information systems for multinational companies and has successfully led several major transformational initiatives.

Mike Tomasco is a Vice President in Pfizer Digital with responsibility for leading Pfizer Global Supply’s (PGS) Digital Transformation. The goal of the program is to transform PGS through a business strategy driven focus on digitization applied end to end across Manufacturing and Supply Operations. Mr. Tomasco has experience across strategy, marketing, finance, manufacturing and information systems for multinational companies and has successfully led several major transformational initiatives.

Mike Tomasco is a Vice President in Pfizer Digital with responsibility for leading Pfizer Global Supply’s (PGS) Digital Transformation. The goal of the program is to transform PGS through a business strategy driven focus on digitization applied end to end across Manufacturing and Supply Operations. Mr. Tomasco has experience across strategy, marketing, finance, manufacturing and information systems for multinational companies and has successfully led several major transformational initiatives.

About the Guest

Katalysts Podcast

Subscribe to Gain Insights About AI Solutions

"With Katalyze AI, we can analyze data in real-time and make informed decisions to optimize our processes." Chris Calabretta

Katalysts Podcast

Subscribe to Gain Insights About AI Solutions

"With Katalyze AI, we can analyze data in real-time and make informed decisions to optimize our processes." Chris Calabretta

Katalysts Podcast

Subscribe to Gain Insights About AI Solutions

"With Katalyze AI, we can analyze data in real-time and make informed decisions to optimize our processes." Chris Calabretta

More Resource Articles

August Report 2023

Effective Communication and Supplier Relationships in Biomanufacturing: Insights from Chris Calabretta

Chris Calabretta highlights how strong communication, supplier relations, and tech like Katalyze AI drive success in the complex world of biomanufacturing.

Learn more

August Report 2023

Effective Communication and Supplier Relationships in Biomanufacturing: Insights from Chris Calabretta

Chris Calabretta highlights how strong communication, supplier relations, and tech like Katalyze AI drive success in the complex world of biomanufacturing.

Learn more

August Report 2023

Effective Communication and Supplier Relationships in Biomanufacturing: Insights from Chris Calabretta

Chris Calabretta highlights how strong communication, supplier relations, and tech like Katalyze AI drive success in the complex world of biomanufacturing.

Learn more

August Report 2023

Effective Communication and Supplier Relationships in Biomanufacturing: Insights from Chris Calabretta

Chris Calabretta highlights how strong communication, supplier relations, and tech like Katalyze AI drive success in the complex world of biomanufacturing.

Learn more

August Report 2023

Navigating the Challenges of Procurement in Biomanufacturing: Insights from Chris Calabretta

Chris Calabretta shares insights on navigating biomanufacturing procurement challenges, highlighting resilience through events like the 2008 crisis and COVID-19.

Learn more

August Report 2023

Navigating the Challenges of Procurement in Biomanufacturing: Insights from Chris Calabretta

Chris Calabretta shares insights on navigating biomanufacturing procurement challenges, highlighting resilience through events like the 2008 crisis and COVID-19.

Learn more

August Report 2023

Navigating the Challenges of Procurement in Biomanufacturing: Insights from Chris Calabretta

Chris Calabretta shares insights on navigating biomanufacturing procurement challenges, highlighting resilience through events like the 2008 crisis and COVID-19.

Learn more

August Report 2023

Navigating the Challenges of Procurement in Biomanufacturing: Insights from Chris Calabretta

Chris Calabretta shares insights on navigating biomanufacturing procurement challenges, highlighting resilience through events like the 2008 crisis and COVID-19.

Learn more

August Report 2023

Exploring the Future of Biomanufacturing: Insights from Industry Expert Michael Burton

Michael Burton highlights biomanufacturing's urgent need for innovation and efficiency, offering expert insights into the sector's challenges and emerging opportunities.

Learn more

August Report 2023

Exploring the Future of Biomanufacturing: Insights from Industry Expert Michael Burton

Michael Burton highlights biomanufacturing's urgent need for innovation and efficiency, offering expert insights into the sector's challenges and emerging opportunities.

Learn more

August Report 2023

Exploring the Future of Biomanufacturing: Insights from Industry Expert Michael Burton

Michael Burton highlights biomanufacturing's urgent need for innovation and efficiency, offering expert insights into the sector's challenges and emerging opportunities.

Learn more

August Report 2023

Exploring the Future of Biomanufacturing: Insights from Industry Expert Michael Burton

Michael Burton highlights biomanufacturing's urgent need for innovation and efficiency, offering expert insights into the sector's challenges and emerging opportunities.

Learn more

Let's Partner!

Feeling ready… Talk to us today

Let's Partner!

Feeling ready…
Talk to us today

Let's Partner!

Feeling ready…
Talk to us today

© 2024 Katalyze AI. All Rights Reserved.

© 2024 Katalyze AI. All Rights Reserved.

© 2024 Katalyze AI. All Rights Reserved.

© 2024 Katalyze AI. All Rights Reserved.