1 00:00:00,409 --> 00:00:03,939 In a world where generative AI promises endless possibilities, 2 00:00:04,329 --> 00:00:07,789 how do you navigate its limitations in risk averse environments 3 00:00:08,019 --> 00:00:10,289 like law, finance, or medicine? 4 00:00:10,849 --> 00:00:15,299 Today's guest shares why it's not just about using AI, but about knowing where 5 00:00:15,439 --> 00:00:19,699 and how to use it, especially when the margin of error is razor thin. 6 00:00:20,369 --> 00:00:24,379 My name is Alexandre Nevski, and this is Innovation Tales. 7 00:00:39,872 --> 00:00:43,987 Generative AI is changing the game in many industries, but in certain 8 00:00:43,997 --> 00:00:47,467 areas where precision is critical, it's not as simple as jumping in. 9 00:00:48,327 --> 00:00:52,167 Today's guest, Sam Gobrail, has over 15 years of experience in 10 00:00:52,167 --> 00:00:56,187 making digital tools work for businesses and government agencies. 11 00:00:56,797 --> 00:01:00,547 His achievements include improving customer journeys for Fortune 100 12 00:01:00,577 --> 00:01:06,047 companies and using AI to cut federal processing times from hours to minutes. 13 00:01:06,777 --> 00:01:11,067 In this episode, Sam shares how to implement generative AI safely and 14 00:01:11,067 --> 00:01:13,277 strategically in complex environments. 15 00:01:13,717 --> 00:01:18,117 Whether you're rolling out AI across your organization or just starting 16 00:01:18,117 --> 00:01:22,487 to explore its uses, you'll find practical advice and fresh perspectives. 17 00:01:23,257 --> 00:01:26,217 Without further ado, here's my conversation with Sam Gobriel. 18 00:01:26,610 --> 00:01:27,900 Sam, welcome to the show. 19 00:01:28,200 --> 00:01:28,500 Thank you. 20 00:01:28,510 --> 00:01:29,270 Thanks for having me. 21 00:01:29,350 --> 00:01:30,140 Excited to talk today. 22 00:01:30,440 --> 00:01:31,970 Well, thanks for making the time. 23 00:01:32,450 --> 00:01:37,690 I love these opportunities to talk with fellow digital leaders. 24 00:01:38,000 --> 00:01:42,150 today you and I, we agreed to dive into a topic with which you have 25 00:01:42,150 --> 00:01:46,485 significant experience, and that's the challenges of leveraging generative 26 00:01:46,485 --> 00:01:49,023 AI in risk averse environments. 27 00:01:49,323 --> 00:01:54,573 So to help our audience get their bearings, would you mind starting us off 28 00:01:54,573 --> 00:01:59,693 with a couple of examples of risk averse environments in an enterprise context? 29 00:01:59,993 --> 00:02:00,643 Absolutely. 30 00:02:00,713 --> 00:02:02,623 I'll start with one that's close to my heart. 31 00:02:03,013 --> 00:02:07,903 I'm an attorney by training, so I went to law school and I still do 32 00:02:07,903 --> 00:02:09,903 a few pro bono cases on the side. 33 00:02:10,203 --> 00:02:15,083 So the legal field comes to mind as one that is very much risk averse. 34 00:02:15,383 --> 00:02:17,793 And when we're talking about risk averse, we're talking about 35 00:02:18,093 --> 00:02:20,063 the fault tolerance, right? 36 00:02:20,093 --> 00:02:25,778 The margin of error on whether you have 100 percent correct information 37 00:02:26,183 --> 00:02:28,543 or 95 percent correct information. 38 00:02:28,973 --> 00:02:34,713 that gap in places like the legal field is simply not acceptable, as opposed 39 00:02:34,713 --> 00:02:40,093 to, for example, you and I researching your next vacation, and a generative 40 00:02:40,093 --> 00:02:45,433 AI model says, "oh, this place is $500 and it ends up being $550". 41 00:02:45,733 --> 00:02:46,783 That's probably okay. 42 00:02:47,063 --> 00:02:49,253 and I think those are the differences we're talking about. 43 00:02:49,313 --> 00:02:52,143 so The legal field certainly comes to mind. 44 00:02:52,613 --> 00:02:57,353 Others that also come to mind include, tax professionals, and some parts of 45 00:02:57,363 --> 00:03:01,983 the medical field, depending on how severe and urgent those items may be. 46 00:03:02,283 --> 00:03:04,343 Right, let's dive straight into it. 47 00:03:04,353 --> 00:03:08,183 What makes generative AI so tough in such environments? 48 00:03:08,483 --> 00:03:08,963 Yeah. 49 00:03:08,963 --> 00:03:12,250 So generative AI, has been such a great tool. 50 00:03:12,260 --> 00:03:13,300 It's taken the world by storm. 51 00:03:13,430 --> 00:03:14,030 I use it. 52 00:03:14,330 --> 00:03:15,300 I think you use it. 53 00:03:15,330 --> 00:03:17,370 We all kind of use it to some degree. 54 00:03:17,790 --> 00:03:20,280 But we can't mistake it for what it truly is, right? 55 00:03:20,280 --> 00:03:25,700 It is an algorithm predicting the next likely word and it comes 56 00:03:25,700 --> 00:03:29,850 up with great responses, but it's not a peer of yours, right? 57 00:03:29,860 --> 00:03:32,640 If I'm an attorney, you're doing research for a legal case. 58 00:03:32,940 --> 00:03:38,490 It's not the equivalent of my peer thinking along with me or one of my 59 00:03:38,490 --> 00:03:40,132 employees doing the research and writing. 60 00:03:40,780 --> 00:03:44,110 and so it's prone to make mistakes, right? 61 00:03:44,180 --> 00:03:47,340 some of them are called hallucinations where it sometimes just makes things up. 62 00:03:47,640 --> 00:03:49,000 Those have improved for sure. 63 00:03:49,300 --> 00:03:52,240 And it doesn't always have the full breadth of knowledge, right? 64 00:03:52,240 --> 00:03:58,075 The context that we as humans may pick up, either from non text context, right? 65 00:03:58,075 --> 00:04:02,825 A phone conversation with the client, the relationship I've had with the patient. 66 00:04:03,255 --> 00:04:07,895 those kinds of things that may add important context to make an 67 00:04:07,945 --> 00:04:10,075 informed, professional decision. 68 00:04:10,515 --> 00:04:12,115 it lacks those things as well. 69 00:04:12,365 --> 00:04:14,265 So that's the space we're talking about. 70 00:04:14,565 --> 00:04:15,085 Right. 71 00:04:15,188 --> 00:04:19,358 I think One way that I tend to think about this is like, from the 72 00:04:19,358 --> 00:04:23,838 point of view of the organizations, the businesses that we serve. 73 00:04:24,138 --> 00:04:29,368 Can you think of like how you would compare and contrast, working with clients 74 00:04:29,368 --> 00:04:34,288 that may have already had experiences with other artificial intelligence approaches, 75 00:04:34,468 --> 00:04:36,698 but more traditional machine learning. 76 00:04:36,998 --> 00:04:38,948 And now they're considering generative AI. 77 00:04:39,188 --> 00:04:43,778 What kind of aspects do you highlight for them at the beginning of their 78 00:04:43,788 --> 00:04:47,138 journey so that, you know, to, to, to lay a solid foundation? 79 00:04:47,438 --> 00:04:49,998 Yeah, I think that is, a great question. 80 00:04:50,398 --> 00:04:53,308 some of the large organizations, many of the large organizations, right, 81 00:04:53,328 --> 00:04:56,108 have used predictive modeling, right? 82 00:04:56,348 --> 00:04:58,898 Traditional AI, right? 83 00:04:59,248 --> 00:05:03,188 For things like financial planning, forecasting, resource planning, right? 84 00:05:03,188 --> 00:05:05,638 These things that require a lot of data. 85 00:05:05,938 --> 00:05:08,978 The trends are pretty reliable, right? 86 00:05:08,978 --> 00:05:13,728 So what happened over the past 20 years is not a guarantee, but has a high 87 00:05:13,738 --> 00:05:15,388 likelihood of happening next year. 88 00:05:15,808 --> 00:05:19,488 And those are well proven, well, well used. 89 00:05:19,578 --> 00:05:21,598 I think where it comes tricky. 90 00:05:21,898 --> 00:05:23,728 Just sticking with finance for a second. 91 00:05:23,798 --> 00:05:27,058 Finance has used predictive modeling for a long time, forecasting 92 00:05:27,058 --> 00:05:28,478 models, all of those things. 93 00:05:28,778 --> 00:05:33,418 Would they, a professional, finance professional, trust a generative AI 94 00:05:33,418 --> 00:05:37,448 model today to write their SEC file? 95 00:05:37,748 --> 00:05:40,978 Governmental agency financial filing that has to get published. 96 00:05:41,198 --> 00:05:42,038 I don't think so, right? 97 00:05:42,038 --> 00:05:46,733 Because their job is literally on the line when they create that document. 98 00:05:46,803 --> 00:05:53,073 And so this is the nuance where an enterprise has opportunities to use 99 00:05:53,073 --> 00:05:58,013 generative AI at enterprise scale for really great things, but either they 100 00:05:58,013 --> 00:06:04,333 have to appreciate the time investing to review and really comb through the output. 101 00:06:04,413 --> 00:06:07,183 And therefore, it's not a fully-backed solution. 102 00:06:07,203 --> 00:06:08,963 It's a half baked solution and that's okay. 103 00:06:08,963 --> 00:06:10,883 It still saves time and energy and that's great. 104 00:06:11,183 --> 00:06:13,913 Or, they have to identify. 105 00:06:14,288 --> 00:06:18,708 Here are the core items that are the reason we get paid 106 00:06:18,738 --> 00:06:19,738 what we get paid, right? 107 00:06:19,738 --> 00:06:20,948 So back to the attorney, right? 108 00:06:21,308 --> 00:06:25,268 The legal opinion is the reason that a client hires them. 109 00:06:25,748 --> 00:06:29,828 But there are many other parts of their business, that we could call admin or 110 00:06:29,828 --> 00:06:33,368 bureaucracy, whatever you want to call it, that the margin of error is acceptable. 111 00:06:33,708 --> 00:06:36,058 The emailing and accounts payable, right? 112 00:06:36,088 --> 00:06:40,958 And the filing system, like there's a lot of other things going on that you 113 00:06:40,958 --> 00:06:42,978 do have a higher risk tolerance for. 114 00:06:43,368 --> 00:06:46,158 And so parsing out what is the core skill set? 115 00:06:46,458 --> 00:06:48,438 I have very low risk tolerance here. 116 00:06:48,738 --> 00:06:50,598 And what are the kind of peripheral items? 117 00:06:50,728 --> 00:06:54,528 I have a higher response and I can actually gain a lot of efficiency 118 00:06:54,548 --> 00:06:58,228 and productivity to focus on why I do this job or why we're in 119 00:06:58,228 --> 00:06:59,288 this business in the first place. 120 00:06:59,588 --> 00:07:00,478 Yeah, that's true. 121 00:07:00,628 --> 00:07:06,908 Another way to look at this is that with generative AI, you can imagine 122 00:07:07,208 --> 00:07:12,058 automating quite a few more tasks than with traditional machine learning 123 00:07:12,058 --> 00:07:17,318 approaches where you would be classifying, you know, helping making decisions, 124 00:07:17,318 --> 00:07:20,898 but you'd need to be very clear about, this is the task, where this 125 00:07:20,898 --> 00:07:22,478 is the decision we need to be making. 126 00:07:22,488 --> 00:07:26,248 This is the data that we are going to use to train the algorithm. 127 00:07:26,603 --> 00:07:31,748 With, you know, the new breed of tools, we can think of a million 128 00:07:31,748 --> 00:07:36,278 different ways where we can apply, but that's, I think, what you were saying, 129 00:07:36,278 --> 00:07:40,178 I totally agree with it, is that, yeah, you, theoretically can, but in 130 00:07:40,188 --> 00:07:42,228 practice, there are some challenges. 131 00:07:42,228 --> 00:07:45,188 And I think that's what we want to explore more today. 132 00:07:45,238 --> 00:07:49,298 And so I wonder, Sam, if you can share with us an example of a 133 00:07:49,298 --> 00:07:55,718 project of deploying generative AI in such a risk averse environment 134 00:07:56,038 --> 00:07:57,588 that we can sort of explore together. 135 00:07:57,888 --> 00:07:58,498 Sure. 136 00:07:58,548 --> 00:08:03,928 so one of the, one of my, my former clients, that, that, we're good friends. 137 00:08:04,338 --> 00:08:09,518 One of the things we had been talking and working on together, is this idea again 138 00:08:09,528 --> 00:08:14,468 around the legal field, for example, there are, two, two examples, one at the very 139 00:08:14,468 --> 00:08:19,928 kind of enterprise level, and one that has a lot of repeatability, but could 140 00:08:19,978 --> 00:08:22,258 apply to even smaller entities, right? 141 00:08:22,258 --> 00:08:22,928 Smaller firms. 142 00:08:23,228 --> 00:08:26,508 the smaller one are very basic filings, right? 143 00:08:26,508 --> 00:08:30,048 So imagine if you have an outdated system. 144 00:08:30,118 --> 00:08:33,328 In the legal system, fairly outdated most of the time, but 145 00:08:33,328 --> 00:08:34,738 you have a lot of filings, right? 146 00:08:34,768 --> 00:08:37,438 And those are essentially templates, right? 147 00:08:37,438 --> 00:08:41,268 You have a one page or a two page document and you're replacing 148 00:08:41,268 --> 00:08:46,778 maybe 10 words, 12 words, 15 words, maybe putting a blurb, a paragraph. 149 00:08:47,078 --> 00:08:49,908 That's a perfect space for generative AI, right? 150 00:08:49,908 --> 00:08:52,538 It is not the core part of a legal case. 151 00:08:52,588 --> 00:08:55,308 It is, I need to do this kind of thing. 152 00:08:55,318 --> 00:09:01,118 It looks nearly the same every single time, but that's 20 minutes of time, 153 00:09:01,118 --> 00:09:02,488 that I could be using to something else. 154 00:09:02,658 --> 00:09:08,363 so those kinds of, again, back to even within the legal work itself, What is 155 00:09:08,363 --> 00:09:13,063 the core and the complex and what is the mundane, repeatable, you can turn 156 00:09:13,063 --> 00:09:17,263 into a template, and you can use the facts of what's going on to fill in 157 00:09:17,273 --> 00:09:21,333 that paragraph or two that's a summary: hey, we're filing this kind of thing. 158 00:09:21,633 --> 00:09:26,738 At the enterprise level, we had talked about, and this goes back to a slightly 159 00:09:26,748 --> 00:09:31,168 older concept that I think is coming back again, around like smart contracts. 160 00:09:31,228 --> 00:09:36,958 So this is not smart contracts, but the idea of large corporate contracts. 161 00:09:36,968 --> 00:09:40,788 So, whether those are contracts for digital products, license 162 00:09:40,788 --> 00:09:42,418 agreements, commercial agreements. 163 00:09:42,718 --> 00:09:44,788 Very lengthy, very long. 164 00:09:45,258 --> 00:09:49,708 And managing every version of those is a complicated endeavor, right? 165 00:09:49,708 --> 00:09:53,658 So imagine if you have a hundred vendors, a hundred different commercial agreements. 166 00:09:54,073 --> 00:09:59,803 What is clause number 15 say for vendor C. That gets messy quickly. 167 00:10:00,273 --> 00:10:03,843 So, similar concept here, right, around how do you use generative 168 00:10:03,843 --> 00:10:08,373 AI to say, "Hey, this is the way I want to structure my contracts". 169 00:10:08,673 --> 00:10:13,263 Assess how different things are, but also create the next one. 170 00:10:13,283 --> 00:10:16,803 Cause I've already given you my standard clauses, right? 171 00:10:16,833 --> 00:10:18,103 My standard language. 172 00:10:18,503 --> 00:10:22,043 And what you're going to change is maybe client name, industry, date, the 173 00:10:22,043 --> 00:10:25,823 kinds of things that, again, you or I would go into a template and modify. 174 00:10:26,123 --> 00:10:31,353 But it also then pulls that data to tell you how fractured your fractured, your, 175 00:10:31,383 --> 00:10:32,643 your commercial agreements are, right? 176 00:10:32,643 --> 00:10:34,963 Do you really have one unified commercial agreement? 177 00:10:35,388 --> 00:10:38,048 Or you actually have a hundred versions of that commercial agreement, which 178 00:10:38,048 --> 00:10:42,168 means, that's a corporate risk that people in this field would assess, right? 179 00:10:42,518 --> 00:10:46,318 So those are some examples of using it in a way that is value 180 00:10:46,318 --> 00:10:51,408 added, taking into account the risk tolerance and the risk averseness of 181 00:10:51,408 --> 00:10:53,288 the folks in these specific fields. 182 00:10:53,588 --> 00:10:58,858 Right, so when you have some guardrails, if I may use the word, 183 00:10:59,158 --> 00:11:03,438 in the form of a template, reduces the chances of a hallucination. 184 00:11:03,848 --> 00:11:10,288 is time saving, I wonder, the primary benefit that, in your experience, people 185 00:11:10,288 --> 00:11:13,138 are seeking today with generative AI? 186 00:11:13,438 --> 00:11:19,233 I think it's either time saving, and/or doing more than you have the 187 00:11:19,233 --> 00:11:23,523 capacity to do, which is a similar thing, but a little different. 188 00:11:23,823 --> 00:11:25,403 In the example of the commercial agreements, right? 189 00:11:25,423 --> 00:11:29,833 If I had a team of four, they could review all of our agreements 190 00:11:29,873 --> 00:11:30,953 and give me that analysis. 191 00:11:31,253 --> 00:11:35,333 In lieu of having a team of four, that work simply isn't getting done. 192 00:11:35,373 --> 00:11:40,553 And so when my, let's say, executive asks, how fractured are our agreements? 193 00:11:40,553 --> 00:11:43,233 The answer is, I think we're pretty fractured. 194 00:11:43,492 --> 00:11:44,802 That's not a very good answer. 195 00:11:45,262 --> 00:11:51,202 And so this is work that cannot be done unless you scale up at a cost you probably 196 00:11:51,202 --> 00:11:53,292 are not ready or willing to take on. 197 00:11:53,792 --> 00:11:58,092 Or getting time back for the team that you do have to do the things that 198 00:11:58,092 --> 00:12:02,492 are more valuable, more important, reduce errors and mistakes, right? 199 00:12:02,492 --> 00:12:04,932 Because if you're doing, humans don't like doing the same 200 00:12:04,942 --> 00:12:06,662 mundane task over and over again. 201 00:12:07,062 --> 00:12:11,742 And so it would benefit both accuracy, but also morale and culture and has all these 202 00:12:11,752 --> 00:12:15,882 other benefits of letting your people do their best work, which is a great thing. 203 00:12:16,182 --> 00:12:22,872 From what you're saying, there is, you know, the time savings on the artifacts 204 00:12:23,092 --> 00:12:28,052 that, or the output of some business activity in this case, contracts. 205 00:12:28,062 --> 00:12:37,572 So that is tricky, but some of the other benefits is that you can do tasks 206 00:12:37,572 --> 00:12:39,352 like, okay, compare these documents. 207 00:12:39,352 --> 00:12:42,282 This is what you've, you've just used as an example. 208 00:12:42,302 --> 00:12:46,132 I was thinking maybe if there's a document that has gone through a number 209 00:12:46,132 --> 00:12:50,952 of revisions, there's been number of changes, et cetera, like asking the 210 00:12:50,952 --> 00:12:57,302 model to summarize the changes, that output of the model isn't directly 211 00:12:57,302 --> 00:13:02,632 going to get used in the actual document that will get eventually 212 00:13:02,632 --> 00:13:05,392 signed, but it still has benefits. 213 00:13:05,412 --> 00:13:09,532 And it's time savings, but I like the way you said it, that it's also 214 00:13:09,542 --> 00:13:12,767 maybe tasks that no one would have bothered doing because, well, we're 215 00:13:12,767 --> 00:13:15,687 all, prioritizing as, as we go. 216 00:13:15,987 --> 00:13:22,877 And yeah, the, the, the other cultural, morale aspect where, repetitive work, 217 00:13:23,177 --> 00:13:24,907 becomes much more, much more doable. 218 00:13:25,347 --> 00:13:29,437 Let's say because the repetitive aspect, you can rely on the, on the model. 219 00:13:29,767 --> 00:13:33,177 So those are really, I like that set of benefits. 220 00:13:33,187 --> 00:13:38,507 Let's now talk about the pitfalls or the risks of, adopting generative AI. 221 00:13:38,507 --> 00:13:43,697 And you gave already the example, but if you try and generalize again, 222 00:13:43,717 --> 00:13:49,007 thinking about a potential client with which you're exploring, you 223 00:13:49,007 --> 00:13:50,237 know, the art of the possible. 224 00:13:50,537 --> 00:13:55,147 What do you emphasize for them in terms of risks and pitfalls? 225 00:13:55,447 --> 00:13:59,127 Yeah, I think some of the risks and pitfalls are, trying 226 00:13:59,127 --> 00:14:01,297 to do too much too fast. 227 00:14:01,477 --> 00:14:06,437 It is very much an exciting time and therefore we find people are 228 00:14:06,797 --> 00:14:10,547 overhyped and they think it'll solve all their problems quickly. 229 00:14:10,767 --> 00:14:15,487 And it has benefits, but it is slower than I think anyone wants it to be 230 00:14:15,497 --> 00:14:17,197 while also still being very exciting. 231 00:14:17,247 --> 00:14:19,812 Especially as we talk about enterprise scale, I want to 232 00:14:19,812 --> 00:14:20,982 roll this out to my employees. 233 00:14:21,282 --> 00:14:21,762 Okay. 234 00:14:21,812 --> 00:14:24,422 We'd love to do that, but it's probably not in three weeks. 235 00:14:24,452 --> 00:14:26,892 And that's some of the tension that folks feel. 236 00:14:27,342 --> 00:14:32,332 the other one I've seen is not integrating it into whatever 237 00:14:32,632 --> 00:14:37,612 way and how, and the tools that your teams use now, right? 238 00:14:37,662 --> 00:14:42,822 My, my personal favorite example, is I don't like Outlook, I never liked Outlook. 239 00:14:43,307 --> 00:14:47,417 But I've used Outlook for nearly 20 years now because that's what 240 00:14:47,457 --> 00:14:50,217 every company and organization uses. 241 00:14:50,684 --> 00:14:53,964 And we all laugh about how bad the search is, right? 242 00:14:54,044 --> 00:14:55,194 I can never find an email. 243 00:14:55,494 --> 00:14:58,664 It's a problem we've known about for a long time, yet we still all use 244 00:14:58,664 --> 00:15:00,724 it because it's the way we all work. 245 00:15:00,824 --> 00:15:01,984 There is search in outlook? 246 00:15:02,004 --> 00:15:03,284 Yeah, exactly. 247 00:15:04,077 --> 00:15:04,757 Exactly. 248 00:15:05,127 --> 00:15:10,047 So that is the, how do you bring it to where your teams are? 249 00:15:10,397 --> 00:15:13,237 Because when you tell them, Hey, you have to go sign up for a new 250 00:15:13,287 --> 00:15:16,237 thing, new login, new website. 251 00:15:16,617 --> 00:15:19,537 And by the way, it's not going to work as well as you hope, or would 252 00:15:19,537 --> 00:15:21,527 like it to work the first few times. 253 00:15:21,827 --> 00:15:23,797 All you're going to do is upset people, right? 254 00:15:23,797 --> 00:15:25,187 You're not going to, you're not helping people at that point. 255 00:15:25,487 --> 00:15:29,827 How do you bring it to where they work or, arguably even better, on the back 256 00:15:29,827 --> 00:15:34,447 end, and just do some of the work on their behalf without even necessarily 257 00:15:34,457 --> 00:15:36,387 needing the human intervention as much? 258 00:15:36,827 --> 00:15:37,077 Yeah. 259 00:15:37,377 --> 00:15:41,447 So those are some examples where, you know, helping them, helping 260 00:15:41,447 --> 00:15:44,277 the people using it actually get comfortable using it. 261 00:15:44,327 --> 00:15:46,627 I think it is a big thing to tackle. 262 00:15:47,117 --> 00:15:51,657 So, again, it's great that you sort of went into this whole 263 00:15:51,657 --> 00:15:54,207 discussion of enterprise scale. 264 00:15:54,587 --> 00:15:57,997 Because this is something I get asked quite a lot. 265 00:15:58,377 --> 00:16:03,917 People, of course since ChatGPT came out, have had personal experiences, 266 00:16:04,207 --> 00:16:10,147 sometimes in like smaller startup or, yeah, smaller organizations. 267 00:16:10,257 --> 00:16:16,127 Just using a tool or, or, or, or 10, it's very easy, very straightforward, 268 00:16:16,547 --> 00:16:21,707 but you've already touched upon some elements that are specific to this 269 00:16:21,707 --> 00:16:26,897 enterprise context, where if you start doing that at scale with all the tools you 270 00:16:26,897 --> 00:16:34,307 already have, it, it, it just adds to the complexity of managing, the organization. 271 00:16:34,597 --> 00:16:37,957 Can you think of any other aspects that you would want to highlight 272 00:16:37,967 --> 00:16:39,547 that are unique to enterprises? 273 00:16:39,847 --> 00:16:44,147 The other thing I would highlight is the maintenance of those systems, right? 274 00:16:44,197 --> 00:16:48,307 At enterprise, they are almost never straight out of the box. 275 00:16:48,367 --> 00:16:51,447 As they're improving, back to the guardrails for a second. 276 00:16:51,887 --> 00:16:56,527 The really nice thing is being able to take a pretty basic model, right? 277 00:16:56,527 --> 00:16:59,637 It understands general conversation. 278 00:16:59,667 --> 00:17:05,727 It can put out pretty well written things, but then giving it your own data, right? 279 00:17:05,817 --> 00:17:08,137 A perfect example of this for enterprise are your like 280 00:17:08,137 --> 00:17:09,617 knowledge management systems. 281 00:17:09,627 --> 00:17:13,797 So SharePoint, wherever you have your knowledge, being able to 282 00:17:13,797 --> 00:17:16,577 take a thousand PDFs, right? 283 00:17:16,637 --> 00:17:20,487 Or every standard operating procedure you've ever written, putting it in 284 00:17:20,487 --> 00:17:26,147 there and then saying, how do I do X process or who do I contact for this? 285 00:17:26,427 --> 00:17:30,147 And it'll actually, give you a good answer based on the standard operating 286 00:17:30,147 --> 00:17:31,317 procedure one of your employees wrote. 287 00:17:31,617 --> 00:17:35,717 So I think that has a lot of potential at the enterprise scale. 288 00:17:36,187 --> 00:17:38,377 But there's a maintenance piece to it, right? 289 00:17:38,377 --> 00:17:42,057 Which means there's a skill set, probably somewhere in IT, or you 290 00:17:42,057 --> 00:17:43,357 could, have somebody come help you. 291 00:17:43,357 --> 00:17:47,847 But to not just build it and get it stood up, but to fine tune it, 292 00:17:48,267 --> 00:17:49,805 add features and functionality. 293 00:17:50,487 --> 00:17:54,807 So there is that very literal cost of keeping this thing 294 00:17:54,817 --> 00:17:56,667 running and feeding it, et cetera. 295 00:17:56,987 --> 00:18:01,377 That I think the benefits clearly out outweigh the cost of that, but it is, 296 00:18:01,747 --> 00:18:06,277 especially as we talk about small and mid sized enterprises, adding one or 297 00:18:06,277 --> 00:18:10,847 two people or three people to an IT team, may not be something that's on 298 00:18:10,847 --> 00:18:14,037 their mind when they're thinking about standing up these kinds of capabilities. 299 00:18:14,337 --> 00:18:14,851 Yeah, definitely. 300 00:18:15,211 --> 00:18:20,641 I think that, you know, this is something where enterprises have both 301 00:18:20,681 --> 00:18:25,361 extra, okay, risks to manage and, you know, mitigations to put in place. 302 00:18:25,361 --> 00:18:30,631 But it's also because they can afford to deploy these systems 303 00:18:30,631 --> 00:18:35,386 that are not generic, but are working with the enterprise's data. 304 00:18:35,516 --> 00:18:41,026 So, that is, maybe not the same when you're in a smaller or 305 00:18:41,026 --> 00:18:42,506 larger organization, of course. 306 00:18:42,856 --> 00:18:47,106 But you've, you know, mentioned skills and that's one of the areas 307 00:18:47,106 --> 00:18:48,756 where I wanted to get your input. 308 00:18:49,056 --> 00:18:55,106 It would seem at this enterprise scale that it kind of requires new skills. 309 00:18:55,106 --> 00:18:58,846 So would you see those skills being internal, external, a mix of both? 310 00:18:59,032 --> 00:18:59,852 I've seen both. 311 00:18:59,942 --> 00:19:03,652 My favorite model, it's just how I like to work, it doesn't mean it's the right 312 00:19:03,652 --> 00:19:05,782 way, is the upskilling model, right? 313 00:19:05,782 --> 00:19:08,632 So to get off the ground, bring in the experts, get 314 00:19:08,632 --> 00:19:10,182 some help, that kind of thing. 315 00:19:10,662 --> 00:19:16,152 But having outside people do the long term maintenance has some benefits. 316 00:19:16,172 --> 00:19:19,132 But what I really like is the idea of you have a team. 317 00:19:19,587 --> 00:19:23,277 They're probably as excited as the rest of the world about this technology. 318 00:19:23,777 --> 00:19:27,697 Give them the chance and the opportunity and the development while this thing 319 00:19:27,697 --> 00:19:31,317 is being built and developed and rolled out, to get upskilled, to get 320 00:19:31,317 --> 00:19:33,967 comfortable and to take ownership of it. 321 00:19:34,037 --> 00:19:36,137 Because they're inside your organization. 322 00:19:36,137 --> 00:19:38,127 They know you, they know your systems, they know your people. 323 00:19:38,477 --> 00:19:42,887 So that transition from, a handover, and an upskilling that goes with it, 324 00:19:42,917 --> 00:19:45,057 I think I've seen it work beautifully. 325 00:19:45,277 --> 00:19:46,557 It's more intentional though. 326 00:19:46,557 --> 00:19:50,777 You really have to not just focus on building it, but focus on the people 327 00:19:50,777 --> 00:19:53,847 that will maintain it coming along for the journey the entire time. 328 00:19:54,337 --> 00:20:00,917 For you, this is a decision that kind of works best in all circumstances? 329 00:20:01,097 --> 00:20:05,457 What about, you know, the potential like future evolution of these technologies 330 00:20:05,457 --> 00:20:10,557 and potentially, you know, trying to position yourself and your organization 331 00:20:11,017 --> 00:20:13,307 in a good place for whatever comes next. 332 00:20:13,317 --> 00:20:15,757 How does that, factor in for you? 333 00:20:16,057 --> 00:20:17,937 I don't know if it applies in all situations. 334 00:20:18,137 --> 00:20:21,687 I've seen it, where there's a desire and an intentionality, I think 335 00:20:21,687 --> 00:20:23,647 it can work in most instances. 336 00:20:23,677 --> 00:20:27,387 There's a few instances where let's say you have such a limited 337 00:20:27,397 --> 00:20:28,977 internal capability, right? 338 00:20:29,147 --> 00:20:30,917 But these are not large enterprises, right? 339 00:20:31,187 --> 00:20:34,637 Or, it's a capacity, not a skill issue, right? 340 00:20:34,907 --> 00:20:38,137 Hey, our folks could learn this, but it means they wouldn't do this 341 00:20:38,157 --> 00:20:39,417 other thing we need them to do. 342 00:20:39,447 --> 00:20:40,807 And we're not planning on hiring. 343 00:20:41,107 --> 00:20:42,547 So it's a strategic decision. 344 00:20:42,942 --> 00:20:43,712 We don't want to do this. 345 00:20:43,712 --> 00:20:45,297 And those are obviously okay. 346 00:20:45,742 --> 00:20:50,052 For our own purpose, I love solving the hard complex problems. 347 00:20:50,112 --> 00:20:53,582 And I think I and the firm have built a reputation for that, right? 348 00:20:53,902 --> 00:20:59,419 Where, if you are trying to tackle something as complex as how do I bring 349 00:20:59,419 --> 00:21:04,467 artificial intelligence, generative AI into my enterprise, that we would be 350 00:21:04,467 --> 00:21:05,907 on the list of folks you would call. 351 00:21:06,007 --> 00:21:11,177 And so my hope is to go from complex problem to complex problem 352 00:21:11,237 --> 00:21:15,867 without needing as much of the maintenance piece or sticking around 353 00:21:15,967 --> 00:21:17,437 unless they want it, which happy 354 00:21:17,457 --> 00:21:22,807 But that's why that's where the sort of combined model where they do have 355 00:21:22,807 --> 00:21:24,587 internal people who are involved. 356 00:21:24,887 --> 00:21:32,657 But you're there to support them, to show them the way, advise, and be 357 00:21:32,657 --> 00:21:39,147 there, but not necessarily expect them to be also at the same time checking 358 00:21:39,147 --> 00:21:40,627 all the research papers every day. 359 00:21:40,927 --> 00:21:41,347 That's right. 360 00:21:41,497 --> 00:21:41,947 That's right. 361 00:21:41,997 --> 00:21:45,187 And that is part of the the doing versus advising. 362 00:21:45,187 --> 00:21:46,597 That's a great, that's a great point. 363 00:21:46,627 --> 00:21:47,017 Yes. 364 00:21:47,067 --> 00:21:51,137 and that relationship is ongoing, whether, you know, we're working 365 00:21:51,152 --> 00:21:53,682 together actively at this moment or not. 366 00:21:54,182 --> 00:21:56,402 Um, yeah, absolutely. 367 00:21:56,406 --> 00:22:00,436 And so what related strategies from a leadership perspective, what related 368 00:22:00,436 --> 00:22:05,966 strategies are relevant when thinking about, you know, deploying generative 369 00:22:05,966 --> 00:22:10,766 AI, you've already mentioned obviously the, the skills side of things, but are 370 00:22:10,766 --> 00:22:15,316 there any other risk mitigation approaches or things to do that you would advise? 371 00:22:15,616 --> 00:22:18,066 So I personally take a very human centered approach. 372 00:22:18,116 --> 00:22:23,396 And regardless of technology, I always start with the people 373 00:22:23,396 --> 00:22:24,606 that will use it, right? 374 00:22:24,606 --> 00:22:29,116 So, for many of our customers, they actually want to deploy it into their 375 00:22:29,416 --> 00:22:31,536 offerings to their customers, right? 376 00:22:31,536 --> 00:22:35,696 So they have a product, a business to business SaaS offering or 377 00:22:35,696 --> 00:22:36,876 something like that, right? 378 00:22:37,286 --> 00:22:39,156 And they're saying, "Hey, we have a suite of products. We want to 379 00:22:39,206 --> 00:22:41,846 add generative AI to it." Love it. 380 00:22:41,906 --> 00:22:42,186 Great. 381 00:22:42,206 --> 00:22:43,276 Let's talk about that. 382 00:22:43,706 --> 00:22:44,566 Do they want it? 383 00:22:44,656 --> 00:22:45,476 Do they need it? 384 00:22:45,556 --> 00:22:47,756 What part of their work are they interested in? 385 00:22:48,216 --> 00:22:51,326 To me, that's the biggest risk mitigation because the technology 386 00:22:51,326 --> 00:22:55,696 is solvable and in many instances now is solved and repeatable. 387 00:22:55,996 --> 00:22:59,866 The question becomes, are you investing in the right area? 388 00:22:59,996 --> 00:23:04,736 Back to the professionals and the way they work and the risk averseness, are you 389 00:23:04,736 --> 00:23:08,336 touching a point that is very sensitive to your customers and therefore they 390 00:23:08,336 --> 00:23:13,336 are much less likely to try or use the thing that you're now pitching them? 391 00:23:13,636 --> 00:23:18,386 Or are you hitting a wonderful pain point like we both laugh about 392 00:23:18,426 --> 00:23:19,736 the pain points we have, right? 393 00:23:20,026 --> 00:23:23,976 They would go, "Oh, somebody thought of me and the pain I 394 00:23:24,106 --> 00:23:25,836 feel, of course, I'm going to try. 395 00:23:26,546 --> 00:23:30,156 And that difference of adoption and usability and interaction and feedback 396 00:23:30,456 --> 00:23:32,146 can make or break any product. 397 00:23:32,296 --> 00:23:36,436 So to me, that's a big risk mitigation is, taking that human centered approach 398 00:23:36,446 --> 00:23:41,526 first, understanding where the people are, the way they work, their environment, 399 00:23:41,526 --> 00:23:45,700 their context, and targeting the right problem with the right solution. 400 00:23:46,000 --> 00:23:49,670 Great, the human centered approach is an excellent point on 401 00:23:49,670 --> 00:23:52,100 which to start the wrapping up. 402 00:23:52,100 --> 00:23:56,500 So usually at the end, I ask for a tool, book, or maybe a habit 403 00:23:56,800 --> 00:24:01,800 that has made an impact on you in the last 12 months and why. 404 00:24:02,100 --> 00:24:04,230 I'll give you a tool and a habit. 405 00:24:04,470 --> 00:24:08,100 What I've been trying to do is, like this conversation, I have really 406 00:24:08,100 --> 00:24:10,210 good conversations and idea sparks. 407 00:24:10,690 --> 00:24:11,420 I have a notebook. 408 00:24:11,420 --> 00:24:12,840 I usually jot it down quickly. 409 00:24:13,280 --> 00:24:17,280 Before this habit and this tool, it would take me a long time to 410 00:24:17,280 --> 00:24:21,390 go from an idea or a bullet to sharing that idea more broadly. 411 00:24:21,530 --> 00:24:26,510 So my habit is, I take that bullet and I expand it into a few bullets. 412 00:24:26,650 --> 00:24:27,360 Let's say. 413 00:24:27,830 --> 00:24:29,430 Eight bullets, nine bullets, right? 414 00:24:29,430 --> 00:24:30,770 Sentences, an outline. 415 00:24:31,080 --> 00:24:33,374 But that takes 10 minutes, 15 minutes. 416 00:24:33,374 --> 00:24:34,250 That doesn't take a lot of time. 417 00:24:34,550 --> 00:24:37,060 And then because we're talking about generative AI, uh, the one I've been 418 00:24:37,060 --> 00:24:38,840 using a lot recently is, Google's Gemini. 419 00:24:39,140 --> 00:24:42,560 So I'll take the bullets into Gemini and I'll say, give me a first draft if I want 420 00:24:42,560 --> 00:24:45,090 to share this idea out with the world. 421 00:24:45,390 --> 00:24:47,840 it's usually only 70 percent good. 422 00:24:48,280 --> 00:24:52,250 But like anyone who's tried to write going from a blank piece of paper 423 00:24:52,260 --> 00:24:54,650 to something is the hardest part. 424 00:24:54,760 --> 00:24:58,530 And so it shortens my cycle to getting an idea, to share with 425 00:24:58,530 --> 00:25:01,900 my peers, to share with a client, to share with the world at large. 426 00:25:02,290 --> 00:25:06,270 And I'm trying to get in the habit of shortening that, that loop, 427 00:25:06,290 --> 00:25:09,830 Of going from idea to getting an idea out and starting over. 428 00:25:10,150 --> 00:25:13,430 And I've done it a couple of times and it's worked well, but I still, 429 00:25:13,730 --> 00:25:14,280 I need to work on that part. 430 00:25:14,780 --> 00:25:16,060 That's an amazing tip. 431 00:25:16,230 --> 00:25:16,750 Thank you. 432 00:25:16,940 --> 00:25:17,870 Thank you very much. 433 00:25:18,210 --> 00:25:22,240 And the final question is, of course, we're living through, you know, times 434 00:25:22,240 --> 00:25:25,970 where everything's changing, but what is the one thing that you would imagine 435 00:25:25,970 --> 00:25:28,090 to remain constant 10 years from now? 436 00:25:28,390 --> 00:25:29,010 That's a great question. 437 00:25:29,160 --> 00:25:33,990 I think the world will not look that much different in 10 years from now. 438 00:25:34,090 --> 00:25:35,600 It's an unpopular opinion, for instance. 439 00:25:35,720 --> 00:25:40,030 I think in many ways, the tools we use will look different. 440 00:25:40,090 --> 00:25:43,020 If you think back 10 or 20 or 30, 50 years, right? 441 00:25:43,300 --> 00:25:44,230 The cars looked different. 442 00:25:44,250 --> 00:25:45,550 The appliances were different. 443 00:25:46,030 --> 00:25:47,759 But you still needed community, right? 444 00:25:47,810 --> 00:25:51,920 You still wanted to hang out with friends, sports still existed, right? 445 00:25:52,350 --> 00:25:55,410 Most of the work that gets done still got done. 446 00:25:55,840 --> 00:26:00,320 so in some ways I'm very excited for the future and in other ways I take 447 00:26:00,330 --> 00:26:05,870 comfort in knowing that the things that we know and love and cherish 448 00:26:06,170 --> 00:26:07,670 aren't going away anytime soon. 449 00:26:07,810 --> 00:26:10,590 So I'm hopeful and optimistic for the future. 450 00:26:10,989 --> 00:26:13,749 Wonderful, thank you very much, Sam, absolute pleasure! 451 00:26:14,016 --> 00:26:14,526 Thank you. 452 00:26:14,526 --> 00:26:16,327 I really appreciated the conversation. 453 00:26:17,392 --> 00:26:21,102 Throughout today's conversation, Sam reminded us that the real challenge with 454 00:26:21,102 --> 00:26:24,042 generative AI isn't the technology itself. 455 00:26:24,602 --> 00:26:27,342 It's knowing how and where to apply it. 456 00:26:27,817 --> 00:26:31,347 Whether it's streamlining repetitive legal filings, managing complex 457 00:26:31,357 --> 00:26:35,707 contracts, or scaling knowledge systems across enterprises, Sam's 458 00:26:35,737 --> 00:26:41,177 insights revealed that success lies in balancing innovation with practicality. 459 00:26:41,777 --> 00:26:44,987 His human centered approach, focusing on risk tolerance, clear 460 00:26:44,987 --> 00:26:49,497 workflows, and employee empowerment, offers a roadmap for leaders 461 00:26:49,567 --> 00:26:51,517 navigating risk averse environments. 462 00:26:52,117 --> 00:26:54,587 Sam's journey challenges us to think deeply. 463 00:26:55,037 --> 00:26:57,977 Are we prioritizing tools that align with our core mission? 464 00:26:58,327 --> 00:27:00,927 Or are we chasing hype at the expense of long term value? 465 00:27:01,827 --> 00:27:04,817 If you'd like to connect with Sam and learn about his work, 466 00:27:04,987 --> 00:27:05,777 you can find him on LinkedIn. 467 00:27:05,777 --> 00:27:08,197 The link is in the description. 468 00:27:08,677 --> 00:27:11,607 He's eager to grow his network and collaborate with leaders who 469 00:27:11,607 --> 00:27:15,177 share his passion for building practical people first solutions. 470 00:27:15,967 --> 00:27:20,227 As always, we have more exciting topics and guests lined up, so stay tuned 471 00:27:20,227 --> 00:27:24,617 for more tales of innovation that inspire, challenge, and transform. 472 00:27:25,237 --> 00:27:27,287 Until next time, peace.