1 00:00:00,146 --> 00:00:03,536 Your digital roadmap probably talks about efficiency. 2 00:00:04,076 --> 00:00:07,286 What if it also determined who gets a fighting chance to live? 3 00:00:08,276 --> 00:00:12,146 Stay with us as we explore the moral calculus of deploying AI in healthcare 4 00:00:12,596 --> 00:00:16,166 and the hard lessons every transformation team can apply across industries. 5 00:00:16,676 --> 00:00:19,526 My name is Alexandre Nevski, and this is Innovation Tales. 6 00:00:20,140 --> 00:00:22,720 Navigating Change one story at a time. 7 00:00:22,750 --> 00:00:26,950 We share insights from leaders tackling the challenges of today's digital world. 8 00:00:27,220 --> 00:00:30,730 Welcome to Innovation Tales, the podcast exploring the human 9 00:00:30,730 --> 00:00:32,590 side of digital transformation. 10 00:00:33,648 --> 00:00:36,408 You've heard the mantra: move fast and break things. 11 00:00:37,098 --> 00:00:40,073 In healthcare, however, the think you break could be a person. 12 00:00:40,998 --> 00:00:45,868 That's why I've invited RJ Kedziora, co-founder of Estenda, HIMSS Davies 13 00:00:45,888 --> 00:00:51,048 and Health 2.0 Leadership Award winner and 25 year digital health veteran, to 14 00:00:51,048 --> 00:00:53,838 show how speed and safety can coexist. 15 00:00:54,288 --> 00:00:57,738 At  Estenda, RJ sets the technical strategy for solutions 16 00:00:57,738 --> 00:00:59,148 that truly improve lives. 17 00:00:59,688 --> 00:01:04,248 Today he explains how ambient listening Gen AI scribes cut clinicians 18 00:01:04,248 --> 00:01:08,313 dreaded "pajama time", letting them spend more minutes with patients, 19 00:01:08,818 --> 00:01:10,588 while the notes write themselves. 20 00:01:11,578 --> 00:01:16,378 Put simply his blueprint helps anyone tasked with turning AI's promise into 21 00:01:16,378 --> 00:01:21,208 a real world impact no matter where it sits on your transformational roadmap. 22 00:01:22,168 --> 00:01:25,658 Without further ado, here's my conversation with RJ Kedziora. 23 00:01:26,271 --> 00:01:27,321 RJ, welcome to the show. 24 00:01:28,049 --> 00:01:29,189 Alex, thanks for having me. 25 00:01:29,249 --> 00:01:30,629 Looking forward to the conversation. 26 00:01:31,769 --> 00:01:32,879 Thanks for taking the time. 27 00:01:33,479 --> 00:01:36,839 You specialize in digital health, but you also describe 28 00:01:36,839 --> 00:01:38,909 yourself as an aspiring author. 29 00:01:38,990 --> 00:01:42,260 The book you're writing describes your productive harmony framework. 30 00:01:42,380 --> 00:01:43,760 So what's the story there? 31 00:01:43,760 --> 00:01:45,290 What led you to take on this project? 32 00:01:45,466 --> 00:01:46,786 Yeah, thanks. 33 00:01:46,816 --> 00:01:50,986 The book working title is The Productive Harmony Framework, and 34 00:01:50,986 --> 00:01:53,326 it comes from my life experience. 35 00:01:53,356 --> 00:01:54,376 I am an entrepreneur. 36 00:01:55,376 --> 00:02:00,866 I also race triathlons, and as you can imagine, there's a lot of activity, 37 00:02:01,016 --> 00:02:03,056 physical activity that's part of that. 38 00:02:03,686 --> 00:02:07,946 And yes, that aspect is challenging, but one of the reasons I've gotten 39 00:02:08,006 --> 00:02:13,436 into triathlons is because of the mental challenge as well and what 40 00:02:13,436 --> 00:02:15,446 drives both of those components. 41 00:02:15,446 --> 00:02:19,226 As I sat down last summer and started thinking about, okay, I wanna write 42 00:02:19,226 --> 00:02:23,456 a book, it's been a passion in the back of my mind that I write 43 00:02:23,456 --> 00:02:27,476 a lot of white papers and journal articles, presentations, webinars. 44 00:02:27,526 --> 00:02:28,636 What is the next step? 45 00:02:28,636 --> 00:02:32,541 The evolution in sort of my career and what interests me? 46 00:02:32,751 --> 00:02:33,921 And I was like, I want to write a book. 47 00:02:33,921 --> 00:02:38,931 So as I worked through that, it, it just, everything kept coming 48 00:02:38,931 --> 00:02:40,851 back to the idea of energy. 49 00:02:40,901 --> 00:02:45,251 Whether it's physical energy or mental energy to be able to get stuff done 50 00:02:45,251 --> 00:02:50,051 and be productive, and even more so getting it done right the first time. 51 00:02:50,771 --> 00:02:54,431 And what's fascinating is I started looking into this, the first thing 52 00:02:54,431 --> 00:02:57,671 most people think of, the first thing that I really started thinking of 53 00:02:58,091 --> 00:03:00,221 early in my career was time management. 54 00:03:00,821 --> 00:03:04,751 Okay, how am I going to manage my time to get stuff done? 55 00:03:05,531 --> 00:03:07,631 There's only 24 hours in a day. 56 00:03:07,751 --> 00:03:09,821 You can't control time. 57 00:03:10,631 --> 00:03:12,161 And so what can you control? 58 00:03:12,311 --> 00:03:14,171 You can control yourself, your energy. 59 00:03:14,681 --> 00:03:21,341 How do you focus your energy to be able to maximize those time that you do have? 60 00:03:21,341 --> 00:03:25,391 You're probably gonna sleep 6, 7, 8 hours a night if you're lucky. 61 00:03:25,391 --> 00:03:26,891 So that's out the door. 62 00:03:26,891 --> 00:03:27,312 That's you. 63 00:03:27,341 --> 00:03:32,316 You can't use that time on getting work or personal projects done. 64 00:03:32,586 --> 00:03:37,236 That sleep time is very productive though because it gives you the energy to be 65 00:03:37,236 --> 00:03:39,666 able to do all of these different tasks. 66 00:03:40,266 --> 00:03:43,796 So that's what drove the idea of the book, productive harmony. 67 00:03:44,276 --> 00:03:47,186 It was fascinating when I started it and was working with editors. 68 00:03:47,696 --> 00:03:52,766 I really thought I had to come up with everything, all of the ideas, all of the 69 00:03:52,766 --> 00:03:54,986 content, and that was very intimidating. 70 00:03:55,121 --> 00:03:58,991 And it was interesting 'cause they're like, how many words do you want it to be? 71 00:03:58,991 --> 00:04:03,521 That meaning has no concept to me as a first time author or as you 72 00:04:03,521 --> 00:04:05,381 or most of the listening audience. 73 00:04:05,691 --> 00:04:08,421 I read lots of books, but I don't know. 74 00:04:08,811 --> 00:04:13,621 And so we did, we wound up settling on 50,000 words, which was roughly 200 pages. 75 00:04:14,411 --> 00:04:19,296 And I learned that no, the entire concept idea didn't have to come from me. 76 00:04:19,386 --> 00:04:24,786 A good book to be engaging has to be loaded with personal stories, 77 00:04:25,206 --> 00:04:28,906 research articles, first person interviews that I've done with 78 00:04:28,906 --> 00:04:30,256 different people on the topic. 79 00:04:30,256 --> 00:04:35,746 I reached out on different social media platforms and talk to PhD experts in 80 00:04:35,746 --> 00:04:40,121 time management, which is fascinating ' cause I didn't even know people were 81 00:04:40,121 --> 00:04:41,711 researched from that perspective. 82 00:04:42,161 --> 00:04:46,241 So instead of really just a book writing exercise, it became a 83 00:04:46,241 --> 00:04:50,491 research, exercise for me, which I do every day in my, day-to-day life. 84 00:04:50,491 --> 00:04:55,141 So it became a much easier task, but very fascinating to dive into the 85 00:04:55,141 --> 00:04:57,881 idea of time management, and energy. 86 00:04:58,601 --> 00:05:03,376 One of the interesting things that I've discovered in this, we as 87 00:05:03,376 --> 00:05:08,102 humans, humanity have been thinking about time since very early on. 88 00:05:08,852 --> 00:05:10,382 Cavemen had to think about time. 89 00:05:10,382 --> 00:05:12,902 Okay, you only have so many hours, the sun's gonna go down. 90 00:05:13,202 --> 00:05:15,152 We need to get food, we need to have shelter. 91 00:05:16,182 --> 00:05:20,232 In the early 1900s there was a gentleman, his name was Frederick Taylor. 92 00:05:20,249 --> 00:05:23,914 And the science has been called Taylorism now, but he really kicked off that 93 00:05:23,914 --> 00:05:27,409 modern day theory of time management. 94 00:05:27,409 --> 00:05:31,459 Of sitting there and looking at a task and breaking it down into components and using 95 00:05:31,459 --> 00:05:36,349 a stopwatch to figure out how to optimize productivity from a time perspective. 96 00:05:37,129 --> 00:05:41,869 What history has forgotten and what I uncovered in my research was the 97 00:05:41,869 --> 00:05:47,179 idea that after two years, his efforts did not increase productivity, but 98 00:05:47,179 --> 00:05:49,614 led to burnout and he got fired. 99 00:05:49,804 --> 00:05:55,040 So amazingly fascinating to me that what's driving us all day to day 100 00:05:55,040 --> 00:05:59,180 is this time management and getting stuff done in, in that early theories 101 00:05:59,450 --> 00:06:02,090 didn't quite pan out as we thought. 102 00:06:02,090 --> 00:06:07,355 So I think energy management today in the 21st century is a much more effective way 103 00:06:07,685 --> 00:06:12,525 of looking at how to plan out your day and maximize those hours that you do have. 104 00:06:12,804 --> 00:06:18,964 For me, I guess the, first things I think of when I hear managing my 105 00:06:18,964 --> 00:06:25,024 energy and increasing my productivity is avoiding context switches. 106 00:06:25,064 --> 00:06:28,964 I dunno if it's the same for other people, but they are extremely 107 00:06:28,964 --> 00:06:30,674 expensive in terms of energy. 108 00:06:31,124 --> 00:06:35,894 Changing between one task and a completely different task takes 109 00:06:35,894 --> 00:06:41,129 just so long and I really cannot afford to do that 20 times a day. 110 00:06:41,249 --> 00:06:42,839 Complete burnout at the end of the day. 111 00:06:42,839 --> 00:06:44,369 And this is unsustainable. 112 00:06:44,879 --> 00:06:50,889 So I wonder, can you give us a, a teaser of what energy management 113 00:06:50,889 --> 00:06:51,909 means for you in the book? 114 00:06:52,209 --> 00:06:59,349 It is about focusing and looking at how energy affects me day to day. 115 00:06:59,559 --> 00:07:04,649 So in, in my particular case, I do get up early in the morning, 4:30, 4:45 and 116 00:07:04,649 --> 00:07:06,809 train for an hour and a half to two hours. 117 00:07:07,079 --> 00:07:11,379 That actually maybe counterintuitively gives me energy 118 00:07:11,949 --> 00:07:13,609 to be able to take on the day. 119 00:07:13,609 --> 00:07:18,199 The days that I do rest and relax, I, there's maybe even some level of 120 00:07:18,199 --> 00:07:19,699 anxiety of oh, am I doing enough? 121 00:07:19,699 --> 00:07:20,779 Am I doing training? 122 00:07:21,334 --> 00:07:25,084 But I need less coffee, less caffeine those days that I train 123 00:07:25,084 --> 00:07:26,704 because it gives me energy. 124 00:07:27,244 --> 00:07:32,344 And I know I need to focus on tasks that need a lot of thinking, a 125 00:07:32,344 --> 00:07:34,144 lot of focus early in the morning. 126 00:07:34,924 --> 00:07:39,325 Through the midday, I focus on administrative types of activities, which 127 00:07:39,330 --> 00:07:43,434 don't need as much time and attention kind of thing, but definitely have to get done. 128 00:07:44,064 --> 00:07:48,504 And then late afternoon I get a burst of creativity where I try and focus 129 00:07:48,504 --> 00:07:53,184 on more of our marketing activities and what am I gonna do next for this 130 00:07:53,184 --> 00:07:55,104 book and the next chapter in the book. 131 00:07:55,200 --> 00:07:57,000 The creativity really peaks there. 132 00:07:57,360 --> 00:08:00,670 So when you do think about a energy, think of your day and how does 133 00:08:00,670 --> 00:08:02,350 that flow, throughout the day. 134 00:08:02,720 --> 00:08:07,550 The one thing too, to your point of not doing this context, switching. 135 00:08:08,150 --> 00:08:13,310 It is, that's horrible for using your energy, for being engaged 136 00:08:13,310 --> 00:08:15,050 in a task, staying on target. 137 00:08:15,945 --> 00:08:17,145 we all have our smartphones. 138 00:08:17,175 --> 00:08:18,405 We all have our cell phones. 139 00:08:19,245 --> 00:08:24,195 And yes, you turn off notifications, but for most people it just sits next to you. 140 00:08:25,485 --> 00:08:31,230 And what's fascinating, by putting the phone in another room, made a 141 00:08:31,230 --> 00:08:36,480 significant difference of not having it front of mind and not distracting me. 142 00:08:36,480 --> 00:08:38,010 It's almost like fear of missing out. 143 00:08:38,015 --> 00:08:40,355 Okay, the phone's there, but I'm not touching it. 144 00:08:40,355 --> 00:08:43,445 I'm not focused, I'm not getting the, it's not dinging, not getting the 145 00:08:43,445 --> 00:08:45,455 distractions, but it's still there. 146 00:08:45,455 --> 00:08:47,645 You just want to pick it up and look at it real quick. 147 00:08:47,645 --> 00:08:51,345 So having it in another room made a world of change. 148 00:08:52,035 --> 00:08:56,070 And the other thing a lot of people talk about is AI these days. 149 00:08:56,120 --> 00:09:00,740 I do have a chapter in the book on the use of AI for productivity. 150 00:09:00,740 --> 00:09:04,900 And that's what I think the biggest general use case, as we talk 151 00:09:04,900 --> 00:09:08,850 about generative AI these days, is a boon for society in terms 152 00:09:08,850 --> 00:09:10,920 of being much more efficient. 153 00:09:11,239 --> 00:09:14,729 Do you see that also as a, like a tool to manage energy? 154 00:09:14,729 --> 00:09:21,599 I can imagine that sometimes we might choose, instead of doing a task the old 155 00:09:21,599 --> 00:09:26,639 fashioned way, ourselves with our own brain, we might take it easy a little bit. 156 00:09:26,639 --> 00:09:29,309 and, We frame the task, but the AI does the heavy lifting. 157 00:09:29,309 --> 00:09:30,419 Is that how you see it? 158 00:09:31,784 --> 00:09:33,109 Yes, absolutely. 159 00:09:33,139 --> 00:09:39,729 So I think I do use AI a lot in my everyday world in terms of marketing, 160 00:09:39,849 --> 00:09:43,249 product development, data analytics. 161 00:09:43,589 --> 00:09:46,534 We're implementing it in various capacities on some of our digital 162 00:09:46,534 --> 00:09:48,574 health projects that we're engaged with. 163 00:09:49,384 --> 00:09:55,834 And, as an example, I had to do user interviews with PhD 164 00:09:55,839 --> 00:09:57,634 experts in smell testing. 165 00:09:58,054 --> 00:10:04,549 Something that was a new area for me as an individual, but I interviewed dozens, 166 00:10:04,549 --> 00:10:09,439 if not hundreds of people to figure out, what a specific product should be. 167 00:10:09,439 --> 00:10:11,449 What should the business rules be? 168 00:10:11,449 --> 00:10:13,159 How will the system work? 169 00:10:13,728 --> 00:10:16,458 So I can do that, but it typically takes a couple hours 170 00:10:16,458 --> 00:10:18,168 to sit down and plan that out. 171 00:10:18,618 --> 00:10:24,978 Whereas you use any of the Gen AI systems today, you just ask it, create me a user 172 00:10:24,978 --> 00:10:28,728 interview and then you give it a lot of background and context about what you 173 00:10:28,728 --> 00:10:32,958 want to do and what you're trying to accomplish, and it generates 10 questions 174 00:10:33,168 --> 00:10:37,878 and you just, okay, generate me 10 more, generate me 10 more, generate me 10. 175 00:10:37,908 --> 00:10:42,258 And within minutes, you have 50 questions that you can use to 176 00:10:42,258 --> 00:10:47,423 structure a user interview to figure out, what makes them tick and, how 177 00:10:47,423 --> 00:10:51,603 can we help them with an application focused on the idea of smell testing. 178 00:10:52,153 --> 00:10:54,373 It's very much an 80% solution. 179 00:10:54,373 --> 00:10:56,443 So I still go into those questions. 180 00:10:56,473 --> 00:11:00,143 Okay, what is a good flow as part of the conversation to 181 00:11:00,143 --> 00:11:02,323 make that work very effectively? 182 00:11:02,773 --> 00:11:07,143 But something that would normally take me a couple hours, took minutes. 183 00:11:07,523 --> 00:11:13,358 Yeah, and from an energy perspective, having a thinking buddy it's obviously 184 00:11:13,358 --> 00:11:17,858 the idea would be to have an actual human to brainstorm through, but that's 185 00:11:17,958 --> 00:11:21,138 limited to the most important use cases. 186 00:11:21,228 --> 00:11:25,475 And it's nice to have the ability to tap into a thinking buddy research 187 00:11:25,475 --> 00:11:29,875 assistant that helps us go through whatever tasks we need to do. 188 00:11:30,335 --> 00:11:33,845 But I did want to pick your brain more generally now since you've branched 189 00:11:33,845 --> 00:11:36,215 out into the digital health space. 190 00:11:36,305 --> 00:11:38,075 And that's your specialty, of course. 191 00:11:38,375 --> 00:11:42,725 I would like to understand like what are the use cases that 192 00:11:42,725 --> 00:11:44,705 have already seen wide adoption? 193 00:11:45,255 --> 00:11:48,315 So on the traditional side of things, I think there is quite already a 194 00:11:48,315 --> 00:11:53,275 lot of machine learning algorithms used to augment and complement 195 00:11:53,375 --> 00:11:55,475 radiologists or detecting tumors. 196 00:11:55,485 --> 00:11:58,215 Identifying patterns of course with traditional machine learning. 197 00:11:58,965 --> 00:12:04,575 And on the Gen AI side of things, I think the first thing that comes to my mind in 198 00:12:04,635 --> 00:12:12,365 healthcare is assistance that doctors can use to reduce the amount of paperwork they 199 00:12:12,395 --> 00:12:14,765 need to do after the clinical setting. 200 00:12:14,765 --> 00:12:17,555 So when you have to go into the office and just write up everything that 201 00:12:17,555 --> 00:12:23,315 instead they can talk to the AI as they're doing the procedure or whatever 202 00:12:23,315 --> 00:12:27,815 they're involved in, and get most of that documentation done on the fly. 203 00:12:28,265 --> 00:12:32,075 First of all, before we move on to other use cases, are those, in 204 00:12:32,075 --> 00:12:35,745 your opinion established already or is that still quite early? 205 00:12:36,155 --> 00:12:39,335 I am glad you mentioned machine learning because I think of that 206 00:12:39,365 --> 00:12:41,705 very often as the forgotten AI. 207 00:12:41,820 --> 00:12:45,840 It has been around for a while and is making an impact. 208 00:12:46,270 --> 00:12:51,460 And the use of it in image analysis is probably the top use case from 209 00:12:51,460 --> 00:12:52,960 a machine learning perspective. 210 00:12:53,440 --> 00:12:59,620 'cause you and I, a trained expert, can look at, a breast image, a brain image. 211 00:13:00,080 --> 00:13:03,920 We do a lot of work in diabetic retinopathy, which is looking at the 212 00:13:03,920 --> 00:13:07,820 retina of your back of your eye for detection of diabetic retinopathy. 213 00:13:07,820 --> 00:13:10,160 It's the leading cause of preventable blindness. 214 00:13:10,160 --> 00:13:14,440 So if you find evidence of it, then you can make a difference 215 00:13:14,440 --> 00:13:15,400 in that person's life. 216 00:13:15,400 --> 00:13:19,180 And so that's the first use case fairly well established from a 217 00:13:19,180 --> 00:13:21,850 scientific computing perspective. 218 00:13:22,215 --> 00:13:26,865 The FDA, here in the us, which is a regulatory body that has to approve 219 00:13:26,865 --> 00:13:31,250 these algorithms, these machine learning systems, I think it's close 220 00:13:31,250 --> 00:13:34,880 to 900 now over the last decade or so. 221 00:13:35,180 --> 00:13:38,780 The vast majority of them, probably 80, 85% are related to 222 00:13:38,780 --> 00:13:40,245 like machine learning and imagery. 223 00:13:41,370 --> 00:13:44,870 It's very well used, established use case. 224 00:13:44,870 --> 00:13:49,295 They can do better than the practitioners and they don't get tired. 225 00:13:50,175 --> 00:13:51,790 Call back to that idea of energy. 226 00:13:52,040 --> 00:13:56,180 If you're looking at a hundred images over a couple hours, you are gonna get tired. 227 00:13:56,180 --> 00:13:56,900 You're human. 228 00:13:57,440 --> 00:13:58,730 The com the computer doesn't. 229 00:13:59,840 --> 00:14:05,860 Now, if we look at generative AI, which is the latest craze today, it's barely, two 230 00:14:05,860 --> 00:14:13,600 years old, ambient listening is the area where it's being most quickly adopted. 231 00:14:14,050 --> 00:14:19,385 And it is to your use case of taking notes and writing those charts, and 232 00:14:19,745 --> 00:14:22,235 doctors and nurses talk about pajama time. 233 00:14:22,865 --> 00:14:26,605 My daughter happens to be a physical therapist and she called on her way 234 00:14:26,605 --> 00:14:30,175 home from work last night and she's like, yeah, I need to sit down and write 235 00:14:30,175 --> 00:14:32,365 notes for 25 patients that I saw today. 236 00:14:33,505 --> 00:14:35,965 I'm like, but now you know you're done work. 237 00:14:35,965 --> 00:14:38,890 And she's like, no, I gotta do these notes now. 238 00:14:39,130 --> 00:14:41,950 So she doesn't, unfortunately, she doesn't have the ambient 239 00:14:41,950 --> 00:14:43,390 listening technology in place. 240 00:14:44,140 --> 00:14:50,260 But the idea is let the AI system listen to the conversation that 241 00:14:50,260 --> 00:14:55,915 you're having in the room and take the notes to make the overall process, 242 00:14:55,915 --> 00:14:57,565 the encounter much more efficient. 243 00:14:58,075 --> 00:15:03,475 And what I like about that approach is it lets the doctor, the medical practitioners 244 00:15:03,475 --> 00:15:06,535 get back to why they got into medicine. 245 00:15:06,625 --> 00:15:11,275 If they got into medicine to talk to people, to help people engage with them, 246 00:15:11,545 --> 00:15:13,705 not to type notes on a computer keyboard. 247 00:15:14,245 --> 00:15:17,225 A lot of the doctors you hear stories that they just have their back 248 00:15:17,225 --> 00:15:20,315 turned to a patient because they're typing away on their keyboard. 249 00:15:20,345 --> 00:15:21,725 That's not why they got into medicine. 250 00:15:22,240 --> 00:15:23,770 They wanna help that individual. 251 00:15:24,040 --> 00:15:29,180 So this opportunity, in a non-invasive way, allows the patient medical 252 00:15:29,180 --> 00:15:33,950 practitioners to talk to each other, have that engaged conversation, and the 253 00:15:33,950 --> 00:15:36,290 AI can pick up and generate those notes. 254 00:15:37,460 --> 00:15:41,900 Then from an efficiency perspective, it could also get into medical coding from 255 00:15:41,900 --> 00:15:45,020 a billing perspective and help out there. 256 00:15:45,020 --> 00:15:50,925 So it is the big use cases in healthcare today are really around the idea of 257 00:15:50,925 --> 00:15:55,085 how do we gain efficiency from an administrative backend perspective. 258 00:15:55,461 --> 00:15:59,339 In some of your posts you were mentioning digital therapeutics. 259 00:15:59,659 --> 00:16:05,089 Is that also an area where the AI is enabling new ways of using it? 260 00:16:05,519 --> 00:16:10,469 A digital therapeutics is an interesting term 'cause it is relatively new. 261 00:16:11,014 --> 00:16:15,274 I think of it as a special class of digital health applications or 262 00:16:15,304 --> 00:16:21,239 systems where you walk into a drug store, pharmacy and buy all sorts of 263 00:16:21,239 --> 00:16:24,989 medications, vitamins, things over the counter without a prescription. 264 00:16:25,589 --> 00:16:28,619 Think of that as your general digital health application. 265 00:16:28,619 --> 00:16:32,969 There's probably 300,000 of those out there, depending on how you count. 266 00:16:33,734 --> 00:16:37,064 A digital therapeutic takes that to the next level. 267 00:16:37,334 --> 00:16:41,144 It's a comparison to an actual drug or medication that went 268 00:16:41,144 --> 00:16:46,544 through extensive clinical trials and clinical regulatory approval. 269 00:16:46,544 --> 00:16:48,504 Here, in the US would be the FDA. 270 00:16:49,044 --> 00:16:51,384 So it's an elevated system. 271 00:16:51,574 --> 00:16:54,904 It has that evidence and it's prescribed just like a 272 00:16:54,904 --> 00:16:56,374 medication would be to a patient. 273 00:16:56,374 --> 00:16:59,764 It is prescribed to a patient to improve them. 274 00:17:00,269 --> 00:17:03,089 Mental health is a huge area for this. 275 00:17:03,139 --> 00:17:05,429 Some new migraine systems were just developed. 276 00:17:05,429 --> 00:17:09,739 So instead of taking a drug which has lots of side effects. 277 00:17:09,739 --> 00:17:13,609 You know I'm sure you've seen those commercials on tv, hear them where it's 278 00:17:13,609 --> 00:17:15,949 just like, who would ever take this drug? 279 00:17:16,859 --> 00:17:19,679 for five minutes they've gone off on all the potential side effects. 280 00:17:20,219 --> 00:17:24,839 A digital health app doesn't have those same issues kind of thing. 281 00:17:24,839 --> 00:17:28,054 You're not taking a physical drug at that point. 282 00:17:28,054 --> 00:17:29,764 And they can make a difference. 283 00:17:30,304 --> 00:17:32,134 And that's the idea of a digital therapeutic. 284 00:17:33,064 --> 00:17:37,174 Unfortunately, it's a little challenged right now from a reimbursement 285 00:17:37,174 --> 00:17:39,574 perspective here in, in the US. 286 00:17:39,654 --> 00:17:43,734 January 1, just a couple months ago, there was some new approvals and 287 00:17:43,734 --> 00:17:46,809 methods to get them to be paid for. 288 00:17:47,199 --> 00:17:50,259 So the idea is the doctor prescribes them like a drug to the patient, but 289 00:17:50,259 --> 00:17:51,789 the insurance company is covering them. 290 00:17:52,389 --> 00:17:56,379 So the company creating the digital therapeutic can get 291 00:17:56,379 --> 00:17:57,759 a higher reimbursement rate. 292 00:17:58,539 --> 00:18:01,449 The EU has done a lot better with this, I think. 293 00:18:01,449 --> 00:18:05,409 It's still in its early stages, but they've tied that regulatory 294 00:18:05,409 --> 00:18:08,649 approval to reimbursement pathways. 295 00:18:08,989 --> 00:18:10,669 Here in the US it was like, oh, you're approved. 296 00:18:11,134 --> 00:18:12,574 Now, how do you get paid for it? 297 00:18:12,794 --> 00:18:16,509 In Europe, they tied those things together, which is a much better approach. 298 00:18:16,559 --> 00:18:22,469 I saw some startups in, in the healthcare AI space offering 299 00:18:23,549 --> 00:18:26,154 through Gen AI automated checkups. 300 00:18:26,204 --> 00:18:30,104 That somebody who has been prescribed a certain medication takes the right 301 00:18:30,104 --> 00:18:35,254 dosage, the right schedule, and generally something that I guess nurses would do. 302 00:18:35,254 --> 00:18:39,484 But nurses can only visit so many patients in one day or call 303 00:18:39,484 --> 00:18:40,974 so many patients in one day. 304 00:18:40,974 --> 00:18:45,904 And that enables to augment their work, if I understand correctly, to 305 00:18:45,904 --> 00:18:48,904 help patients follow through with the. 306 00:18:49,294 --> 00:18:53,604 treatment that had been prescribed, in a, in an outpatient environment, 307 00:18:53,604 --> 00:18:54,534 if they're at home or something. 308 00:18:54,924 --> 00:19:00,714 And so is that the next level from that, the digital therapeutics where it's not 309 00:19:00,714 --> 00:19:06,404 just helping with, like prescriptions for chemical drugs, but also encouraging 310 00:19:06,404 --> 00:19:12,014 people to do specific behaviors that then create better health outcomes? 311 00:19:12,606 --> 00:19:13,086 Yes. 312 00:19:13,136 --> 00:19:20,456 that's the amazing aspect of what this Gen AI is going to make capable. 313 00:19:21,176 --> 00:19:23,246 In a lot of cases, is capable today. 314 00:19:23,696 --> 00:19:25,646 You do have to be cautious. 315 00:19:25,646 --> 00:19:31,736 There's notes of caution around using the generic AI is out there, the ChatGPTs, 316 00:19:31,756 --> 00:19:36,556 the Claude kind of thing, because they're not designed for clinical medical use. 317 00:19:36,856 --> 00:19:41,296 They are very capable and they will give you answers that sound really 318 00:19:41,296 --> 00:19:47,176 good, but you do have to be aware of biases and the idea of hallucinations 319 00:19:47,176 --> 00:19:49,246 or that the AI might make something up. 320 00:19:49,666 --> 00:19:51,736 It ultimately, it wants to please you. 321 00:19:52,246 --> 00:19:55,306 You know it, it's not gonna sit there and tell you, I don't know the answer. 322 00:19:55,336 --> 00:19:58,126 Like you or I having a conversation, I can be like, oh, I don't know. 323 00:19:58,716 --> 00:20:00,036 It's gonna make something up. 324 00:20:00,496 --> 00:20:03,046 And it sounds convincing, which is challenging. 325 00:20:03,046 --> 00:20:07,876 But from a mental health capacity, there is a mental health crisis. 326 00:20:08,026 --> 00:20:08,686 Globally. 327 00:20:09,136 --> 00:20:11,086 There are not enough professionals. 328 00:20:11,646 --> 00:20:12,786 To help all those people. 329 00:20:13,536 --> 00:20:18,036 So yes, how can we implement these AI systems to help those people to have 330 00:20:18,036 --> 00:20:20,346 those conversations, to be the companion? 331 00:20:21,136 --> 00:20:24,696 Your use case that you were talking about is, follow patient follow up. 332 00:20:24,746 --> 00:20:26,006 Are they taking the medication? 333 00:20:26,006 --> 00:20:27,506 Are they having complications? 334 00:20:28,106 --> 00:20:32,966 The AI systems today are very capable of having that conversation with an 335 00:20:32,966 --> 00:20:38,816 individual, getting that feedback and then analyzing that feedback, not just 336 00:20:38,816 --> 00:20:44,696 from a content perspective, but also from a tone and an emotional perspective, 337 00:20:46,286 --> 00:20:52,246 which is just another level you or I, if you really know that patient, over time 338 00:20:52,246 --> 00:20:54,436 could develop that type of awareness. 339 00:20:54,436 --> 00:20:55,726 Is this person sad? 340 00:20:55,726 --> 00:20:56,446 Are they angry? 341 00:20:57,391 --> 00:21:02,201 It usually takes a little bit more experience to develop that, but the 342 00:21:02,201 --> 00:21:07,811 AI systems can tag a conversation as this patient, is showing signs of 343 00:21:07,811 --> 00:21:12,311 depression, which then maybe you have a human follow up with them, but the 344 00:21:12,311 --> 00:21:16,971 computer doesn't get tired, can have conversations all throughout the day. 345 00:21:18,121 --> 00:21:22,146 You know, a nurse that's doing follow up on dozens of patients might have 346 00:21:22,146 --> 00:21:25,086 to constrain those calls to three or five minutes to get through all 347 00:21:25,086 --> 00:21:29,976 of those, where the AI doesn't have that constraint . So you think of, 348 00:21:30,056 --> 00:21:34,346 your mother who might be elderly at home, or an assisted living facility. 349 00:21:34,976 --> 00:21:37,946 It's a great opportunity to engage with them and have a conversation. 350 00:21:38,376 --> 00:21:41,556 And so how do you use that AI to even go beyond that? 351 00:21:41,586 --> 00:21:43,596 Hey, have you taken your medications today? 352 00:21:43,956 --> 00:21:46,416 Hey, how are you feeling today? 353 00:21:47,026 --> 00:21:49,366 To engage in, in a much more meaningful conversation. 354 00:21:49,733 --> 00:21:55,133 It does raise so many questions about the nature of care, just generally, 355 00:21:55,133 --> 00:22:00,853 ethically speaking, how do we feel if a close one, one of our family members, 356 00:22:01,313 --> 00:22:07,388 instead of getting the visits of a nurse that they should be, that we, they 357 00:22:07,388 --> 00:22:09,968 used to get now and AI is calling them? 358 00:22:10,618 --> 00:22:14,158 But at the same time, I think what you were saying was more describing 359 00:22:14,278 --> 00:22:16,228 augmentation rather than replacement. 360 00:22:16,228 --> 00:22:21,898 And earlier you were talking about ambient listening and I believe it was 361 00:22:21,898 --> 00:22:28,243 in Fei Fei Li's book, where she was also talking about this technology as like 362 00:22:28,243 --> 00:22:33,743 a few years back when it was just the beginning and how some of the medical 363 00:22:34,373 --> 00:22:40,073 professionals were quite concerned about, this being used as a system to monitor 364 00:22:40,073 --> 00:22:41,663 performance or something like that. 365 00:22:41,663 --> 00:22:45,773 And also raises quite a few ethical questions. 366 00:22:45,773 --> 00:22:50,988 And, you are the expert that deals with people adopting this 367 00:22:50,988 --> 00:22:52,278 technology day in, day out. 368 00:22:52,518 --> 00:22:54,313 How do you think about those ethical questions? 369 00:22:56,013 --> 00:22:58,203 You definitely have to address them. 370 00:22:58,543 --> 00:23:03,673 You definitely have to be aware of them, and there is a lot of fear, 371 00:23:03,673 --> 00:23:08,563 uncertainty, and doubt when it comes to these AI systems today. 372 00:23:09,273 --> 00:23:13,353 So you do have to think around along those ethical guidelines of how are 373 00:23:13,353 --> 00:23:16,173 we going to be using this system? 374 00:23:16,173 --> 00:23:17,523 What do we have to consider? 375 00:23:17,523 --> 00:23:20,733 What are the risks to patient care and wellness? 376 00:23:20,936 --> 00:23:24,956 AI generally stands for artificial intelligence, but you mentioned 377 00:23:24,956 --> 00:23:28,436 it a minute ago, I really think of it as augmented intelligence. 378 00:23:28,706 --> 00:23:34,856 How do we use it to make our systems and processes better than they are today? 379 00:23:35,576 --> 00:23:38,306 And that's a fascinating possibility. 380 00:23:38,306 --> 00:23:43,096 So I don't want it to replace the nurse, if someone's in 381 00:23:43,096 --> 00:23:44,506 an assisted living facility. 382 00:23:44,506 --> 00:23:49,876 That human contact is very important, but it's fascinating because when the 383 00:23:49,876 --> 00:23:56,806 AI systems are compared against human interactions time and time again, they 384 00:23:56,806 --> 00:24:04,036 come back as the AI is more empathetic in that conversation, in that note, 385 00:24:04,156 --> 00:24:06,556 that was sent via email or text message. 386 00:24:06,556 --> 00:24:10,466 A lot of that I think is just we're human, we're tired at the end of the day and 387 00:24:10,466 --> 00:24:15,836 only have so much capacity, energy to apply to every one of these conversations. 388 00:24:16,316 --> 00:24:19,886 So it is an opportunity to improve that relationship. 389 00:24:19,886 --> 00:24:24,146 I still want the nurse checking on my mother, and having that human interaction. 390 00:24:24,536 --> 00:24:28,551 But at some point, it's gonna be irresponsible to not take 391 00:24:28,551 --> 00:24:33,156 advantage of these tools and these technologies because they can do good. 392 00:24:33,186 --> 00:24:34,806 They can make a difference. 393 00:24:35,316 --> 00:24:40,146 So yes, cautionary notes, we need to be aware of the risks, the biases that 394 00:24:40,146 --> 00:24:41,556 are inherent in these technologies. 395 00:24:41,971 --> 00:24:47,011 And the bias is there because it learned on information that we as 396 00:24:47,011 --> 00:24:50,011 humans created, we as humans are biased. 397 00:24:50,011 --> 00:24:55,261 It's much easier to talk to an AI and help identify its biases than you and I. 398 00:24:55,291 --> 00:24:57,571 We're generally not aware of our biases. 399 00:24:57,731 --> 00:25:02,921 That self reflection and introspection is difficult as it's challenging and 400 00:25:02,951 --> 00:25:07,041 do you do it in every moment of the day as you're providing care and treatment, 401 00:25:07,041 --> 00:25:11,841 you have the best of intentions to do the best job you can for every patient 402 00:25:11,841 --> 00:25:13,491 you see as a healthcare professional. 403 00:25:14,001 --> 00:25:18,811 But your history, situations are going to impact that. 404 00:25:19,391 --> 00:25:23,141 And that can be tuned out of the AI systems. 405 00:25:23,651 --> 00:25:27,821 As well as the other thing, obviously mistakes, and we call them hallucinations 406 00:25:27,821 --> 00:25:30,971 these days, but it's basically an AI system making something up. 407 00:25:31,631 --> 00:25:34,781 But doctors, humans, we are not perfect. 408 00:25:35,381 --> 00:25:37,691 There's a reason for second opinions. 409 00:25:38,211 --> 00:25:42,571 And you talk to a colleague, you can also talk to the AI systems to get an 410 00:25:42,571 --> 00:25:49,161 informed opinion, which interestingly can be aware of the vast volumes of research 411 00:25:49,161 --> 00:25:51,351 that are generated each day, each year. 412 00:25:51,771 --> 00:25:53,361 You and I can't keep up with that. 413 00:25:53,701 --> 00:25:58,651 It's getting us to ask very un unexpected questions about our 414 00:25:58,651 --> 00:26:00,056 own cognitive capabilities. 415 00:26:00,356 --> 00:26:05,464 You mentioned our emotional intelligence, our ability to read 416 00:26:05,464 --> 00:26:07,654 other people's emotions, sympathize. 417 00:26:08,254 --> 00:26:11,854 We thought, I guess a few years back that this was purely human. 418 00:26:11,854 --> 00:26:14,974 No way that artificial intelligence can get there it does raise 419 00:26:15,024 --> 00:26:18,894 a number of questions about where do we stand in there. 420 00:26:18,894 --> 00:26:21,233 But maybe that's a conversation for another day. 421 00:26:21,233 --> 00:26:26,264 I really wanted to get your perspective on the future of healthcare. 422 00:26:26,454 --> 00:26:31,384 You're involved in many, projects using machine learning or artificial 423 00:26:31,384 --> 00:26:36,324 intelligence, or other technologies, but where do you see the field a few 424 00:26:36,324 --> 00:26:37,674 years from now, let's say five years? 425 00:26:38,231 --> 00:26:42,156 I think there's amazing potential that we're not even 426 00:26:42,156 --> 00:26:43,746 aware of what's gonna happen. 427 00:26:43,746 --> 00:26:49,176 My favorite quote these days is from Bill Gates that we as humans tend to 428 00:26:49,356 --> 00:26:55,716 overestimate what's possible in two years and underestimate what's possible in 10. 429 00:26:56,196 --> 00:27:02,086 We have no idea what this technology is going to be capable of in 10 years. 430 00:27:02,766 --> 00:27:05,196 There's a lot of fear of it replacing. 431 00:27:05,226 --> 00:27:06,936 I still think it's augmenting. 432 00:27:07,566 --> 00:27:09,636 We still very much want that human touch. 433 00:27:09,636 --> 00:27:14,896 Even if the AI has empathy, we do wanna keep that human touch in it. 434 00:27:15,436 --> 00:27:17,086 It's gonna change it for the better. 435 00:27:17,536 --> 00:27:23,176 It is going to make a difference in the health and wellness of people globally. 436 00:27:23,756 --> 00:27:28,806 There are aspects of, technical equity and can everybody get access to these systems? 437 00:27:29,601 --> 00:27:34,401 Challenges, yes, we have to work on, but it is going to make a difference 438 00:27:34,491 --> 00:27:36,376 in the lives of millions of people. 439 00:27:36,881 --> 00:27:40,211 And as we're about to wrap up, I usually ask a couple of questions at the end. 440 00:27:40,211 --> 00:27:44,861 So the first one is, what is a book, tool or habit that has made the most 441 00:27:44,861 --> 00:27:46,601 impact on you in the last 12 months? 442 00:27:46,663 --> 00:27:48,013 AI is right up there. 443 00:27:48,113 --> 00:27:52,918 I, I can't overemphasize the idea of how much of an impact AI has made. 444 00:27:53,458 --> 00:27:56,728 But one of the books I interesting that I keep coming back to and 445 00:27:56,728 --> 00:28:00,118 I'm starting to refer people to, obviously I'm writing my own book. 446 00:28:00,643 --> 00:28:04,733 But I discovered a book called When from author Dan Pink. 447 00:28:04,733 --> 00:28:05,753 A really good book. 448 00:28:06,103 --> 00:28:09,913 And we all think about why we do stuff or what do we have to do. 449 00:28:10,363 --> 00:28:15,133 But his book does a deep dive into when does it make sense to do something, 450 00:28:15,463 --> 00:28:19,868 whether it's what point, like seasonally, what time of the year or what time of day. 451 00:28:20,413 --> 00:28:22,783 And that's what tied into my energy thinking. 452 00:28:22,783 --> 00:28:26,033 It's like he was talking about, when do you do a task? 453 00:28:26,033 --> 00:28:29,243 And it's okay, I'm basing that decision on my energy. 454 00:28:29,973 --> 00:28:30,423 Wonderful. 455 00:28:30,423 --> 00:28:32,973 We'll make sure to provide the link in the description next, 456 00:28:32,973 --> 00:28:34,103 of course, to your project. 457 00:28:34,743 --> 00:28:38,903 And the final question, mirroring what you were saying just a moment ago. 458 00:28:39,953 --> 00:28:43,133 We've discussed a lot of things that are changing, but do you 459 00:28:43,133 --> 00:28:47,123 see maybe one thing that is gonna remain the same 10 years from now? 460 00:28:47,623 --> 00:28:48,763 What is not gonna change? 461 00:28:48,793 --> 00:28:51,343 I think AI is going to change so much. 462 00:28:52,273 --> 00:28:56,323 I don't know that we recognize the world a few years from now, honestly. 463 00:28:56,828 --> 00:29:03,938 I think of AI as like the printing press, electricity, uh, computing technology, 464 00:29:03,968 --> 00:29:07,658 the internet, smartphones, social media. 465 00:29:07,748 --> 00:29:12,788 These changes change society significantly. 466 00:29:13,383 --> 00:29:16,183 Absolutely super exciting for some of us. 467 00:29:16,648 --> 00:29:21,088 Maybe super scary for some others in our audience, but in any 468 00:29:21,088 --> 00:29:23,433 case, RJ, it's been a pleasure! 469 00:29:23,847 --> 00:29:24,782 Thank you, Alex. 470 00:29:25,002 --> 00:29:27,312 The core points of today's conversation are simple. 471 00:29:27,552 --> 00:29:31,752 Use AI to give time back to people such as with ambient listening digital 472 00:29:31,752 --> 00:29:36,162 scribes that capture clinical notes during the visit so doctors aren't up 473 00:29:36,162 --> 00:29:38,322 late doing "pajama time" paperwork. 474 00:29:38,752 --> 00:29:42,562 And not deploying AI tools that clearly help patients and expand access to 475 00:29:42,562 --> 00:29:47,632 healthcare may be as irresponsible as failing to address bias or safety issues. 476 00:29:47,892 --> 00:29:51,312 If you'd like to see the projects behind these lessons, check out the 477 00:29:51,312 --> 00:29:54,382 case studies RJ shares at estenda.com. 478 00:29:54,619 --> 00:29:58,939 As always, we have more exciting topics and guest appearances lined up, so 479 00:29:58,939 --> 00:30:03,799 stay tuned for more tales of innovation that inspire, challenge and transform. 480 00:30:04,699 --> 00:30:05,539 Until next time. 481 00:30:06,009 --> 00:30:06,309 Peace. 482 00:30:07,683 --> 00:30:10,023 Thanks for tuning in to Innovation Tales. 483 00:30:14,763 --> 00:30:18,453 Get inspired, connect with other practitioners and approach the 484 00:30:18,453 --> 00:30:20,493 digital revolution with confidence. 485 00:30:21,423 --> 00:30:24,813 Visit innovation-tales.com for more episodes. 486 00:30:26,153 --> 00:30:27,173 See you next time.