1 00:00:00,049 --> 00:00:03,249 Alex: Have you noticed the growing divide between those who are thriving 2 00:00:03,249 --> 00:00:06,659 with artificial intelligence and those who are being left behind? 3 00:00:07,499 --> 00:00:11,449 Today my guest dives into what's driving this AI divide and why 4 00:00:11,449 --> 00:00:15,709 empowering everyday users is critical for a fair and inclusive future. 5 00:00:16,469 --> 00:00:20,299 He goes on to share practical advice and real world examples to help you 6 00:00:20,309 --> 00:00:24,239 support those who are struggling to embrace AI and close the gap. 7 00:00:24,999 --> 00:00:28,950 My name is Alexandre Nevski and this is Innovation Tales. 8 00:00:44,394 --> 00:00:48,464 Artificial intelligence, revolutionary tech, or the ultimate headache. 9 00:00:49,034 --> 00:00:53,034 If you're a business leader trying to navigate this AI craze, you might 10 00:00:53,034 --> 00:00:57,509 feel a bit like you're standing in the serial aisle with way too many options. 11 00:00:58,219 --> 00:01:03,169 Enter Justin Simpson, co founder of Here.Now.AI, whose weekly newsletter 12 00:01:03,319 --> 00:01:08,449 helps over 20000 readers make sense of AI without drowning in jargon. 13 00:01:08,869 --> 00:01:09,449 His mission? 14 00:01:09,919 --> 00:01:14,179 To help people and businesses use AI to work smarter, not harder. 15 00:01:14,919 --> 00:01:19,829 In today's episode, Justin tackles the AI divide, a problem where some people 16 00:01:19,889 --> 00:01:24,279 are zooming ahead while others are stuck at the starting line, wondering 17 00:01:24,299 --> 00:01:25,869 if this thing is worth their time. 18 00:01:26,409 --> 00:01:30,719 He'll share practical tips, real life examples, and a no nonsense 19 00:01:30,719 --> 00:01:32,989 approach to help you lead your team. 20 00:01:33,479 --> 00:01:36,769 Without further ado, here's my conversation with Justin Simpson. 21 00:01:36,918 --> 00:01:38,238 Justin, welcome to the show. 22 00:01:38,938 --> 00:01:39,588 Justin: Good morning, Alex. 23 00:01:39,628 --> 00:01:40,488 Thank you for having me. 24 00:01:40,788 --> 00:01:42,038 Alex: Oh, I'm delighted to have you. 25 00:01:42,368 --> 00:01:46,408 Your work with the Hear.Now.AI newsletter has focused on making 26 00:01:46,448 --> 00:01:50,258 artificial intelligence accessible to everyone, not just attack savvy. 27 00:01:50,558 --> 00:01:53,118 But I would like to start with a bigger picture question. 28 00:01:53,598 --> 00:01:57,118 Why do you see closing the AI divide as such a critical issue? 29 00:01:57,402 --> 00:02:00,842 Justin: I lived through into the first age of technology when a lot 30 00:02:00,842 --> 00:02:04,382 of the benefit of that technology accumulated to a small group of people. 31 00:02:04,612 --> 00:02:09,912 And once AI came out and one could see its power and the speed at which is 32 00:02:09,912 --> 00:02:15,252 changing, I have great concern that even more on even more quickly, the benefits 33 00:02:15,252 --> 00:02:19,092 would accrue to those who understood and could leverage this new technology 34 00:02:19,772 --> 00:02:21,172 versus those who are left behind. 35 00:02:21,192 --> 00:02:23,582 And I'm passionate that people not be left behind. 36 00:02:23,882 --> 00:02:27,692 Because it's such a liberating and powerful technology that I think it 37 00:02:27,792 --> 00:02:31,362 does have the ability to help across a broad spectrum of society, but 38 00:02:31,362 --> 00:02:35,322 society needs to embrace it and at least play with it and engage with 39 00:02:35,322 --> 00:02:37,332 it in order to get those benefits. 40 00:02:37,566 --> 00:02:42,140 Alex: I definitely see where you're coming from, but somebody might argue that the 41 00:02:42,270 --> 00:02:47,930 AI divide isn't as urgent because tools are becoming more and more intuitive. 42 00:02:48,010 --> 00:02:49,720 How would you respond to that, perspective? 43 00:02:50,070 --> 00:02:54,150 Justin: Well, in the consumer world, or this thing, space, 44 00:02:54,150 --> 00:02:57,480 if you like, is becoming more accessible and that will continue. 45 00:02:57,840 --> 00:03:02,360 But in the workspace, we just had, Meta, Mark Zuckerberg saying he 46 00:03:02,360 --> 00:03:05,821 expects his software engineers to be 30 percent more productive in 2025 47 00:03:06,340 --> 00:03:08,160 because they're leveraging AI tools. 48 00:03:08,720 --> 00:03:12,635 And the implication is if you're not 30 percent more productive, you can expect 49 00:03:12,645 --> 00:03:17,665 to be replaced by someone who is, so I think that there are a number of factors 50 00:03:17,675 --> 00:03:21,745 that could change dramatically, for example, small businesses, and I engage 51 00:03:21,745 --> 00:03:25,215 with those even here in San Francisco, which is the world capital of AI. 52 00:03:25,215 --> 00:03:27,985 If you like, there are small businesses who aren't using AI. 53 00:03:27,985 --> 00:03:32,055 And when I explained to them, this is a marketing department, a client 54 00:03:32,115 --> 00:03:35,515 communications department, a customer service department, all of which you can 55 00:03:35,515 --> 00:03:39,875 set up and use easily and quickly, now. 56 00:03:40,265 --> 00:03:42,425 If you don't, someone is going to. 57 00:03:42,765 --> 00:03:45,685 And so Fei Fei Li, who's a professor of AI, one of the, she is called the 58 00:03:45,983 --> 00:03:50,755 godfather, godmother of AI at Stanford, the Center for Human Centered Artificial 59 00:03:50,755 --> 00:03:55,025 Intelligence, said your job won't be taken by AI, but your job could be taken 60 00:03:55,035 --> 00:03:58,265 by someone who knows how to leverage AI, and I think people who are sitting 61 00:03:58,275 --> 00:04:01,985 back and waiting for it to become easy and accessible risk missing out. 62 00:04:02,285 --> 00:04:02,625 Alex: I see. 63 00:04:03,155 --> 00:04:05,695 So let's move on to the newsletter. 64 00:04:06,565 --> 00:04:09,885 Can you share your story, how you came with the idea? 65 00:04:10,185 --> 00:04:10,515 Justin: Yeah. 66 00:04:10,645 --> 00:04:13,075 so sitting here in San Francisco, of course, you see all the exciting 67 00:04:13,075 --> 00:04:16,315 stuff, the cutting edge, what's coming next, and that partly 68 00:04:16,345 --> 00:04:18,595 informs this urgency that I have. 69 00:04:18,895 --> 00:04:23,705 And so I started this newsletter to say, for non technical readers, let's find 70 00:04:23,735 --> 00:04:28,625 ways to help people understand what AI is, because there's a lot of big conceptual 71 00:04:28,625 --> 00:04:32,565 discussion about the longterm future and artificial general intelligence 72 00:04:32,585 --> 00:04:34,775 and all sorts of slogans and slang. 73 00:04:35,375 --> 00:04:38,435 But for general people, the best way to experience what AI 74 00:04:38,455 --> 00:04:39,935 is is to do something with it. 75 00:04:40,055 --> 00:04:43,765 So every week I put out three or four different applications that 76 00:04:43,785 --> 00:04:45,585 are intended to improve your life. 77 00:04:45,885 --> 00:04:48,745 At work, at home, or at play. 78 00:04:48,845 --> 00:04:51,995 So some of them aren't that serious to me, playing around with creating 79 00:04:52,005 --> 00:04:55,115 AI music for most people is fun. 80 00:04:55,405 --> 00:04:58,665 But for a few it's a career, and something to be taken seriously 81 00:04:58,665 --> 00:04:59,885 and a creative endeavor, et cetera. 82 00:04:59,885 --> 00:05:03,645 But for most people being able to write, a few sentences of description and 83 00:05:03,665 --> 00:05:05,865 have it turned into an engaging song. 84 00:05:06,375 --> 00:05:09,915 Is a wow moment to help them understand this is a powerful 85 00:05:10,045 --> 00:05:12,355 technology rather than something that's going to transform their lives. 86 00:05:12,355 --> 00:05:15,315 So I put out a random sort of selection of those. 87 00:05:15,345 --> 00:05:20,115 Therefore, I look at 30 or 40 apps every week to decide what seems 88 00:05:20,115 --> 00:05:22,035 to be in the category of useful. 89 00:05:22,495 --> 00:05:26,035 Alex: You said 30 or 40, but it can't be random. 90 00:05:26,045 --> 00:05:30,725 you probably have some way of selecting the ones that you consider would be 91 00:05:30,735 --> 00:05:32,555 the most interesting for your, readers. 92 00:05:32,855 --> 00:05:35,365 Justin: Yeah, there's an awful lot out there, and some of them 93 00:05:35,415 --> 00:05:39,985 are quite obscure, so they would appeal to a very niche, audience. 94 00:05:40,425 --> 00:05:45,165 Some of them are put out by unclear sort of groups that don't 95 00:05:45,185 --> 00:05:46,735 make it and identify themselves. 96 00:05:46,735 --> 00:05:49,505 I won't put something on the newsletter if I don't know 97 00:05:49,505 --> 00:05:51,820 exactly who's behind the software. 98 00:05:51,820 --> 00:05:54,840 And so some folks, perhaps in their rush to get it out, forget to tell you 99 00:05:54,850 --> 00:05:56,420 who they are and what their story is. 100 00:05:57,092 --> 00:06:00,990 some will expect you to pay first and then discover if you like the software. 101 00:06:00,990 --> 00:06:02,370 And so I don't put those in generally. 102 00:06:02,370 --> 00:06:05,270 They are something which has a free trial or a lower tier level 103 00:06:05,270 --> 00:06:06,700 that people can experiment with. 104 00:06:07,170 --> 00:06:10,500 And then the sort of general thing of thinking, if I'm a non technical 105 00:06:10,500 --> 00:06:13,530 person, would this be interesting to me that I could say, Oh, I could try 106 00:06:13,530 --> 00:06:17,040 this with respect to my professional life or with respect to my home life. 107 00:06:17,496 --> 00:06:19,826 Alex: And so can you share, maybe an example or story from a 108 00:06:19,836 --> 00:06:22,486 reader who applied something they learned from your newsletter? 109 00:06:22,836 --> 00:06:24,706 Justin: There's two, if I may, one is a fun one. 110 00:06:24,706 --> 00:06:28,556 I put out a story about an app, which takes a photograph of a 111 00:06:28,556 --> 00:06:31,966 family member or yourself and turns it into a Pixar cartoon. 112 00:06:32,266 --> 00:06:35,336 And so many people came to me and said, I know I should be focusing 113 00:06:35,336 --> 00:06:37,516 on the things that have helped me at work, but that was such fun. 114 00:06:37,516 --> 00:06:39,961 I spent way too long doing that. 115 00:06:39,961 --> 00:06:43,066 And I think that's a success as well, because it's helping understand the 116 00:06:43,146 --> 00:06:46,056 power of the, of AI in its own way. 117 00:06:46,516 --> 00:06:49,696 But something that's very useful, I put out a piece about a tool 118 00:06:49,706 --> 00:06:51,486 which I use myself all the time. 119 00:06:51,486 --> 00:06:56,166 I'm not gifted with high speed typing, and so I saw an app called 120 00:06:56,206 --> 00:06:59,836 Flow, which lives natively on my Mac. 121 00:06:59,836 --> 00:07:00,926 I'm in the Mac environment. 122 00:07:01,236 --> 00:07:05,106 And anytime I hold down a key, takes my voice and records it. 123 00:07:05,206 --> 00:07:09,696 There's a lot of transcription apps, but this AI filter recognizes that I'm human. 124 00:07:09,696 --> 00:07:12,536 So I say, or ah, or I correct myself. 125 00:07:12,536 --> 00:07:13,826 I say, I'll see you on Tuesday. 126 00:07:13,826 --> 00:07:14,236 No, Wednesday. 127 00:07:14,536 --> 00:07:17,746 And what comes out very quickly at up to a couple of hundred words a minute 128 00:07:18,046 --> 00:07:22,231 is a sort of coherent version of what I've said, and that can be filled in 129 00:07:22,231 --> 00:07:26,361 whether I'm writing a script for an AI generation, a letter, an email, et cetera. 130 00:07:26,661 --> 00:07:29,251 And I wrote about that, and a lot of people have told me that they've 131 00:07:29,251 --> 00:07:33,441 adopted that as, uh, because they also, like me, are not high speed typists. 132 00:07:33,531 --> 00:07:35,941 Alex: That's, an amazing example, a very useful one. 133 00:07:36,371 --> 00:07:40,221 Can you think of anything more, let's say controversial or maybe an 134 00:07:40,221 --> 00:07:43,761 unexpected reaction you've received from something you shared on the newsletter? 135 00:07:44,061 --> 00:07:47,591 Justin: not so much from a specific app, but I think there is a lot of concern 136 00:07:47,631 --> 00:07:51,791 about AI in general, and the fact that I take an optimistic view in the newsletter. 137 00:07:51,841 --> 00:07:54,761 I'm trying to encourage people to use it, so I'm not inclined to talk about 138 00:07:54,761 --> 00:07:59,781 the long term consequences of AI and I know several people have pushed back 139 00:07:59,931 --> 00:08:05,316 and said, why don't you engage with the risks of job loss or the threat 140 00:08:05,431 --> 00:08:08,011 to different industries, et cetera. 141 00:08:08,401 --> 00:08:11,731 I'm aware of those, and, consciously thinking about what 142 00:08:11,731 --> 00:08:13,161 the implications for society are. 143 00:08:13,161 --> 00:08:15,931 But I choose in my newsletter not to engage with that. 144 00:08:15,941 --> 00:08:18,891 It's not intended to be a holistic philosophical view of AI. 145 00:08:19,051 --> 00:08:23,571 It's a very hands on, light, experimental version. 146 00:08:23,871 --> 00:08:24,861 Alex: Yeah, that makes sense. 147 00:08:25,181 --> 00:08:29,341 And so if there was like a simple exercise or experiment that our listeners 148 00:08:29,341 --> 00:08:34,761 could try today to understand the potential of AI or learn about a new 149 00:08:34,761 --> 00:08:36,821 tool, what would you recommend to them? 150 00:08:37,185 --> 00:08:42,075 Justin: So for starters, I think that ChatGPT is the main adopted, encounter 151 00:08:42,145 --> 00:08:43,925 with AI that, that everybody has. 152 00:08:44,565 --> 00:08:47,345 and I encourage people to go and play with it because it's not 153 00:08:47,565 --> 00:08:50,935 the same ChatGPT that they might have played with six months ago. 154 00:08:51,035 --> 00:08:54,180 It's a very much more sophisticated animal. 155 00:08:54,180 --> 00:08:58,420 So even without leaving that ChatGPT environment now, for example, you can 156 00:08:58,420 --> 00:09:03,210 set up projects where you keep a set of information, documents, data related to 157 00:09:03,210 --> 00:09:07,610 a specific topic, and you just have a conversation with that particular topic. 158 00:09:07,610 --> 00:09:10,810 So it's not the general knowledge that ChatGPT has of the world. 159 00:09:10,890 --> 00:09:15,935 It's just the knowledge of your particular set of files and links, etc. You can 160 00:09:16,035 --> 00:09:18,185 ask ChatGPT to send you a reminder. 161 00:09:18,245 --> 00:09:22,915 For example, every morning I get from ChatGPT a list of the top news stories 162 00:09:22,915 --> 00:09:24,535 and technology of the previous day. 163 00:09:25,095 --> 00:09:27,995 I get a forecast about weather for the day. 164 00:09:28,500 --> 00:09:32,550 So those are called tasks, and ChatGPT is now engaging, and this is the beginning 165 00:09:32,550 --> 00:09:38,190 of what we're going to see a lot of in 2025, which is not just AI as a source 166 00:09:38,190 --> 00:09:42,310 of information, but AI as a partner that executes for you on your behalf. 167 00:09:42,310 --> 00:09:46,220 In the beginning, what I'm describing are quite simple tasks, it's going 168 00:09:46,220 --> 00:09:52,220 to be a matter of weeks or months before regular folk, not just skilled 169 00:09:52,220 --> 00:09:57,955 programmers, can ask an AI to go away and research a holiday, including travel and 170 00:09:57,955 --> 00:09:59,895 hotels, where you come back with a plan. 171 00:10:00,585 --> 00:10:03,595 Probably at this stage, they don't want to give their credit card information. 172 00:10:04,015 --> 00:10:05,815 So it'll come back as a here's what I booked. 173 00:10:05,965 --> 00:10:09,445 If you like it, put in your credit card and it'll finalize. 174 00:10:09,495 --> 00:10:11,195 But that capability is coming very quickly. 175 00:10:11,195 --> 00:10:15,645 So I think exploring every week, putting aside half an hour to play 176 00:10:15,645 --> 00:10:20,295 with ChatGPT or Gemini or whatever your preferred version is, and just 177 00:10:20,765 --> 00:10:24,275 push those technologies a little bit in the hopes that one of them sparks. 178 00:10:24,275 --> 00:10:25,965 And you say, Aha, I can use this at work. 179 00:10:26,045 --> 00:10:26,775 This would be a good feature. 180 00:10:27,120 --> 00:10:35,870 Alex: I could contribute maybe, that you could use ChatGPT to learn 181 00:10:35,870 --> 00:10:40,040 languages, like in conjunction with maybe a Duolingo or something like that. 182 00:10:40,640 --> 00:10:47,105 But if you are like me, need a lot of practice, which I'm very, I don't know, 183 00:10:47,365 --> 00:10:50,795 avoiding as much as possible when it comes to learning foreign languages, 184 00:10:51,135 --> 00:10:58,385 but you can use the audio interface of, the ChatGPT app to have a very natural 185 00:10:58,435 --> 00:11:03,235 conversation where essentially at the beginning you prompt, the model to act 186 00:11:03,305 --> 00:11:08,945 as your kind of, learning buddy, or maybe even a tutor and you just say, I 187 00:11:08,945 --> 00:11:13,075 want to practice, in my case, Spanish and then have a conversation with me. 188 00:11:13,075 --> 00:11:19,905 And, it's, it's, a use case, which to me unlocks, it's not replacing humans, but 189 00:11:19,905 --> 00:11:25,145 it is creating a situation where in this specific case, for me, there's a lot of 190 00:11:25,145 --> 00:11:29,225 stress involved when I'm practicing a foreign language and there's much less 191 00:11:29,225 --> 00:11:31,515 stress when practicing, with an AI. 192 00:11:31,555 --> 00:11:37,325 So can you think of maybe like once a person has become relatively 193 00:11:37,325 --> 00:11:41,485 familiar with ChatGPT and the new features, that are being rolled 194 00:11:41,485 --> 00:11:43,705 out, what would be the next step? 195 00:11:44,340 --> 00:11:50,380 For somebody who is not techie, as you said, but, is feeling the AI divide. 196 00:11:50,596 --> 00:11:55,066 Justin: Somebody would think through their life and say, where is there a highly 197 00:11:55,066 --> 00:12:01,416 repetitive task, for example, and can AI help me by replacing, that would be one, 198 00:12:01,846 --> 00:12:03,996 and that might be at work or at home. 199 00:12:04,396 --> 00:12:07,796 Some people are comfortable, for example, saying, I'm going to let AI 200 00:12:07,986 --> 00:12:12,066 screen my emails and identify those which are important in some way. 201 00:12:12,746 --> 00:12:16,551 For most of us, our life is such a sort of cornucopia of different 202 00:12:16,561 --> 00:12:19,781 things that your home email is probably not a good candidate for 203 00:12:20,081 --> 00:12:22,701 scanning and summarizing by AI. 204 00:12:22,901 --> 00:12:25,861 But if you're at work and you have a work email address, that is just 205 00:12:25,891 --> 00:12:32,781 customer feedback coming in to address complaints or compliments or questions, 206 00:12:32,781 --> 00:12:38,471 et cetera, it's entirely possible quite straightforwardly to set up a filtering 207 00:12:38,471 --> 00:12:42,591 mechanism, particularly if you're already on Microsoft and you have copilot, or if 208 00:12:42,611 --> 00:12:46,841 you're on the Google ecosystem and you have Gemini, and I think experimenting 209 00:12:46,841 --> 00:12:53,011 with that, is a fun way to understand the technology and also to, to see the 210 00:12:53,041 --> 00:12:56,501 possibilities that you might say, great, I'm going to go ahead and probably with 211 00:12:56,501 --> 00:13:00,451 a little bit of help, but possibly by yourself, implement something like that. 212 00:13:00,451 --> 00:13:04,381 But you do it at the sort of individual level first to see that 213 00:13:04,381 --> 00:13:05,661 you're comfortable with the output. 214 00:13:05,711 --> 00:13:06,531 That would be one. 215 00:13:07,031 --> 00:13:10,551 Analogous to your learning, languages, which I think is the excellent 216 00:13:10,551 --> 00:13:13,771 example, by the way, I think it's a really fun way to leverage this 217 00:13:13,841 --> 00:13:15,941 conversational ability of ChatGPT. 218 00:13:15,961 --> 00:13:18,771 And I love the fact that in its latest iteration, you can interrupt it. 219 00:13:18,801 --> 00:13:20,221 You can change its direction. 220 00:13:20,231 --> 00:13:24,036 You don't have to wait for it to finish an answer and then interact. 221 00:13:24,036 --> 00:13:28,306 It's a very much more conversational, but, Google has a fantastic product 222 00:13:28,346 --> 00:13:32,346 called Notebook LM. And, this could be for any topic that you want to learn. 223 00:13:32,746 --> 00:13:37,296 It could be something highly complex with a lot of research material, 224 00:13:37,676 --> 00:13:39,526 websites, documents, et cetera. 225 00:13:39,626 --> 00:13:43,956 It could be something very straightforward that somebody has forwarded you a hundred 226 00:13:43,956 --> 00:13:47,476 page PDF and you don't really want to read through it without understanding. 227 00:13:47,776 --> 00:13:51,766 So in NotebookLM, you upload whatever the content is, or you point to the 228 00:13:51,776 --> 00:13:57,271 website or the YouTube talk, etc. Then you can engage with it however you choose. 229 00:13:57,271 --> 00:14:00,301 You can ask it to prepare a summary, you can ask it to prepare a Q&A. 230 00:14:01,161 --> 00:14:05,871 Most excitingly, though, is that you can ask it to create a podcast in which 231 00:14:05,871 --> 00:14:11,001 two people talk to each other in a very engaging way about the content. 232 00:14:11,001 --> 00:14:15,111 And so for a, for somebody who wants to learn but is not really that engaged, 233 00:14:15,216 --> 00:14:16,641 it's a fantastic way to do that. 234 00:14:16,641 --> 00:14:21,022 And now, not only that, but you push a button and the podcast stops. 235 00:14:21,201 --> 00:14:23,901 And one of the hosts says, I think we have a question from the audience 236 00:14:23,901 --> 00:14:25,371 here, and you interject your question. 237 00:14:26,041 --> 00:14:29,751 Related to something they've said or the topic, they answer your question 238 00:14:30,271 --> 00:14:33,781 in live real time, which I think is an extraordinary demonstration of AI 239 00:14:33,951 --> 00:14:38,561 capability, but also is addressing ways to learn for people who don't like to absorb 240 00:14:38,571 --> 00:14:42,491 information just by reading the 100 page PDF for, doing it the old fashioned way. 241 00:14:43,008 --> 00:14:45,528 Alex: I'm so glad, Justin, that you brought up Notebook. 242 00:14:45,528 --> 00:14:52,238 I think it's definitely in that set of tools that should be in that, second step 243 00:14:52,638 --> 00:14:55,138 after getting familiar with chatbots. 244 00:14:55,558 --> 00:14:59,738 From my personal experiences with Notebook, I would recommend people to try. 245 00:15:00,058 --> 00:15:04,098 Notebook and use it, especially that you mentioned YouTube, especially with 246 00:15:04,098 --> 00:15:08,618 YouTube, because YouTube is full of, content where to learn new skills, 247 00:15:08,678 --> 00:15:10,148 very practical skills very often. 248 00:15:10,258 --> 00:15:16,128 for example, I was helping a friend of mine, a tattoo artist with, She, 249 00:15:16,138 --> 00:15:21,858 she wants it to get a bit of, help to go from image to video, essentially, 250 00:15:21,888 --> 00:15:26,918 images of her, work to make that into clips for social media, et cetera. 251 00:15:27,458 --> 00:15:29,848 And, there is a fair amount of tools out there. 252 00:15:30,128 --> 00:15:35,903 And for some of us, at least in my case, I have been using YouTube to 253 00:15:35,913 --> 00:15:38,033 learn about a lot of these tools. 254 00:15:38,138 --> 00:15:39,983 Obviously, I don't know every one of them. 255 00:15:40,033 --> 00:15:43,573 just like everyone, like other people, I have to do my research. 256 00:15:43,773 --> 00:15:46,123 And that research often took me to YouTube. 257 00:15:46,123 --> 00:15:52,793 But it takes a lot of time and a bit of emotional control not to click 258 00:15:53,033 --> 00:15:57,913 from the thing that you're researching to all of the other cat videos that 259 00:15:57,923 --> 00:15:59,763 YouTube recommends or whatever it is. 260 00:16:00,113 --> 00:16:05,763 And so the pro tip I would have for our listeners is if you're researching 261 00:16:05,763 --> 00:16:11,103 on YouTube, rather than listening and watching all of the videos for a 262 00:16:11,103 --> 00:16:17,143 specific topic, grab their URLs, drop them into Notebook LM, and then have 263 00:16:17,283 --> 00:16:19,788 maybe the podcast tell you a summary. 264 00:16:19,788 --> 00:16:22,778 But in which you can inject the questions that you actually have. 265 00:16:22,778 --> 00:16:26,858 So in my case, I was working with this, with this friend and I was able to get the 266 00:16:26,858 --> 00:16:31,198 system to generate a podcast specifically for if you're a tattoo artist, this 267 00:16:31,198 --> 00:16:32,678 is what you can do with your designs. 268 00:16:32,678 --> 00:16:37,728 And, I think that's, one of those, you refer to them as an aha moment before. 269 00:16:37,728 --> 00:16:39,798 I think that's definitely one of them. 270 00:16:40,098 --> 00:16:40,358 Justin: Yeah. 271 00:16:40,658 --> 00:16:42,338 Alex: I would want now to move. 272 00:16:42,648 --> 00:16:46,638 Because you've already mentioned using these tools at work at some point once 273 00:16:46,638 --> 00:16:48,578 you're relatively comfortable with them. 274 00:16:48,578 --> 00:16:54,388 So I would like to move also in the activity that you have with businesses. 275 00:16:54,388 --> 00:16:57,398 I know you've been offering and running AI workshops. 276 00:16:57,408 --> 00:16:59,178 Can you tell us a little bit more about that? 277 00:16:59,478 --> 00:17:02,448 Justin: Yeah, I think that there's quite an interesting phenomenon going on at the 278 00:17:02,448 --> 00:17:07,328 moment, and surveys are obviously changing as quickly as the technology is changing. 279 00:17:07,328 --> 00:17:09,798 So you have to take a snapshot and assume that it's representative, 280 00:17:09,998 --> 00:17:11,828 but it's just to give you an idea. 281 00:17:12,288 --> 00:17:15,528 But the idea, which I think has been repeated and in my experience is 282 00:17:15,528 --> 00:17:19,783 correct, is that fewer companies have worked out what they want to do as a 283 00:17:19,803 --> 00:17:25,893 company with AI than people who are wanting to experiment and leverage AI. 284 00:17:25,963 --> 00:17:29,804 So you have this gap between what is a what's the corporate policy 285 00:17:29,833 --> 00:17:33,633 and how should we use a I versus the individuals who are going and saying, 286 00:17:34,113 --> 00:17:38,203 Aha, I can make my life quicker, faster, more efficient by using these tools. 287 00:17:38,623 --> 00:17:42,313 And so, what I'm trying to help with this sort of closing that gap. 288 00:17:42,863 --> 00:17:48,053 And not closing that gap by coming at it like we might have in my earlier 289 00:17:48,053 --> 00:17:51,593 career, when if a new technology came in, it was imposed from the top down. 290 00:17:51,653 --> 00:17:56,233 There's a new version of Microsoft Excel and everybody in the entire company 291 00:17:56,393 --> 00:17:58,153 has to use exactly the same version. 292 00:17:58,153 --> 00:18:01,243 And so we're inevitably two versions behind what's going on, but 293 00:18:01,813 --> 00:18:03,333 that's what you're required to do. 294 00:18:03,663 --> 00:18:05,213 You will learn it and you will do this. 295 00:18:05,213 --> 00:18:07,183 We'll have you very top down, very hierarchical. 296 00:18:07,483 --> 00:18:10,463 The liberating power of AI is that it's not. 297 00:18:10,493 --> 00:18:12,073 It's a bottom up skill set. 298 00:18:12,073 --> 00:18:16,253 It's you looking at your work day and saying, gosh, if I could automate my 299 00:18:16,253 --> 00:18:21,093 email responses or summarize documents or prepare my PowerPoint presentations 300 00:18:21,153 --> 00:18:24,263 more quickly and more efficiently, that's going to make me better at my job. 301 00:18:24,303 --> 00:18:28,283 That's sort of nature of the liberation that comes from this AI technology. 302 00:18:28,483 --> 00:18:30,803 So those two have to be married together. 303 00:18:30,803 --> 00:18:33,443 And so the workshops are an attempt to do that. 304 00:18:33,453 --> 00:18:38,158 To try and help everybody think through strategically, for a particular type of 305 00:18:38,158 --> 00:18:41,788 business, what are the sorts of things that AI lends itself to that will make the 306 00:18:41,788 --> 00:18:45,228 business better and more efficient versus things where it might introduce risk? 307 00:18:45,908 --> 00:18:49,328 So the poor guy in finance who thinks that he's going to get a great 308 00:18:49,918 --> 00:18:53,448 boost in productivity by uploading confidential corporate financial 309 00:18:53,448 --> 00:18:57,568 information into ChatGPT probably needs to be constrained a little bit. 310 00:18:58,193 --> 00:19:02,953 But the person who's writing customer service responses using ChatGPT 311 00:19:02,973 --> 00:19:06,253 probably doesn't and can leverage and the company as well as the 312 00:19:06,253 --> 00:19:07,603 individual can benefit from that. 313 00:19:07,603 --> 00:19:11,803 So thinking through that sort of task matrix of risk versus benefit, 314 00:19:12,103 --> 00:19:18,113 thinking through, ethical policies, Some AI chatbots already try to build 315 00:19:18,113 --> 00:19:19,723 in some sort of ethical limitations. 316 00:19:19,733 --> 00:19:22,253 Some, on the other hand, believe in entire freedom. 317 00:19:22,253 --> 00:19:24,853 And so you can get some quite unpleasant stuff floating around in there. 318 00:19:25,133 --> 00:19:28,513 So what should my corporation do to think about that? 319 00:19:28,913 --> 00:19:32,913 and thinking about, should we standardize on a particular system or should we 320 00:19:32,913 --> 00:19:37,313 have a number of different approaches because the larger the corporation, 321 00:19:37,313 --> 00:19:40,273 the more the historic inclination of these corporations is to pick one 322 00:19:40,283 --> 00:19:44,063 standard, part of the AI revolution. 323 00:19:44,653 --> 00:19:47,613 And we're talking about the speed is that from one week to the 324 00:19:47,613 --> 00:19:50,823 next, one model is faster, better, quicker, more efficient, et cetera. 325 00:19:50,883 --> 00:19:56,313 And so picking one in January of 2025, you have really no idea by the time that 326 00:19:56,543 --> 00:20:00,273 system comes online at the end of 2025, whether what you've picked is correct. 327 00:20:00,323 --> 00:20:02,503 So I think about how you architect that. 328 00:20:02,953 --> 00:20:06,323 And so I think there's that workshop incorporates all of that. 329 00:20:06,353 --> 00:20:11,063 And then also, because there are still folk who are uncomfortable 330 00:20:11,063 --> 00:20:12,053 with the idea of prompting. 331 00:20:12,368 --> 00:20:15,668 That in their minds, ChatGPT is a sort of Google replacement. 332 00:20:15,668 --> 00:20:17,618 I ask it a question, I get an answer, and that's it. 333 00:20:17,618 --> 00:20:19,238 And the answer is either good or it's bad. 334 00:20:19,658 --> 00:20:22,298 But actually, it's an interaction. 335 00:20:22,358 --> 00:20:23,248 It's a conversation. 336 00:20:23,458 --> 00:20:27,628 And so taking a workshop where you take something with a bland, generic question, 337 00:20:27,628 --> 00:20:29,108 and you get a bland, generic answer. 338 00:20:29,148 --> 00:20:32,555 And over the course of 40 minutes, thinking about how to prompt to get the 339 00:20:32,555 --> 00:20:36,980 best out of ChatGPT or the Google or whatever it is that you're using and 340 00:20:36,980 --> 00:20:41,400 looking at the answer at the end and saying, gosh, it is better if I structure 341 00:20:41,400 --> 00:20:45,570 my prompt, if I position it in certain ways, if I make it clear what I want, 342 00:20:45,620 --> 00:20:50,580 if I say, write an article like the New York Times or, in the style of Axios, 343 00:20:51,160 --> 00:20:53,360 I'm going to give very different answers. 344 00:20:53,360 --> 00:20:58,230 And so playing with those sorts of tweaks to help people understand it's not just 345 00:20:58,230 --> 00:21:00,430 a one off, ask a question, get an answer. 346 00:21:00,730 --> 00:21:05,620 Alex: That sounds very comprehensive because you've talked about the risks 347 00:21:05,650 --> 00:21:10,110 and, the guardrails or other measures that, organizations can put in place, 348 00:21:10,580 --> 00:21:14,680 the benefits depending on the type of industry you're in, and then all the 349 00:21:14,680 --> 00:21:21,010 way down to how somebody uses prompts or goes through a conversation, how 350 00:21:21,010 --> 00:21:25,320 that's different from doing a web search. 351 00:21:25,320 --> 00:21:26,940 So yeah, it sounds very comprehensive. 352 00:21:26,940 --> 00:21:31,640 How much time Do people invest to go through these workshops? 353 00:21:31,940 --> 00:21:36,770 Justin: Generally, three hours is a sort of bare minimum where you can cover at 354 00:21:36,770 --> 00:21:39,340 a high level what I've just described to you, but do it in a hands on way. 355 00:21:39,390 --> 00:21:44,155 I really hate the idea of talking for three hours, almost as much as 356 00:21:44,155 --> 00:21:47,975 somebody hates the idea of listening for three hours of conversation. 357 00:21:48,395 --> 00:21:52,445 so three hours of engaged workshop is really a sort of covers the sectors 358 00:21:52,445 --> 00:21:56,205 and then a full day gives you really, more opportunity to dive into some 359 00:21:56,205 --> 00:22:00,335 of the topics in more detail and have more calibration of the audience and 360 00:22:00,365 --> 00:22:04,105 fitting the specific circumstances, what the relevant content is. 361 00:22:04,405 --> 00:22:04,615 Alex: All right. 362 00:22:04,615 --> 00:22:10,195 And I guess the various pieces that you were listing earlier also are like about, 363 00:22:10,475 --> 00:22:12,515 about like somewhat of a menu, right? 364 00:22:12,605 --> 00:22:15,785 It's not that in three hours you necessarily do all of them. 365 00:22:15,805 --> 00:22:15,965 Yeah. 366 00:22:16,265 --> 00:22:16,835 Justin: Exactly. 367 00:22:17,105 --> 00:22:17,575 Exactly. 368 00:22:18,065 --> 00:22:18,335 Yeah. 369 00:22:18,535 --> 00:22:21,195 Alex: What was the most surprising transformation you've seen in the 370 00:22:21,205 --> 00:22:22,875 participants in one of the workshops? 371 00:22:23,175 --> 00:22:26,115 Justin: I work with a, a security business, which is 372 00:22:26,645 --> 00:22:28,895 basically run as an agency model. 373 00:22:28,895 --> 00:22:32,305 It's a guy who has contacts in the world of people who, can 374 00:22:32,755 --> 00:22:35,215 spend time with security guards that are full time employees. 375 00:22:35,225 --> 00:22:38,495 So he goes and gets the contracts and then places a security 376 00:22:38,495 --> 00:22:40,355 guard, at specific locations. 377 00:22:40,655 --> 00:22:41,615 And it's a two man show. 378 00:22:41,655 --> 00:22:46,025 His wife runs the books and the business, and he does the front office, 379 00:22:46,025 --> 00:22:49,405 if you like, selling the contracts and then placing the security guards. 380 00:22:49,705 --> 00:22:55,335 And with a little bit of work from AI, he is putting out a newsletter, 381 00:22:55,845 --> 00:22:59,465 which makes it look as if it's a 30 man corporate operation. 382 00:22:59,765 --> 00:23:01,115 Topics in security. 383 00:23:01,435 --> 00:23:03,895 should I have drones at my event? 384 00:23:03,915 --> 00:23:07,425 things that are really engaging, really professionally done. 385 00:23:07,725 --> 00:23:11,455 and done by this two man team with a lot of other things going on because 386 00:23:11,595 --> 00:23:16,255 of the efficiency of AI that can create the content formatted and 387 00:23:16,255 --> 00:23:20,325 presented and distributed in a very customer friendly and efficient way. 388 00:23:20,625 --> 00:23:26,305 There's another very happy story of a person who, here in San Francisco used 389 00:23:26,305 --> 00:23:29,935 to be a big radio personality working for one of the big stations and decided 390 00:23:29,945 --> 00:23:33,345 to go into the business of coaching, teaching people how to prepare for media 391 00:23:33,415 --> 00:23:37,375 interviews, teaching people how to do big public speeches, teaching people 392 00:23:37,375 --> 00:23:39,265 how to create a social media persona. 393 00:23:39,565 --> 00:23:43,715 And she will tell you that she doesn't start any of your engagements without 394 00:23:43,765 --> 00:23:47,985 first going in and having a conversation with ChatGPT about how to prepare. 395 00:23:48,025 --> 00:23:51,975 I'm going to talk to the executive of a corporation that does such and such. 396 00:23:52,635 --> 00:23:55,655 And he is interested in creating a social media personality. 397 00:23:55,655 --> 00:23:57,335 What are the ways that I might engage with him? 398 00:23:57,465 --> 00:24:02,575 Just for some extra ideas and creativity, some generation, and it becomes a habit. 399 00:24:02,585 --> 00:24:05,715 So I think those are ways in which small businesses in those cases 400 00:24:06,015 --> 00:24:11,035 can appear much more responsive and, effective as marketers. 401 00:24:11,640 --> 00:24:14,580 And there's other applications in that sort of small, medium term business, 402 00:24:14,580 --> 00:24:17,935 which can also be quite, again, the aha moment that we're looking for. 403 00:24:18,480 --> 00:24:21,130 Alex: Wow, those are great examples. 404 00:24:21,760 --> 00:24:27,250 And so as we're about to wrap up, I usually ask for a book, maybe a tool or 405 00:24:27,250 --> 00:24:33,230 a habit that has made the most impact on you in the last 12 months and why. 406 00:24:33,690 --> 00:24:39,730 Justin: Mustafa Suleiman, who's now head of AI at, Microsoft, but before was one 407 00:24:39,730 --> 00:24:44,830 of the early pioneers, has written a book called The Coming Wave, which is a sort 408 00:24:44,830 --> 00:24:48,770 of analysis of the bigger picture that I don't write about in the newsletter and I 409 00:24:48,810 --> 00:24:51,880 don't generally talk about in my seminars. 410 00:24:52,180 --> 00:24:54,880 But I think it's important to understand because there are a few books coming 411 00:24:54,880 --> 00:24:59,440 out now on the topic, but it's important to get a grasp for what this means for 412 00:24:59,440 --> 00:25:02,420 society because we're not talking about transformation for our grandchildren. 413 00:25:02,970 --> 00:25:04,790 We're talking about transformation in our lifetimes. 414 00:25:05,090 --> 00:25:06,290 I have more gray hair than you. 415 00:25:06,300 --> 00:25:09,290 My horizon, by definition is a bit shorter. 416 00:25:09,895 --> 00:25:11,795 I'm seeing the transformation happen now. 417 00:25:11,795 --> 00:25:13,625 It's not something that I'm thinking about. 418 00:25:13,925 --> 00:25:16,295 There is a negative side to that, which is job losses. 419 00:25:16,585 --> 00:25:20,325 But there's also a very positive side, which is job creation, job 420 00:25:20,675 --> 00:25:26,545 reinvention, job satisfaction, increase, etc. And it's starting to 421 00:25:26,545 --> 00:25:29,060 happen now and Because of human nature. 422 00:25:29,060 --> 00:25:32,710 It's not happening as quickly as the technology could allow it, but everybody 423 00:25:32,730 --> 00:25:38,090 should have some idea as well as some self reflection about how can I am I exposed 424 00:25:38,680 --> 00:25:41,890 this way and how could I benefit from it? 425 00:25:41,950 --> 00:25:46,450 And my hope is that with a little bit of thought and creativity, most 426 00:25:46,450 --> 00:25:50,490 folk can find a way to come out on the positive side of that, but I do 427 00:25:50,490 --> 00:25:56,380 know that disengaging or waiting or doing nothing is not the right answer. 428 00:25:56,380 --> 00:25:57,800 So the two are together. 429 00:25:57,810 --> 00:26:01,760 One is perhaps that book, The Coming Wave by Mustafa Suleyman, and the 430 00:26:01,810 --> 00:26:04,565 other is the habit of playing with AI. 431 00:26:04,825 --> 00:26:08,845 Experimenting, whether it's through my newsletter or just web scanning or 432 00:26:08,845 --> 00:26:13,135 anything else, finding different ways to understand the power of the technology 433 00:26:13,715 --> 00:26:17,405 and then hopefully think about how it might impact you in your life at work or 434 00:26:17,405 --> 00:26:20,325 at home or at play is not so bad either. 435 00:26:20,625 --> 00:26:23,475 Alex: All right, we'll make sure to link both the book and of course, 436 00:26:23,475 --> 00:26:25,525 your newsletter in the description. 437 00:26:25,895 --> 00:26:30,605 And the final question is, there's, is there something that you would expect 438 00:26:30,615 --> 00:26:32,745 to remain constant 10 years from now? 439 00:26:32,795 --> 00:26:33,165 And why? 440 00:26:33,465 --> 00:26:36,735 Justin: Human creativity will be an essential component of this. 441 00:26:37,035 --> 00:26:39,645 I think the AI is going to get smarter and faster. 442 00:26:40,265 --> 00:26:47,160 I think more and more rote stuff will be done by AI agents rather than by humans. 443 00:26:47,660 --> 00:26:51,990 I think a lot of the material is going to be written by AI, but at the 444 00:26:51,990 --> 00:26:55,130 end of it, the stuff that's really going to matter is the stuff that 445 00:26:55,170 --> 00:26:59,190 isn't programmable, that just resides in the human brain and the human 446 00:26:59,210 --> 00:27:01,440 instinct and the human creative spirit. 447 00:27:01,920 --> 00:27:03,290 So to me, that's the constant. 448 00:27:03,290 --> 00:27:06,310 It's just a question of how it's engineered to maximize 449 00:27:06,310 --> 00:27:09,410 the benefit to the human rather than to the AI system. 450 00:27:09,710 --> 00:27:10,230 Alex: Brilliant. 451 00:27:10,800 --> 00:27:11,330 That's great. 452 00:27:11,490 --> 00:27:12,500 Thank you very much, Justin. 453 00:27:12,800 --> 00:27:13,320 Justin: Thank you, Alex. 454 00:27:13,320 --> 00:27:13,880 Great pleasure. 455 00:27:14,385 --> 00:27:16,285 Alex: Justin gave us a lot to think about today. 456 00:27:16,695 --> 00:27:20,835 From the growing AI divide to practical ways we can bridge it, he is adamant. 457 00:27:21,070 --> 00:27:24,080 that AI isn't just for tech experts, it's for everyone. 458 00:27:24,548 --> 00:27:27,448 Staying engaged and experimenting with artificial intelligence 459 00:27:27,478 --> 00:27:29,288 is key to staying relevant. 460 00:27:29,898 --> 00:27:34,778 His insights challenge us to ask ourselves, how AI not just a tool 461 00:27:34,778 --> 00:27:38,728 for ourselves, but a force for empowering those around us, our 462 00:27:38,728 --> 00:27:41,318 teams, communities and organizations. 463 00:27:41,622 --> 00:27:45,202 If you are ready to dive deeper into practical AI tools or want to bring 464 00:27:45,262 --> 00:27:48,902 actionable strategies to your workplace, check out Justin's newsletter. 465 00:27:49,312 --> 00:27:50,732 The link is in the description. 466 00:27:51,502 --> 00:27:56,242 It's a trusted hands on guide for non technical users and his workshops are 467 00:27:56,242 --> 00:28:00,932 an excellent resource for organizations looking to break down barriers to AI 468 00:28:00,932 --> 00:28:04,897 adoption and craft a smarter, more inclusive roadmap for the future. 469 00:28:05,395 --> 00:28:09,035 As always, we have more exciting topics and guest appearances lined up. 470 00:28:09,415 --> 00:28:14,325 So stay tuned for more tales of innovation that inspire, challenge, and transform. 471 00:28:15,005 --> 00:28:16,885 Until next time, peace.