• Skip to Content

Kelley School of Business Indiana University
  • Undergraduate
    • Why Kelley?
    • Admissions
    • Academics
    • Scholarships
    • Student Life
    • Pre-College
    • Current Students
    • Parents
  • Graduate
    • Full-Time +Flex MBA
    • Kelley Direct Online MBA
    • Evening MBA
    • Physician MBA
    • Online Master's Degrees & Certificates
    • Specialized Masters
    • 3/2 MBA
    • Graduate Accounting Indianapolis
    • MS in Accounting with Data and Analytics
    • MS in the Business of Biotech and Life Sciences
    • MS in Finance
    • MS in Healthcare Management
    • MS in Information Systems
    • MS in Management
    • Kelley Direct Online EDBA
    • International Programs
  • Faculty, Research & PhD
    • Research & Publications
    • Faculty Directory
    • Departments & Majors
    • Centers & Institutes
    • Behavioral Lab
    • Courses
    • PhD
  • Executive Education
    • Custom Programs
    • Degree Partnerships
    • Student Resources
    • Case Studies
    • Professional Development
    • Meet Us
  • Recruiters
    • Graduate Career Services
    • Undergraduate Career Services
    • Indianapolis Career Services
    • Alumni Career Resources
    • Corporate & Foundation Relations
    • Indianapolis Corporate & Foundation Relations
  • Alumni
    • Who We Are
    • Get Involved
    • Career & Professional Development
    • Awards
    • Alumni Legacies
    • Events
    • Giving
    • Contact Us
  • About
    • Dean's Welcome
    • Administration
    • Kelley Women
    • School Profile
    • History
    • Visit Bloomington
    • Visit Kelley Indianapolis
    • Contact
    • Directory
    • Social Media Directory
    • Rankings
  • More
    • Centers & Institutes
    • Directory
    • News & Events
    • Give
    • Kelley Store
Professional Development
  • Leadership
    • Applied Leadership
    • Developing the Effective Technical Leader
    • Emerging Leadership
    • Executive Communications
    • Frontline Leadership
    • Leading Organizational Change
    • New Manager's Toolkit
  • Management
    • Business Essentials
    • Business of Life Sciences
    • Digital Teamwork And Business Skills
    • Design Thinking for Business Innovation
    • Negotiation Fundamentals
  • Technology
    • AI Agents
    • AI Applications in Marketing
    • Data Driven Decision-Making
    • GenAI 101 Licensing
    • Space Cybersecurity
  • Operations
    • Lean Six Sigma Green Belt
    • PMP Exam Prep
    • Project Management
  • Finance
    • Finance for Non-Financial Managers
    • Personal Finance
  • Leading With AI
    • Accounting
    • AI Strategy for Executives
    • Critical Thinking
    • Finance
    • Influence
    • Leading Change
    • Managing People
    • Marketing
    • Negotiation
    • Project Management
    • Self-Leadership
    • Strategy
  • FAQ
    A stylized image of an android representing artificial intelligence executive education

    Transform your leadership with AI strategy for executives

    Lead with confidence in an AI-enabled world and set yourself apart as a forward-thinking business leader. AI Strategy for Executives gives leaders the insight and tools to harness AI for innovation and strategic impact. Through this short course, you’ll explore emerging, cutting-edge technologies; understand how AI adoption is reshaping organizations and decision-making; and learn to apply AI-driven strategies that enhance leadership effectiveness.

    Register now

    1. Home
    2. Executive Education
    3. Professional Development
    4. Leading With AI
    5. AI Strategy for Executives
    • Accounting
    • AI Strategy for Executives
    • Critical Thinking
    • Finance
    • Influence
    • Leading Change
    • Managing People
    • Marketing
    • Negotiation
    • Project Management
    • Self-Leadership
    • Strategy

    Leading with AI: AI Strategy for Executives

    Leading AI Adoption in Organizations

    In this short course, we will examine AI through three different lenses, giving executives a comprehensive perspective of how to make AI applicable to their careers and their organizations.

    1. AI technologies: Analyze technologies related to AI systems, understand AI from a historical perspective, and examine a state-of-the-art take on AI solutions.
    2. AI for individuals: Explore types of technologies related to AI solutions including machine learning, deep learning, and those associated with generative AI.
    3. AI for organizations: Learn about AI's impact, focusing on digital humans and the collaborative work of human and digital agents; explore the use of generative AI and implications for productivity; and discover how AI affects organizational processes and determine best practices for its use in organizations.

    Upcoming dates

    Dates: July 14 to August 18
    Delivery: Online (instructor led), live sessions (90 mins), and weekly (7 to 8:30 p.m. ET)
    Duration: 6 weeks
    Price: $1,495
    Online: Tuesdays

    Interested in this program but need different dates? Contact us at kelleypd@iu.edu to explore alternative options.

    Register

    Learning objectives

    • Different technologies related to AI
    • The differences between machine learning and deep learning
    • Technology underpinnings of generative AI
    • Future of work from an individual perspective
    • Main factors influencing AI investments in organizations
    • Understanding of the issues involved in AI adoption and implementation

    Course details

    • Live-virtual online classes
    • Two 90-minute sessions for each of the three modules. Sessions occur once a week.
    • Attendance is expected, but sessions will be recorded for those who have conflicts.
    • Cost: $1,495
    • Discounts: Available for IU alumni, staff, groups of three or more participants, and purchases of four or more courses*

    *Purchase four or more courses in the Leading with AI series and receive a $1,500 discount. Pay a total of $4,480 (regularly $5,980), a 25% savings.

    Want to learn more?

    Fill out the form below to request more information.

    Showcase your new skills

    Each course offers the opportunity to complete an optional Action Learning Project, applying course concepts to a real organizational challenge you face. Participants who complete this project earn a digital badge, a verifiable credential you can showcase on platforms like LinkedIn.

    Kelley executive education digital badge for the AI course for executives

    Build toward a professional certificate

    Complete four courses from the Leading with AI professional development series and earn the Kelley Professional Certificate in AI Leadership.

    Pencil and paper icon

    Access a customizable supervisor request letter to support your case for attending a Kelley professional development course.

    Course outline

    Module 1 (two weeks):

    • Analyze technologies related to AI systems
    • Gain AI historical perspectives and a state-of-the-art take on AI solutions
    • Explore machine learning and deep learning
    • Focus on technologies associated with generative AI

    Module 2 (two weeks):

    • Examine the impact of AI on individuals
    • Explore digital humans and collaborative work between human and digital agents
    • Focus on how productivity may be enhanced by using generative AI

    Module 3 (two weeks):

    • Understand why AI impacts organizations
    • Address how AI can affect organizational processes and products
    • Learn best practices for implementing AI initiatives in organizations

    Elevate your career with AI: Inside Kelley’s AI Strategy for Executives course

    Artificial intelligence is transforming how organizations operate but many leaders still aren’t sure how to begin applying it strategically. In this conversation, Alex Barsi Lopes, Grant Thornton Scholar and clinical professor of information systems, shares how professionals can start creating value for their organizations with AI business strategy skills.

    Description of the video:

    WEBVTT 1 00:00:00.405 --> 00:00:01.575 [Kim Goad] Alex, Happy New Year. 2 00:00:02.055 --> 00:00:04.775 [Alex Barsi Lopes] [Brazillian accent] Happy New Year. [Kim] Nice to see you. 3 00:00:04.775 --> 00:00:06.695 Thank you so much for coming in and doing this with us. 4 00:00:06.795 --> 00:00:11.015 [Alex] Of course. [Kim] Yeah. I, um, I wonder if you to, 5 00:00:11.015 --> 00:00:13.335 to just set the stage before we really get into the meat 6 00:00:13.355 --> 00:00:16.815 of the topic, if you wouldn't mind sharing for the audience 7 00:00:16.885 --> 00:00:19.335 what your background here is at the Kelley School 8 00:00:19.755 --> 00:00:20.895 and, and your current role. 9 00:00:21.405 --> 00:00:25.455 [Alex] Sure. Of course. I have been at Kelley School since 10 00:00:25.475 --> 00:00:28.335 2011 now, so it has been quite some time. 11 00:00:29.035 --> 00:00:31.095 Um, I'm a clinical professor here 12 00:00:31.315 --> 00:00:33.735 and, um, also, right now I'm in charge 13 00:00:33.755 --> 00:00:36.615 of all our technology programs. 14 00:00:37.435 --> 00:00:40.535 I'm associate chair, uh, for the MSIS, 15 00:00:41.315 --> 00:00:43.895 and I'm associate, I'm associate chair mostly 16 00:00:44.535 --> 00:00:47.335 managing our Online MS in IT Management 17 00:00:47.435 --> 00:00:50.095 and Online MS in Business Analytics. 18 00:00:50.835 --> 00:00:53.975 And before I was in the part of our KEEP, right, 19 00:00:54.905 --> 00:00:56.695 Kelley's executive education program, and 20 00:00:56.695 --> 00:00:59.175 before there I was part of Kelley Direct as well. 21 00:00:59.315 --> 00:01:01.975 So multiple different roles, all them. 22 00:01:01.985 --> 00:01:04.055 Super exciting and happy to be doing 23 00:01:04.055 --> 00:01:05.095 what I'm doing here at Kelley. 24 00:01:05.325 --> 00:01:07.535 [Kim] Yeah. Well, thank you. You, you alluded to the work 25 00:01:07.535 --> 00:01:09.735 that you've done with KEEP or Kelley Executive Education 26 00:01:09.735 --> 00:01:10.935 Programs and, um, 27 00:01:11.235 --> 00:01:14.095 and as a, as a, um, an associate faculty chair, 28 00:01:14.515 --> 00:01:17.655 and then also just from my experience working with you 29 00:01:17.765 --> 00:01:19.575 with some of our corporate clients 30 00:01:19.635 --> 00:01:22.375 and open enrollment programs, I honestly can't think 31 00:01:22.375 --> 00:01:24.415 of anybody better to have this conversation with. 32 00:01:24.995 --> 00:01:26.295 So I really appreciate that, 33 00:01:26.295 --> 00:01:28.175 and I appreciate that you are the faculty lead 34 00:01:28.195 --> 00:01:31.095 for this particular course in the Leading with AI series. 35 00:01:31.715 --> 00:01:34.135 Um, and we, we should probably say what the title of 36 00:01:34.135 --> 00:01:37.895 that is, AI, uh, for strategy for executives. 37 00:01:38.205 --> 00:01:41.375 [Alex] Yeah. So we actually have a variety of different titles, 38 00:01:41.635 --> 00:01:43.215 uh, for this course. 39 00:01:43.305 --> 00:01:46.935 Right. And I think we are essentially settling in Leading 40 00:01:47.045 --> 00:01:50.775 with AI or some variation of that. [Kim] Uhhuh. 41 00:01:51.515 --> 00:01:54.415 [Alex] Um, but it is really a course 42 00:01:54.525 --> 00:01:57.975 that is focusing on helping people, right, 43 00:01:58.195 --> 00:02:01.615 to be transformative in terms of 44 00:02:01.955 --> 00:02:05.895 how AI is impacting their organizations, their careers. 45 00:02:06.235 --> 00:02:07.695 So that's the main focus, right? 46 00:02:07.695 --> 00:02:11.735 Is how to really elevate yourself to a position 47 00:02:11.745 --> 00:02:14.615 where you can lead with AI, you can create strategies 48 00:02:14.615 --> 00:02:18.335 with AI, you can really make AI, you know, uh, something 49 00:02:18.335 --> 00:02:21.255 that adds value to you, your stakeholders. [Kim] Mm-hmm. 50 00:02:21.405 --> 00:02:23.535 Good. You, you started to talk about that too, 51 00:02:23.535 --> 00:02:25.175 about the participants for this program. 52 00:02:25.395 --> 00:02:27.175 Who is the ideal, when you think about 53 00:02:27.175 --> 00:02:30.055 where someone is in their leadership or, um, journey 54 00:02:30.075 --> 00:02:32.495 or in their career, who is the ideal person 55 00:02:32.595 --> 00:02:33.975 to take this particular course? 56 00:02:34.815 --> 00:02:37.255 [Alex] I would say that it varies a little bit, 57 00:02:37.475 --> 00:02:41.295 and in part of the deal is that, uh, I think people 58 00:02:41.295 --> 00:02:43.975 that are starting to be in managerial roles, 59 00:02:44.195 --> 00:02:46.495 but also even higher level executives, 60 00:02:46.765 --> 00:02:51.245 because the course is, it starts 61 00:02:51.245 --> 00:02:52.405 to be very comprehensive. 62 00:02:52.665 --> 00:02:55.885 So we're attacking this from three different, uh, angles. 63 00:02:55.885 --> 00:02:58.365 [Kim] Mm-hmm. [Alex] Um, we have the technology angle, 64 00:02:58.545 --> 00:03:02.325 and this is a course that is co-taught by me, uh, 65 00:03:02.745 --> 00:03:06.725 my professor Sagar Samtani, and by Professor Alan Denis. 66 00:03:07.145 --> 00:03:09.765 And all of us have extensive experience working 67 00:03:09.765 --> 00:03:11.725 with clients, with executives. 68 00:03:11.795 --> 00:03:14.525 Professor Denis has been, uh, uh, entrepreneur 69 00:03:14.585 --> 00:03:16.285 has [UNINTELLIGIBLE] companies. 70 00:03:16.985 --> 00:03:21.245 Uh, Professor Samtani, he has been in the key role of director 71 00:03:21.245 --> 00:03:24.685 of AI for TSMC, which is the top 10 companies in the world. 72 00:03:24.705 --> 00:03:25.845 And, and 73 00:03:25.845 --> 00:03:28.685 so the idea is really attack these from different angles, 74 00:03:28.905 --> 00:03:33.085 uh, from the technology sides, from the individual sides, 75 00:03:33.425 --> 00:03:36.125 and from the managerial kind of organization side. 76 00:03:36.465 --> 00:03:39.605 So that's the reason that you can come a little bit earlier 77 00:03:39.705 --> 00:03:40.845 in your career and come 78 00:03:40.845 --> 00:03:43.365 and say, hey, what is happening with AI right now? 79 00:03:44.035 --> 00:03:47.765 What are the practical implications for me as a, as a, 80 00:03:47.765 --> 00:03:49.365 as a worker, as a knowledge worker, 81 00:03:49.385 --> 00:03:50.805 as a contributor to my organization? 82 00:03:51.385 --> 00:03:53.805 But you can also think about this as someone 83 00:03:54.085 --> 00:03:58.765 that is looking at understanding what the effects of 84 00:03:59.305 --> 00:04:02.285 AI for an organizational level. 85 00:04:02.425 --> 00:04:04.645 How do I manage my employees 86 00:04:04.785 --> 00:04:07.485 in a situation where, you know, AI is becoming more 87 00:04:07.485 --> 00:04:08.565 and more prominent, right? 88 00:04:08.565 --> 00:04:12.725 And how do I leverage AI to really get value? Right. 89 00:04:12.985 --> 00:04:14.365 That's one of the biggest challenges 90 00:04:14.365 --> 00:04:15.485 we have with AI right now. 91 00:04:15.595 --> 00:04:17.365 Lots of technology and important, not 92 00:04:17.365 --> 00:04:19.645 as much value being delivered in this far. [Kim] Mm-hmm. 93 00:04:20.075 --> 00:04:22.285 Good. Yeah, there is so much out there. 94 00:04:22.425 --> 00:04:24.965 And when I think about the, the format of this course 95 00:04:24.965 --> 00:04:28.285 that I don't know that, um, that, that this has been clear 96 00:04:28.285 --> 00:04:30.485 yet for the audience, it's a little unique with our Leading 97 00:04:30.485 --> 00:04:34.285 with AI series and that this one is live virtual, um, 98 00:04:34.515 --> 00:04:36.005 once a week for six weeks. 99 00:04:36.265 --> 00:04:40.045 [Alex] Yes. [Kim] And each session is what, 60 to 90 minutes long? 100 00:04:40.185 --> 00:04:42.525 [Alex] About that, about 90 minutes long. [Kim] Okay. 101 00:04:42.585 --> 00:04:47.565 [Alex] Um, and, uh, so we start with two sessions about technology. 102 00:04:47.675 --> 00:04:51.725 [Kim] Okay. [Alex] I, so, you know, we talk about, you know, GenAI, 103 00:04:52.845 --> 00:04:54.925 right, agentic AI, and is, what is that? 104 00:04:54.925 --> 00:04:58.285 What exactly does it entail, right? 105 00:04:58.345 --> 00:04:59.965 And, and sometimes we neglect 106 00:05:00.025 --> 00:05:03.725 to really understand there's a lot of AI that has been 107 00:05:03.725 --> 00:05:04.725 around for quite some time 108 00:05:04.725 --> 00:05:06.325 that actually delivers a lot of value, right. 109 00:05:06.325 --> 00:05:09.005 Things like machine learning, things 110 00:05:09.005 --> 00:05:10.445 more traditional deep learning. 111 00:05:10.935 --> 00:05:12.845 Those are things that are having around 112 00:05:13.035 --> 00:05:14.485 that are very effective. 113 00:05:15.065 --> 00:05:17.085 It generate a lot of value for organizations. 114 00:05:17.105 --> 00:05:18.365 And sometimes we neglect 115 00:05:18.365 --> 00:05:20.965 because everybody's excited with the shiny new, [Kim] Right. [Alex] right? 116 00:05:21.055 --> 00:05:24.285 So this is something that Sagar covers early. 117 00:05:24.985 --> 00:05:28.325 Um, just, you know, let's situate where the technology is, 118 00:05:28.595 --> 00:05:29.965 what are the things that you can do, 119 00:05:29.995 --> 00:05:32.885 what the things you cannot do or the things you should not do. 120 00:05:32.885 --> 00:05:37.405 Uh, and then we have, uh, uh, Alan, um, teaching 121 00:05:37.405 --> 00:05:39.685 for two sessions about individual effects. 122 00:05:39.935 --> 00:05:42.845 Right? So what are in the individual uses, 123 00:05:42.865 --> 00:05:45.165 if you're thinking about agents, right? 124 00:05:45.305 --> 00:05:46.525 How do you create agents? 125 00:05:46.625 --> 00:05:48.005 How do you create AI agents 126 00:05:48.005 --> 00:05:50.205 that can help you in your workflows? 127 00:05:50.265 --> 00:05:53.245 How do you think about the next evolutions 128 00:05:53.245 --> 00:05:54.365 or like digital humans? 129 00:05:54.365 --> 00:05:57.525 Right, in which you can replace, uh, you know, 130 00:05:57.525 --> 00:06:00.845 certain services with AI constructs 131 00:06:00.845 --> 00:06:03.525 that can perhaps interact, uh, in a more, 132 00:06:03.545 --> 00:06:06.365 in a deeper way than you have with a regular chatbot. 133 00:06:06.365 --> 00:06:09.085 Right? And then I wrap up the last, 134 00:06:09.185 --> 00:06:11.085 the two sessions really talking about 135 00:06:11.855 --> 00:06:15.005 value with stakeholders, implementation with AI, right? 136 00:06:15.005 --> 00:06:17.205 What other things that you have to consider. 137 00:06:17.275 --> 00:06:18.805 What is AI governance? 138 00:06:19.675 --> 00:06:23.165 What are, you know, the data concerns that we have with AI 139 00:06:23.165 --> 00:06:24.245 and things like privacy. 140 00:06:24.625 --> 00:06:29.165 How do you create a plan for organizations to absorb AI 141 00:06:29.665 --> 00:06:33.885 and not go into that, you know, experimental threat, right? 142 00:06:34.425 --> 00:06:36.605 Try new fields. It's proof of concepts, 143 00:06:36.605 --> 00:06:37.685 it never goes anywhere. 144 00:06:38.345 --> 00:06:41.125 How do you go beyond that to actually have something 145 00:06:41.125 --> 00:06:43.245 that can be moved into production, 146 00:06:43.745 --> 00:06:45.365 so really starts delivering value? 147 00:06:45.635 --> 00:06:48.085 [Kim] Yeah. You know, it's, it's, um, interesting 148 00:06:48.085 --> 00:06:50.605 that you talk about that and this whole idea of strategy 149 00:06:50.605 --> 00:06:51.605 around AI 150 00:06:51.865 --> 00:06:54.805 and how do we decide which projects to go with so 151 00:06:54.805 --> 00:06:57.405 that it doesn't just end in the, on the cutting room floor. 152 00:06:57.905 --> 00:07:00.085 Um, I, you may not remember this, 153 00:07:00.085 --> 00:07:01.845 but it was just a handful of years ago 154 00:07:01.845 --> 00:07:04.725 that I first met you at a Kelley holiday party, 155 00:07:05.145 --> 00:07:07.805 and we were all talking about, I mean, uh, um, 156 00:07:07.925 --> 00:07:10.205 ChatGPT was new for students using that, 157 00:07:10.345 --> 00:07:12.645 and we were concerned about what that was going to, 158 00:07:12.785 --> 00:07:14.485 the impact that was gonna have in the classroom. 159 00:07:14.585 --> 00:07:16.605 And I mean, our consensus was, it's here, 160 00:07:16.605 --> 00:07:19.365 obviously at universities, were all across the globe 161 00:07:19.875 --> 00:07:22.005 grappling with, we've gotta embrace it, 162 00:07:22.065 --> 00:07:23.245 but what does that look like? 163 00:07:23.865 --> 00:07:28.205 So speaking to the executives, I mean, it seems like, um, 164 00:07:28.225 --> 00:07:31.365 for a lot of people, there are a lot of emotions 165 00:07:31.665 --> 00:07:35.045 or beliefs, preconceived notions around AI. 166 00:07:35.055 --> 00:07:38.165 Maybe they are, maybe there's some paralysis or fear, 167 00:07:38.425 --> 00:07:40.165 or, I just wanna avoid it 168 00:07:40.165 --> 00:07:41.645 because it's, there's too much to do. 169 00:07:41.805 --> 00:07:43.805 I mean, how do you, can, can you, well, 170 00:07:43.845 --> 00:07:45.485 I guess on a personal level, can you speak, 171 00:07:46.265 --> 00:07:47.365 and this might not be fair 172 00:07:47.365 --> 00:07:50.405 because you've been an a, a tech guy your whole life, 173 00:07:50.625 --> 00:07:54.245 but when did think the thinking for you start to change 174 00:07:54.245 --> 00:07:56.725 around how do we embrace this in our work? 175 00:07:57.385 --> 00:08:01.165 And maybe, maybe give some examples for you, um, and, 176 00:08:01.185 --> 00:08:04.605 and then addressing, um, the, those emotions 177 00:08:04.625 --> 00:08:05.885 for the, the executive. 178 00:08:06.545 --> 00:08:10.705 [Alex] Of course. Of course. So I have been teaching about AI in 179 00:08:11.065 --> 00:08:13.625 business before AI was applied, right? 180 00:08:13.845 --> 00:08:16.785 So, uh, I have been teaching, uh, course related 181 00:08:16.785 --> 00:08:18.945 to AI business for like seven, eight years 182 00:08:19.045 --> 00:08:22.905 before ChatGPT. I actually used to demo, um, you know, 183 00:08:22.905 --> 00:08:24.745 before ChatGPT was public, I used 184 00:08:24.745 --> 00:08:26.945 to demo large language models in my class. 185 00:08:26.945 --> 00:08:30.345 Right? And I think that was the first kind 186 00:08:30.345 --> 00:08:31.385 of light bulb. 187 00:08:31.475 --> 00:08:33.425 Right? When I start talking to students 188 00:08:33.685 --> 00:08:35.825 and then like, okay, let's try this, right? 189 00:08:35.825 --> 00:08:39.665 Like early versions, early pilot versions for OpenAi, 190 00:08:40.245 --> 00:08:43.065 and, uh, and then, okay, let's, let's try this. 191 00:08:43.135 --> 00:08:44.665 What do, what would you like to know? Right? 192 00:08:44.725 --> 00:08:47.625 And, and the students together creating these queries 193 00:08:48.205 --> 00:08:49.505 and everybody's looking 194 00:08:49.525 --> 00:08:53.145 at the outcome like, wow, right, this is something 195 00:08:53.145 --> 00:08:56.925 that is big to help to get prepared, goes well 196 00:08:56.985 --> 00:08:58.005 beyond the [UNINTELLIGIBLE]. 197 00:08:58.005 --> 00:09:00.685 So that to me was my first, you know, 198 00:09:00.905 --> 00:09:03.725 beyond teaching the traditional neural networks, 199 00:09:03.725 --> 00:09:05.965 machine learning stuff, that was when I say, okay, 200 00:09:05.965 --> 00:09:07.605 this is something that really have to account, 201 00:09:07.605 --> 00:09:09.285 that really can transform organizations. 202 00:09:09.865 --> 00:09:13.005 And over the years, uh, we have been in contact with, 203 00:09:13.005 --> 00:09:15.645 you know, many different companies, organizations, right? 204 00:09:16.005 --> 00:09:18.805 I, I do a lot of interviews with tech leaders 205 00:09:19.025 --> 00:09:21.845 and just to see what the post, uh, is right now. 206 00:09:22.385 --> 00:09:24.805 So AI is here, uh, for this, 207 00:09:25.065 --> 00:09:28.165 and that's something that, uh, we are seeing right now is 208 00:09:28.975 --> 00:09:31.805 first of all, right, is there, is, is, you know, 209 00:09:32.005 --> 00:09:34.485 sometimes in, in the academic, you know, uh, uh, 210 00:09:34.665 --> 00:09:37.685 in the academic where you are, like, 211 00:09:38.035 --> 00:09:39.605 everybody's using AI, right? 212 00:09:39.605 --> 00:09:42.165 And they realize that that's not really true. 213 00:09:43.155 --> 00:09:44.885 Many organizations, we have 214 00:09:44.885 --> 00:09:46.045 people that have been avoiding it. 215 00:09:46.045 --> 00:09:48.205 We have people that are saying, this is not for me, 216 00:09:48.665 --> 00:09:50.565 or a few unprepared, right? 217 00:09:50.665 --> 00:09:53.325 And I, I don't feel that they have the, the right guidance. 218 00:09:53.675 --> 00:09:55.885 What is allowed, what is not allowed. 219 00:09:56.385 --> 00:09:59.525 So I think this is what you're trying to do in this, 220 00:09:59.545 --> 00:10:01.845 in this course, is really figure out, okay, 221 00:10:02.745 --> 00:10:06.205 to feel comfortable using AI 222 00:10:06.205 --> 00:10:08.845 right? You need to have, you don't need to know, 223 00:10:09.145 --> 00:10:11.965 you know, that, uh, that transformers 224 00:10:12.345 --> 00:10:15.445 or convolutional neural networks here and there, right? 225 00:10:15.825 --> 00:10:17.445 We need to know what's the power 226 00:10:17.795 --> 00:10:19.205 that this technology brings, 227 00:10:19.305 --> 00:10:21.645 and what is the proper utilization of that. 228 00:10:21.645 --> 00:10:24.245 So that's something that you try to address, uh, 229 00:10:24.555 --> 00:10:28.565 very much, um, from the very early stages, especially 230 00:10:28.565 --> 00:10:31.485 with Sagar and with Alan, like, you know, they teach you 231 00:10:31.505 --> 00:10:32.605 how to create an agent. 232 00:10:32.665 --> 00:10:35.045 Right? How familiar should you be in terms 233 00:10:35.065 --> 00:10:39.165 of the capabilities, in terms of tackling workflows 234 00:10:39.355 --> 00:10:42.685 that you can delegate to this AI intern. 235 00:10:42.885 --> 00:10:47.285 Right, you know this, this unpaid very, you know, energetic, 236 00:10:48.505 --> 00:10:51.625 somewhat naive intern, which is our AI, and, 237 00:10:53.005 --> 00:10:56.025 but you also have to figure it out on a higher level, um, 238 00:10:56.055 --> 00:10:59.785 from organizations is, okay, so what happens, it is 239 00:10:59.805 --> 00:11:01.225 how do we redefine the work? 240 00:11:01.225 --> 00:11:04.905 Right? And I think that the more people understands 241 00:11:04.905 --> 00:11:07.185 that, you know, hey, uh, this is changing a lot. 242 00:11:07.315 --> 00:11:08.905 There is no question about that. 243 00:11:09.765 --> 00:11:13.505 How do you position yourself in your organization in a way 244 00:11:13.505 --> 00:11:14.745 that you can take advantage 245 00:11:14.965 --> 00:11:17.065 and can be more proactive. 246 00:11:17.065 --> 00:11:19.425 Instead of having things like advance to the point like, oh, 247 00:11:19.535 --> 00:11:21.225 this is happening now have to react. 248 00:11:21.855 --> 00:11:23.185 Like, how do you make sure 249 00:11:23.345 --> 00:11:25.505 that I will be designing processes, 250 00:11:25.685 --> 00:11:29.385 I'm redesigning workflows, I'm creating types 251 00:11:29.485 --> 00:11:31.985 of behaviors in which, you know, both AI 252 00:11:32.245 --> 00:11:36.185 and humans are collaborating in terms of what they do best. 253 00:11:36.195 --> 00:11:39.065 Right? AI cannot do empathy. 254 00:11:39.245 --> 00:11:43.305 [UNINTELLIGBILE] AI cannot put itself in the place 255 00:11:43.325 --> 00:11:44.825 of someone who's facing a problem. 256 00:11:44.925 --> 00:11:47.545 Right? Humans can do that. 257 00:11:47.565 --> 00:11:50.145 Humans can actually leverage the knowledge 258 00:11:50.145 --> 00:11:52.105 and the patterns that AI can bring 259 00:11:52.525 --> 00:11:54.925 to actually solve the problems with that empathy 260 00:11:55.165 --> 00:11:59.205 right, that connection, that AI just isn't able to do. 261 00:11:59.335 --> 00:12:00.335 [Kim] Right? Yeah. Good. 262 00:12:00.335 --> 00:12:00.685 Good. 263 00:12:00.985 --> 00:12:05.325 Um, I, I happen to know that with, with all of our classes, 264 00:12:05.745 --> 00:12:08.485 we bring in participants from varied industries. 265 00:12:08.665 --> 00:12:11.365 So, uh, someone may come to this class 266 00:12:11.545 --> 00:12:14.445 and they may be in the banking industry, 267 00:12:14.445 --> 00:12:17.685 and they may be in a cohort with somebody from healthcare 268 00:12:17.745 --> 00:12:20.645 or manufacturing or utilities or you name it. 269 00:12:20.645 --> 00:12:23.205 Right? And I also know that you and Sagar 270 00:12:23.345 --> 00:12:27.205 and Alan are very equipped to, um, pivot 271 00:12:27.275 --> 00:12:29.125 with whoever's in the room with you 272 00:12:29.385 --> 00:12:34.045 and to make that very, um, contextualized to their, the, 273 00:12:34.045 --> 00:12:36.365 the participant's real world experiences. 274 00:12:36.545 --> 00:12:40.365 So, um, I'm, I'm thinking about, uh, they're going 275 00:12:40.365 --> 00:12:42.725 to be coming with lots of different experiences and problems 276 00:12:42.825 --> 00:12:45.245 and some notions about what they might be able 277 00:12:45.245 --> 00:12:46.365 to use AI for. 278 00:12:47.185 --> 00:12:50.965 You've, um, you, you spoke to this a little bit ago, 279 00:12:50.985 --> 00:12:52.405 and you've worked with students a lot 280 00:12:52.405 --> 00:12:54.205 with the Technology Consulting Workshop 281 00:12:54.225 --> 00:12:55.765 or with companies with exec ed. 282 00:12:56.315 --> 00:12:57.405 What are some cool 283 00:12:57.505 --> 00:13:00.685 and varied things that you've seen companies just 284 00:13:00.785 --> 00:13:03.045 as they started to get their toes in the water with AI? 285 00:13:03.045 --> 00:13:04.205 What are they doing or what, 286 00:13:04.305 --> 00:13:06.765 how are you using it in your own work 287 00:13:06.845 --> 00:13:07.885 that might surprise people? 288 00:13:09.185 --> 00:13:11.685 [Alex] So it's definitely two things here. 289 00:13:11.685 --> 00:13:14.125 First, I have, one of my favorite quotes, is actually my 290 00:13:14.325 --> 00:13:17.725 signature in my emails, which is "The future is already here. 291 00:13:18.065 --> 00:13:20.005 It's just not evenly distributed." right? 292 00:13:20.005 --> 00:13:24.045 This is by-- a tribute to William Gibson who is my favorite sci-fi authors. 293 00:13:24.585 --> 00:13:28.885 But that's the thing I think is, is admirable about our classes is 294 00:13:29.075 --> 00:13:32.125 because we have people with different backgrounds, right? 295 00:13:32.635 --> 00:13:35.205 What some organizations are doing, 296 00:13:36.025 --> 00:13:38.725 um, it might be more advanced in certain 297 00:13:38.725 --> 00:13:39.965 areas than other organizations. 298 00:13:39.965 --> 00:13:41.045 We're not, you know, 299 00:13:41.045 --> 00:13:42.565 thinking about like revealing trade 300 00:13:42.565 --> 00:13:43.685 secrets or not. 301 00:13:43.945 --> 00:13:46.125 But most of our classes are going 302 00:13:46.125 --> 00:13:47.525 to involve sharing, right? 303 00:13:47.585 --> 00:13:50.445 Say, hey, this is what I'm facing in my industry. 304 00:13:51.065 --> 00:13:54.365 You know, might be something that other, someone representing 305 00:13:54.365 --> 00:13:55.965 another industry might have already faced 306 00:13:55.985 --> 00:13:57.125 and might have a solution, right? 307 00:13:57.125 --> 00:14:00.085 And we believe, I mean, we truly believe, right, 308 00:14:00.115 --> 00:14:03.925 that is the learning is, is all around us. 309 00:14:04.325 --> 00:14:05.485 Right, it's not like, you know, 310 00:14:05.865 --> 00:14:08.805 and the, you know, the sage on the mountain saying, like 311 00:14:09.435 --> 00:14:10.805 this is the truth, right? 312 00:14:10.805 --> 00:14:13.205 It's really about the students collaborating, 313 00:14:13.305 --> 00:14:14.965 the students sharing the experiences 314 00:14:15.585 --> 00:14:16.765 in learning from each other, 315 00:14:16.815 --> 00:14:18.205 which I believe is very important 316 00:14:18.205 --> 00:14:20.325 because there are so many interesting cases. 317 00:14:20.665 --> 00:14:25.485 We have many cases right now with AI that are, you know, 318 00:14:25.485 --> 00:14:27.925 helping, for example, in financial advising. 319 00:14:28.665 --> 00:14:29.685 It is a great area 320 00:14:29.685 --> 00:14:33.005 because a lot of the prospectors, a lot of the data 321 00:14:33.115 --> 00:14:34.485 that can be summarized 322 00:14:34.545 --> 00:14:37.165 and can be helpful, uh, to people that, 323 00:14:37.165 --> 00:14:38.845 of reading tons of these, right? 324 00:14:38.845 --> 00:14:42.045 I was, uh, in the University 325 00:14:42.145 --> 00:14:45.005 of Pittsburgh some time ago attending a presentation 326 00:14:45.145 --> 00:14:48.845 and they were talking about, um, uh, Pitt has like, uh, 327 00:14:49.085 --> 00:14:52.085 a dean for AI in medicine, right? 328 00:14:52.225 --> 00:14:53.445 And they're looking about, hey, 329 00:14:53.465 --> 00:14:57.285 how do you create those systems that would help new doctors 330 00:14:57.995 --> 00:15:01.245 that do not have a lot of that own experience 331 00:15:01.585 --> 00:15:05.285 to eventually use this knowledge that is available, not 332 00:15:05.705 --> 00:15:08.365 as a crutch, not as a replacement, 333 00:15:09.025 --> 00:15:10.565 but to as, as a coach, 334 00:15:10.785 --> 00:15:14.325 as a support mechanism in which you can validate you know, 335 00:15:14.325 --> 00:15:16.605 your ideas and say, hey, this is likely, 336 00:15:16.775 --> 00:15:17.885 maybe this is not likely. 337 00:15:17.985 --> 00:15:21.005 Gimme some alternative diagnosis here, right? 338 00:15:21.275 --> 00:15:22.485 This kind of conversation, 339 00:15:22.495 --> 00:15:25.565 which an AI can be your assistant, your support. 340 00:15:26.225 --> 00:15:28.165 Um, that is really, really helpful. 341 00:15:28.305 --> 00:15:30.685 And we can apply it to variety of 342 00:15:30.685 --> 00:15:31.765 different industries, right. 343 00:15:31.905 --> 00:15:34.845 Uh, personally I do a lot of AI 344 00:15:34.865 --> 00:15:36.685 for like storytelling, right? 345 00:15:36.745 --> 00:15:39.485 So, one the things sometimes I'm looking, um, um, 346 00:15:39.625 --> 00:15:43.325 so I just did one class recently about the, a, a day in the, 347 00:15:44.665 --> 00:15:47.865 a day in the future of anticipatory. 348 00:15:48.605 --> 00:15:51.745 And the idea here is really thinking about, hey, how 349 00:15:52.405 --> 00:15:54.905 for a young professional, how is the world going 350 00:15:54.905 --> 00:15:56.145 to be different, right? 351 00:15:56.205 --> 00:15:59.305 If you really do not have as many constraints. 352 00:15:59.565 --> 00:16:01.425 Uh, it was how can have AI 353 00:16:02.085 --> 00:16:04.745 and, uh, it's kind of a, so I use a lot of AI 354 00:16:05.025 --> 00:16:07.745 to create illustrations to tell a story. 355 00:16:07.765 --> 00:16:10.785 Like how someone is waking up in the morning 356 00:16:11.005 --> 00:16:13.745 and being adjusted in terms of the alarm clock 357 00:16:13.745 --> 00:16:15.625 because there was perhaps an accident 358 00:16:16.095 --> 00:16:17.265 that is blocking traffic. 359 00:16:17.265 --> 00:16:21.345 Right? How is someone using AI to, uh, 360 00:16:21.415 --> 00:16:23.305 interview a potential candidate 361 00:16:23.485 --> 00:16:26.185 and having, uh, systems that are helping 362 00:16:26.885 --> 00:16:30.905 to dig some additional questions to help to react in terms 363 00:16:30.965 --> 00:16:33.625 of the, of the, of the responses for the candidates. 364 00:16:33.625 --> 00:16:37.025 So, you know, so to me, like I use a lot to just say, 365 00:16:37.085 --> 00:16:39.545 hey, you know, if I talk about this, it's very abstract. 366 00:16:39.765 --> 00:16:42.025 [Kim] Yes. [Alex] So I actually use a lot of, uh, you know, uh, 367 00:16:42.105 --> 00:16:43.425 generation of, uh, images 368 00:16:43.455 --> 00:16:45.225 to say, hey, this is the stage. 369 00:16:45.715 --> 00:16:47.985 Let's go through this, kind of do a lot of storytelling. 370 00:16:48.005 --> 00:16:49.705 That's, [Kim] Yeah [Alex] that's one my favorite things to do. 371 00:16:49.765 --> 00:16:51.505 [Kim] Oh, yeah. I love that. I love that. 372 00:16:51.865 --> 00:16:54.145 I was talking with a client the other day in the 373 00:16:54.145 --> 00:16:55.265 pharmaceutical industry, 374 00:16:55.285 --> 00:16:58.385 and they're using AI as a way 375 00:16:58.385 --> 00:17:02.505 to train new reps at, the AI agent acts as a physician, 376 00:17:02.565 --> 00:17:04.305 and they are detailing that physician 377 00:17:04.605 --> 00:17:06.265 and practicing with AI. 378 00:17:06.525 --> 00:17:08.025 Um, and I, yes-- 379 00:17:08.365 --> 00:17:09.365 [Alex] As well. Things like, 380 00:17:09.365 --> 00:17:11.545 you know, uh, technical sales 381 00:17:11.765 --> 00:17:14.425 was working with, uh, with, uh, with a client as well 382 00:17:14.925 --> 00:17:17.545 and the ideas of the clients like, well, sometimes you have 383 00:17:17.545 --> 00:17:20.865 to reply to these, um, RFP [Kim] Yeah. 384 00:17:20.865 --> 00:17:25.345 [Alex] Right. RF, you know, RFQs, and, uh, the information is there. 385 00:17:25.885 --> 00:17:29.945 We have some previous experience serving similar customers. 386 00:17:29.965 --> 00:17:33.105 And the more we can create that like a playbook, 387 00:17:33.745 --> 00:17:38.025 right, understand, what is the main criteria for those RFPs or, 388 00:17:38.165 --> 00:17:40.745 or those requests for proposals of, um, 389 00:17:40.965 --> 00:17:42.945 and build from their knowledge 390 00:17:42.945 --> 00:17:45.225 and have accumulate over time, like a slide decks 391 00:17:45.325 --> 00:17:46.745 and proposals, right. 392 00:17:46.845 --> 00:17:51.205 We can quickly craft with human supervision. [Kim] Yeah. [Alex] Right. 393 00:17:51.395 --> 00:17:53.325 We can quickly craft something 394 00:17:53.425 --> 00:17:57.365 and then can respond quicker, which increases the likelihood 395 00:17:57.385 --> 00:17:58.605 of the getting the contract. 396 00:17:58.735 --> 00:17:59.885 [Kim] Right. Right. Yeah. 397 00:17:59.965 --> 00:18:02.845 I mean the, I've can think of so many success stories 398 00:18:02.845 --> 00:18:04.205 that clients have shared with me lately. 399 00:18:04.205 --> 00:18:08.725 Another client who is, um, consulting with, um, uh, uh, 400 00:18:08.725 --> 00:18:12.485 police force and they are using AI now to figure out how 401 00:18:12.485 --> 00:18:14.125 to do patrol scheduling, so 402 00:18:14.155 --> 00:18:17.085 what areas should we send patrol cars to in our area? 403 00:18:17.465 --> 00:18:19.645 And he said it was a project that took him six months, 404 00:18:19.745 --> 00:18:20.885 uh, uh, 20 years ago. 405 00:18:20.905 --> 00:18:22.685 He did the same project for the same client, 406 00:18:22.945 --> 00:18:24.765 and it took, you know, months to do it. 407 00:18:24.825 --> 00:18:26.965 And now he said, it took me more time 408 00:18:26.965 --> 00:18:29.085 to clean the data than it did to actually do the project. [Alex] Yes. 409 00:18:29.345 --> 00:18:32.165 [Kim] But yeah. So lots of success stories, 410 00:18:32.305 --> 00:18:36.125 but can you think of where maybe you've experienced, 411 00:18:37.335 --> 00:18:39.765 maybe where AI didn't live up to your expectations 412 00:18:39.765 --> 00:18:41.805 or it failed, or, you know, you mentioned 413 00:18:41.805 --> 00:18:43.645 that there are some things that just can't do, 414 00:18:43.985 --> 00:18:46.125 and you had like empathy for example [laughs] 415 00:18:47.305 --> 00:18:49.725 [Alex] Not there yet. I think that a lot 416 00:18:49.725 --> 00:18:51.005 of the things at issue-- 417 00:18:51.025 --> 00:18:54.045 So I was just, uh, actually prepping some, some new, 418 00:18:54.145 --> 00:18:57.955 new content in classes about, um, um, 419 00:18:58.135 --> 00:18:59.355 uh, AI governance, right. 420 00:18:59.355 --> 00:19:01.395 How do you minimize the risk 421 00:19:01.655 --> 00:19:04.475 or how to anticipate the risk so we can have mitigation. 422 00:19:04.985 --> 00:19:06.715 [Kim] Yeah, yeah. [Alex] One of the biggest issues 423 00:19:06.715 --> 00:19:10.315 that we're facing right now, especially with the need 424 00:19:10.315 --> 00:19:13.475 to keep the humans in the loop, is accuracy. 425 00:19:13.535 --> 00:19:16.635 Right? So there are certain things, I, I think part 426 00:19:16.635 --> 00:19:19.995 of the deal here is really to instruct people about 427 00:19:20.465 --> 00:19:22.315 what is your risk appetite. 428 00:19:22.315 --> 00:19:25.115 What is your risk tolerance, [Kim] Right. [Alex] right? [Kim] Yeah. 429 00:19:25.115 --> 00:19:28.235 [Alex] Because for certain things, you know, being wrong, 430 00:19:28.815 --> 00:19:31.315 not a big deal. You know, occasionally going 431 00:19:31.315 --> 00:19:32.435 to have an inaccurate thing, 432 00:19:32.535 --> 00:19:33.555 but in general, 433 00:19:33.655 --> 00:19:35.835 you've got some good results so we can move on, right? 434 00:19:35.835 --> 00:19:37.955 Like for certain things like, you know, 435 00:19:37.955 --> 00:19:39.515 medical diagnosis for example. 436 00:19:40.055 --> 00:19:44.955 See, you can not be wrong [Kim laughs] Accuracy, again, not 437 00:19:44.955 --> 00:19:46.675 to say that humans are never wrong, 438 00:19:46.935 --> 00:19:49.755 but, you know, transferring that responsibility to a machine, 439 00:19:49.975 --> 00:19:51.115 to the AI system, right. 440 00:19:51.375 --> 00:19:53.315 It definitely cannot get wrong 441 00:19:53.315 --> 00:19:54.435 because, you know, sometimes 442 00:19:54.435 --> 00:19:55.595 people could trust a little bit too much. 443 00:19:56.295 --> 00:19:59.955 Uh, so I think this is really what we're seeing is like, 444 00:20:00.145 --> 00:20:02.155 what is the level of comfortable people 445 00:20:02.255 --> 00:20:05.835 and what's the appropriate right level of certainty 446 00:20:05.865 --> 00:20:08.795 that you need to have to the task at hand. 447 00:20:08.855 --> 00:20:12.955 There's some stories of, uh, you know, uh, chatbot selling, 448 00:20:13.455 --> 00:20:16.835 you know, air fare for like one third of the price. [Kim] Uhhuh, 449 00:20:17.045 --> 00:20:18.045 right! [Kim laughs] 450 00:20:18.095 --> 00:20:20.755 [Alex] That's, [Kim laughing] I was not so lucky to get in on that 451 00:20:21.625 --> 00:20:23.235 [Alex] I wish, but so, you know, 452 00:20:23.295 --> 00:20:24.835 but again, this is tolerable, right? 453 00:20:25.055 --> 00:20:26.315 So we, can fix this, right? 454 00:20:26.575 --> 00:20:29.715 If you get the wrong diagnosis of someone, cannot be fixed. 455 00:20:29.895 --> 00:20:32.955 So it's that level of comfort, I think that's the next barrier 456 00:20:32.955 --> 00:20:36.955 that we're trying to kind of really, uh, look at as a way 457 00:20:36.975 --> 00:20:39.115 to keep increasing the reach of AI. 458 00:20:39.225 --> 00:20:40.235 [Kim] Yeah. Good. Good. 459 00:20:41.055 --> 00:20:43.435 Wow, there's so much, and I think about last six weeks, 460 00:20:43.695 --> 00:20:46.595 six weeks in this course has got to fly by, if you had 461 00:20:46.595 --> 00:20:49.725 to boil it down to like one main thing you would want people 462 00:20:49.725 --> 00:20:51.605 to walk away with, like, if you think, okay, 463 00:20:51.605 --> 00:20:53.005 you've finished up in six weeks 464 00:20:53.065 --> 00:20:54.525 and we've had a great time together, 465 00:20:54.985 --> 00:20:57.005 and you think of those learners 466 00:20:57.145 --> 00:21:00.445 or leaders out one week out in their real world, 467 00:21:00.635 --> 00:21:03.405 what do you hope that they've gained from the six weeks 468 00:21:03.515 --> 00:21:05.525 with you and Alan and Sagar? 469 00:21:06.105 --> 00:21:08.805 [Alex] So actually something that we have know, 470 00:21:08.805 --> 00:21:10.005 the three of us have debated 471 00:21:10.035 --> 00:21:12.405 [Kim] Yeah. [Alex] quite a bit, and, um, 472 00:21:12.665 --> 00:21:16.105 and to us it's like, okay, you finish this, this week, 473 00:21:16.615 --> 00:21:20.065 next week, we want you to be able in two things. 474 00:21:20.175 --> 00:21:22.705 Once you be involved in conversations 475 00:21:22.815 --> 00:21:25.385 that involve AI in organization, right. 476 00:21:25.505 --> 00:21:28.785 Is really take that leadership and, 477 00:21:29.005 --> 00:21:31.145 and say like, well, I can talk with people, 478 00:21:31.505 --> 00:21:32.865 I can converse with people. 479 00:21:33.165 --> 00:21:35.945 I'm knowledgeable, right? I can give opinions. 480 00:21:36.105 --> 00:21:37.585 I have a basis for my opinion. 481 00:21:37.685 --> 00:21:41.065 So get that level of involvement in the AI in organization. 482 00:21:42.085 --> 00:21:44.305 But the other part is very practical is like, you know, 483 00:21:44.555 --> 00:21:46.345 think about something you can do. 484 00:21:46.345 --> 00:21:47.465 [Kim] Mm-hmm. [Alex] Right? [Kim] Mm-hmm. 485 00:21:47.775 --> 00:21:51.545 What is, look around, I mean, I always finish my, my many 486 00:21:51.545 --> 00:21:54.805 of my classes say tomorrow you going back to 487 00:21:54.805 --> 00:21:56.125 work, look around. 488 00:21:56.545 --> 00:21:59.725 I'm sure that you are going to be able to find a couple 489 00:21:59.725 --> 00:22:03.125 of opportunities in which you can apply AI. 490 00:22:03.125 --> 00:22:05.045 You can deliver value, right? 491 00:22:05.585 --> 00:22:07.445 And not only looking around, 492 00:22:07.745 --> 00:22:11.765 but now you should feel prepared, right, to go 493 00:22:11.865 --> 00:22:14.685 and start tackling, like, those objectives, 494 00:22:14.925 --> 00:22:15.925 tackling those projects and 495 00:22:15.925 --> 00:22:18.405 before they feel like, oh, you know, interesting. 496 00:22:18.625 --> 00:22:22.285 Be great to have that, we want to move you from, 497 00:22:22.305 --> 00:22:24.845 it would be great to have that to go ahead 498 00:22:24.845 --> 00:22:27.205 and say, right, you are able to do it. 499 00:22:27.205 --> 00:22:29.045 You have that. Do you have key knowledge? 500 00:22:29.385 --> 00:22:31.245 You have the capability, go 501 00:22:31.245 --> 00:22:32.925 [Kim] Right. [Alex] and start working on this right away. 502 00:22:32.925 --> 00:22:33.965 [Kim] Wonderful. Good. Good. 503 00:22:34.515 --> 00:22:36.845 Well, what else would you want everyone to know 504 00:22:36.905 --> 00:22:38.085 before we wrap up? 505 00:22:38.225 --> 00:22:39.805 Is there anything else about the class we, 506 00:22:39.905 --> 00:22:41.565 we haven't talked about that you would hope they know? 507 00:22:42.485 --> 00:22:45.925 I think that is, it's a lot of fun. [Kim] Yeah. Yeah. 508 00:22:46.025 --> 00:22:49.205 [Alex] So I think that's something that, you know, I, I, you know, 509 00:22:49.485 --> 00:22:52.045 teaching this class and just interacting and just talking 510 00:22:52.145 --> 00:22:54.765 and, and believe me, there's some detours 511 00:22:54.765 --> 00:22:56.485 that you take in class, right, 512 00:22:56.485 --> 00:22:59.325 because you start to talk about some of the content 513 00:22:59.625 --> 00:23:01.525 and then you starts, you know, all of a sudden like, oh, 514 00:23:01.525 --> 00:23:03.805 you know what, if this was possible, what if this? 515 00:23:03.805 --> 00:23:08.405 Right? So I think there's a lot of, uh, this nice guided, 516 00:23:08.585 --> 00:23:09.885 you know, speculation 517 00:23:09.995 --> 00:23:12.445 that you do in the large session, right? 518 00:23:12.445 --> 00:23:15.805 When someone comes and say, wow, I have, you know, 519 00:23:15.835 --> 00:23:17.485 this issue I'm facing right now. 520 00:23:17.795 --> 00:23:20.285 What, what, what, what does everybody think? [Kim] Mm-hmm. 521 00:23:20.465 --> 00:23:21.765 [Alex] And I might offer an opinion 522 00:23:21.865 --> 00:23:24.085 and some other students offer their opinions 523 00:23:24.185 --> 00:23:26.325 and all of a sudden, we're kind of problem-solving. [Kim] Mm-hmm. 524 00:23:26.965 --> 00:23:30.645 [Alex] I get kind of, uh, brainstorming stuff, a bit of the class. 525 00:23:30.785 --> 00:23:33.645 So it is a lot of fun. I mean, the students are great. 526 00:23:33.955 --> 00:23:37.045 They're very engaged. We have really awesome conversations. 527 00:23:37.105 --> 00:23:39.045 And, uh, I, I, 528 00:23:39.245 --> 00:23:41.925 I should remember like the last time I taught this class was 529 00:23:41.925 --> 00:23:44.645 like, this was just full of, just so exciting, 530 00:23:44.865 --> 00:23:46.765 So big smile, right? 531 00:23:46.765 --> 00:23:48.085 And make everybody super excited. 532 00:23:48.185 --> 00:23:50.005 [Kim] You know, it's interesting because when we do those 533 00:23:50.395 --> 00:23:53.245 post-course surveys, that is one of the things 534 00:23:53.245 --> 00:23:55.325 that always ends up being a positive, is that 535 00:23:55.325 --> 00:23:57.485 what we think are detours, that that's 536 00:23:57.485 --> 00:24:00.645 what mo- people love sometimes the most, is what came out, 537 00:24:00.865 --> 00:24:02.485 um, in unexpected ways. 538 00:24:02.485 --> 00:24:04.605 That, yes, you come in with a problem, 539 00:24:04.605 --> 00:24:05.885 what does the class think about this? 540 00:24:06.105 --> 00:24:08.925 And, uh, you, I dunno if you've read the book "Range," 541 00:24:09.065 --> 00:24:10.845 but it's very much that idea 542 00:24:11.105 --> 00:24:13.925 or if, you've taught design thinking, uh, you, you come up 543 00:24:13.925 --> 00:24:15.885 with ideas for your own application 544 00:24:16.395 --> 00:24:18.885 from some completely unrelated conversation. 545 00:24:18.905 --> 00:24:20.285 No, what you thought was a completely 546 00:24:20.285 --> 00:24:21.725 unrelated conversation. 547 00:24:21.745 --> 00:24:24.525 So tho I, those are the most fun. [Alex] Absolutely. [Kim] Yeah. 548 00:24:24.875 --> 00:24:26.405 Well, it's been fun talking with you too. 549 00:24:26.685 --> 00:24:27.885 [Alex] Always. [Kim] I always love seeing you. 550 00:24:28.045 --> 00:24:29.965 I miss seeing you up in our office suite. 551 00:24:30.345 --> 00:24:32.205 Um, I, we should probably say 552 00:24:32.205 --> 00:24:35.405 that your class starts on March 24th? [Alex] I believe. 553 00:24:35.845 --> 00:24:38.325 [Kim] Yeah. And, and if there are questions about that, they can 554 00:24:38.345 --> 00:24:42.445 of course email, uh, kelleypd@iu.edu. 555 00:24:42.445 --> 00:24:46.245 [Alex] Mm-hmm. [Kim] Um, and uh, I just again appreciate you being here 556 00:24:46.305 --> 00:24:49.165 and, um, uh, Happy New Year again, Alex. 557 00:24:49.165 --> 00:24:50.165 [Alex] Happy New Year. I thank you for having 558 00:24:50.165 --> 00:24:51.405 me. That was great. [Kim] Thanks.

    Meet your instructor

    Alex Lopes portrait

    Alex Barsi Lopes

    Alex Barsi Lopes, PhD, is a clinical professor of information systems and Grant Thornton Scholar at the Kelley School of Business at Indiana University Bloomington. He currently serves as associate chair for Kelley Executive Education Programs, in charge of Online MS programs and certificates. He has taught courses on digital transformation, management of the IT Function, business process modeling and systems development, low-code development, agile development and organizational agility, design thinking, robotics process automation, intelligent automation and artificial intelligence applications, and databases/big data technologies.

    Alex created the Technology Consulting Workshop, served as its director from 2016 to 2020, and led student consulting engagements in Guatemala, India, and Thailand. His research focuses on online information goods, collaboration technologies, face-to-face and online social networks, and IS educational initiatives. His research appears in journals including Information Systems Research, Journal of Management Information Systems, and Communications of the ACM. Passionate about international business education, Alex has taken students to Brazil, Canada, China, Guatemala, India, Korea, Mexico, and Thailand, in addition to developing a speaker series about Ghana and Western Africa. Before earning his PhD from the University of Pittsburgh, Alex occupied consulting and corporate positions. Prior to joining Kelley, he was the director of the MS IS Program at the University of Cincinnati.

    Sagar Samtani portrait

    Sagar Samtani

    Sagar Samtani, PhD, is an assistant professor and Grant Thornton Scholar of Information Systems at the Kelley School of Business at Indiana University. Sagar’s research aims to develop AI-enabled analytics based in deep learning, network science, and text mining approaches for dark web analytics; vulnerability assessment for advanced cyberinfrastructure; open-source software security; cyber threat intelligence; AI risk management; and mental health applications. Sagar has published over 60 journal, conference, and workshop papers on these topics in leading information systems, cybersecurity, and machine learning venues. He has received over $4M (in PI and Co-PI roles) in funding from the NSF SaTC, CICI, SFS, and CRII programs, as well as private sources. He holds leadership positions in leading industry entities, including spots on the CompTIA ISAO Executive Advisory Council and the DEFCON AI Village Board of Directors.

    Sagar has won several awards for his research, including the IU Outstanding Junior Faculty Award in 2023, Kelley School of Business Early Career Research Impact Award in 2023, induction into the NSF/CISA CyberCorps SFS Hall of Fame in 2022, the AIS Early Career Award in 2022, the ACM SIGMIS (ICIS) Doctoral Dissertation award in 2019, and several best paper awards. He has also won numerous teaching awards and distinctions for his courses on AI for cybersecurity, CTI, and business analytics, including the IU Trustees' Teaching Award and being named as a Top 50 Undergraduate Professor by Poets&Quants in 2022, among others. Sagar has been cited in the Associated Press, Forbes, Miami Herald, Fox, Science Magazine, AAAS, The Penny Hoarder, and other venues. He is a member of INFORMS, AIS, ACM, IEEE, and INNS.

    Alan Dennis portrait

    Alan R. Dennis

    Alan R. Dennis is a professor of information systems and holds the John T. Chambers Chair of Internet Systems at the Kelley School of Business at Indiana University. He was named a Fellow of the Association for Information Systems in 2012 and received the LEO Award in 2021. His research focuses on artificial intelligence and cybersecurity.

    Alan is ranked in the top five most published information systems researchers over the last 30 years, and a recent Stanford study placed him in the top 1% most influential researchers in the world across all scientific disciplines. His research has been reported in the popular press almost 1000 times, including Wall Street Journal, Forbes, USA Today, CBS, PBS, Fox, CBC, and CTV. He is a past president of the Association for Information Systems.

    Questions?

    Please reach out to us at kelleypd@iu.edu to learn more about the AI Strategy for Executives short course and other courses offered by Kelley School of Business Executive Education.

    Explore all courses in the Leading with AI series

    Social media

    • Linkedin for the Kelley School of Business Executive Education Program
    • Blog for the Kelley School of Business Executive Education Program
    • Accessibility
    • College Scorecard
    • Transferability Between Campuses
    • Privacy Notice
    • Copyright © 2026 The Trustees of Indiana University