• Skip to Content

Kelley School of Business Indiana University
  • Undergraduate
    • Why Kelley?
    • Admissions
    • Academics
    • Scholarships
    • Student Life
    • Pre-College
    • Current Students
    • Parents
  • Graduate
    • Full-Time +Flex MBA
    • Kelley Direct Online MBA
    • Evening MBA
    • Physician MBA
    • Online Master's Degrees & Certificates
    • Specialized Masters
    • 3/2 MBA
    • Graduate Accounting Indianapolis
    • MS in Accounting with Data and Analytics
    • MS in the Business of Biotech and Life Sciences
    • MS in Finance
    • MS in Healthcare Management
    • MS in Information Systems
    • MS in Management
    • Kelley Direct Online EDBA
    • International Programs
  • Faculty, Research & PhD
    • Research & Publications
    • Faculty Directory
    • Departments & Majors
    • Centers & Institutes
    • Behavioral Lab
    • Courses
    • PhD
  • Executive Education
    • Custom Programs
    • Degree Partnerships
    • Student Resources
    • Case Studies
    • Professional Development
    • Meet Us
  • Recruiters
    • Graduate Career Services
    • Undergraduate Career Services
    • Indianapolis Career Services
    • Alumni Career Resources
    • Corporate & Foundation Relations
    • Indianapolis Corporate & Foundation Relations
  • Alumni
    • Who We Are
    • Get Involved
    • Career & Professional Development
    • Awards
    • Alumni Legacies
    • Events
    • Giving
    • Contact Us
  • About
    • Dean's Welcome
    • Administration
    • Kelley Women
    • School Profile
    • History
    • Visit Bloomington
    • Visit Kelley Indianapolis
    • Contact
    • Directory
    • Social Media Directory
    • Rankings
  • More
    • Centers & Institutes
    • Directory
    • News & Events
    • Give
    • Kelley Store
Professional Development
  • Leadership
    • Applied Leadership
    • Developing the Effective Technical Leader
    • Emerging Leadership
    • Executive Communications
    • Frontline Leadership
    • Leading Organizational Change
    • New Manager's Toolkit
  • Management
    • Business Essentials
    • Business of Life Sciences
    • Digital Teamwork And Business Skills
    • Design Thinking for Business Innovation
    • Negotiation Fundamentals
  • Technology
    • AI Agents
    • AI Applications in Marketing
    • Data Driven Decision-Making
    • GenAI 101 Licensing
    • Space Cybersecurity
  • Operations
    • Lean Six Sigma Green Belt
    • PMP Exam Prep
    • Project Management
  • Finance
    • Finance for Non-Financial Managers
    • Personal Finance
  • Leading With AI
    • Accounting
    • AI Strategy for Executives
    • Critical Thinking
    • Finance
    • Influence
    • Leading Change
    • Managing People
    • Marketing
    • Negotiation
    • Project Management
    • Self-Leadership
    • Strategy
  • FAQ
    A business professional uses a device with a night cityscape in the background.

    Revolutionize your marketing: Join Kelley's cutting-edge AI marketing course

    In a world where AI is transforming marketing at lightning speed, staying ahead of the curve is crucial. In this AI marketing course—crafted by marketers, for marketers—you will learn how to harness AI in personal workflows to save time and money.

    Register now

    1. Home
    2. Executive Education
    3. Professional Development
    4. Technology
    5. AI Applications in Marketing

    Short course overview: AI Applications in Marketing

    Equip yourself with a GenAI toolset designed for impactful AI adoption. Anchored in the marketing process, this course empowers you with skills to leverage AI effectively, driving growth, reducing cycle times, and cutting costs. Dive deep into the market landscape, employ advanced analytics, create compelling content, and utilize GenAI tools for automation and monitoring. You'll culminate your learning by creating a personalized marketing AI assistant, all while mastering change management for competitive advantages.

    Upcoming dates

    Dates: August 4 to 27, and October 6 to 29
    Time: Tuesdays and Thursdays
    6 to 7:30 p.m. ET
    Delivery: Online, instructor led
    Price: $1,995

    Register

    Interested in bringing this program to your company or organization? Email kelleypd@iu.edu to discuss our custom program options.

    Want to learn more?

    Fill out the form below to request more information.

    Who is AI Applications in Marketing for?

    Marketing practitioners and leaders who are hungry to get up to speed on AI for themselves, their teams, and organizations, and don’t want to be left behind. Anchored in the marketing planning process, students will learn how AI can help answer marketing questions through hands-on learning and case studies. AI enables students to build apps and actions that were previously the domain of coders. No prior AI experience or technical experience is needed to take full advantage of the course.

    What makes this AI for marketing course unique among AI courses being offered?

    The AI Applications in Marketing course:

    • Is taught by marketing leaders from industry and distinguished faculty
    • Provides a customized ChatGPT that stays with learners once the course concludes
    • Offers a continuous learning plan so the content stays relevant
    • Covers marketing analytics in addition to custom GPTs
    • Pushes beyond custom GPTs to cover AI agents, automations, assistants, APIs, and avatars
    • Brings learners up to speed quickly—it’s four weeks, live and online
    Pencil and paper icon

    Access a customizable supervisor request letter to support your case for attending a Kelley professional development course.

    Course outline

    Up-to-date coverage of the leaders, companies, and constructs changing business from the front lines of GenAI

    The strongest marketing use case may not be content - it may be the democratization of advanced data analytics for your team

    Sift through the 1000s of GenAI apps to define your team’s content martech

    The new 5As of marketing - Automations, Assistants, Agents, Avatar,s and APIs

    Course details

    • Format: Live-virtual online classes
    • Schedule: Two 90-minute sessions per week for four weeks
    • Discounts: Available for IU alumni, staff, and groups of three or more participants
    • What’s included: ChatGPT Team Subscription is included in your course fee.

    Why choose Kelley?

    • Learn from the Best: Gain insights from esteemed faculty guests and industry experts who bring AI and marketing expertise from the real world to the classroom.
    • Immediate Impact: Acquire practical skills in AI-driven marketing analytics, content creation, and automation that you can apply to your work right away for quick results.
    • Comprehensive Curriculum: Explore a wide range of AI applications in marketing, equipping you with a complete toolkit to transform your marketing strategies.
    • Expand Your Network: Connect with a diverse community of professionals, opening doors to new opportunities and collaborations in the business world.
    • Valued Credentials: Enhance your resume with a digital badge, showcasing your AI marketing expertise on platforms like LinkedIn.
    • Flexible Learning: Enjoy the convenience of a live-virtual format, allowing you to learn from anywhere while balancing other commitments.

    Showcase your new skills

    In addition to earning an AI marketing certificate of completion, you will earn a digital badge to showcase your skills on platforms like LinkedIn. These credentials show your network the concrete and in-demand skills you earned from taking this Kelley program.

    AI marketing course digital badge

    AI Applications for Business

    “[This webinar] may seem like it’s too much, too fast. That’s because it’s too much too fast. This is the pace of AI, this is the pace of change, and this is what your teams are dealing with.”

    Hear from Paul Acito, adjunct faculty at the Kelley School of Business and president of Lyftbridge Innovation, as he highlights leaders in AI; dives into Google, OpenAI, and Anthropic tools and implementation; and shares case studies on how AI is rapidly transforming the business landscape around the globe.

    You’ll also be introduced to upcoming spring webinars in AI workplace transformation.

    Description of the video:

    WEBVTT 1 00:00:03.065 --> 00:00:04.675 [Garfield Bowen] What role does AI play 2 00:00:05.675 --> 00:00:07.695 in moving us into the future of innovation? 3 00:00:08.715 --> 00:00:12.655 And I think that is why, um, the things 4 00:00:12.655 --> 00:00:15.335 that we are doing at 3M are so very important, 5 00:00:15.915 --> 00:00:17.935 and we don't have all the answers, I'll tell you, 6 00:00:18.435 --> 00:00:23.265 but we've gotta make sure that we're not being too far 7 00:00:23.265 --> 00:00:26.385 behind where things are going. 8 00:00:26.805 --> 00:00:29.865 So how do we capitalize on the value 9 00:00:30.495 --> 00:00:34.305 that AI provides while at the same time being mindful 10 00:00:34.405 --> 00:00:35.985 of the risks that can be created? 11 00:00:42.275 --> 00:00:43.725 [Dr. Alain Barker] I've been pondering that, obviously for, 12 00:00:43.985 --> 00:00:45.085 uh, a few years now. 13 00:00:45.265 --> 00:00:48.605 And, um, I wanna throw out an idea. 14 00:00:49.585 --> 00:00:54.245 Um, I think that when we, when we create things and, 15 00:00:54.305 --> 00:00:56.765 and I, I see this happening all the time in, 16 00:00:56.825 --> 00:00:57.885 in, in the music world. 17 00:00:58.705 --> 00:01:02.405 We spend a lot of time, uh, emulating and perfecting 18 00:01:02.705 --> 00:01:05.845 and objective that we have identified before ourselves. 19 00:01:07.065 --> 00:01:11.485 And we think that by doing that, we've created something 20 00:01:13.235 --> 00:01:17.615 and with this surge of AI as a new technology, 21 00:01:17.915 --> 00:01:20.095 and we open our eyes and we look in front of ourselves, 22 00:01:20.095 --> 00:01:22.775 and we cannot believe that this technology is basically 23 00:01:23.775 --> 00:01:27.735 reproducing what we consider to be creative work. 24 00:01:29.535 --> 00:01:33.655 I wonder whether we are, um, at a point where we have 25 00:01:33.655 --> 00:01:36.215 to start distinguishing between things 26 00:01:36.215 --> 00:01:38.095 that are being emulated, things 27 00:01:38.095 --> 00:01:39.815 that are being repeated at a very, very, 28 00:01:39.815 --> 00:01:41.055 very high quality level, 29 00:01:41.835 --> 00:01:46.275 and things that are meaningful in a creative sense, 30 00:01:46.735 --> 00:01:48.875 uh, in a sense of it being new, in a sense 31 00:01:48.875 --> 00:01:51.955 of it being impactful to the world that we live in, 32 00:01:52.015 --> 00:01:54.795 in a sense of it accelerating our ability 33 00:01:54.855 --> 00:01:57.115 to impact the world that we, that we live in. 34 00:01:59.545 --> 00:02:01.625 [Dr. Caroline Ylitalo] I am tasked with creating new technologies 35 00:02:01.685 --> 00:02:03.105 and developing new products. 36 00:02:03.785 --> 00:02:08.665 I have over 300 documented inventions for 3M that led 37 00:02:08.725 --> 00:02:11.025 to over a hundred patent applications 38 00:02:11.565 --> 00:02:13.665 and, uh, hundreds of millions of dollars 39 00:02:13.725 --> 00:02:14.985 of new product sales. 40 00:02:16.125 --> 00:02:18.145 So I'm at the forefront of invention. 41 00:02:18.505 --> 00:02:20.745 I used it to free up my time 42 00:02:20.885 --> 00:02:24.225 so I can focus on doing things I am good at, 43 00:02:24.235 --> 00:02:25.905 which is coming up with new ideas. 44 00:02:30.355 --> 00:02:34.175 [Sarah Bellamy] So if we're relying on instruments without this perceptual 45 00:02:34.495 --> 00:02:37.495 function, which like can't engage in discernment 46 00:02:37.495 --> 00:02:41.815 and meaning making, to expedite expedite these processes, 47 00:02:41.885 --> 00:02:43.855 this governance of our society in different ways, 48 00:02:44.835 --> 00:02:46.695 it really does feel like we're opting out 49 00:02:46.695 --> 00:02:47.895 from a moral standpoint. 50 00:02:47.955 --> 00:02:48.975 And that scares me. 51 00:02:49.635 --> 00:02:51.895 Um, I really, I can't abide by that 52 00:02:51.895 --> 00:02:54.335 because for me, like from my perspective, 53 00:02:54.445 --> 00:02:56.695 that is the quintessential facet of being human. 54 00:02:56.805 --> 00:02:59.055 It's our reasoning, our sentience, our capacity to, 55 00:02:59.085 --> 00:03:00.365 to empathize and relate. 56 00:03:01.025 --> 00:03:03.805 And these are the things that are really at the foundation 57 00:03:03.805 --> 00:03:04.965 of our ethical positions. 58 00:03:04.965 --> 00:03:06.365 We have opinions, you know, 59 00:03:06.555 --> 00:03:08.165 they might sometimes be incorrect, 60 00:03:08.865 --> 00:03:10.965 but they're informed by our life experiences. 61 00:03:11.545 --> 00:03:13.365 And that's a composite process, right? 62 00:03:13.705 --> 00:03:14.805 And I think one of the things 63 00:03:14.805 --> 00:03:18.565 that's tricky about this is we're replicating only one 64 00:03:18.665 --> 00:03:21.325 aspect of this, our cognitive reasoning as a proxy 65 00:03:21.425 --> 00:03:24.285 for human engagement, and it that feels dangerous on a, 66 00:03:24.285 --> 00:03:26.325 on a number of of levels for me. 67 00:03:30.625 --> 00:03:32.615 [Paul Acito] Hello everyone, and thank you 68 00:03:32.675 --> 00:03:35.375 for joining this webinar on AI Applications 69 00:03:35.375 --> 00:03:37.415 in Business sponsored by the Kelley School 70 00:03:38.155 --> 00:03:39.655 at Indiana University. 71 00:03:40.415 --> 00:03:42.895 I wanted to share those four vignettes from our guest 72 00:03:42.925 --> 00:03:45.655 instructors that we, instructors that we use in the courses, 73 00:03:45.795 --> 00:03:48.815 but they're also, obviously many of my colleagues 74 00:03:48.835 --> 00:03:49.895 and former colleagues. 75 00:03:50.705 --> 00:03:54.575 These video shorts also provide a window into the thoughtful 76 00:03:54.775 --> 00:03:56.055 approach that we're taking at Kelley 77 00:03:56.075 --> 00:03:57.975 around artificial intelligence. 78 00:03:58.955 --> 00:04:00.175 It was a warmup, uh, 79 00:04:00.195 --> 00:04:03.015 but it's kind of important to have a warmup 80 00:04:03.015 --> 00:04:05.375 because we're coming to you live from Minneapolis, 81 00:04:05.375 --> 00:04:09.095 where it's negative one degrees Fahrenheit for our European, 82 00:04:09.235 --> 00:04:12.655 uh, and other country and international, uh, visitors. 83 00:04:12.715 --> 00:04:17.615 Uh, that's negative 18 C uh, in January in Minnesota. 84 00:04:17.795 --> 00:04:19.535 We have to think warm thoughts, 85 00:04:19.535 --> 00:04:21.815 so please send your warm thoughts to us. 86 00:04:22.035 --> 00:04:23.095 Uh, it's cold. 87 00:04:23.595 --> 00:04:27.455 Uh, we're all wearing sweaters indoors and coats indoors. 88 00:04:28.835 --> 00:04:31.375 To continue with our warming trend, uh, I'd like 89 00:04:31.375 --> 00:04:34.455 to extend a special welcome to our Indiana University 90 00:04:35.075 --> 00:04:36.415 alumni, Kelley alumni, 91 00:04:36.995 --> 00:04:39.615 and attendees from the Chief Marketing, uh, 92 00:04:39.885 --> 00:04:42.295 Officer Collaborative who are 93 00:04:42.295 --> 00:04:45.255 among our 1,700 registrants for this course. 94 00:04:46.495 --> 00:04:48.755 You're gonna hear the word a lot today from me: 95 00:04:48.765 --> 00:04:51.915 leadership. Leadership and big leadership. 96 00:04:52.735 --> 00:04:56.515 Uh, it's important to have a strong leadership context, 97 00:04:56.745 --> 00:04:57.835 context right now. 98 00:04:58.495 --> 00:05:03.475 Uh, AI is the most important technical transformation 99 00:05:03.575 --> 00:05:07.515 of our lifetime, and we need big leaders to get it right. 100 00:05:09.435 --> 00:05:11.735 The purpose of this webinar really are twofold. 101 00:05:12.075 --> 00:05:14.255 One, to provide a glimpse into 102 00:05:14.255 --> 00:05:18.055 how business can leverage starting in 2025, starting today. 103 00:05:18.485 --> 00:05:21.135 It's already started to accelerate growth, 104 00:05:21.235 --> 00:05:23.255 reduce cycle times, reduce costs, 105 00:05:23.795 --> 00:05:27.255 and just overall change the landscape for business 106 00:05:27.315 --> 00:05:28.655 and frankly for society. 107 00:05:29.635 --> 00:05:33.095 And two, to give you a quick introduction to the portfolio 108 00:05:33.115 --> 00:05:36.655 of courses that we have, coming from Kelley Exec Ed, starting 109 00:05:36.725 --> 00:05:39.775 with our third, uh, implementation 110 00:05:39.955 --> 00:05:43.695 of AI Applications in Marketing starting February 4th. 111 00:05:45.255 --> 00:05:47.875 You know, we're all mesmerized with ChatGPT, 112 00:05:47.875 --> 00:05:49.755 and we're astonished with the amount of capital 113 00:05:50.415 --> 00:05:53.955 and human capital that's been deployed against AI. 114 00:05:54.895 --> 00:05:56.435 But we're just getting started. 115 00:05:57.565 --> 00:05:59.805 I had a, uh, CEO who used to say, we're closer 116 00:05:59.865 --> 00:06:02.765 to the beginning than the end of this transformation. 117 00:06:04.415 --> 00:06:09.395 Compute, data, frontier models from OpenAI, Google, 118 00:06:09.505 --> 00:06:12.755 Anthropic, and others are on a three- to four-month cycle time. 119 00:06:13.255 --> 00:06:18.155 So you go on holiday for the holiday break, you come back, 120 00:06:18.895 --> 00:06:20.155 the landscape has changed. 121 00:06:21.355 --> 00:06:25.355 Companies and countries are vying for competitive advantage. 122 00:06:26.095 --> 00:06:28.795 Uh, leaders are repositioning their organizations 123 00:06:29.495 --> 00:06:30.595 for the new frontier. 124 00:06:30.595 --> 00:06:33.395 And know that was not a coincidence that we started out with 125 00:06:33.465 --> 00:06:36.355 that song by, some of you recognized, which, 126 00:06:36.645 --> 00:06:38.955 which tells me something about you that that was, uh, 127 00:06:38.955 --> 00:06:39.995 called The New Frontier. 128 00:06:40.455 --> 00:06:44.835 So why take the risk to try to learn something now about AI, 129 00:06:45.255 --> 00:06:47.595 or worse yet, why take the risk to try 130 00:06:47.595 --> 00:06:49.955 to teach something about AI when things 131 00:06:49.955 --> 00:06:50.995 are changing so fast? 132 00:06:51.185 --> 00:06:52.595 Well, my colleagues 133 00:06:52.615 --> 00:06:56.155 and I hope to share what we know, not just about AI, 134 00:06:56.295 --> 00:06:58.275 but about the diffusion of innovations. 135 00:06:58.295 --> 00:06:59.915 We know a thing or two about innovation, 136 00:07:00.865 --> 00:07:04.115 even though this is an unprecedented invention. 137 00:07:04.745 --> 00:07:09.195 It's the invention of, uh, super intelligence. 138 00:07:09.775 --> 00:07:12.315 We will see AGI, 139 00:07:12.875 --> 00:07:16.595 artificial general intelligence this year, 2025. 140 00:07:17.955 --> 00:07:19.255 And let's not get confused. 141 00:07:19.845 --> 00:07:23.015 This is going to be a turbulent transformation. 142 00:07:23.965 --> 00:07:26.335 Just got some data from the World Economic Forum 143 00:07:26.335 --> 00:07:29.535 that reports that half of employers plan 144 00:07:29.555 --> 00:07:31.855 to reorient their businesses around AI. 145 00:07:32.635 --> 00:07:36.375 Two thirds plan to hire employees with AI skills. 146 00:07:37.225 --> 00:07:40.935 40% anticipate reducing their workforce, 147 00:07:41.225 --> 00:07:44.935 where AI can automate tasks and processes. 148 00:07:46.135 --> 00:07:48.555 So mitigate, mitigating these risks will require 149 00:07:48.835 --> 00:07:52.235 compassionate leadership, harnessing its potential, 150 00:07:52.385 --> 00:07:55.155 will require inspired leadership and governance. 151 00:07:55.815 --> 00:07:56.955 But we're here 152 00:07:57.145 --> 00:07:59.355 because we believe in business as a force 153 00:07:59.455 --> 00:08:01.635 for positive change in the world. 154 00:08:01.735 --> 00:08:05.395 So let's dig into some of these details. 155 00:08:08.685 --> 00:08:09.685 I like to quote 156 00:08:09.865 --> 00:08:13.685 and follow the leaders, uh, in AI, particularly Sam Altman, 157 00:08:14.225 --> 00:08:16.885 Jensen Huang, and we'll hear from both of them, uh, during, 158 00:08:17.025 --> 00:08:19.685 uh, this and, and Dario Amodei, since they have 159 00:08:19.685 --> 00:08:20.845 access to AI models. 160 00:08:21.585 --> 00:08:23.405 And we won't see 161 00:08:23.405 --> 00:08:24.845 what they know for another one to two years. 162 00:08:24.865 --> 00:08:25.965 So imagine what they've got. 163 00:08:25.965 --> 00:08:28.965 They've got these very powerful models. 164 00:08:29.945 --> 00:08:33.025 And I don't know, I'm, I'm imagining 165 00:08:33.025 --> 00:08:34.185 that they're modeling the economy. 166 00:08:34.575 --> 00:08:36.705 They may be making stock picks, who knows? 167 00:08:37.165 --> 00:08:39.105 But they see what we can't. 168 00:08:39.485 --> 00:08:42.585 So anytime they signal something, I like 169 00:08:42.585 --> 00:08:44.745 to listen carefully, and I encourage 170 00:08:44.745 --> 00:08:45.825 you to do the same thing too. 171 00:08:45.845 --> 00:08:49.345 In our courses, we actually have a leaderboard of, of 172 00:08:49.365 --> 00:08:52.545 who these, these leaders are and who should listen to. 173 00:08:52.545 --> 00:08:54.745 But let's just take a listen to Sam Altman. 174 00:08:54.745 --> 00:08:58.005 This was a recent, uh, interview he had with, uh, 175 00:08:58.165 --> 00:09:01.245 Y Combinator in San Francisco, which he used to run. 176 00:09:02.395 --> 00:09:05.895 [Sam Altman] It is easy when there's a new technology platform to say, 177 00:09:05.895 --> 00:09:09.335 well, because I'm doing some of AI, the, the rule, the laws 178 00:09:09.335 --> 00:09:10.735 of business don't apply to me. 179 00:09:10.735 --> 00:09:11.935 That's obviously not true. 180 00:09:13.785 --> 00:09:16.215 And, and remembering not to fall for that, 181 00:09:16.215 --> 00:09:17.135 and that you still have to build 182 00:09:17.135 --> 00:09:18.135 something of enduring value. 183 00:09:21.315 --> 00:09:23.255 [Paul] So this is where we're gonna spend the rest of our time. 184 00:09:24.145 --> 00:09:27.645 Uh, it's going to seem like too much, too fast. 185 00:09:28.585 --> 00:09:32.165 Uh, that's because it's too much, too fast. 186 00:09:32.505 --> 00:09:34.805 And this is, this is the pace of AI. 187 00:09:35.115 --> 00:09:36.565 This is the pace of change, 188 00:09:36.865 --> 00:09:38.485 and this is what your teams are dealing with. 189 00:09:39.185 --> 00:09:41.045 Um, in fact, this list is most 190 00:09:41.045 --> 00:09:42.445 notable in what it doesn't cover. 191 00:09:42.505 --> 00:09:45.525 It doesn't cover Copilot, doesn't cover xAI or Meta 192 00:09:45.985 --> 00:09:47.005 or Perplexity. 193 00:09:47.305 --> 00:09:49.765 It doesn't cover any of the tens of thousands 194 00:09:49.765 --> 00:09:51.365 of wraparound AI apps. 195 00:09:51.945 --> 00:09:55.765 Um, this first eight that we're going to cover, the, 196 00:09:55.765 --> 00:09:57.565 the ones in white are designed 197 00:09:57.565 --> 00:09:58.765 to give you something you can take 198 00:09:58.765 --> 00:10:00.085 and start applying in your business. 199 00:10:00.085 --> 00:10:02.325 Yeah, some of 'em you're gonna have to pay money to, to use, 200 00:10:02.745 --> 00:10:03.765 uh, most are free. 201 00:10:04.265 --> 00:10:07.725 Uh, the last two are a tip off 202 00:10:07.725 --> 00:10:10.205 of what's coming next, proof of concept. 203 00:10:10.985 --> 00:10:13.885 Uh, this is about agentic workflows. 204 00:10:13.885 --> 00:10:16.045 You're hearing a lot about AI agents. I know that. 205 00:10:16.225 --> 00:10:18.645 And they can't really even define 206 00:10:18.645 --> 00:10:20.085 what an AI agent is clearly. 207 00:10:20.505 --> 00:10:24.165 But we'll take a stab at looking at how this might impact, 208 00:10:24.625 --> 00:10:26.965 uh, what goes on at your work desk, uh, today. 209 00:10:27.025 --> 00:10:30.725 So, um, to illustrate these models that, 210 00:10:30.725 --> 00:10:32.085 that we're gonna go through the first eight. 211 00:10:34.215 --> 00:10:38.765 Again, use your warm imagination, uh, for those 212 00:10:38.765 --> 00:10:40.245 of us freezing up here in Minnesota, 213 00:10:40.635 --> 00:10:42.165 imagine you just got a new job. 214 00:10:43.315 --> 00:10:45.405 Imagine that you got asked to run a P&L, 215 00:10:45.405 --> 00:10:48.005 new product program, on bicycling accessories. 216 00:10:48.005 --> 00:10:49.125 This case, it's reflectors. 217 00:10:49.945 --> 00:10:54.005 And we're not only new to the company, you're not only new 218 00:10:54.005 --> 00:10:55.445 to the company, but, uh, 219 00:10:55.625 --> 00:10:57.845 you don't really know anything about the bicycle 220 00:10:57.915 --> 00:10:59.605 accessories, but it got a good offer. 221 00:11:00.385 --> 00:11:04.045 And, um, congratulations. Good news is you got this offer. 222 00:11:04.045 --> 00:11:06.605 You got your big, big comp plan. 223 00:11:07.185 --> 00:11:09.605 Uh, you got a great new boss, new company, new product. 224 00:11:10.025 --> 00:11:12.725 Uh, the bad news is they want a full review 225 00:11:12.725 --> 00:11:14.085 of the business next week. 226 00:11:14.965 --> 00:11:18.465 So we're hoping that we can leverage AI 227 00:11:19.365 --> 00:11:20.745 and get us off to a good start. 228 00:11:20.745 --> 00:11:22.665 Start. And the reason I'm going through this more 229 00:11:22.665 --> 00:11:24.425 or less academic case study, actually, this is kind 230 00:11:24.425 --> 00:11:26.305 of a real company, but we'll get to that later. 231 00:11:27.125 --> 00:11:31.745 The, the idea is that this will illustrate 232 00:11:32.335 --> 00:11:36.185 that AI goes so far beyond cute TikTok videos 233 00:11:36.765 --> 00:11:39.865 and, um, you know, writing 234 00:11:41.665 --> 00:11:42.985 teenagers' essays for, for high school. 235 00:11:43.045 --> 00:11:47.505 It, it, if you're not pushing beyond the free version 236 00:11:47.505 --> 00:11:51.185 of ChatGPT, I think you are going 237 00:11:51.185 --> 00:11:53.165 to be amazed at some of this. 238 00:11:53.265 --> 00:11:56.045 Now, we had 1,700 people sign up for this. 239 00:11:56.545 --> 00:12:00.555 So when you get 1,700 people, you get a normal distribution. 240 00:12:00.585 --> 00:12:02.995 That means there's a hundred of you on the phone 241 00:12:03.025 --> 00:12:05.195 that probably know more than I do, uh, 242 00:12:05.195 --> 00:12:08.435 and have probably forgotten, uh, more than I know about AI. 243 00:12:09.055 --> 00:12:10.795 And there's probably a hundred who 244 00:12:11.675 --> 00:12:13.085 haven't hit the keyboard yet. 245 00:12:13.425 --> 00:12:15.485 And then the rest of us are in the middle somewhere. 246 00:12:15.625 --> 00:12:19.805 So my hope for this time that we have together is 247 00:12:19.805 --> 00:12:23.725 that you get one or two new applications that you can, 248 00:12:23.905 --> 00:12:26.485 you can use some of this may frankly astonish you. 249 00:12:26.715 --> 00:12:28.245 It's, it's, it's pretty incredible. 250 00:12:28.665 --> 00:12:33.565 So let's get started, started. Google has been busy. 251 00:12:34.325 --> 00:12:37.045 I think if you rewind about four to six months, 252 00:12:37.045 --> 00:12:38.125 people were concerned 253 00:12:38.435 --> 00:12:40.685 that Google was falling behind somehow. 254 00:12:40.985 --> 00:12:42.685 I'm, I'm not very worried about Google. 255 00:12:42.885 --> 00:12:45.725 I don't worry about Google or Microsoft or the rest. 256 00:12:46.325 --> 00:12:50.005 I like to say, who's gonna win in AI? Meet the new boss, 257 00:12:51.025 --> 00:12:52.075 same as the old boss. 258 00:12:52.095 --> 00:12:54.515 You're, you're starting to pick up on a classic rock, 259 00:12:55.095 --> 00:12:56.355 uh, theme here as well. 260 00:12:56.935 --> 00:12:59.555 But the point is that 261 00:13:00.575 --> 00:13:03.515 the Google is just been on a tear lately. 262 00:13:03.935 --> 00:13:08.815 And let's take a look at one way Google's implementation 263 00:13:08.855 --> 00:13:10.455 of AI can help us. 264 00:13:10.955 --> 00:13:12.135 We need to do some research. 265 00:13:12.275 --> 00:13:14.135 We don't know anything about this bicycle market, 266 00:13:14.755 --> 00:13:17.215 so let's do some research. 267 00:13:17.315 --> 00:13:20.455 But let's use Google Deep Research. 268 00:13:21.165 --> 00:13:24.255 This is currently only available, so this is a pay -for one. 269 00:13:24.595 --> 00:13:26.415 Uh, but I expect this will be available soon. 270 00:13:26.415 --> 00:13:27.855 It's, it's kind of in beta. 271 00:13:28.515 --> 00:13:32.455 So the features that differentiate deep research are 272 00:13:32.455 --> 00:13:34.415 that you get citations, which is really great. 273 00:13:34.955 --> 00:13:36.415 Um, it's deep research. 274 00:13:36.635 --> 00:13:40.095 Um, what I'm gonna show you took six minutes to do. 275 00:13:40.925 --> 00:13:42.295 It's a summary 276 00:13:43.115 --> 00:13:46.015 and a competent generation of a report. 277 00:13:46.175 --> 00:13:50.175 I hope you'll agree. And importantly, no ads. 278 00:13:50.715 --> 00:13:55.355 So let's take a look. So you can see here I'm 279 00:13:55.355 --> 00:13:59.875 picking, um, Google 1.5 Pro with deep research. 280 00:14:03.155 --> 00:14:06.355 I give it a kind of a quick prompt, said, 281 00:14:07.585 --> 00:14:11.215 write me a marketing analysis on the US market 282 00:14:11.395 --> 00:14:12.655 for bicycle accessories. 283 00:14:13.515 --> 00:14:15.175 So it starts the research process. 284 00:14:20.345 --> 00:14:24.245 At first, it picks up 27 websites. So again, this is key. 285 00:14:24.825 --> 00:14:26.565 Uh, I think it gives you some confidence 286 00:14:26.565 --> 00:14:28.565 that it's not hallucinating when you can actually go back, 287 00:14:28.615 --> 00:14:31.005 click on and find the source document, 288 00:14:37.965 --> 00:14:39.985 and then it says, hey, here's my research plan. 289 00:14:40.375 --> 00:14:43.425 What do you think? I'll analyze results, create a report. 290 00:14:43.815 --> 00:14:47.065 Look good, sounds good. Now again, this took six minutes. 291 00:14:47.125 --> 00:14:48.225 So I've accelerated this. 292 00:14:48.575 --> 00:14:51.525 Then I went back and found 42 resources, 293 00:14:52.585 --> 00:14:56.365 and it generates a Word document, 294 00:14:56.365 --> 00:15:00.935 or in this case a Google Docs that is a pretty comprehensive 295 00:15:00.995 --> 00:15:03.935 and well-structured report on the 296 00:15:03.935 --> 00:15:05.055 bicycle accessories market. 297 00:15:05.155 --> 00:15:07.895 So that's Google Deep research. 298 00:15:13.285 --> 00:15:17.125 Next, we kind of are under some pressure 299 00:15:17.125 --> 00:15:18.845 because we've never worked in this industry. 300 00:15:20.265 --> 00:15:23.765 We have to do a financial statement 301 00:15:24.625 --> 00:15:26.805 and we don't even know where to start. 302 00:15:27.745 --> 00:15:30.125 So even though we have little domain expertise, 303 00:15:30.275 --> 00:15:33.605 this doesn't stop our division president from putting us on 304 00:15:33.605 --> 00:15:34.965 the agenda next week 305 00:15:35.505 --> 00:15:38.525 and insisting on a full set of financial forecasts. 306 00:15:39.495 --> 00:15:40.695 I don't know about you, but I've worked 307 00:15:40.715 --> 00:15:41.775 for these folks before. 308 00:15:42.235 --> 00:15:45.055 That's exactly how it happens. So where do we start? 309 00:15:45.685 --> 00:15:47.575 Well, you know, the way I would start is 310 00:15:48.145 --> 00:15:49.895 let's take a look at somebody else's P&L, 311 00:15:49.895 --> 00:15:51.855 maybe a publicly traded company. 312 00:15:53.105 --> 00:15:54.195 Shimano comes to mind. 313 00:15:54.335 --> 00:15:55.395 If you've ever ridden a bike, 314 00:15:55.395 --> 00:15:57.275 you've probably ridden on Shimano gears. 315 00:15:58.335 --> 00:16:01.435 So wouldn't it be great though, as we're looking at these, 316 00:16:01.435 --> 00:16:04.235 because, you know, I, I can read a P&L 317 00:16:04.235 --> 00:16:05.995 with the best of 'em and a balance sheet, 318 00:16:05.995 --> 00:16:07.955 but you know, I might miss something and, 319 00:16:08.055 --> 00:16:09.315 and time is of the essence. 320 00:16:09.415 --> 00:16:10.555 So wouldn't it be great 321 00:16:10.555 --> 00:16:13.355 to have an expert looking over our shoulder to help 322 00:16:13.355 --> 00:16:14.435 with a P&L assessment? 323 00:16:15.095 --> 00:16:17.115 That's where Google Stream Realtime, 324 00:16:17.335 --> 00:16:20.395 an experimental app you can use today, comes in, 325 00:16:22.635 --> 00:16:26.375 Oh, there we go. 326 00:16:26.795 --> 00:16:29.405 Oh, excuse me. 327 00:16:32.135 --> 00:16:36.335 There we go. So the first thing you do is you give, 328 00:16:36.595 --> 00:16:38.455 and this is a little bit scary, you give Google 329 00:16:39.055 --> 00:16:40.295 complete control of your screen, 330 00:16:40.295 --> 00:16:43.015 or at least visibility on your screen and it can hear you. 331 00:16:43.555 --> 00:16:45.775 So you'll hear a recorded version of me talking 332 00:16:46.355 --> 00:16:47.535 to Google Gemini. 333 00:16:47.835 --> 00:16:50.255 [Recorded Paul] Hey Gemini, can you take a look at a p and l for me? 334 00:16:52.055 --> 00:16:55.225 [Gemini] Sure. I can take a look at the P&L. 335 00:16:55.635 --> 00:16:56.825 Could you please show it to me? 336 00:16:58.105 --> 00:17:00.975 [Recorded Paul] Yeah, can you see that? [Gemini] Yes, 337 00:17:01.295 --> 00:17:04.255 I can see the P&L, also known 338 00:17:04.275 --> 00:17:05.895 as the profit and loss statement. 339 00:17:06.275 --> 00:17:07.855 It appears to be for Shimano 340 00:17:07.875 --> 00:17:10.375 and it shows the financial results for the first nine months 341 00:17:10.375 --> 00:17:13.255 of fiscal years 2023 and 2024. 342 00:17:14.495 --> 00:17:16.195 Do you have any specific questions about it? 343 00:17:17.225 --> 00:17:19.035 [Recorded Paul] Yeah, just, uh, since we're through, 344 00:17:19.035 --> 00:17:22.155 through the nine month period on both of these, uh, both 345 00:17:22.155 --> 00:17:26.995 of these columns here, gimme some observations from, uh, 346 00:17:26.995 --> 00:17:31.315 professional, uh, cost accounting standpoint, 347 00:17:31.815 --> 00:17:36.795 uh, particularly focused on growth cost and, uh, income. 348 00:17:37.535 --> 00:17:40.115 And, uh, just take a look, for example, at net sales 349 00:17:40.495 --> 00:17:42.875 and anything you can tell me about what's going on 350 00:17:42.945 --> 00:17:44.275 between these two periods. 351 00:17:46.095 --> 00:17:47.145 [Gemini] Okay, I can do that. 352 00:17:47.615 --> 00:17:48.945 Looking at the profit and 353 00:17:48.985 --> 00:17:51.565 and loss statement, net sales decreased from 354 00:17:51.565 --> 00:17:56.365 375,264 million yen in the first nine 355 00:17:56.365 --> 00:17:58.125 months of fiscal year 2023 356 00:17:58.665 --> 00:18:03.645 to 334,879 million yen in the first nine months 357 00:18:03.705 --> 00:18:05.205 of fiscal year 2024. 358 00:18:06.285 --> 00:18:08.575 This indicates a significant drop in sales. 359 00:18:09.455 --> 00:18:11.455 Additionally, the cost of sales also decreased, 360 00:18:11.715 --> 00:18:14.375 but it appears to be by a lower percentage than the net 361 00:18:14.375 --> 00:18:16.655 sales resulting in a lower gross profit. 362 00:18:17.905 --> 00:18:20.165 [Recorded Paul] Hey, let's, uh, let's take a look at the balance sheet 363 00:18:20.165 --> 00:18:22.245 and see if there's any surprises there for you. 364 00:18:22.545 --> 00:18:26.445 Um, start off by, with just looking at current assets, 365 00:18:28.555 --> 00:18:30.325 [Gemini] Alright, I can look at the balance sheet 366 00:18:30.325 --> 00:18:32.045 with you focusing on current assets. 367 00:18:32.845 --> 00:18:35.965 I can see that total current assets increased slightly from 368 00:18:35.965 --> 00:18:40.925 652,753 million yen in fiscal year 2023 369 00:18:41.265 --> 00:18:45.245 to 658,909 million yen in the first nine months 370 00:18:45.245 --> 00:18:46.765 of fiscal year 2024. 371 00:18:47.065 --> 00:18:48.285 Do you wanna go into more detail? 372 00:18:49.625 --> 00:18:51.555 [Paul Acito] Okay, before I leave here, if you do try this, I, 373 00:18:51.555 --> 00:18:54.075 I'll tell you, um, one thing I learned is that, um, 374 00:18:54.975 --> 00:18:57.835 the bigger the font, the better accuracy you get. 375 00:18:57.855 --> 00:18:59.235 So I'm, I'm just learning how to use this. 376 00:18:59.235 --> 00:19:00.555 These are brand new tools by the way. 377 00:19:01.575 --> 00:19:05.315 So we've got a market research report, we've got a model, 378 00:19:05.855 --> 00:19:08.555 uh, P&L, we know some things about the market. 379 00:19:09.415 --> 00:19:13.675 Um, let's flip over to OpenAI and custom GPTs. 380 00:19:14.415 --> 00:19:18.715 Um, if you're not using custom GPTs, um, couple of things. 381 00:19:18.815 --> 00:19:23.075 One is you can only develop custom GPTs if you've got a 382 00:19:23.695 --> 00:19:27.475 $20, um, a month subscription to, uh, 383 00:19:27.985 --> 00:19:31.035 ChatGPT, which is about what it costs to get a, 384 00:19:31.155 --> 00:19:32.635 a good Netflix subscription. 385 00:19:33.295 --> 00:19:35.355 Um, you can use custom GPTs 386 00:19:35.355 --> 00:19:37.275 that someone else develops if you've got a free account. 387 00:19:37.855 --> 00:19:41.835 But for me, this is kind of the gateway application for AI. 388 00:19:41.935 --> 00:19:44.515 If you can develop custom GPT, you are going 389 00:19:44.515 --> 00:19:49.075 to get significantly more leverage out of your use of the 390 00:19:49.605 --> 00:19:52.715 model than, than you can without, um, developing that. 391 00:19:52.735 --> 00:19:54.755 And I think you'll see that in the next example. 392 00:19:55.495 --> 00:19:58.275 So we've got, um, this market study. 393 00:19:58.645 --> 00:20:01.395 We've got some ideas around a P&L, uh, 394 00:20:01.615 --> 00:20:03.795 but there's a good deal of uncertainty 395 00:20:04.055 --> 00:20:05.835 and, you know, you've talked with the team 396 00:20:06.375 --> 00:20:08.515 and they've told you a little bit about your new boss 397 00:20:09.335 --> 00:20:12.315 and they're, uh, they know that they like 398 00:20:12.315 --> 00:20:13.555 to beat up scenarios. 399 00:20:13.555 --> 00:20:17.415 So you developed, along with the help of Kelley School, 400 00:20:18.055 --> 00:20:22.375 a scenario-planning custom GPT that cranks out scenarios 401 00:20:22.565 --> 00:20:25.855 because of course, the only thing you know, with a, 402 00:20:25.855 --> 00:20:27.575 about a forecast with a hundred percent 403 00:20:27.575 --> 00:20:29.615 certainty is that it's wrong. 404 00:20:30.195 --> 00:20:31.975 So it's always great to do scenario planning. 405 00:20:32.115 --> 00:20:34.775 So, um, I love this art of the long view approach 406 00:20:34.835 --> 00:20:35.935 to strategic planning. 407 00:20:35.955 --> 00:20:39.575 So let's take a look at a custom GPT implementation. 408 00:20:39.575 --> 00:20:41.015 I'll walk you through what's happening here. 409 00:20:41.035 --> 00:20:44.495 So you can see this is a 10-dimension scenario planning. 410 00:20:44.515 --> 00:20:47.725 It looks at social economic political trends. 411 00:20:48.255 --> 00:20:52.125 We've given it this bicycle market, uh, study that, uh, 412 00:20:52.305 --> 00:20:54.685 we did with deep research from Google. 413 00:20:55.585 --> 00:20:57.445 The other thing you'll note here is the 414 00:20:57.445 --> 00:20:58.725 interaction of these tools. 415 00:20:59.015 --> 00:21:00.965 These tools are all having 416 00:21:01.925 --> 00:21:03.005 relative strengths to one another. 417 00:21:03.305 --> 00:21:06.045 And the combination of these tools is 418 00:21:06.045 --> 00:21:08.845 what I feel is extremely important for, 419 00:21:09.065 --> 00:21:10.925 uh, business applications. 420 00:21:15.655 --> 00:21:17.715 So again, it makes quick work of this 421 00:21:22.935 --> 00:21:24.195 and funny how impatient we get. 422 00:21:24.195 --> 00:21:25.395 We didn't have this three months ago 423 00:21:25.395 --> 00:21:26.475 and now it's like taking too long. 424 00:21:30.175 --> 00:21:33.235 And look at this all the way to a PowerPoint slide. 425 00:21:33.375 --> 00:21:34.635 That's a pretty good Venn diagram. 426 00:21:35.455 --> 00:21:37.475 And we've got three scenarios. 427 00:21:38.055 --> 00:21:42.835 The smart safety-first scenario, the eco-style revolution, 428 00:21:43.415 --> 00:21:45.995 and the value-driven, uh, basics. 429 00:21:46.375 --> 00:21:48.115 So these are three scenarios 430 00:21:48.545 --> 00:21:51.395 that gives you a good scaffolding for your planning. 431 00:21:55.635 --> 00:21:57.605 Alright, another one from OpenAI. 432 00:22:01.575 --> 00:22:03.885 Again, now we've got three scenarios, 433 00:22:04.385 --> 00:22:06.245 but you know, you've got tough management. 434 00:22:06.325 --> 00:22:08.405 I always liken upper management to my dog. 435 00:22:08.485 --> 00:22:10.405 I give my dog a biscuit, she eats it. 436 00:22:10.985 --> 00:22:12.445 And then she immediately forgets 437 00:22:12.445 --> 00:22:14.285 I gave her a biscuit, and she wants another biscuit. 438 00:22:15.425 --> 00:22:17.325 Has anybody ever worked for a manager 439 00:22:17.325 --> 00:22:18.725 that's like my golden retriever? 440 00:22:19.585 --> 00:22:22.205 Our financial-driven leadership team 441 00:22:22.835 --> 00:22:25.045 will wanna see those financials, 442 00:22:25.045 --> 00:22:26.205 will wanna see the scenarios, 443 00:22:26.705 --> 00:22:29.365 but I anticipate the next thing they're going to ask is 444 00:22:30.485 --> 00:22:34.495 what about a P&L structure for each of the scenarios? 445 00:22:34.605 --> 00:22:39.055 This is a demanding group, so we're lucky we have ChatGPT 446 00:22:39.055 --> 00:22:42.735 with code interpreter, and we have some capabilities there. 447 00:22:42.735 --> 00:22:45.015 We can do a new product forecast for each scenario, 448 00:22:45.685 --> 00:22:47.015 then a P&L. 449 00:22:47.715 --> 00:22:51.845 But wait, new product forecasting, that's a tough deal 450 00:22:52.115 --> 00:22:53.485 because you have no history. 451 00:22:54.385 --> 00:22:59.205 So I remember a long, long time ago, longer than for you 452 00:22:59.715 --> 00:23:02.725 that I learned something called the Bass model when I was 453 00:23:02.785 --> 00:23:06.165 in, um, school, my undergraduate and graduate courses. 454 00:23:07.025 --> 00:23:08.805 But it's always been a little clunky to use. 455 00:23:09.905 --> 00:23:11.885 But I bet pat ChatGPT 456 00:23:11.885 --> 00:23:13.205 with its code interpreter can use that. 457 00:23:13.345 --> 00:23:16.965 So let's ask ChatGPT to develop 458 00:23:17.615 --> 00:23:18.685 three forecasts, 459 00:23:18.785 --> 00:23:21.445 one for each scenario; three P&Ls, 460 00:23:21.445 --> 00:23:23.885 based on the Shimano model; 461 00:23:24.785 --> 00:23:28.625 um, and three complete scenario, 462 00:23:29.085 --> 00:23:30.385 uh, representations. 463 00:23:30.435 --> 00:23:35.385 Let's see how it does. So we feed it the 464 00:23:35.385 --> 00:23:36.985 P&Ls, we feed it the market study, 465 00:23:37.205 --> 00:23:39.385 and we feed it the Venn diagram so it knows the, 466 00:23:40.045 --> 00:23:41.865 the three scenarios that it's planning towards. 467 00:23:43.825 --> 00:23:46.045 So I'm, I'm showing this, I know, know it's going fast, 468 00:23:46.585 --> 00:23:49.405 but here's an estimated market size. 469 00:23:50.585 --> 00:23:52.805 So it predicted that based on a prompt, 470 00:23:54.865 --> 00:23:56.245 you give it some assumptions about 471 00:23:56.265 --> 00:23:58.645 how fast you think you can penetrate, um, 472 00:23:58.855 --> 00:24:00.405 based on this Bass model, 473 00:24:00.405 --> 00:24:01.965 which actually works pretty slick here 474 00:24:02.545 --> 00:24:05.345 and there you can see the Greek notation. 475 00:24:05.695 --> 00:24:08.545 Well, I, I've not used it in a few years [pause] 476 00:24:26.515 --> 00:24:28.015 and I'm getting a little greedy here 477 00:24:28.415 --> 00:24:29.655 'cause time is of the essence. 478 00:24:31.095 --> 00:24:33.595 I'm gonna ask chat GP to GPT to, uh, 479 00:24:34.095 --> 00:24:36.235 to develop my whole PowerPoint presentation for me. 480 00:24:36.245 --> 00:24:37.275 Let's see how it does with that. [pause] 481 00:24:48.295 --> 00:24:51.875 Pretty good goes through the market. 482 00:25:04.575 --> 00:25:05.665 There's our Venn diagram 483 00:25:08.085 --> 00:25:09.625 and here's each P&L 484 00:25:09.625 --> 00:25:13.145 scenario for each of the three scenarios 485 00:25:18.635 --> 00:25:20.255 and some recommendations for this three 486 00:25:20.255 --> 00:25:21.415 to five year strategic plan. 487 00:25:21.515 --> 00:25:26.215 So again, ChatGPT, 488 00:25:26.745 --> 00:25:29.935 these models can do a ton of stuff to help you. 489 00:25:30.045 --> 00:25:32.815 They're not always right, you still have to supervise them, 490 00:25:33.075 --> 00:25:34.895 but you still have to supervise your interns 491 00:25:35.155 --> 00:25:36.455 and your marketing assistants 492 00:25:36.455 --> 00:25:37.735 and your product managers as well. 493 00:25:40.255 --> 00:25:42.835 So, another dilemma. 494 00:25:43.215 --> 00:25:47.115 We know that our upper management is hungry for information. 495 00:25:47.115 --> 00:25:49.635 They're very enthusiastic about this project 496 00:25:49.635 --> 00:25:50.715 that you've been assigned to. 497 00:25:51.135 --> 00:25:53.635 You're gonna present them with these three scenarios. 498 00:25:54.055 --> 00:25:58.155 But again, anticipating what leadership and you need to do. 499 00:25:58.715 --> 00:26:03.075 Wouldn't it be great to be able to pressure test the 500 00:26:03.835 --> 00:26:06.395 P&Ls once they decide which one they 501 00:26:06.395 --> 00:26:07.475 like and which one they'll fund. 502 00:26:08.095 --> 00:26:10.195 So next up is Anthropics' Claude, 503 00:26:10.935 --> 00:26:13.915 and an application, uh, or a feature called Artifacts. 504 00:26:14.815 --> 00:26:18.385 And if you don't code 505 00:26:18.565 --> 00:26:22.065 and write Python, congratulations this afternoon, you can. 506 00:26:22.965 --> 00:26:26.425 Um, here I'm gonna ask Claude to take, 507 00:26:27.365 --> 00:26:28.905 uh, one of the P&L models. 508 00:26:28.985 --> 00:26:32.825 I think I, I gave it the, um, most likely 509 00:26:32.925 --> 00:26:34.025 to start out in the middle. 510 00:26:35.045 --> 00:26:36.625 And I thought, wouldn't it be interesting 511 00:26:36.625 --> 00:26:40.425 to have an interactive app to share with my team to go back 512 00:26:40.425 --> 00:26:44.535 and forth and really be able 513 00:26:44.535 --> 00:26:47.415 to pressure-test this live, maybe even put it up in front of 514 00:26:47.415 --> 00:26:48.935 that, that management team 515 00:26:49.635 --> 00:26:52.415 and make a good impression on my first two weeks on the job. 516 00:26:53.155 --> 00:26:56.855 And you can see here in a second, this makes quick work. 517 00:26:56.875 --> 00:26:57.935 Writes the Python code. 518 00:26:57.935 --> 00:27:00.175 That's what you saw scrolling by was the Python code. 519 00:27:00.515 --> 00:27:03.215 And here is a fully functional widget 520 00:27:03.235 --> 00:27:04.695 or app, um, 521 00:27:05.585 --> 00:27:08.335 which doesn't necessitate you going into a clunky Excel, 522 00:27:08.995 --> 00:27:12.695 um, a clunky Excel application in front 523 00:27:12.695 --> 00:27:13.735 of the leadership team. 524 00:27:13.955 --> 00:27:15.575 The only thing worse than, uh, 525 00:27:15.795 --> 00:27:17.175 making a presentation in front 526 00:27:17.175 --> 00:27:18.895 of management is making a presentation in management 527 00:27:18.895 --> 00:27:21.415 where you have to pull up Excel and, 528 00:27:21.475 --> 00:27:23.495 and, um, do PowerPoint live. 529 00:27:26.895 --> 00:27:28.995 So now we've got a research study, 530 00:27:28.995 --> 00:27:30.515 we've got strategic planning scenarios. 531 00:27:30.515 --> 00:27:32.835 We've got a full deck for presentation. 532 00:27:32.845 --> 00:27:34.395 We've even got an interactive app. 533 00:27:35.055 --> 00:27:38.685 Um, but we got a green light. 534 00:27:39.795 --> 00:27:41.815 The good news is we got our green light. Go. 535 00:27:41.815 --> 00:27:44.095 They bought out, I don't remember which, uh, 536 00:27:44.195 --> 00:27:45.495 uh, P&L they they did. 537 00:27:45.495 --> 00:27:48.495 But now it's time to communicate, communicate, communicate. 538 00:27:48.495 --> 00:27:50.215 You've gotta get going 'cause you've got a forecast 539 00:27:50.235 --> 00:27:51.535 to make that you just sold. 540 00:27:52.195 --> 00:27:54.415 So let's turn a little bit more now to, 541 00:27:54.715 --> 00:27:55.815 to content development. 542 00:27:57.035 --> 00:28:00.465 Um, they didn't give us the budget. 543 00:28:02.505 --> 00:28:03.675 They gave us the forecast, 544 00:28:04.335 --> 00:28:05.435 but they didn't give us the budget. 545 00:28:05.785 --> 00:28:07.395 They didn't give us the headcount. 546 00:28:07.825 --> 00:28:10.235 They didn't uh okay, any international travel 547 00:28:10.235 --> 00:28:12.595 that I mentioned, that we do business in 60 countries. 548 00:28:13.815 --> 00:28:15.715 And we can barely afford social media. 549 00:28:15.815 --> 00:28:19.275 So one way to get the program quickly into the hands 550 00:28:19.275 --> 00:28:23.115 of our global team, what if we create, uh, 551 00:28:23.415 --> 00:28:25.355 an AI version of a program guide? 552 00:28:26.095 --> 00:28:28.635 And I know many of you have probably already discovered 553 00:28:28.955 --> 00:28:32.675 NotebookLM, a very powerful app from Google that is, uh, um, 554 00:28:32.725 --> 00:28:36.115 again, free to use, uh, but with incredible power. 555 00:28:36.215 --> 00:28:39.955 So let's, uh, let's take a listen and a note here. 556 00:28:40.015 --> 00:28:42.955 So you start out by, um, going in 557 00:28:43.015 --> 00:28:44.715 and giving it some resources. 558 00:28:44.715 --> 00:28:48.795 These can be YouTube videos, transcripts, um, websites. 559 00:28:49.055 --> 00:28:50.955 In this case I gave it our market study. 560 00:28:52.815 --> 00:28:54.555 And it can produce an outline, 561 00:28:54.615 --> 00:28:58.555 it can produce an FAQ, and imagine sharing with your team 562 00:28:59.055 --> 00:29:01.995 all this immediately generated on your business 563 00:29:02.005 --> 00:29:03.715 plans and your market studies. 564 00:29:04.535 --> 00:29:08.235 But the real clicker is this feature called 565 00:29:08.565 --> 00:29:09.915 audio summaries. Listen, 566 00:29:10.775 --> 00:29:12.835 [AI voice 1] Hey everyone, and welcome to another deep dive. 567 00:29:13.165 --> 00:29:15.795 Today, we'll be exploring the world of bicycle accessories. 568 00:29:16.715 --> 00:29:18.875 Specifically, we're gonna zero in on reflectors. 569 00:29:18.935 --> 00:29:20.915 [AI voice 2] Sounds good. [AI voice 1] You know, especially since you've been looking 570 00:29:20.915 --> 00:29:22.075 into cycling gear lately, 571 00:29:22.795 --> 00:29:26.115 I thought why not shine some light on this often overlooked 572 00:29:26.295 --> 00:29:27.835 but vital piece of equipment. 573 00:29:27.865 --> 00:29:28.865 [AI voice 2] Yeah, good idea. 574 00:29:30.115 --> 00:29:32.055 [Paul] What's interesting about that is it is good 575 00:29:32.055 --> 00:29:34.295 that those podcasters are talking about my work, but 576 00:29:34.355 --> 00:29:36.135 [AI voice 1] Hey everyone, and welcome to another deep dive. 577 00:29:36.145 --> 00:29:38.895 Today we'll be exploring the world of bicycle accessories. 578 00:29:39.495 --> 00:29:41.570 Specifically, we're gonna zero in on reflectors. 579 00:29:41.570 --> 00:29:43.405 [AI voice 2] Sounds good. [AI voice 1] You know, especially since you've been looking 580 00:29:43.405 --> 00:29:44.885 at a cycling gear lately, I thought, 581 00:29:45.305 --> 00:29:48.085 why not shine some light on this often overlooked 582 00:29:48.225 --> 00:29:49.885 but vital piece of equipment. [AI voice 2] Yeah, 583 00:29:49.885 --> 00:29:50.885 Good idea. [AI voice 1] So 584 00:29:50.885 --> 00:29:52.445 the big question we're gonna tackle today is, 585 00:29:52.785 --> 00:29:55.485 is there a real opportunity in the bicycle reflector market? 586 00:29:55.485 --> 00:29:57.845 We're gonna sift through the data, look at some trends 587 00:29:58.065 --> 00:30:00.205 and try to get a clear picture of this niche, 588 00:30:00.625 --> 00:30:02.045 but potentially lucrative market. 589 00:30:02.585 --> 00:30:04.725 [Recorded Paul] Hey, so I was wondering if you're gonna take a deep dive 590 00:30:04.725 --> 00:30:07.285 in the market, if you're gonna be looking at any particular 591 00:30:07.375 --> 00:30:10.125 years and which segments you're gonna be looking at? 592 00:30:11.265 --> 00:30:12.685 [AI voice 1] That's a great question, and yes, 593 00:30:12.705 --> 00:30:14.445 we absolutely are gonna dig into the market 594 00:30:14.445 --> 00:30:15.645 details. [AI voice 2] We definitely 595 00:30:15.675 --> 00:30:17.725 plan to look closely at specific years 596 00:30:17.865 --> 00:30:18.885 and market segments. 597 00:30:18.985 --> 00:30:21.005 [AI voice 2] In fact, we've got some data from 2023 598 00:30:21.005 --> 00:30:23.045 and projections all the way out to 2030. 599 00:30:23.705 --> 00:30:27.325 [Paul Acito] So outside of these two synthetic podcasters 600 00:30:28.155 --> 00:30:29.565 talking about your study 601 00:30:29.625 --> 00:30:33.965 and the data that you gave to them, you can interject, 602 00:30:34.055 --> 00:30:35.205 steer the conversation, 603 00:30:35.825 --> 00:30:38.205 and all of it's captured in your notebook. 604 00:30:39.345 --> 00:30:43.565 Um, I couldn't resist to say so that's great. 605 00:30:43.905 --> 00:30:46.765 Um, but if we're sending this around the world 606 00:30:46.785 --> 00:30:48.965 and around to our teams, uh, 607 00:30:48.965 --> 00:30:51.005 what if we gave it a little more personality? [AI voice 1] Hey 608 00:30:51.205 --> 00:30:52.845 everyone, and welcome to another deep dive. 609 00:30:53.175 --> 00:30:55.965 Today we'll be exploring the world of bicycle accessories. 610 00:30:56.685 --> 00:30:58.885 Specifically, we're gonna zero in on reflectors. 611 00:30:58.985 --> 00:31:00.965 [AI voice 2] Sounds good. [AI voice 1] You know, especially since you've been looking 612 00:31:00.965 --> 00:31:02.125 into cycling gear lately, 613 00:31:02.805 --> 00:31:06.165 I thought why not shine some light on this often overlooked 614 00:31:06.345 --> 00:31:08.405 but vital piece of equipment. [AI voice 2] Yeah, good 615 00:31:08.405 --> 00:31:09.405 idea. [AI voice 1] So the 616 00:31:09.405 --> 00:31:10.725 big question we're gonna tackle today is, 617 00:31:10.725 --> 00:31:14.365 is there a real opportunity in the bicycle reflector market? 618 00:31:14.575 --> 00:31:16.845 We're gonna sift through the data, look at some trends, 619 00:31:17.025 --> 00:31:20.245 and try to get a clear picture of this niche, 620 00:31:20.665 --> 00:31:22.685 but potentially lucrative market. 621 00:31:22.915 --> 00:31:25.165 [AI voice 2] Okay. [AI voice 1] So let's jump right in. 622 00:31:25.415 --> 00:31:28.045 First up, let's talk market size. [AI voice 2] Okay. 623 00:31:28.305 --> 00:31:30.725 [AI voice 1] Now, the US bicycle accessories market is already 624 00:31:30.725 --> 00:31:32.245 pretty substantial, right? [AI voice 2] Oh yeah. 625 00:31:32.305 --> 00:31:36.565 It was valued at a cool $2.1 billion in 2023 626 00:31:36.985 --> 00:31:38.845 and projections for 2024, 627 00:31:39.025 --> 00:31:40.485 [AI voice 1] Let me guess, even bigger. [AI voice 2] You 628 00:31:40.545 --> 00:31:41.925 bet. It's estimated to hit [trails off] 629 00:31:42.735 --> 00:31:47.475 [Paul Acito] Pretty cool. Did I mention we do business in 60 countries? 630 00:32:08.935 --> 00:32:09.935 Thirty Languages. [Overlapping Spanish and English AI voices, unintelligible] One 631 00:32:09.935 --> 00:32:10.855 idea. 632 00:32:12.545 --> 00:32:13.875 [Paul Acito] Even without a communication, 633 00:32:16.275 --> 00:32:18.855 you can really scale your operation. 634 00:32:18.915 --> 00:32:21.255 And again, this is early days. 635 00:32:21.765 --> 00:32:24.815 This is the worst AI you are ever going to work with. 636 00:32:24.885 --> 00:32:28.695 It's only gonna get better, smoother, and easier to use. 637 00:32:30.785 --> 00:32:35.165 One thing that the management team was insistent upon is 638 00:32:35.165 --> 00:32:39.085 they really liked the proposal for your brand voice, 639 00:32:40.005 --> 00:32:43.865 but they wanna make sure that you stay on brand voice. 640 00:32:43.865 --> 00:32:46.705 They've had issues before with as they go international, 641 00:32:46.705 --> 00:32:48.105 as they go across divisions, 642 00:32:48.375 --> 00:32:51.465 that the brand voice gets, gets off track. 643 00:32:52.485 --> 00:32:55.585 And this is where Claude style guides comes in. 644 00:32:55.685 --> 00:33:00.025 Claude actually has a style guide feature where 645 00:33:00.595 --> 00:33:03.505 it'll either walk you through how to create a brand voice 646 00:33:03.565 --> 00:33:05.145 or style guide if you don't have one, 647 00:33:05.485 --> 00:33:08.865 or you can upload yours and then ask it to write. 648 00:33:09.525 --> 00:33:11.105 In this case, I'll ask it to write a 649 00:33:11.955 --> 00:33:16.625 email welcoming a customer or an email welcoming or, 650 00:33:16.685 --> 00:33:19.305 or responding to a customer who didn't get the right order, 651 00:33:19.565 --> 00:33:24.265 or whatever communication you need to make using your voice 652 00:33:24.885 --> 00:33:27.225 and making sure that you are on brand 653 00:33:27.525 --> 00:33:28.705 for all your communications. 654 00:33:29.285 --> 00:33:33.425 And you can see here, um, just crank me out 655 00:33:33.425 --> 00:33:35.655 an email. Does it instantly, 656 00:33:36.565 --> 00:33:40.225 and it uses the terminology, the vocabulary, the style, 657 00:33:40.565 --> 00:33:42.905 the dos, and the don'ts of your brand voice. 658 00:33:43.695 --> 00:33:47.865 This is available for free on Claude this afternoon. 659 00:33:52.785 --> 00:33:57.255 But again, did I mention we do business in 60 countries? 660 00:33:57.675 --> 00:34:01.035 Um, so this 661 00:34:02.005 --> 00:34:04.195 style guide can also translate your brand voice 662 00:34:04.225 --> 00:34:05.435 into 30 languages. 663 00:34:07.695 --> 00:34:09.995 So again, why show off all this? 664 00:34:10.335 --> 00:34:14.275 Why talk about this if you're not leveraging 665 00:34:15.205 --> 00:34:19.835 ChatGPT, Claude, Gemini, xAI, 666 00:34:20.105 --> 00:34:24.755 Meta, Perplexity, the rest, and I suspect that you are, 667 00:34:25.895 --> 00:34:28.715 but many organizations still have a policy 668 00:34:29.015 --> 00:34:32.715 of prohibiting the use within the company that you work for, 669 00:34:33.215 --> 00:34:34.715 and it's important that you get governance. 670 00:34:39.225 --> 00:34:41.175 Let's round out our business planning. 671 00:34:41.455 --> 00:34:43.935 I remember I kind of talked about we're not gonna do an 672 00:34:43.935 --> 00:34:45.215 awful lot with content 673 00:34:45.325 --> 00:34:47.975 because that's been well covered in the media, 674 00:34:48.555 --> 00:34:50.975 but I thought I'd show you one example of Sora, 675 00:34:50.975 --> 00:34:52.735 which is brand new from OpenAI 676 00:34:53.275 --> 00:34:54.295 and how you can use that 677 00:34:54.295 --> 00:34:57.895 to develop either social media guides, excuse me, 678 00:34:57.895 --> 00:35:00.175 social media ads or even online ads. 679 00:35:00.715 --> 00:35:02.695 And this is, uh, this is Sora, 680 00:35:03.075 --> 00:35:07.935 and this is, uh, how one use case for taking a, a, 681 00:35:07.955 --> 00:35:10.415 uh, stock photo from a Adobe stock 682 00:35:10.795 --> 00:35:12.495 and asking it to animate it. 683 00:35:12.495 --> 00:35:14.895 So it started out with a picture, not a, not a video, 684 00:35:15.675 --> 00:35:18.455 and, uh, introduced motion into this. 685 00:35:19.195 --> 00:35:23.815 And with a common video editor like, uh, Adobe Premiere, 686 00:35:24.555 --> 00:35:27.615 or you know, others, you can, 687 00:35:27.675 --> 00:35:31.335 you can pretty much drop your logo in, do 688 00:35:31.985 --> 00:35:33.025 some experimentation 689 00:35:33.445 --> 00:35:37.255 and get your product, uh, in motion literally. 690 00:35:37.595 --> 00:35:40.055 And by the way, traffic is one of the harder things 691 00:35:40.075 --> 00:35:41.455 for these models to do right now. 692 00:35:41.955 --> 00:35:43.415 And again, it's early days. 693 00:35:43.415 --> 00:35:46.695 This is less than I think 30 days released, is Sora. 694 00:35:46.755 --> 00:35:48.615 So that's it. 695 00:35:49.075 --> 00:35:51.015 Um, if I've done my job right, 696 00:35:51.015 --> 00:35:52.175 you should have seen at least one 697 00:35:52.175 --> 00:35:55.015 or two new AI applications for business. 698 00:35:55.565 --> 00:35:59.295 This is a sliver of what can be done with AI for business. 699 00:35:59.915 --> 00:36:02.295 But if you stick around, wait, there's more. 700 00:36:02.835 --> 00:36:05.375 Uh, there is two more quick examples 701 00:36:06.035 --> 00:36:10.495 and you can see what our agentic future holds for us, uh, 702 00:36:10.635 --> 00:36:14.135 for the, uh, um, 2025 year. 703 00:36:15.355 --> 00:36:20.255 So, um, we'll start out by listening to the CEO 704 00:36:20.395 --> 00:36:22.255 of the world's most valuable company, 705 00:36:22.275 --> 00:36:25.575 and you can see the quote from Jensen here on agents. 706 00:36:26.315 --> 00:36:28.615 [Jensen Huang] AI has been advancing at an incredible pace. 707 00:36:29.245 --> 00:36:31.335 Started with perception AI. 708 00:36:32.155 --> 00:36:34.295 We now can understand images and words 709 00:36:34.315 --> 00:36:37.255 and sounds to generative AI. 710 00:36:37.795 --> 00:36:40.055 We can generate images and texts and sounds 711 00:36:41.275 --> 00:36:45.635 and now agentic AI, AIs that can 712 00:36:46.155 --> 00:36:48.915 perceive, reason, plan, and act. 713 00:36:52.045 --> 00:36:56.225 [Paul Acito] So while we're still defining agentic AI, 714 00:36:57.045 --> 00:36:59.505 we can be off to the races in terms of our use. 715 00:37:00.525 --> 00:37:03.865 Uh, you've probably heard about Anthropics' 716 00:37:03.885 --> 00:37:06.865 Claude computer control. If you haven't, 717 00:37:07.295 --> 00:37:11.215 what this is is a program whereby 718 00:37:11.955 --> 00:37:15.375 you can offer control of your keyboard, mouse, 719 00:37:15.875 --> 00:37:18.045 and apps to Claude. 720 00:37:19.425 --> 00:37:23.455 Lemme repeat that: entire control of your computer to the AI. 721 00:37:24.285 --> 00:37:28.015 What could possibly go wrong? So here's an example. 722 00:37:28.035 --> 00:37:29.855 You'll note that I'm using a container. 723 00:37:29.855 --> 00:37:32.175 I didn't let it have access to my whole computer. 724 00:37:32.875 --> 00:37:35.135 Of all of the apps that I'm showing you 725 00:37:35.155 --> 00:37:36.775 and applications that I'm showing you today, 726 00:37:36.775 --> 00:37:40.335 this is one I would not recommend, uh, that you use it. 727 00:37:40.335 --> 00:37:44.745 If you do, be sure to containerize it with, um, one of the, 728 00:37:45.245 --> 00:37:46.745 um, one of the containers 729 00:37:46.745 --> 00:37:48.465 that's available online, usually for free. 730 00:37:49.045 --> 00:37:50.945 But here I've given it a one line quote 731 00:37:50.945 --> 00:37:54.225 and I said, go download some pictures or images of cyclists. 732 00:37:54.365 --> 00:37:56.705 That's all I asked it to do. I didn't tell where to look. 733 00:37:57.225 --> 00:37:59.905 I didn't tell, uh, it, uh, how to look. 734 00:38:00.165 --> 00:38:03.505 It went in, figured out which search engines to look to, uh, 735 00:38:03.525 --> 00:38:05.505 to use and goes through 736 00:38:06.005 --> 00:38:09.865 and downloads, as you can see to my, um, downloads, 737 00:38:10.005 --> 00:38:11.475 uh, folder, 738 00:38:11.655 --> 00:38:15.195 the, the, the pictures. And there you are. 739 00:38:15.575 --> 00:38:17.235 And that was just one simple example. 740 00:38:18.215 --> 00:38:22.875 Um, but where I think the big payout is going to be 741 00:38:23.655 --> 00:38:27.475 in 2025 is with AI automations and agents. 742 00:38:28.015 --> 00:38:29.675 So we've got, um, 743 00:38:30.185 --> 00:38:32.235 this built into some of our courses already. 744 00:38:32.305 --> 00:38:35.965 This is, uh, the logo for make.com, which if you're familiar 745 00:38:35.965 --> 00:38:38.365 with Zapier or Make or Plum 746 00:38:38.825 --> 00:38:41.205 or uh, uh, n8n, 747 00:38:41.305 --> 00:38:44.725 or the rest, this is one of these, uh, AI automation 748 00:38:45.255 --> 00:38:46.685 agent, uh, environments. 749 00:38:47.665 --> 00:38:52.015 And let's imagine, once again, back to our business 750 00:38:52.015 --> 00:38:53.015 with your budget cuts, 751 00:38:53.035 --> 00:38:56.415 you didn't get the requisition signed that you wanted 752 00:38:56.435 --> 00:38:58.495 for a market research analyst. 753 00:38:58.675 --> 00:39:02.975 So you designed one, and his name is Paul. 754 00:39:03.435 --> 00:39:06.295 And all you have to do is send Paul an email. 755 00:39:06.295 --> 00:39:07.375 Here we have an email that says, 756 00:39:07.795 --> 00:39:09.695 please design the following survey. 757 00:39:10.595 --> 00:39:12.335 And, uh, you'll probably get this survey. 758 00:39:12.835 --> 00:39:16.215 Uh, what, uh, what kind of use do you make of, of the, uh, 759 00:39:16.215 --> 00:39:18.135 different, the different tools? 760 00:39:18.355 --> 00:39:19.895 And, uh, ask an open-ended question 761 00:39:19.895 --> 00:39:21.175 and see if you can collect their job title. 762 00:39:22.555 --> 00:39:25.655 So all you do, just like you would with Paul if Paul 763 00:39:25.655 --> 00:39:27.335 were your market research analyst, 764 00:39:27.955 --> 00:39:29.375 is you send Paul an email. 765 00:39:30.885 --> 00:39:35.115 Um, our agent gets this email, 766 00:39:35.305 --> 00:39:37.635 it's automatically filed in a specific folder. 767 00:39:37.695 --> 00:39:40.635 And this triggers an automation, an AI automation, 768 00:39:41.045 --> 00:39:43.195 which goes through and designs the survey. 769 00:39:44.795 --> 00:39:47.175 So it designs the survey in Google Forms in this instance. 770 00:39:48.015 --> 00:39:50.315 And you can see there's the form that goes out. 771 00:39:50.935 --> 00:39:55.755 So this is, uh, um, a pretty competent survey, online survey, 772 00:39:56.505 --> 00:40:00.915 that is sent out to a prescribed mailing list, email list. 773 00:40:04.055 --> 00:40:06.395 And once the due date has passed, 774 00:40:06.785 --> 00:40:10.815 that triggers a second AI automation, which goes through 775 00:40:11.675 --> 00:40:15.695 and analyzes the data, the sentiment, 776 00:40:16.035 --> 00:40:18.255 and produces a market research report. 777 00:40:18.435 --> 00:40:20.735 So this is where this is headed. 778 00:40:21.395 --> 00:40:24.855 You will hopefully be in 2025 creating your own team. 779 00:40:24.955 --> 00:40:29.215 So leave space on your work chart for AI agents. 780 00:40:32.345 --> 00:40:34.365 So that's, um, that's what I had. 781 00:40:34.385 --> 00:40:35.685 I'm gonna give one more plug here 782 00:40:35.685 --> 00:40:37.245 and then hand things over quickly to Kim. 783 00:40:37.365 --> 00:40:39.565 I think we're right on the money in terms of time. 784 00:40:40.345 --> 00:40:43.405 Um, this is the course that we're doing for the third time. 785 00:40:43.585 --> 00:40:45.405 Uh, it's frankly all new. 786 00:40:45.985 --> 00:40:48.605 Uh, but, um, if you take the course, you'll have access 787 00:40:48.625 --> 00:40:53.125 to the, um, changed content as, as things progress. 788 00:40:55.085 --> 00:40:57.575 It's, uh, called AI Applications in Marketing. 789 00:40:57.645 --> 00:40:58.735 It's spring, obviously 790 00:40:58.735 --> 00:41:01.815 marketing is gonna be a early mover in the application. 791 00:41:02.045 --> 00:41:04.095 There's a link to the, uh, signup. 792 00:41:04.595 --> 00:41:07.135 Um, we'll show off a little bit of the, uh, 793 00:41:07.165 --> 00:41:08.455 faculty that we've got. 794 00:41:08.645 --> 00:41:11.615 This is all, uh, not all of these are confirmed 795 00:41:11.615 --> 00:41:15.015 for February, but these are all speakers that we have, uh, 796 00:41:15.365 --> 00:41:19.495 used in the past, uh, including, uh, people from Nvidia, 797 00:41:20.065 --> 00:41:24.455 Intel, three CEOs of ad agencies, 3M, uh, 798 00:41:24.635 --> 00:41:27.695 and of course Kelley's, uh, uh, um, 799 00:41:29.325 --> 00:41:31.835 staff of, uh, of, uh, professors. 800 00:41:32.375 --> 00:41:34.485 So with that, um, I'll turn things over to Kim. 801 00:41:34.495 --> 00:41:36.165 Thank you very much for your kind attention 802 00:41:36.465 --> 00:41:38.725 and, uh, uh, send warm thoughts. 803 00:41:40.115 --> 00:41:41.185 [Kim Allison] Thank you so much, Paul. 804 00:41:41.485 --> 00:41:43.025 Hi everyone, my name's Kim Allison 805 00:41:43.215 --> 00:41:44.745 with Kelley Executive Education. 806 00:41:44.745 --> 00:41:46.225 Thank you all for joining us today. 807 00:41:46.845 --> 00:41:48.385 So I hope you guys have enjoyed this session. 808 00:41:48.605 --> 00:41:50.905 If you wanna learn more about AI applications 809 00:41:51.205 --> 00:41:53.705 and dive a little deeper into some of these topics, 810 00:41:53.845 --> 00:41:55.665 we have a great course coming up called 811 00:41:56.325 --> 00:41:57.905 AI Applications for Marketing. 812 00:41:58.125 --> 00:41:59.425 Um, Paul Acito 813 00:41:59.565 --> 00:42:01.465 and some outstanding faculty are 814 00:42:01.465 --> 00:42:02.505 gonna be teaching this program. 815 00:42:03.325 --> 00:42:04.985 The course will be offered online 816 00:42:05.055 --> 00:42:07.505 with some live meetings Tuesdays and Thursdays. 817 00:42:07.805 --> 00:42:11.265 We have a QR code here on the slide, so feel free to scan 818 00:42:11.265 --> 00:42:14.505 that and take a look at, um, more details about the course. 819 00:42:15.365 --> 00:42:16.985 If you have any questions, feel free 820 00:42:16.985 --> 00:42:18.065 to reach out to me anytime. 821 00:42:18.245 --> 00:42:20.145 I'm gonna put my email address in the chat. 822 00:42:20.845 --> 00:42:22.505 Um, and again, if you're not familiar 823 00:42:22.505 --> 00:42:24.945 with Kelley Executive Education, we provide a range 824 00:42:24.945 --> 00:42:27.185 of different courses and certificate programs 825 00:42:27.285 --> 00:42:29.305 to meet the needs of working professionals. 826 00:42:29.885 --> 00:42:31.625 Um, we cover a number of different topics 827 00:42:31.685 --> 00:42:35.505 around AI leadership, operational excellence, finance, 828 00:42:35.655 --> 00:42:37.265 project management, and more. 829 00:42:37.765 --> 00:42:40.145 So, um, anyway, I'll add my email to the chat, 830 00:42:40.145 --> 00:42:41.505 so if you have any questions, feel free 831 00:42:41.505 --> 00:42:42.585 to reach out to me anytime. 832 00:42:43.085 --> 00:42:44.305 And Paul, I'll hand it back to you. 833 00:42:48.085 --> 00:42:50.195 [Paul] Great. I'm, you know, we're at time. 834 00:42:50.615 --> 00:42:52.355 Um, that, that's fine. 835 00:42:52.355 --> 00:42:54.475 So you can get back to get back to work 836 00:42:55.055 --> 00:42:57.875 or if you're using AI, um, you can be more productive 837 00:42:57.875 --> 00:42:59.115 and stick around for some Q&A. 838 00:42:59.575 --> 00:43:02.475 Uh, but I'd be happy to stick around for 5, 10, 15 minutes. 839 00:43:02.655 --> 00:43:04.675 Uh, um, I don't have another, uh, appointment 840 00:43:04.675 --> 00:43:06.715 until 12, uh, our time. 841 00:43:07.575 --> 00:43:10.835 But, um, if you wanna put a question in the chat, um, 842 00:43:10.835 --> 00:43:12.195 especially about the course or, 843 00:43:12.255 --> 00:43:15.155 or, you know, any other thoughts, um, 844 00:43:15.415 --> 00:43:18.235 or any other, uh, things that you'd like me to share, uh, 845 00:43:18.235 --> 00:43:21.035 because I can't really put you on on speaker, um, 846 00:43:21.615 --> 00:43:23.955 you can say that I'll, I'm gonna just brag a little bit. 847 00:43:23.955 --> 00:43:26.635 These are some of the, uh, comments that, uh, 848 00:43:26.975 --> 00:43:28.875 our students have made about, uh, 849 00:43:29.005 --> 00:43:30.675 about the course that in marketing. 850 00:43:30.815 --> 00:43:33.155 And we're, we're going to have an entire portfolio 851 00:43:33.895 --> 00:43:36.155 of courses, AI applications in business 852 00:43:36.175 --> 00:43:38.915 and strategy, uh, Python for marketers. 853 00:43:39.295 --> 00:43:42.155 Uh, it's, it's gonna be a fun year. [pause] [unknown voice in the background] Could 854 00:43:43.765 --> 00:43:46.405 you ask Kim to drop the, to register 855 00:43:46.405 --> 00:43:47.605 for the course in the chat? 856 00:43:48.465 --> 00:43:51.005 [Paul] Hey, Kim, if you're still on, can you drop the link 857 00:43:51.025 --> 00:43:53.205 to register for the course in the chat? 858 00:43:54.645 --> 00:43:56.285 [Kim] Absolutely. [Paul] Oh, oh, okay. You're still there. 859 00:43:56.365 --> 00:43:57.805 I was gonna say, I could probably figure that out. 860 00:43:58.235 --> 00:44:00.165 [Kim] Yeah, absolutely. I'll drop it in the chat for anyone 861 00:44:00.165 --> 00:44:01.205 that needs some more information. 862 00:44:02.025 --> 00:44:03.765 Um, and Paul, we do have, we have a couple 863 00:44:03.765 --> 00:44:05.605 of questions about privacy. 864 00:44:06.015 --> 00:44:07.405 Could you touch a little bit on that? 865 00:44:07.585 --> 00:44:11.405 Um, Candace was wondering, you know, are there any privacy, 866 00:44:11.865 --> 00:44:15.245 what's the privacy around information you're feeding to AI? 867 00:44:15.425 --> 00:44:17.125 For example, is ChatGPT 868 00:44:17.585 --> 00:44:19.525 or Google capturing what you're asking for 869 00:44:19.525 --> 00:44:21.485 and connecting it back to your company? 870 00:44:22.225 --> 00:44:23.325 Can you touch on that a little bit? 871 00:44:23.915 --> 00:44:28.125 [Paul] Yeah, we've got, in of course, of course the course is, 872 00:44:28.125 --> 00:44:31.645 of course, we, we cover privacy, bias, uh, and 873 00:44:31.825 --> 00:44:33.325 and security quite a bit. 874 00:44:33.555 --> 00:44:37.965 Yeah, this is, this is a, I'm, I'm gonna answer it depends. 875 00:44:38.395 --> 00:44:40.565 This is a, is a big deal. Privacy is a big deal. 876 00:44:40.595 --> 00:44:41.885 Data privacy is a big deal, 877 00:44:42.625 --> 00:44:44.245 so you should always proceed with caution. 878 00:44:44.585 --> 00:44:47.165 But there are things you can do if you have a paid version. 879 00:44:48.345 --> 00:44:50.165 I'm not working for ChatGPT, 880 00:44:50.165 --> 00:44:52.165 but if you have a paid version of ChatGPT, 881 00:44:52.165 --> 00:44:55.365 you can toggle off the, uh, sharing. 882 00:44:55.465 --> 00:44:57.365 In other words, they won't train on your data. 883 00:44:58.165 --> 00:44:59.845 I feel pretty good about that. 884 00:45:00.065 --> 00:45:01.365 Now, if you're working 885 00:45:01.465 --> 00:45:04.005 for a multibillion dollar organization, you're, you, 886 00:45:04.005 --> 00:45:07.685 you gotta go talk to your IT and, and marketing and, 887 00:45:07.685 --> 00:45:10.365 and financial governance teams to do that. 888 00:45:10.865 --> 00:45:13.645 So the answer is, it depends. You can protect it. 889 00:45:13.715 --> 00:45:15.965 Some of these experimental apps like, um, 890 00:45:17.875 --> 00:45:18.995 NotebookLM is okay, 891 00:45:19.055 --> 00:45:22.475 but Google Deep Research is probably training on that data. 892 00:45:22.655 --> 00:45:25.395 So my advice to you is if you're using something that's 893 00:45:26.315 --> 00:45:29.635 a free app, do not put any, first of all, 894 00:45:29.635 --> 00:45:33.795 never put any sensitive data anywhere in a, uh, 895 00:45:34.155 --> 00:45:35.235 a nonsecure system. 896 00:45:35.895 --> 00:45:38.595 Uh, but if you must, if you wanna do P&L 897 00:45:38.595 --> 00:45:43.475 for example, uh, things, don't put that into that, uh, 898 00:45:43.535 --> 00:45:45.235 uh, Stream Live at this point 899 00:45:45.505 --> 00:45:47.555 because they could be training on it. 900 00:45:47.555 --> 00:45:50.115 You're gonna have to read the fine print on some 901 00:45:50.115 --> 00:45:51.355 of these more experimental ones. 902 00:45:51.895 --> 00:45:53.315 But these companies are smart. 903 00:45:53.345 --> 00:45:56.515 They know you're not gonna use their apps if they're, 904 00:45:56.535 --> 00:45:58.435 if they're stealing your data. 905 00:45:58.975 --> 00:46:00.315 So they'll, they'll make that available. 906 00:46:00.375 --> 00:46:02.435 But that usually is behind a paywall. 907 00:46:02.535 --> 00:46:04.875 So I think you have to do a deep dive. 908 00:46:04.975 --> 00:46:07.635 We cover, again, all this in the, in the coursework where 909 00:46:08.015 --> 00:46:09.035 and how to set that up. 910 00:46:09.035 --> 00:46:12.835 But you can set up your preferences in ChatGPT paid version. 911 00:46:12.845 --> 00:46:16.235 Again, it's just ChatGPT, also in Gemini, the paid versions, 912 00:46:16.425 --> 00:46:18.555 that it will not train on your data. 913 00:46:18.855 --> 00:46:20.555 That's pretty, pretty good. 914 00:46:21.295 --> 00:46:22.675 And in the, in the course, again, 915 00:46:22.935 --> 00:46:26.555 we give everybody a ChatGPT team license, uh, when, 916 00:46:26.615 --> 00:46:28.995 and then we walk everyone through not to share the data. 917 00:46:29.135 --> 00:46:31.915 So, so, uh, it's a long answer, but it's, 918 00:46:31.915 --> 00:46:32.955 and it's a great question. 919 00:46:33.255 --> 00:46:37.675 And what I would say and cap off with that, don't let you 920 00:46:37.775 --> 00:46:41.265 or your organizations use 921 00:46:42.005 --> 00:46:45.615 the privacy risk to slow the adoption. 922 00:46:46.485 --> 00:46:49.615 There's a phenomenon called B-Y-O-A-I. 923 00:46:50.195 --> 00:46:53.375 So if the organization thinks it's stopping their employees 924 00:46:53.595 --> 00:46:56.175 or their students or their faculty from using AI 925 00:46:56.175 --> 00:46:57.615 because of the security risk, 926 00:46:57.765 --> 00:46:59.815 they bring their own AI to work. 927 00:47:00.785 --> 00:47:02.565 So there's this whole movement, 928 00:47:03.225 --> 00:47:08.125 and the entire reason that I'm doing these courses is 929 00:47:08.145 --> 00:47:10.685 to make sure you don't have to wait for the enterprise. 930 00:47:10.705 --> 00:47:14.845 If, if your enterprise is a year behind, that's like 10, 931 00:47:15.225 --> 00:47:19.045 10 years in, in dog years is, is, uh, in AI years. 932 00:47:19.945 --> 00:47:22.285 So you have to continue to hone your skillset. 933 00:47:22.475 --> 00:47:25.125 This stuff is not waiting for you to figure it out. 934 00:47:25.625 --> 00:47:28.165 And the skills that you developed were prompting the skills 935 00:47:28.165 --> 00:47:30.845 that you developed for understanding the capabilities. 936 00:47:31.515 --> 00:47:34.895 This is only gonna get better. Anything else? 937 00:47:34.955 --> 00:47:35.955 Any other questions, Kim? 938 00:47:38.215 --> 00:47:40.395 [Kim] Um, no, I just wanted to quickly mention, 939 00:47:40.555 --> 00:47:41.755 I forgot to mention this before. 940 00:47:42.215 --> 00:47:44.555 We are, for everyone that's attended the webinar, 941 00:47:44.805 --> 00:47:48.155 we're offering a $500 discount if you sign up for one 942 00:47:48.155 --> 00:47:50.435 of our, uh, Kelley Executive Education courses, 943 00:47:50.745 --> 00:47:52.755 including the AI course that we're talking about. 944 00:47:53.095 --> 00:47:56.715 So if you use Code PD500, which I added to the chat, 945 00:47:56.735 --> 00:47:58.155 you'll receive $500 off. 946 00:47:58.155 --> 00:47:59.395 So I just wanted to mention 947 00:47:59.395 --> 00:48:01.155 that quickly in case anyone is interested. 948 00:48:01.855 --> 00:48:03.635 [Paul] That's a, that's a serious discount. 949 00:48:04.265 --> 00:48:05.595 [Kim] Yeah. Um, 950 00:48:05.595 --> 00:48:08.635 and then we had just a couple questions asking about, 951 00:48:08.685 --> 00:48:11.995 maybe talking a little bit more about the AI marketing 952 00:48:11.995 --> 00:48:14.475 course that you're gonna be offering, um, in February. 953 00:48:14.895 --> 00:48:16.555 If you could just, uh, maybe go into a 954 00:48:16.555 --> 00:48:17.635 little bit more detail on that. 955 00:48:17.755 --> 00:48:18.995 I had a few questions around that. 956 00:48:19.795 --> 00:48:21.165 [Paul] Yeah, sure. So it's, 957 00:48:22.005 --> 00:48:24.055 it's gonna be a little bit different this time around. 958 00:48:24.055 --> 00:48:26.175 We're gonna have more emphasis on 959 00:48:26.245 --> 00:48:28.655 what we call playground activity, which is 960 00:48:29.265 --> 00:48:31.295 using these apps, learning how to develop the apps. 961 00:48:32.035 --> 00:48:33.815 Um, and that's, 962 00:48:33.815 --> 00:48:36.375 that's from feedback we've gotten from previous, uh, 963 00:48:36.375 --> 00:48:40.815 sessions and the, and we got great reviews as you can see, 964 00:48:41.075 --> 00:48:43.905 but we also cover a great deal. 965 00:48:44.005 --> 00:48:46.945 We start out with looking at leadership in AI 966 00:48:46.945 --> 00:48:48.185 because you have to watch this space. 967 00:48:48.235 --> 00:48:50.465 We're only about one half of 1%. 968 00:48:50.485 --> 00:48:51.785 The fact that you're on the phone 969 00:48:51.925 --> 00:48:55.865 or on the Zoom call today puts you in the top one half of 1% 970 00:48:56.165 --> 00:48:58.425 of people thinking about this in business right now. 971 00:48:58.925 --> 00:49:02.145 So I, you're not late, 972 00:49:03.675 --> 00:49:05.495 but you're behind, so to speak. 973 00:49:05.915 --> 00:49:08.535 Uh, if that makes any sense, uh, because we're all behind. 974 00:49:09.275 --> 00:49:12.895 And, and, um, so the course, it proceeds in four modules, 975 00:49:12.915 --> 00:49:16.975 uh, eight sessions, so twice a week for 90 minutes. 976 00:49:17.515 --> 00:49:19.695 And, uh, we, we, I think, 977 00:49:19.695 --> 00:49:21.895 have done a very thoughtful job about how we arrange this. 978 00:49:21.895 --> 00:49:26.455 So we get into and start out with discussions around, uh, 979 00:49:26.535 --> 00:49:30.055 a little bit of history, but, um, it's, it's all about 980 00:49:30.755 --> 00:49:32.535 AI applications for business 981 00:49:32.555 --> 00:49:36.215 and privacy concerns, bias, security, what some 982 00:49:36.215 --> 00:49:38.015 of the history is, who some of the leaders are. 983 00:49:38.435 --> 00:49:40.175 You know, things like, did you know 984 00:49:40.175 --> 00:49:44.535 that if you add up the top AI companies, the, uh, 985 00:49:44.535 --> 00:49:48.575 market cap is 18 trillion, which is the GDP of China, 986 00:49:49.595 --> 00:49:51.895 as an example of the kind of investment that's going on. 987 00:49:52.715 --> 00:49:54.735 And, and we, we walk through all that. 988 00:49:54.955 --> 00:49:57.975 And the reason is that this course is geared for leaders. 989 00:49:58.195 --> 00:50:01.915 You don't need to be a CMO, but if you've been asked 990 00:50:02.095 --> 00:50:05.115 or you would like to lead the AI deployment, 991 00:50:05.215 --> 00:50:08.195 or at least help foster the AI deployment in your 992 00:50:08.195 --> 00:50:10.355 organization, that's what this is doing. 993 00:50:10.455 --> 00:50:12.715 I'm trying to train leaders and we have something 994 00:50:12.715 --> 00:50:14.635 that nobody else does, which is our 995 00:50:15.295 --> 00:50:16.995 AI, GenAI code of ethics. 996 00:50:17.655 --> 00:50:19.475 Uh, and you'll have an opportunity to sign, 997 00:50:19.495 --> 00:50:21.515 and it's a, it's a commitment that says, look, 998 00:50:21.535 --> 00:50:23.875 I'm gonna use, basically use AI for good. 999 00:50:24.105 --> 00:50:25.355 Because as you can imagine, 1000 00:50:25.355 --> 00:50:27.075 these are some very powerful tools 1001 00:50:27.505 --> 00:50:29.965 and the powers of persuasion that they have, 1002 00:50:30.305 --> 00:50:32.565 and the ability to scrape data from the internet 1003 00:50:32.865 --> 00:50:35.485 or violate privacy is incredible. 1004 00:50:35.705 --> 00:50:37.605 And, and we'd like Kelley trained, 1005 00:50:38.025 --> 00:50:40.645 and certainly people who are, uh, trained on, 1006 00:50:40.665 --> 00:50:43.685 on my philosophy on business, uh, to, to use it for good. 1007 00:50:44.585 --> 00:50:48.405 Um, we have, um, Tim Lemper from our, um, uh, 1008 00:50:48.585 --> 00:50:49.925 law school coming in and, 1009 00:50:49.945 --> 00:50:53.405 and, uh, he's a Harvard, uh, law graduate who talks about, 1010 00:50:53.825 --> 00:50:55.885 uh, fair use and copyright. 1011 00:50:56.305 --> 00:50:58.525 So you'll get some clarification on, on that. 1012 00:50:58.535 --> 00:51:00.085 Again, not always a satisfying answer 1013 00:51:00.085 --> 00:51:01.245 because you still have to pay attention. 1014 00:51:01.705 --> 00:51:04.165 We have Sarah Bellamy coming in from, uh, 1015 00:51:04.185 --> 00:51:05.805 the Penumbra Center for Racial Healing 1016 00:51:05.825 --> 00:51:07.485 to talk about coded bias 1017 00:51:07.905 --> 00:51:10.725 and how we're training these systems to be biased, 1018 00:51:10.795 --> 00:51:13.045 just like the internet, just like frankly, 1019 00:51:13.115 --> 00:51:14.125 society in some cases. 1020 00:51:15.025 --> 00:51:19.445 Um, we have a whole session on AI analytics, the ability, 1021 00:51:19.685 --> 00:51:22.365 I kind of, I kind of tipped a hat on the Bass model. 1022 00:51:22.865 --> 00:51:24.205 So all those formulas 1023 00:51:24.205 --> 00:51:26.565 and all the very complex multivariate statistics 1024 00:51:26.565 --> 00:51:29.245 that you were trained on in business analytics, uh, 1025 00:51:29.275 --> 00:51:31.245 that you have to walk down the hall 1026 00:51:31.305 --> 00:51:34.325 and deal with a grumpy PhD in statistics 1027 00:51:34.345 --> 00:51:37.405 to get anything done are all three clicks away now. 1028 00:51:38.425 --> 00:51:39.685 And, and it's just incredible. 1029 00:51:40.665 --> 00:51:43.805 And then we have Kelly King coming in who teaches, uh, 1030 00:51:43.805 --> 00:51:45.605 innovation and communications at IU, 1031 00:51:45.985 --> 00:51:47.205 at Kelley School of Business. 1032 00:51:47.385 --> 00:51:49.725 And we'll walk you through all of this stuff. 1033 00:51:49.745 --> 00:51:52.205 So you can create your own ads, your own Facebook posts, 1034 00:51:52.205 --> 00:51:57.005 your own, um, Insta posts, uh, TikTok videos, uh, 1035 00:51:57.065 --> 00:51:58.725 in a matter of seconds with quality. 1036 00:51:58.725 --> 00:52:01.205 That's, uh, it's approaching Hollywood quality. 1037 00:52:01.625 --> 00:52:04.365 And then we round out with what I call the five A's. 1038 00:52:04.985 --> 00:52:07.525 And this is, uh, I always forget one A, 1039 00:52:07.525 --> 00:52:08.565 so let's see if I can do this. 1040 00:52:08.715 --> 00:52:11.445 It's, it's automations, assistance, agents, 1041 00:52:11.635 --> 00:52:12.765 avatars, and APIs. 1042 00:52:12.805 --> 00:52:17.485 I got 'em all. So it's, it's, it's kind one step 1043 00:52:17.585 --> 00:52:21.605 beyond, um, custom GPT, um, 1044 00:52:22.265 --> 00:52:25.585 not still no code or potentially low code, 1045 00:52:25.965 --> 00:52:29.825 but things that you can do that will blow the socks off 1046 00:52:29.825 --> 00:52:33.265 of anything else, including a very big emphasis on those 1047 00:52:33.265 --> 00:52:35.185 automations that we just showed, so 1048 00:52:35.185 --> 00:52:38.665 that you can create your own team of agents 1049 00:52:38.845 --> 00:52:40.465 to go do stuff for you. 1050 00:52:40.765 --> 00:52:42.385 And this is marketing at this point. 1051 00:52:44.465 --> 00:52:47.255 Again, sorry, long answers. I did I mention I'm a lecturer. 1052 00:52:54.225 --> 00:52:55.225 Anything else? Kim? 1053 00:52:57.695 --> 00:52:59.075 [Kim] Um, we have one question asking, 1054 00:52:59.255 --> 00:53:02.515 can you speak a little bit more to AI prompt engineering 1055 00:53:02.695 --> 00:53:05.555 as a skillset beyond the regular research 1056 00:53:05.555 --> 00:53:07.075 that people may use in AI? 1057 00:53:09.435 --> 00:53:12.935 [Paul] Yeah, prompt 1058 00:53:13.375 --> 00:53:14.655 engineering is a skill. 1059 00:53:16.195 --> 00:53:20.935 I would not recommend investing in learning 1060 00:53:20.935 --> 00:53:25.625 that skill because you're gonna need 1061 00:53:25.625 --> 00:53:26.665 to get good at prompt engineering 1062 00:53:26.745 --> 00:53:28.345 'cause especially in video generation 1063 00:53:28.345 --> 00:53:30.065 and image generation, there's some tricks there. 1064 00:53:31.445 --> 00:53:34.875 But the AI's doing the prompt engineering for you. 1065 00:53:34.895 --> 00:53:36.865 Now, both OpenAI 1066 00:53:37.525 --> 00:53:42.165 and Anthropic have prompt creators. 1067 00:53:43.065 --> 00:53:46.965 So, and my experience is that you come up with a prompt 1068 00:53:46.965 --> 00:53:49.205 that's kind of, for lack of a better term, 1069 00:53:49.205 --> 00:53:51.285 and like when I do it, it's an amateur prompt. 1070 00:53:51.965 --> 00:53:56.045 I put it into the, um, uh, Claude prompt generator 1071 00:53:56.045 --> 00:53:57.725 and it comes out like code. 1072 00:53:58.275 --> 00:54:00.085 It's unbelievable. I mean, you still have to go through 1073 00:54:00.465 --> 00:54:01.605 and you should still know what the 1074 00:54:01.605 --> 00:54:02.925 principles of a good prompt are. 1075 00:54:02.925 --> 00:54:04.365 You need to be specific, you need 1076 00:54:04.365 --> 00:54:06.005 to define a role and other things. 1077 00:54:06.105 --> 00:54:09.005 But you know, that, that, um, this idea 1078 00:54:09.075 --> 00:54:10.725 that there's a career path 1079 00:54:10.745 --> 00:54:15.045 or something in prompt engineering, um, I, I don't buy. 1080 00:54:15.305 --> 00:54:17.365 Um, I think you can, I mean, 1081 00:54:17.365 --> 00:54:20.285 if you're planning on writing apps by using English prompts, 1082 00:54:20.585 --> 00:54:23.605 you probably need to be an expert at prompt engineering. 1083 00:54:24.105 --> 00:54:26.485 But, uh, for what we're doing with business, I think 1084 00:54:26.485 --> 00:54:29.965 for the most part, uh, prompt engineering is, uh, something 1085 00:54:29.965 --> 00:54:33.005 that's better left to learning, you know, 1086 00:54:33.225 --> 00:54:34.725 and you know, what we teach in the course 1087 00:54:35.065 --> 00:54:37.885 and then using the tools, uh, that are available in, in some 1088 00:54:37.885 --> 00:54:39.885 of the frontier models to develop prompts. 1089 00:54:39.885 --> 00:54:40.885 Does that make sense? 1090 00:54:42.745 --> 00:54:44.875 [Kim] Yeah, that's great. Paul, one last question. 1091 00:54:44.935 --> 00:54:47.755 How familiar, um, someone asked, how familiar do I have 1092 00:54:47.755 --> 00:54:51.275 to be with AI, uh, to prepare for the, 1093 00:54:51.295 --> 00:54:52.635 the AI in Marketing course? 1094 00:54:52.895 --> 00:54:54.395 Do you need to have any background 1095 00:54:54.695 --> 00:54:56.195 or, you know, what should they prepare? 1096 00:54:56.615 --> 00:55:00.905 [Paul] Uh, the, the answer is, uh, you know, come, come along 1097 00:55:01.045 --> 00:55:03.265 and, and I'll tell you why that I, I wouldn't let 1098 00:55:03.265 --> 00:55:04.625 that were you for a second, 1099 00:55:04.685 --> 00:55:08.165 and here's why we have created, 1100 00:55:09.475 --> 00:55:12.415 go figure an AI agent. 1101 00:55:12.515 --> 00:55:16.095 So you will take a survey that's generated by AI 1102 00:55:16.755 --> 00:55:20.525 that goes in and asks you with your basic level 1103 00:55:20.525 --> 00:55:21.605 of familiarity with you. 1104 00:55:21.665 --> 00:55:23.365 In other words, it'll ask questions like, 1105 00:55:23.365 --> 00:55:25.285 do you have a paid subscription to ChatGPT? 1106 00:55:25.445 --> 00:55:26.605 'cause that tells us something about 1107 00:55:26.785 --> 00:55:28.125 how, how you're using it. 1108 00:55:28.665 --> 00:55:30.925 Um, um, how familiar are you 1109 00:55:30.925 --> 00:55:32.965 with prompt engineering as an example? 1110 00:55:33.635 --> 00:55:36.605 Then we will generate a personalized report. 1111 00:55:36.705 --> 00:55:38.525 So before module one, you'll have, 1112 00:55:38.945 --> 00:55:40.165 you'll have to watch a movie. 1113 00:55:40.705 --> 00:55:42.445 Uh, I mean, it's pretty light homework, 1114 00:55:42.945 --> 00:55:44.605 but maybe read this article. 1115 00:55:45.065 --> 00:55:46.565 Uh, what do you know about the history? 1116 00:55:46.865 --> 00:55:48.805 Uh, in fact, the only homework we're giving out, 1117 00:55:48.885 --> 00:55:49.885 I think at this point is a couple 1118 00:55:49.885 --> 00:55:53.285 of really good movies on AI, not not sci-fi movies, but, 1119 00:55:53.285 --> 00:55:54.925 but, uh, kind of documentaries. 1120 00:55:55.385 --> 00:55:58.485 So, um, so the answer is, I, there, 1121 00:55:58.485 --> 00:56:00.405 there are no prerequisites at this point. 1122 00:56:00.465 --> 00:56:02.685 You don't need to know anything about Python, you don't need 1123 00:56:02.685 --> 00:56:03.765 to know about any of that. 1124 00:56:04.385 --> 00:56:08.205 And, you know, it's, it's a class there again, we'll, 1125 00:56:08.205 --> 00:56:10.405 we'll have 30 to 50 students, um, 1126 00:56:10.785 --> 00:56:15.165 and it will be a couple people will be brand new, 1127 00:56:15.325 --> 00:56:17.165 a couple people will know more than I do, and 1128 00:56:17.185 --> 00:56:18.845 and they'll make that, then they'll share that 1129 00:56:18.845 --> 00:56:21.685 with the class and everyone else is in the middle. 1130 00:56:21.825 --> 00:56:23.525 And it's a really nice cohort. 1131 00:56:23.745 --> 00:56:25.965 The cohorts pulled together in an 1132 00:56:25.965 --> 00:56:27.325 incredible fashion, by the way. 1133 00:56:27.325 --> 00:56:29.045 They wanna continue to, and, 1134 00:56:29.045 --> 00:56:30.845 and we, we we're gonna offer those opportunities 1135 00:56:30.905 --> 00:56:31.925 to connect after the course. 1136 00:56:32.065 --> 00:56:34.445 So, anything else, Kim? Is that a good answer? 1137 00:56:35.385 --> 00:56:36.815 [Kim] Great. Yeah, I think that's it. 1138 00:56:36.995 --> 00:56:38.975 Um, if you guys have any other questions, 1139 00:56:39.195 --> 00:56:40.655 you can go ahead and type them in the chat. 1140 00:56:40.675 --> 00:56:42.895 If not, we hope you have a wonderful day 1141 00:56:42.995 --> 00:56:44.415 and thank you so much for joining us. 1142 00:56:44.575 --> 00:56:46.535 I hope this was beneficial. [Paul] Yeah, 1143 00:56:47.185 --> 00:56:48.185 Thank you. Thanks everybody. 1144 00:56:48.185 --> 00:56:49.415 Everybody send warm 1145 00:56:49.415 --> 00:56:50.455 thoughts up to Minneapolis. 1146 00:56:52.505 --> 00:56:54.295 Thank you all. [Paul] Uh, hang on, 1147 00:56:54.295 --> 00:56:55.415 Kim, did we get another question? 1148 00:56:55.475 --> 00:56:56.895 Uh, is the course offered at other times? 1149 00:56:57.315 --> 00:56:59.015 It, it depends if we fill this up 1150 00:56:59.355 --> 00:57:01.215 or if, if, uh, if many of you say, hey, 1151 00:57:01.295 --> 00:57:03.855 I can't make those dates, we'll schedule another one. 1152 00:57:05.295 --> 00:57:07.095 [Kim] Absolutely. [Paul] And the only the I'll, 1153 00:57:07.095 --> 00:57:08.735 I'll revise my comment about what, 1154 00:57:08.735 --> 00:57:10.095 what prerequisites are there. 1155 00:57:10.235 --> 00:57:11.815 If you don't like the sound of my voice, 1156 00:57:12.235 --> 00:57:13.415 you probably shouldn't sign 1157 00:57:13.415 --> 00:57:14.695 up 'cause you're gonna get a lot of me. 1158 00:57:18.025 --> 00:57:22.035 [overlapping voices] Thank you all. Salude. Thanks. 1159 00:57:22.815 --> 00:57:23.795 Bye. Take care. Thank you.

    Meet your instructor

    Frank Acito portrait

    Frank Acito

    Frank Acito is professor emeritus of marketing at the Kelley School of Business. His area of expertise includes strategies for analyzing structural models, alignment of marketing and sales strategies, sampling characteristics, and the use of marketing research by managers. He served as director of the Kelley Institute for Business Analytics chair of the marketing department at Kelley. He earned a PhD and MBA at State University of New York at Buffalo and a BSEE from Cornell University.

    Paul Acito

    Paul Acito

    Paul Acito is CEO of Lyftbridge, an AI-infused marketing agency, and an adjunct faculty member at the University of Minnesota and St. Catherine University. Previously, he worked at 3M, where he began as a marketing intern and rose to the position of chief marketing officer. In his time at 3M, he worked in the United States, Thailand, Belgium, and Japan. Paul was also a vice president for the medical device manufacturer Medtronic, an instructor at the University of Minnesota, and an adjunct faculty member at Indiana University. He earned his MBA from the Kelley School.

    Kelly King portrait

    Kelly King

    Tim Lemper portrait

    Tim Lemper

    Tim Lemper teaches in Kelley’s Undergraduate Program, Full-Time MBA program, and Kelley Direct Online MBA program, as well as in Kelley’s partner MBA program at Sungkyunkwan University (SKKU) in Seoul, South Korea. He has also taught courses for the Media School, the Cybersecurity Risk Management program, and the Wells Scholars program at IU.

    Tim serves as Kelley’s faculty liaison for undergraduate student engagement, having previously served nearly a decade as chairperson of Kelley’s Academic Fairness Committee. He is also director of the Legal Advising Forum at the Johnson Center for Entrepreneurship & Innovation and is affiliated with the Center for Intellectual Property Research at IU’s Maurer School of Law.

    Tim is a co-author of a business law textbook, and his teaching and research focus on the law of unfair competition, intellectual property law, marketing law, First Amendment and constitutional law, and media law. In 2012, in direct response to Tim’s research, Public Law 112–190 was passed by Congress and signed by President Obama to amend the federal trademark dilution statute.

    Tim has a Juris Doctorate from Harvard Law School, a Master of Science in Public Administration and Public Policy from the London School of Economics and Political Science, and bachelor’s degrees in political science, history, and English from Indiana University, where he also completed the Liberal Arts and Management Program. He is a former Harry S. Truman Scholar and Herman B Wells Scholar, and currently serves as chairperson of IU’s Nominating Committee for the Rhodes, Marshall, Mitchell, and Churchill Scholarships.

    Tim has a broad range of professional experience, including practicing as an intellectual property and First Amendment litigator, serving as a staff assistant to US Senator Richard Lugar, working as a consultant to the Air Staff for the United States Air Force at the Pentagon, interning in the British House of Commons, and participating in several businesses and nonprofit organizations.

    Questions?

    Please reach out to us at kelleypd@iu.edu to learn more about the AI Applications in Marketing short course and other courses offered by Kelley School of Business Executive Education.

    Social media

    • Linkedin for the Kelley School of Business Executive Education Program
    • Blog for the Kelley School of Business Executive Education Program
    • Accessibility
    • College Scorecard
    • Transferability Between Campuses
    • Privacy Notice
    • Copyright © 2026 The Trustees of Indiana University