1 00:00:00,340 --> 00:00:02,562 fence artificial intelligence . My name 2 00:00:02,562 --> 00:00:04,896 is Lieutenant Commander Arlo Abrahamson . 3 00:00:04,896 --> 00:00:06,729 I'll be moderating today's press 4 00:00:06,729 --> 00:00:08,784 briefing today . It's my pleasure to 5 00:00:08,784 --> 00:00:10,451 introduce the director of the 6 00:00:10,451 --> 00:00:12,284 Department of Defense artificial 7 00:00:12,284 --> 00:00:14,118 Intelligence center . Lieutenant 8 00:00:14,118 --> 00:00:16,062 General Michael grown . Lieutenant 9 00:00:16,062 --> 00:00:18,118 general grown has joined today by Dr 10 00:00:18,118 --> 00:00:20,229 Jane Penelas who is the Chief of test 11 00:00:20,229 --> 00:00:22,284 and evaluation for the jake and Miss 12 00:00:22,284 --> 00:00:24,451 Alcatel who's the chief of responsible 13 00:00:24,451 --> 00:00:26,800 A I we'll begin today's press briefing 14 00:00:26,800 --> 00:00:28,911 with an opening statement followed by 15 00:00:28,911 --> 00:00:31,133 questions . We've got people out of the 16 00:00:31,133 --> 00:00:33,189 line and of course folks in the room 17 00:00:33,189 --> 00:00:35,189 and I think we'll be able to get to 18 00:00:35,189 --> 00:00:37,244 everybody today . So with that sir , 19 00:00:37,244 --> 00:00:39,467 over to you for the opening statement . 20 00:00:39,467 --> 00:00:41,633 Thank you will . Well , good afternoon 21 00:00:41,633 --> 00:00:43,633 and greetings to the members of the 22 00:00:43,633 --> 00:00:45,800 Defense Press corps are really glad to 23 00:00:45,800 --> 00:00:48,022 be here with you today . I hope many of 24 00:00:48,022 --> 00:00:47,770 you got the opportunity to listen in to 25 00:00:47,770 --> 00:00:49,881 at least some of the ai symposium and 26 00:00:49,881 --> 00:00:51,937 technology exchange that we had this 27 00:00:51,937 --> 00:00:53,992 week . This week , it was our second 28 00:00:53,992 --> 00:00:56,230 annual uh , symposium . We had over 29 00:00:56,230 --> 00:00:58,680 1400 participants in three days of 30 00:00:58,680 --> 00:01:01,450 virtualized content . Um I want to say 31 00:01:01,450 --> 00:01:03,450 thank you first of all to all those 32 00:01:03,450 --> 00:01:05,672 senior leaders who participated in that 33 00:01:05,672 --> 00:01:07,894 dialogue over the past three days . And 34 00:01:07,894 --> 00:01:10,061 we've heard from senior leaders across 35 00:01:10,061 --> 00:01:11,839 the department including Deputy 36 00:01:11,839 --> 00:01:13,839 Secretary Hicks , uh the honourable 37 00:01:13,839 --> 00:01:15,950 robert Orca , former deputy secretary 38 00:01:15,950 --> 00:01:18,006 and currently the vice chair for the 39 00:01:18,006 --> 00:01:17,640 National Security Commission on 40 00:01:17,640 --> 00:01:20,150 Artificial Intelligence . MS Michele 41 00:01:20,150 --> 00:01:22,730 Flournoy joined us this week as well . 42 00:01:22,840 --> 00:01:25,007 She's been a tireless advocate for a i 43 00:01:25,007 --> 00:01:27,229 in the policy community and a number of 44 00:01:27,229 --> 00:01:29,229 other senior defense officials also 45 00:01:29,229 --> 00:01:31,451 included . We especially thank the Vice 46 00:01:31,451 --> 00:01:33,673 chairman , General Heighten and the U . 47 00:01:33,673 --> 00:01:35,396 S . Special Operations Command 48 00:01:35,396 --> 00:01:37,507 Commander , General Clark who brought 49 00:01:37,507 --> 00:01:39,673 their not only their insights but they 50 00:01:39,673 --> 00:01:41,673 brought the voice of the warfighter 51 00:01:41,673 --> 00:01:43,896 into our conversation and it was really 52 00:01:43,896 --> 00:01:46,007 valuable to have them here as part of 53 00:01:46,007 --> 00:01:48,007 the session . We all have benefited 54 00:01:48,007 --> 00:01:50,160 from the terrific work of the Sc AI , 55 00:01:50,240 --> 00:01:52,240 the National Security Commission on 56 00:01:52,240 --> 00:01:54,296 artificial intelligence , and remain 57 00:01:54,296 --> 00:01:56,518 grateful for their insights into how we 58 00:01:56,518 --> 00:01:58,740 truly achieve the modernized force that 59 00:01:58,740 --> 00:02:00,962 we need . Finally , we never forget the 60 00:02:00,962 --> 00:02:03,184 enormous support we get from Congress , 61 00:02:03,184 --> 00:02:04,851 who continue to recognize the 62 00:02:04,851 --> 00:02:07,018 transformational nature of our current 63 00:02:07,018 --> 00:02:08,740 challenges . Congresses steady 64 00:02:08,740 --> 00:02:10,907 steadfast support to the D . O . D . S 65 00:02:10,907 --> 00:02:12,962 AI initiatives is one of the keys to 66 00:02:12,962 --> 00:02:16,310 victory . So the level of the 67 00:02:16,320 --> 00:02:18,487 dialogue and the participation in this 68 00:02:18,487 --> 00:02:20,709 symposium from senior leadership really 69 00:02:20,709 --> 00:02:22,931 demonstrates how serious the department 70 00:02:22,931 --> 00:02:24,820 and our broader national security 71 00:02:24,820 --> 00:02:27,042 community takes artificial intelligence 72 00:02:27,042 --> 00:02:29,042 and the generational opportunity we 73 00:02:29,042 --> 00:02:31,264 have to preserve our military advantage 74 00:02:31,264 --> 00:02:33,376 through broad artificial intelligence 75 00:02:33,376 --> 00:02:35,320 implementation at scale across the 76 00:02:35,320 --> 00:02:38,070 force . As this symposium demonstrated , 77 00:02:38,080 --> 00:02:40,700 we have true leadership from the top in 78 00:02:40,700 --> 00:02:43,120 bringing data , artificial intelligence 79 00:02:43,190 --> 00:02:45,190 and new technical approaches to our 80 00:02:45,190 --> 00:02:47,357 most difficult challenges . And that's 81 00:02:47,357 --> 00:02:49,468 a really important thing that we do . 82 00:02:49,468 --> 00:02:52,060 Um uh the competition is clearly 83 00:02:52,060 --> 00:02:54,170 working hard on this . Uh Many have 84 00:02:54,170 --> 00:02:56,140 cited the ruthless efficiency of 85 00:02:56,140 --> 00:02:58,470 totalitarian organizations like the 86 00:02:58,470 --> 00:03:00,660 chinese Communist Party or or Russia 87 00:03:01,040 --> 00:03:03,500 America and our ethically aligned 88 00:03:03,510 --> 00:03:05,454 international partners have always 89 00:03:05,454 --> 00:03:07,343 counted on the innovation of free 90 00:03:07,343 --> 00:03:09,343 societies . And I'm happy to report 91 00:03:09,343 --> 00:03:11,566 after this symposium that the lights of 92 00:03:11,566 --> 00:03:13,732 american innovation are on and they're 93 00:03:13,732 --> 00:03:15,732 shining brightly and they're really 94 00:03:15,732 --> 00:03:17,788 positioned to help us as we make our 95 00:03:17,788 --> 00:03:19,954 way through this transformation . This 96 00:03:19,954 --> 00:03:22,121 is truly the challenge of a generation 97 00:03:22,121 --> 00:03:23,954 and it's clear that our industry 98 00:03:23,954 --> 00:03:26,177 partners and the industry leadership is 99 00:03:26,177 --> 00:03:28,010 equally concerned and engaged in 100 00:03:28,010 --> 00:03:30,343 meeting the demands of this competition . 101 00:03:30,343 --> 00:03:32,399 I hope you captured Deputy Secretary 102 00:03:32,400 --> 00:03:35,350 Hicks's keynote on Tuesday where she 103 00:03:35,350 --> 00:03:37,780 discussed a brand new AI and data 104 00:03:37,780 --> 00:03:40,600 acceleration initiative uh that along 105 00:03:40,600 --> 00:03:42,822 with her recent signing of a memorandum 106 00:03:42,822 --> 00:03:44,933 affirming the department's commitment 107 00:03:44,933 --> 00:03:47,100 to responsible a I I think it's really 108 00:03:47,100 --> 00:03:49,211 important . The just juxtaposition of 109 00:03:49,211 --> 00:03:51,378 these two announcements clearly marked 110 00:03:51,378 --> 00:03:53,433 the department's intent to modernize 111 00:03:53,433 --> 00:03:55,489 our capabilities but to always do so 112 00:03:55,489 --> 00:03:57,322 standing on a rock solid ethical 113 00:03:57,322 --> 00:03:59,489 foundation and I think it's a powerful 114 00:03:59,489 --> 00:04:01,600 signal that both of these things have 115 00:04:01,600 --> 00:04:03,600 happened here in the last 30 days . 116 00:04:03,600 --> 00:04:05,711 Symposiums like these are important . 117 00:04:05,711 --> 00:04:07,600 These are these help us build and 118 00:04:07,600 --> 00:04:09,711 strengthen the Ai ecosystem that will 119 00:04:09,711 --> 00:04:11,822 help broad uh dr broad transformation 120 00:04:11,822 --> 00:04:13,822 across the department . This is our 121 00:04:13,822 --> 00:04:16,044 vision and it's what we focus on in the 122 00:04:16,044 --> 00:04:18,100 joint artificial intelligence center 123 00:04:18,100 --> 00:04:20,267 every day through four like these , we 124 00:04:20,267 --> 00:04:22,378 broadly enable defense transformation 125 00:04:22,378 --> 00:04:24,267 by illuminating idea , hating and 126 00:04:24,267 --> 00:04:26,211 integrating our thinking about the 127 00:04:26,211 --> 00:04:28,433 transformation underway . The symposium 128 00:04:28,433 --> 00:04:30,656 talked about implementation , we talked 129 00:04:30,656 --> 00:04:32,711 about platforms and technologies and 130 00:04:32,711 --> 00:04:34,544 scale and data and lots of other 131 00:04:34,544 --> 00:04:36,322 technical aspects of artificial 132 00:04:36,322 --> 00:04:38,400 intelligence implementation . But at 133 00:04:38,400 --> 00:04:40,511 the pointy end of that spear of those 134 00:04:40,511 --> 00:04:42,289 capabilities , the hard work of 135 00:04:42,289 --> 00:04:44,344 creating successful environments and 136 00:04:44,344 --> 00:04:47,250 implementing AI in the , dangerous 137 00:04:47,260 --> 00:04:50,580 uh challenge warfighting environments 138 00:04:50,590 --> 00:04:52,423 right at the edge is what really 139 00:04:52,423 --> 00:04:54,479 matters and that's what we wanted to 140 00:04:54,479 --> 00:04:56,479 focus on this week accelerating the 141 00:04:56,479 --> 00:04:58,701 capabilities to our war fighters at the 142 00:04:58,701 --> 00:05:00,868 at the tactical edge was really at the 143 00:05:00,868 --> 00:05:02,646 heart of our conversation . The 144 00:05:02,646 --> 00:05:04,812 explosion of innovation that that that 145 00:05:04,812 --> 00:05:06,868 we that we uncovered this week is is 146 00:05:06,868 --> 00:05:08,479 really encouraging . And the 147 00:05:08,479 --> 00:05:10,940 seriousness seriousness with which this 148 00:05:10,940 --> 00:05:13,540 group took take on our most pressing 149 00:05:13,540 --> 00:05:15,818 national challenges is really humbling , 150 00:05:16,240 --> 00:05:18,351 as you may have heard on Tuesday from 151 00:05:18,351 --> 00:05:20,220 the Deputy secretary aI and data 152 00:05:20,220 --> 00:05:22,890 accelerator and seeks to expand our 153 00:05:22,890 --> 00:05:25,410 understanding . We want to understand 154 00:05:25,420 --> 00:05:27,420 things like latency challenges . We 155 00:05:27,420 --> 00:05:29,364 want to understand reliability and 156 00:05:29,364 --> 00:05:31,198 uptime requirements . We want to 157 00:05:31,198 --> 00:05:32,864 understand restrictive policy 158 00:05:32,864 --> 00:05:35,087 environments that may be holdovers from 159 00:05:35,087 --> 00:05:37,309 from an earlier age . Yet still hold us 160 00:05:37,309 --> 00:05:39,253 back in our implementation of AI , 161 00:05:39,253 --> 00:05:41,364 especially at the edge . We wanted to 162 00:05:41,364 --> 00:05:43,530 discover the technical bureaucratic 163 00:05:43,530 --> 00:05:45,600 process and cultural obstacles to 164 00:05:45,600 --> 00:05:47,840 change and remove them from the path of 165 00:05:47,840 --> 00:05:50,007 our war fighters . And that's what the 166 00:05:50,007 --> 00:05:52,007 A . I . And data accelerator is all 167 00:05:52,007 --> 00:05:53,784 about . And I hope we have some 168 00:05:53,784 --> 00:05:55,840 questions about that later . Uh , we 169 00:05:55,840 --> 00:05:58,007 want to understand that the challenges 170 00:05:58,007 --> 00:06:00,118 to implementation implementation from 171 00:06:00,118 --> 00:06:02,340 this technology . Um , I think a couple 172 00:06:02,340 --> 00:06:04,507 of things that that you may have taken 173 00:06:04,507 --> 00:06:06,451 away from Deputy Secretary Hicks's 174 00:06:06,451 --> 00:06:08,562 comments . Um , as certainly I did is 175 00:06:08,562 --> 00:06:10,340 the first is a department level 176 00:06:10,340 --> 00:06:11,896 incentive . Ization of this 177 00:06:11,896 --> 00:06:13,729 experimentation . The department 178 00:06:13,729 --> 00:06:15,729 leadership knows our challenges and 179 00:06:15,729 --> 00:06:17,284 they want to accelerate the 180 00:06:17,284 --> 00:06:19,396 transformation . They have made A I a 181 00:06:19,396 --> 00:06:21,710 priority for resourcing . And we have 182 00:06:21,720 --> 00:06:23,831 we have an awareness on both sides of 183 00:06:23,831 --> 00:06:26,540 the potomac and Congress as well . That 184 00:06:26,550 --> 00:06:29,600 that that this transformation to data 185 00:06:29,600 --> 00:06:31,660 driven and artificial intelligence , 186 00:06:31,660 --> 00:06:33,870 human machine teaming is a really 187 00:06:33,870 --> 00:06:35,981 important transformation that we need 188 00:06:35,981 --> 00:06:38,203 for a modernized force . A second thing 189 00:06:38,203 --> 00:06:40,810 we heard Tuesday is that the A . D . A . 190 00:06:40,820 --> 00:06:42,487 Starts with real war fighting 191 00:06:42,487 --> 00:06:44,598 challenges . Our combatant commanders 192 00:06:44,598 --> 00:06:46,764 have some of the most intense decision 193 00:06:46,764 --> 00:06:48,764 making environments but have yet to 194 00:06:48,764 --> 00:06:50,653 have the opportunity to apply the 195 00:06:50,653 --> 00:06:53,190 latest tools to uh to responsive 196 00:06:53,200 --> 00:06:55,089 decision support . And we want to 197 00:06:55,089 --> 00:06:57,256 correct that and we want to do that in 198 00:06:57,256 --> 00:06:59,367 a repeatable way . We also want to do 199 00:06:59,367 --> 00:07:01,720 that in a way that scales if we if we 200 00:07:01,720 --> 00:07:03,887 make progress at one combatant command 201 00:07:03,887 --> 00:07:06,190 uh and help their decision processes , 202 00:07:06,200 --> 00:07:08,990 we expect to be able to rapidly scale 203 00:07:08,990 --> 00:07:10,920 those capabilities across other 204 00:07:10,920 --> 00:07:12,753 combatant commands to help their 205 00:07:12,753 --> 00:07:14,809 decision making as well . We want to 206 00:07:14,809 --> 00:07:16,920 date what we want to do that in a way 207 00:07:16,920 --> 00:07:18,976 that illuminates a path for software 208 00:07:18,976 --> 00:07:21,031 capabilities that might be different 209 00:07:21,031 --> 00:07:23,142 than our historic norm . The shifting 210 00:07:23,142 --> 00:07:25,340 balance from hardware to software , uh 211 00:07:25,350 --> 00:07:27,183 defined capabilities will really 212 00:07:27,183 --> 00:07:29,294 require us to think differently about 213 00:07:29,294 --> 00:07:31,461 how we approach development Through it 214 00:07:31,461 --> 00:07:33,628 through 88 . We are teaching ourselves 215 00:07:33,628 --> 00:07:35,406 how to implement software-based 216 00:07:35,406 --> 00:07:37,628 capabilities and how to support them in 217 00:07:37,628 --> 00:07:39,794 infrastructure and how to achieve them 218 00:07:39,794 --> 00:07:41,850 at scale a few things I know we will 219 00:07:41,850 --> 00:07:43,961 discover this is not a transformation 220 00:07:43,961 --> 00:07:45,961 that you can make at the surface or 221 00:07:45,961 --> 00:07:47,961 with a series of shiny objects . We 222 00:07:47,961 --> 00:07:49,683 will have to dig deep into a i 223 00:07:49,683 --> 00:07:52,590 architecture data curation , network 224 00:07:52,590 --> 00:07:54,701 planning and we'll have to ensure our 225 00:07:54,701 --> 00:07:56,757 development and operational planners 226 00:07:56,757 --> 00:07:58,757 for decision support , our secure , 227 00:07:58,757 --> 00:08:00,757 reliable and tested . A new coat of 228 00:08:00,757 --> 00:08:03,130 paint will not get us the transformed 229 00:08:03,130 --> 00:08:05,186 decision making and tempo generating 230 00:08:05,186 --> 00:08:07,300 machine that modernized defense 231 00:08:07,300 --> 00:08:09,590 capability demands . There's a very 232 00:08:09,590 --> 00:08:11,680 clear implement implications of a 233 00:08:11,680 --> 00:08:13,670 transformed defense environment . 234 00:08:13,680 --> 00:08:15,700 Foundational e it depends on 235 00:08:15,710 --> 00:08:17,877 transforming the Department of Defense 236 00:08:17,877 --> 00:08:19,821 is technical operating model . The 237 00:08:19,821 --> 00:08:21,877 business model of what Department of 238 00:08:21,877 --> 00:08:23,877 Defense does for the nation doesn't 239 00:08:23,877 --> 00:08:26,330 change . But the ways the operating 240 00:08:26,330 --> 00:08:28,590 model for how we accomplish those goals 241 00:08:28,590 --> 00:08:30,910 certainly will change . We should think 242 00:08:30,910 --> 00:08:32,910 of this as the beginning of a joint 243 00:08:32,910 --> 00:08:34,910 operating system . We might compare 244 00:08:34,910 --> 00:08:37,120 that to a specific vendor ecosystem or 245 00:08:37,120 --> 00:08:40,310 vendor vendor vendor architecture . You 246 00:08:40,310 --> 00:08:42,254 know , there are many examples out 247 00:08:42,254 --> 00:08:44,421 there where the pieces fit together by 248 00:08:44,421 --> 00:08:46,532 design . This is what a new operating 249 00:08:46,532 --> 00:08:48,532 model looks like for defense pieces 250 00:08:48,532 --> 00:08:50,754 that purposely fit together situational 251 00:08:50,754 --> 00:08:52,532 awareness that is automatically 252 00:08:52,532 --> 00:08:54,580 generated and widely shared . Any 253 00:08:54,580 --> 00:08:57,040 sensor available to feed any decision 254 00:08:57,040 --> 00:08:59,460 maker , the deputy secretaries A I . 255 00:08:59,460 --> 00:09:01,516 And data accelerator sets us on that 256 00:09:01,516 --> 00:09:03,640 path . To be honest , there's there's 257 00:09:03,640 --> 00:09:05,473 very little magic here . We have 258 00:09:05,473 --> 00:09:07,251 multiple models to copy and the 259 00:09:07,251 --> 00:09:08,918 commercial environment in the 260 00:09:08,918 --> 00:09:10,973 industrial environment and elsewhere 261 00:09:11,050 --> 00:09:13,230 this is this is all about making the 262 00:09:13,230 --> 00:09:15,452 Department of Defense as productive and 263 00:09:15,452 --> 00:09:17,660 efficient as any modern , successful 264 00:09:17,660 --> 00:09:20,100 data driven enterprise . And as we look 265 00:09:20,100 --> 00:09:22,211 at this , it's pretty easy to see the 266 00:09:22,211 --> 00:09:24,322 scale of the challenge that we face . 267 00:09:24,322 --> 00:09:26,433 In some ways this transformation will 268 00:09:26,433 --> 00:09:28,211 require an integrated operating 269 00:09:28,211 --> 00:09:30,100 environment that could that could 270 00:09:30,100 --> 00:09:32,322 actually make jointness look easy . And 271 00:09:32,322 --> 00:09:34,489 here's what I mean by that . Operating 272 00:09:34,489 --> 00:09:36,489 with data and human machine teeming 273 00:09:36,489 --> 00:09:38,980 across every domain and integrated 274 00:09:38,990 --> 00:09:41,200 across domains demands a level of 275 00:09:41,200 --> 00:09:44,260 process and technical integration and 276 00:09:44,260 --> 00:09:46,600 data commonality that far exceeds what 277 00:09:46,600 --> 00:09:48,880 we practice today . What we're talking 278 00:09:48,880 --> 00:09:51,210 about here implies a much higher level 279 00:09:51,210 --> 00:09:54,330 of integration and platforms and data 280 00:09:54,340 --> 00:09:56,800 and domain awareness that exceeds our 281 00:09:56,800 --> 00:09:58,670 current standards . It is truly 282 00:09:58,670 --> 00:10:00,630 transformational and it is truly 283 00:10:00,630 --> 00:10:03,870 necessary . We won't achieve this by a 284 00:10:03,870 --> 00:10:06,600 scattered yard of shiny objects and 285 00:10:06,600 --> 00:10:08,620 stovepipe developments . We need to 286 00:10:08,620 --> 00:10:10,830 begin planning and developing for a 287 00:10:10,830 --> 00:10:12,663 purposeful operating system that 288 00:10:12,663 --> 00:10:14,663 stitches are various capabilities . 289 00:10:14,663 --> 00:10:16,608 Together , we look forward to your 290 00:10:16,608 --> 00:10:18,774 questions on this for sure . Before we 291 00:10:18,774 --> 00:10:20,941 get into questions though and before I 292 00:10:20,941 --> 00:10:23,052 close here , I just wanna acknowledge 293 00:10:23,052 --> 00:10:22,760 to really important leaders that are 294 00:10:22,760 --> 00:10:25,300 here with me today . One is Alka Patel , 295 00:10:25,310 --> 00:10:28,470 esquire if I may , whom I know I know 296 00:10:28,470 --> 00:10:30,581 some of you know very well . Alka has 297 00:10:30,581 --> 00:10:32,880 been an irresistible force in building 298 00:10:32,880 --> 00:10:34,991 our ethical foundations and baselines 299 00:10:34,991 --> 00:10:37,158 and I hope you have some questions for 300 00:10:37,158 --> 00:10:38,991 her today . The other is Dr jane 301 00:10:38,991 --> 00:10:41,213 Pinellas , who is also a thought leader 302 00:10:41,213 --> 00:10:43,380 and a real leading technical expert on 303 00:10:43,380 --> 00:10:45,590 the emerging discipline of AI testing 304 00:10:45,590 --> 00:10:48,060 and evaluation , a critical component 305 00:10:48,070 --> 00:10:50,550 of A I . Development and AI integration . 306 00:10:50,550 --> 00:10:52,606 So I look forward to hearing from dr 307 00:10:52,606 --> 00:10:54,828 jane and I hope you have some questions 308 00:10:54,828 --> 00:10:56,994 for her as well . These women have led 309 00:10:56,994 --> 00:10:59,217 the jake and the diode through critical 310 00:10:59,217 --> 00:11:01,439 junctures in our development and I hope 311 00:11:01,439 --> 00:11:03,272 I hope you take advantage of the 312 00:11:03,272 --> 00:11:05,439 opportunity to speak with them . We're 313 00:11:05,439 --> 00:11:07,606 grateful for their leadership and will 314 00:11:07,606 --> 00:11:07,230 continue to lean on them as we mature 315 00:11:07,230 --> 00:11:09,210 our responsible AI and testing and 316 00:11:09,210 --> 00:11:11,432 evaluation initiatives for the future . 317 00:11:11,440 --> 00:11:13,662 With that we're very happy to take your 318 00:11:13,662 --> 00:11:15,662 questions . Thank you . Thank you . 319 00:11:15,662 --> 00:11:17,718 General grown first question will go 320 00:11:17,718 --> 00:11:19,829 out to Tony from Bloomberg News sir . 321 00:11:22,430 --> 00:11:24,570 Excuse me I'm testing evaluation . I 322 00:11:24,570 --> 00:11:27,050 cover a bot any a lot over the years . 323 00:11:27,640 --> 00:11:30,050 Are you working with them in terms of 324 00:11:30,170 --> 00:11:32,860 testing matrixes and modeling 325 00:11:33,940 --> 00:11:36,410 criteria for determining effectiveness 326 00:11:36,410 --> 00:11:38,710 and suitability . What would pass it 327 00:11:38,710 --> 00:11:40,877 would fail in terms of an AI construct 328 00:11:40,877 --> 00:11:43,610 for general . Can you give a couple 329 00:11:43,610 --> 00:11:46,340 examples of where in the next couple 330 00:11:46,340 --> 00:11:49,800 years aI might be feel that you came up 331 00:11:49,800 --> 00:11:52,620 with this really neat youtube copilot 332 00:11:52,620 --> 00:11:56,250 issue last it was last year . Is that 333 00:11:56,260 --> 00:11:58,260 is the U two at some point . Soon . 334 00:11:58,260 --> 00:12:01,050 Going to have an Ai co pilot basically . 335 00:12:02,840 --> 00:12:05,007 Sure . Great question , Tony So I just 336 00:12:05,007 --> 00:12:07,290 want to go first please . Sure . So we 337 00:12:07,290 --> 00:12:09,457 work with the a teeny extensively , we 338 00:12:09,457 --> 00:12:11,568 talked to them several times a week . 339 00:12:11,568 --> 00:12:13,623 There are very important partner for 340 00:12:13,623 --> 00:12:15,846 for us and tests and evaluation as well 341 00:12:15,846 --> 00:12:17,623 as the service operational test 342 00:12:17,623 --> 00:12:19,734 commands . Right . We have the uh the 343 00:12:19,734 --> 00:12:21,957 kind of always D component and then the 344 00:12:21,957 --> 00:12:23,679 service component as well . We 345 00:12:23,679 --> 00:12:25,568 primarily interact with our chief 346 00:12:25,568 --> 00:12:27,679 scientist dr Greg Zacharias um and we 347 00:12:27,679 --> 00:12:29,901 work with them in a variety of issues . 348 00:12:29,901 --> 00:12:31,679 Anything from test planning and 349 00:12:31,679 --> 00:12:33,901 updating . Test planning guidance , how 350 00:12:33,901 --> 00:12:36,123 to actually write a test and evaluation 351 00:12:36,123 --> 00:12:38,012 master plan to how do you measure 352 00:12:38,012 --> 00:12:41,000 security of an Ai enabled system to 353 00:12:41,000 --> 00:12:43,000 operationally testing an Ai enabled 354 00:12:43,000 --> 00:12:45,056 system and the infrastructure that's 355 00:12:45,056 --> 00:12:47,056 involved uh etcetera . So we worked 356 00:12:47,056 --> 00:12:49,000 with them and coordinate with them 357 00:12:49,000 --> 00:12:51,056 probably a few times every week . Do 358 00:12:51,056 --> 00:12:53,389 you have any current systems in testing ? 359 00:12:53,389 --> 00:12:55,500 You can talk about just give a couple 360 00:12:55,500 --> 00:12:57,222 examples . So we have the jake 361 00:12:57,222 --> 00:12:59,550 performed testing for all of Jake's 362 00:12:59,560 --> 00:13:02,820 acquired systems . Uh We are actually 363 00:13:02,820 --> 00:13:04,931 partnering with D . O . T . Any for a 364 00:13:04,931 --> 00:13:08,420 couple of them . Um The different 365 00:13:08,430 --> 00:13:10,670 I guess there were various systems are 366 00:13:10,670 --> 00:13:12,837 different stages of development uh and 367 00:13:12,837 --> 00:13:15,190 so some of them um for instance are 368 00:13:15,200 --> 00:13:17,260 forced protection tools . We're 369 00:13:17,260 --> 00:13:19,482 currently testing both at the algorithm 370 00:13:19,482 --> 00:13:21,593 level . So we have vendor models , we 371 00:13:21,593 --> 00:13:23,704 evaluate our those models accurate uh 372 00:13:23,704 --> 00:13:25,927 in terms of their prediction , withheld 373 00:13:25,927 --> 00:13:28,093 test data set , but also to the extent 374 00:13:28,093 --> 00:13:30,149 that some of our systems are fielded 375 00:13:30,149 --> 00:13:32,040 Now we're able to evaluate their 376 00:13:32,040 --> 00:13:34,096 effectiveness with the human . Now . 377 00:13:34,096 --> 00:13:36,318 Right , we care about things like human 378 00:13:36,318 --> 00:13:38,540 system integration and in fact the jake 379 00:13:38,540 --> 00:13:40,373 recently came out with our human 380 00:13:40,373 --> 00:13:42,730 systems integration framework uh which 381 00:13:42,730 --> 00:13:45,230 were able to distribute to all of our D . 382 00:13:45,230 --> 00:13:48,040 O . D . Test partners that helps others 383 00:13:48,040 --> 00:13:49,873 evaluate their systems for human 384 00:13:49,873 --> 00:13:51,762 factors as well . Okay , thanks . 385 00:13:51,762 --> 00:13:54,610 That's good . Great . And then uh and 386 00:13:54,610 --> 00:13:56,443 then with respect I'll just just 387 00:13:56,443 --> 00:13:58,610 quickly , you know with respect to the 388 00:13:58,610 --> 00:14:00,499 kind of the types of AI that will 389 00:14:00,499 --> 00:14:02,610 continue to field . I mean , you know 390 00:14:02,610 --> 00:14:04,666 obviously there are layers of AI for 391 00:14:04,666 --> 00:14:06,888 example , you know those AI is that you 392 00:14:06,888 --> 00:14:09,221 know , run specific systems for example , 393 00:14:09,221 --> 00:14:08,910 like like copilot and Youtube for 394 00:14:08,910 --> 00:14:11,630 example , most of those are service 395 00:14:11,640 --> 00:14:13,360 developed , their service led 396 00:14:13,640 --> 00:14:15,807 developments for , you know , specific 397 00:14:15,807 --> 00:14:17,973 pieces of equipment or weapons systems 398 00:14:17,973 --> 00:14:20,140 or you know , other other constructs . 399 00:14:20,140 --> 00:14:22,029 Um , one of the things that we're 400 00:14:22,029 --> 00:14:24,251 really focused on that that we think is 401 00:14:24,251 --> 00:14:26,510 is um , it matches the maturity of AI 402 00:14:26,510 --> 00:14:28,343 technology to the today as we're 403 00:14:28,343 --> 00:14:30,177 implementing . It is things like 404 00:14:30,177 --> 00:14:32,232 helping decision support , like like 405 00:14:32,232 --> 00:14:35,130 teeing up good decisions for commanders 406 00:14:35,140 --> 00:14:38,260 to help commanders make decisions based 407 00:14:38,260 --> 00:14:40,830 on sound data , either patterns in 408 00:14:40,830 --> 00:14:43,750 historical data or knowledge of things 409 00:14:43,750 --> 00:14:45,972 that are happening on the battlefield , 410 00:14:45,972 --> 00:14:48,083 you know , with the red force or with 411 00:14:48,083 --> 00:14:49,972 the Blue force , but helping good 412 00:14:49,972 --> 00:14:52,194 decision making . If we can make a good 413 00:14:52,194 --> 00:14:54,083 decision making and have informed 414 00:14:54,083 --> 00:14:56,250 decision makers , we think that is the 415 00:14:56,250 --> 00:14:58,028 most significant application of 416 00:14:58,028 --> 00:14:59,972 artificial intelligence . And then 417 00:14:59,972 --> 00:15:02,028 we'll continue to go from there into 418 00:15:02,028 --> 00:15:04,194 into other functions . And the list is 419 00:15:04,194 --> 00:15:06,361 endless from every , you know , moving 420 00:15:06,361 --> 00:15:08,250 logistics successfully around the 421 00:15:08,250 --> 00:15:08,190 battlefield , understanding what's 422 00:15:08,190 --> 00:15:10,301 happening based on historical pattern 423 00:15:10,301 --> 00:15:12,357 and precedent , uh understanding the 424 00:15:12,357 --> 00:15:14,468 implications of weather or terrain on 425 00:15:14,468 --> 00:15:16,634 maneuvers . All of those things can be 426 00:15:16,634 --> 00:15:20,080 assisted by ai So you will see a rapid 427 00:15:20,080 --> 00:15:22,290 proliferation of really enabling tools 428 00:15:22,290 --> 00:15:24,290 for decision makers across the wide 429 00:15:24,290 --> 00:15:26,512 range of warfighting functions . Is any 430 00:15:26,512 --> 00:15:28,512 of this part of the pacific Defense 431 00:15:28,512 --> 00:15:31,880 Initiative field ai enabled predictors 432 00:15:31,880 --> 00:15:33,769 over there ? I don't have a great 433 00:15:33,769 --> 00:15:35,713 example , but anything part of the 434 00:15:35,713 --> 00:15:38,200 pacific defensive , I'm not aware of 435 00:15:38,200 --> 00:15:40,520 what the pacific , specifically the 436 00:15:40,520 --> 00:15:42,742 specific defense initiative is going to 437 00:15:42,742 --> 00:15:44,900 resource historically and we've had a 438 00:15:44,900 --> 00:15:47,011 european defense initiative for years 439 00:15:47,011 --> 00:15:50,280 and in those kind of environments . Uh 440 00:15:50,290 --> 00:15:52,401 combatant commanders have the ability 441 00:15:52,401 --> 00:15:53,957 to to to experiment and and 442 00:15:53,957 --> 00:15:55,957 implementation of capabilities that 443 00:15:55,957 --> 00:15:58,123 they didn't have before . So I don't I 444 00:15:58,123 --> 00:16:00,346 don't know any specifics of P . D . I . 445 00:16:00,346 --> 00:16:02,568 But I suspect that those kind of things 446 00:16:02,568 --> 00:16:04,512 are on the table at least . Okay , 447 00:16:04,512 --> 00:16:06,568 we're gonna go to the phones for the 448 00:16:06,568 --> 00:16:08,790 next question . The next question we'll 449 00:16:08,790 --> 00:16:10,901 go out to Sydney Freedberg . Breaking 450 00:16:10,901 --> 00:16:14,890 defense . Sydney go ahead . I 451 00:16:14,900 --> 00:16:17,233 believe there may be an audio questions . 452 00:16:17,233 --> 00:16:19,730 So Sydney's question is about the A . I . 453 00:16:19,730 --> 00:16:22,770 And data initiative and his question , 454 00:16:22,770 --> 00:16:24,826 I think there's a few reporters that 455 00:16:24,826 --> 00:16:28,490 had similar lines here about how the 456 00:16:28,500 --> 00:16:30,444 jake will work with the chief data 457 00:16:30,444 --> 00:16:32,556 officer and how they'll work with the 458 00:16:32,556 --> 00:16:34,778 combs and some details on that sir into 459 00:16:34,778 --> 00:16:36,944 the team . Yeah . Okay . Great . Thank 460 00:16:36,944 --> 00:16:39,540 you . Thank you Sydney . Um so so the A . 461 00:16:39,540 --> 00:16:41,580 D . A . The A . I . And data 462 00:16:41,580 --> 00:16:44,210 accelerator initiative is something 463 00:16:44,210 --> 00:16:46,920 that is moving really fast and and 464 00:16:46,930 --> 00:16:50,250 frankly I think it makes it makes a lot 465 00:16:50,250 --> 00:16:52,417 of the historical defense process kind 466 00:16:52,417 --> 00:16:54,361 of uncomfortable , right ? Because 467 00:16:54,361 --> 00:16:56,583 we're moving so quickly and but what we 468 00:16:56,583 --> 00:16:58,790 want to do , the core is uh start 469 00:16:58,790 --> 00:17:00,846 really started from from a series of 470 00:17:00,846 --> 00:17:03,280 combatant command exercises where 471 00:17:03,280 --> 00:17:06,070 combatant commanders um I wanted to try 472 00:17:06,070 --> 00:17:08,540 new things . They wanted to become uh 473 00:17:08,550 --> 00:17:10,520 experiment with with data driven 474 00:17:10,520 --> 00:17:12,890 decision making , making sense out of 475 00:17:12,890 --> 00:17:15,670 noise and you know , creating options 476 00:17:15,670 --> 00:17:18,003 for commanders to consider in execution . 477 00:17:18,003 --> 00:17:20,270 This sort of wrap this idea of rapid 478 00:17:20,270 --> 00:17:22,570 idea generation and support really 479 00:17:22,570 --> 00:17:25,830 drove us to to uh conversation about , 480 00:17:25,840 --> 00:17:28,690 okay , how do we really accelerate the 481 00:17:28,700 --> 00:17:30,478 data readiness of our combatant 482 00:17:30,478 --> 00:17:32,750 commanders and uh and the artificial 483 00:17:32,750 --> 00:17:34,806 intelligence tools that they have at 484 00:17:34,806 --> 00:17:37,028 their disposal to make good decisions ? 485 00:17:37,028 --> 00:17:38,917 And the combatant commanders were 486 00:17:38,917 --> 00:17:41,028 chosen specifically because they have 487 00:17:41,028 --> 00:17:42,972 one , they have their own exercise 488 00:17:42,972 --> 00:17:44,861 environments , but they have real 489 00:17:44,861 --> 00:17:46,917 decision environments , really , the 490 00:17:46,917 --> 00:17:48,806 toughest decision environments of 491 00:17:48,806 --> 00:17:51,028 anybody . And yet they often don't have 492 00:17:51,028 --> 00:17:53,194 a lot of tools to deal with those kind 493 00:17:53,194 --> 00:17:55,306 of things . So we wanted to help them 494 00:17:55,306 --> 00:17:57,361 with that . And uh it was clear that 495 00:17:57,361 --> 00:17:59,417 there were two lines of effort or to 496 00:17:59,417 --> 00:18:01,750 real problems that we wanted to address . 497 00:18:01,750 --> 00:18:03,917 The first one was data readiness , the 498 00:18:03,917 --> 00:18:06,300 U . As in any large enterprise . If 499 00:18:06,300 --> 00:18:08,022 you're going to use artificial 500 00:18:08,022 --> 00:18:10,133 intelligence and start bringing those 501 00:18:10,133 --> 00:18:12,300 sort of data driven tools to bear , um 502 00:18:12,300 --> 00:18:14,467 you have to understand your data , you 503 00:18:14,467 --> 00:18:14,380 have to clean up your data , you have 504 00:18:14,380 --> 00:18:16,380 to get the data where you where you 505 00:18:16,380 --> 00:18:18,750 want it . And so data curation , data 506 00:18:18,750 --> 00:18:21,500 conditioning , data quality control , 507 00:18:21,510 --> 00:18:23,850 data management all become really 508 00:18:23,850 --> 00:18:25,628 important functions . Combatant 509 00:18:25,628 --> 00:18:28,160 commanders uh and their staffs are 510 00:18:28,160 --> 00:18:30,990 built to fight the US joint force right . 511 00:18:31,000 --> 00:18:33,222 They're not built to do those technical 512 00:18:33,222 --> 00:18:35,310 functions . They need help . So . So 513 00:18:35,320 --> 00:18:37,820 the first part of A . D . A . Is to 514 00:18:37,820 --> 00:18:40,630 bring in data teams . Operational data 515 00:18:40,630 --> 00:18:42,980 teams will call them and they will they 516 00:18:42,980 --> 00:18:44,980 will work with combatant commands , 517 00:18:44,980 --> 00:18:46,980 command staffs and headquarters and 518 00:18:46,980 --> 00:18:49,690 commanders to get the data in a good 519 00:18:49,690 --> 00:18:52,410 place right to explore all the all the 520 00:18:52,410 --> 00:18:54,132 sources of data that combatant 521 00:18:54,132 --> 00:18:56,188 commanders can use in their decision 522 00:18:56,188 --> 00:18:58,299 making and then they create access to 523 00:18:58,299 --> 00:19:00,299 that data . The second , the second 524 00:19:00,299 --> 00:19:02,940 piece of this is the the the challenge 525 00:19:02,950 --> 00:19:06,360 of process flow . So today in our in 526 00:19:06,360 --> 00:19:08,582 our in our joint force and we have many 527 00:19:08,582 --> 00:19:10,930 processes that are a series of 528 00:19:10,930 --> 00:19:13,950 stovepipes . Uh individual efforts with 529 00:19:13,960 --> 00:19:16,320 individual systems with individual 530 00:19:16,320 --> 00:19:18,060 sources of data for example , 531 00:19:18,140 --> 00:19:20,307 contributing one piece of knowledge to 532 00:19:20,307 --> 00:19:21,973 a commander's decision making 533 00:19:21,973 --> 00:19:23,973 environment . What we want to do is 534 00:19:23,973 --> 00:19:26,084 take all of those stovepipes and turn 535 00:19:26,084 --> 00:19:28,920 them into a collection of observations 536 00:19:28,920 --> 00:19:31,087 that are integrated in a way and fused 537 00:19:31,087 --> 00:19:33,590 in a way that that helped the commander 538 00:19:33,590 --> 00:19:35,700 make better decisions from a fused 539 00:19:35,710 --> 00:19:37,930 picture of what's going on not having 540 00:19:37,930 --> 00:19:41,270 to assemble in his or her head . You 541 00:19:41,270 --> 00:19:42,992 know , the contributions of 30 542 00:19:42,992 --> 00:19:46,190 different systems . And so so given the , 543 00:19:46,200 --> 00:19:47,978 you know , cleaning up the data 544 00:19:47,978 --> 00:19:49,978 environment and then looking at the 545 00:19:49,978 --> 00:19:52,033 workflows at a combatant command and 546 00:19:52,033 --> 00:19:54,144 this . There are multiple workflows . 547 00:19:54,144 --> 00:19:55,867 As you might imagine , all the 548 00:19:55,867 --> 00:19:57,922 different functions that occur , you 549 00:19:57,922 --> 00:19:57,130 know , under the auspices of that 550 00:19:57,130 --> 00:19:59,440 headquarters that that that really 551 00:19:59,440 --> 00:20:01,610 could use machine assistance . Right ? 552 00:20:01,620 --> 00:20:03,510 And so we're going to help build 553 00:20:03,510 --> 00:20:06,740 machines that make the decision process 554 00:20:06,740 --> 00:20:08,851 is smoother , that make the processes 555 00:20:08,851 --> 00:20:10,962 of integration at the command command 556 00:20:10,962 --> 00:20:12,907 smoother and help them with this . 557 00:20:12,907 --> 00:20:15,018 We're gonna do that piece of it , the 558 00:20:15,018 --> 00:20:17,129 ai piece of it with something that we 559 00:20:17,129 --> 00:20:19,129 call fly away teams . Um We have we 560 00:20:19,129 --> 00:20:21,240 have a persistent engagement with our 561 00:20:21,240 --> 00:20:23,296 combatant command headquarters , but 562 00:20:23,296 --> 00:20:25,129 what we but what we will do when 563 00:20:25,129 --> 00:20:27,296 they're ready , you know , linked into 564 00:20:27,296 --> 00:20:29,518 their decision cycles of their exercise 565 00:20:29,518 --> 00:20:31,518 cycle , their experimentation cycle 566 00:20:31,518 --> 00:20:33,740 when they're when they have the time to 567 00:20:33,740 --> 00:20:35,962 look at this , we want to fall in on on 568 00:20:35,962 --> 00:20:37,851 on their efforts and help them uh 569 00:20:37,851 --> 00:20:40,930 experiment in this space . Um We're 570 00:20:40,930 --> 00:20:43,152 going to experiment with process flow , 571 00:20:43,152 --> 00:20:45,486 we're going to experiment with workflow , 572 00:20:45,486 --> 00:20:47,810 and if we if we can develop something 573 00:20:47,810 --> 00:20:50,240 that works well for them , ideally , 574 00:20:50,240 --> 00:20:52,240 we'll leave that in place and there 575 00:20:52,240 --> 00:20:54,407 will be one step better than they were 576 00:20:54,407 --> 00:20:56,518 before , and then we'll come back and 577 00:20:56,518 --> 00:20:58,629 we'll do it again and we'll make them 578 00:20:58,629 --> 00:21:00,740 one step better better again . And so 579 00:21:00,740 --> 00:21:02,740 through this series of experimental 580 00:21:02,740 --> 00:21:05,810 activities , we hope to really start to 581 00:21:05,810 --> 00:21:08,350 gain real capability . And if this , 582 00:21:08,360 --> 00:21:10,416 you know , if this sounds familiar , 583 00:21:10,416 --> 00:21:12,416 it's because this is a conventional 584 00:21:12,416 --> 00:21:14,582 software engineering approach . And so 585 00:21:14,582 --> 00:21:16,638 we're talking about largely software 586 00:21:16,638 --> 00:21:19,390 derived capabilities . So it only makes 587 00:21:19,390 --> 00:21:21,112 sense for us to use a software 588 00:21:21,112 --> 00:21:23,057 engineering approach for testing , 589 00:21:23,057 --> 00:21:25,112 experimenting and implementing these 590 00:21:25,112 --> 00:21:27,334 capabilities . And one of the magics of 591 00:21:27,334 --> 00:21:30,300 doing this in a software way is that if 592 00:21:30,300 --> 00:21:32,740 you can create opportunities for one 593 00:21:32,740 --> 00:21:34,962 command and command to streamline their 594 00:21:34,962 --> 00:21:37,180 decision processes , then that scales 595 00:21:37,180 --> 00:21:39,069 pretty readily to other combatant 596 00:21:39,069 --> 00:21:41,291 commands who have very simple , similar 597 00:21:41,291 --> 00:21:43,347 challenges . They may have different 598 00:21:43,347 --> 00:21:45,180 data . They may have a different 599 00:21:45,180 --> 00:21:47,236 theater , but the challenges and the 600 00:21:47,236 --> 00:21:49,291 staff actions are largely the same . 601 00:21:49,291 --> 00:21:51,513 And so we hope to be able to experiment 602 00:21:51,513 --> 00:21:53,736 rapidly and then scale across the joint 603 00:21:53,736 --> 00:21:55,902 force as as we can . Thank you , sir , 604 00:21:56,240 --> 00:21:59,030 ma'am , thank you very much about the 605 00:21:59,040 --> 00:22:03,020 limitations of ai . What 606 00:22:03,030 --> 00:22:06,870 are some areas that a I cannot 607 00:22:06,880 --> 00:22:10,270 do ? And how will you cover this area ? 608 00:22:11,640 --> 00:22:13,918 So , so that's that's a great question . 609 00:22:13,918 --> 00:22:15,862 We we spend a lot of time thinking 610 00:22:15,862 --> 00:22:17,918 about what a I can do and and uh and 611 00:22:17,918 --> 00:22:20,530 obviously any process that is that is 612 00:22:20,530 --> 00:22:23,370 data driven or requires inputs from 613 00:22:23,370 --> 00:22:25,660 abroad , you know from a broad spectrum 614 00:22:25,660 --> 00:22:29,420 of data producers . Um It is 615 00:22:29,430 --> 00:22:31,790 it is very natural for artificial 616 00:22:31,790 --> 00:22:34,380 intelligence to help humans sort 617 00:22:34,380 --> 00:22:37,120 through large volumes of data . And so 618 00:22:37,120 --> 00:22:39,342 A I is good at that sort of thing . And 619 00:22:39,342 --> 00:22:41,509 that's really the sweet spot for where 620 00:22:41,509 --> 00:22:44,260 we want to help commanders . Um A I may 621 00:22:44,260 --> 00:22:46,690 not be as useful in in in in decision 622 00:22:46,690 --> 00:22:48,960 making that is uh you know integrated 623 00:22:48,960 --> 00:22:51,960 with humans and human emotions and 624 00:22:51,960 --> 00:22:54,970 working with individuals . I would I 625 00:22:54,970 --> 00:22:57,192 would I would submit . I mean there are 626 00:22:57,192 --> 00:22:59,414 still ways that artificial intelligence 627 00:22:59,414 --> 00:23:01,526 can help in those interactions to the 628 00:23:01,526 --> 00:23:03,581 defense health . You know enterprise 629 00:23:03,581 --> 00:23:05,780 for example has A I applications for 630 00:23:05,780 --> 00:23:08,002 lots of different aspects for treatment 631 00:23:08,002 --> 00:23:10,860 of of of of a variety of illnesses , 632 00:23:10,860 --> 00:23:12,971 you know , both both physical and and 633 00:23:12,971 --> 00:23:15,550 and mental and so like there is a lot 634 00:23:15,550 --> 00:23:17,680 of work that way I can do to help 635 00:23:17,680 --> 00:23:20,320 doctors make better decisions to help 636 00:23:20,330 --> 00:23:23,060 uh organizations make better policy , 637 00:23:23,070 --> 00:23:25,840 but those don't jump out at you quite 638 00:23:25,840 --> 00:23:28,110 as quite as cleanly as some of the ones 639 00:23:28,110 --> 00:23:30,054 that are working with , you know , 640 00:23:30,054 --> 00:23:32,900 tactical data or or uh a situational 641 00:23:32,900 --> 00:23:35,011 data on the battlefield for example , 642 00:23:35,011 --> 00:23:37,011 or logistics data that is driven by 643 00:23:37,011 --> 00:23:39,300 that are very data driven enterprises 644 00:23:39,320 --> 00:23:41,690 ai falls naturally into those things . 645 00:23:41,700 --> 00:23:43,810 So we're gonna pick carefully about 646 00:23:43,820 --> 00:23:46,620 which ones are , you know , have the 647 00:23:46,620 --> 00:23:48,860 data to actually support a data driven 648 00:23:48,860 --> 00:23:51,960 analytical engine and which ones maybe 649 00:23:51,960 --> 00:23:54,127 we want to hold off until we have more 650 00:23:54,127 --> 00:23:56,900 mature technology . Okay , next 651 00:23:56,900 --> 00:23:58,956 question goes out to Jackson Barnett 652 00:23:58,956 --> 00:24:01,067 from bed scoop . Go ahead , Jackson , 653 00:24:01,840 --> 00:24:04,390 thank you very much for doing this . Um 654 00:24:04,400 --> 00:24:06,800 My question is directed toward MS Patel , 655 00:24:06,810 --> 00:24:10,290 um what is the or any 656 00:24:10,300 --> 00:24:12,078 implementation or other type of 657 00:24:12,078 --> 00:24:14,460 guidance that you have created for 658 00:24:14,940 --> 00:24:17,770 understanding the ethics principles and 659 00:24:17,770 --> 00:24:21,210 responsible ai And how does the 660 00:24:21,210 --> 00:24:23,830 new memo signed by Deputy Secretary 661 00:24:23,830 --> 00:24:26,500 hicks change or alter anyway ? The 662 00:24:26,510 --> 00:24:29,550 timeline for developing such guidance ? 663 00:24:31,240 --> 00:24:33,407 Sure , thanks Jackson , thanks for the 664 00:24:33,407 --> 00:24:35,518 question . Um and I really appreciate 665 00:24:35,518 --> 00:24:37,129 your diligence in holding me 666 00:24:37,129 --> 00:24:39,296 accountable uh in terms of our efforts 667 00:24:39,296 --> 00:24:41,407 on responsibility at the department . 668 00:24:41,407 --> 00:24:43,390 Um so as you alluded to the Deputy 669 00:24:43,390 --> 00:24:45,720 Secretary of Defense , executed or 670 00:24:45,720 --> 00:24:48,110 signed a memo on May 26 , that's really 671 00:24:48,110 --> 00:24:49,666 focused on how we implement 672 00:24:49,666 --> 00:24:51,970 responsibility at the department . Um 673 00:24:51,980 --> 00:24:54,700 In addition to affirming the ai ethics 674 00:24:54,700 --> 00:24:57,033 principles which were adopted last year , 675 00:24:57,033 --> 00:25:00,050 this memo actually sets out six 676 00:25:00,050 --> 00:25:01,910 foundational tenants and those 677 00:25:01,910 --> 00:25:04,180 foundational tenants layout the 678 00:25:04,180 --> 00:25:05,902 structure for our strategy and 679 00:25:05,902 --> 00:25:09,400 implementation plan going forward . So 680 00:25:09,410 --> 00:25:12,160 we've taken a step forward to answer 681 00:25:12,160 --> 00:25:14,271 your question . More specifically , I 682 00:25:14,271 --> 00:25:15,882 think you're aware we have a 683 00:25:15,882 --> 00:25:17,938 responsibility subcommittee that has 684 00:25:17,938 --> 00:25:20,104 been convened last year . We meet on a 685 00:25:20,104 --> 00:25:22,327 monthly basis , We've met over 12 times 686 00:25:22,327 --> 00:25:24,250 over the last month , which also 687 00:25:24,250 --> 00:25:25,861 includes representation from 688 00:25:25,861 --> 00:25:28,028 individuals across the department . So 689 00:25:28,028 --> 00:25:30,194 it's not just jake individuals who are 690 00:25:30,194 --> 00:25:32,194 working on and trying to solve this 691 00:25:32,194 --> 00:25:34,361 problem , but really a cross sectional 692 00:25:34,361 --> 00:25:36,083 representation from the entire 693 00:25:36,083 --> 00:25:38,250 department and allow those discussions 694 00:25:38,250 --> 00:25:40,361 is what led to identifying what those 695 00:25:40,361 --> 00:25:42,583 foundational tenants are . And so we've 696 00:25:42,583 --> 00:25:44,806 taken that step and that came through a 697 00:25:44,806 --> 00:25:46,972 lot of learning and experimentation as 698 00:25:46,972 --> 00:25:49,028 the general is talking about earlier 699 00:25:49,028 --> 00:25:51,250 and then our next piece , as you'll see 700 00:25:51,250 --> 00:25:53,430 in the memo , are very specific action 701 00:25:53,430 --> 00:25:56,580 items . Deliverables with corresponding 702 00:25:56,580 --> 00:25:58,524 timelines , and you'll see much of 703 00:25:58,524 --> 00:26:01,350 those timelines are fairly short in the 704 00:26:01,350 --> 00:26:03,350 sense that we recognize the urgency 705 00:26:03,350 --> 00:26:05,517 around this work and how important and 706 00:26:05,517 --> 00:26:08,240 critical it is . And therefore by uh 707 00:26:08,250 --> 00:26:10,417 september october we will have a final 708 00:26:10,417 --> 00:26:12,780 version of a responsibility strategy , 709 00:26:12,780 --> 00:26:14,502 an implementation plan for the 710 00:26:14,502 --> 00:26:16,502 department . One other thing I will 711 00:26:16,502 --> 00:26:20,220 just add briefly is that in addition to 712 00:26:20,230 --> 00:26:22,850 that memo , this Tuesday at the 713 00:26:22,850 --> 00:26:24,961 symposium , both the deputy Secretary 714 00:26:24,961 --> 00:26:26,794 of Defense and the general again 715 00:26:26,794 --> 00:26:29,017 highlighted the priority of responsible 716 00:26:29,017 --> 00:26:31,017 AI for the department , but we also 717 00:26:31,017 --> 00:26:33,183 announced the release of a responsible 718 00:26:33,183 --> 00:26:35,294 R . F . I . A request for information 719 00:26:35,294 --> 00:26:37,406 through our acquisition project Trade 720 00:26:37,406 --> 00:26:39,590 Wind or Vehicle Trade Wind and so in 721 00:26:39,590 --> 00:26:42,340 that R . F . I . Well , we're asking 722 00:26:42,340 --> 00:26:44,610 for is for individuals either from 723 00:26:44,610 --> 00:26:46,750 industry academia , non profit 724 00:26:46,750 --> 00:26:48,972 organizations from all sectors who have 725 00:26:48,972 --> 00:26:51,530 subject matter expertise , who who have 726 00:26:51,530 --> 00:26:54,000 solutions , services , products , 727 00:26:54,000 --> 00:26:55,980 solutions , best practices in the 728 00:26:55,980 --> 00:26:58,240 responsible AI area to respond to that 729 00:26:58,240 --> 00:27:00,650 are FBI . And that information will 730 00:27:00,660 --> 00:27:02,780 actually inform and guide what the 731 00:27:02,780 --> 00:27:04,910 department needs to do to build that 732 00:27:04,920 --> 00:27:07,087 operating infrastructure that you were 733 00:27:07,087 --> 00:27:09,031 alluding to general and and really 734 00:27:09,031 --> 00:27:10,976 build that across the department . 735 00:27:10,976 --> 00:27:13,198 Because what we're , what we've learned 736 00:27:13,198 --> 00:27:12,890 with AI is it's not just about the 737 00:27:12,890 --> 00:27:14,723 technology , there's a number of 738 00:27:14,723 --> 00:27:16,890 different pieces that that impact this 739 00:27:16,890 --> 00:27:18,612 and so we have to look at this 740 00:27:18,612 --> 00:27:20,668 holistically and so that is that has 741 00:27:20,668 --> 00:27:22,980 really been the focus of our efforts 742 00:27:22,980 --> 00:27:25,091 and it's come to a culmination in the 743 00:27:25,091 --> 00:27:27,202 last couple of weeks between the memo 744 00:27:27,202 --> 00:27:29,202 between the R . F . I . Um uh we've 745 00:27:29,202 --> 00:27:31,760 also recently have the third convening 746 00:27:31,760 --> 00:27:33,927 of the a partnership for Defense where 747 00:27:33,927 --> 00:27:35,927 we bring our international partners 748 00:27:35,927 --> 00:27:38,038 together and Responsible AI is at the 749 00:27:38,038 --> 00:27:40,260 heart of those conversations . So there 750 00:27:40,260 --> 00:27:42,371 have been three comedians to date and 751 00:27:42,371 --> 00:27:44,593 um all of them have responsibility as a 752 00:27:44,593 --> 00:27:46,704 foundation . Additionally , there are 753 00:27:46,710 --> 00:27:48,932 other efforts on the acquisition side , 754 00:27:48,932 --> 00:27:51,370 which are contracting vehicles are to 755 00:27:51,380 --> 00:27:53,100 be released and two of them 756 00:27:53,100 --> 00:27:55,680 specifically one is on data readiness , 757 00:27:55,680 --> 00:27:57,680 but the other one is on testing and 758 00:27:57,680 --> 00:27:59,458 evaluation and those are really 759 00:27:59,458 --> 00:28:01,513 critical vehicles in terms of how we 760 00:28:01,513 --> 00:28:03,791 actually operationalize the principles . 761 00:28:03,791 --> 00:28:06,013 And the last thing I'll just mention is 762 00:28:06,013 --> 00:28:07,958 that talent is also something that 763 00:28:07,958 --> 00:28:10,124 we're thinking about . This is still a 764 00:28:10,124 --> 00:28:12,347 fairly new area . And so how do we make 765 00:28:12,347 --> 00:28:14,180 sure that we are bringing in the 766 00:28:14,180 --> 00:28:16,347 necessary talent to to think about all 767 00:28:16,347 --> 00:28:18,569 the areas of responsibility ? I as well 768 00:28:18,569 --> 00:28:20,736 as internally thinking about workforce 769 00:28:20,736 --> 00:28:22,847 education to up skill , our workforce 770 00:28:22,847 --> 00:28:25,180 to really be able to address this issue . 771 00:28:25,180 --> 00:28:27,291 And so um , a lot of different pieces 772 00:28:27,291 --> 00:28:29,347 that we're looking at and working on 773 00:28:29,347 --> 00:28:31,291 holistically , we're building this 774 00:28:31,291 --> 00:28:33,513 plane as we fly it , so to speak . Um , 775 00:28:33,513 --> 00:28:35,680 there is no playbook , you've seen the 776 00:28:35,680 --> 00:28:37,791 tech companies that the tech industry 777 00:28:37,791 --> 00:28:39,958 has been working on this for years and 778 00:28:39,958 --> 00:28:42,180 there isn't one solution . And so we 779 00:28:42,180 --> 00:28:44,347 are making progress . And hopefully by 780 00:28:44,347 --> 00:28:47,370 the end of the fall you will see a uh 781 00:28:47,380 --> 00:28:49,713 published D . O . D . Responsible the A . 782 00:28:49,713 --> 00:28:51,824 I . Strategy and implementation guy , 783 00:28:51,824 --> 00:28:53,824 if I can just let me pile under one 784 00:28:53,824 --> 00:28:56,490 second or so . So I I think what was 785 00:28:56,500 --> 00:28:58,840 enormously encouraging to me , you know , 786 00:28:58,840 --> 00:29:01,062 as the new administration came on board 787 00:29:01,062 --> 00:29:03,290 here this early this year . And uh you 788 00:29:03,290 --> 00:29:05,401 know , we're settling into their into 789 00:29:05,401 --> 00:29:07,734 their jobs . I mean , one of the things , 790 00:29:07,734 --> 00:29:09,957 you know , those of us in this business 791 00:29:09,957 --> 00:29:11,790 often get very excited about the 792 00:29:11,790 --> 00:29:11,670 technology and the technological 793 00:29:11,670 --> 00:29:15,340 aspects of it . And uh as Deputy Deputy 794 00:29:15,340 --> 00:29:17,240 Secretary Hicks took her took her 795 00:29:17,240 --> 00:29:20,720 position , um , you know , her her 796 00:29:20,730 --> 00:29:24,160 first impulse , her first 797 00:29:24,170 --> 00:29:26,040 attention to the artificial 798 00:29:26,040 --> 00:29:28,330 intelligence conversation was all about 799 00:29:28,340 --> 00:29:30,618 ethical foundations . And I just I was , 800 00:29:30,940 --> 00:29:33,162 you know , as a as a maybe a technology 801 00:29:33,162 --> 00:29:35,440 person , I was a little bit , you know , 802 00:29:35,440 --> 00:29:37,607 put put put put them on the back of my 803 00:29:37,607 --> 00:29:39,496 feet for a second there . But how 804 00:29:39,496 --> 00:29:41,218 encouraging that is and what a 805 00:29:41,218 --> 00:29:43,107 wonderful way for us to start our 806 00:29:43,107 --> 00:29:45,329 interaction . And and really put a mark 807 00:29:45,329 --> 00:29:47,560 down by this administration for 808 00:29:47,740 --> 00:29:50,630 responsible ai and an ethical ai 809 00:29:50,630 --> 00:29:52,750 baseline as a baseline for everything 810 00:29:52,750 --> 00:29:54,930 we do in this space is so critically 811 00:29:54,930 --> 00:29:57,041 important . I think it was just , you 812 00:29:57,041 --> 00:29:59,208 know , it's just so insightful to make 813 00:29:59,208 --> 00:30:01,319 that the first thing that we did here 814 00:30:01,319 --> 00:30:03,374 in the department . And so , uh , so 815 00:30:03,374 --> 00:30:05,597 I'm very encouraged by that . And as uh 816 00:30:05,597 --> 00:30:07,708 as alka can can can attribute , we're 817 00:30:07,708 --> 00:30:09,486 we are we're really making good 818 00:30:09,486 --> 00:30:11,541 progress based on the baseline now . 819 00:30:11,541 --> 00:30:13,763 And if I can add a little bit as well , 820 00:30:13,763 --> 00:30:16,430 we've been able to tie our human system 821 00:30:16,430 --> 00:30:18,541 integration framework in a really big 822 00:30:18,541 --> 00:30:21,110 way to the responsible ai principles . 823 00:30:21,120 --> 00:30:23,120 Um , and that's an important tie in 824 00:30:23,120 --> 00:30:25,710 because that means that to the extent 825 00:30:25,710 --> 00:30:27,710 that some of responsible a I can be 826 00:30:27,710 --> 00:30:30,960 tested um against uh in our framework , 827 00:30:31,140 --> 00:30:33,251 that means that we're not adding time 828 00:30:33,251 --> 00:30:35,307 or any kind of financial cost . It's 829 00:30:35,307 --> 00:30:37,362 something that we're doing already , 830 00:30:37,362 --> 00:30:39,362 because ultimately using Ai enabled 831 00:30:39,362 --> 00:30:41,196 systems responsibly is very much 832 00:30:41,196 --> 00:30:43,307 connected to using them effectively . 833 00:30:43,307 --> 00:30:45,307 Um and so what we're seeing just to 834 00:30:45,307 --> 00:30:47,307 give you a couple of examples , for 835 00:30:47,307 --> 00:30:49,362 instance , as we measure whether the 836 00:30:49,362 --> 00:30:51,362 warfighter has the information that 837 00:30:51,362 --> 00:30:53,362 they need to know when they need to 838 00:30:53,362 --> 00:30:55,529 know in the way that they understand . 839 00:30:55,529 --> 00:30:57,473 That's a very common human factors 840 00:30:57,473 --> 00:30:59,696 question . Um , and that also ties very 841 00:30:59,696 --> 00:31:01,807 much to the responsible principle and 842 00:31:01,807 --> 00:31:03,862 to the traceability principle . Um , 843 00:31:03,862 --> 00:31:06,750 when we talk about whether the operator 844 00:31:06,750 --> 00:31:09,100 can use the system to do precisely what 845 00:31:09,100 --> 00:31:11,640 they want to do with that system , um , 846 00:31:11,650 --> 00:31:13,650 in human factors , we may call that 847 00:31:13,650 --> 00:31:15,706 usability of function allocation and 848 00:31:15,706 --> 00:31:17,372 responsible ai terms , that's 849 00:31:17,372 --> 00:31:19,594 governability etcetera . So there are a 850 00:31:19,594 --> 00:31:21,650 lot of these really important ties , 851 00:31:21,650 --> 00:31:24,030 which means really that some of these 852 00:31:24,030 --> 00:31:26,060 principles are not going to be as 853 00:31:26,060 --> 00:31:29,650 difficult and as new or as costly to 854 00:31:29,650 --> 00:31:32,280 assess as one might imagine , because 855 00:31:32,280 --> 00:31:34,058 we didn't necessarily ask these 856 00:31:34,058 --> 00:31:36,058 questions of conventional systems . 857 00:31:36,058 --> 00:31:38,290 Previously , that's right . Next 858 00:31:38,290 --> 00:31:40,512 question will go to louis Martinez from 859 00:31:40,512 --> 00:31:42,860 abc News . Go ahead . Thank you . Arlo 860 00:31:43,440 --> 00:31:45,900 General , you spoke about building new 861 00:31:45,900 --> 00:31:49,810 machines . Um , but I think what I 862 00:31:49,810 --> 00:31:53,620 would like to know is tangibly How 863 00:31:53,620 --> 00:31:57,120 does one see Ai in the 864 00:31:57,120 --> 00:32:00,700 military framework ? How does a 865 00:32:00,710 --> 00:32:03,550 commander a tactical commander 866 00:32:03,940 --> 00:32:07,580 warfighter ? We'll hear a I I'm 867 00:32:07,580 --> 00:32:11,100 asking do they grasp what it is , 868 00:32:11,110 --> 00:32:14,560 what you will see Tony asked about an 869 00:32:14,560 --> 00:32:16,782 autopilot . I mean , that sounds , that 870 00:32:16,782 --> 00:32:19,960 sounds like something that's uh , you 871 00:32:19,960 --> 00:32:22,880 know , in internal to an aircraft . Is 872 00:32:22,880 --> 00:32:25,102 it tied to a broader network while it's 873 00:32:25,102 --> 00:32:27,970 in the air , what can actually , what 874 00:32:27,970 --> 00:32:29,970 can actually someone see ? Or is it 875 00:32:29,970 --> 00:32:32,192 just as you said , it's software driven 876 00:32:32,192 --> 00:32:34,303 to the point where the only thing you 877 00:32:34,303 --> 00:32:36,248 will see is the team that is being 878 00:32:36,248 --> 00:32:39,140 created . That's a great question . So , 879 00:32:39,140 --> 00:32:41,810 and so , and this this is , you know , 880 00:32:41,810 --> 00:32:43,477 part of the challenge of this 881 00:32:43,477 --> 00:32:45,366 transformation is that there's an 882 00:32:45,366 --> 00:32:47,699 educational aspect of this , to be sure . 883 00:32:47,699 --> 00:32:50,032 Um , here here's one thing that we have . 884 00:32:50,032 --> 00:32:52,199 Um , so , so , so what we're trying to 885 00:32:52,199 --> 00:32:54,254 accomplish through this , you know , 886 00:32:54,254 --> 00:32:55,977 through when I talked about an 887 00:32:55,977 --> 00:32:58,032 operating model and uh and operate , 888 00:32:58,032 --> 00:33:00,254 you know , a defense operating system , 889 00:33:00,254 --> 00:33:02,390 um this this idea of access to large 890 00:33:02,390 --> 00:33:04,700 volumes of data wherever they are being 891 00:33:04,700 --> 00:33:07,190 able to write applications against that 892 00:33:07,190 --> 00:33:09,023 data to , you know , to navigate 893 00:33:09,023 --> 00:33:11,023 through traffic or navigate through 894 00:33:11,023 --> 00:33:12,690 some , some , you know , some 895 00:33:12,690 --> 00:33:14,634 battlefield situation or make some 896 00:33:14,634 --> 00:33:16,690 decision about a supply , you know , 897 00:33:16,690 --> 00:33:18,412 supply , uh , movement or , or 898 00:33:18,412 --> 00:33:20,180 transaction . Uh this comes so 899 00:33:20,180 --> 00:33:22,840 naturally to the younger members of our 900 00:33:22,840 --> 00:33:25,007 force . They grew up in this , in this 901 00:33:25,007 --> 00:33:27,270 environment writing apps against data . 902 00:33:27,280 --> 00:33:29,710 Many of them do this as a hobby . Right . 903 00:33:29,720 --> 00:33:33,210 And so so we have a we have a great 904 00:33:33,210 --> 00:33:36,280 swaths of the force that grew up as 905 00:33:36,280 --> 00:33:38,860 digital natives or near digital natives . 906 00:33:39,140 --> 00:33:41,970 And they understand this implicitly , 907 00:33:41,980 --> 00:33:45,020 like they can see , because they , 908 00:33:45,030 --> 00:33:47,086 their minds have been trained , they 909 00:33:47,086 --> 00:33:48,919 can see the advantages of things 910 00:33:48,919 --> 00:33:50,974 operating at scale . You know , they 911 00:33:50,974 --> 00:33:53,130 see how , you know , large online 912 00:33:53,140 --> 00:33:56,040 marketplaces work and how having access 913 00:33:56,040 --> 00:33:59,040 to that data and being given , you know , 914 00:33:59,050 --> 00:34:01,840 uh , potential things to buy maybe with 915 00:34:01,840 --> 00:34:03,910 recommendations based on things that 916 00:34:03,910 --> 00:34:05,632 you've said before . Like they 917 00:34:05,632 --> 00:34:07,854 understand that implicitly all of those 918 00:34:07,854 --> 00:34:10,420 models fall right into military 919 00:34:10,420 --> 00:34:13,240 processes for almost every commercial 920 00:34:13,240 --> 00:34:15,620 application . There is a military 921 00:34:15,620 --> 00:34:18,250 analog that those algorithms and the 922 00:34:18,260 --> 00:34:20,427 and the processes for using artificial 923 00:34:20,427 --> 00:34:22,649 intelligence just fall right in there . 924 00:34:22,649 --> 00:34:25,900 Older folks , not me , but people much 925 00:34:25,900 --> 00:34:29,760 older than me . They it doesn't come as 926 00:34:29,760 --> 00:34:31,927 naturally right . And so , so for some 927 00:34:31,927 --> 00:34:34,149 of the senior folks , they think of a I 928 00:34:34,149 --> 00:34:36,750 as a black box that's going to come in 929 00:34:36,750 --> 00:34:39,850 and make my decisions . For me , we're 930 00:34:39,850 --> 00:34:42,650 getting past that as larger and larger 931 00:34:42,660 --> 00:34:45,070 uh , proportion of the force . Really 932 00:34:45,070 --> 00:34:47,680 understand that No , actually , we're 933 00:34:47,680 --> 00:34:50,220 taking your decision processes and 934 00:34:50,220 --> 00:34:52,387 we're taking all of the hard data work 935 00:34:52,387 --> 00:34:54,498 and we're making that really easy for 936 00:34:54,498 --> 00:34:56,670 you . That's what this is all about . 937 00:34:56,740 --> 00:34:59,690 So commanders own decision processes 938 00:35:00,030 --> 00:35:02,030 and what we're trying to do is give 939 00:35:02,030 --> 00:35:04,141 them tools to make better decisions , 940 00:35:04,141 --> 00:35:06,308 to make better decisions based on data 941 00:35:06,308 --> 00:35:08,308 and that the number of , you know , 942 00:35:08,308 --> 00:35:10,252 kind of commanders who who are now 943 00:35:10,252 --> 00:35:12,363 starting to appreciate how this works 944 00:35:12,363 --> 00:35:14,600 uh is growing rapidly . You can really 945 00:35:14,600 --> 00:35:16,711 see the light bulbs coming on uh just 946 00:35:16,711 --> 00:35:18,822 in the last , you know , six or eight 947 00:35:18,822 --> 00:35:20,822 months , uh you know that I've been 948 00:35:20,822 --> 00:35:20,580 here in the department . It's 949 00:35:20,580 --> 00:35:23,810 incredible to me how fast on a , you 950 00:35:23,810 --> 00:35:26,340 know , on a Department of Defense scale , 951 00:35:26,350 --> 00:35:28,720 how fast this transformation is taking 952 00:35:28,720 --> 00:35:30,640 hold and how more and more people 953 00:35:30,650 --> 00:35:33,070 understand . It's not just about the 954 00:35:33,070 --> 00:35:35,014 shiny objects , it's not about the 955 00:35:35,014 --> 00:35:37,380 black box , it's about the architecture , 956 00:35:37,390 --> 00:35:39,612 it's about decision making , it's about 957 00:35:39,612 --> 00:35:41,390 responsible , decision making , 958 00:35:41,390 --> 00:35:44,080 predictive decision making based on a 959 00:35:44,080 --> 00:35:46,191 level of confidence that you get from 960 00:35:46,191 --> 00:35:48,358 understanding what's actually going on 961 00:35:48,358 --> 00:35:50,580 around you . Humans are , you know , uh 962 00:35:50,580 --> 00:35:53,570 famously really bad at , at operating 963 00:35:53,570 --> 00:35:55,626 on large volumes of data , but we're 964 00:35:55,626 --> 00:35:57,626 really good at intuit , inc kind of 965 00:35:57,626 --> 00:36:00,480 right answers . We're building both 966 00:36:00,490 --> 00:36:02,657 right and we're bringing them together 967 00:36:02,657 --> 00:36:05,230 and a bigger and bigger swath of the 968 00:36:05,230 --> 00:36:08,070 department leadership . Um , commanders 969 00:36:08,080 --> 00:36:10,080 really are starting to take hold of 970 00:36:10,080 --> 00:36:12,247 this and they and they want it . And I 971 00:36:12,247 --> 00:36:14,413 think , I think that , you know , that 972 00:36:14,413 --> 00:36:16,580 is only accelerating in the department 973 00:36:16,580 --> 00:36:18,580 now . It's really exciting to see , 974 00:36:19,630 --> 00:36:21,686 okay , we'll go out to the phones we 975 00:36:21,686 --> 00:36:23,852 have will night on the line from wired 976 00:36:23,852 --> 00:36:27,200 will go ahead . Hello . Thank you . Yes , 977 00:36:27,200 --> 00:36:30,460 I I wanted to ask a question about 978 00:36:31,130 --> 00:36:33,297 the kind of data and tools that you'll 979 00:36:33,297 --> 00:36:35,408 be using from industry . So there was 980 00:36:35,408 --> 00:36:37,574 there was a report out of Georgetown a 981 00:36:37,574 --> 00:36:39,630 couple of days ago talking about the 982 00:36:39,630 --> 00:36:42,790 risks posed by um ai data 983 00:36:42,790 --> 00:36:45,950 and tools built around that data . And 984 00:36:45,950 --> 00:36:48,061 so I'm just wondering how when you're 985 00:36:48,061 --> 00:36:50,117 going to be using a lot of tools and 986 00:36:50,117 --> 00:36:52,006 data coming out of industry , how 987 00:36:52,006 --> 00:36:54,172 you're going to be sure that that data 988 00:36:54,172 --> 00:36:56,283 hasn't been poisoned . And one of the 989 00:36:56,283 --> 00:36:58,450 recommendations in this report is that 990 00:36:58,450 --> 00:37:00,640 you have a Red team , red Team machine 991 00:37:00,640 --> 00:37:02,830 learning team to test tools to make 992 00:37:02,830 --> 00:37:04,663 sure that they cannot be used or 993 00:37:04,663 --> 00:37:06,497 misused by an adversary . So I'm 994 00:37:06,497 --> 00:37:08,663 wondering about this , how you how you 995 00:37:08,663 --> 00:37:10,830 gonna be vetting that ? Yeah . Great , 996 00:37:10,830 --> 00:37:12,941 great , great question will . And I'm 997 00:37:12,941 --> 00:37:15,163 going to start and I'm going to turn it 998 00:37:15,163 --> 00:37:15,110 over to jane here in just a second . 999 00:37:15,110 --> 00:37:17,277 But I think I think you know , there's 1000 00:37:17,277 --> 00:37:19,166 a couple of aspects here that are 1001 00:37:19,166 --> 00:37:21,710 really important . One I would suggest 1002 00:37:21,720 --> 00:37:24,930 is the idea that in a human driven 1003 00:37:24,930 --> 00:37:28,760 environment we have no risks um , is 1004 00:37:28,760 --> 00:37:31,890 not true , right ? In many cases by 1005 00:37:31,890 --> 00:37:34,200 bringing in algorithms and and 1006 00:37:34,200 --> 00:37:36,256 protecting our data and securing our 1007 00:37:36,256 --> 00:37:38,380 data , we actually can get to ground 1008 00:37:38,380 --> 00:37:40,491 truth and we can actually make better 1009 00:37:40,491 --> 00:37:42,810 decisions without some of the risks 1010 00:37:42,810 --> 00:37:44,977 that humans bring into the change . So 1011 00:37:45,010 --> 00:37:46,843 there are so I don't say that to 1012 00:37:46,843 --> 00:37:48,677 downplay the risks of artificial 1013 00:37:48,677 --> 00:37:50,732 intelligence . But in every in every 1014 00:37:50,732 --> 00:37:52,843 aspect of this business , there's the 1015 00:37:52,843 --> 00:37:55,010 comparative if we don't use machines , 1016 00:37:55,010 --> 00:37:57,760 what what do we do ? So in this 1017 00:37:57,760 --> 00:37:59,982 environment with when we start to bring 1018 00:37:59,982 --> 00:38:01,982 in data , you're absolutely right . 1019 00:38:01,982 --> 00:38:04,038 Just just as you know , the when the 1020 00:38:04,038 --> 00:38:06,620 first tank was invented , the next 1021 00:38:06,620 --> 00:38:08,620 thing that was invented was an anti 1022 00:38:08,620 --> 00:38:11,170 tank grenade . When the first ship was 1023 00:38:11,170 --> 00:38:14,000 invented , the next thing was a 1024 00:38:14,010 --> 00:38:16,066 cannonball or a missile or something 1025 00:38:16,066 --> 00:38:18,700 that would sink a ship . And in the ai 1026 00:38:18,710 --> 00:38:21,340 evolution of AI especially as it's 1027 00:38:21,340 --> 00:38:23,562 applied to military systems , that same 1028 00:38:23,562 --> 00:38:25,680 dynamic is surely to be present . And 1029 00:38:25,680 --> 00:38:28,240 so , working through the dynamic of 1030 00:38:28,920 --> 00:38:32,910 artificial intelligence , uh anti 1031 00:38:32,910 --> 00:38:35,080 artificial intelligence and anti anti 1032 00:38:35,080 --> 00:38:37,302 artificial intelligence so that so that 1033 00:38:37,302 --> 00:38:39,740 you know , this this cycle of of uh of 1034 00:38:39,740 --> 00:38:41,851 development and securing , you know , 1035 00:38:41,851 --> 00:38:43,796 securing your data that's going to 1036 00:38:43,796 --> 00:38:45,740 continue . And so so we are we are 1037 00:38:45,740 --> 00:38:48,520 highly cognizant of of the research 1038 00:38:48,530 --> 00:38:50,141 were highly cognizant of the 1039 00:38:50,141 --> 00:38:51,950 implementation . We have great 1040 00:38:51,950 --> 00:38:53,506 relationships with academic 1041 00:38:53,506 --> 00:38:55,339 environments and with commercial 1042 00:38:55,339 --> 00:38:57,450 environments that really help us keep 1043 00:38:57,450 --> 00:38:59,617 on the cutting edge . So we understand 1044 00:38:59,617 --> 00:39:01,561 where the threats are and what the 1045 00:39:01,561 --> 00:39:03,728 threats are . We will never be able to 1046 00:39:03,728 --> 00:39:05,783 use to eliminate all threats in this 1047 00:39:05,783 --> 00:39:07,894 competition of of a I encounter A I . 1048 00:39:07,894 --> 00:39:10,590 But what we want to be is as informed 1049 00:39:10,590 --> 00:39:12,920 as possible about what is what is 1050 00:39:12,920 --> 00:39:15,142 possible at the cutting edge and how do 1051 00:39:15,142 --> 00:39:17,450 we best secure our systems and make 1052 00:39:17,450 --> 00:39:19,561 sure that we're still in forming good 1053 00:39:19,561 --> 00:39:21,450 decision making . Sorry , dr jane 1054 00:39:21,450 --> 00:39:23,506 please . So to build a little bit on 1055 00:39:23,506 --> 00:39:27,110 the on the general statement , uh part 1056 00:39:27,120 --> 00:39:29,670 of operational testing is testing your 1057 00:39:29,670 --> 00:39:31,410 system again in its realistic 1058 00:39:31,410 --> 00:39:33,670 operational conditions against a 1059 00:39:33,670 --> 00:39:35,892 realistic adversary without information 1060 00:39:35,892 --> 00:39:38,020 is available . So to that end we test 1061 00:39:38,020 --> 00:39:41,020 our systems for a variety of robustness 1062 00:39:41,020 --> 00:39:44,010 and resiliency issues . Um the first 1063 00:39:44,010 --> 00:39:47,090 one being resilient to cyber threats of 1064 00:39:47,090 --> 00:39:49,640 course there is also being resilient to 1065 00:39:49,640 --> 00:39:51,973 even just natural perturbations . Right ? 1066 00:39:51,973 --> 00:39:53,807 You think of a sensor maybe in a 1067 00:39:53,807 --> 00:39:55,696 computer vision problem ? Uh that 1068 00:39:55,696 --> 00:39:57,751 sensor could get attacked itself but 1069 00:39:57,751 --> 00:39:59,862 also it could just be cloudy that day 1070 00:39:59,862 --> 00:40:01,807 or it could be blurry for whatever 1071 00:40:01,807 --> 00:40:03,696 reason . And then of course we we 1072 00:40:03,696 --> 00:40:05,751 actually have a red team at the jake 1073 00:40:05,751 --> 00:40:09,140 that tests our systems with respect to 1074 00:40:09,150 --> 00:40:11,640 real adversarial threats . Um 1075 00:40:12,810 --> 00:40:15,590 having said that right , once the model 1076 00:40:15,590 --> 00:40:17,940 is actually deployed , uh once the tool 1077 00:40:17,940 --> 00:40:19,884 is deployed , there are additional 1078 00:40:19,884 --> 00:40:21,940 things that we worry about as far as 1079 00:40:21,940 --> 00:40:23,884 robustness and resilience . So you 1080 00:40:23,884 --> 00:40:26,107 worry about data drift , we worry about 1081 00:40:26,107 --> 00:40:28,273 model drift . So these are all runtime 1082 00:40:28,273 --> 00:40:30,440 monitoring uh types of questions 1083 00:40:30,450 --> 00:40:32,440 because monitoring these systems 1084 00:40:32,450 --> 00:40:34,339 doesn't stop once the systems are 1085 00:40:34,339 --> 00:40:36,450 deployed , which is kind of how we've 1086 00:40:36,450 --> 00:40:38,561 traditionally uh tested things in the 1087 00:40:38,561 --> 00:40:40,990 department . Uh we uh partner on this 1088 00:40:40,990 --> 00:40:43,230 with a few federally funded research 1089 00:40:43,230 --> 00:40:45,630 and development centers with DARPA and 1090 00:40:45,630 --> 00:40:47,797 with a couple of university affiliated 1091 00:40:47,797 --> 00:40:50,240 research centers as well , because a 1092 00:40:50,240 --> 00:40:51,851 lot of this research is both 1093 00:40:51,851 --> 00:40:53,684 operational in nature , but also 1094 00:40:53,684 --> 00:40:55,940 somewhat academic and nature too . And 1095 00:40:56,410 --> 00:40:58,060 so those are the important 1096 00:40:58,060 --> 00:41:00,171 relationships there . And then as far 1097 00:41:00,171 --> 00:41:02,282 as data poisoning very specifically , 1098 00:41:02,282 --> 00:41:04,720 uh at the jake , we have a variety of 1099 00:41:04,720 --> 00:41:07,820 operational data that we're able to 1100 00:41:07,820 --> 00:41:09,820 share in a very secure way with our 1101 00:41:09,820 --> 00:41:11,820 developers so that their models are 1102 00:41:11,820 --> 00:41:13,876 developed on extremely operationally 1103 00:41:13,876 --> 00:41:15,940 relevant data . But of course now we 1104 00:41:15,940 --> 00:41:17,800 also have the data decrees that 1105 00:41:17,800 --> 00:41:20,022 recently came out from the Ceo's office 1106 00:41:20,210 --> 00:41:23,910 uh that I think will be a nice um and I 1107 00:41:23,910 --> 00:41:26,840 step to provide securing data sharing 1108 00:41:26,850 --> 00:41:28,850 between organizations because as we 1109 00:41:28,850 --> 00:41:31,017 develop these data of utmost quality , 1110 00:41:31,017 --> 00:41:33,183 uh we need to ensure their security as 1111 00:41:33,183 --> 00:41:35,910 you mentioned . And if I could also add 1112 00:41:35,910 --> 00:41:38,860 just a few additional uh comments in 1113 00:41:38,860 --> 00:41:41,027 terms of some of the efforts that were 1114 00:41:41,027 --> 00:41:43,193 also doing um because data is critical 1115 00:41:43,193 --> 00:41:45,193 to all those principles as we think 1116 00:41:45,193 --> 00:41:47,249 about responsible ai but some of the 1117 00:41:47,249 --> 00:41:49,249 efforts that we've done at the jake 1118 00:41:49,249 --> 00:41:51,416 recently as well , um and you've heard 1119 00:41:51,416 --> 00:41:53,638 me talk about this before is uh the use 1120 00:41:53,638 --> 00:41:55,804 of data card . So um you know if we go 1121 00:41:55,804 --> 00:41:57,693 back to the earlier stages of the 1122 00:41:57,693 --> 00:41:59,804 development lifecycle and think about 1123 00:41:59,804 --> 00:42:01,693 you know when we're designing and 1124 00:42:01,693 --> 00:42:03,804 developing um so when we're designing 1125 00:42:03,804 --> 00:42:05,916 the use case , have we identified the 1126 00:42:05,916 --> 00:42:08,082 right sets of data ? Do we really need 1127 00:42:08,082 --> 00:42:10,650 all that data ? How we uh looked at 1128 00:42:10,650 --> 00:42:12,594 that data and we separate training 1129 00:42:12,594 --> 00:42:14,594 versus testimony and so forth , but 1130 00:42:14,594 --> 00:42:16,483 using tools such as data cards as 1131 00:42:16,483 --> 00:42:18,650 documentation purposes , also having a 1132 00:42:18,650 --> 00:42:20,817 governance process . So this all comes 1133 00:42:20,817 --> 00:42:22,483 down to thinking through risk 1134 00:42:22,483 --> 00:42:24,483 mitigation , right ? Like how do we 1135 00:42:24,483 --> 00:42:26,594 mitigate the risk as much as we can ? 1136 00:42:26,594 --> 00:42:28,761 How do we monitor when it comes to run 1137 00:42:28,761 --> 00:42:30,761 time monitoring ? One c systems are 1138 00:42:30,761 --> 00:42:32,594 deployed to to really be able to 1139 00:42:32,594 --> 00:42:34,761 identify when there is data drift , so 1140 00:42:34,761 --> 00:42:36,761 to speak . And so how do we build a 1141 00:42:36,761 --> 00:42:38,483 robust governance system , our 1142 00:42:38,483 --> 00:42:40,594 structure around that so that we make 1143 00:42:40,594 --> 00:42:43,440 sure that our use of data um our 1144 00:42:43,440 --> 00:42:46,950 selection of data uh is 1145 00:42:46,960 --> 00:42:49,830 aligned with our principles and the 1146 00:42:49,840 --> 00:42:52,007 scope of the project and the intent of 1147 00:42:52,007 --> 00:42:54,173 the projects as well . And so the data 1148 00:42:54,173 --> 00:42:56,118 decrease . But also there are data 1149 00:42:56,118 --> 00:42:58,118 ethics principles that are also out 1150 00:42:58,118 --> 00:43:00,118 there as well . And so all of these 1151 00:43:00,118 --> 00:43:02,570 speak . Go hand in hand frankly . Thank 1152 00:43:02,570 --> 00:43:04,810 you , ma'am . Would you like to ask a 1153 00:43:04,810 --> 00:43:07,670 question ? Yeah . You can talk a little 1154 00:43:07,670 --> 00:43:09,448 bit about the concern about the 1155 00:43:09,448 --> 00:43:11,503 competition with china and Russia in 1156 00:43:11,503 --> 00:43:13,503 the field of they are especially if 1157 00:43:13,503 --> 00:43:15,226 there may be different ethical 1158 00:43:15,226 --> 00:43:17,520 constraints . Yes , it's it's a great 1159 00:43:17,520 --> 00:43:21,500 question . Thanks The uh clearly um one 1160 00:43:21,500 --> 00:43:23,444 of the things that that we pay key 1161 00:43:23,444 --> 00:43:25,667 attention to is as we've already talked 1162 00:43:25,667 --> 00:43:27,760 about is the is the sound ethical 1163 00:43:27,760 --> 00:43:29,982 baseline and everything that we base we 1164 00:43:29,982 --> 00:43:32,080 based our Ai development . It begins 1165 00:43:32,080 --> 00:43:34,191 with A . I . Principles and works its 1166 00:43:34,191 --> 00:43:36,024 way through things like test and 1167 00:43:36,024 --> 00:43:38,136 evaluation . It works its way through 1168 00:43:38,136 --> 00:43:40,420 responsible AI integration . So we have 1169 00:43:40,420 --> 00:43:42,830 this entire process that is built 1170 00:43:42,830 --> 00:43:45,770 foundation early on trust and and the 1171 00:43:45,770 --> 00:43:47,937 results of our process beginning right 1172 00:43:47,937 --> 00:43:50,870 from those very principles is building 1173 00:43:50,870 --> 00:43:52,926 in trust , building in trust through 1174 00:43:52,926 --> 00:43:55,037 testing , building in trust , through 1175 00:43:55,037 --> 00:43:57,630 evaluation , building in trust through 1176 00:43:57,630 --> 00:43:59,850 human systems integration , building 1177 00:43:59,850 --> 00:44:02,390 and trust in operational employment 1178 00:44:02,390 --> 00:44:04,557 doctrine to make sure that we're using 1179 00:44:04,557 --> 00:44:06,612 our AI is where it's appropriate and 1180 00:44:06,612 --> 00:44:08,279 where they can be value added 1181 00:44:08,290 --> 00:44:10,910 validating and verifying our AI 1182 00:44:10,910 --> 00:44:13,077 algorithms so that we can be sure that 1183 00:44:13,077 --> 00:44:14,966 they not only perform as they are 1184 00:44:14,966 --> 00:44:17,020 designed to , but they also perform 1185 00:44:17,030 --> 00:44:19,141 perform to design in the context that 1186 00:44:19,141 --> 00:44:21,308 we want them to they achieve the right 1187 00:44:21,308 --> 00:44:23,474 effects that we want them to achieve . 1188 00:44:23,474 --> 00:44:25,641 So we have this very , you know , this 1189 00:44:25,641 --> 00:44:27,752 very complex and multi step process , 1190 00:44:27,752 --> 00:44:29,919 you know , under the under the heading 1191 00:44:29,919 --> 00:44:31,990 of responsible A . I . That is just 1192 00:44:31,990 --> 00:44:34,380 foundational to the way we do A I right , 1193 00:44:34,390 --> 00:44:36,612 it's everything we do has to pass those 1194 00:44:36,612 --> 00:44:39,720 tests . And so as a result , we think 1195 00:44:39,720 --> 00:44:41,720 that actually creates Temple for us 1196 00:44:41,720 --> 00:44:45,350 because because a trusted AI is an Ai 1197 00:44:45,350 --> 00:44:47,128 that a commander will use or an 1198 00:44:47,128 --> 00:44:48,794 operator will use and will be 1199 00:44:48,794 --> 00:44:51,017 comfortable using and no when it can be 1200 00:44:51,017 --> 00:44:53,239 used and when it cannot be used . I say 1201 00:44:53,239 --> 00:44:55,406 that I answer your question that way , 1202 00:44:55,406 --> 00:44:57,739 because some some would contrast the uh , 1203 00:44:57,739 --> 00:44:59,961 you know , maybe the speed and Temple , 1204 00:44:59,961 --> 00:45:01,961 if an authoritarian regime like the 1205 00:45:01,961 --> 00:45:04,294 Russian regime , you know , develops uh , 1206 00:45:04,294 --> 00:45:06,461 you know , a , you know , a weaponized 1207 00:45:06,461 --> 00:45:08,406 AI capability , for example , um , 1208 00:45:08,406 --> 00:45:11,300 without the the ethical baseline and 1209 00:45:11,300 --> 00:45:13,356 sort of the self questioning all the 1210 00:45:13,356 --> 00:45:17,110 way through . Well then you may not be 1211 00:45:17,110 --> 00:45:19,221 able to use that weapon effectively . 1212 00:45:19,221 --> 00:45:21,388 You may not have trust of , you know , 1213 00:45:21,388 --> 00:45:23,443 of the operators or a commander that 1214 00:45:23,443 --> 00:45:25,443 those things will be effective . We 1215 00:45:25,443 --> 00:45:27,554 think that we actually gain tempo and 1216 00:45:27,554 --> 00:45:30,980 speed and capability by by bringing 1217 00:45:30,990 --> 00:45:33,690 AI principles and ethics right from the 1218 00:45:33,700 --> 00:45:35,811 very , very beginning . And we're not 1219 00:45:35,811 --> 00:45:38,640 alone in this . We we currently have a 1220 00:45:38,640 --> 00:45:41,780 I partnership for defense with 16 1221 00:45:41,780 --> 00:45:44,890 nations that that all who embrace the 1222 00:45:44,890 --> 00:45:47,700 same set of ethical principles have 1223 00:45:47,700 --> 00:45:49,644 banded together to help each other 1224 00:45:49,644 --> 00:45:51,756 think through and to work through how 1225 00:45:51,756 --> 00:45:53,700 do you actually develop Ai in this 1226 00:45:53,700 --> 00:45:56,210 construct ? This ? And so we've we just 1227 00:45:56,210 --> 00:45:58,580 had our third meeting of the Ai 1228 00:45:58,580 --> 00:46:00,913 Partnership for Defense With 16 nations . 1229 00:46:00,913 --> 00:46:02,880 We just added three . This last go 1230 00:46:02,880 --> 00:46:06,210 around . Um all of these nations want 1231 00:46:06,210 --> 00:46:09,430 to approach the same uh ai development 1232 00:46:09,430 --> 00:46:11,652 from the same ethical baseline . And so 1233 00:46:11,652 --> 00:46:13,740 it's an enormously powerful team and 1234 00:46:13,740 --> 00:46:15,629 it's not , you know , when we get 1235 00:46:15,629 --> 00:46:17,740 together , it's not a it's not just a 1236 00:46:17,740 --> 00:46:19,518 talk shop about philosophy . We 1237 00:46:19,518 --> 00:46:22,630 actually share real , real examples of 1238 00:46:22,630 --> 00:46:25,210 how you can develop ai in ethical ways , 1239 00:46:25,280 --> 00:46:27,058 how you can build trust in your 1240 00:46:27,058 --> 00:46:29,058 operators , all of the aspects that 1241 00:46:29,058 --> 00:46:31,280 make this that make this effective , we 1242 00:46:31,280 --> 00:46:34,360 think doing it this way , makes our Ai 1243 00:46:34,370 --> 00:46:37,300 actually much more effective as a as a 1244 00:46:37,300 --> 00:46:39,750 capability , then it would be if we 1245 00:46:39,750 --> 00:46:42,550 just handed out uh you know , an 1246 00:46:42,550 --> 00:46:44,550 algorithm that wasn't tested , that 1247 00:46:44,550 --> 00:46:46,661 wasn't trusted , that we weren't sure 1248 00:46:46,661 --> 00:46:48,606 if where it would work or where it 1249 00:46:48,606 --> 00:46:50,383 wouldn't work . Um that kind of 1250 00:46:50,383 --> 00:46:53,020 distrust uh would come from uh from uh 1251 00:46:53,030 --> 00:46:55,030 a i development that's not based on 1252 00:46:55,030 --> 00:46:56,974 those same kind of principles that 1253 00:46:56,974 --> 00:46:59,141 doesn't adhere to the transparency and 1254 00:46:59,141 --> 00:47:01,363 the indivisibility of accountability of 1255 00:47:01,363 --> 00:47:04,100 process that ours does . Thanks for the 1256 00:47:04,100 --> 00:47:05,989 question that would go out to the 1257 00:47:05,989 --> 00:47:08,700 phones to just pre gil from inside 1258 00:47:08,700 --> 00:47:12,390 defense . The question was 1259 00:47:12,390 --> 00:47:14,557 actually answer . It's all just handed 1260 00:47:14,557 --> 00:47:18,500 over to the extra parts . Okay . 1261 00:47:18,880 --> 00:47:22,570 Uh Yes of course sir , 1262 00:47:22,800 --> 00:47:25,180 number you . Yeah , I'm relatively 1263 00:47:25,180 --> 00:47:27,400 needed to say i stuff can you explain 1264 00:47:27,780 --> 00:47:30,710 what is ethical ai 1265 00:47:31,280 --> 00:47:34,010 versus what might be unethical ? Some 1266 00:47:34,010 --> 00:47:36,650 practical examples and what is the 1267 00:47:36,650 --> 00:47:38,594 department doing to ensure that it 1268 00:47:38,594 --> 00:47:42,300 always has ethical ai so so I'll start 1269 00:47:42,300 --> 00:47:44,810 with the easy stuff , you know , with 1270 00:47:44,820 --> 00:47:47,098 I'll start with the ethical principles , 1271 00:47:47,098 --> 00:47:49,098 right ? Because that's where that's 1272 00:47:49,098 --> 00:47:51,209 where we start this conversation . So 1273 00:47:51,209 --> 00:47:54,040 so um if our if our our ethical 1274 00:47:54,040 --> 00:47:56,210 principles which includes reliability 1275 00:47:56,580 --> 00:48:00,150 and transparency and equity , ability 1276 00:48:00,160 --> 00:48:03,330 um and and then and then moves on from 1277 00:48:03,330 --> 00:48:06,160 there so that we have um not only 1278 00:48:06,170 --> 00:48:09,920 uh fair and 1279 00:48:09,930 --> 00:48:13,180 transparent and traceable ai 1280 00:48:13,180 --> 00:48:15,980 algorithms that we have a sense of 1281 00:48:15,980 --> 00:48:18,950 reliability , we know they work . Then 1282 00:48:18,960 --> 00:48:21,630 the then that goes to the next ai 1283 00:48:21,630 --> 00:48:24,180 principles which include reliability 1284 00:48:24,180 --> 00:48:26,640 and governability . And so if you're if 1285 00:48:26,640 --> 00:48:28,751 you're building to those principles , 1286 00:48:28,751 --> 00:48:31,380 first of all your your your ensuring 1287 00:48:31,380 --> 00:48:33,640 that you're a I actually works as it's 1288 00:48:33,650 --> 00:48:36,400 designed to work . You're assuring that 1289 00:48:36,410 --> 00:48:39,130 you understand any biases that those 1290 00:48:39,130 --> 00:48:41,410 systems might have an almost any a I 1291 00:48:41,410 --> 00:48:43,940 will have a natural bias or will grow 1292 00:48:43,940 --> 00:48:45,940 of natural bias as its trained over 1293 00:48:45,940 --> 00:48:48,700 time . And so this is a real aspect of 1294 00:48:48,710 --> 00:48:50,710 artificial intelligence development 1295 00:48:50,710 --> 00:48:52,988 that you have to pay keen attention to . 1296 00:48:52,988 --> 00:48:55,154 And if you understand how the ai helps 1297 00:48:55,154 --> 00:48:57,321 your decisions , it's traceable and so 1298 00:48:57,321 --> 00:48:59,410 that you actually know how the 1299 00:48:59,410 --> 00:49:01,632 algorithm works and how it comes to the 1300 00:49:01,632 --> 00:49:03,466 conclusions it comes to , or the 1301 00:49:03,466 --> 00:49:05,299 predictions it comes to . If you 1302 00:49:05,299 --> 00:49:07,410 understand those things and you build 1303 00:49:07,410 --> 00:49:09,354 your Ai consistent with those with 1304 00:49:09,354 --> 00:49:11,690 those with those principles , then you 1305 00:49:11,690 --> 00:49:13,690 kind of graduate to the next step , 1306 00:49:13,690 --> 00:49:16,520 which is okay . Does it do this uh in a 1307 00:49:16,520 --> 00:49:19,820 responsible way and doesn't do this in 1308 00:49:19,820 --> 00:49:22,110 a governable way . Right . Can you 1309 00:49:22,110 --> 00:49:24,540 actually ensure that the Ai is doing is 1310 00:49:24,540 --> 00:49:26,762 acting responsible in the human systems 1311 00:49:26,762 --> 00:49:29,580 integration environment or a 1312 00:49:29,590 --> 00:49:31,800 verification of validation environment 1313 00:49:31,800 --> 00:49:33,960 where you're trying to test an AI 1314 00:49:33,960 --> 00:49:35,793 algorithm in the context that is 1315 00:49:35,793 --> 00:49:38,071 supposed to perform ? And then finally , 1316 00:49:38,071 --> 00:49:40,127 is it governable ? And at the end of 1317 00:49:40,127 --> 00:49:42,238 the day , can you pull the plug on it 1318 00:49:42,238 --> 00:49:42,150 and and and decide , you know what I'm 1319 00:49:42,150 --> 00:49:44,372 going to , you know , I don't trust the 1320 00:49:44,372 --> 00:49:46,428 data that's coming out of this , you 1321 00:49:46,428 --> 00:49:48,594 know this particular algorithm in this 1322 00:49:48,594 --> 00:49:50,706 context . So I'm not going to use the 1323 00:49:50,706 --> 00:49:50,450 algorithm for that decision making . 1324 00:49:51,470 --> 00:49:54,180 That's the core of like what ethical ai 1325 00:49:54,190 --> 00:49:57,360 is rather than a black box here . 1326 00:49:57,370 --> 00:49:59,537 Listen to the black box . Whatever the 1327 00:49:59,537 --> 00:50:01,759 black box tells you to do , that's what 1328 00:50:01,759 --> 00:50:03,870 you're gonna do . That's an unethical 1329 00:50:03,870 --> 00:50:06,037 application of artificial intelligence 1330 00:50:06,037 --> 00:50:08,092 in our in our in our mind . Okay you 1331 00:50:08,092 --> 00:50:07,820 can probably answer this question much 1332 00:50:07,820 --> 00:50:11,220 better than I cancel please I think job 1333 00:50:11,220 --> 00:50:14,010 well done their general . Um So let me 1334 00:50:14,010 --> 00:50:16,177 I'll just we could talk about this for 1335 00:50:16,177 --> 00:50:18,399 hours so I want to be mindful of time . 1336 00:50:18,399 --> 00:50:20,566 Um and let me just go back for a quick 1337 00:50:20,566 --> 00:50:22,800 second and say uh department already 1338 00:50:22,800 --> 00:50:25,770 has a strong foundation , a strong 1339 00:50:25,770 --> 00:50:27,770 enduring foundation and history and 1340 00:50:27,770 --> 00:50:29,881 ethics , right ? And so to go back to 1341 00:50:29,881 --> 00:50:31,990 your question earlier , um you know 1342 00:50:32,060 --> 00:50:34,171 just we are the U . S . Department of 1343 00:50:34,171 --> 00:50:36,460 Defense and we we we have that strong 1344 00:50:36,460 --> 00:50:38,571 foundation just because others may or 1345 00:50:38,571 --> 00:50:40,738 may not doesn't mean we don't continue 1346 00:50:40,738 --> 00:50:44,540 with our with with our with our values 1347 00:50:44,540 --> 00:50:46,700 and leading in with our values . So I 1348 00:50:46,700 --> 00:50:48,922 just I just wanted to kind of reiterate 1349 00:50:48,922 --> 00:50:51,570 that point . Um when we talk about Ai 1350 00:50:51,570 --> 00:50:54,070 ethics it's still a fairly new area 1351 00:50:54,080 --> 00:50:56,136 right ? And it's being developed and 1352 00:50:56,136 --> 00:50:58,136 there's there's sort of two ways to 1353 00:50:58,136 --> 00:51:00,191 think about it . One is from sort of 1354 00:51:00,191 --> 00:51:02,720 the philosophical perspective and you 1355 00:51:02,720 --> 00:51:04,664 know thinking about the context of 1356 00:51:04,664 --> 00:51:07,180 should we or should we not use ai for 1357 00:51:07,180 --> 00:51:09,150 certain use ? We've seen other 1358 00:51:09,150 --> 00:51:12,970 countries who use it for um tracking 1359 00:51:12,970 --> 00:51:15,110 purposes using facial recognition in 1360 00:51:15,110 --> 00:51:17,332 instances that you would not want it to 1361 00:51:17,332 --> 00:51:19,554 be utilized and they are not consistent 1362 00:51:19,554 --> 00:51:21,721 with our values . And the other aspect 1363 00:51:21,721 --> 00:51:23,880 of this is thinking about ethics from 1364 00:51:23,880 --> 00:51:26,047 an applied perspective , which is what 1365 00:51:26,047 --> 00:51:27,991 we're really talking about when it 1366 00:51:27,991 --> 00:51:30,047 comes to our principles , right ? So 1367 00:51:30,047 --> 00:51:32,269 the principles set out the values , the 1368 00:51:32,269 --> 00:51:31,510 five principles that the general is 1369 00:51:31,510 --> 00:51:33,732 talking about , set out the values . So 1370 00:51:33,732 --> 00:51:35,788 the next step is how do you actually 1371 00:51:35,788 --> 00:51:37,732 take those values and make it into 1372 00:51:37,732 --> 00:51:39,843 process driven steps into checkpoints 1373 00:51:39,843 --> 00:51:42,066 into guard rails ? So as we're building 1374 00:51:42,066 --> 00:51:44,232 these technologies which are unique in 1375 00:51:44,232 --> 00:51:46,399 the aspect that we say , you know , aI 1376 00:51:46,399 --> 00:51:48,399 has never done . Um we need to make 1377 00:51:48,399 --> 00:51:50,121 sure that we're building those 1378 00:51:50,121 --> 00:51:52,121 safeguards into the process when we 1379 00:51:52,121 --> 00:51:53,843 think about the use case , the 1380 00:51:53,843 --> 00:51:55,677 potential harms that might cause 1381 00:51:55,677 --> 00:51:58,010 because this is a socio technical issue , 1382 00:51:58,010 --> 00:51:57,590 right ? This isn't just about 1383 00:51:57,600 --> 00:51:59,600 technology that , you know , you're 1384 00:51:59,600 --> 00:52:01,822 always going to get the same output all 1385 00:52:01,822 --> 00:52:03,711 the time , that's not what you're 1386 00:52:03,711 --> 00:52:05,878 that's not how this technology works . 1387 00:52:05,878 --> 00:52:07,878 And so um thinking about what those 1388 00:52:07,878 --> 00:52:10,100 safeguards and guardrails look like for 1389 00:52:10,100 --> 00:52:12,100 the use case for the data , for the 1390 00:52:12,100 --> 00:52:14,322 model for the output . And so it's more 1391 00:52:14,322 --> 00:52:16,322 process driven and those principles 1392 00:52:16,322 --> 00:52:18,378 outline how we think about it from a 1393 00:52:18,378 --> 00:52:20,489 higher level values perspective . But 1394 00:52:20,489 --> 00:52:22,656 now the implementation that that I was 1395 00:52:22,656 --> 00:52:22,120 alluding to earlier that strategy and 1396 00:52:22,120 --> 00:52:24,070 implementation plan is how do we 1397 00:52:24,070 --> 00:52:26,181 identify the actual process is how do 1398 00:52:26,181 --> 00:52:28,070 we think about the people who are 1399 00:52:28,070 --> 00:52:30,292 responsible for those different steps ? 1400 00:52:30,292 --> 00:52:32,292 How do we think about those process 1401 00:52:32,292 --> 00:52:34,348 flows ? Um and how do we think about 1402 00:52:34,348 --> 00:52:36,514 governance to make sure that we always 1403 00:52:36,514 --> 00:52:38,570 stay consistent and in line with the 1404 00:52:38,570 --> 00:52:42,140 principles . Mhm . Yeah . Ali 1405 00:52:42,140 --> 00:52:45,410 Hudson aviation week . So I understand 1406 00:52:45,410 --> 00:52:47,577 what the A . D . A . Initiative you're 1407 00:52:47,577 --> 00:52:49,466 working with the coco palms but I 1408 00:52:49,466 --> 00:52:51,688 wanted to see how you're actually going 1409 00:52:51,688 --> 00:52:53,799 to be interacting with the individual 1410 00:52:53,799 --> 00:52:55,966 services . For example , like will you 1411 00:52:55,966 --> 00:52:58,132 be participating in the Army's project 1412 00:52:58,132 --> 00:53:00,354 convergence or any of the A . B . M . S 1413 00:53:00,354 --> 00:52:59,770 demonstrations that the Air Force is 1414 00:52:59,770 --> 00:53:01,980 doing . If you could talk about that , 1415 00:53:02,150 --> 00:53:04,960 we absolutely will and we were 1416 00:53:04,960 --> 00:53:07,127 partnered very closely with all of the 1417 00:53:07,127 --> 00:53:09,349 service development efforts and so this 1418 00:53:09,349 --> 00:53:11,516 is what gives us confidence as we , as 1419 00:53:11,516 --> 00:53:13,793 we kind of go into this 88 environment . 1420 00:53:13,793 --> 00:53:16,016 Um , you know , we we have technologies 1421 00:53:16,016 --> 00:53:18,182 in the jake that we think are going to 1422 00:53:18,182 --> 00:53:20,516 be really helpful for Quebec commanders , 1423 00:53:20,516 --> 00:53:22,571 but we also know that we have a deep 1424 00:53:22,571 --> 00:53:24,793 bench of AI capabilities that have been 1425 00:53:24,793 --> 00:53:26,738 developed by the services And that 1426 00:53:26,738 --> 00:53:28,849 those are also really good tools that 1427 00:53:28,849 --> 00:53:30,960 we might want to bring into an 80 a . 1428 00:53:30,960 --> 00:53:33,071 Environment . So , so uh , you know , 1429 00:53:33,071 --> 00:53:35,182 this is not just jake technology that 1430 00:53:35,182 --> 00:53:35,060 we're talking about , we're talking 1431 00:53:35,060 --> 00:53:37,171 about technology that's that's that's 1432 00:53:37,171 --> 00:53:39,830 already been built and tested and 1433 00:53:39,830 --> 00:53:42,500 employed by the services that we might 1434 00:53:42,500 --> 00:53:44,778 be able to bring to combatant commands , 1435 00:53:44,778 --> 00:53:46,667 uh , decision making space really 1436 00:53:46,667 --> 00:53:48,833 quickly . So , so I think , you know , 1437 00:53:48,833 --> 00:53:51,930 we we are we are we are key partners uh 1438 00:53:51,940 --> 00:53:54,107 you know , with with both , you know , 1439 00:53:54,107 --> 00:53:56,218 the A . B . M . S series of exercises 1440 00:53:56,218 --> 00:53:58,329 with the project convergence of great 1441 00:53:58,329 --> 00:54:00,551 relationship with , you know , with the 1442 00:54:00,551 --> 00:54:02,718 services were working closely with the 1443 00:54:02,718 --> 00:54:04,884 Navy on their project Overmatch . Um , 1444 00:54:04,884 --> 00:54:06,662 you know , the thing about this 1445 00:54:06,662 --> 00:54:08,662 technology , so what we have in the 1446 00:54:08,662 --> 00:54:10,884 department , you know , there's so many 1447 00:54:10,884 --> 00:54:12,996 parties that are eager to get started 1448 00:54:12,996 --> 00:54:15,051 on this journey that we , you know , 1449 00:54:15,051 --> 00:54:17,218 that we have flowers blooming all over 1450 00:54:17,218 --> 00:54:19,329 the place , right ? Like , you know , 1451 00:54:19,329 --> 00:54:19,280 people are doing really good work and 1452 00:54:19,280 --> 00:54:21,502 so what we want to do is one illuminate 1453 00:54:21,502 --> 00:54:23,669 that where you know where somebody has 1454 00:54:23,669 --> 00:54:25,836 has something that's scalable across , 1455 00:54:25,836 --> 00:54:27,947 you know , across from one service to 1456 00:54:27,947 --> 00:54:29,891 another or to a different national 1457 00:54:29,891 --> 00:54:32,113 honor to a combatant command . We wanna 1458 00:54:32,113 --> 00:54:31,790 we wanna be kind of keepers of best 1459 00:54:31,790 --> 00:54:33,846 practice and so we understand what's 1460 00:54:33,846 --> 00:54:35,790 available so that we can make that 1461 00:54:35,790 --> 00:54:37,957 broadly available across the force and 1462 00:54:37,957 --> 00:54:39,846 then and then that works two ways 1463 00:54:39,846 --> 00:54:41,679 because as we , you know , maybe 1464 00:54:41,679 --> 00:54:43,901 proliferate technology that the Navy is 1465 00:54:43,901 --> 00:54:46,179 working on and we talked to , you know , 1466 00:54:46,179 --> 00:54:48,123 a defense agency about that , that 1467 00:54:48,123 --> 00:54:50,123 defense agency might also have some 1468 00:54:50,123 --> 00:54:49,980 best practices that we can bring back 1469 00:54:49,980 --> 00:54:52,310 to the Navy . So we uh , one of the , 1470 00:54:52,310 --> 00:54:54,421 one of the key elements of the jake , 1471 00:54:54,421 --> 00:54:56,532 what we think is an important part of 1472 00:54:56,532 --> 00:54:58,699 our mission is kind of be this brokers 1473 00:54:58,699 --> 00:55:00,810 of best practice , right ? Like we we 1474 00:55:00,810 --> 00:55:02,977 learn from are the people we work with 1475 00:55:02,977 --> 00:55:05,143 and then we can teach to the people we 1476 00:55:05,143 --> 00:55:07,254 work with again from that , from that 1477 00:55:07,254 --> 00:55:09,532 body of knowledge that we collect . So . 1478 00:55:09,532 --> 00:55:11,810 Absolutely . And the more we integrate , 1479 00:55:11,810 --> 00:55:13,810 the better our collective output is 1480 00:55:13,810 --> 00:55:16,760 going to be . Thanks lee . And the last 1481 00:55:16,760 --> 00:55:19,440 question of the day is to Jared from 1482 00:55:19,450 --> 00:55:22,170 federal news network Jared . Go ahead . 1483 00:55:28,240 --> 00:55:30,920 Okay , jerry is not on the line . So 1484 00:55:30,920 --> 00:55:32,920 we'll go , I'm sorry here , can you 1485 00:55:32,920 --> 00:55:35,031 guys hear me now ? Oh , Jared . There 1486 00:55:35,031 --> 00:55:37,087 you are . Yeah . Didn't hit the mute 1487 00:55:37,087 --> 00:55:39,253 button . My fault . Sorry about that . 1488 00:55:39,253 --> 00:55:41,531 Thanks for doing this . Everybody . Um , 1489 00:55:41,531 --> 00:55:43,253 I wanted to go back to the ADA 1490 00:55:43,253 --> 00:55:45,587 initiative , the kokonas cloud strategy . 1491 00:55:45,587 --> 00:55:47,753 The department just put out recently , 1492 00:55:47,753 --> 00:55:49,864 talk to a fair amount of detail about 1493 00:55:49,864 --> 00:55:51,864 uh , some of the challenges co coms 1494 00:55:51,864 --> 00:55:53,976 have . Just in terms of basic I . T . 1495 00:55:53,976 --> 00:55:53,640 Infrastructure , you know a lack of 1496 00:55:53,640 --> 00:55:55,890 access to commercial cloud services 1497 00:55:55,900 --> 00:55:58,380 reach back to Kaunas and I'm just 1498 00:55:58,380 --> 00:56:00,660 curious if that's right . How much is 1499 00:56:00,660 --> 00:56:02,771 that going to be a hindrance to these 1500 00:56:02,771 --> 00:56:04,880 fly away teams ? I mean can they do 1501 00:56:04,890 --> 00:56:07,310 real meaningful ai implementation if 1502 00:56:07,310 --> 00:56:09,510 they're working in a just a basic nuts 1503 00:56:09,510 --> 00:56:11,566 and bolts I . T . Environment that's 1504 00:56:11,566 --> 00:56:13,732 kind of primitive and siloed by modern 1505 00:56:13,732 --> 00:56:15,860 standards ? Yeah . Great . Great 1506 00:56:15,860 --> 00:56:17,810 question Jared . And and honestly 1507 00:56:17,850 --> 00:56:20,072 that's the reason we're doing a D . A . 1508 00:56:20,072 --> 00:56:22,072 Right because what we want to do is 1509 00:56:22,072 --> 00:56:24,239 experiment in the environments that we 1510 00:56:24,239 --> 00:56:26,350 expect our algorithms to work and you 1511 00:56:26,350 --> 00:56:28,572 can't you can do it in a lab . But when 1512 00:56:28,572 --> 00:56:30,790 you bring that lab that lab tested the 1513 00:56:30,800 --> 00:56:32,578 capability out to the combatant 1514 00:56:32,578 --> 00:56:34,522 commander are out somewhere on the 1515 00:56:34,522 --> 00:56:36,578 tactical edge you're gonna realize . 1516 00:56:36,578 --> 00:56:38,800 Holy cow , the latency here is horrible 1517 00:56:38,800 --> 00:56:40,940 or it's intermittent . Holy cow . the 1518 00:56:40,940 --> 00:56:42,884 reliability and the up time of the 1519 00:56:42,884 --> 00:56:44,551 servers that I require is not 1520 00:56:44,551 --> 00:56:48,150 sufficient . Right ? By doing this a D 1521 00:56:48,150 --> 00:56:50,540 a experimentation in the place that we 1522 00:56:50,540 --> 00:56:52,762 expect our algorithms to work . We will 1523 00:56:52,762 --> 00:56:54,910 discover the bureaucratic obstacles , 1524 00:56:54,920 --> 00:56:56,642 the cultural obstacles and the 1525 00:56:56,642 --> 00:56:58,642 technical obstacles to making these 1526 00:56:58,642 --> 00:57:00,864 things successful and then we can bring 1527 00:57:00,864 --> 00:57:02,976 that back . So we're great partners . 1528 00:57:02,976 --> 00:57:04,976 We mentioned the Ceo and uh and and 1529 00:57:04,976 --> 00:57:07,198 their role in the data enterprise . But 1530 00:57:07,198 --> 00:57:06,490 what we're also great partners with the 1531 00:57:06,490 --> 00:57:09,070 C . I . O . Right . So , so the chief 1532 00:57:09,070 --> 00:57:11,700 information officer who actually owns 1533 00:57:11,710 --> 00:57:13,830 operates , builds and fixes these 1534 00:57:13,830 --> 00:57:16,150 networks . We can use what we what we 1535 00:57:16,150 --> 00:57:18,372 observe in the real working environment 1536 00:57:18,430 --> 00:57:21,240 to help inform upgrades to networks , 1537 00:57:21,250 --> 00:57:24,020 upgrades to architecture , uh re 1538 00:57:24,020 --> 00:57:26,353 architect during things that , you know , 1539 00:57:26,353 --> 00:57:28,520 that that may be have to be completely 1540 00:57:28,520 --> 00:57:30,631 redone in a data driven environment . 1541 00:57:30,631 --> 00:57:32,576 And then also from a from a policy 1542 00:57:32,576 --> 00:57:34,687 perspective , maybe it's insufficient 1543 00:57:34,687 --> 00:57:36,800 to have a uh you know , authority to 1544 00:57:36,800 --> 00:57:39,090 operate , you know , structure where 1545 00:57:39,090 --> 00:57:41,590 you make really uh you make decisions 1546 00:57:41,590 --> 00:57:43,757 about what can go on what network in a 1547 00:57:43,757 --> 00:57:46,320 very deliberate and maybe a way that we 1548 00:57:46,320 --> 00:57:48,542 could update . That's the kind of stuff 1549 00:57:48,542 --> 00:57:50,209 that we hope to uh we hope to 1550 00:57:50,209 --> 00:57:52,430 understand uh policy obstacles , 1551 00:57:52,440 --> 00:57:54,162 cultural obstacles , technical 1552 00:57:54,162 --> 00:57:56,329 obstacles , network obstacles . And if 1553 00:57:56,329 --> 00:57:58,660 we learn what those obstacles are , 1554 00:57:58,830 --> 00:58:00,941 then we can address the real problems 1555 00:58:00,941 --> 00:58:03,108 to ai implementation . And I think , I 1556 00:58:03,108 --> 00:58:05,230 think it's it's critical , I'm sorry 1557 00:58:05,230 --> 00:58:07,830 for for continue to go on here , but I 1558 00:58:07,830 --> 00:58:09,997 think it's a really important question 1559 00:58:09,997 --> 00:58:12,052 because we can have we can do design 1560 00:58:12,052 --> 00:58:14,163 documents in the lab and build a i in 1561 00:58:14,163 --> 00:58:17,240 the lab forever until we can actually 1562 00:58:17,240 --> 00:58:19,407 employ them on the in the environments 1563 00:58:19,407 --> 00:58:21,462 that they are expected to operate in 1564 00:58:21,462 --> 00:58:23,573 and they are expected to work , we're 1565 00:58:23,573 --> 00:58:25,740 not gonna know and that's unacceptable 1566 00:58:25,740 --> 00:58:28,240 to us . And so A . D . A . Is exactly 1567 00:58:28,240 --> 00:58:30,351 designed for that purpose . We can we 1568 00:58:30,351 --> 00:58:32,573 can find out for sure does this work or 1569 00:58:32,573 --> 00:58:34,490 does it not work ? Thanks for the 1570 00:58:34,490 --> 00:58:36,657 question . Can I follow ? Can I follow 1571 00:58:36,657 --> 00:58:38,768 real quick just to ask uh do you know 1572 00:58:38,768 --> 00:58:40,879 how soon these teams are gonna deploy 1573 00:58:40,879 --> 00:58:43,046 and where you're going to recruit from 1574 00:58:43,046 --> 00:58:45,379 to actually staff them up ? Yeah . Yeah . 1575 00:58:45,379 --> 00:58:47,379 So so we're gonna be uh we're gonna 1576 00:58:47,379 --> 00:58:49,490 push our first data uh reinforcements 1577 00:58:49,490 --> 00:58:51,930 out I think within 30 days or so we 1578 00:58:51,930 --> 00:58:54,170 should uh and we'll be building , we'll 1579 00:58:54,170 --> 00:58:56,910 be working with combat commands on uh 1580 00:58:56,920 --> 00:59:00,040 fly away teams from the largely from 1581 00:59:00,040 --> 00:59:03,450 the jake and folks that will contract 1582 00:59:03,450 --> 00:59:07,050 to come with us uh within within 60 or 1583 00:59:07,050 --> 00:59:09,217 90 days . So this is , you know , this 1584 00:59:09,217 --> 00:59:11,560 is coming really quickly and , and as I 1585 00:59:11,560 --> 00:59:13,504 indicated before , uh , you know , 1586 00:59:13,504 --> 00:59:15,449 combatant commands are busy , busy 1587 00:59:15,449 --> 00:59:17,616 people and busy staffs and they have a 1588 00:59:17,616 --> 00:59:19,560 lot of things going on . They have 1589 00:59:19,560 --> 00:59:21,782 large chunks of the world that they are 1590 00:59:21,782 --> 00:59:23,893 that they're responsible for . And so 1591 00:59:23,893 --> 00:59:26,004 uh we have to be very , we have to be 1592 00:59:26,004 --> 00:59:27,893 very attuned to what their battle 1593 00:59:27,893 --> 00:59:30,116 rhythm is and when they're available to 1594 00:59:30,116 --> 00:59:31,671 kind of commit to commit to 1595 00:59:31,671 --> 00:59:33,782 experimenting with us and we'll align 1596 00:59:33,782 --> 00:59:35,949 to their schedule to make sure that it 1597 00:59:35,949 --> 00:59:38,060 works effectively for them . And then 1598 00:59:38,060 --> 00:59:40,060 we'll port the , report the results 1599 00:59:40,060 --> 00:59:42,393 across the force and through repetition . 1600 00:59:42,393 --> 00:59:44,282 We expect to do this about once a 1601 00:59:44,282 --> 00:59:46,393 quarter where we'll actually get into 1602 00:59:46,393 --> 00:59:48,393 an experiment experimentation cycle 1603 00:59:48,393 --> 00:59:50,727 with a combatant command . Uh We'll try , 1604 00:59:50,727 --> 00:59:52,838 we'll try to , you know , do that and 1605 00:59:52,838 --> 00:59:54,949 build capability about once a quarter 1606 00:59:54,949 --> 00:59:57,171 the data , the data aspect of this is a 1607 00:59:57,171 --> 00:59:59,282 little bit more continuous . So we'll 1608 00:59:59,282 --> 01:00:01,504 have , you know , people that are there 1609 01:00:01,504 --> 01:00:03,504 in a long term basis , steady state 1610 01:00:03,504 --> 01:00:05,727 basis to help them shape their data and 1611 01:00:05,727 --> 01:00:05,440 get their data into the right condition . 1612 01:00:07,510 --> 01:00:09,621 Can I ask a quick question before you 1613 01:00:09,621 --> 01:00:13,170 rat ? Okay , who's on the line , 1614 01:00:13,170 --> 01:00:15,570 please ? This is Jack Polson from tech 1615 01:00:15,570 --> 01:00:19,320 Inquiry . Hijack . Go ahead sir . So 1616 01:00:19,710 --> 01:00:21,766 have been several companies that the 1617 01:00:21,766 --> 01:00:23,932 Department of Defense is procured from 1618 01:00:23,932 --> 01:00:26,300 whether X Mode Social or Clearview Ai 1619 01:00:26,300 --> 01:00:29,040 or wherever album that have arguably 1620 01:00:29,040 --> 01:00:31,250 violated the Ai principles commitment 1621 01:00:31,260 --> 01:00:33,482 to audible data trails , whether that's 1622 01:00:33,530 --> 01:00:36,110 X mode Social . Having sourced some of 1623 01:00:36,110 --> 01:00:37,999 its data reportedly from a muslim 1624 01:00:37,999 --> 01:00:40,730 prayer app or Clearview Ai being sued 1625 01:00:40,730 --> 01:00:43,070 in the state of Illinois for the way it 1626 01:00:43,080 --> 01:00:45,850 scraped social media , I guess . I be 1627 01:00:45,850 --> 01:00:47,906 curious if you could detail what the 1628 01:00:47,906 --> 01:00:50,270 retrospective process might have looked 1629 01:00:50,270 --> 01:00:53,000 like for those companies in doing an 1630 01:00:53,000 --> 01:00:55,000 analysis of whether these companies 1631 01:00:55,000 --> 01:00:58,010 have met the defense innovation boards 1632 01:00:58,010 --> 01:01:01,840 recommended aI principles . Yeah , sure . 1633 01:01:01,850 --> 01:01:04,660 Yeah , I'm happy to try to address this . 1634 01:01:04,670 --> 01:01:07,080 Um So so the D . O . D . At this 1635 01:01:07,080 --> 01:01:09,302 principles , right , Which were founded 1636 01:01:09,302 --> 01:01:11,302 on on as you alluded to the defense 1637 01:01:11,302 --> 01:01:13,469 innovation boards uh recommendations . 1638 01:01:13,469 --> 01:01:16,230 Um So so one of our efforts actually 1639 01:01:16,230 --> 01:01:19,180 that we are doing through our trade 1640 01:01:19,180 --> 01:01:22,390 wind project at the jake is actually 1641 01:01:22,390 --> 01:01:24,612 looking at how we are going to build in 1642 01:01:24,612 --> 01:01:27,290 responsible A . I . Recommendations and 1643 01:01:27,290 --> 01:01:29,340 practices within the Ai acquisition 1644 01:01:29,340 --> 01:01:32,510 process . So to your point , I think as 1645 01:01:32,510 --> 01:01:34,677 we work with vendors , we want to make 1646 01:01:34,677 --> 01:01:37,220 sure that their processes , their 1647 01:01:37,220 --> 01:01:39,442 practices are there principles are that 1648 01:01:39,442 --> 01:01:41,700 they may have on their end aligned with 1649 01:01:41,700 --> 01:01:44,230 ours . And so often times we talk about 1650 01:01:44,230 --> 01:01:46,063 technical interoperability , but 1651 01:01:46,063 --> 01:01:48,230 there's also this aspect of principles 1652 01:01:48,230 --> 01:01:50,341 and practices interoperability and so 1653 01:01:50,341 --> 01:01:52,452 um what we're looking to do and we're 1654 01:01:52,452 --> 01:01:54,230 working with the responsible Ai 1655 01:01:54,230 --> 01:01:57,660 institute via our trade Wind um vehicle , 1656 01:01:57,670 --> 01:02:00,170 is to actually map out the ai 1657 01:02:00,170 --> 01:02:02,410 acquisition lifecycle and find all the 1658 01:02:02,410 --> 01:02:04,632 different various entry points where we 1659 01:02:04,632 --> 01:02:07,420 can ask certain questions right ? Um 1660 01:02:07,430 --> 01:02:09,652 and do our own due diligence . So , for 1661 01:02:09,652 --> 01:02:11,319 example , understanding these 1662 01:02:11,319 --> 01:02:14,730 organizations practices around um data , 1663 01:02:14,730 --> 01:02:17,230 how they are , how they are are doing 1664 01:02:17,230 --> 01:02:20,330 their own data governance , um where 1665 01:02:20,330 --> 01:02:22,274 the data is coming from , thinking 1666 01:02:22,274 --> 01:02:24,510 about and asking about their own ethics 1667 01:02:24,510 --> 01:02:26,510 maturity assessment . So does that 1668 01:02:26,510 --> 01:02:29,380 organization have principles even and 1669 01:02:29,380 --> 01:02:31,602 if so , what are they doing to put them 1670 01:02:31,602 --> 01:02:33,713 into practice , understand what their 1671 01:02:33,713 --> 01:02:35,713 supply chain might look like from a 1672 01:02:35,713 --> 01:02:37,713 responsible ai perspective . And so 1673 01:02:37,713 --> 01:02:39,769 that's actually something that is on 1674 01:02:39,769 --> 01:02:41,824 top of mind for us and that's one of 1675 01:02:41,824 --> 01:02:41,820 our efforts that we're working on 1676 01:02:42,000 --> 01:02:44,111 through our trade wind project and so 1677 01:02:44,111 --> 01:02:47,430 hopefully um again , uh as part of the 1678 01:02:47,430 --> 01:02:50,230 memo that has a number of deliverables , 1679 01:02:50,230 --> 01:02:52,008 there is one that is focused on 1680 01:02:52,008 --> 01:02:54,100 acquisition and I think that uh 1681 01:02:54,110 --> 01:02:56,430 deliverable , which is you uh sometime 1682 01:02:56,430 --> 01:02:58,670 in the fall , early fall address , is 1683 01:02:58,670 --> 01:03:00,781 really what you're trying to get at . 1684 01:03:01,100 --> 01:03:03,670 Yeah , I think that's a great question 1685 01:03:03,670 --> 01:03:06,003 and I'm glad you asked because you know , 1686 01:03:06,003 --> 01:03:07,948 we often , you know , just just as 1687 01:03:07,948 --> 01:03:10,059 evidence this afternoon , uh we often 1688 01:03:10,059 --> 01:03:11,948 think of ai ethics and ai ethical 1689 01:03:11,948 --> 01:03:14,003 principles of responsible A i in the 1690 01:03:14,003 --> 01:03:15,948 context of algorithms themselves . 1691 01:03:15,948 --> 01:03:18,114 Right . And one of the one of the most 1692 01:03:18,114 --> 01:03:20,226 important pieces that we're doing and 1693 01:03:20,226 --> 01:03:22,840 and alka , I alluded to it here , we're 1694 01:03:22,840 --> 01:03:25,720 building an ai acquisition capability 1695 01:03:25,730 --> 01:03:29,290 that produces a i uh acquisition 1696 01:03:29,300 --> 01:03:32,200 experts so that so that , you know , in 1697 01:03:32,200 --> 01:03:34,533 the future , as this capability matures , 1698 01:03:34,533 --> 01:03:36,478 we'll actually have people who are 1699 01:03:36,478 --> 01:03:38,589 trained to look at that sort of thing 1700 01:03:38,589 --> 01:03:40,811 and they'll actually , you know , these 1701 01:03:40,811 --> 01:03:42,811 I could see these kind of questions 1702 01:03:42,811 --> 01:03:45,033 being part of source selection criteria 1703 01:03:45,033 --> 01:03:47,144 for vendors as we bring vendors on on 1704 01:03:47,144 --> 01:03:49,089 new projects . And so so I think I 1705 01:03:49,089 --> 01:03:51,200 think that speaks to a , you know , a 1706 01:03:51,200 --> 01:03:53,690 systemic implementation of of 1707 01:03:53,690 --> 01:03:56,830 responsible ai uh responsible ai 1708 01:03:56,830 --> 01:03:59,410 principles and an ethical principles , 1709 01:03:59,420 --> 01:04:01,642 you know , right from the get go before 1710 01:04:01,642 --> 01:04:03,864 we ever even bring somebody on contract 1711 01:04:03,864 --> 01:04:05,920 to do this so we can look inside our 1712 01:04:05,920 --> 01:04:08,160 own house . Certainly , I think we're 1713 01:04:08,160 --> 01:04:10,930 getting better at at learning how to 1714 01:04:10,940 --> 01:04:12,940 look outside our house to make sure 1715 01:04:12,940 --> 01:04:14,940 that , you know , that the that the 1716 01:04:14,940 --> 01:04:16,940 wrong , the wrong kind of practices 1717 01:04:16,940 --> 01:04:20,760 come in , come in the tent . Okay , 1718 01:04:20,760 --> 01:04:22,538 thank you very much . That will 1719 01:04:22,538 --> 01:04:24,538 conclude today's press conference . 1720 01:04:24,538 --> 01:04:26,427 Some of you have my contacts . If 1721 01:04:26,427 --> 01:04:28,760 there's follow ups for those that don't , 1722 01:04:28,760 --> 01:04:30,927 you can contact the OsD Public Affairs 1723 01:04:30,927 --> 01:04:33,038 Duty Officer and they'll connect with 1724 01:04:33,038 --> 01:04:35,260 me if you have follow up . So thank you 1725 01:04:35,260 --> 01:04:37,316 very much for attending . Can I make 1726 01:04:37,316 --> 01:04:39,427 one more concern ? Um , so , so hey , 1727 01:04:39,427 --> 01:04:39,350 for those of you who are regular 1728 01:04:39,350 --> 01:04:42,920 members of the circuit here , uh , we 1729 01:04:42,920 --> 01:04:44,864 and the jake have been served just 1730 01:04:44,864 --> 01:04:47,150 tremendously by our public Affairs 1731 01:04:47,150 --> 01:04:50,670 Officer , Commander Abrahamson . And uh , 1732 01:04:50,670 --> 01:04:52,892 I'll tell you what , he has done just a 1733 01:04:52,892 --> 01:04:54,781 fantastic job and I hope , I hope 1734 01:04:54,781 --> 01:04:56,837 you've had the same experience . But 1735 01:04:56,837 --> 01:04:59,114 for me , I mean , he's been thoughtful . 1736 01:04:59,114 --> 01:05:01,281 He's been very patient . He's , he's , 1737 01:05:01,281 --> 01:05:03,281 he's kept these events , you know , 1738 01:05:03,281 --> 01:05:05,392 pull these events together for a long 1739 01:05:05,392 --> 01:05:07,559 time . Arlo is going to be moving on . 1740 01:05:07,559 --> 01:05:07,510 He's gonna , he's moving up , you know , 1741 01:05:07,790 --> 01:05:10,420 one success after another for this guy . 1742 01:05:10,590 --> 01:05:12,812 Um , He's gonna be he's gonna be moving 1743 01:05:12,812 --> 01:05:14,979 on and uh go back to work for the Navy 1744 01:05:14,979 --> 01:05:17,201 for a little while . We hope we see him 1745 01:05:17,201 --> 01:05:19,534 again back in the joint force here soon . 1746 01:05:19,534 --> 01:05:21,479 Uh You know maybe he'll take uh Mr 1747 01:05:21,479 --> 01:05:23,701 Kirby's job some someday . But uh but I 1748 01:05:23,701 --> 01:05:25,757 tell you what I really want to thank 1749 01:05:25,757 --> 01:05:27,923 you know just in the presence of those 1750 01:05:27,923 --> 01:05:29,979 of you who have worked with Arlo . I 1751 01:05:29,979 --> 01:05:29,450 want to just say thank you Arlo for 1752 01:05:29,450 --> 01:05:31,783 what a great job you've done as a P . O . 1753 01:05:31,783 --> 01:05:34,117 Thank you , appreciate it . My pleasure . 1754 01:05:35,490 --> 01:05:37,010 All right thank you very much everybody .