1 00:00:00,000 --> 00:00:00,000 2 00:00:05,440 --> 00:00:07,607 good afternoon , everybody . We have a 3 00:00:07,607 --> 00:00:09,551 little bit of technical difficulty 4 00:00:09,551 --> 00:00:11,773 there . Before we get started , I'll do 5 00:00:11,773 --> 00:00:13,996 just a really brief will call . As most 6 00:00:13,996 --> 00:00:16,273 of you know , I'm Lieutenant Commander , 7 00:00:16,273 --> 00:00:18,051 our Labor Hampson , and I'll be 8 00:00:18,051 --> 00:00:20,162 moderating . Today's press conference 9 00:00:20,162 --> 00:00:21,940 based press trumpets will cover 10 00:00:21,940 --> 00:00:24,107 artificial intelligence capabilities . 11 00:00:24,107 --> 00:00:26,310 Initiative Defense . Our host for 12 00:00:26,310 --> 00:00:28,477 today's press conferences . Lieutenant 13 00:00:28,477 --> 00:00:30,699 General Michael Grown , the director of 14 00:00:30,699 --> 00:00:32,588 the Joint Artificial Intelligence 15 00:00:32,588 --> 00:00:34,699 Center . We'll begin today's briefing 16 00:00:34,699 --> 00:00:37,032 with opening remarks from general grown , 17 00:00:37,032 --> 00:00:39,254 and then we'll continue with a question 18 00:00:39,254 --> 00:00:38,880 and answer session . We do have a hard 19 00:00:38,880 --> 00:00:41,510 stop today about 13 40 because of 20 00:00:41,510 --> 00:00:43,550 another event that will be going on 21 00:00:43,550 --> 00:00:45,717 here at two o'clock . So we just asked 22 00:00:45,717 --> 00:00:48,090 everybody to keep their questions brief . 23 00:00:48,100 --> 00:00:50,100 Just one question and me and at the 24 00:00:50,100 --> 00:00:52,690 most one follow one question that's 25 00:00:52,690 --> 00:00:56,010 very brief , and I don't have the list 26 00:00:56,020 --> 00:00:58,242 for that I normally have . So I'll just 27 00:00:58,242 --> 00:01:00,353 go and do a quick roll call before we 28 00:01:00,353 --> 00:01:02,687 start the opening remarks and displease . 29 00:01:02,687 --> 00:01:04,298 Idea yourselves on your news 30 00:01:04,298 --> 00:01:06,576 organization when you ask the question . 31 00:01:06,576 --> 00:01:08,742 So if I could do a quick roll call out 32 00:01:08,742 --> 00:01:10,798 to the lines really quick , go ahead 33 00:01:10,798 --> 00:01:10,550 out to the films 34 00:01:19,140 --> 00:01:21,960 okay , I think we still have a tackle 35 00:01:22,510 --> 00:01:25,340 issue . So what we'll do is we'll go 36 00:01:25,340 --> 00:01:27,451 ahead and get started . General Grown 37 00:01:27,451 --> 00:01:29,618 will deliver his opening remarks , and 38 00:01:29,618 --> 00:01:31,810 we will try to get the phone lines 39 00:01:31,810 --> 00:01:34,580 patched in here . So without further 40 00:01:34,580 --> 00:01:36,524 ado , if you could deliver opening 41 00:01:36,524 --> 00:01:38,524 remarks and started Thanks , Carl . 42 00:01:38,740 --> 00:01:41,380 Okay . Good afternoon . Welcome . I'm 43 00:01:41,390 --> 00:01:43,334 Mike Grew in Lieutenant Lieutenant 44 00:01:43,334 --> 00:01:45,501 General , United States Marine Corps . 45 00:01:45,501 --> 00:01:47,390 I'm the new director of the Joint 46 00:01:47,390 --> 00:01:49,446 Artificial Intelligence Center . The 47 00:01:49,446 --> 00:01:51,057 Jake . I'm very glad for the 48 00:01:51,057 --> 00:01:53,279 opportunity to interact with you . Look 49 00:01:53,279 --> 00:01:53,160 forward to our conversation today . 50 00:01:53,250 --> 00:01:55,100 It's my great privilege to serve 51 00:01:55,100 --> 00:01:57,322 alongside the members of the Jake , but 52 00:01:57,322 --> 00:01:59,710 also the much larger numbers across the 53 00:01:59,710 --> 00:02:01,543 department that are committed to 54 00:02:01,543 --> 00:02:03,670 changing the way we decide the way we 55 00:02:03,670 --> 00:02:06,030 fight , the way we manage and the way 56 00:02:06,030 --> 00:02:08,320 we prepare . It's clear to me that we 57 00:02:08,320 --> 00:02:10,542 do not have an awareness problem in the 58 00:02:10,542 --> 00:02:12,264 department , but like with any 59 00:02:12,264 --> 00:02:14,431 transformational set of technologies , 60 00:02:14,431 --> 00:02:16,653 we have a lot of work to do and broadly 61 00:02:16,653 --> 00:02:18,876 understanding the transformative nature 62 00:02:18,876 --> 00:02:21,209 and the implications of a I integration . 63 00:02:21,440 --> 00:02:23,662 We're challenged not so much in finding 64 00:02:23,662 --> 00:02:25,773 the technologies we need , but rather 65 00:02:25,773 --> 00:02:27,884 to get to getting about the hard work 66 00:02:27,884 --> 00:02:30,320 of AI implementation . I've often used 67 00:02:30,320 --> 00:02:32,530 the analogy of the transformation into 68 00:02:32,530 --> 00:02:34,650 industrial age warfare off literally 69 00:02:34,650 --> 00:02:37,380 lancers riding into battle against guns 70 00:02:37,390 --> 00:02:39,940 that were machines flying machines that 71 00:02:39,940 --> 00:02:42,230 scouted positions or dropped bombs of 72 00:02:42,230 --> 00:02:44,452 mass , long range artillery machines or 73 00:02:44,452 --> 00:02:46,619 even poison gas . Used a weapon uses a 74 00:02:46,619 --> 00:02:49,050 weapon at an industrial scale . That 75 00:02:49,050 --> 00:02:51,161 transformation that had been underway 76 00:02:51,161 --> 00:02:53,420 for decades suddenly coalesced into 77 00:02:53,420 --> 00:02:55,570 something very lethal and very riel 78 00:02:55,830 --> 00:02:58,052 understanding that came at great cost . 79 00:02:58,280 --> 00:03:00,080 Another example is blitzkrieg , 80 00:03:00,090 --> 00:03:02,540 literally lightning war that leverage 81 00:03:02,540 --> 00:03:05,280 technology known to both sides to 82 00:03:05,280 --> 00:03:07,590 create but but was used by one side to 83 00:03:07,590 --> 00:03:09,800 create tempo that overwhelmed the 84 00:03:09,800 --> 00:03:12,320 slower , more methodical force . In 85 00:03:12,330 --> 00:03:14,570 either case , the artifacts of the new 86 00:03:14,570 --> 00:03:16,792 technological environment were plain to 87 00:03:16,792 --> 00:03:18,959 see in the society that surrounded the 88 00:03:18,959 --> 00:03:21,070 participants . These transformational 89 00:03:21,070 --> 00:03:23,430 moments were eminently foreseeable , 90 00:03:23,480 --> 00:03:26,070 but in many cases not foreseen . I 91 00:03:26,070 --> 00:03:28,237 would submit that today we face a very 92 00:03:28,237 --> 00:03:30,660 similar situation . We're surrounded by 93 00:03:30,660 --> 00:03:32,827 the artifacts of the information age . 94 00:03:32,827 --> 00:03:34,920 We need to understand the impacts of 95 00:03:34,920 --> 00:03:36,642 this set of globally available 96 00:03:36,642 --> 00:03:38,864 technologies on the future of warfare . 97 00:03:39,040 --> 00:03:41,520 We need to work hard now to foresee 98 00:03:41,520 --> 00:03:43,610 what is foreseeable . We have a tech 99 00:03:43,610 --> 00:03:45,832 native military and civilian work force 100 00:03:45,832 --> 00:03:48,140 that enjoys a fast flowing , responsive 101 00:03:48,140 --> 00:03:50,550 and tailored information environment at 102 00:03:50,550 --> 00:03:52,439 home When they're on their mobile 103 00:03:52,439 --> 00:03:54,661 phones , they want that same experience 104 00:03:54,661 --> 00:03:56,828 in the military and department systems 105 00:03:56,828 --> 00:03:58,883 that they operate . Our war fighters 106 00:03:58,883 --> 00:04:01,106 want responsive , data driven decisions 107 00:04:01,106 --> 00:04:03,439 are commanders want to operate at speed . 108 00:04:03,439 --> 00:04:05,550 And with a mix of manned and unmanned 109 00:04:05,550 --> 00:04:07,890 capabilities , the citizens seek 110 00:04:07,890 --> 00:04:10,112 efficiency and effectiveness from their 111 00:04:10,112 --> 00:04:12,310 investments and defense . Artificial 112 00:04:12,310 --> 00:04:14,477 intelligence can unlock all of these . 113 00:04:14,840 --> 00:04:16,951 We're surrounded by examples in every 114 00:04:16,951 --> 00:04:18,618 major industry of data driven 115 00:04:18,618 --> 00:04:20,970 enterprise that operate with speed and 116 00:04:20,970 --> 00:04:22,850 efficiency . That leaves their 117 00:04:22,850 --> 00:04:25,460 competitors in the dust . We want that . 118 00:04:25,840 --> 00:04:27,784 Most important of all , we need to 119 00:04:27,784 --> 00:04:30,007 ensure that the young men and women who 120 00:04:30,007 --> 00:04:31,951 go in harm's way on our behalf are 121 00:04:31,951 --> 00:04:34,173 prepared and equipped for the complex , 122 00:04:34,173 --> 00:04:36,173 high tempo battlefields that of the 123 00:04:36,173 --> 00:04:38,229 future . I often hear that AI is our 124 00:04:38,229 --> 00:04:40,562 future , and I don't disagree with that . 125 00:04:40,562 --> 00:04:43,070 But A I also needs to be our present as 126 00:04:43,070 --> 00:04:45,126 an implementation organization . The 127 00:04:45,126 --> 00:04:47,181 Jake will continue to work hard with 128 00:04:47,181 --> 00:04:49,348 many partners across the department to 129 00:04:49,348 --> 00:04:51,514 bring that into being , So let me just 130 00:04:51,514 --> 00:04:53,681 talk a little bit about our priorities 131 00:04:53,681 --> 00:04:53,400 in the Jake today and you can ask 132 00:04:53,400 --> 00:04:56,630 questions and Jake 1.0 . We help jump 133 00:04:56,630 --> 00:04:58,750 start a I in the d . O . D through 134 00:04:58,750 --> 00:05:00,861 Pathfinder projects we called Mission 135 00:05:00,861 --> 00:05:02,972 initiatives . So over the last year , 136 00:05:02,972 --> 00:05:04,917 year and a half we've been in that 137 00:05:04,917 --> 00:05:06,861 business . We developed over 30 AI 138 00:05:06,861 --> 00:05:08,917 products , working across a range of 139 00:05:08,917 --> 00:05:10,917 department use cases . We learned a 140 00:05:10,917 --> 00:05:13,139 great deal and brought on board some of 141 00:05:13,139 --> 00:05:15,306 the brightest talent in the business . 142 00:05:15,306 --> 00:05:17,306 It really is amazing . When we took 143 00:05:17,306 --> 00:05:19,528 stock , however , we realized that this 144 00:05:19,528 --> 00:05:21,583 was not transformational enough . We 145 00:05:21,583 --> 00:05:23,639 weren't going to be in a position to 146 00:05:23,639 --> 00:05:25,694 transform the department through the 147 00:05:25,694 --> 00:05:27,790 delivery of use cases . Uh , in Jake 148 00:05:27,790 --> 00:05:29,901 two point . Oh , what ? We're calling 149 00:05:29,901 --> 00:05:31,957 our our effort . Now we seek to push 150 00:05:31,957 --> 00:05:33,734 harder across the department to 151 00:05:33,734 --> 00:05:35,960 accelerate the adoption of a I across 152 00:05:35,970 --> 00:05:38,220 every aspect of our warfighting and 153 00:05:38,220 --> 00:05:40,490 business operations . While the Jake 154 00:05:40,490 --> 00:05:42,900 will continue to develop AI , solutions 155 00:05:42,910 --> 00:05:45,310 were working in parallel to enable a 156 00:05:45,310 --> 00:05:47,380 broad range of customers across the 157 00:05:47,380 --> 00:05:49,770 department . We can't achieve scale 158 00:05:49,770 --> 00:05:52,000 without having a broader range of of 159 00:05:52,000 --> 00:05:54,222 participants in the integration of AI . 160 00:05:54,440 --> 00:05:56,662 That means a renewed focus on the Joint 161 00:05:56,662 --> 00:05:58,773 Common Foundation , which most of you 162 00:05:58,773 --> 00:06:00,662 are familiar with the Dev SEC ops 163 00:06:00,662 --> 00:06:02,884 platform . That and the key enabler for 164 00:06:02,884 --> 00:06:04,773 AI at that advancement within the 165 00:06:04,773 --> 00:06:06,940 department . It's a resource for all , 166 00:06:06,940 --> 00:06:09,107 but especially for disadvantaged users 167 00:06:09,107 --> 00:06:11,218 who don't have the infrastructure and 168 00:06:11,218 --> 00:06:13,496 the tech expertise to do it themselves . 169 00:06:13,496 --> 00:06:15,329 We're re crafting our engagement 170 00:06:15,329 --> 00:06:17,440 mechanism inside the Jake to actively 171 00:06:17,440 --> 00:06:19,607 seek out problems and help make others 172 00:06:19,607 --> 00:06:22,020 successful . We will be mawr problem 173 00:06:22,020 --> 00:06:24,850 pull than product push . 174 00:06:26,240 --> 00:06:28,296 One thing we note is that stovepipes 175 00:06:28,296 --> 00:06:30,518 don't scale , so we'll work through our 176 00:06:30,518 --> 00:06:32,629 partners in the AI Executive steering 177 00:06:32,629 --> 00:06:34,573 group in the subcommittees of that 178 00:06:34,573 --> 00:06:36,573 group to integrate and focus common 179 00:06:36,573 --> 00:06:39,040 architectures . AI standards , data 180 00:06:39,040 --> 00:06:41,690 sharing strategies , educational norms 181 00:06:41,700 --> 00:06:43,460 and best practice for a I 182 00:06:43,460 --> 00:06:45,570 implementation will continue to work 183 00:06:45,570 --> 00:06:47,850 across the department on AI Ethics AI 184 00:06:47,850 --> 00:06:50,017 policy ai governance and we'll do that 185 00:06:50,017 --> 00:06:52,260 as a community . We'll also continue to 186 00:06:52,260 --> 00:06:54,093 work with like minded nations to 187 00:06:54,093 --> 00:06:55,927 enhance security cooperation and 188 00:06:55,927 --> 00:06:57,810 interoperability through our AI 189 00:06:57,810 --> 00:07:00,980 partnership for the for defense . All 190 00:07:00,980 --> 00:07:03,036 of the jakes were comes back to that 191 00:07:03,036 --> 00:07:04,980 enabling that broad transformation 192 00:07:04,980 --> 00:07:07,202 across the department . We want to help 193 00:07:07,202 --> 00:07:09,420 defense leaders see that AI is about 194 00:07:09,420 --> 00:07:11,364 generating essential warfighting . 195 00:07:11,364 --> 00:07:14,470 Advantage is hey , I is not I t . 196 00:07:15,040 --> 00:07:17,207 It's not a black box that a contractor 197 00:07:17,207 --> 00:07:19,318 is going to deliver to you . It's not 198 00:07:19,318 --> 00:07:21,540 some digital gadget that an I . T . Rep 199 00:07:21,540 --> 00:07:23,429 will show you how to log into our 200 00:07:23,429 --> 00:07:25,262 primary implement Implementation 201 00:07:25,262 --> 00:07:27,429 Challenge is the hard work of decision 202 00:07:27,429 --> 00:07:29,207 engineering . It's commanders , 203 00:07:29,207 --> 00:07:31,262 business at every level and in every 204 00:07:31,262 --> 00:07:33,318 defense enterprise . How do you make 205 00:07:33,318 --> 00:07:35,660 this war fighting decisions ? What data 206 00:07:35,670 --> 00:07:37,840 drives your decision making ? Do you 207 00:07:37,840 --> 00:07:40,007 have that data ? Do you have access to 208 00:07:40,007 --> 00:07:43,490 it ? If it's driving leaders to think , 209 00:07:43,500 --> 00:07:45,333 you know , I could make a better 210 00:07:45,333 --> 00:07:49,200 decision if I knew X . Jake wants to 211 00:07:49,200 --> 00:07:51,256 help leaders at every level . Get to 212 00:07:51,256 --> 00:07:53,960 that ex . We want data informed , data 213 00:07:53,960 --> 00:07:56,182 driven decisions across warfighting and 214 00:07:56,182 --> 00:07:58,182 functional enterprises . We want to 215 00:07:58,182 --> 00:08:00,470 understand the enemy and ourselves and 216 00:08:00,470 --> 00:08:02,640 benefit for data driven insights into 217 00:08:02,640 --> 00:08:04,920 what what happens next . We want the 218 00:08:04,920 --> 00:08:07,250 generation of tempo to respond to fast 219 00:08:07,250 --> 00:08:09,528 moving threats across multiple domains . 220 00:08:09,528 --> 00:08:11,528 We want recursive virtualized , war 221 00:08:11,528 --> 00:08:13,861 gaming and simulation at Great Fidelity . 222 00:08:13,861 --> 00:08:16,083 We want successful teaming among manned 223 00:08:16,083 --> 00:08:18,150 and unmanned platforms , and we want 224 00:08:18,150 --> 00:08:20,150 small leaders or small unit leaders 225 00:08:20,150 --> 00:08:22,261 that go into harm's way to go with um 226 00:08:22,261 --> 00:08:24,206 or complete understanding of their 227 00:08:24,206 --> 00:08:26,372 threats , their risks , their resource 228 00:08:26,372 --> 00:08:28,317 is and their opportunities . We're 229 00:08:28,317 --> 00:08:30,428 grateful to Congress . We're grateful 230 00:08:30,428 --> 00:08:32,594 to d o d leadership , the enthusiastic 231 00:08:32,594 --> 00:08:34,761 service members who who are helping us 232 00:08:34,761 --> 00:08:36,872 with this and the American people for 233 00:08:36,872 --> 00:08:38,983 their continued trust and support . I 234 00:08:38,983 --> 00:08:41,039 really appreciate your attention and 235 00:08:41,039 --> 00:08:40,750 look forward to your questions . Thank 236 00:08:40,750 --> 00:08:44,100 you very much . Thank you , sir . 237 00:08:44,110 --> 00:08:45,999 Appreciate that . We'll go to the 238 00:08:45,999 --> 00:08:48,221 phones now , Uh , the first question is 239 00:08:48,221 --> 00:08:50,332 gonna come from Sydney Freedberg from 240 00:08:50,332 --> 00:08:53,990 breaking defense . Hello , 241 00:08:53,990 --> 00:08:55,934 General Sydney Freedberg here from 242 00:08:55,934 --> 00:08:58,101 breaking defense . Thank you for doing 243 00:08:58,101 --> 00:09:01,220 it . Um , and , uh , apologies . If we 244 00:09:01,220 --> 00:09:03,387 ask , you repeat yourself a little bit 245 00:09:03,387 --> 00:09:05,498 because those of us on the phone line 246 00:09:05,498 --> 00:09:05,450 we're not dialed until you start 247 00:09:05,450 --> 00:09:08,130 speaking . Um , you know , you've 248 00:09:08,130 --> 00:09:10,297 talked repeatedly about the importance 249 00:09:10,297 --> 00:09:12,297 of this being commanders , ai being 250 00:09:12,297 --> 00:09:13,963 commanders business about the 251 00:09:13,963 --> 00:09:16,130 importance of this not being seen as , 252 00:09:16,130 --> 00:09:19,960 you know , nerd stuff up . How 253 00:09:19,970 --> 00:09:22,490 how do you actually socialized ? 254 00:09:22,500 --> 00:09:24,820 Institutionalized that across the 255 00:09:24,820 --> 00:09:26,987 Defense Department . And clearly a lot 256 00:09:26,987 --> 00:09:28,820 of high level interest from your 257 00:09:28,820 --> 00:09:30,931 service sheets in a I certainly a lot 258 00:09:30,931 --> 00:09:32,820 of lip service . At least a I and 259 00:09:32,820 --> 00:09:34,987 people , the briefing slides , But how 260 00:09:34,987 --> 00:09:37,153 do you really familiarize . You know , 261 00:09:37,153 --> 00:09:39,098 not the technical people , but the 262 00:09:39,098 --> 00:09:41,320 commanders with the potential of this , 263 00:09:41,320 --> 00:09:43,320 uh , you know , working as the Jake 264 00:09:43,320 --> 00:09:45,376 with European . Apparently a limited 265 00:09:45,376 --> 00:09:44,890 number of people you don't have . You 266 00:09:44,890 --> 00:09:47,640 can't send a missionary out . Uh , Thio 267 00:09:47,650 --> 00:09:49,761 every office , uh , you know , in the 268 00:09:49,761 --> 00:09:51,872 Pentagon to preach the virtues of a I 269 00:09:52,840 --> 00:09:54,951 great , great question , Sydney . And 270 00:09:54,951 --> 00:09:57,173 and so this this really is the heart of 271 00:09:57,173 --> 00:09:59,284 the implementation challenge . And so 272 00:09:59,284 --> 00:10:02,090 getting commanders , senior leaders 273 00:10:02,090 --> 00:10:03,868 across the department to really 274 00:10:03,868 --> 00:10:07,480 understand that this is not I t ai is 275 00:10:07,480 --> 00:10:10,090 not i t This is war fighting business . 276 00:10:10,100 --> 00:10:12,680 It is assessment and analysis , 277 00:10:12,690 --> 00:10:15,450 analysis of warfighting decision making 278 00:10:15,460 --> 00:10:17,570 or enterprise decision making in our 279 00:10:17,570 --> 00:10:19,459 support infrastructure and in our 280 00:10:19,459 --> 00:10:21,681 business infrastructure . If you if you 281 00:10:21,681 --> 00:10:23,900 understand it that way , then then we 282 00:10:23,900 --> 00:10:26,580 open the doors thio much better and 283 00:10:26,580 --> 00:10:29,070 much more effective integration into 284 00:10:29,080 --> 00:10:31,000 our warfighting constructs . Our 285 00:10:31,000 --> 00:10:33,040 service enterprises are support 286 00:10:33,040 --> 00:10:35,151 enterprises across the department and 287 00:10:35,151 --> 00:10:37,318 we really start to get traction . This 288 00:10:37,318 --> 00:10:40,260 is why our focus on on the Joint Common 289 00:10:40,260 --> 00:10:42,750 Foundation , because what we find , uh , 290 00:10:42,760 --> 00:10:44,816 I think there are two aspects that I 291 00:10:44,816 --> 00:10:46,982 think are important . The Joint Common 292 00:10:46,982 --> 00:10:49,204 Foundation , which provides a technical 293 00:10:49,204 --> 00:10:51,316 platform . So now we have a technical 294 00:10:51,316 --> 00:10:53,427 platform . Uh , it'll become IOC here 295 00:10:53,427 --> 00:10:55,593 early in in 2021 and then we will . We 296 00:10:55,593 --> 00:10:57,704 will . We will rapidly change it . We 297 00:10:57,704 --> 00:10:59,816 expect to do monthly updates of tools 298 00:10:59,816 --> 00:11:02,038 and capabilities to that platform . But 299 00:11:02,038 --> 00:11:04,204 that platform now provides a technical 300 00:11:04,204 --> 00:11:06,149 basis for especially disadvantaged 301 00:11:06,149 --> 00:11:08,149 users who don't have access to data 302 00:11:08,149 --> 00:11:10,149 scientists who don't have access to 303 00:11:10,149 --> 00:11:12,093 algorithms who are not sure how to 304 00:11:12,093 --> 00:11:14,038 leverage their data . We can bring 305 00:11:14,038 --> 00:11:16,204 those those folks to a place where now 306 00:11:16,204 --> 00:11:18,371 they can store their data . They might 307 00:11:18,371 --> 00:11:20,482 be ableto leverage training data from 308 00:11:20,482 --> 00:11:22,593 some other program . We might be able 309 00:11:22,593 --> 00:11:24,538 to identify algorithms that can be 310 00:11:24,538 --> 00:11:26,649 repurposed and reused , you know , in 311 00:11:26,649 --> 00:11:28,816 similar problem sets . So there's that 312 00:11:28,816 --> 00:11:30,871 technical piece of it . There's also 313 00:11:30,871 --> 00:11:33,038 the soft what I call the soft services 314 00:11:33,038 --> 00:11:35,204 side of it , which is now we help them 315 00:11:35,204 --> 00:11:37,371 with , uh , a I testing and evaluation 316 00:11:37,371 --> 00:11:39,482 for verification and validation those 317 00:11:39,482 --> 00:11:42,380 critical AI functions on . We helped 318 00:11:42,380 --> 00:11:44,658 them with best practice in that regard . 319 00:11:44,658 --> 00:11:46,880 We help them with AI ethics and how toe 320 00:11:46,880 --> 00:11:49,970 build ethically grounded AI development 321 00:11:49,970 --> 00:11:51,850 program on . Then we create an 322 00:11:51,850 --> 00:11:54,370 environment for for sharing of all of 323 00:11:54,370 --> 00:11:56,990 that through best practice . Um , if we 324 00:11:56,990 --> 00:12:00,240 do that , then we will , In addition to 325 00:12:00,240 --> 00:12:02,073 the platform piece of this we're 326 00:12:02,073 --> 00:12:04,240 building are what we call our missions 327 00:12:04,240 --> 00:12:06,530 directorate . Now , were we re crafting 328 00:12:06,530 --> 00:12:09,040 that to be much mawr aggressive in 329 00:12:09,040 --> 00:12:11,460 going out to find those problems ? Find 330 00:12:11,460 --> 00:12:13,627 those most compelling use cases across 331 00:12:13,627 --> 00:12:15,738 the department that then we can bring 332 00:12:15,738 --> 00:12:17,990 back home and help that user understand 333 00:12:17,990 --> 00:12:20,212 the problem . Help that user get access 334 00:12:20,212 --> 00:12:22,212 to contracting vehicles , help that 335 00:12:22,212 --> 00:12:24,380 user access to technical platform and 336 00:12:24,380 --> 00:12:27,220 do everything we can to facilitate a i 337 00:12:27,330 --> 00:12:29,970 1000 ai sprouts across the department 338 00:12:29,980 --> 00:12:32,202 so that it really starts to take hold , 339 00:12:32,202 --> 00:12:34,091 and we start to see the impact on 340 00:12:34,091 --> 00:12:36,600 decision making . Thanks , sir . The 341 00:12:36,610 --> 00:12:38,970 next question is coming from Carrie 342 00:12:38,970 --> 00:12:41,540 Johnson Venture beat car . If you're 343 00:12:41,540 --> 00:12:43,596 still on the line , go ahead , sir , 344 00:12:49,740 --> 00:12:51,907 is not on the line . So we're gonna go 345 00:12:51,907 --> 00:12:53,980 to the next question , which is from 346 00:12:53,980 --> 00:12:56,370 Jasmine from national defense . Jasmine , 347 00:12:56,370 --> 00:12:59,530 if you're on the line , go ahead . Uh 348 00:12:59,530 --> 00:13:01,900 huh . Thank you , sir . A Do you know 349 00:13:01,900 --> 00:13:03,844 defense companies face a volley of 350 00:13:03,844 --> 00:13:05,678 attacks from adversarial nations 351 00:13:05,678 --> 00:13:07,789 attempting to steal their I p and get 352 00:13:07,789 --> 00:13:09,900 peaks and sensitive information . How 353 00:13:09,900 --> 00:13:12,067 is the Jake keeping the important work 354 00:13:12,067 --> 00:13:11,770 it does with industry ? Stay from these 355 00:13:11,780 --> 00:13:14,002 countries or bad actors who may want to 356 00:13:14,002 --> 00:13:16,340 steal and replicated . Yeah , great 357 00:13:16,340 --> 00:13:18,340 question , Jasmine . And you know , 358 00:13:18,340 --> 00:13:20,229 we're reminded every day that the 359 00:13:20,229 --> 00:13:22,173 artificial intelligence space is a 360 00:13:22,173 --> 00:13:24,284 competitive space , and there's a lot 361 00:13:24,284 --> 00:13:26,451 of a lot of places that we compete . I 362 00:13:26,451 --> 00:13:28,618 probably the first thing I would throw 363 00:13:28,618 --> 00:13:30,784 out there is cybersecurity . And , you 364 00:13:30,784 --> 00:13:33,118 know , obviously we , you know , we are . 365 00:13:33,118 --> 00:13:35,229 We participate along with the rest of 366 00:13:35,229 --> 00:13:37,284 the department and our cybersecurity 367 00:13:37,284 --> 00:13:37,230 initiatives here in the department to 368 00:13:37,230 --> 00:13:39,720 defend our networks , to defend our 369 00:13:39,730 --> 00:13:41,730 cloud architectures , to defend our 370 00:13:41,730 --> 00:13:44,350 algorithms . But in addition to that , 371 00:13:44,360 --> 00:13:48,080 we we have developed a number of cyber 372 00:13:48,080 --> 00:13:50,136 security tools that we can help that 373 00:13:50,136 --> 00:13:52,540 industry detect those threats . And 374 00:13:52,540 --> 00:13:54,707 then and then and then the third thing 375 00:13:54,707 --> 00:13:56,929 I throw on there is our our efforts now 376 00:13:56,929 --> 00:13:58,929 to secure our platform so obviously 377 00:13:58,929 --> 00:14:01,710 will use defense certified , uh , 378 00:14:01,720 --> 00:14:05,230 accessibility requirements were . What 379 00:14:05,230 --> 00:14:07,397 we're focused on is building a trusted 380 00:14:07,397 --> 00:14:09,397 ecosystem because one of the things 381 00:14:09,397 --> 00:14:11,397 that will make this powerful is our 382 00:14:11,397 --> 00:14:13,397 ability to share . So we have to be 383 00:14:13,397 --> 00:14:15,563 ableto ascertain our data . We have to 384 00:14:15,563 --> 00:14:17,730 know it's providence . We have to know 385 00:14:17,730 --> 00:14:19,730 that the networks that we pass that 386 00:14:19,730 --> 00:14:21,952 data on our sound and secure we have to 387 00:14:21,952 --> 00:14:23,897 create an environment where we can 388 00:14:23,897 --> 00:14:25,786 readily move , you know , through 389 00:14:25,786 --> 00:14:28,063 container ization or some other method , 390 00:14:28,063 --> 00:14:30,240 uh , developments or code that's done 391 00:14:30,240 --> 00:14:32,351 in one platform to another platform . 392 00:14:32,351 --> 00:14:34,240 So to do all of this securely and 393 00:14:34,240 --> 00:14:38,180 safely is a primary demand signal on 394 00:14:38,180 --> 00:14:40,402 the Joint Common Foundation . And it is 395 00:14:40,402 --> 00:14:42,458 on all of our ai developments across 396 00:14:42,458 --> 00:14:44,569 the department and the platforms that 397 00:14:44,570 --> 00:14:46,737 the other platforms that are out there 398 00:14:46,737 --> 00:14:48,848 across the department we are . We are 399 00:14:48,848 --> 00:14:51,930 wide awake to the threat posed by , uh , 400 00:14:51,940 --> 00:14:54,000 foreign actors , especially who you 401 00:14:54,000 --> 00:14:56,380 know who have who have a proven track 402 00:14:56,380 --> 00:14:58,158 record of stealing intellectual 403 00:14:58,158 --> 00:15:00,269 property from wherever they could get 404 00:15:00,269 --> 00:15:02,436 their hands on it . We're going to try 405 00:15:02,436 --> 00:15:04,380 to provide an effective defense to 406 00:15:04,380 --> 00:15:06,602 ensure that doesn't happen . Okay , the 407 00:15:06,602 --> 00:15:09,100 next question is gonna go out to Brandy 408 00:15:09,100 --> 00:15:11,680 Vincent from next cup . Go ahead , 409 00:15:11,680 --> 00:15:14,770 ma'am . Hi . Thank you so much for the 410 00:15:14,770 --> 00:15:17,170 call today . My question is on the 411 00:15:17,170 --> 00:15:19,630 Joint Common Foundation . You mentioned 412 00:15:19,630 --> 00:15:22,220 these soft services that it'll have , 413 00:15:22,230 --> 00:15:24,397 and I read recently that there will be 414 00:15:24,397 --> 00:15:27,430 some to keep users aware of ethical 415 00:15:27,430 --> 00:15:29,300 principles and other important 416 00:15:29,300 --> 00:15:31,356 considerations that they should make 417 00:15:31,356 --> 00:15:33,750 when using AI in warfare . Can you tell 418 00:15:33,750 --> 00:15:35,694 us a little bit more about how the 419 00:15:35,694 --> 00:15:37,417 platform will be used with the 420 00:15:37,417 --> 00:15:39,990 Pentagon's ethical priorities and from 421 00:15:39,990 --> 00:15:41,823 your own experience ? Why do you 422 00:15:41,823 --> 00:15:44,120 believe that ? That's important . Yeah . 423 00:15:44,130 --> 00:15:47,010 Great . Great question . I really I 424 00:15:47,010 --> 00:15:49,232 think this is so important . And I tell 425 00:15:49,232 --> 00:15:51,399 you , I didn't always think that way . 426 00:15:51,399 --> 00:15:53,550 When I came into the Jake job , I had 427 00:15:53,550 --> 00:15:56,460 my own epiphany about the role of an AI 428 00:15:56,460 --> 00:15:58,571 ethical foundation to everything that 429 00:15:58,571 --> 00:16:00,793 we do . And it just it just jumps right 430 00:16:00,793 --> 00:16:02,904 at you right out at you . Many people 431 00:16:02,904 --> 00:16:05,016 might think Well , yeah , of course . 432 00:16:05,016 --> 00:16:07,182 You know , we do things ethically . So 433 00:16:07,182 --> 00:16:08,904 when we use a , I will do that 434 00:16:08,904 --> 00:16:11,016 ethically as well . But I think of it 435 00:16:11,016 --> 00:16:10,980 through the lens of just like the law 436 00:16:10,980 --> 00:16:14,260 of war . The law of war . Um , you know , 437 00:16:14,260 --> 00:16:18,250 the the the determination of military 438 00:16:18,250 --> 00:16:21,160 necessity . The the , uh , you know , 439 00:16:21,160 --> 00:16:24,480 the the , uh , the unnecessary , you 440 00:16:24,480 --> 00:16:26,830 know , limiting unnecessary suffering , 441 00:16:26,840 --> 00:16:29,062 all of the all of the principles of the 442 00:16:29,062 --> 00:16:32,380 law of war that drive . Our decision 443 00:16:32,380 --> 00:16:34,820 making actually has a significant 444 00:16:34,830 --> 00:16:37,730 impact on the way that we organize and 445 00:16:37,730 --> 00:16:39,952 fight our force today . And you can see 446 00:16:39,952 --> 00:16:42,174 it . Anybody , you know , the fact that 447 00:16:42,174 --> 00:16:43,952 we have a very mature targeting 448 00:16:43,952 --> 00:16:46,174 doctrine and the targeting process that 449 00:16:46,174 --> 00:16:48,174 is full of checks and balances , um 450 00:16:48,174 --> 00:16:50,341 helps us to ensure that that were that 451 00:16:50,341 --> 00:16:52,397 were complying with the law of war . 452 00:16:52,397 --> 00:16:55,540 This this process is , uh , is 453 00:16:55,550 --> 00:16:57,920 unprecedented , and it it is thoroughly 454 00:16:57,920 --> 00:17:00,087 ingrained in the way we do things . It 455 00:17:00,087 --> 00:17:02,198 changes the way we do business in the 456 00:17:02,198 --> 00:17:04,087 targeting world . We believe that 457 00:17:04,087 --> 00:17:06,198 there's a similar approach for AI and 458 00:17:06,198 --> 00:17:08,253 ethical considerations . So when you 459 00:17:08,253 --> 00:17:10,364 think about the , you know , the AI , 460 00:17:10,364 --> 00:17:13,190 the AI , uh , principles or the or the 461 00:17:13,200 --> 00:17:15,520 ethical principles these things tell us 462 00:17:15,530 --> 00:17:19,110 how to build a I and then how to employ 463 00:17:19,110 --> 00:17:21,400 them , uh , responsibly . So when we 464 00:17:21,400 --> 00:17:23,760 think about , you know , building A I , 465 00:17:23,770 --> 00:17:26,180 we wanna make sure that our requirement 466 00:17:26,180 --> 00:17:28,124 or our outcomes are traceable . We 467 00:17:28,124 --> 00:17:30,400 wanna make sure that it's equitable . 468 00:17:30,410 --> 00:17:32,632 We wanna make sure that our systems are 469 00:17:32,632 --> 00:17:34,743 reliable and we do that through tests 470 00:17:34,743 --> 00:17:37,030 and evaluation in a very rigorous way . 471 00:17:37,040 --> 00:17:39,270 But then we also want to sure insure 472 00:17:39,270 --> 00:17:41,700 that as we employ our ai that we're 473 00:17:41,700 --> 00:17:43,880 doing it in ways that are responsible 474 00:17:43,890 --> 00:17:46,280 and that are governable . So we know 475 00:17:46,280 --> 00:17:48,240 that we're using an AI within the 476 00:17:48,240 --> 00:17:50,462 boundaries of which it was tested , for 477 00:17:50,462 --> 00:17:52,640 example , or we use an AI in a manner 478 00:17:52,640 --> 00:17:55,190 that we can we can turn it off or we 479 00:17:55,190 --> 00:17:57,370 can ask it in some cases . Hey , how 480 00:17:57,370 --> 00:17:59,426 sure are you about that answer ? You 481 00:17:59,426 --> 00:18:01,426 know what ? You know what ? What is 482 00:18:01,426 --> 00:18:03,537 your assessment of the quality of the 483 00:18:03,537 --> 00:18:05,648 answer you provide ? An AI gives us a 484 00:18:05,648 --> 00:18:07,981 window to be able to do that . Honestly , 485 00:18:07,981 --> 00:18:10,037 we and the nations that were working 486 00:18:10,037 --> 00:18:12,148 within our AI partnership for defense 487 00:18:12,148 --> 00:18:14,370 really are kind of breaking ground here 488 00:18:14,370 --> 00:18:16,037 for establishing that ethical 489 00:18:16,037 --> 00:18:18,037 foundation . And it will be just as 490 00:18:18,037 --> 00:18:20,320 important and just as impactful as 491 00:18:20,320 --> 00:18:23,540 application of the Law of War Eyes on 492 00:18:23,540 --> 00:18:25,707 our targeting doctrine , for example . 493 00:18:25,707 --> 00:18:27,840 So if you have that it's really 494 00:18:27,840 --> 00:18:30,173 critical , then there are not that many , 495 00:18:30,173 --> 00:18:33,280 uh , experts ethicists who really 496 00:18:33,280 --> 00:18:34,947 understand this topic and can 497 00:18:34,947 --> 00:18:37,000 communicate it in a way that helps 498 00:18:37,000 --> 00:18:39,250 designers design systems , help testers 499 00:18:39,250 --> 00:18:41,472 test systems and help implement there's 500 00:18:41,472 --> 00:18:43,694 implement them . And so we have some of 501 00:18:43,694 --> 00:18:45,750 them in the Jake . They're fantastic 502 00:18:45,750 --> 00:18:47,917 people . And they're and they're , you 503 00:18:47,917 --> 00:18:49,806 know , they punch way above their 504 00:18:49,806 --> 00:18:51,861 weight . We're really helping hoping 505 00:18:51,861 --> 00:18:54,083 toe Let give their give access to their 506 00:18:54,083 --> 00:18:56,900 expertise across the department by by 507 00:18:56,900 --> 00:18:58,622 linking it to the Joint Common 508 00:18:58,622 --> 00:19:00,789 Foundation . Thanks for the question . 509 00:19:00,789 --> 00:19:03,011 I think that's a really important one . 510 00:19:03,540 --> 00:19:05,596 Okay , the next question goes out to 511 00:19:05,596 --> 00:19:07,762 Jackson Barnett of Fed Scoop Jackson . 512 00:19:07,762 --> 00:19:10,620 Go ahead , sir . Hi . Thank you so much 513 00:19:10,620 --> 00:19:12,940 for doing this . Um , could you say 514 00:19:12,950 --> 00:19:15,740 what is your expectation ? Or even 515 00:19:15,750 --> 00:19:18,820 because baseline requirements for what 516 00:19:18,830 --> 00:19:21,020 everyone needs to understand about ai 517 00:19:21,020 --> 00:19:23,187 when you talk about trying to enable a 518 00:19:23,187 --> 00:19:25,409 across the department , what is it that 519 00:19:25,409 --> 00:19:27,560 you hope , um , that those being a 520 00:19:27,570 --> 00:19:29,792 commander is out in the field or people 521 00:19:29,792 --> 00:19:32,014 working on the back office ? Um , parts 522 00:19:32,014 --> 00:19:34,126 of the Pentagon ? What do people need 523 00:19:34,126 --> 00:19:36,126 to know about AI for your vision of 524 00:19:36,126 --> 00:19:38,181 enabling a across the department toe 525 00:19:38,181 --> 00:19:40,348 work ? Yeah . Great question . Jackson 526 00:19:40,348 --> 00:19:42,348 s . So the most important thing , I 527 00:19:42,348 --> 00:19:44,510 think , is what I alluded to in my 528 00:19:44,510 --> 00:19:47,050 opening comments that ai is about 529 00:19:47,050 --> 00:19:49,161 decision making , not decision making 530 00:19:49,161 --> 00:19:51,960 in the abs abstract but decision making 531 00:19:51,960 --> 00:19:55,340 in the in the in the finite , in the in 532 00:19:55,340 --> 00:19:57,451 the moment , in the decision with the 533 00:19:57,451 --> 00:19:59,890 decision maker that that really defines , 534 00:19:59,890 --> 00:20:01,779 like , how do I want to make that 535 00:20:01,779 --> 00:20:04,210 decision ? What process do I use today ? 536 00:20:04,580 --> 00:20:06,850 And then what data do I use to make 537 00:20:06,850 --> 00:20:09,210 that decision today , in many cases , 538 00:20:09,220 --> 00:20:11,310 historically are Ah , lot of our 539 00:20:11,310 --> 00:20:13,477 warfighting decisions are made kind of 540 00:20:13,477 --> 00:20:16,450 the seat of the pants judgment , you 541 00:20:16,450 --> 00:20:18,730 know , individuals with lots of 542 00:20:18,730 --> 00:20:20,897 experience , a mature understanding of 543 00:20:20,897 --> 00:20:22,841 the situation , but doing decision 544 00:20:22,841 --> 00:20:26,210 making without necessarily , uh , 545 00:20:26,220 --> 00:20:28,830 current data . We can fix that . We can 546 00:20:28,830 --> 00:20:30,997 make that better and so ways for us to 547 00:20:30,997 --> 00:20:33,740 do that , um uh , you know , we have to 548 00:20:33,740 --> 00:20:36,600 help people visualize what a I means 549 00:20:36,600 --> 00:20:38,822 across the department . And what in a I 550 00:20:38,822 --> 00:20:40,989 use case looks like it's really easy . 551 00:20:40,989 --> 00:20:42,767 I mean , for me to start at the 552 00:20:42,767 --> 00:20:44,822 tactical level , you know , way want 553 00:20:44,822 --> 00:20:47,044 weapons that are more precise . We want 554 00:20:47,044 --> 00:20:46,950 weapons that guide on command , you 555 00:20:46,950 --> 00:20:49,061 know , to human selected targets . We 556 00:20:49,061 --> 00:20:50,950 want threat detection , automatic 557 00:20:50,950 --> 00:20:52,506 threat detection and threat 558 00:20:52,506 --> 00:20:54,850 identification on our bases . We want a 559 00:20:54,860 --> 00:20:57,270 better information about the logistic 560 00:20:57,280 --> 00:20:59,520 support that is available available to 561 00:20:59,520 --> 00:21:01,740 our small units . We would like better 562 00:21:01,740 --> 00:21:03,796 awareness of the medical situation , 563 00:21:03,796 --> 00:21:05,740 you know , perhaps remote triage , 564 00:21:05,740 --> 00:21:07,962 medical dispatch processes , you know , 565 00:21:07,962 --> 00:21:10,250 everything that you just imagine that 566 00:21:10,250 --> 00:21:11,917 you do in the in a commercial 567 00:21:11,917 --> 00:21:14,083 environment today , here in the United 568 00:21:14,083 --> 00:21:16,194 States , we wanna be able to do those 569 00:21:16,194 --> 00:21:18,361 same things with the same ease and the 570 00:21:18,361 --> 00:21:20,472 same reliability on the battlefield . 571 00:21:20,472 --> 00:21:22,870 Um , you know , reconnaissance on 572 00:21:22,870 --> 00:21:25,092 scouting for , you know , with unmanned 573 00:21:25,092 --> 00:21:27,203 platforms . Uh , you know , equipment 574 00:21:27,203 --> 00:21:29,314 that's instrumented that's gonna tell 575 00:21:29,314 --> 00:21:31,426 us if it if it thinks it will fail in 576 00:21:31,426 --> 00:21:33,703 the next , you know , in the next hour , 577 00:21:33,703 --> 00:21:35,648 the next flight or whatever . Team 578 00:21:35,648 --> 00:21:37,870 members that have secure communications 579 00:21:37,870 --> 00:21:39,926 over small distances . Uh , you know 580 00:21:39,926 --> 00:21:42,037 that all that tech exists today , and 581 00:21:42,037 --> 00:21:44,092 if you move up the value chain , you 582 00:21:44,092 --> 00:21:43,680 know , up into the , you know , like 583 00:21:43,680 --> 00:21:45,513 theater like Combatant command , 584 00:21:45,513 --> 00:21:47,180 decision support , you know , 585 00:21:47,180 --> 00:21:49,236 visibility of data across across the 586 00:21:49,236 --> 00:21:51,458 theater . What an incredible thing that 587 00:21:51,458 --> 00:21:53,624 would be to achieve , Available at the 588 00:21:53,624 --> 00:21:55,791 fingertips of a combatant commander at 589 00:21:55,791 --> 00:21:57,624 any time today . Those combatant 590 00:21:57,624 --> 00:21:59,569 commanders really on the alone and 591 00:21:59,569 --> 00:22:01,736 unafraid in many cases , geographic in 592 00:22:01,736 --> 00:22:03,958 the geo geographical regions around the 593 00:22:03,958 --> 00:22:06,150 world have to make real time decisions 594 00:22:06,150 --> 00:22:09,010 based on imperfect knowledge . And , uh , 595 00:22:09,020 --> 00:22:11,360 s O they do the best they can . But I 596 00:22:11,370 --> 00:22:13,370 think our commanders deserve better 597 00:22:13,370 --> 00:22:15,314 than that . They should be able to 598 00:22:15,314 --> 00:22:17,537 decide based on data where we have data 599 00:22:17,537 --> 00:22:19,703 available and where we could make that 600 00:22:19,703 --> 00:22:21,814 data available for them . Things like 601 00:22:21,814 --> 00:22:23,926 at a service level , you know , human 602 00:22:23,926 --> 00:22:26,037 capital management , you know ? Think 603 00:22:26,037 --> 00:22:28,203 Moneyball , right ? Like , I need that 604 00:22:28,203 --> 00:22:30,092 kind of person for this job . I'm 605 00:22:30,092 --> 00:22:32,314 looking for a individual . This kind of 606 00:22:32,314 --> 00:22:34,648 skills . Where can I find such a person ? 607 00:22:34,648 --> 00:22:36,870 When is that person going to rotate the 608 00:22:36,870 --> 00:22:39,037 services that we can provide ? Service 609 00:22:39,037 --> 00:22:41,203 members ? Uh , you know , I don't know 610 00:22:41,203 --> 00:22:43,370 how many how many man hours I've spent 611 00:22:43,370 --> 00:22:45,592 standing in lines and it administration 612 00:22:45,592 --> 00:22:47,648 section , you know , in my command , 613 00:22:47,648 --> 00:22:49,870 you know , waiting for somebody to look 614 00:22:49,870 --> 00:22:51,981 at my record book or change and a new 615 00:22:51,981 --> 00:22:54,259 allowance or something like that . Why ? 616 00:22:54,259 --> 00:22:56,092 Why do we do that ? You know , I 617 00:22:56,092 --> 00:22:58,259 haven't set foot in a bank for years . 618 00:22:58,259 --> 00:23:00,314 Why would I have to set foot into an 619 00:23:00,314 --> 00:23:02,426 admits section to be able to do these 620 00:23:02,426 --> 00:23:04,537 kind of processes ? This is kind of , 621 00:23:04,537 --> 00:23:03,650 you know , this is the broad 622 00:23:03,650 --> 00:23:06,000 visualization that includes , you know , 623 00:23:06,000 --> 00:23:08,540 support and enabling capabilities . But 624 00:23:08,550 --> 00:23:10,772 it extends all the way the war fighting 625 00:23:10,772 --> 00:23:12,994 decision making that you know that it's 626 00:23:12,994 --> 00:23:15,161 necessary , right ? It's We have to We 627 00:23:15,161 --> 00:23:17,328 have to do this . It will make us more 628 00:23:17,328 --> 00:23:20,440 effective . Mawr efficient . Thank you , 629 00:23:20,440 --> 00:23:22,662 sir . Uh , the next question comes from 630 00:23:22,662 --> 00:23:25,180 Lauryn Williams from FC W , uh , floor . 631 00:23:25,180 --> 00:23:27,124 If you're on the line , go ahead , 632 00:23:27,124 --> 00:23:30,220 ma'am . Yes . Thank you for doing this . 633 00:23:30,230 --> 00:23:33,220 Far as you're talking about the new 634 00:23:33,220 --> 00:23:37,110 capabilities , the the data strategy 635 00:23:37,110 --> 00:23:39,332 came out and obviously , like that is a 636 00:23:39,332 --> 00:23:41,950 very important part of making a I work . 637 00:23:41,950 --> 00:23:44,050 Can you talk a little bit about what 638 00:23:44,050 --> 00:23:46,510 the Jake is going to be doing in the 639 00:23:46,510 --> 00:23:48,621 near future ? Like what we can expect 640 00:23:48,621 --> 00:23:50,450 to see , you know , in terms of 641 00:23:50,450 --> 00:23:52,672 implementing the data strategy and what 642 00:23:52,672 --> 00:23:54,617 the jig's role is gonna be there . 643 00:23:55,240 --> 00:23:57,296 Great question , Laura . So ? So the 644 00:23:57,296 --> 00:23:59,710 data strategy for those of you who 645 00:23:59,710 --> 00:24:01,877 don't know that's comes from the chief 646 00:24:01,877 --> 00:24:04,099 data officer . So within within the the 647 00:24:04,099 --> 00:24:06,266 chief information officer . Sweet on . 648 00:24:06,266 --> 00:24:08,488 So what ? What the CEO organization has 649 00:24:08,488 --> 00:24:12,150 done is kind of created a vision and a 650 00:24:12,150 --> 00:24:15,140 strategy for how are we going to manage 651 00:24:15,140 --> 00:24:17,570 the enormous amount of data that's 652 00:24:17,570 --> 00:24:19,792 gonna be flowing through our networks . 653 00:24:19,792 --> 00:24:21,737 That's going to be coming from our 654 00:24:21,737 --> 00:24:23,903 sensors . That's going to be generated 655 00:24:23,903 --> 00:24:25,681 and curated for AI models . And 656 00:24:25,681 --> 00:24:28,000 everywhere else We use data . You can't 657 00:24:28,000 --> 00:24:30,100 be data driven as a department . You 658 00:24:30,100 --> 00:24:32,470 can't do data driven warfighting if you 659 00:24:32,470 --> 00:24:34,692 don't have a strategy for how to manage 660 00:24:34,692 --> 00:24:36,990 your data and so through as we as we 661 00:24:36,990 --> 00:24:39,570 established the Joint Common Foundation . 662 00:24:39,570 --> 00:24:42,350 But also as we help other customers , 663 00:24:42,360 --> 00:24:45,000 you know , execute AI programs within , 664 00:24:45,000 --> 00:24:47,167 you know , within their enterprises we 665 00:24:47,167 --> 00:24:50,310 will help the CEO uh , implement that 666 00:24:50,310 --> 00:24:52,600 strategy , right ? So things like data 667 00:24:52,600 --> 00:24:54,600 sharing . So data sharing is really 668 00:24:54,600 --> 00:24:56,656 important in an environment where we 669 00:24:56,656 --> 00:24:58,878 have enormous amounts of data available 670 00:24:58,878 --> 00:25:00,989 to us broadly across the department . 671 00:25:00,989 --> 00:25:03,270 We need to make sure that data is 672 00:25:03,270 --> 00:25:05,437 available from one consumer to another 673 00:25:05,437 --> 00:25:07,870 consumer and hand in hand with that is 674 00:25:07,870 --> 00:25:10,037 the security of that data . We need to 675 00:25:10,037 --> 00:25:11,870 make sure that we have the right 676 00:25:11,870 --> 00:25:13,926 security controls on the data . That 677 00:25:13,926 --> 00:25:16,148 data is shared , but it's shared within 678 00:25:16,148 --> 00:25:18,148 a construct that we can protect the 679 00:25:18,148 --> 00:25:20,314 data . One of the worst things that we 680 00:25:20,314 --> 00:25:22,426 could do is create stovepipes of data 681 00:25:22,426 --> 00:25:24,370 that are not accessible across the 682 00:25:24,370 --> 00:25:26,800 department and that that result in the 683 00:25:26,800 --> 00:25:28,890 department spending millions and 684 00:25:28,890 --> 00:25:31,030 millions of dollars . You know , re 685 00:25:31,030 --> 00:25:34,290 analyzing data . Re cleaning data re , 686 00:25:34,300 --> 00:25:36,820 you know , repurpose ing data . When ? 687 00:25:36,830 --> 00:25:39,010 When that data is already available . 688 00:25:39,020 --> 00:25:41,290 Eso We're working with the CEO , and 689 00:25:41,290 --> 00:25:43,780 then we'll work across the the AI 690 00:25:43,790 --> 00:25:45,957 Executive Steering Group to figure out 691 00:25:45,957 --> 00:25:47,957 ways . How do we How do we not only 692 00:25:47,957 --> 00:25:50,290 share models , but how do we share code ? 693 00:25:50,290 --> 00:25:52,457 How do we share training data ? How do 694 00:25:52,457 --> 00:25:54,457 we share test and evaluation data , 695 00:25:54,457 --> 00:25:56,512 thes air ? The kind of things that a 696 00:25:56,512 --> 00:25:58,568 data strategy will help us kind of , 697 00:25:58,568 --> 00:26:00,790 you know , put the lines in the road so 698 00:26:00,790 --> 00:26:00,450 we could do it effectively . But do it 699 00:26:00,450 --> 00:26:02,600 safely at the same time . Thank you , 700 00:26:02,600 --> 00:26:04,711 sir . We've got two other journalists 701 00:26:04,711 --> 00:26:06,878 on the line , and I want to try to get 702 00:26:06,878 --> 00:26:08,989 to that before we got to cut off . So 703 00:26:08,989 --> 00:26:11,156 the next question is gonna go to Scott 704 00:26:11,156 --> 00:26:13,322 from Federal News Network . Scott , if 705 00:26:13,322 --> 00:26:13,160 you're on the line , go ahead , sir . 706 00:26:15,440 --> 00:26:17,607 Hi , general . Thanks for doing this . 707 00:26:17,607 --> 00:26:19,607 Um , you're just curious about your 708 00:26:19,607 --> 00:26:21,829 priorities for 2021 . You know , you're 709 00:26:21,829 --> 00:26:24,051 getting , uh , more money than you were 710 00:26:24,051 --> 00:26:26,051 a couple of years ago . Considering 711 00:26:26,051 --> 00:26:28,051 that your organization is growing , 712 00:26:28,051 --> 00:26:30,218 you've started toe work within some of 713 00:26:30,218 --> 00:26:33,230 the combat areas . So you know where 714 00:26:33,230 --> 00:26:35,397 you're going to be investing money and 715 00:26:35,397 --> 00:26:37,619 where we're going to see the Jake start 716 00:26:37,619 --> 00:26:39,730 to grow . Great question , Scott . So 717 00:26:39,730 --> 00:26:41,841 So Aziz , we look at , you know , one 718 00:26:41,841 --> 00:26:43,952 of the challenges of kind of where we 719 00:26:43,952 --> 00:26:46,063 are in this evolution of the Jake and 720 00:26:46,063 --> 00:26:48,960 the department is we we have , 721 00:26:48,960 --> 00:26:52,950 ah , pipeline of use cases . That 722 00:26:52,950 --> 00:26:55,172 is way mawr , you know , vastly exceeds 723 00:26:55,172 --> 00:26:57,850 our resource is and so this is part of 724 00:26:57,850 --> 00:27:00,840 our enablement process . We want Thio . 725 00:27:00,850 --> 00:27:02,850 You know , we want to find the most 726 00:27:02,850 --> 00:27:05,860 compelling use case is that we can find 727 00:27:05,980 --> 00:27:07,369 the things that are most 728 00:27:07,369 --> 00:27:09,591 transformational , the things that will 729 00:27:09,591 --> 00:27:11,702 have the broadest application and the 730 00:27:11,702 --> 00:27:13,702 things that will lead to , uh , you 731 00:27:13,702 --> 00:27:15,950 know , innovation in the space . And so 732 00:27:15,950 --> 00:27:18,117 there's a balance here that we're that 733 00:27:18,117 --> 00:27:20,172 we're trying to achieve . On the one 734 00:27:20,172 --> 00:27:22,339 hand , we're working some very cutting 735 00:27:22,339 --> 00:27:25,280 edge uh , ai technologies with 736 00:27:25,440 --> 00:27:27,860 consumers , Cem pretty mature consumers , 737 00:27:27,860 --> 00:27:30,193 consumers who are , you know , you know , 738 00:27:30,193 --> 00:27:32,360 working at the same level . We are and 739 00:27:32,360 --> 00:27:34,471 in partnership . On the other side of 740 00:27:34,471 --> 00:27:36,693 the coin we have , we have partnerships 741 00:27:36,693 --> 00:27:38,804 with really important enterprises and 742 00:27:38,804 --> 00:27:40,916 organizations who haven't even really 743 00:27:40,916 --> 00:27:43,270 started their journey into a I . And so 744 00:27:43,270 --> 00:27:45,437 we've got toe . Make sure that we have 745 00:27:45,437 --> 00:27:48,200 the right balance of investment in high 746 00:27:48,200 --> 00:27:50,860 tech AI that moves the state of the art 747 00:27:51,240 --> 00:27:54,130 on shows the pathway for additional AI 748 00:27:54,130 --> 00:27:56,660 development and implementation and then 749 00:27:56,660 --> 00:28:00,430 also , um , helping consumers , you 750 00:28:00,430 --> 00:28:02,652 know , with their first forays into the 751 00:28:02,652 --> 00:28:04,486 AI environment and that and that 752 00:28:04,486 --> 00:28:06,708 includes things like , you know , doing 753 00:28:06,708 --> 00:28:08,763 data readiness assessments . So as I 754 00:28:08,763 --> 00:28:10,960 mentioned in my opening remarks and re 755 00:28:10,960 --> 00:28:13,820 crafting our missions directorate todo 756 00:28:13,820 --> 00:28:15,987 we're creating fly away teams , if you 757 00:28:15,987 --> 00:28:18,660 will , um , that can that can fall in 758 00:28:18,660 --> 00:28:21,990 on a annettor prize or or a potential 759 00:28:21,990 --> 00:28:24,410 AI consumer and help them understand 760 00:28:24,410 --> 00:28:26,354 their data environment , help them 761 00:28:26,354 --> 00:28:28,354 understand what kind of things that 762 00:28:28,354 --> 00:28:30,466 they're gonna have to do to create an 763 00:28:30,466 --> 00:28:32,521 environment that that can support an 764 00:28:32,521 --> 00:28:34,243 artificial intelligence set of 765 00:28:34,243 --> 00:28:36,243 solutions . So we'll help them with 766 00:28:36,243 --> 00:28:38,466 that . And when we're done helping them 767 00:28:38,466 --> 00:28:40,577 with that , then we'll help find them 768 00:28:40,577 --> 00:28:42,466 the AI solution . In an unlimited 769 00:28:42,466 --> 00:28:44,632 budgetary environment , we might build 770 00:28:44,632 --> 00:28:46,632 that algorithm for them , uh , in a 771 00:28:46,632 --> 00:28:48,799 limited budget environment , Sometimes 772 00:28:48,799 --> 00:28:50,966 the best things we can do is look link 773 00:28:50,966 --> 00:28:52,966 them to a contractor who may have a 774 00:28:52,966 --> 00:28:54,743 demonstrated expertise in their 775 00:28:54,743 --> 00:28:56,743 particular particular use case . In 776 00:28:56,743 --> 00:28:58,743 some cases , it may just be helping 777 00:28:58,743 --> 00:29:00,854 them find a contract vehicles so that 778 00:29:00,854 --> 00:29:02,966 they could bring somebody in , in any 779 00:29:02,966 --> 00:29:04,799 case , will inform them with the 780 00:29:04,799 --> 00:29:07,021 ethical standards will inform them with 781 00:29:07,021 --> 00:29:09,243 best practices for testing . Evaluation 782 00:29:09,243 --> 00:29:11,466 will help them do their data analysis , 783 00:29:11,466 --> 00:29:13,632 and so are re sourcing . Now is spread 784 00:29:13,632 --> 00:29:15,930 between high end use cases and use 785 00:29:15,930 --> 00:29:17,986 cases that we're that we're building 786 00:29:17,986 --> 00:29:19,652 because we you know , because 787 00:29:19,652 --> 00:29:21,874 purposefully we want to build those toe 788 00:29:21,874 --> 00:29:24,000 meet specific needs the common 789 00:29:24,000 --> 00:29:26,000 foundation and building that common 790 00:29:26,000 --> 00:29:28,570 foundation and then helping a broader 791 00:29:28,570 --> 00:29:31,250 base of consumers uh , take a I on 792 00:29:31,250 --> 00:29:33,417 board and start toe , you know , start 793 00:29:33,417 --> 00:29:35,780 to respond to the transformation by 794 00:29:35,780 --> 00:29:37,669 looking at their own problem sets 795 00:29:37,669 --> 00:29:39,947 facilitated by us . So we'll have thio . 796 00:29:39,947 --> 00:29:42,113 We'll have to , you know , it's a very 797 00:29:42,113 --> 00:29:44,224 it's a very nuanced program of how do 798 00:29:44,224 --> 00:29:46,224 you spread the resource ing to make 799 00:29:46,224 --> 00:29:48,336 sure all of those important functions 800 00:29:48,336 --> 00:29:48,010 are accomplished . Thanks for the 801 00:29:48,010 --> 00:29:50,210 question . This will be the last 802 00:29:50,210 --> 00:29:52,450 question things . Question comes from 803 00:29:52,460 --> 00:29:55,530 Peter from aside , TV and we just have 804 00:29:55,530 --> 00:29:57,586 a couple of more minutes here . So , 805 00:29:57,700 --> 00:29:59,922 Peter , if you could go ahead with your 806 00:29:59,922 --> 00:30:03,330 question , sir . Absolutely . Thank you 807 00:30:03,330 --> 00:30:05,497 very much . Um , I wanted to ask about 808 00:30:05,497 --> 00:30:08,390 the security of algorithms and how you 809 00:30:08,390 --> 00:30:10,680 attempt to deal with one of the biggest 810 00:30:10,680 --> 00:30:13,050 problems in AI is always over matching 811 00:30:13,060 --> 00:30:16,720 to the data . Um , in that you will 812 00:30:16,720 --> 00:30:19,150 have to keep algorithms secure and so 813 00:30:19,150 --> 00:30:22,050 periodically update and reduce and 814 00:30:22,640 --> 00:30:25,210 renew them . What fear do you see an 815 00:30:25,220 --> 00:30:27,387 over matching or even under matching ? 816 00:30:27,387 --> 00:30:29,553 If you know that you have to throw out 817 00:30:29,553 --> 00:30:31,720 a bunch of data . Yeah , that's that's 818 00:30:31,720 --> 00:30:34,500 a great question . Uh , primarily you 819 00:30:34,500 --> 00:30:37,270 know what way ? Our way are 820 00:30:37,840 --> 00:30:40,060 limited in the data we have in many 821 00:30:40,060 --> 00:30:42,116 cases . And the good data , the good 822 00:30:42,116 --> 00:30:44,282 labeled data , the you know , the well 823 00:30:44,282 --> 00:30:47,570 conditioned data and so helping us us 824 00:30:47,570 --> 00:30:49,681 kind of creating the standards in the 825 00:30:49,681 --> 00:30:51,514 environment so we can build high 826 00:30:51,514 --> 00:30:53,681 quality data Eyes is an important step 827 00:30:53,681 --> 00:30:55,959 that will accomplish through the J . C . 828 00:30:55,959 --> 00:30:58,014 F . And will help other consumers in 829 00:30:58,014 --> 00:31:00,126 that same in the same role but then , 830 00:31:00,126 --> 00:31:02,292 But then , once we have good data , we 831 00:31:02,292 --> 00:31:04,403 have to protect it . So we protect it 832 00:31:04,403 --> 00:31:06,237 through the way had the security 833 00:31:06,237 --> 00:31:08,348 conversation a little while ago . But 834 00:31:08,348 --> 00:31:08,330 we protect it through the right 835 00:31:08,330 --> 00:31:11,250 security apparatus so that we can share 836 00:31:11,250 --> 00:31:13,361 effectively yet ensure that that data 837 00:31:13,361 --> 00:31:15,583 remains protective . You know , we have 838 00:31:15,583 --> 00:31:17,800 to protect test and evaluation data we 839 00:31:17,800 --> 00:31:20,090 have to protect labeled in condition 840 00:31:20,090 --> 00:31:22,312 data for a lot of different reasons for 841 00:31:22,312 --> 00:31:24,312 operational reasons , for technical 842 00:31:24,312 --> 00:31:24,030 reasons . And because there is a 843 00:31:24,040 --> 00:31:26,550 valuable resource , we have to protect 844 00:31:26,550 --> 00:31:29,240 the intellectual property of government 845 00:31:29,240 --> 00:31:31,630 data and how we use that effectively 846 00:31:31,640 --> 00:31:35,190 Thio to ensure that we have access to 847 00:31:35,200 --> 00:31:38,470 rapid and frequent algorithm updates . 848 00:31:38,540 --> 00:31:41,390 Yet without paying a proprietary price 849 00:31:41,390 --> 00:31:43,446 for data that the government doesn't 850 00:31:43,446 --> 00:31:45,446 own or the data the government gave 851 00:31:45,446 --> 00:31:47,612 away , we wanna make sure that we have 852 00:31:47,612 --> 00:31:49,834 environment that makes sense for that . 853 00:31:49,834 --> 00:31:52,070 For that , For that situation . Uh , 854 00:31:52,080 --> 00:31:54,700 what your question kind of reminds all 855 00:31:54,700 --> 00:31:56,811 of us , though , is , um , you know , 856 00:31:56,811 --> 00:31:59,870 the the technology of adversarial ai 857 00:32:00,040 --> 00:32:02,910 the opportunities for AI exploitation 858 00:32:02,910 --> 00:32:05,200 or spoofing or deception . Um , you 859 00:32:05,200 --> 00:32:07,367 know that that research environment is 860 00:32:07,367 --> 00:32:09,367 very robust . Obviously we pay very 861 00:32:09,367 --> 00:32:12,310 close attention . We do have Ah , uh Ah . 862 00:32:12,310 --> 00:32:14,920 Pretty significant powerhouse bench of 863 00:32:14,930 --> 00:32:18,190 AI . Engineers and experts in , uh , in 864 00:32:18,190 --> 00:32:20,960 data science as well , Who are who keep 865 00:32:20,960 --> 00:32:23,016 us up to date and keep us abreast of 866 00:32:23,016 --> 00:32:24,849 all of the developments in those 867 00:32:24,849 --> 00:32:26,738 threatening aspects of artificial 868 00:32:26,738 --> 00:32:28,849 intelligence . And we work those into 869 00:32:28,849 --> 00:32:30,960 our processes to the to the degree we 870 00:32:30,960 --> 00:32:33,720 can . We're very sensitive to the idea 871 00:32:33,730 --> 00:32:36,490 of over conditioned or overmatched data 872 00:32:36,500 --> 00:32:39,720 were very sensitive to the issues of AI 873 00:32:39,720 --> 00:32:42,710 vulnerability and a and adversarial ai . 874 00:32:42,720 --> 00:32:45,560 And we're trying to build and work in . 875 00:32:45,560 --> 00:32:48,180 How do we build robust algorithms ? In 876 00:32:48,180 --> 00:32:50,900 many cases , the science of responding 877 00:32:50,900 --> 00:32:53,122 thio adversarial ai and the threat that 878 00:32:53,122 --> 00:32:55,370 it poses is a very immature science . 879 00:32:55,370 --> 00:32:57,148 And so , from an implementation 880 00:32:57,148 --> 00:32:59,610 perspective , we find ourselves , you 881 00:32:59,610 --> 00:33:01,777 know , working with our especially our 882 00:33:01,777 --> 00:33:03,960 academic partners and our industry 883 00:33:03,960 --> 00:33:06,680 partners to really help us understand 884 00:33:06,680 --> 00:33:09,320 where we need to go as a department to 885 00:33:09,320 --> 00:33:11,540 make sure that our A I algorithms are 886 00:33:11,540 --> 00:33:13,540 safe , are protected , and our data 887 00:33:13,540 --> 00:33:16,390 eyes is the same safe , protected and 888 00:33:16,390 --> 00:33:18,930 usable when we need to use it . All of 889 00:33:18,930 --> 00:33:22,170 these are artifacts of AI 890 00:33:22,170 --> 00:33:24,290 implementation that the department is 891 00:33:24,290 --> 00:33:26,290 learning as we go . And the Jake is 892 00:33:26,290 --> 00:33:28,650 trying thio kind of show the way and 893 00:33:28,650 --> 00:33:30,761 get the conversation going across the 894 00:33:30,761 --> 00:33:32,983 department so we don't have to discover 895 00:33:32,983 --> 00:33:34,983 it . You know , serially , we could 896 00:33:34,983 --> 00:33:36,983 discover in parallel with all of us 897 00:33:36,983 --> 00:33:39,150 kind of learning together . So we'll , 898 00:33:39,150 --> 00:33:41,206 you know , we'll keep pushing that . 899 00:33:41,206 --> 00:33:43,428 But your point is very well taken . And 900 00:33:43,428 --> 00:33:45,594 it's an important consideration for us 901 00:33:45,594 --> 00:33:47,261 eyes making sure that we have 902 00:33:47,261 --> 00:33:49,094 reliability , uh , in the in the 903 00:33:49,094 --> 00:33:50,983 outcomes of all of our artificial 904 00:33:50,983 --> 00:33:53,261 intelligence efforts . Hey , Thank you , 905 00:33:53,261 --> 00:33:55,206 ladies and gentlemen . Thank you , 906 00:33:55,206 --> 00:33:57,317 General . Grown for your time today . 907 00:33:57,317 --> 00:33:59,483 Uh , just a reminder for the folks out 908 00:33:59,483 --> 00:34:01,539 on the line . This broadcast will be 909 00:34:01,539 --> 00:34:03,706 replayed on David's and we should have 910 00:34:03,706 --> 00:34:05,650 a transcript up on defense dot gov 911 00:34:05,650 --> 00:34:07,817 within the next 24 hours . If you have 912 00:34:07,817 --> 00:34:10,039 any follow on questions , you can reach 913 00:34:10,039 --> 00:34:12,206 out to me at my contacts . Most of you 914 00:34:12,206 --> 00:34:15,030 have those , or you can contact the SDP 915 00:34:15,030 --> 00:34:17,141 a duty officers . Thank you very much 916 00:34:17,141 --> 00:34:19,141 for everybody for attending today . 917 00:34:21,340 --> 00:34:22,270 Thank you .