1 00:00:00,040 --> 00:00:03,770 uh the US Secretary of Defense as the 2 00:00:03,770 --> 00:00:06,000 leader of the Department of Defense 3 00:00:06,010 --> 00:00:08,680 Secretary Austin is in charge of 4 00:00:08,700 --> 00:00:11,070 leading over two million service 5 00:00:11,070 --> 00:00:14,120 members and civilians and importantly , 6 00:00:14,120 --> 00:00:17,260 continues a long career , distinguished 7 00:00:17,260 --> 00:00:19,680 career of service to this country . I 8 00:00:19,690 --> 00:00:21,857 would also like to welcome back to the 9 00:00:21,857 --> 00:00:25,590 stage , the honourable robert Wark who 10 00:00:25,590 --> 00:00:27,701 is our vice chair of our Commission , 11 00:00:27,701 --> 00:00:29,479 who will participate in a short 12 00:00:29,479 --> 00:00:31,701 fireside chat following the Secretary's 13 00:00:31,701 --> 00:00:33,923 remarks . With that being said , Ladies 14 00:00:33,923 --> 00:00:35,868 and gentlemen , please give a warm 15 00:00:35,868 --> 00:00:39,570 welcome to Secretary Austin . Yeah . Mr 16 00:00:53,340 --> 00:00:54,450 Yeah , I think 17 00:00:57,840 --> 00:01:00,062 see if I can get this mike adjusted for 18 00:01:00,062 --> 00:01:01,284 normal sized people . 19 00:01:09,140 --> 00:01:11,140 Well , good afternoon and thank you 20 00:01:11,140 --> 00:01:13,390 General Kumashiro for that kind 21 00:01:13,400 --> 00:01:16,460 introduction and for your important 22 00:01:16,460 --> 00:01:19,610 work with this commission . Thanks to 23 00:01:19,610 --> 00:01:22,540 all of you for being here and including 24 00:01:22,540 --> 00:01:25,150 those people who are joining us online . 25 00:01:26,840 --> 00:01:28,840 And the fact that so many of us can 26 00:01:28,840 --> 00:01:31,730 gather safely in person here in 27 00:01:31,730 --> 00:01:35,260 Washington is a testament to what 28 00:01:35,750 --> 00:01:38,650 science and leadership can do . 29 00:01:40,540 --> 00:01:42,540 You know , it says a lot about this 30 00:01:42,540 --> 00:01:44,707 commission that you pulled together as 31 00:01:44,707 --> 00:01:47,770 such . An impressive lineup of speakers 32 00:01:47,770 --> 00:01:50,770 for the summit . You also brought 33 00:01:50,770 --> 00:01:53,048 together your impressive commissioners , 34 00:01:53,048 --> 00:01:56,690 including eric Schmidt sufferer 35 00:01:56,690 --> 00:02:00,190 cuts , Gilman , Louie chris 36 00:02:00,190 --> 00:02:03,250 Darby and katarina Mcfarland . 37 00:02:04,540 --> 00:02:06,373 It truly is great to see so many 38 00:02:06,373 --> 00:02:09,750 friends here , including bob work 39 00:02:10,640 --> 00:02:12,720 who has made such tremendous 40 00:02:12,720 --> 00:02:15,390 contributions as a commissioner and a 41 00:02:15,390 --> 00:02:17,760 distinguished leader of the department . 42 00:02:20,040 --> 00:02:22,050 As this commission has argued , 43 00:02:23,140 --> 00:02:26,520 cooperation is key to ensuring that the 44 00:02:26,520 --> 00:02:30,230 forces of technology support the forces 45 00:02:30,240 --> 00:02:33,890 of democracy and I couldn't agree more 46 00:02:33,890 --> 00:02:36,057 and I'm very grateful for all that you 47 00:02:36,057 --> 00:02:39,690 do . Now . I'd like to talk today about 48 00:02:39,690 --> 00:02:41,912 some big changes that we're bringing to 49 00:02:41,912 --> 00:02:44,600 the Department of Defense with respect 50 00:02:44,600 --> 00:02:47,720 to artificial intelligence and they 51 00:02:47,730 --> 00:02:51,280 represent some big changes to some old 52 00:02:51,280 --> 00:02:54,740 ways of thinking . This 53 00:02:54,740 --> 00:02:57,280 commission calls a i the most powerful 54 00:02:57,280 --> 00:03:00,070 tool in generations for benefiting 55 00:03:00,070 --> 00:03:03,870 humanity . It's a capability that 56 00:03:03,870 --> 00:03:05,860 this department urgently needs to 57 00:03:05,860 --> 00:03:07,250 develop even further . 58 00:03:09,340 --> 00:03:12,360 AI is central to our innovation agenda , 59 00:03:13,240 --> 00:03:15,960 helping us to compute faster and share 60 00:03:15,960 --> 00:03:19,060 better and leverage other platforms . 61 00:03:20,240 --> 00:03:22,462 And that's fundamental to the fights of 62 00:03:22,462 --> 00:03:26,430 the future . And so we are 63 00:03:26,430 --> 00:03:29,660 not , we are all now present at the 64 00:03:29,660 --> 00:03:33,640 creation , all part 65 00:03:33,650 --> 00:03:35,950 of a new age of technology . 66 00:03:38,540 --> 00:03:40,873 As President , president biden has said , 67 00:03:40,873 --> 00:03:43,090 we are determined to work with their 68 00:03:43,090 --> 00:03:45,460 like minded partners to shape the rules 69 00:03:45,460 --> 00:03:47,850 and norms that will govern those 70 00:03:47,850 --> 00:03:51,090 sweeping advances . And that means 71 00:03:51,090 --> 00:03:54,160 standing up for democratic values even 72 00:03:54,170 --> 00:03:57,760 and especially in times of great change . 73 00:03:58,940 --> 00:04:01,470 And it means ensuring that technologies 74 00:04:01,470 --> 00:04:04,750 like AI are , as the President has put 75 00:04:04,750 --> 00:04:07,460 it used to lift people up 76 00:04:08,740 --> 00:04:12,620 and not used to pin them down . And so 77 00:04:12,620 --> 00:04:15,230 we are renewing our efforts to posture 78 00:04:15,230 --> 00:04:17,890 ourselves for what I would call the 79 00:04:17,890 --> 00:04:20,950 future fight now . 80 00:04:22,040 --> 00:04:24,210 Obviously , if it comes down to 81 00:04:24,210 --> 00:04:27,550 fighting , we will do so and we will 82 00:04:27,550 --> 00:04:29,860 win and we will win decisively . 83 00:04:31,840 --> 00:04:34,062 But our first goal should be to prevent 84 00:04:34,062 --> 00:04:37,160 conflict and to deter adversaries 85 00:04:38,540 --> 00:04:41,040 and that demands demands of us . A new 86 00:04:41,040 --> 00:04:43,262 vision for deterrence in this century . 87 00:04:44,840 --> 00:04:48,140 We call this vision integrated 88 00:04:48,140 --> 00:04:51,150 deterrence and I'll have more to say 89 00:04:51,150 --> 00:04:54,500 about this in the weeks to come . But 90 00:04:54,500 --> 00:04:56,556 basically , integrated deterrence is 91 00:04:56,556 --> 00:04:59,060 about using the right mix of technology 92 00:04:59,740 --> 00:05:01,590 in operational concepts and 93 00:05:01,590 --> 00:05:04,970 capabilities , all woven together in a 94 00:05:04,970 --> 00:05:07,860 network way that is so credible , 95 00:05:08,440 --> 00:05:12,320 inflexible and formidable that it will 96 00:05:12,320 --> 00:05:14,680 give any adversary pause . 97 00:05:17,340 --> 00:05:19,340 Integrated deterrence means working 98 00:05:19,340 --> 00:05:21,562 closely with our friends and partners , 99 00:05:21,562 --> 00:05:24,730 just as this commission has urged , it 100 00:05:24,730 --> 00:05:26,508 means using some of our current 101 00:05:26,508 --> 00:05:29,450 capabilities differently . It means 102 00:05:29,450 --> 00:05:32,180 developing new operational concepts for 103 00:05:32,180 --> 00:05:36,000 things that we already do . And it 104 00:05:36,000 --> 00:05:37,960 means investing in cutting edge 105 00:05:37,960 --> 00:05:41,750 capabilities for the future in all 106 00:05:41,750 --> 00:05:43,528 domains of potential conflict . 107 00:05:45,640 --> 00:05:49,030 America's integrated deterrence relies 108 00:05:49,030 --> 00:05:52,450 on both innovation and investment , 109 00:05:54,440 --> 00:05:56,273 and we understand that those are 110 00:05:56,273 --> 00:06:00,250 interwoven innovation requires 111 00:06:00,250 --> 00:06:02,700 the resources to develop new ideas and 112 00:06:02,700 --> 00:06:06,420 scale them appropriately . An 113 00:06:06,420 --> 00:06:09,520 investment pays off when it's focused 114 00:06:09,530 --> 00:06:13,000 on the challenges of tomorrow and not 115 00:06:13,000 --> 00:06:16,820 yesterday . Take advantage is like 116 00:06:16,820 --> 00:06:19,730 A . I are changing the face and the 117 00:06:19,730 --> 00:06:23,380 pace of warfare , but we believe that 118 00:06:23,380 --> 00:06:26,040 we can responsibly use AI as a force 119 00:06:26,040 --> 00:06:29,390 multiplier , one that helps us to make 120 00:06:29,390 --> 00:06:31,660 decisions faster and more rigorously 121 00:06:32,840 --> 00:06:35,190 and to integrate across all domains and 122 00:06:35,190 --> 00:06:37,620 to replace old ways of doing business . 123 00:06:39,840 --> 00:06:42,220 AI and related technologies will give 124 00:06:42,220 --> 00:06:45,130 us both an information in an 125 00:06:45,130 --> 00:06:48,790 operational edge and that 126 00:06:48,790 --> 00:06:51,960 means a strategic advantage . 127 00:06:54,040 --> 00:06:56,720 But we know that truly successful 128 00:06:56,730 --> 00:07:00,470 adoption of AI isn't just like , say , 129 00:07:00,470 --> 00:07:03,980 procuring a better tank , you know , A 130 00:07:03,980 --> 00:07:06,880 closer analogy might be the 131 00:07:06,880 --> 00:07:10,200 department's use of computers and that 132 00:07:10,200 --> 00:07:13,100 began with a with a few critical 133 00:07:13,100 --> 00:07:15,060 applications and over the decades 134 00:07:15,060 --> 00:07:18,290 became embedded in nearly every 135 00:07:18,290 --> 00:07:19,960 military system that we have 136 00:07:22,440 --> 00:07:26,060 in a future that increasingly feels 137 00:07:26,740 --> 00:07:30,660 as if it's already here . AI holds the 138 00:07:30,660 --> 00:07:34,340 promise of superior performance across 139 00:07:34,340 --> 00:07:37,280 a wide range of platforms and systems 140 00:07:39,340 --> 00:07:41,562 over just the past decade . Progress in 141 00:07:41,562 --> 00:07:44,450 AI research , especially in machine 142 00:07:44,450 --> 00:07:47,260 learning has vastly expanded 143 00:07:49,340 --> 00:07:51,960 and we see a I as a transformative 144 00:07:51,960 --> 00:07:54,127 technology , one that will require new 145 00:07:54,127 --> 00:07:57,200 processes , new policies and new 146 00:07:57,200 --> 00:07:59,950 procedures across the department 147 00:08:01,840 --> 00:08:04,000 And use right . Ai capabilities can 148 00:08:04,000 --> 00:08:06,570 play a critical role in all four areas 149 00:08:07,420 --> 00:08:09,642 of the joint warfighting concept that I 150 00:08:09,642 --> 00:08:12,230 approve that spring , including joint 151 00:08:12,230 --> 00:08:15,940 fires , joint all domain command and 152 00:08:15,940 --> 00:08:19,350 control , contestant logistics 153 00:08:20,140 --> 00:08:22,260 and information advantage . 154 00:08:24,740 --> 00:08:26,740 Today across the department we have 155 00:08:26,740 --> 00:08:30,480 more than 600 AI efforts in progress 156 00:08:31,140 --> 00:08:33,450 significantly more than just a year ago 157 00:08:35,040 --> 00:08:37,100 and that includes the artificial 158 00:08:37,100 --> 00:08:40,330 intelligence and data acceleration 159 00:08:40,340 --> 00:08:43,090 initiative , which brings AI to bear on 160 00:08:43,100 --> 00:08:46,570 operational data . It 161 00:08:46,570 --> 00:08:50,000 includes project sailors , A 162 00:08:50,000 --> 00:08:52,222 predictive tool for finding patterns in 163 00:08:52,222 --> 00:08:55,300 COVID-19 data that the department built 164 00:08:55,300 --> 00:08:58,120 from scratch with some top Silicon 165 00:08:58,120 --> 00:09:00,660 Valley companies starting last March . 166 00:09:02,040 --> 00:09:05,260 And it includes the Pathfinder 167 00:09:05,260 --> 00:09:07,610 project , which is an algorithm driven 168 00:09:07,610 --> 00:09:10,920 system that helps us better to better 169 00:09:10,920 --> 00:09:13,550 detect airborne threats by using A I . 170 00:09:14,210 --> 00:09:16,060 The fuse data from military , 171 00:09:16,060 --> 00:09:18,990 commercial and government censors in 172 00:09:18,990 --> 00:09:22,490 real time . And of course 173 00:09:23,540 --> 00:09:26,060 the department and especially DARPA 174 00:09:27,240 --> 00:09:30,260 have a long history of A I research , 175 00:09:31,740 --> 00:09:34,760 You know , in the 1960s DARPA Research 176 00:09:35,440 --> 00:09:38,160 shape the so called first wave of AI 177 00:09:39,540 --> 00:09:41,910 And today through its multi-year 178 00:09:41,920 --> 00:09:45,060 investment of more than $2 billion , 179 00:09:45,840 --> 00:09:48,540 DARPA's AI Next campaign is paving the 180 00:09:48,540 --> 00:09:51,850 way for the future . Third wave , 181 00:09:53,540 --> 00:09:55,860 I recently visited the professionals up 182 00:09:55,860 --> 00:09:58,870 at DARPA and I was so impressed to 183 00:09:58,870 --> 00:10:01,270 learn about their more than 60 programs 184 00:10:01,270 --> 00:10:04,450 that are applying AI including 185 00:10:05,040 --> 00:10:08,000 using it to detect and patch cyber 186 00:10:08,010 --> 00:10:11,910 vulnerabilities and we're just 187 00:10:11,910 --> 00:10:15,430 getting started . And 188 00:10:15,430 --> 00:10:18,150 DARPA is just one of the many 189 00:10:18,740 --> 00:10:21,760 research test and evaluation 190 00:10:21,770 --> 00:10:23,881 organizations across the department . 191 00:10:27,840 --> 00:10:30,550 As this commission has recommended we 192 00:10:30,550 --> 00:10:32,300 elevated the joint artificial 193 00:10:32,300 --> 00:10:34,630 intelligence center so that it reports 194 00:10:34,630 --> 00:10:36,574 directly to the deputy secretary , 195 00:10:36,574 --> 00:10:39,020 ensuring that we have the focus from 196 00:10:39,020 --> 00:10:41,410 senior leaders needed to drive AI 197 00:10:41,410 --> 00:10:45,380 transformation . And over the next five 198 00:10:45,380 --> 00:10:48,990 years the department will invest nearly 199 00:10:49,000 --> 00:10:52,970 $1.5 billion in the centre's 200 00:10:53,030 --> 00:10:55,670 efforts to accelerate our adoption of 201 00:10:55,670 --> 00:10:58,860 Ai done 202 00:10:58,860 --> 00:11:01,870 responsibly leadership in a . I can 203 00:11:01,870 --> 00:11:03,740 boost our future military tech 204 00:11:03,740 --> 00:11:07,550 advantage from data driven decisions to 205 00:11:08,040 --> 00:11:09,750 human to machine teaming 206 00:11:12,740 --> 00:11:16,190 and that could make the pentagon of the 207 00:11:16,190 --> 00:11:19,610 near future dramatically more 208 00:11:19,610 --> 00:11:22,550 effective , more agile 209 00:11:23,440 --> 00:11:26,970 and more ready . But 210 00:11:26,970 --> 00:11:29,120 obviously we aren't the only ones who 211 00:11:29,120 --> 00:11:31,150 understand the promise of A . I . 212 00:11:32,540 --> 00:11:34,596 China's leaders have made clear that 213 00:11:34,596 --> 00:11:36,762 they intend to be globally dominant in 214 00:11:36,762 --> 00:11:39,350 Ai . By the by the year 2030 215 00:11:40,840 --> 00:11:43,370 in Beijing already talks about using AI 216 00:11:43,370 --> 00:11:45,092 for a range of missions , from 217 00:11:45,092 --> 00:11:48,000 surveillance to cyber attacks , to 218 00:11:48,000 --> 00:11:51,800 autonomous weapons and in the eye 219 00:11:51,800 --> 00:11:53,750 realm . As in many others , we 220 00:11:53,750 --> 00:11:56,840 understand that china is are facing 221 00:11:56,840 --> 00:12:00,430 challenge and we're going to compete to 222 00:12:00,430 --> 00:12:03,160 win , but we're going to do it the 223 00:12:03,160 --> 00:12:06,080 right way . We're not going to cut 224 00:12:06,090 --> 00:12:09,950 corners on safety , security or ethics 225 00:12:10,740 --> 00:12:13,400 in our watchwords , our responsibility 226 00:12:13,410 --> 00:12:16,970 and results and we don't believe for a 227 00:12:16,970 --> 00:12:20,280 minute that we have to sacrifice one 228 00:12:20,290 --> 00:12:23,290 for the other . We're going to rely on 229 00:12:23,290 --> 00:12:25,512 the longstanding advantages of our open 230 00:12:25,512 --> 00:12:28,360 system and our civil society in our 231 00:12:28,360 --> 00:12:32,140 democratic values . That's our 232 00:12:32,140 --> 00:12:34,500 road map to success and I wouldn't 233 00:12:34,500 --> 00:12:36,111 trade it for anyone else's . 234 00:12:38,840 --> 00:12:41,590 You know , american power has long been 235 00:12:41,590 --> 00:12:45,300 routed in american innovation and 236 00:12:45,300 --> 00:12:48,250 that's even truer today . Our 237 00:12:48,250 --> 00:12:50,528 powerhouse universities and are nimble . 238 00:12:50,528 --> 00:12:53,350 Small businesses are brimming with good 239 00:12:53,350 --> 00:12:56,510 ideas and we're working as their 240 00:12:56,510 --> 00:12:58,621 partners through initiatives like our 241 00:12:58,621 --> 00:13:00,850 recently launched Institute for a 242 00:13:00,850 --> 00:13:03,780 nascent innovation consortium which 243 00:13:03,780 --> 00:13:05,836 bring together small companies and a 244 00:13:05,836 --> 00:13:08,770 problem solving network to tackle some 245 00:13:08,770 --> 00:13:10,770 of some of the government's hardest 246 00:13:10,770 --> 00:13:14,050 tech challenges . But 247 00:13:14,060 --> 00:13:17,670 ultimately , AI systems 248 00:13:18,240 --> 00:13:20,850 only work when they are based in trust 249 00:13:22,740 --> 00:13:24,990 and we have a principled approach to AI 250 00:13:24,990 --> 00:13:26,879 that anchors everything that this 251 00:13:26,879 --> 00:13:30,580 department does . We call this 252 00:13:31,340 --> 00:13:34,790 responsible A I . And it's the only 253 00:13:34,790 --> 00:13:36,350 kind of AI that we do 254 00:13:38,140 --> 00:13:40,140 responsible . AI is the place where 255 00:13:40,140 --> 00:13:43,580 cutting edge tech meets timeless 256 00:13:43,580 --> 00:13:47,350 values . And again , you see , we don't 257 00:13:47,350 --> 00:13:49,517 believe that we need to choose between 258 00:13:49,517 --> 00:13:52,400 them , and we don't believe that doing 259 00:13:52,400 --> 00:13:53,550 so would work . 260 00:13:56,540 --> 00:13:58,780 This commission speaks of establishing 261 00:13:58,780 --> 00:14:02,310 justified confidence in A . I . Systems 262 00:14:03,540 --> 00:14:05,429 and we want that confidence to go 263 00:14:05,429 --> 00:14:07,500 beyond just ensuring that AI systems 264 00:14:07,500 --> 00:14:10,320 function , but also ensuring that AI 265 00:14:10,320 --> 00:14:13,450 systems support our founding principles . 266 00:14:15,140 --> 00:14:18,370 So our use of AI must reinforce our 267 00:14:18,370 --> 00:14:21,860 democratic values , protect our rights , 268 00:14:22,440 --> 00:14:25,450 ensure our safety and defend our 269 00:14:25,450 --> 00:14:28,680 privacy . Of course , 270 00:14:29,940 --> 00:14:32,162 we clearly understand the pressures and 271 00:14:32,162 --> 00:14:36,120 attentions and we know that evaluations 272 00:14:36,120 --> 00:14:38,660 of the legal and ethical implications 273 00:14:38,660 --> 00:14:40,327 of novel tech can take time . 274 00:14:43,540 --> 00:14:45,830 AI is going to change many things about 275 00:14:45,830 --> 00:14:49,000 military operations , but nothing is 276 00:14:49,000 --> 00:14:51,111 going to change America's commitments 277 00:14:51,111 --> 00:14:53,600 to the laws of war and the principles 278 00:14:53,600 --> 00:14:57,540 of our democracy . So we've 279 00:14:57,540 --> 00:14:59,910 established core principles for 280 00:14:59,920 --> 00:15:03,660 responsible A . I . Our development , 281 00:15:04,240 --> 00:15:07,240 deployment and use of A . I must always 282 00:15:07,240 --> 00:15:09,950 be responsible , 283 00:15:10,840 --> 00:15:13,750 equitable trace of old , 284 00:15:14,440 --> 00:15:17,460 reliable and governable 285 00:15:19,240 --> 00:15:21,407 and we're going to use A I for clearly 286 00:15:21,407 --> 00:15:24,460 defined purposes and we're not gonna 287 00:15:24,460 --> 00:15:26,850 put up with unintended bias from ai 288 00:15:26,860 --> 00:15:28,916 we're gonna watch out for unintended 289 00:15:28,916 --> 00:15:31,340 consequences and we're going to 290 00:15:31,350 --> 00:15:33,820 immediately adjust , improve or even 291 00:15:33,820 --> 00:15:36,450 disable AI systems that that aren't 292 00:15:36,450 --> 00:15:38,339 behaving the way that we intend . 293 00:15:41,140 --> 00:15:43,340 And to underscore this culture of 294 00:15:43,340 --> 00:15:47,030 responsibility in May , the department 295 00:15:47,040 --> 00:15:49,430 reaffirmed his commitment to our Ai 296 00:15:49,440 --> 00:15:53,260 ethics principles and that includes 297 00:15:53,740 --> 00:15:55,462 training a workforce ready for 298 00:15:55,462 --> 00:15:58,520 responsible A . I . And establishing 299 00:15:58,520 --> 00:16:02,000 structures for oversight and 300 00:16:02,000 --> 00:16:05,120 cultivating a robust ecosystem for 301 00:16:05,120 --> 00:16:06,260 responsible A . I . 302 00:16:09,240 --> 00:16:11,129 And I should note the outstanding 303 00:16:11,129 --> 00:16:12,907 efforts of our Deputy secretary 304 00:16:12,907 --> 00:16:15,018 Catholics in this crucial effort , an 305 00:16:15,018 --> 00:16:17,710 amazing job by a very , very talented 306 00:16:17,710 --> 00:16:21,650 professional . Now , 307 00:16:21,660 --> 00:16:23,890 our wider vision of integrated 308 00:16:23,890 --> 00:16:25,850 deterrents relies on our unmatched 309 00:16:25,850 --> 00:16:28,140 network of allies and partners 310 00:16:28,140 --> 00:16:31,730 worldwide worldwide and so does our 311 00:16:31,730 --> 00:16:34,830 approach to responsible A . I we're 312 00:16:34,830 --> 00:16:37,340 working together with other like minded 313 00:16:37,340 --> 00:16:40,370 friends to advance global norms , 314 00:16:40,370 --> 00:16:43,700 grounded in our shared values And so 315 00:16:43,700 --> 00:16:46,120 the department and 15 of our allied and 316 00:16:46,120 --> 00:16:48,550 partnered countries . Our meeting 317 00:16:48,550 --> 00:16:51,110 several times of year , several times a 318 00:16:51,110 --> 00:16:53,388 year in the AI Partnership for Defense . 319 00:16:55,640 --> 00:16:57,862 As we've accelerated our integration of 320 00:16:57,862 --> 00:17:00,030 AI , we have of course relied heavily 321 00:17:00,030 --> 00:17:02,590 on expert advice , including 322 00:17:02,590 --> 00:17:04,757 recommendations from this commission . 323 00:17:05,740 --> 00:17:08,310 You pushed us to increase our 324 00:17:08,310 --> 00:17:11,020 investments in A . I . Development and 325 00:17:11,020 --> 00:17:13,890 fielding . And in june we announced the 326 00:17:13,890 --> 00:17:15,600 creation of the Rapid defense 327 00:17:15,610 --> 00:17:18,150 experimentation reserve , which helps 328 00:17:18,150 --> 00:17:21,470 us get promising tech across the so 329 00:17:21,470 --> 00:17:23,910 called Valley of Death and into new 330 00:17:23,910 --> 00:17:26,230 prototypes , capabilities and concepts . 331 00:17:27,340 --> 00:17:29,173 And you've urged us to build the 332 00:17:29,173 --> 00:17:31,180 technical backbone to support AI 333 00:17:31,180 --> 00:17:34,050 systems throughout their life cycle . 334 00:17:35,540 --> 00:17:37,210 And we've just launched the 335 00:17:37,210 --> 00:17:39,930 department's new AI and data 336 00:17:39,930 --> 00:17:42,560 acceleration initiative , which will 337 00:17:42,560 --> 00:17:46,460 help us harness data at scale and speed 338 00:17:47,640 --> 00:17:49,640 and it will speed up the gains from 339 00:17:49,640 --> 00:17:53,540 leveraging AI . Your report 340 00:17:53,540 --> 00:17:55,651 also recommends that the department's 341 00:17:55,651 --> 00:17:57,762 budget budget focused more on science 342 00:17:57,762 --> 00:18:00,460 and technology and you know what 343 00:18:01,640 --> 00:18:05,060 you're exactly right . And that's why 344 00:18:05,070 --> 00:18:07,600 this year's budget asked for $112 345 00:18:07,600 --> 00:18:10,100 billion dollars for research , 346 00:18:10,100 --> 00:18:12,270 development , testing and evaluation . 347 00:18:13,940 --> 00:18:16,107 It is the department's largest R . And 348 00:18:16,107 --> 00:18:19,520 D . Request ever . And in that 349 00:18:19,530 --> 00:18:21,500 request A . I . Is one of the 350 00:18:21,500 --> 00:18:24,150 department's top tech modernization 351 00:18:24,150 --> 00:18:27,710 priorities and we're not just investing 352 00:18:27,710 --> 00:18:29,877 in individual AI applications either . 353 00:18:29,877 --> 00:18:32,120 We're investing in the infrastructure 354 00:18:32,120 --> 00:18:34,310 and the reforms to make our efforts 355 00:18:34,310 --> 00:18:38,130 more effective and our 356 00:18:38,130 --> 00:18:41,680 final and most important investment is 357 00:18:41,680 --> 00:18:44,880 in our people . We're going to have to 358 00:18:44,880 --> 00:18:46,658 do a lot better at recruiting , 359 00:18:46,658 --> 00:18:50,390 training and retaining talented people 360 00:18:50,520 --> 00:18:53,210 which are often young people , but 361 00:18:53,210 --> 00:18:55,432 people who can lead the department into 362 00:18:55,432 --> 00:18:57,488 and through the A . I . Revolution . 363 00:18:58,840 --> 00:19:00,800 And that means creating new career 364 00:19:00,800 --> 00:19:04,150 paths and new incentives . And it means 365 00:19:04,160 --> 00:19:06,170 including tech skills as a part of 366 00:19:06,170 --> 00:19:09,960 basic training programs . And it means 367 00:19:09,970 --> 00:19:12,190 a significant shift in the way that 368 00:19:12,190 --> 00:19:14,760 this institution thinks about tech . 369 00:19:16,440 --> 00:19:19,590 You know , some of our troops leave 370 00:19:19,590 --> 00:19:21,701 homes that are decked out in state of 371 00:19:21,701 --> 00:19:25,280 the art . Personal tech and then they 372 00:19:25,280 --> 00:19:27,260 spend their work day on virtually 373 00:19:27,640 --> 00:19:31,440 obsolete laptops . Yeah , 374 00:19:32,240 --> 00:19:34,018 you're familiar with this story 375 00:19:36,640 --> 00:19:38,880 And we still see college graduates and 376 00:19:38,880 --> 00:19:42,330 newly minted phds who would never think 377 00:19:42,330 --> 00:19:46,310 about a career in the department . So 378 00:19:46,310 --> 00:19:49,100 we have to do better . We have to do 379 00:19:49,100 --> 00:19:52,780 better . Emerging 380 00:19:52,780 --> 00:19:55,180 technologies must be central to our 381 00:19:55,180 --> 00:19:57,870 strategic development . We need to 382 00:19:57,870 --> 00:20:00,550 tackle our our culture of risk aversion 383 00:20:01,240 --> 00:20:03,407 and we need to smarten up our sluggish 384 00:20:03,407 --> 00:20:06,340 pace of acquisition and we need to more 385 00:20:06,340 --> 00:20:10,250 vigorously recruit talented people and 386 00:20:10,250 --> 00:20:13,570 not scare them away . And in today's 387 00:20:13,570 --> 00:20:16,760 world in today's department , 388 00:20:18,040 --> 00:20:20,770 innovation cannot be an afterthought , 389 00:20:21,740 --> 00:20:25,410 It is the ball game . And again , as 390 00:20:25,410 --> 00:20:28,420 president biden has noted , we're not 391 00:20:28,420 --> 00:20:31,250 going to , we're going to see more 392 00:20:31,250 --> 00:20:33,720 technological change In the next 10 393 00:20:33,720 --> 00:20:35,560 years and we saw in the last 50 394 00:20:37,640 --> 00:20:39,251 and we know that some of our 395 00:20:39,251 --> 00:20:41,196 competitors think that they see an 396 00:20:41,196 --> 00:20:44,210 opening , but we're determined as the 397 00:20:44,210 --> 00:20:45,988 president says , to develop and 398 00:20:45,988 --> 00:20:48,154 dominate the products and technologies 399 00:20:48,154 --> 00:20:50,950 of the future that's central to our 400 00:20:50,950 --> 00:20:54,860 agenda . And that mission is far easier 401 00:20:54,860 --> 00:20:57,780 because of two of America's greatest 402 00:20:57,780 --> 00:21:01,490 assets , creativity of an open 403 00:21:01,490 --> 00:21:05,250 society in the ingenuity of an open 404 00:21:05,250 --> 00:21:09,190 mind . We're going to need the help of 405 00:21:09,190 --> 00:21:11,540 our friends and all of this and believe 406 00:21:11,540 --> 00:21:14,070 me , we're going to continue to lean on 407 00:21:14,070 --> 00:21:17,660 you , but we are going to get this done 408 00:21:18,540 --> 00:21:20,596 and we're going to get it done right 409 00:21:21,240 --> 00:21:23,760 and we're going to get it done together . 410 00:21:24,940 --> 00:21:28,270 Thank you very much . Yeah . 411 00:21:28,640 --> 00:21:29,060 Mhm , 412 00:21:35,340 --> 00:21:35,770 mm . 413 00:21:41,040 --> 00:21:43,207 Thank you . Secretary Austin first for 414 00:21:43,207 --> 00:21:45,262 taking time out of your schedule and 415 00:21:45,262 --> 00:21:47,484 coming over this afternoon . I think he 416 00:21:47,484 --> 00:21:49,707 knows Secretary Mark esper attended our 417 00:21:49,707 --> 00:21:52,490 first inaugural kick off . So having 418 00:21:52,490 --> 00:21:55,580 you to book end the end of our work has 419 00:21:55,580 --> 00:21:57,900 been a special treat for us and thanks 420 00:21:57,900 --> 00:22:00,067 for the motivating words , their music 421 00:22:00,067 --> 00:22:02,289 to the ears of all of our commissioners 422 00:22:02,289 --> 00:22:04,622 and I'm sure to the whole audience here , 423 00:22:04,940 --> 00:22:07,162 I think he laid out that the department 424 00:22:07,162 --> 00:22:09,420 really is building up momentum on the 425 00:22:09,420 --> 00:22:13,300 adoption of A I so you mentioned some 426 00:22:13,300 --> 00:22:15,430 of them does publication of 427 00:22:15,430 --> 00:22:18,890 responsibility principles , the Chief 428 00:22:18,890 --> 00:22:22,470 data Officer and his data decrees the 429 00:22:22,480 --> 00:22:25,880 data and ai accelerator . An awful lot 430 00:22:25,880 --> 00:22:29,060 of stuff going on . Uh And it looks 431 00:22:29,060 --> 00:22:31,750 like you finally solved Jedi Uh huh 432 00:22:32,540 --> 00:22:36,050 Yeah . Okay . 433 00:22:38,140 --> 00:22:40,960 Oh man , I never thought it would end . 434 00:22:41,220 --> 00:22:44,620 Uh but now that you've been here for 435 00:22:44,620 --> 00:22:46,950 six months , uh what do you see as the 436 00:22:46,950 --> 00:22:50,840 next wave of accelerated work to really 437 00:22:50,840 --> 00:22:53,460 get a I enabled applications adopted 438 00:22:53,460 --> 00:22:56,890 its scale across the department ? Yeah . 439 00:22:56,900 --> 00:23:00,050 So , am I still on here bob ? Yes , sir . 440 00:23:00,060 --> 00:23:03,760 Okay , so , um uh three 441 00:23:03,760 --> 00:23:05,927 things that I would , I would point to 442 00:23:05,927 --> 00:23:08,190 bob weir . We're pushing 443 00:23:08,200 --> 00:23:10,380 experimentation and you mentioned the 444 00:23:10,380 --> 00:23:13,870 Ai accelerator . That's a that's a key 445 00:23:13,870 --> 00:23:17,870 mechanism that will use to to get , 446 00:23:17,890 --> 00:23:21,740 you know , products and capabilities to 447 00:23:21,740 --> 00:23:25,210 our convent commanders as quickly as we 448 00:23:25,210 --> 00:23:29,060 possibly can . Uh The second 449 00:23:29,060 --> 00:23:31,282 thing that I would say that we're doing 450 00:23:31,282 --> 00:23:33,282 is we're establishing protocols for 451 00:23:33,282 --> 00:23:35,338 responsibility and you heard me talk 452 00:23:35,338 --> 00:23:38,650 about that earlier , uh and and I think 453 00:23:38,660 --> 00:23:42,500 we can ill afford to move so fast that 454 00:23:42,500 --> 00:23:45,660 we we forget about uh you know , the 455 00:23:45,670 --> 00:23:48,760 our responsibilities in terms of uh 456 00:23:49,440 --> 00:23:52,860 behaving in an ethical fashion . And 457 00:23:52,860 --> 00:23:55,860 then the third thing is , I would say , 458 00:23:56,340 --> 00:23:59,660 uh we're gonna build good 459 00:23:59,660 --> 00:24:02,300 governance processes along the way , I 460 00:24:02,300 --> 00:24:04,540 think . And and we've already , you 461 00:24:04,540 --> 00:24:07,820 know , departed on this path here . Uh 462 00:24:07,830 --> 00:24:09,997 It's all about governance in my mind , 463 00:24:09,997 --> 00:24:12,180 establishing the right mechanisms to 464 00:24:12,180 --> 00:24:14,850 provide the right oversight to ensure 465 00:24:14,850 --> 00:24:17,017 that you have discipline in the system 466 00:24:17,017 --> 00:24:19,280 I think is awfully important . So I 467 00:24:19,280 --> 00:24:21,502 think those three muscle movements will 468 00:24:21,502 --> 00:24:24,730 keep us in the right place . Sounds 469 00:24:24,730 --> 00:24:27,850 great , sir . Um the 22 budget 470 00:24:27,850 --> 00:24:30,590 submission , as you noted , provides 471 00:24:30,590 --> 00:24:32,479 the biggest increase in R . And D 472 00:24:32,479 --> 00:24:34,368 spending in recent years , is the 473 00:24:34,368 --> 00:24:36,701 biggest R . And D . Budget in D . O . D . 474 00:24:36,701 --> 00:24:38,868 S . History . That is very good news . 475 00:24:38,930 --> 00:24:42,770 How are you going to ensure that the 476 00:24:42,770 --> 00:24:44,937 department prioritizes A . I . And the 477 00:24:44,937 --> 00:24:47,048 other emerging technologies are going 478 00:24:47,048 --> 00:24:48,937 to give the future joint force an 479 00:24:48,937 --> 00:24:50,790 advantage within this increase . 480 00:24:51,840 --> 00:24:55,170 Uh As you would imagine , bob , I'm 481 00:24:55,640 --> 00:24:57,862 I'm really proud of the fact that we're 482 00:24:57,862 --> 00:25:01,350 investing 100 and $12 billion 483 00:25:01,350 --> 00:25:04,920 dollars in in already . I mean that's 484 00:25:04,930 --> 00:25:07,670 uh as I pointed out earlier , that's a 485 00:25:08,040 --> 00:25:10,096 that's the largest investment in R . 486 00:25:10,096 --> 00:25:12,373 And D . That the department scene ever . 487 00:25:13,040 --> 00:25:16,760 uh and and 25% of that 488 00:25:17,240 --> 00:25:20,860 is focused on emerging technologies . 489 00:25:21,440 --> 00:25:23,800 And I think I think it's that kind of 490 00:25:23,800 --> 00:25:27,290 investment that that really will ensure 491 00:25:27,290 --> 00:25:31,150 that that we maintain and increase 492 00:25:31,150 --> 00:25:33,410 our competitive edge as we look towards 493 00:25:33,410 --> 00:25:36,190 our pure competitors . That china is 494 00:25:36,190 --> 00:25:38,900 the Russia's of the world . I would 495 00:25:38,900 --> 00:25:40,900 also point out though that , I mean 496 00:25:40,900 --> 00:25:44,200 this is a 50% increase From what we saw 497 00:25:44,210 --> 00:25:48,100 in uh in 2019 . So it's it's it's 498 00:25:48,100 --> 00:25:50,100 pretty substantial in terms of what 499 00:25:50,100 --> 00:25:51,720 we're investing in emerging 500 00:25:51,720 --> 00:25:54,880 technologies . But when you when you 501 00:25:54,890 --> 00:25:57,320 consider emerging technologies , as you 502 00:25:57,320 --> 00:26:00,890 pointed out , it's not just about ai 503 00:26:00,900 --> 00:26:04,530 it's also about uh hypersonic , it's 504 00:26:04,530 --> 00:26:07,430 about biotechnology , uh it's about 505 00:26:07,430 --> 00:26:10,310 micro electronics . So a number of 506 00:26:10,310 --> 00:26:12,550 things that I think uh you know , we'll 507 00:26:12,550 --> 00:26:14,661 have to you know , continue to invest 508 00:26:14,661 --> 00:26:16,772 in along the way . But but I'm pretty 509 00:26:16,772 --> 00:26:19,060 excited about if we get the budget 510 00:26:19,060 --> 00:26:21,420 through . Pretty excited about the 511 00:26:21,420 --> 00:26:23,364 capabilities that we're going to , 512 00:26:23,364 --> 00:26:25,476 we're going to curate going forward , 513 00:26:26,440 --> 00:26:29,050 Sir . We have time probably for one 514 00:26:29,440 --> 00:26:32,640 short comment . Today's theme is really 515 00:26:32,640 --> 00:26:34,780 working with our partners . And you 516 00:26:34,780 --> 00:26:36,891 mentioned that one of the things that 517 00:26:36,891 --> 00:26:38,836 you wanted to do was secretary was 518 00:26:38,836 --> 00:26:41,510 really work with partners on a I ah do 519 00:26:41,510 --> 00:26:43,640 you have any ideas right now how the 520 00:26:43,640 --> 00:26:45,807 department is going to go about that ? 521 00:26:46,840 --> 00:26:49,410 Um You heard me mention earlier , I 522 00:26:49,410 --> 00:26:52,230 think in my remarks on what we're doing 523 00:26:52,230 --> 00:26:55,500 in terms of our ai uh 524 00:26:55,510 --> 00:26:57,510 partnership , we've we've partnered 525 00:26:57,510 --> 00:27:01,400 with 15 other nations on uh on , 526 00:27:01,410 --> 00:27:05,220 you know , ai and responsible A I and 527 00:27:05,230 --> 00:27:07,990 and so I think that's a good start . I 528 00:27:07,990 --> 00:27:09,823 think that that that number will 529 00:27:09,823 --> 00:27:11,879 mushroom going forward , and I think 530 00:27:11,879 --> 00:27:14,110 it's all about sharing ideas and best 531 00:27:14,110 --> 00:27:16,480 practices and it's all about making 532 00:27:16,480 --> 00:27:19,800 sure that we establish norms uh and 533 00:27:19,800 --> 00:27:22,270 encourage others to follow those norms 534 00:27:22,280 --> 00:27:25,880 along along the way . But , you know , 535 00:27:25,890 --> 00:27:28,850 as you are , I'm a big believer in uh 536 00:27:28,860 --> 00:27:31,082 in our alliances and our partnerships . 537 00:27:31,082 --> 00:27:34,860 I think it magnifies our 538 00:27:34,870 --> 00:27:37,820 capacity and our capabilities . So , um 539 00:27:37,830 --> 00:27:41,360 you'll you'll see us continue to uh uh 540 00:27:41,370 --> 00:27:44,510 to refurbish and then strengthen those 541 00:27:44,510 --> 00:27:46,940 alliances going forward and a I will be 542 00:27:46,940 --> 00:27:48,940 a pretty significant part of that . 543 00:27:50,240 --> 00:27:52,240 Well , thank you , Mr Secretary , I 544 00:27:52,240 --> 00:27:54,407 think the department and the nation is 545 00:27:54,407 --> 00:27:56,351 very lucky to have you at the helm 546 00:27:56,351 --> 00:27:58,407 right now , with everything going on 547 00:27:58,540 --> 00:28:01,830 from Haiti to china , uh Sprinkle in a 548 00:28:01,830 --> 00:28:05,400 little cuba . Um , just your 549 00:28:05,400 --> 00:28:08,440 inbox doesn't ever seem to go down . So 550 00:28:08,440 --> 00:28:10,551 the fact that you could take time out 551 00:28:10,551 --> 00:28:12,551 of , you know , trying to work your 552 00:28:12,551 --> 00:28:14,440 inbox down and visit with us this 553 00:28:14,440 --> 00:28:16,662 afternoon . We really appreciate it and 554 00:28:16,662 --> 00:28:18,551 wish you the best of blood well , 555 00:28:18,551 --> 00:28:20,718 thanks for having me here . And I note 556 00:28:20,718 --> 00:28:22,940 that you , you had the rest of the , of 557 00:28:22,940 --> 00:28:25,051 the national security team here today 558 00:28:25,051 --> 00:28:27,384 as well with jake Sullivan and , and uh , 559 00:28:27,384 --> 00:28:29,551 and now Tony Blinken . I thanks coming 560 00:28:29,551 --> 00:28:31,662 up . Uh , so , uh , we spend a lot of 561 00:28:31,662 --> 00:28:34,070 time together just based upon the 562 00:28:34,070 --> 00:28:36,292 things that you pointed out in terms of 563 00:28:36,292 --> 00:28:38,459 the issues that we have on our plate . 564 00:28:38,459 --> 00:28:40,514 But I would say that it's a , it's a 565 00:28:40,514 --> 00:28:42,570 great team to be a part of . And one 566 00:28:42,570 --> 00:28:44,737 final comment , you know , it's a good 567 00:28:44,737 --> 00:28:44,380 thing that I can't really see who's in 568 00:28:44,380 --> 00:28:46,547 the audience , but I just noticed that 569 00:28:46,547 --> 00:28:48,491 Sally Donnelly is sitting right in 570 00:28:48,491 --> 00:28:50,658 front of me there and uh I'm surprised 571 00:28:50,658 --> 00:28:52,713 that I didn't get heckled you know , 572 00:28:52,713 --> 00:28:54,936 during my presentation , but sally it's 573 00:28:54,936 --> 00:28:57,200 good to see and thanks to all of you 574 00:28:57,210 --> 00:28:59,500 for what you continue to do , to 575 00:28:59,510 --> 00:29:02,390 support a very worthy cause and 576 00:29:02,390 --> 00:29:04,501 something that I think we really have 577 00:29:04,501 --> 00:29:06,612 to sprint on , so so thanks a lot and 578 00:29:06,612 --> 00:29:08,946 thanks for allowing me to be here today , 579 00:29:08,946 --> 00:29:11,940 please . Uh huh . Yeah . 580 00:29:15,240 --> 00:29:19,170 Yeah . Yeah .