WEBVTT 1 00:00:00.420 --> 00:00:04.860 Anna Delaney: Hi, I'm Anna Delaney, director of productions and Information Security Media Group. 2 00:00:05.280 --> 00:00:10.830 Today I'm joined by Peter Yapp, a partner at law firm Schillings and former deputy director of the 3 00:00:10.830 --> 00:00:15.600 UK's National Cyber Security Center, NCSC. Thanks for joining us, Peter. 4 00:00:16.320 --> 00:00:16.890 Peter Yapp: Thank you, Anna. 5 00:00:18.150 --> 00:00:22.770 Anna Delaney: So Peter, I'm curious to know what you're getting up to since leaving NCSC. From what 6 00:00:22.770 --> 00:00:28.320 I understand you've joined Schillings as a partner to lead their cyber team. What does this involve? 7 00:00:28.350 --> 00:00:30.360 And where do your current priorities lie? 8 00:00:31.560 --> 00:00:38.490 Peter Yapp: So it's the the cyber and information security team. So it's a bit of a crossover mix 9 00:00:38.490 --> 00:00:46.050 between the cyber world and the physical world. So where those two worlds collide, as it were, and so 10 00:00:46.050 --> 00:00:55.290 I oversee a practice that is trying to promote protective work. So assessing people's 11 00:00:55.290 --> 00:01:02.850 cybersecurity pentesting, red teaming, vulnerability scanning, making sure the policies 12 00:01:02.850 --> 00:01:09.660 and procedures are in place that are needed, making sure that the training is there. And 13 00:01:09.660 --> 00:01:18.390 perhaps just dropping in and offering advice to people, when they're short on resources or short 14 00:01:18.390 --> 00:01:28.530 on time, whatever. And the reason we're concentrating on the protective side is because 15 00:01:29.010 --> 00:01:36.210 people will always come to us in a crisis. We're known as a crisis firm. So we we're used to 16 00:01:36.210 --> 00:01:44.160 handling crises and and what I'm trying to do is help people not get into those in the first place. 17 00:01:44.280 --> 00:01:48.330 And I think it's far more cost effective to do that. It just requires a little bit of prior 18 00:01:48.330 --> 00:01:55.650 thought. And yes, it's great that people remember Schillings and phoned us up in crisis, but I'd 19 00:01:55.650 --> 00:02:01.770 much rather have that relationship. Before a crisis to stop a crisis happening. 20 00:02:01.950 --> 00:02:08.580 Anna Delaney: So how does your previous role the that you your role as deputy director at NCSC? How 21 00:02:08.580 --> 00:02:12.270 has that experience impacted your current position? 22 00:02:13.170 --> 00:02:20.700 Peter Yapp: So I think the crisis management piece is obviously really important. So I ran the 23 00:02:20.700 --> 00:02:31.980 Incident Response Team at the NCSC for, for about a year, and saw all sorts of all sorts of things 24 00:02:31.980 --> 00:02:39.390 during that time. And, and I kind of got used to, to being able to handle all those those questions 25 00:02:39.390 --> 00:02:46.260 that were coming in from, from all angles and sometimes questions from very high up as well. So 26 00:02:46.770 --> 00:02:53.910 whenever there's a cyber breach, you can imagine that, you know, those people who are occupying 27 00:02:53.910 --> 00:02:58.440 number 10 Downing Street are going to be interested in that they're going to know want to 28 00:02:58.440 --> 00:03:04.950 know what's happening. What are we doing about it? And so those kind of external, almost 29 00:03:04.950 --> 00:03:11.550 communication type pressures, handling, those are so important and, and I think that that's a 30 00:03:11.550 --> 00:03:17.820 fundamental thing that I that I learned that it's, it's not so much about the technical side of 31 00:03:17.820 --> 00:03:23.550 things. It's about the communication side of things. It's about making sure you keep everyone 32 00:03:23.580 --> 00:03:29.910 in the loop and up to date. And that you're on the front foot that you've got things prepared, you've 33 00:03:29.910 --> 00:03:38.280 got things to say at the appropriate times, you're not scrambling around and making up something on 34 00:03:38.280 --> 00:03:44.970 the on the fly, in terms of what you're communicating to, to the general public or to your 35 00:03:44.970 --> 00:03:50.280 shareholders or even internally to your to your businesses. So there's something about that that 36 00:03:50.280 --> 00:04:00.510 crisis piece. And then I think what I've fundamentally learnt was that the... it was much 37 00:04:00.510 --> 00:04:06.720 better being proactive than reactive. So that kind of fed into to the business that I'm doing now. 38 00:04:07.230 --> 00:04:19.110 And supply chain, the importance of supply chain. So as long as businesses have got their house in 39 00:04:19.110 --> 00:04:25.440 order, then they shouldn't stop at that point, they really shouldn't look at their supply chain 40 00:04:25.440 --> 00:04:34.320 as well. And there was so many so many incidents that happened on my watch that that involved the 41 00:04:34.320 --> 00:04:39.870 supply chain. And and it's like people just didn't have that in their field of vision. And it's like 42 00:04:39.870 --> 00:04:45.480 supply chain has crept up on us over the last three or four years. And suddenly, we're 43 00:04:45.480 --> 00:04:51.780 outsourcing far more than we were much more is in the cloud now. And it's a major consideration. It 44 00:04:51.780 --> 00:05:01.950 is an extension of your business. And I think that combined with with the fact that there's 45 00:05:01.950 --> 00:05:09.540 collateral damage that's caused when your supply chain is hit. So we that I suppose the third big 46 00:05:09.570 --> 00:05:17.040 lesson I learned at the NCSC was that if all you're focusing on is the threat, and if all 47 00:05:17.040 --> 00:05:26.790 you're looking at is who might attack me, then you're missing a whole, a whole piece of this. You 48 00:05:26.790 --> 00:05:32.850 know, attackers are scanning the internet looking for vulnerabilities, and then they'll pick the 49 00:05:32.850 --> 00:05:38.820 targets. So you might get picked out just by mistake, just because you had a big hole in 50 00:05:38.820 --> 00:05:45.480 something you had an unpatched vulnerability. And, and you get landed on not because you're a 51 00:05:45.480 --> 00:05:53.130 pharmaceutical company that's developing a vaccine. Just because you are a company with a 52 00:05:53.130 --> 00:05:57.510 vulnerability that you should have patched and someone else has come across it. 53 00:05:58.620 --> 00:06:01.560 Anna Delaney: So focus on your vulnerabilities first,? 54 00:06:02.100 --> 00:06:06.780 Peter Yapp: Absolutely, get your own house in order. Look at what you're exposing to the to the 55 00:06:06.780 --> 00:06:12.600 internet, look at the easy ways into your environment, sort that out first, then look at 56 00:06:12.600 --> 00:06:14.730 your supply chain because they're feeding into that. 57 00:06:15.510 --> 00:06:19.470 Anna Delaney: And to coin one of my least favorite expressions at the moment, we are living in 58 00:06:19.500 --> 00:06:24.720 unprecedented times. And COVID-19 has obviously introduced an array of additional security 59 00:06:24.720 --> 00:06:31.230 challenges for businesses. And now of course, many businesses find themselves dealing with the need 60 00:06:31.230 --> 00:06:38.520 to collect vast amounts of customer data from pubs taking our names to airport testing, where do you 61 00:06:38.520 --> 00:06:41.220 feel businesses are struggling with this task? 62 00:06:42.350 --> 00:06:48.350 Peter Yapp: So so I think they're still trying to come to terms with the vast increase in the amount 63 00:06:48.350 --> 00:06:52.640 of data that they've got to hold so they're not used. Lots of these businesses are not used to 64 00:06:52.640 --> 00:07:00.290 holding that kind of personal data. And then, how do you secure that data? How do you do you delete 65 00:07:00.290 --> 00:07:07.250 it after 21 days? I mean, you know, that's not a normal business practice. Businesses are not 66 00:07:07.250 --> 00:07:14.210 geared up to, to wipe all traces of this personal data within 21 days. And that has a whole set of 67 00:07:14.210 --> 00:07:21.680 problems on its on its own. And of course GDPR. Now GDPR, would've hit many, many businesses, but 68 00:07:21.680 --> 00:07:27.530 perhaps not in in this kind of way, this is fundamental to GDPR this type of data, this 69 00:07:27.530 --> 00:07:35.360 personal data, so how do you notify all those people that the purpose of why you're collecting 70 00:07:35.360 --> 00:07:41.450 this? How do you minimize the data that you're you're collecting, and how do you protect that 71 00:07:41.450 --> 00:07:49.160 data that you're holding, so looking at things like encryption that they may not have had to look 72 00:07:49.160 --> 00:07:58.400 at before looking at how to protect their systems from from attackers. Again, they may not have 73 00:07:58.400 --> 00:08:05.000 considered that you know, as a Powerball, or restaurant or whatever? And then, and then 74 00:08:05.000 --> 00:08:12.470 finally, the technical problems of deleting something absolutely after 21 days. So what do you 75 00:08:12.470 --> 00:08:19.910 do about your backups? If you have backups? Has it captured that data? Has it incrementally captured 76 00:08:19.910 --> 00:08:24.170 that data? How are you going to get rid of that? You know, that's, that's a real fundamental 77 00:08:24.170 --> 00:08:28.250 problem for what are often quite small businesses. 78 00:08:28.990 --> 00:08:32.980 Anna Delaney: So what's your advice to businesses? How can they protect their their customers' data, 79 00:08:32.980 --> 00:08:35.020 but also their their own systems? 80 00:08:35.780 --> 00:08:45.200 Peter Yapp: So I think there's a set of fundamentals that they should do. And then I would 81 00:08:45.200 --> 00:08:52.520 say it's fundamental. It's not necessarily easy, but they're kind of the baseline. So I wouldn't 82 00:08:53.630 --> 00:09:01.970 jump into some hugely sophisticated solution. You know, and I wouldn't be persuaded by someone who's 83 00:09:01.970 --> 00:09:07.340 saying, I had the answer to all this in a package that you can buy, because I think that's most 84 00:09:07.340 --> 00:09:12.920 unlikely. So I think that probably five things that you need to look at, you need to make sure 85 00:09:12.920 --> 00:09:20.150 that your antivirus is there, across all your devices. So, you know, if you look at how this 86 00:09:20.180 --> 00:09:28.130 data is being collected, it's being collected on iPads and on laptops. So for all of those devices, 87 00:09:28.130 --> 00:09:33.860 mobile phones as well make sure that they're protected with antivirus. Make sure that 88 00:09:33.860 --> 00:09:37.820 everything's up to date. So it's all patched. You know, these are kind of fundamental things that 89 00:09:37.820 --> 00:09:43.040 we've heard many times before. But for some of these businesses, they probably didn't care too 90 00:09:43.040 --> 00:09:50.570 much about the patching. You know, maybe they treated it like many of us treat it at home. You 91 00:09:50.570 --> 00:09:54.980 know, it's just kind of an inconvenience when it comes up on your mobile phone. Would you like to 92 00:09:54.980 --> 00:10:02.300 update to the latest software? Oh no, I'm doing something at the moment. I'm collecting this 93 00:10:02.300 --> 00:10:08.510 person's data or whatever. So you put it off and put it off. That patching is fundamental, not 94 00:10:08.540 --> 00:10:16.820 because you get the latest version with the latest additions and improvements to the software, but 95 00:10:16.820 --> 00:10:22.220 because they often put security updates and software. And often they're not going to tell you 96 00:10:22.220 --> 00:10:29.270 that it's kind of in the small print on the fifth page to say it's a security update as well. So 97 00:10:29.270 --> 00:10:35.570 patching fundamentally important, and then just making sure that you've locked down accounts so 98 00:10:35.570 --> 00:10:41.540 that only the people who need to see the data, see the data. So don't leave everything open. You 99 00:10:41.540 --> 00:10:47.360 know, don't have a very flat structure that everyone can get into. Make sure that if you're 100 00:10:47.360 --> 00:10:54.980 buying new kit in those hardware and software devices, as with all internet of things type 101 00:10:54.980 --> 00:11:03.500 devices, but make sure that they haven't just got the default admin and password. And we still come 102 00:11:03.500 --> 00:11:08.990 across that, you know, in our reviews, we're still coming across that. And I know we were coming 103 00:11:08.990 --> 00:11:16.490 across that 10 years ago and shouting about it. It hasn't landed everywhere, that kind of lesson. And 104 00:11:17.090 --> 00:11:20.990 a kind of wrapping around all of that. And for some of these small businesses, maybe they haven't 105 00:11:20.990 --> 00:11:28.610 thought about this before, but just having a firewall, just sit all the all of this behind a 106 00:11:28.610 --> 00:11:36.080 firewall that protects you. Now, most businesses have adopted that years ago, but there will be 107 00:11:36.320 --> 00:11:39.230 still some small businesses that don't have that. 108 00:11:40.470 --> 00:11:46.800 Anna Delaney: So this focus on the fundamentals does shine a light on the need for businesses to 109 00:11:47.040 --> 00:11:52.620 build strong security infrastructure, but how should they do that? Where should their priorities 110 00:11:52.620 --> 00:11:53.040 lie? 111 00:11:54.380 --> 00:12:02.840 Peter Yapp: So so I think for for strong infrastructure you're talking about a whole series 112 00:12:02.840 --> 00:12:12.710 of things, people processes technology supply chain. So you must look at all of that. And in 113 00:12:12.710 --> 00:12:18.320 that people are going to be the most important. So, you know, maybe cybersecurity sometimes 114 00:12:18.320 --> 00:12:24.410 concentrates too much on the the kids and the software and the complexity of it, but actually 115 00:12:25.280 --> 00:12:34.550 having empowered staff, staff who are prepared to say, "This looks odd," you know, just to be able 116 00:12:34.550 --> 00:12:40.340 to flag something up, feel confident that if they do that, they're not going to be told, don't be 117 00:12:40.340 --> 00:12:48.230 silly. Or, you know, machines always slow down in the afternoon. You know, you don't want that kind 118 00:12:48.230 --> 00:12:55.070 of response. You want to bake that into the culture so that they can tell that tell someone 119 00:12:55.070 --> 00:13:00.380 that there's so much to report it and the actions taken against it and and that no one gets told off 120 00:13:00.560 --> 00:13:09.770 for it. No one gets punished for clicking on a phishing email. You know, it, it's, it might not 121 00:13:09.770 --> 00:13:13.700 be the right thing to do, but you're not going to get punished for it because you've told us about 122 00:13:13.700 --> 00:13:18.140 it. That's the important thing. So I think, fundamentally, to get that and secure 123 00:13:18.140 --> 00:13:23.960 infrastructure, that security infrastructure within a business, you've got to, you've got to 124 00:13:23.960 --> 00:13:31.430 bring your people on board. I think there's some some technical things, I think it's always good to 125 00:13:31.460 --> 00:13:39.200 stand back and have a review of everything you have in place. So some kind of cybersecurity 126 00:13:39.230 --> 00:13:51.170 assessments. Usually, someone independent is going to give you a complete unadulterated view of of 127 00:13:51.170 --> 00:13:57.620 that. So no kind of conflicts of interest about this was the system that I put in, and now I'm 128 00:13:57.620 --> 00:14:05.870 testing it that, that kind of thing. I think you build on the process side. So you have all the 129 00:14:05.870 --> 00:14:13.610 policies in place, or the procedures in place and you make sure that that that works. You have a 130 00:14:13.610 --> 00:14:25.730 strong password policy. So something something like at the NCSC they advocate three random words. 131 00:14:26.210 --> 00:14:30.680 And the reason that they do this is because you can remember three random words. So if you just 132 00:14:30.680 --> 00:14:38.270 try this, nothing that connects you to your hobbies or your family life or football teams 133 00:14:38.270 --> 00:14:46.880 support or whatever, but three truly random words. And then don't change that password until you need 134 00:14:46.880 --> 00:14:52.070 to, until it looks like it might have been breached for whatever reason, hopefully, never 135 00:14:52.070 --> 00:14:58.520 will be. But this encourages people to have quite complex, quite strong passwords that they hold on 136 00:14:58.520 --> 00:15:04.910 to It also encourages people to use different passwords for different systems. So that, you 137 00:15:04.970 --> 00:15:11.450 know, once you've attacked one system, you're not getting all of the systems. So it you know, it's 138 00:15:11.450 --> 00:15:17.840 another fundamental and the backing up of data. So I talked a bit about the 21 days with COVID-19. 139 00:15:17.840 --> 00:15:26.360 But backing up is really important, particularly with ransomware. And I think, I think in the UK, 140 00:15:26.360 --> 00:15:30.950 we've been quite lucky with the level of ransomware attacks. I don't think the U.S. has 141 00:15:30.950 --> 00:15:41.870 been so lucky. But I do see that that is a potential threats coming down the path for us. And 142 00:15:42.500 --> 00:15:49.760 really, with ransomware, you want to be in a position that you never have to pay the ransom. So 143 00:15:49.760 --> 00:15:55.610 it's not a question of, do you don't you just, you don't you don't need to because you've got 144 00:15:55.850 --> 00:16:04.070 complete backups that are independent from your Live system. So you're not backing up dynamically 145 00:16:04.070 --> 00:16:10.790 into the cloud, which will get you ransomwared both physically and in the cloud at the same time, 146 00:16:11.180 --> 00:16:16.610 you've got something that that's separate and that you can recover from and then then you're not 147 00:16:16.610 --> 00:16:23.180 going to have to pay out these, in some cases, it appears quite huge, ransoms that have been paid 148 00:16:23.180 --> 00:16:34.460 over in the U.S. at the moment, so backing up really important. And I think we've talked about 149 00:16:34.460 --> 00:16:39.560 getting your own house in order and scanning for your own vulnerabilities. I think I think that's 150 00:16:39.560 --> 00:16:48.860 very important. And, and then one of the, one of the areas that I picked up from my time at NCSC is 151 00:16:49.640 --> 00:16:59.570 flat networks. So once an attacker's got into a network, the flatter the network that these 152 00:16:59.570 --> 00:17:05.690 artists traverse around the world and get into lots of lots of other systems now, flat networks 153 00:17:05.690 --> 00:17:13.400 came in to save money, basically. So less, less maintenance and admin, less routers and switches, 154 00:17:13.430 --> 00:17:25.010 switches. And you can see a CIO with the kind of business imperative of keeping the costs down, 155 00:17:25.190 --> 00:17:35.660 saying this is a great solution to save money. And security should have been a separate voice in that 156 00:17:35.660 --> 00:17:41.180 conversation. They should have been saying, actually, for the business as a whole is a flat 157 00:17:41.180 --> 00:17:48.290 network, the best option going forward, what happens if we are breached what happens to our 158 00:17:48.290 --> 00:17:55.910 reputation and and then what is the value of the business? So, so I think there's a real question 159 00:17:55.910 --> 00:18:04.670 for some quite large organizations about how much they should segregate their, their networks. Are 160 00:18:04.670 --> 00:18:12.710 they segregated enough? Could they stop an attacker just going through the entire worldwide 161 00:18:12.740 --> 00:18:21.320 business? And I think they should be able to, you know, security, that security infrastructure is so 162 00:18:21.320 --> 00:18:27.560 important. And and I think that's, that might be at the most difficult end of all the things I've 163 00:18:27.560 --> 00:18:34.070 talked about and the most expensive, but I think absolutely, it should be a consideration. It 164 00:18:34.070 --> 00:18:40.640 should be a business consideration, not in terms of let's go for the lowest cost network. But let's 165 00:18:40.640 --> 00:18:47.180 go for the balance between security potential reputation risk in terms of cost. 166 00:18:48.050 --> 00:18:52.400 Anna Delaney: And finally, Peter, you've been a leader in computer forensics for nearly three 167 00:18:52.400 --> 00:18:57.110 decades now. What do you feel has been the greatest lesson learned of this period? 168 00:18:59.940 --> 00:19:09.960 Peter Yapp: So it still surprises me that how remarkably resilient digital information is. So if 169 00:19:09.960 --> 00:19:17.190 you think something's lost or deleted or destroyed, it probably isn't. And you know, 170 00:19:17.190 --> 00:19:25.710 multiple copies are often made. But that may reside in a device in the cloud in a backup. Data 171 00:19:25.710 --> 00:19:32.070 is always somewhere, you know, when when someone says, Can you find this email? Usually can 172 00:19:32.970 --> 00:19:40.380 because, you know, it ends up in multiple places it's copied in its blind copied in, you just don't 173 00:19:40.380 --> 00:19:49.110 know how widely this goes. So I think when people despairingly say, you know, is there any chance 174 00:19:49.110 --> 00:19:55.470 that you might get this data? Well, often the answer is yes, not every time but often it is. So 175 00:19:55.830 --> 00:20:02.520 getting an expert to forensically search for key data will often uncover information that an 176 00:20:02.520 --> 00:20:10.440 average user just wouldn't be able to find so. And if I give you an example that brings the physical 177 00:20:10.440 --> 00:20:17.190 and the cybersecurity world together, so this is a recent example, but a smartphone with 178 00:20:17.190 --> 00:20:25.980 incriminating evidence on it. That was smashed to pieces and left on the road. So you think that's 179 00:20:25.980 --> 00:20:32.310 gone? There's no way I'm ever going to get that back. But by carefully reconstructing that phone, 180 00:20:32.310 --> 00:20:39.030 it was possible to power it on. But it just couldn't retain the power enough for us to see 181 00:20:39.030 --> 00:20:48.600 what was on the screen. So my team sourced another identical phone, remove the cover with a heat gun. 182 00:20:50.040 --> 00:20:57.810 This is not a quick or easy process. I have to say. The days of being able to unscrew the front 183 00:20:57.870 --> 00:21:06.630 of a mobile phone are long gone. If you if you remember the knock ears and the like, but we got 184 00:21:06.630 --> 00:21:14.790 the top off the top on the other one was smashed anyway. So that was easily lifted off, take out 185 00:21:14.790 --> 00:21:23.640 the memory chips, put them into the new phone, and we recovered all of the data. And you would think 186 00:21:23.640 --> 00:21:27.570 that that was gone forever. Certainly if you looked at the phone, you'd think there was no 187 00:21:27.570 --> 00:21:37.350 chance at all, but it was all all recovered in in it's original state so and we were able to do that 188 00:21:37.410 --> 00:21:42.750 under controlled conditions. So that that was evidentially admissible. 189 00:21:44.430 --> 00:21:50.280 Anna Delaney: Quite incredible. Peter, thank you so much for your time today. It's been highly 190 00:21:50.280 --> 00:21:51.090 informative. 191 00:21:51.840 --> 00:21:52.800 Peter Yapp: Excellent. Thank you. 192 00:21:53.550 --> 00:21:58.620 Anna Delaney: Once again, I've been speaking with Peter Yapp, partner at law firm Schillings, and 193 00:21:58.650 --> 00:22:01.260 for Information Security media. I'm Anna Delaney