Man, Thursday and Friday were nuts!
Sep. 9th, 2017 11:32 pmWednesday night I got an email from a friend whom I used to work with. She'd gone to a doctor that afternoon and their office was in a kind of chaos: the office had been hit by at least two different kinds of ransomware attacks. She wanted to know if I could help.
That night I did some research on the particular attacks, found out they were variants of the same core and both were based on exploiting weak Windows RDP (Remote Desktop Protocol) passwords. RDP is a back door to a server that techs use for management. It should NEVER be left open! There are other, more secure, ways to manage servers. If it must be left open, then it should have a VERY secure, i.e. LONG and complicated, password on it.
Obviously it did not.
A friend of the doctor's is their main IT guy, but he's not local, and he's decent but not top drawer. This problem apparently was discovered before Wednesday, and their guy (let's call him Bob) was making a new server for them with the latest version of Windows Server and SQL Server. The software that their clinic uses is mainly based in SQL Server, and here's the really suckie part: it was running Windows Server 2008 R2 and SQL Server 2008. And plugged straight in to a router to the internet. No hardware firewall, vendor-provided router.
*facepalm*
I didn't bother checking to see the patch level on their Windows Server 2008, it was kind of pointless. I did note that their SQL Server 2008 was well below the final patches that were released for it, not that it mattered as all of its databases had been encrypted.
The new router, though consumer grade, is fully patched. The new server is fully patched. A new Cisco firewall is on order. That's the best that we can do right now.
I was there Thursday from 11am to 8pm, then worked at home from 10pm to midnight copying, compressing (7zip), and uploading a big analytics file to a forensics company who sent us a utility to try and figure out what happened. Friday I only put in five hours, finishing up an inventory of all of the computers (which they didn't have) to figure out what should be tossed and what could be upgraded to get them all up to Windows 10 Pro and writing up some reports.
One woman complained to us that her computer was really slow. And it was. It was absolutely horribly slow! I was afraid that it had something nasty running under the covers, then I opened up Control Panel and did some poking around and found that it was a Pentium 4 with 2 gig of memory running Windows 10. The Performance Index, Satisfaction Index, whatever index, was 1.0. So we ordered her a new computer.
I always had a three-step plan when it came to buying computers to make them last longer and save money. When a new OS came out and the original started slowing down: add memory. That usually sped things up. Next OS comes out: install a better video card. Next OS comes out: buy a new computer. All of their computers are running at least 4 gig of memory, odds are they're all running a motherboard-based video card. I'm hoping we might be able to do memory upgrades and install some video cards and upgrade some of these for about $100-150 instead of tossing them. We shall see. I'll do some more inventory work next week now that we have a better idea as to what's out there.
This weekend I'm writing up a report more detailed than the single page invoice that just had bullet points as to what I did, I'm also burning a DVD with bootable malware/virus inspection software that'll look deeper in to the OS than something like Symantec can do, and since you're booting from read-only media, it'll look for boot kits that are otherwise invisible. I'll get to inspect all of the workstations! That'll make everyone oh so very happy to have their computer denied to them for however long it takes.
The tragic thing is that their backups weren't running properly because they had a terrible internet connection that couldn't handle the transfer. The software did a nightly backup to their vendor, but it had been failing. And they weren't doing anything locally, so they didn't really have a fall-back point to recover from. Their practice software vendor was able to restore from an earlier backup, but I don't know how successful that was in terms of how old and was there any corruption in it. I'll be finding that out Monday. This gets their patient information back, which is critical. And their insurance information is also processed online, so that should be safe. But anything stored locally may be lost.
And the horrible thing about that is the way the database is configured! I'm a database guy, I've been working with SQL Server for 25 years, since the first Microsoft version came out running on Lan Man/OS2. And the vendor has a VERY bad configuration. And I won't improve it unless they say it's OK. We're going to set up local backups, I've stressed upon the office manager the importance of rotating backup media and having a fire-proof safe in-house for storing said media. So eventually they'll be in a much better place.
The big question is whether or not they have to notify all their patients. I don't think this represents a HIPAA information spill. These ransomware encryptions are fully automated attacks by bots, I've never heard of data being exfiltrated and used for further extortion, that's a much more targeted attack. I'm going to have to tell the doctor who owns the practice to talk to his attorney and discuss this point because that's far outside of my ability to give him a recommendation.
That night I did some research on the particular attacks, found out they were variants of the same core and both were based on exploiting weak Windows RDP (Remote Desktop Protocol) passwords. RDP is a back door to a server that techs use for management. It should NEVER be left open! There are other, more secure, ways to manage servers. If it must be left open, then it should have a VERY secure, i.e. LONG and complicated, password on it.
Obviously it did not.
A friend of the doctor's is their main IT guy, but he's not local, and he's decent but not top drawer. This problem apparently was discovered before Wednesday, and their guy (let's call him Bob) was making a new server for them with the latest version of Windows Server and SQL Server. The software that their clinic uses is mainly based in SQL Server, and here's the really suckie part: it was running Windows Server 2008 R2 and SQL Server 2008. And plugged straight in to a router to the internet. No hardware firewall, vendor-provided router.
*facepalm*
I didn't bother checking to see the patch level on their Windows Server 2008, it was kind of pointless. I did note that their SQL Server 2008 was well below the final patches that were released for it, not that it mattered as all of its databases had been encrypted.
The new router, though consumer grade, is fully patched. The new server is fully patched. A new Cisco firewall is on order. That's the best that we can do right now.
I was there Thursday from 11am to 8pm, then worked at home from 10pm to midnight copying, compressing (7zip), and uploading a big analytics file to a forensics company who sent us a utility to try and figure out what happened. Friday I only put in five hours, finishing up an inventory of all of the computers (which they didn't have) to figure out what should be tossed and what could be upgraded to get them all up to Windows 10 Pro and writing up some reports.
One woman complained to us that her computer was really slow. And it was. It was absolutely horribly slow! I was afraid that it had something nasty running under the covers, then I opened up Control Panel and did some poking around and found that it was a Pentium 4 with 2 gig of memory running Windows 10. The Performance Index, Satisfaction Index, whatever index, was 1.0. So we ordered her a new computer.
I always had a three-step plan when it came to buying computers to make them last longer and save money. When a new OS came out and the original started slowing down: add memory. That usually sped things up. Next OS comes out: install a better video card. Next OS comes out: buy a new computer. All of their computers are running at least 4 gig of memory, odds are they're all running a motherboard-based video card. I'm hoping we might be able to do memory upgrades and install some video cards and upgrade some of these for about $100-150 instead of tossing them. We shall see. I'll do some more inventory work next week now that we have a better idea as to what's out there.
This weekend I'm writing up a report more detailed than the single page invoice that just had bullet points as to what I did, I'm also burning a DVD with bootable malware/virus inspection software that'll look deeper in to the OS than something like Symantec can do, and since you're booting from read-only media, it'll look for boot kits that are otherwise invisible. I'll get to inspect all of the workstations! That'll make everyone oh so very happy to have their computer denied to them for however long it takes.
The tragic thing is that their backups weren't running properly because they had a terrible internet connection that couldn't handle the transfer. The software did a nightly backup to their vendor, but it had been failing. And they weren't doing anything locally, so they didn't really have a fall-back point to recover from. Their practice software vendor was able to restore from an earlier backup, but I don't know how successful that was in terms of how old and was there any corruption in it. I'll be finding that out Monday. This gets their patient information back, which is critical. And their insurance information is also processed online, so that should be safe. But anything stored locally may be lost.
And the horrible thing about that is the way the database is configured! I'm a database guy, I've been working with SQL Server for 25 years, since the first Microsoft version came out running on Lan Man/OS2. And the vendor has a VERY bad configuration. And I won't improve it unless they say it's OK. We're going to set up local backups, I've stressed upon the office manager the importance of rotating backup media and having a fire-proof safe in-house for storing said media. So eventually they'll be in a much better place.
The big question is whether or not they have to notify all their patients. I don't think this represents a HIPAA information spill. These ransomware encryptions are fully automated attacks by bots, I've never heard of data being exfiltrated and used for further extortion, that's a much more targeted attack. I'm going to have to tell the doctor who owns the practice to talk to his attorney and discuss this point because that's far outside of my ability to give him a recommendation.