Jump to content

Ransomware attack today for me


markstanco

Recommended Posts

My hack a few years ago was much less serious than yours, but still extremely annoying. Some fucking cunt hacked my email, sent out well worded emails from me to all of our customers and prospects that included proposals (malware). Jesus, I ate a lot of fucking crow that that day.

I also stopped watching porn on my work computer.

  • Like 1
  • Haha 1
Link to comment
Share on other sites

11 hours ago, Updawg said:

Change passwords often, keep your shit patched/updated and tell your users not to open or click on shit from email

that is what is so difficult, you can train, train, train so much and add headers to top of every external email and ding them if they keep failing internal test but at the end of the day it only takes one person to let someone onto your network and wreak havoc on things :(

Link to comment
Share on other sites

11 hours ago, Updawg said:

Your ap,accounting, hr people will be the next to get spear phished.

can't second this enough - your accounts payable folks are either GOING to be a target or are CURRENTLY a target.

if Facebook and Google can get scammed for more than $100m via invoice fraud anyone can be scammed...

https://www.cnbc.com/2019/03/28/how-to-avoid-invoice-theft-scam-that-cost-google-facebook-123m.html

Quote

 

Invoice fraud results in an immediate financial loss. And it’s on the rise.

According to the FBI, the amount of money that scammers attempted to steal through business e-mail compromise grew 136% between December 2016 and May 2018. Overall, e-mail scammers targeted more than $12 billion worldwide between October 2013 and May 2018.

In a typical invoice fraud, hackers take over or convincingly spoof the email address of a known business partner, like an attorney or vendor. The criminal may carefully monitor the usual interactions and payment processes between the business and the other party. Then, the criminal sends a convincing invoice or asks for a wire transfer for services rendered. Often, the business’s accounting office doesn’t realize it’s fraud and releases the funds.

That was the case with one owner of a small accounting firm in Brooklyn, New York, who wished to remain anonymous. In 2016 and 2017, an administrative assistant received several emails from an email address that appeared to belong to a business partner requesting payment for legal services, with wire addresses at legitimate banks. The assistant was in charge of releasing funds for routine invoices and complied. The scam cost the firm nearly $700,000 in one year — about half his average yearly revenue.

if you have a business, large or small, and aren't actively working with your accounting and A/P folks on this you are begging for someone to hit you - may not be a large amount of money at first but it's happening.

my last organization got scam faxes with invoices (same concept, just using faxes not emails) on an almost daily basis.

train your people, encourage them, reward them for identifying phishing attempts. 

(and this isn't even spear phishing where they get a spoofed email from the CEO/CFO with legitimate partner names/information other than wiring instructions and ask for something to paid by EOD or paid ASAP)

Link to comment
Share on other sites

10 hours ago, markstanco said:
11 hours ago, Updawg said:
Change passwords often, keep your shit patched/updated and tell your users not to open or click on shit from email

Good point here. They are all required to ask on clicking on "blue texts" within email to their super. That wasnt the issue here. Brute force was.

glad you are already on top of this! good job!

  • Like 1
Link to comment
Share on other sites

Paths with SQL changed as well. It's amazing how this wrecks shit. Also, other small stuff I worked on all day as well such as making sales orders in to PDFs and reports in to xls docs.

I cant remember if I stated above, but our server was originally named DDGxxxxxxx and the IT guy when setting it up named it DGxxxxxx. Not his fault, DGxxxxxx was printed on the server, my last IT company must have fat fingered the keyboard. So that created a whole slew of new issues.

Link to comment
Share on other sites

Every night I google ransomware. If you google it now you will see stories from 1 day ago. Every night.

So IT gurus, what is the most vulnerable thing you shake your head at? I think mine was leaving the administrator account open, which is now disabled. Also, our RDP was opened then and now closed. Anything else?

Link to comment
Share on other sites

The only way to be 100% invulnerable is to unplug. Of course, we all know that isn’t possible. Keep your servers and desktops and firewalls patched, educate your users, follow industry security standards, and perform regular security audits and you should be ok.

 

  • Like 1
Link to comment
Share on other sites

I deal with software and services more than hardware. One thing I've noticed is all the consultants and offshore developers they hire know nothing about how to secure their stuff (oauth2, client certificate, etc). We hav one lady that still doesn't know the difference between a JWT and SSL Certificate public key, even after I explained it to her. We waisted a week because she kept asking us for a JWT key and no one know wtf she was talking about. It only got resolved when I got access to the system and did that part of her job for her.

Link to comment
Share on other sites

The most important safeguard is regular offsite backups.  If you had those, you would have been down no more than a day if that.  You are never going to idiot proof your network, so it's best to have those backups as your last line of defense.  Another thing you might consider is a firewall that does Geo-location blocking.  It's not foolproof, but it will make sure that a good chunk of bad actors outside of the US don't have access to you.

Edited by kevwun
Link to comment
Share on other sites

Reuse of passwords is an extremely common security failure, as is neglecting to configure/use 2FA. Simply addressing those things will do wonders to shrink your attack surface and prevent the preventable. Hate, Viper, and Kevwun are all correct as well - the only way to be 100% secure is to unplug, if you have to be plugged in at least work to get your staff/IT team trained up to make better decisions, and as one last safeguard keep regular offsite backups AND TEST RESTORING FROM YOUR BACKUPS REGULARY!

There's no silver bullet for this sort of thing, and the best way to address security concerns is through defense in depth. It's not so much that you're going to make yourself impenetrable, but you can reduce the blast radius of when you do get pwned and reduce the frequency of security incidents.

On the note of reused passwords, it's always fun to check out https://haveibeenpwned.com/ to see which of your old emails/logins is out in the internet with a known password you currently use

Link to comment
Share on other sites

Day 15 in the books! Feeling a little better, I am only waking up a time or 2 every night vs every hour.

Hardware and software back in order 100% today, we have now a total of 9 backups on the cloud and local from the hardware we installed monday and the cloud ones are real time minus about 15 minutes of data entry. Paid for real time backups numerous times a day back logging a month, and the local ghost hardware is also a month's worth.

Link to comment
Share on other sites

1 minute ago, markstanco said:

Day 15 in the books! Feeling a little better, I am only waking up a time or 2 every night vs every hour.

Hardware and software back in order 100% today, we have now a total of 9 backups on the cloud and local from the hardware we installed monday and the cloud ones are real time minus about 15 minutes of data entry. Paid for real time backups numerous times a day back logging a month, and the local ghost hardware is also a month's worth.

Yeah but now your entire workforce will get Coronavirus and half will die, then comes bankruptcy. 

Good job on getting back up to speed though

Link to comment
Share on other sites

On 2/26/2020 at 11:48 PM, markstanco said:

So IT gurus, what is the most vulnerable thing you shake your head at?

I was at ground zero when the EBT System (foodstamps) went tits up and caused grocery store chaos across the country. That was probably the biggest "WTF were you thinking" I've seen in regards to actually making the news. I've actually seen a lot worse but those were accompanied with NDA's but this foodstamp fiasco was such a cluster fuck there was no hiding it.

 

 

 

 

 

Link to comment
Share on other sites

Day 15 in the books! Feeling a little better, I am only waking up a time or 2 every night vs every hour.

Hardware and software back in order 100% today, we have now a total of 9 backups on the cloud and local from the hardware we installed monday and the cloud ones are real time minus about 15 minutes of data entry. Paid for real time backups numerous times a day back logging a month, and the local ghost hardware is also a month's worth.
Looks like it would have been easier to just have paid them.
Link to comment
Share on other sites

9 hours ago, markstanco said:

Day 15 in the books! Feeling a little better, I am only waking up a time or 2 every night vs every hour.

Hardware and software back in order 100% today, we have now a total of 9 backups on the cloud and local from the hardware we installed monday and the cloud ones are real time minus about 15 minutes of data entry. Paid for real time backups numerous times a day back logging a month, and the local ghost hardware is also a month's worth.

Glad to hear that you're out of the wilderness on recovering. Next order of business after getting your backup solution in place is to test your restore workflow. 

I can't tell you how many fortune-500 enterprises get caught with their dick in their hands during a DR event because they never bothered to actually test if they could rebuild from the backups or not

  • Like 1
Link to comment
Share on other sites

2 hours ago, Hate said:

All of them because there is only one way to test that which is to actually restore from said backups.

Well the idea is that you test your backup and restore solution in a granular fashion (app-by-app or service-by-service) during maintenance windows so you can at least have some visibility into potential issues and recovery timelines for a fire drill DR scenario. You'll never really be able to fully test your entire DR (in a non-cloud environment at least) outside of an actual recovery event, but even doing some smaller test runs can help to identify otherwise unknown critical issues.

Link to comment
Share on other sites

I get it, I’ve just experienced it at a Fortune 500 company that had to report to the SEC and FTC that our DR solutions were airtight when in fact they weren’t. That was 12 years ago so things may have changed, but I doubt it very seriously. The bigger the company the less likely they are to be fully compliant, IMO. I could be wrong.

Link to comment
Share on other sites

9 hours ago, Captainant said:

Glad to hear that you're out of the wilderness on recovering. Next order of business after getting your backup solution in place is to test your restore workflow. 

I can't tell you how many fortune-500 enterprises get caught with their dick in their hands during a DR event because they never bothered to actually test if they could rebuild from the backups or not

This is what caused the EBT fiasco I mentioned above. Failed over to a "hot site" that hadn't fully synched up for 3 years prior to the failover. 

Link to comment
Share on other sites

On 2/26/2020 at 11:48 PM, markstanco said:

Every night I google ransomware. If you google it now you will see stories from 1 day ago. Every night.

So IT gurus, what is the most vulnerable thing you shake your head at? I think mine was leaving the administrator account open, which is now disabled. Also, our RDP was opened then and now closed. Anything else?

don't fault yourself.  some of them (unethical hackers) are very good at what they do. in addition to "google-ing ransomware", ensure your team is up to date with CVE and other publicly known databases that list known vulnerabilities.  one of the best things you can do is protect yourself from publicly known vulnerabilities and have a patch management standard in place.

patch management.  patch. patch.  i hear that word a lot from experienced pros. 

much easier to have standards, procedures, table-top exorcises, IRT, etc.... with larger corps.

 

  • Like 3
Link to comment
Share on other sites

Fortune 100 here. . .last year we got in a legal spat with one of our prime vendors who had the audacity to send all our employees an email outlining how we were screwing them and why they turned us off.

 

Narrator: we were screwing them, had developed entirely new unauthorized applications based on their source platform, running in their infrastructure.

 

Tl;dr don't forget supplier diversity, at least know your options.

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...