Since news about the Capital One breach broke, the cybersecurity world has been speculating about how exactly the attack went down. Did one AWS contractor take advantage of privileged access granted by her former employer, or was it an instance of her exploiting a service misconfiguration? Meanwhile, security executives are scrambling to protect their organizations against such attacks; the betrayal of trust by a malicious insider is a gut punch to those needing to provide access to sensitive data to employees, contractors, and business partners.
When people hear about insider threats, they instinctively think about greedy or disgruntled employees misusing access for money or revenge. But the reality is that more often than not, insider threat is the result of mistakes or negligence by users.
Can an administrator disregarding security best practices, e.g., using “admin” as a password or not proactively installing new patches, be considered a threat? Absolutely – willful negligence should definitely be considered an insider threat. A well-intentioned but careless or apathetic administrator is just as capable as a malicious insider of damaging an organization’s reputation and bottom line. In 2018 alone, sheer negligence and carelessness accounted for an alarming 64% of breaches.
Unintentional insider threat occurs in all industries. In a recent case coming out of Silicon Valley, 540 million records containing information on Facebook users and their activities were left unprotected because of misconfigured AWS S3 buckets belonging to third parties Cultura Colectiva and At the Pool. Farther afield, bad system management, a lack of employee training, and network flaws and misconfigurations allowed hackers to successfully breach the health data of 1.5 million Singaporeans including its prime minister.
The Facebook and SingHealth breaches make two important points: One, that insiders are not just employees but go on to include third parties like contractors, business partners and guest workers. These appendages to a company’s network push what it means to be a trusted user. And two, that unintentional insider threat comes in many forms.
Examples of Unintentional Insider Threat
Neglect and poor security hygiene can cause data exposure and irreparable damage to organizations and the civic lives of customers they serve.
1. Accidental Exposure
In 2018, Georgia Tech mistakenly emailed the personal information of nearly 8,000 College of Computing students to their peers. The sensitive and personally identifiable information included ID and telephone numbers, dates of birth, addresses, GPAs and national origins. This incident caused student and public distrust in a school with reputed strengths in computer science.
2. Service Misconfigurations
A 854 GB-sized MongoDB database lacking password and login authentication leaked the resumes of 200 million Chinese job seekers. Misconfigurations like this gift hackers with valuable information for succeeding phishing attacks. It also gives organizations an unfair advantage in negotiations or the ability to screen out candidates based on information that should be private.
3. Exposed, Lost, or Improperly Discarded Files & Equipment
A software developer in the NSA’S TAO group took classified material, documents and hacking tools home from 2010 to 2015 with the intention of working after-hours towards a promotion. The inappropriate removal and storage of files put the intelligence community’s capabilities and methods at severe risk, rendering some of them unusable.
Because the room for human error is only growing, we believe intent should not be the sole determinant of whether someone is considered an insider threat. In fact, we may want to start using the term insider risk in its stead.
Defending Against Insider Threat
There is no shortcut or magic bullet to defend against insider threat. Organizations need to establish a robust security program built on the Protect-Detect-Respond framework. Here are some core best practices to establish:
- Access Lifecycle Management Based on the Principle of Least Privilege: From the moment a user is onboarded within an organization until he/she leaves, continuously ensure that the user is granted access and privileges for just the services required to do the job, and only for the time that such access is required.
- Stringent Service Configuration Management: Ensure that services are configured in accordance with security best practices and organizational policies, and that configuration changes are tightly controlled and audited. Configuration management is particularly important in the cloud; while the service provider is responsible for the security of the services they provide, properly configuring and using the service is your responsibility.
- Visibility, Early Detection and Rapid Response: Do you know what your admins are doing across all services, including SaaS and IaaS? Is someone creating a backdoor account in Salesforce? Is a user downloading unusually large volumes of data or behaving “suspiciously”? Getting early warning about emerging threats and risks allows security teams to nip damage in the bud before an incident blows up.
- Incident Response: Prepare for incidents as though it’s a matter of when, not if. Have a consolidated audit trail that lets you see what users did in different services and allows you to investigate incidents quickly and effectively. This is an area where training and fire drills help.
Get Started Today
Implementing these best practices will not be an overnight accomplishment. In fact, organizations often take years to establish a mature security posture against insider threat. However, here are three areas you can focus on today to get disproportionate returns:
- Lock down production environments and administrator privileges: Restrict access to product environments based on need and usage. This means granting access to the right person, at the right time, and for the right purposes. Continuously monitor accounts with privileges and revoke these when appropriate.
- Enable logging across all apps and systems: Often organization’s do not max out their logging and audit capabilities. They can do this by ensuring these are enabled and functional in all environments.
- Finally, educate employees on security and privacy best practices: Training modules should cover common sense topics like using MFA, password managers, and guarding data and company equipment while in the office, at home or traveling. Additional training must be provided if employees handle sensitive information like healthcare, financial or student data.
Insider threat is a human issue and humans remain, even at their best, imperfect. By taking human error into account, security teams will be one step closer to squashing insider threat for good.