Cyber espionage: Means to gather sensitive or classified data, trade secrets or other forms of intellectual property that can be used by threat actor for an advantage. |
Financial crime: Illegal activties whose primary goal is to make money. |
Hacktivism: Individual or group who utilises hacking techniques to promote a political or social agenda. |
Information operations: Coordinated actions taken to influence, disrupt or exploit an adversary decision making process. |
Analyst Tradecraft: 1. Intelligence Analysis: Like detectives piecing together clues, CTI analysts use reasoning to figure out what happened and why. 2. Technology Expertise: Analysts need to understand hardware and software engineering, systems integration, networks and protocols, exploits and vulnerabilities to spot issues. |
Challenge of attribution and response: When attempting to find out who is behind an attack, Incident responders typically assess both indicators of compromise (IoCs) and attack tactics, techniques and procedures (TTPs) that had been observed during an attack. IoCs are good place to start but an attacker infrastructure like IP address, domains can be easily spoofed or generated which will obfuscate their real identity. |
2 types of thinking ( System 1 - intuition, fast, permits quick judgement. How we perceived the world around us System 2 - analytical, slow, deliberate, slow thinking process. Activated when we do something that does not come naturally and requires some thinking through. ) |
Cognitive bias in CTI: Cognitive biases are mental shortcuts that sometimes lead us astray. Think of them as illusions for the brain. |
5 most common analytical traps: 1. Failing to consider multiple hypotheses or explanations. 2. Ignoring inconsistencies. 3. Reject evidence that does not support the hypothesis. 4. Insufficient resources to capture key evidence. 5. Improperly projecting past experience. |
Failure to consider visibility: Form of failing to consider multiple hypotheses or explanations. Different organisations have different views of threat landscape. Your environment, your country, your industry. Example: Suspicious email with unknown backdoor sent to CFO, must be targeted. But this activity is hitting customers of european based banks, must be a regionally focused cyber crime. |
Mixing facts with assessments: Result in failure to cope with evidence of uncertain accuracy. Example: Team wombat domain news.myworldnews.com resolved to same IP address as mail.mediacorp.com. (fact) Possible misinterpretation as mail.mediacorp.com is attributable to team wombat (assessment). |
Failing to properly vet sources: threat intelligence lives and dies on the quality of inputs, garbage in and garbage out. However, many organisations start their threat intelligence program by signing up for a series of open source threat feeds without a proper vetting process in place. Can result in a flood of alerts that are difficult to trust or differentiate. |
Failure to account for human action: In the landscape of computer operations, we deal with data but it is easy to forget that there is a person behind the keyboard. Our minds naturally want to sort and categorise information, make sense of the environment but not always comfortable with grey areas. |
Common Biases: 1. Confirmation Bias: Seeing what you expect to see, like ignoring evidence against your belief. 2. Ambiguity Effect: Avoiding decisions because of incomplete information. 3. Bandwagon Effect: Believing something just because everyone else does. |
Impact on Cybersecurity: Bias can cause analysts to misjudge situations, like assuming an attack on multiple targets must be highly organized without verifying the evidence. |
Bias is inherent and even awareness of biases not enough to neutralise them, what to do? Heuer says that when presented with an outcome, we ask ourselves the following questions: 1. If the opposite outcome had occurred, would I be surprised? 2. If this report had told me the opposite, would I believe it? 3. If the opposite outcome had occurred, would it have been predictable given the information that was available. |
Structured Analytical Techniques: Frameworks to ensure logical and unbiased analysis. Pros: 1. Promote collaboration and clarity. 2. Show the reasoning process for conclusions, making them more transparent. |
Intelligence lifecycle: 1. Planning and requirements, 2. Collection, 3. Analysis, 4. Production, 5. Dissemination and feedback |
Planning and requirements: stakeholders defined, business needs and information concerns. |
Collection: From information sources, raw internal and external data, open source, commercial and sensitive. |
Analysis: Collation and aggregation via threat intel platform or analyst best practices. |
Production: Estimative language, challenge analysis |
Dissemination and feedback: Role based intelligence reporting, feedback loop firmly established. |
Refer to case study for my details |
Diamond Model : Connects the dots between attackers, victims, tools, and infrastructure. Four Elements: 1. Adversary: The attacker or group. 2. Infrastructure: Tools and assets like servers used in the attack. 3. Capability: The methods or techniques used (e.g., malware). 4. Victim: The target. |
Considerations for the diamond model: 1. Timestamp: Date and time intrusion event occurred. 2. Result: Outcome of intrusion, succeed or failure or unknown. 3. Direction: How event moved through network or host (e.g victim to infrastructure, adversary to infrastructure, bidirectional) 4. Methodology: Category of event (portscan, spear phishing) 5. Resources: elements required for intrusion (e.g particular software, knowledge, funds, facilities, access rights) 6. Socio-political: Relationship between adversary and victim. 7. Technology: Tech involved in adversary capabilities and use of infrastructure. |
Example: LAPSUS$ used social engineering to breach companies like Okta and Microsoft, demonstrating how attackers exploit human and technical weaknesses. Refer to case study for more details. |
Cyber Kill Chain (7 stages): 1. Reconnaissance: Spying on the target to find weaknesses. 2. Weaponization: Creating tools like malicious emails or files. 3. Delivery: Sending the malicious tool to the target. 4. Exploitation: Activating the tool to break in. 5. Installation: Planting backdoors for ongoing access. 6. Command and Control (C2): Controlling infected machines remotely. 7. Actions on Objectives: Achieving the attacker's goal, like stealing data or causing disruption. |