homepage
Menu
Open menu
  • Training
    Go one level top Back

    Training

    • Courses

      Build cyber prowess with training from renowned experts

    • Hands-On Simulations

      Hands-on learning exercises keep you at the top of your cyber game

    • Certifications

      Demonstrate cybersecurity expertise with GIAC certifications

    • Ways to Train

      Multiple training options to best fit your schedule and preferred learning style

    • Training Events & Summits

      Expert-led training at locations around the world

    • Free Training Events

      Upcoming workshops, webinars and local events

    • Security Awareness

      Harden enterprise security with end-user and role-based training

    Featured

    Get a Free Hour of SANS Training

    Free Course Demos

    Can't find what you are looking for?

    Let us help.
    Contact us
  • Learning Paths
    Go one level top Back

    Learning Paths

    • By Focus Area

      Chart your path to job-specific training courses

    • By NICE Framework

      Navigate cybersecurity training through NICE framework roles

    • DoDD 8140 Work Roles

      US DoD 8140 Directive Frameworks

    • By European Skills Framework

      Align your enterprise cyber skills with ECSF profiles

    • By Skills Roadmap

      Find the right training path based on critical skills

    • New to Cyber

      Give your cybersecurity career the right foundation for success

    • Leadership

      Training designed to help security leaders reduce organizational risk

    • Degree and Certificate Programs

      Gain the skills, certifications, and confidence to launch or advance your cybersecurity career.

    Featured: Solutions for Emerging Risks

    New to Cyber resources

    Start your career
  • Community Resources
    Go one level top Back

    Community Resources

    Watch & Listen

    • Webinars
    • Live Streams
    • Podcasts

    Read

    • Blog
    • Newsletters
    • White Papers
    • Internet Storm Center

    Download

    • Open Source Tools
    • Posters & Cheat Sheets
    • Policy Templates
    • Summit Presentations
    • SANS Community Benefits

      Connect, learn, and share with other cybersecurity professionals

    • CISO Network

      Engage, challenge, and network with fellow CISOs in this exclusive community of security leaders

  • For Organizations
    Go one level top Back

    For Organizations

    Team Development

    • Why Partner with SANS
    • Group Purchasing
    • Skills & Talent Assessments
    • Private & Custom Training

    Leadership Development

    • Leadership Courses & Accreditation
    • Executive Cybersecurity Exercises
    • CISO Network

    Security Awareness

    • End-User Training
    • Phishing Simulation
    • Specialized Role-Based Training
    • Risk Assessments
    • Public Sector Partnerships

      Explore industry-specific programming and customized training solutions

    • Sponsorship Opportunities

      Sponsor a SANS event or research paper

    Interested in developing a training plan to fit your organization’s needs?

    We're here to help.
    Contact us
  • Talk with an expert
  • Log In
  • Join - it's free
  • Account
    • Account Dashboard
    • Log Out
  1. Home >
  2. Blog >
  3. Undercover Operations: Scraping the Cybercrime Underground
Apurv Singh Gautam
Apurv Singh Gautam

Undercover Operations: Scraping the Cybercrime Underground

Web scraping is a powerful and essential capability for cybercrime intelligence professionals.

January 15, 2025

This blog is jointly authored by Apurv Singh Gautam, Sr. Threat Research Analyst at Cyble, and Sean O’Connor, co-author of the SANS FOR589TM: Cybercrime IntelligenceTM course.

In the rapidly evolving cybercrime landscape, staying ahead of malicious actors requires a proactive approach to gathering and analyzing data. One of the most powerful tools in the arsenal of cybercrime intelligence analysts is web scraping.

What is Web Scraping?

Web scraping is the process of automatically extracting data from websites and web pages, enabling organizations to collect vast amounts of information quickly and efficiently.

Web scraping plays a crucial role in cybercrime intelligence by enabling analysts to keep a close watch on dark web forums, marketplaces, and other online chat platforms where cybercriminals gather and exchange information. By systematically collecting data from these sources, analysts can unearth valuable insights into emerging threats, vulnerabilities, and malicious actors' tactics, techniques, and procedures (TTPs). These insights can guide decision-making, bolster incident response capabilities, and fortify overall cybersecurity.

In this blog post, we will explore the various aspects of scraping operations on the cybercrime underground. We will delve into the different scraping methods, discuss the key use cases in cybercrime investigations, and examine the challenges of anti-scraping mechanisms and techniques to bypass them. Importantly, we will provide insights into the strategic decision-making process behind scraping operations and highlight the importance of data storage and analysis using tools like Elasticsearch and Kibana.

By the end of this post, readers will have a comprehensive understanding of how scraping operations can be leveraged to enhance cybercrime intelligence efforts. This knowledge will empower you, as a cybersecurity intelligence analyst, with the best practices and considerations for implementing effective scraping workflows (see Figure 1). So, let's dive in and uncover the power of scraping in the fight against cybercrime.

Figure 1: Scraping Workflow (LLM Generated Image from Napkin AI)
Figure 1: Scraping Workflow (LLM Generated Image from Napkin AI)

Scraping Toolsets

Scraping operations involve leveraging different software tools, libraries, and frameworks to extract data programmatically, enabling efficient and scalable information gathering (see Figure 2).

Figure 2: Scraping Toolsets (LLM Generated Image from Napkin AI)
Figure 2: Scraping Toolsets (LLM Generated Image from Napkin AI)
  • Python Libraries and Frameworks: The most popular tools for these scraping tasks include Python libraries and frameworks. BeautifulSoup is a Python library that simplifies parsing HTML and XML documents with an intuitive interface for navigating and searching parsed data, making it a user-friendly choice for web scraping. Another key library, Requests, facilitates HTTP requests and is often used with BeautifulSoup to fetch web pages and extract data, providing a clean API for handling requests, cookies, and authentication. Scrapy, a more robust Python framework, offers a comprehensive ecosystem for building and managing scrapers, with features like request handling, data extraction, multithreading, and pipeline management. Another application-specific library is Telethon, which is used to scrape Telegram messages.
  • JavaScript Scraping: For scraping JavaScript-based sources, Puppeteer is a powerful Node.js library that enables programmatic control of headless Chrome or Chromium browsers. It is ideal for scraping dynamic web pages and handling complex scenarios like user interactions and JavaScript rendering. Browser automation tools such as Selenium and Playwright are also widely used. Selenium supports multiple programming languages and simulates user interactions, making it effective for scraping JavaScript-heavy websites. Playwright, developed by Microsoft, provides a unified API for automating interactions across different browsers, offering a modern alternative to Selenium.
  • Proxies: Proxies play a crucial role in scraping operations by enhancing privacy and bypassing restrictions. Tools like Privoxy, a non-caching web proxy with advanced filtering capabilities, support traffic routing through networks like Tor. Another proxy tool, Proxychains, forces applications to route TCP connections through proxies like TOR, SOCKS4/5, or HTTP(S) proxies. Similarly, Tor is a widely used network for routing traffic to enhance anonymity.

Cybercrime Intelligence Use Cases for Scraping

Scraping operations play a crucial role in gathering valuable intelligence for combating cybercrime. By leveraging automated scraping techniques, cybercrime investigators and analysts can collect and analyze vast amounts of data from cybercrime sources. Some of the key use cases of scraping operations on the cybercrime underground include monitoring cybercrime forums and marketplaces, detecting data leaks and breaches, tracking and profiling threat actors, and investigating cybercrime networks and infrastructure (see Figure 3).

Figure 3: Scraping Use Cases (LLM Generated Image from Napkin AI)
Figure 3: Scraping Use Cases (LLM Generated Image from Napkin AI)
  1. Monitoring Cybercrime Forums and Marketplaces: The cybercrime underground is a hotbed for cybercriminal activities, with numerous forums, marketplaces, and chat platforms for sharing technical knowledge, selling stolen data, and coordinating attacks. Scraping these forums and marketplaces provides valuable insights into the latest trends, tools, and techniques cybercriminals use. By monitoring conversations and tracking the sale of illicit goods, analysts can stay ahead of emerging threats and proactively defend against potential attacks.
  2. Detecting Data Leaks and Breaches: The scraped data from these forums and marketplaces can be used to search for specific keywords or patterns, such as company names, product names, or sensitive data, to identify potential exposures quickly. Early detection of data leaks allows organizations to take swift action, notify affected parties, and mitigate the breach's impact.
  3. Tracking and Profiling Threat Actors: Cybercriminals often leave digital footprints across multiple online platforms, including forums, social media, and code repositories. By scraping these sources, analysts can gather information about specific threat actors, their aliases, and their activities. This data can be used to build comprehensive profiles of cybercriminals, understand their motivations, and track their movements across the web. Profiling threat actors helps in attribution efforts and enables targeted investigations and takedowns.
  4. Investigating Cybercrime Networks and Infrastructure: Cybercriminals often rely on a complex network of infrastructure, including command and control (C2) servers, proxy servers, and bulletproof hosting providers. Analyzing indicators of compromise (IOCs) from scraped data and enriching them with domain registration records, IP address ranges, malware analysis, and server configurations can help uncover these infrastructure components and map out the cybercrime infrastructure. Analysts can also identify key players, disrupt operations, and gather evidence for legal proceedings.

By leveraging automated scraping techniques, cybercrime intelligence analysts can gather valuable data, identify trends and patterns, and make informed decisions to prevent, detect, and respond to cyber threats.

Anti-Scraping Mechanisms

Website owners and administrators implement anti-scraping mechanisms to protect their content and prevent unauthorized data collection. These mechanisms pose significant challenges for cybercrime intelligence professionals who rely on scraping to gather valuable data. Some of the common anti-scraping techniques employed by websites include:

  1. Access Control and Authentication: CAPTCHAs and human verification challenges, such as distorted text or Image recognition tasks, are among the most widely used anti-scraping measures. These challenges, designed to differentiate human users from automated bots, significantly impede data collection efforts.
  2. User Agent Detection and Blocking: Websites can analyze the user agent string and headers the client sends to identify and block requests originating from scraping scripts. The website may deny access or serve alternate content if the user agent string does not indicate a legitimate human-like browser request.
  3. IP Address Tracking and Blocking: Websites can monitor the rate and pattern of requests from specific IP addresses. If an IP address is detected making an unusually high number of requests within a short time frame, the website may flag it as a potential scraper and block or throttle its access.
  4. DDoS Protection and Rate Limiting: Distributed denial of service (DDoS) protection measures, such as introducing delays in content delivery, can effectively limit the rate at which scrapers can access and extract data. These delays are designed to distinguish between legitimate human users and automated scripts. Website administrators utilize services like Cloudflare and Imperva to provide DDoS protection and rate limits.
  5. Dynamic Content Rendering and JavaScript Obstacles: Some websites heavily rely on JavaScript to render content dynamically, making it challenging for traditional scraping tools to extract data. The desired information may not be present in the initial HTML response and requires executing JavaScript code to load and display the content.
  6. Cookie Change and Account Blocking: Website administrators may monitor and block user accounts that exhibit suspicious scraping behavior. Moreover, periodically changing cookies is another strategy website administrators employ to deter scraping. Scrapers that rely on static cookies may encounter difficulties when the cookies are frequently updated.
  7. Fingerprinting and Browser Profiling: Advanced anti-scraping systems employ fingerprinting techniques to identify and block automated scraping tools. These techniques analyze various client attributes, such as browser version, installed plugins, screen resolution, and system fonts, to create a unique fingerprint. These fingerprinting mechanisms can detect whether a request comes from a genuine browser.

Anti-Scraping Countermeasures

To successfully navigate the challenges posed by anti-scraping mechanisms, cybercrime intelligence analysts must employ a range of countermeasures and best practices. These techniques aim to mitigate the risk of detection and ensure the continuity of scraping operations. Some of the strategies to bypass anti-scraping measures and conduct effective scraping for cybercrime intelligence purposes include:

  1. Handling CAPTCHAs and Solving Challenges: Dealing with CAPTCHAs is one of the most significant hurdles in scraping operations. While some CAPTCHAs can be solved using automated optical character recognition (OCR) techniques, more sophisticated challenges often require human intervention. CAPTCHA-solving/bypass services, such as Anti-Captcha or 2Captcha, can be integrated into scraping workflows to outsource CAPTCHA-solving to human workers, allowing scrapers to proceed without manual intervention.
  2. Rotating User Agents and Headers: One of the most basic countermeasures is to rotate user agent strings and headers to mimic legitimate browser requests. By using a pool of diverse user agent strings and regularly rotating them between requests, scrapers can avoid detection based on a single, consistent user agent. Customizing headers such as "Referer" and "Accept-Language" can also help make requests appear more human-like.
  3. Utilizing Proxy Servers and IP Rotation: Employing proxy servers and implementing IP rotation strategies can help circumvent IP-based blocking and rate limiting. By routing requests through a network of proxy servers, scrapers can distribute their traffic across multiple IP addresses, making it harder for websites to detect and block individual scraping sessions. Proxy and VPN services offer various proxies, such as residential, data center, and mobile, each with advantages and use cases.
  4. Mimicking Human Behavior and Introducing Random Delays: To avoid triggering rate-limiting mechanisms and appearing as a bot, scrapers should introduce random delays between requests and mimic human browsing behavior. This can involve adding random pauses, varying the time intervals between requests, and simulating human-like actions such as scrolling, clicking, and mouse movements. Tools like Playwright and Selenium can be used to mimic human behavior and introduce delays.
  5. Handling Dynamic Content and JavaScript Rendering: For websites that heavily rely on JavaScript to render content dynamically, scraping requires the use of headless browsers or tools like Puppeteer or Selenium. These tools can simulate user interactions, execute JavaScript code, and retrieve fully rendered HTML content. By leveraging these technologies, scrapers can extract data from websites that would otherwise be challenging to scrape using traditional methods.
  6. Continuous Changing of Cookies and Rotating Accounts: Websites use cookies to track users' sessions and behavior. A scraper can avoid being recognized as a single continuous user by continuously changing cookies. This can be done manually by clearing cookies at regular intervals or automatically by scripting the scraper to change or reset cookies periodically. This technique helps evade tracking mechanisms that rely on cookie data to identify and block scrapers. Similarly, if the website requires user accounts for access, rotating between different accounts can prevent any single account from being flagged for excessive use or unusual activity. This practice helps distribute the load and masks the automated nature of the interactions.
  7. Bypassing Fingerprinting and Browser Profiling: Analysts need to be aware of the browser fingerprinting methods and take steps to mimic a genuine browser environment. This can involve using headless browsers with configurations resembling popular browser setups and periodically rotating browser profiles to avoid detection. Services like Am I Unique or Cover Your Tracks can be used to check browser fingerprinting.

Anti-scraping mechanisms are constantly evolving, and websites may implement new measures to deter scrapers. As cybercrime intelligence professionals, your adaptability is key. You must continuously monitor your scraping operations, detect any changes or disruptions, and adapt your techniques accordingly. Staying up-to-date with the latest scraping technologies, tools, and best practices is essential to maintain effective scraping efforts (see Table 1).

Anti-Scraping Mechanism

Intent

Countermeasure

Impact to Scraping Operations

CAPTCHAs and human verification challenges

Differentiate human users from automated bots

Utilize CAPTCHA-solving services or incorporate human intervention

Increases the complexity and time required for scraping

User agent detection and blocking

Identify and block requests from known scraping tools or libraries

Rotate user agent strings and customize headers to mimic legitimate browser requests

Requires additional effort to maintain a pool of diverse user agent strings

IP address tracking and rate limiting

Detect and block requests from IP addresses making excessive requests

Employ proxy servers and implement IP rotation strategies to distribute requests across multiple IP addresses

Increases the cost and complexity of scraping infrastructure

Dynamic content rendering and JavaScript obstacles

Prevent scraping of content loaded dynamically through JavaScript

Use headless browsers or tools like Puppeteer or Selenium to render and extract dynamically loaded content

Increases the complexity and computational resources required for scraping

Fingerprinting and browser profiling

Identify and block automated tools based on unique browser characteristics

Use headless browsers with configurations that closely resemble genuine browser environments and rotate browser profiles

Requires continuous monitoring and adaptation to avoid detection

Browser automation detection

Identify and block requests from automated browser tools like Selenium or Puppeteer

Minimize the use of automation-specific code and utilize techniques like WebDriver spoofing or undetected Chrome variants

Requires staying updated with the latest detection methods and countermeasures

CAPTCHA and challenge-response evolution

Prevent automated solving of CAPTCHAs using advanced challenge-response mechanisms

Continuously monitor and integrate the latest CAPTCHA-solving techniques and services

Increases the complexity and cost of handling CAPTCHAs in scraping operations

Table 1: Anti-Scraping Mechanisms and Countermeasures (Source)

Scraping Decision Making

Effective scraping operations require careful planning and strategic decision-making. Several factors must be considered to ensure that scraping efforts are targeted, efficient, and aligned with the overall objectives of the investigation. The key aspects of decision-making when planning and executing scraping tasks for cybercrime intelligence purposes include:

  1. Identifying Relevant Data Sources: Analysts must carefully select the websites, forums, marketplaces, and other online platforms likely to contain relevant information. Factors to consider include the data source's reputation and credibility, the volume and quality of the available data, and the information's relevance to the specific case at hand.
  2. Evaluating Technical Feasibility and Resource Requirements: A scraping operation's technical feasibility and resource requirements must be carefully evaluated. This involves assessing the complexity of the target websites, the presence of anti-scraping mechanisms, the need for specialized tools or expertise, and the decision between self-built or paid tools. Analysts should consider factors such as the scalability of the scraping infrastructure, the storage and processing capacity for the collected data, and the availability of skilled personnel to design and execute the scraping tasks.
  3. Designing Resilient and Adaptable Scraping Workflows: Cybercrime intelligence professionals should design scraping workflows resilient to changes in website structures, anti-scraping measures, and data availability. This involves implementing robust error handling, monitoring mechanisms, and failover strategies to ensure the continuity of scraping operations.

Storage and Analysis

Efficient storage, processing, and analysis of the data collected through scraping operations are crucial for deriving actionable insights in cybercrime investigations. Any database can store the data, but the Elastic or ELK stack provides a robust and scalable solution for managing and exploiting the scraped data.

Elasticsearch: Elasticsearch is a distributed, open-source search and analytics engine that forms the core of the ELK stack. It provides a scalable and efficient platform for storing, searching, and analyzing large volumes of structured and unstructured data.

Logstash: Logstash is a data processing pipeline that integrates with Elasticsearch to ingest, transform, and load data from various sources.

Kibana: Kibana is a powerful data visualization and exploration tool that complements Elasticsearch and Logstash in the ELK stack. It provides an intuitive web interface for querying, visualizing, and dashboarding the data stored in Elasticsearch.

Cybercrime intelligence analysts need to integrate their scraping operations with the ELK stack to leverage the Elastic stack for scraped data storage and analysis. This involves configuring the scraping tools or scripts to output the collected data in a format compatible with Logstash's input plugins, such as JSON or CSV. Once Logstash ingests the scraped data, it can be processed, enriched, and transformed using Logstash's pipeline configuration. The transformed data is then indexed in Elasticsearch, which becomes available for searching, querying, and visualization through Kibana. Kibana's Discover and Visualization tools (see Figures 4 and 5) can analyze and visualize the data using tables and charts.

Figure 4: Kibana Discover Tool
Figure 4: Kibana Discover Tool

Figure 5: Kibana Visualize Tool
Figure 5: Kibana Visualize Tool

The powerful combination of Elastic stack, with its search and analytics capabilities, enrichment features, and intuitive visualization and exploration tools, enables efficient and effective insights extraction from collected data.

Case Study

CHAOTIC SPIDER, also known as "Desorden" and previously operating as "ChaosCC," is a financially motivated cybercriminal entity active since September 2021. The group specializes in data theft and extortion, focusing its efforts on enterprises in Southeast Asia, particularly in Thailand since 2022. Desorden employs SQL injection attacks to compromise web-facing servers, exfiltrating data without resorting to encryption or destruction tactics commonly seen in double-extortion schemes. Their primary activities include targeting prominent organizations such as Acer, Ranhill Utilities, and various Thai firms, with stolen data sold on cybercriminal forums like RaidForums and BreachForums (see Figure 6). The group's last known activity was recorded in October 2023, emphasizing their preference for secrecy through secure communication channels such as Tox and private messaging. Desorden’s operational focus highlights the urgent need for regional businesses to implement robust cybersecurity defenses, including multi-factor authentication, enhanced monitoring, and employee security awareness training, to mitigate potential threats.

Figure 6: DESORDEN post on BreachForums
Figure 6: DESORDEN post on BreachForums

Profiling Desorden's activities offers critical insights into their methods and objectives. Manual profiling through cybercrime forums provides a detailed analysis of their data sales and targeted industries but is time-intensive and subject to challenges such as post deletions and forum volatility (see Figure 7).

Figure 7: DESORDEN posts on BreachForums
Figure 7: DESORDEN posts on BreachForums

Automated profiling, on the other hand, leverages tools like Elasticsearch and Kibana to index and query forum data efficiently, offering enhanced visibility and historical insights into the group's operations across multiple forums. For example, Kibana allows analysts to quickly identify posts authored by Desorden, uncovering a range of high-profile victims across Southeast Asia (see Figure 8).

Figure 8: DESORDEN posts on cybercrime forums visualized in Kibana
Figure 8: DESORDEN posts on cybercrime forums visualized in Kibana

While automated systems provide scalability and operational security, they benefit from being complemented by human intelligence (HUMINT) to add context and nuance. This blended approach ensures a comprehensive understanding of threat actor behaviors, enabling organizations to stay ahead of evolving cyber threats posed by entities like CHAOTIC SPIDER.

The case study demonstrates the practical application of scraping operations in investigating data leaks on the cybercrime underground. The case study focuses on identifying and analyzing leaked credentials and sensitive information related to a specific organization.

Utilizing Large Language Models (LLMs) for Scraping Operations

The advancements in LLMs have opened up new possibilities for enhancing cybercrime intelligence, including scraping operations and data analysis. Some applications of LLMs in the context of scraping and analyzing data for cybercrime investigations include:

  1. Automated Scraping Script Generation: Script generation and improvement are widely used features of LLMs. By providing the LLM with a description of the target website or data source and the desired data fields to extract, LLMs can generate a customized scraping script in a specific programming language. This can save significant time and effort in the development process, especially when dealing with multiple data sources or frequently changing website structures. LLMs are also used to fine-tune an already-built script for scaling purposes, making the process more efficient and productive.
  2. Text Summarization and Insight Generation: LLMs can generate summaries and extract key insights from large volumes of scraped text data. The model can be trained to identify and highlight the most relevant information from scraped forum discussions, chat logs, or marketplace listings by fine-tuning an LLM on domain-specific cybercrime-related content. For example, an LLM can generate concise summaries of lengthy threads discussing new attack techniques, summarize key points from posts related to a specific cybercrime campaign, or identify emerging trends and patterns across multiple data sources, keeping analysts informed and up-to-date with timely intelligence.
  3. Multilingual Analysis and Translation: Cybercrime activities often span across different countries and languages. LLMs with multilingual capabilities can be leveraged to analyze scraped data in various languages and provide automated translations. For example, an LLM can be used to detect the language of scraped text automatically, translate the content into a common language (e.g., English) for analysis, or identify key entities and relationships across multiple languages. This can significantly expand the scope and effectiveness of cybercrime investigations by incorporating data from diverse linguistic backgrounds.

It's important to note that while LLMs offer promising applications in scraping operations, they also have limitations and potential biases. LLMs are trained on vast amounts of data and may sometimes generate irrelevant or incorrect information. Therefore, analysts play a crucial role in carefully reviewing and validating the outputs generated by LLMs to ensure accuracy and relevance, making them an integral part of the process.

Final Thoughts on Scraping the Underground

Web scraping is a powerful and essential capability for cybercrime intelligence professionals, offering unmatched efficiency in gathering and analyzing data from the cybercrime underground. By leveraging tools like Python libraries, browser automation frameworks, and proxies, analysts can monitor forums, marketplaces, and communication platforms where cybercriminals operate. However, the work doesn’t stop there—understanding and countering anti-scraping mechanisms, integrating robust analysis tools like the ELK stack, and strategically managing operations are critical to success.

With the right tools and techniques, web scraping allows analysts to detect emerging threats, track malicious actors, and provide actionable insights to bolster organizational defenses.

Interested in mastering web scraping techniques and integrating them into your threat intelligence operations? The SANS FOR589: Cybercrime Intelligence course dives deep into these strategies, offering hands-on training and insights into navigating the ever-evolving cybercrime landscape. Register today or request a live demo to see how the FOR589 course can transform your approach to cyber intelligence.

Let’s take your scraping operation skills to the next level!

Share:
TwitterLinkedInFacebook
Copy url Url was copied to clipboard
Subscribe to SANS Newsletters
Receive curated news, vulnerabilities, & security awareness tips
United States
Canada
United Kingdom
Spain
Belgium
Denmark
Norway
Netherlands
Australia
India
Japan
Singapore
Afghanistan
Aland Islands
Albania
Algeria
American Samoa
Andorra
Angola
Anguilla
Antarctica
Antigua and Barbuda
Argentina
Armenia
Aruba
Austria
Azerbaijan
Bahamas
Bahrain
Bangladesh
Barbados
Belarus
Belize
Benin
Bermuda
Bhutan
Bolivia
Bonaire, Sint Eustatius, and Saba
Bosnia And Herzegovina
Botswana
Bouvet Island
Brazil
British Indian Ocean Territory
Brunei Darussalam
Bulgaria
Burkina Faso
Burundi
Cambodia
Cameroon
Cape Verde
Cayman Islands
Central African Republic
Chad
Chile
China
Christmas Island
Cocos (Keeling) Islands
Colombia
Comoros
Cook Islands
Costa Rica
Cote D'ivoire
Croatia (Local Name: Hrvatska)
Curacao
Cyprus
Czech Republic
Democratic Republic of the Congo
Djibouti
Dominica
Dominican Republic
East Timor
Ecuador
Egypt
El Salvador
Equatorial Guinea
Eritrea
Estonia
Eswatini
Ethiopia
Falkland Islands (Malvinas)
Faroe Islands
Fiji
Finland
France
French Guiana
French Polynesia
French Southern Territories
Gabon
Gambia
Georgia
Germany
Ghana
Gibraltar
Greece
Greenland
Grenada
Guadeloupe
Guam
Guatemala
Guernsey
Guinea
Guinea-Bissau
Guyana
Haiti
Heard And McDonald Islands
Honduras
Hong Kong
Hungary
Iceland
Indonesia
Iraq
Ireland
Isle of Man
Israel
Italy
Jamaica
Jersey
Jordan
Kazakhstan
Kenya
Kiribati
Korea, Republic Of
Kosovo
Kuwait
Kyrgyzstan
Lao People's Democratic Republic
Latvia
Lebanon
Lesotho
Liberia
Liechtenstein
Lithuania
Luxembourg
Macau
Madagascar
Malawi
Malaysia
Maldives
Mali
Malta
Marshall Islands
Martinique
Mauritania
Mauritius
Mayotte
Mexico
Micronesia, Federated States Of
Moldova, Republic Of
Monaco
Mongolia
Montenegro
Montserrat
Morocco
Mozambique
Myanmar
Namibia
Nauru
Nepal
Netherlands Antilles
New Caledonia
New Zealand
Nicaragua
Niger
Nigeria
Niue
Norfolk Island
North Macedonia
Northern Mariana Islands
Oman
Pakistan
Palau
Palestine
Panama
Papua New Guinea
Paraguay
Peru
Philippines
Pitcairn
Poland
Portugal
Puerto Rico
Qatar
Reunion
Romania
Russian Federation
Rwanda
Saint Bartholemy
Saint Kitts And Nevis
Saint Lucia
Saint Martin
Saint Vincent And The Grenadines
Samoa
San Marino
Sao Tome And Principe
Saudi Arabia
Senegal
Serbia
Seychelles
Sierra Leone
Sint Maarten
Slovakia
Slovenia
Solomon Islands
South Africa
South Georgia and the South Sandwich Islands
South Sudan
Sri Lanka
St. Helena
St. Pierre And Miquelon
Suriname
Svalbard And Jan Mayen Islands
Sweden
Switzerland
Taiwan
Tajikistan
Tanzania, United Republic Of
Thailand
Togo
Tokelau
Tonga
Trinidad And Tobago
Tunisia
Turkey
Turkmenistan
Turks And Caicos Islands
Tuvalu
Uganda
Ukraine
United Arab Emirates
United States Minor Outlying Islands
Uruguay
Uzbekistan
Vanuatu
Vatican City State
Venezuela
Vietnam
Virgin Islands (British)
Virgin Islands (U.S.)
Wallis And Futuna Islands
Western Sahara
Yemen
Zambia
Zimbabwe

By providing this information, you agree to the processing of your personal data by SANS as described in our Privacy Policy.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Tags:
  • Digital Forensics, Incident Response & Threat Hunting

Related Content

Blog
DFIR - Blog - Running EZ Tools Natively on Linux_340 x 340.jpg
Digital Forensics, Incident Response & Threat Hunting
April 23, 2025
Running EZ Tools Natively on Linux: A Step-by-Step Guide
Developed by Eric Zimmerman, the EZ Tools suite is a collection of utilities written to assist with multiple aspects of forensic analysis.
Seth_Enoka_370x370.png
Seth Enoka
read more
Blog
DFIR - Blog - Are Ransomware Victims Paying Less_340 x 340.jpg
Digital Forensics, Incident Response & Threat Hunting
April 11, 2025
Are Ransomware Victims Paying Less? Insights from the Latest Stay Ahead of Ransomware Live Stream
In this month's reboot of the SANS Stay Ahead of Ransomware live stream, we dove into one of the most pressing questions in cyber extortion today.
Mari DeGrazia
Mari DeGrazia
read more
Blog
powershell_option_340x340.jpg
Cyber Defense, Digital Forensics, Incident Response & Threat Hunting, Cybersecurity and IT Essentials, Offensive Operations, Pen Testing, and Red Teaming
July 12, 2022
Month of PowerShell - Windows File Server Enumeration
In this Month of PowerShell article we look at several commands to interrogate Windows SMB servers as part of our incident response toolkit.
Josh Wright - Headshot - 370x370 2025.jpg
Joshua Wright
read more
  • Company
  • Mission
  • Instructors
  • About
  • FAQ
  • Press
  • Contact Us
  • Careers
  • Policies
  • Training Programs
  • Work Study
  • Academies & Scholarships
  • Public Sector Partnerships
  • Law Enforcement
  • SkillsFuture Singapore
  • Degree Programs
  • Get Involved
  • Join the Community
  • Become an Instructor
  • Become a Sponsor
  • Speak at a Summit
  • Join the CISO Network
  • Award Programs
  • Partner Portal
Subscribe to SANS Newsletters
Receive curated news, vulnerabilities, & security awareness tips
United States
Canada
United Kingdom
Spain
Belgium
Denmark
Norway
Netherlands
Australia
India
Japan
Singapore
Afghanistan
Aland Islands
Albania
Algeria
American Samoa
Andorra
Angola
Anguilla
Antarctica
Antigua and Barbuda
Argentina
Armenia
Aruba
Austria
Azerbaijan
Bahamas
Bahrain
Bangladesh
Barbados
Belarus
Belize
Benin
Bermuda
Bhutan
Bolivia
Bonaire, Sint Eustatius, and Saba
Bosnia And Herzegovina
Botswana
Bouvet Island
Brazil
British Indian Ocean Territory
Brunei Darussalam
Bulgaria
Burkina Faso
Burundi
Cambodia
Cameroon
Cape Verde
Cayman Islands
Central African Republic
Chad
Chile
China
Christmas Island
Cocos (Keeling) Islands
Colombia
Comoros
Cook Islands
Costa Rica
Cote D'ivoire
Croatia (Local Name: Hrvatska)
Curacao
Cyprus
Czech Republic
Democratic Republic of the Congo
Djibouti
Dominica
Dominican Republic
East Timor
Ecuador
Egypt
El Salvador
Equatorial Guinea
Eritrea
Estonia
Eswatini
Ethiopia
Falkland Islands (Malvinas)
Faroe Islands
Fiji
Finland
France
French Guiana
French Polynesia
French Southern Territories
Gabon
Gambia
Georgia
Germany
Ghana
Gibraltar
Greece
Greenland
Grenada
Guadeloupe
Guam
Guatemala
Guernsey
Guinea
Guinea-Bissau
Guyana
Haiti
Heard And McDonald Islands
Honduras
Hong Kong
Hungary
Iceland
Indonesia
Iraq
Ireland
Isle of Man
Israel
Italy
Jamaica
Jersey
Jordan
Kazakhstan
Kenya
Kiribati
Korea, Republic Of
Kosovo
Kuwait
Kyrgyzstan
Lao People's Democratic Republic
Latvia
Lebanon
Lesotho
Liberia
Liechtenstein
Lithuania
Luxembourg
Macau
Madagascar
Malawi
Malaysia
Maldives
Mali
Malta
Marshall Islands
Martinique
Mauritania
Mauritius
Mayotte
Mexico
Micronesia, Federated States Of
Moldova, Republic Of
Monaco
Mongolia
Montenegro
Montserrat
Morocco
Mozambique
Myanmar
Namibia
Nauru
Nepal
Netherlands Antilles
New Caledonia
New Zealand
Nicaragua
Niger
Nigeria
Niue
Norfolk Island
North Macedonia
Northern Mariana Islands
Oman
Pakistan
Palau
Palestine
Panama
Papua New Guinea
Paraguay
Peru
Philippines
Pitcairn
Poland
Portugal
Puerto Rico
Qatar
Reunion
Romania
Russian Federation
Rwanda
Saint Bartholemy
Saint Kitts And Nevis
Saint Lucia
Saint Martin
Saint Vincent And The Grenadines
Samoa
San Marino
Sao Tome And Principe
Saudi Arabia
Senegal
Serbia
Seychelles
Sierra Leone
Sint Maarten
Slovakia
Slovenia
Solomon Islands
South Africa
South Georgia and the South Sandwich Islands
South Sudan
Sri Lanka
St. Helena
St. Pierre And Miquelon
Suriname
Svalbard And Jan Mayen Islands
Sweden
Switzerland
Taiwan
Tajikistan
Tanzania, United Republic Of
Thailand
Togo
Tokelau
Tonga
Trinidad And Tobago
Tunisia
Turkey
Turkmenistan
Turks And Caicos Islands
Tuvalu
Uganda
Ukraine
United Arab Emirates
United States Minor Outlying Islands
Uruguay
Uzbekistan
Vanuatu
Vatican City State
Venezuela
Vietnam
Virgin Islands (British)
Virgin Islands (U.S.)
Wallis And Futuna Islands
Western Sahara
Yemen
Zambia
Zimbabwe

By providing this information, you agree to the processing of your personal data by SANS as described in our Privacy Policy.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
  • Privacy Policy
  • Terms and Conditions
  • Do Not Sell/Share My Personal Information
  • Contact
  • Careers
© 2025 The Escal Institute of Advanced Technologies, Inc. d/b/a SANS Institute. Our Terms and Conditions detail our trademark and copyright rights. Any unauthorized use is expressly prohibited.
  • Twitter
  • Facebook
  • Youtube
  • LinkedIn