top of page

Logger 6.5 Demo Script

  • Writer: Pavan Raja
    Pavan Raja
  • Apr 8, 2025
  • 17 min read

Summary:

The document provided offers valuable insights into how Logger can be used as a powerful tool for network analysis and malware investigation, particularly in the context of email and network security. Here's an executive summary based on the content of the report: **Executive Summary:** This report outlines a comprehensive methodology for using Logger to generate detailed reports on user logins over a one-day period. The primary focus is on enhancing the standard "Demo User Logins" report with visual enhancements such as value-based coloring and sparklines, which are designed to provide deeper insights into login patterns and trends. **Key Findings:** 1. **Report Generation Process:** The process involves saving a basic version of the report, running it through Logger's Smart Format, and adjusting settings for time duration (one day before now) and scan limit (set to 0). This initial setup lays the groundwork for further customization with visual tools like value-based coloring and sparklines. 2. **Enhanced Report Features:** The enhanced version of the report includes additional elements that facilitate more interactive and informative analysis. These features include expanded views on user names, which display colored grids based on data metrics and sparklines showing trend information. This approach allows users to dynamically explore detailed login patterns within the specified time frame. 3. **Technical Enhancements in Logger:** The document highlights how Logger has evolved from basic functionalities (such as scheduled logger lookups and URL analysis) to more sophisticated features introduced in later versions (like smart reports with Sparklines). These enhancements demonstrate the software's continuous improvement and adaptability to meet evolving analytical needs. 4. **User Interface Updates:** Notable changes include updates in user interface elements, which have been refined for better usability and clarity, along with corrections made to address technical issues related to data aggregation and security use cases. **Conclusion:** This report serves as a practical guide on how to leverage Logger's capabilities to create actionable insights from network security data. By following the outlined steps, users can generate customized reports that are visually enhanced for better understanding of login activities and trends over time. The historical context provided in the revision history section highlights the software's development journey and its responsiveness to user feedback and technological advancements in the field of network analysis. **Recommendations:** For organizations looking to enhance their security operations, it is recommended to leverage Logger for generating detailed reports on login activities. Consider upgrading to more recent versions that incorporate smart report features such as Sparklines for a richer data visualization experience. Additionally, leveraging value-based coloring and sparklines within the enhanced "Demo User Logins" report can provide deeper insights into user behavior and detect anomalies or unusual activity patterns not immediately apparent through basic metrics alone. **Legal Considerations:** Use of Logger is at your own risk as per the legal disclaimer provided in the document, which emphasizes that this report does not constitute a guarantee of specific outcomes from using the software. It also serves to clarify trademark usage and acknowledges Micro Focus International plc's ownership of certain intellectual property rights.

Details:

This document is for internal use only at Micro Focus, titled "Micro Focus Confidential—subject to use restriction," which provides a guide on how to install a demonstration license for ArcSight Logger version 6.5. The file contains various sections detailing different aspects of the software's functionality and features, such as installation instructions, usage scenarios (like data analysis, compliance), technical details, and more. **Contents:**

  • **Overview**: Describes the purpose of the demonstration script for ArcSight Logger version 6.5.

  • **Installation of Demonstration License**: Instructions on how to set up a demonstration environment using a virtual machine running at least Logger v6.5.

  • **Lookup Files**: Information and guidance related to configuring lookup files in Logger.

  • **Categorization, Data Analysis, Searching, Security Use Case, Compliance Use Case, IT Ops Use Case, Application Development Use Case, NetFlow Use Case, Raw Events and Regex Use Case, Additional Use Cases, Analyzing Machine Transactions, Scheduled Updates for Logger LOOKUP, EVAL: URL Analysis, insubnet operator, EVAL: Decoding URLs, Dashboard for 15 Tables, FortiGate - Logger partnership, Dynamic Analysis using Smart Reports, Sparklines**: Detailed guides on specific functionalities and features of ArcSight Logger.

  • **Appendix A – Revision History**: Track changes made to the document over time.

  • **Micro Focus Trademark Information, Company Details**: Legal information about Micro Focus and its products.

**Key Points from Overview:** The demonstration script is designed to showcase the capabilities of ArcSight Logger version 6.5. It includes a note on configuring Event Broker Receivers if needed for demonstrations involving both Logger and Event Broker. The installation instructions are crucial for setting up the demonstration environment accurately. This document outlines the steps for installing an Autopass license in Logger 6.5, which is designed to manage capacity licenses centrally using ArcMC as an ADP license server. The process involves applying both a base license to Logger and a capacity uplift license in ArcMC, followed by configuring Lookup Files for static correlation. To initiate the licensing setup: 1. **Switch on ArcMC** as an ADP license server within the ArcMC GUI. Confirm this action with "Yes". Now, ArcMC serves as the central repository for all Logger licenses. 2. **Apply the base license in Logger**: Navigate to System Administration → License and Update, then apply your ADP base license. This completes the licensing setup, making your device ADP managed. 3. **Monitor license usage** in ArcMC; if you exceed the limit indicated by a graph on the Dashboard, apply a capacity uplift license in ArcMC under Administration → System Admin. Ensure this new capacity license is reflected in the license information. 4. **Configure Lookup Files**: Use these files for static correlation and event enrichment. For instance, import a list of known malicious IP addresses into a Lookup Table named Malicious_Addresses. Utilize the lookup operator within Logger's search feature to correlate events with entries in this table, narrowing down results from over 100,000 potential matches to just a few dozen relevant entries. This setup streamlines license management and enhances event analysis capabilities through static correlation enabled by Lookup Files. The text provides an overview of improvements to the Logger User Interface in a product like ArcSight, which aids in indexing fields for quicker searches. It highlights that certain fields can be super-indexed (darker green), providing faster search results when frequently accessed or less common entries are sought. The demo script discusses enhancing event data with contextual information from Lookup Files, using examples such as categorizing malicious IP addresses and their connections to Apache web servers via a successful connection (indicated by the /Success event). The demonstration includes steps for creating new Lookup Files based on search results, visualizing data through charts, and navigating saved searches and filters. The script also discusses how categorization simplifies searching across different devices like Windows, UNIX, or Mainframe during login processes. In summary, the text is about utilizing enhancements in a Logger User Interface to improve indexing for quicker searches, integrating contextual information from Lookup Files, using categorizations for uniformity and consistency in search results across various devices, and demonstrating these features through practical examples in a product like ArcSight. This text discusses the use of Logger, an ArcSight feature, for analyzing network traffic, particularly HTTPS traffic between servers and web servers. It emphasizes the importance of unified profiling across all devices in organizations to ensure consistent categorization and analysis. The report generation process involves simple search functions like entering destinationPort=443 to monitor HTTPS traffic from a specific port (443). This can be further customized by changing fields or adding filters for more targeted data retrieval and analysis. The Logger tool allows users to create reports on failed logins, which are crucial for audit requirements and security monitoring. It supports various device types without requiring modifications to the report template, as it automatically categorizes new events based on underlying device vendors. The system can also generate statistical reports using predefined or custom filters like "Demo_https_top", analyzing traffic type and usage from servers running HTTPS connections. The article concludes by highlighting that Logger's ability to analyze large volumes of data efficiently is a key feature, allowing for the detection of patterns and investigation into suspicious activities through simple search conditions and indexed fields. This flexibility helps in quickly identifying relevant information about network traffic without prior knowledge of what might be sought, thanks to statistical field summaries and analyses provided by default or customizable filters. This text discusses the use of Logger and ArcSight for analyzing network traffic, focusing on detecting rare events within large datasets. The process involves selecting an operation (such as top or tail) to filter through data, using commands like "top name" or "tail 5" to specify how much data to analyze. For instance, one can use the command "destinationPort=443 | tail 5" to focus on recent events involving port 443. The text also introduces the concept of categorization and normalization in traffic analysis using ArcSight's Logger tool, which helps group and classify traffic based on destination ports like 443. This method allows for quick analysis within a short period, summarizing network traffic generated between source and destination addresses. Furthermore, the text touches upon the utility of superindexing in Logger to efficiently search through vast amounts of data, even when looking for rare events. Superindexes are highlighted as automatically maintained fields for important metadata like IP addresses, hostnames, and usernames, which can significantly enhance search performance, especially with large datasets where efficiency is crucial. Overall, the text provides a clear guide on how to effectively use Logger within ArcSight for detailed traffic analysis, including strategies for handling both common and rare events efficiently. Superindexing can handle tens of millions of events per second, with performance ranging from 50 to 100 million events per second depending on whether Logger software is installed or not. It efficiently scans through a vast number of events, even when searching for non-existent terms like an impossible IP address. In the context of security use cases such as incident response and forensics, Logger Super Indexing has been used to investigate users who have recently left the company. Custom Login Banners can be implemented using Logger, allowing companies to display their policies and provide a way for users to acknowledge them. The dashboard feature allows users to customize panels to suit their interests, with options to change formats or view detailed event information by hovering over chart slices. Logger also offers compliance and network operations dashboards that track various aspects such as intrusion events, configuration changes, failed logins, user privilege modifications, and network traffic statistics like NetFlow by destination port. These can be customized based on specific requirements within an organization's environment. This text provides a step-by-step guide on how to use a Logger search tool effectively, using specific keywords and conditions to retrieve desired information from various data sources such as servers, events, and user activities related to finance and FTP communications. The process involves several interactive steps including keyword refinement, advanced search settings adjustment, graphical representation customization, exporting results, navigating through the system's features, and running reports for management summaries. This method is designed to simulate real-world investigative scenarios where initial knowledge about what exactly is being sought might be limited until a thorough search is conducted. The provided text discusses the features and functionalities of a software tool called "Logger," which is designed to assist with demonstrating regulatory compliance. Key points include: 1. **Report Generation**: Logger allows users to generate reports in various formats such as Adobe PDF, MS Word, MS Excel, and CSV. Reports can be generated on-demand or scheduled for automatic delivery via email. 2. **Customization Options**: The software offers the ability to display a custom login banner that companies can use to communicate their policies to users. This feature is optional and customizable as per company requirements. 3. **Compliance Use Case**: Describes how Logger helps in managing different log retention requirements, automated log review, and real-time alerting for compliance issues:

  • Each user has a personalized dashboard showing relevant PCI compliance events based on their role.

  • Role-based access control ensures users only view pertinent data and events.

  • Users can explore detailed reports via drill-down capabilities directly from the interface.

4. **Compliance Insight Packages**: Specifically mentions the PCI Compliance Insight Package which includes pre-built, top reports that are particularly helpful for compliance efforts as they provide insights based on customer feedback. 5. **Real-Time Alerts**: Logger offers numerous preconfigured alerts for various compliance controls, allowing users to detect issues as they occur rather than after a report is generated. Users can modify existing alerts or create new ones according to their specific requirements. 6. **Storage and Retention Policies**: Configurable storage groups allow for different retention periods based on the type of logs (e.g., PCI logs need to be retained for at least one year). This feature ensures that Logger automatically handles log maintenance, which is crucial for compliance-related data retention. 7. **Security Features**: The auditor account has read permissions only to maintain a controlled environment where settings can be verified but not altered by auditors themselves. This summary highlights how the software tool Logger assists in various aspects of regulatory compliance, from generating detailed reports and providing real-time alerts to managing storage requirements efficiently. The document describes a method for managing log streams and storage groups using a Logger tool, with specific focus on IT Operations (IT Ops) use cases. It outlines how an admin user can restrict access for auditors to configuration settings while providing them with read permissions only. This setup is intended for compliance purposes where role-based access control and segregation of duties are critical. For the IT Ops use case titled "Web Server Down," the process involves: 1. Logging in as an admin. 2. Navigating to the Analyze, Search page to enter a specific search term related to the web server down event. 3. Using drop-down menus and filters to narrow down logs from a North Korean IP address that might indicate a denial of service attack. 4. Searching for events related to configuration modifications on the webserver, noting any such changes made by user mike, which could be linked to the server's malfunctioning state. 5. Generating reports using standard or custom-defined parameters through the Logger tool’s reporting feature, filtering out relevant data from different devices and configurations. 6. Using connectors for easier event categorization and understanding across various technological areas, ensuring that expertise in specific domains does not hinder investigation. 7. Adjusting search terms dynamically to incorporate more detailed information about affected systems or device interactions during the troubleshooting process. 8. Utilizing reports as a quick reference for previously recorded configurations and modifications related to devices under observation, which can be adapted based on specific user requirements or changes in the operational environment. This summary outlines the capabilities of a software tool called Logger, which is used for analyzing and visualizing real-time data from various sources such as web servers, application logs, and network traffic. The tool supports several use cases including ITOPS demonstration, application development, and NetFlow analysis. For the ITOPS demonstration, after accessing the main menu in Logger, users can click on "Analyze" and then "Live Event Viewer." They are advised to contact Mike and his supervisor to understand why changes are being made during business hours while monitoring a webserver in real-time. The Live Event Viewer allows users to search for specific events related to the webserver by entering terms like "web1.arcnet.com". The tool can handle multiline logs, which is particularly useful for applications developed with languages such as Ruby (RAA), Java, or Python, and captures error messages effectively. In the application development use case, Logger's ability to take in multi-line log files is highlighted. This feature enables users to parse complex log entries from various environments like Java, C++, or Ruby applications. The tool can also be used for analyzing errors by filtering events marked as ERRORS within the Application Development field set. The NetFlow use case focuses on monitoring network traffic directed towards Microsoft SQL Servers. By entering a specific search term "netflow dpt=1433", Logger provides insights into which sources are communicating with the SQL servers, based on various fields such as source addresses and byte statistics. This information can be visualized using dynamic charts that automatically update according to user settings. Overall, Logger offers powerful functionalities for real-time monitoring and analysis of complex data streams from multiple sources, providing actionable insights through flexible charting options and exportable reports. The article discusses the use of NetFlow data and raw events for network performance analysis. It provides a step-by-step guide on how to perform searches within these datasets using specific keywords and regex patterns to identify network issues such as latency (RTA) and failed logins. Key points include: 1. **NetFlow Data Analysis**: The article suggests performing a quick search within NetFlow data to find the most popular destination ports in the network environment, which can be visualized through charts or graphs displaying counts by destination port number (dpt). This helps in understanding the traffic distribution and potential bottlenecks. 2. **Raw Events Search**: For detecting network latency issues such as high round-trip averages (RTA), a search is conducted using raw events logged without CEF formatting. The Logger tool within the platform allows access to all raw event content, which can be filtered using specific keywords like "loss nagios ALERT". This helps in focusing on relevant performance data from network devices and servers. 3. **Regex Use Case**: To analyze RTA values greater than 1 ms, a regex helper is used to parse the RAW events, extracting the field containing the measurement (RTA). The Logger Regex helper automatically recognizes fields and assigns meaningful names to parsed variables for further analysis. This process simplifies complex data extraction tasks within raw logs. 4. **Discover Fields Function**: For searching failed logins from raw events, the "Discover Fields" capability in Logger is utilized to identify relevant fields such as usernames that are not normally parsed. This function helps in uncovering valuable information even from unparsed event data. 5. **Exporting Results**: The article highlights how Logger offers multiple options for exporting search results, including saving locally or converting them into PDF or CSV formats, which is useful for creating reports and further network performance analysis. Overall, the article demonstrates how to leverage advanced features of a platform (such as Logger) to extract meaningful insights from complex network data, using specific keywords and regex patterns tailored to detect particular issues like high latency and failed logins. This document outlines a method for analyzing and masking credit card numbers in Logger VM, with instructions tailored for specific actions within this software environment. The process involves navigating through the system's interface, using features such as Regex Helper to extract relevant data, and applying custom filters to obscure sensitive information before generating reports suitable for distribution. 1. **Accessing Search Page**: Begin by accessing the search page from the main browser interface. Here, recent credit card transactions can be reviewed, with options to adjust the time window if necessary. 2. **Entering Query and Filtering Data**: Input 'sha1hex' into the query field and click 'Go!' This will filter the displayed data based on this specific criteria. If required, users can modify the search time window to cover the last hour for more recent transactions. 3. **Selecting Field Set**: Choose the "All Fields" field set from the dropdown menu, which includes all available event details relevant to the transaction under review. 4. **Using Logger Regex Helper**: Navigate to the middle or bottom of the browser and click on '+'. This will expand an event view. Apply the Logger Regex Helper by clicking '+' again and selecting the RAW (Extract Fields) icon. 5. **Extracting Credit Card Numbers**: Scroll down in the pop-up window to locate Number_3, where the regex helper is used to parse raw events for credit card numbers which can then be renamed as 'ccnum'. After renaming fields, confirm your selection and click 'Go!' to proceed with the extraction process. 6. **Masking Data**: Utilize Logger's capabilities to mask data, particularly focusing on credit card numbers by only showing the first digit series that identifies the type of card (3-6). This involves using specific regex patterns linked to different card types and modifying field names if necessary. 7. **Preparing Reports**: Once masked, use the newly created 'ccnum' column in the grid for reporting purposes while ensuring sensitive data is protected. Users can add or modify searches through fields like 'firstnum' to refine reports as required before exporting them in a suitable format such as PDF. 8. **Advanced Analytical Features**: Discuss how Logger supports higher-level analytical tasks, including transactional grouping and real-time file processing without the need for additional connectors. This section should also highlight its integration with Mail Transfer Agent (MTA) logs like POSTFIX to facilitate more comprehensive analysis of machine transactions. This document provides a structured approach to handling sensitive data in Logger VM while offering insights into advanced analytical features that can enhance deeper system understandings beyond just the presence or absence of credit card numbers. ArcSight Logger is a powerful tool designed for log management and analysis. It allows users to parse logs, create receivers for events, and group them into transactions based on common attributes like QueueID. This functionality helps in organizing large volumes of data by grouping similar events together, making it easier to analyze and investigate patterns or issues related to those specific events. The tool supports exporting and importing content such as alerts, dashboards, filters, parsers, saved searches, and source types, facilitating the sharing and reuse of configurations among different users. For instance, Logger can export reports, queries, and templates for further analysis in external tools or systems. Logger also includes a drillable dashboard feature that allows users to analyze data by drilling down into specific details of events. This is particularly useful for visualizing and understanding the behavior of email events such as those related to POSTFIX Mail through graphical representations like charts. These charts can display information about subprocesses (like error, smtp, or queue manager), counts of emails grouped by transaction ID, and lengths of transactions sorted by event count. Additionally, Logger provides a TRANSACTION operator for grouping events based on specific criteria, which is particularly useful for analyzing email events with the same QueueID. This feature not only assigns a unique transaction ID to each group but also keeps track of how many events are in each transaction. Logger's ability to schedule updates from external sources like Tor Exit Nodes further enhances its functionality by allowing users to integrate real-time data directly into their analysis without manual intervention. Users can update lookup tables, which serve as static correlations for Logger searches, through a scheduled process. This feature is demonstrated in the provided script that retrieves and formats data on Tor Exit Nodes every 30 minutes or upon request. In summary, ArcSight Logger offers extensive capabilities for log management and analysis, including customizable parsing, flexible search functionalities, drillable dashboards, and integration with external data sources through scheduled updates. Its flexibility makes it a valuable tool not only for IT professionals but also for analysts who need to work with large volumes of log data without necessarily being domain experts in the specific area of interest (like email or network security). This document outlines several methods for using different commands and features within a Logger search, specifically focusing on network analysis and malware investigation. The examples provided include using the 'lookup' command for table-based data comparison, evaluating URL lengths to detect potential malware indicators, and utilizing the 'insubnet' operator for searching specific subnet ranges within an environment. The 'lookup' command is used to compare IP addresses against a list (like tor exit nodes or other lists) stored in Logger. The output will show which IP addresses have matches in these lists, aiding in network security checks. URL analysis involves calculating the length of URLs and using this metric as an indicator for potential malware or malicious activity. This is particularly useful when dealing with long URLs that might be indicative of suspicious behavior. The 'insubnet' operator allows for searches within variable-length subnet masks (VLSM) networks, which is beneficial for identifying network activities occurring within specified IP ranges and subnets. These methods provide quick and efficient ways to analyze large datasets for security purposes, helping in the detection of potential threats or suspicious activity within a network environment. This document outlines several functionalities of a software tool called "Logger," which is used for decoding URLs and analyzing data related to security events. The process involves entering encoded web URLs into a search box, where the system automatically decodes them for easier readability. Additionally, it discusses how to use Logger in conjunction with Cisco firewalls, particularly focusing on the 'requestUrl' field that has been URL-encoded. The decoded results are displayed alongside the original encoded version, aiding in analysis and decision making. The document also introduces a new dashboard feature within Logger, allowing users to view up to 15 panels of aggregated data from different fields simultaneously. It explains how to access this dashboard and use it for analyzing top values across various fields. The document concludes with an overview of Smart Reports and Sparklines, which are tools for dynamic analysis and visualization of large datasets through interactive reports and graphical representations within the Logger interface. **Executive Summary:** This report outlines the process for generating a "Demo User Logins" report using Logger, a software tool designed for network security analysis. The primary objective of this report is to demonstrate how to create and run enhanced reports that include value-based coloring, sparklines, and multiple levels of groupings. These features are applied to provide deeper insights into the user login data over a one-day period. **Steps to Generate the Report:** 1. **Save the Report**: Begin by selecting "Demo User Logins" from the report list and check the box marked "Create Report." Click on "Save" to proceed, confirming any prompts regarding overwriting existing filters if necessary. 2. **Run the Report**: Navigate to the "Reports" section and click on "Explorer," then expand "Default Logger Search Reports" to locate "Demo User Logins." Right-click and select "Run in Smart Format." Adjust the settings for Start (set to one day before now) and Scan Limit (set to 0). Click "Apply." 3. **Review the Report Output**: Upon completion, observe the bar chart displaying counts of specific tuples over the past day. Switch to the grid view by clicking on "Grid" at the bottom. You can filter results dynamically using column headers like "count," and adjust settings such as sorting or hiding columns through right-clicking options. 4. **Enhanced Report**: To run an enhanced version, follow similar steps but use "Demo User Logins Enhanced." This report includes additional visual enhancements like value-based coloring and sparklines. Expand the user name to see detailed views of outcomes, including colored grids based on data and sparklines indicating trends. **Revision History:** The document details updates made through various Logger versions: from version 6.1 with added functionalities such as scheduled logger lookups and URL analysis, to version 6.5 that incorporates smart reports introduced in Logger 6.4 for dynamic analysis using Sparklines. Notable changes include updates in user interface (UI) elements and corrections in security use cases or technical bugs affecting data aggregation. **Legal Disclaimer:** This document is not a commitment to deliver any material, code, or functionality, nor does it guarantee that Logger will meet your requirements or expectations. Use of this software is at your own risk. Micro Focus International plc holds the trademark for "Micro Focus" and related logos. This report format effectively guides users through the process of generating customized network security reports using Logger, while also providing a historical record of software enhancements and corrections.

Disclaimer:
The content in this post is for informational and educational purposes only. It may reference technologies, configurations, or products that are outdated or no longer supported. If there are any comments or feedback, kindly leave a message and will be responded.

Recent Posts

See All
Zeus Bot Use Case

Summary: "Zeus Bot Version 5.0" is a document detailing ArcSight's enhancements to its Zeus botnet detection capabilities within the...

 
 
 
Windows Unified Connector

Summary: The document "iServe_Demo_System_Usage_for_HP_ESP_Canada_Solution_Architects_v1.1" outlines specific deployment guidelines for...

 
 
 

Comments


@2021 Copyrights reserved.

bottom of page