IT Resource Logging - Reporting and Review Procedures

These procedures are in support of the IT Resource Logging Standard (S-11).  Audit logs are subject to regular periodic review as required by the criticality of the IT Resource and the underlying Information Assets. Where needed, Information owners and/or data stewards will collaborate with IT administrators to help define review procedures and frequency of review commensurate with the level of criticality or specific regulatory requirements. Factors influencing frequency of review include:

  • University classification of the data being stored, processed or transmitted by the IT Resource and any associated risk, including data subject to regulatory or industry specific standards, including but not limited to, HIPAA, FERPA, GLBA, Controlled Unclassified Information (CUI), PCI, etc.
  • Criticality of the IT Resource or Information Assets supporting (1) University scholarship, research and instructional activities, (2) business or administrative operations of the University, (3) access to University services or (4) support student and campus life activities.

Security information and event management (SIEM) or other solutions incorporating event thresholds and providing alerts or reporting may be used to facilitate the monitoring or review processes. Where requested or required, IT Security & Policy (itpolicyanswers@purdue.edu) provides services to assist in centralized log collection and SIEM reporting.

Auditable events and reporting needs may change as business needs or regulatory requirements change, as well as, when addressing improvements for incident identification and response procedures.

Reporting and Events to Monitor

Authentication and Authorization Reports provide a view into the main means of controlling access to systems and data.Login failures and successes, including attempts to disabled/service/non-existing/default/guest/suspended accounts

  • Multiple login failures followed by success by same account (this may require rule-based SIEM correlation to produce)
  • Privileged account access (success, failure) including administrator accounts, root, su use, Run As use, or other system and platform relevant equivalents
  • VPN authentication and other remote access logins (success, failure) including source IP address

Change Reports identify critical security changes to systems and networked assets—configuration files, accounts, regulated and sensitive data or other components of system or application.

  • Additions/changes/deletions to users, groups
  • Additions of accounts to administrator/privileged groups
  • Additions/changes/deletions to network services
  • Changes to system files – binaries, configurations
  • Changes to other key files
  • Application installs, updates (success/failure) by system, application, user

Network Activity Reports can identify suspicious system or network activity.

  • Outbound connections from internal and DMZ systems by source and destination
  • Internal systems listening on non-required ports such as deviation from baseline and least privilege
  • Top-talker internal systems as sources of multiple types of Intrusion Detection Systems (IDS), Intrusion Prevention Systems (IPS) or Web Application Firewall (WAF) alerts
  • VPN network activity by user name, count of sessions
  • Wireless network activity, including rogue AP detection and rogue AP association logs

Resource Access Reports identify system, application and database resource access patterns and can be used for activity audit, trending, incident detection, reveal insider abuse/attack, or be useful for capacity planning.

  • Top internal clients blocked by proxy from accessing prohibited sites, malware sources, etc.
  • File, network share or resource access (success, failure)
  • Top database users (to be useful for security activity it should exclude know application access to the database and ideally, a production database should have no direct access from users or developers)
  • Summary of query types (excluding known application queries to detect anomalous database access)
  • Privileged database user access and activity
  • Users executing INSERT, DELETE database commands (excluding known application queries)
  • Users executing CREATE, GRANT, schema changes on a database
  • Summary of database backups (review who performed database backups and those done without authorization)
  • Emailed attachment content types, sizes, names
  • Internal systems sending mail excluding known mail servers
  • Log access summary

Malware Activity Reports identify malicious software and events.

  • Malware detection trends with outcomes (cleaned or not)
  • Detect-only events from anti-virus tools (not cleaned)
  • Anti-virus protection failures
  • Internal connection to known malware IP addresses (firewall or other detection against a public blacklist)

Regulatory and Industry Specific Standards

Often, there are specific audit log and review requirements by regulatory or industry standards affecting certain types of information, including, but not limited to Payment Card Industry Data Security Standard (PCI-DSS), Health Insurance Portability and Accountability Act of 1996 (HIPAA), Protecting Controlled Unclassified Information (CUI) in Nonfederal Information Systems (NIST SP 800-171), Federal Information Security Management Act (FISMA) or DFARS 252.204-7012. These resources should be reviewed for additional requirements that may be above and beyond the guidance provided by the IT Resource Logging Standard and these procedures. (Note:  Although current at the time of publication, always reference the most current version of these control frameworks.) Consult with the appropriate University Data Steward or IT Security & Policy (itpolicyanswers@purdue.edu) for assistance in meeting compliance.

Payment Card Industry Data Security Standard (PCI-DSS)
(Special note regarding compliance with this section—Due to reduction of our infrastructure subject to PCI DSS, these items are not currently required.)

Logging and auditing standards as required by the current version of the PCI-DSS and as they are updated and changed by the PCI Security Standards Council (PCI SSC) must be met. The following logging and auditing requirements are current for PCI-DSS version 3.2 compliance:

    1. Antivirus logging must be enabled (Requirement 5.2.d).
    2. Implement audit trails to link all access to system components to each individual user (Requirement 10.1).
    3. Implement automated audit trails for all system components to reconstruct the following events (Requirement 10.2):
      1. All individual user accesses to cardholder data (Requirement 10.2.1)
      2. All actions taken by any individual with root or administrative privileges (Requirement 10.2.2)
      3. Access to all audit trails (Requirement 10.2.3)
      4. Invalid logical access attempts (Requirement 10.2.4)
      5. Use of and changes to identification and authentication mechanisms—including but not limited to creation of new accounts and elevation of privileges—and all changes, additions, or deletions to accounts with root or administrative privileges (Requirement 10.2.5)
      6. Initialization, stopping, or pausing of the audit logs (Requirement 10.2.6)
      7. Creation and deletion of system-level objects (Requirement 10.2.7)
    4. Record at least the following audit trail entries for all system components for each event (Requirement 10.3):
      1. User identification (Requirement 10.3.1)
      2. Type of event (Requirement 10.3.2)
      3. Date and time (Requirement 10.3.3)
      4. Success or failure indication (Requirement 10.3.4)
      5. Origination of event (Requirement 10.3.5)
      6. Identity or name of affected data, system component, or resource (Requirement 10.3.6)
    5. Secure audit trails so they cannot be altered (Requirement 10.5).
      1. Limit viewing of audit trails to those with a job-related need (Requirement 10.5.1).
      2. Protect audit trail files from unauthorized modifications (Requirements 10.5.2).
      3. Promptly back up audit trail files to a centralized log server or media that is difficult to alter (Requirement 10.5.3).
      4. Write logs for external-facing technologies onto a secure, centralized, internal log server or media device (Requirement 10.5.4).
      5. Use file-integrity monitoring or change-detection software on logs to ensure that existing log data cannot be changed without generating alerts (although new data being added should not cause an alert) (Requirement 10.5.5).
    6. Review logs and security events for all system components to identify anomalies or suspicious activity (Requirement 10.6).
      1. Review the following at least daily (Requirement 10.6.1):
        1. All security events
        2. Logs of all all system components that store, process, or transmit cardholder data (CHD) and/or sensitive authentication data (SAD)
        3. Logs of all critical system components
        4. Logs of all servers and system components that perform security functions (for example, firewalls, intrusion-detection systems/intrusion-prevention systems (IDS/IPS), authentication servers, e-commerce redirection servers, etc.). 
      2. Review logs of all other system components periodically based on the organization’s policies  and risk management strategy, as determined by the organization’s annual risk assessment (Requirement 10.6.2).
      3. Follow up exceptions and anomalies identified during the review process (Requirement 10.6.3).
    7. Retain audit trail history for at least one year, with a minimum of three months immediately available for analysis (for example, online, archived, or restorable from backup) (Requirement 10.7).
    8. Ensure that that security policies and operational procedures for monitoring all access to network resources and cardholder data are documented, in use, and known to all affected parties (Requirement 10.8).

Health Insurance Portability and Accountability Act (HIPAA)

The Health Insurance Portability and Accountability Act of 1996 (HIPAA; Pub.L. 104–191, 110 Stat. 1936, enacted August 21, 1996) and the Health Information Technology for Economic and Clinical Health Act (HITECH; enacted under Title XIII of the American Recovery and Reinvestment Act of 2009, Pub.L. 111–5) requires any organization with patient health information to protect the confidentiality, integrity, and availability of that information. The HIPAA Security Rule specifically applies to electronic protected health information (EPHI). Audit security controls and supporting controls are specifically mandated in the following sections: 

Security Management Process – Implement procedures to regularly review records of information system activity, such as audit logs, access reports, and security incident tracking reports. (§164.308(a)(1)(ii)(D)) 

Access Control – Assign a unique name and/or number for identifying and tracking user identity. Ensure that system activity can be traced to a specific user. Ensure that the necessary data is available in the system logs to support audit and other related business functions. (§164.312(a)(2)(i))

Audit controls – Implement hardware, software, and/or procedural mechanisms that record and examine activity in information systems that contain or use electronic protected health information. (§164.312(b))

The following logging and auditing requirements must be met for HIPAA:

      1. Create, document, and implement policies and standard operating procedures for system activity and user audit review on a regular and consistent basis.
      2. Ensure that all users have been assigned a unique identifier.
      3. Determine and document the specific activities that will be tracked or audited.
      4. Select, configure, deploy, and document tools for auditing and system activity reviews.
      5. System, security, and application logs must be collected through the PSS enterprise log management system.
      6. User activity log and audit data related to all activities associated with treatment, payment, and healthcare operations must be retained and archived for a period of seven years.
      7. The PSS SIEM must be configured to alert security analysts when specific conditions defined by the HIPAA Security Officer (or designee) are encountered.

All IT units have a responsibility to ensure that any information system that stores, processes, transforms, and/or transmits EPHI meets the requirements of this standard and the HIPAA privacy and security rules and HITECH as defined in the Federal Register. Failure to comply with the requirements of this standard may result disciplinary actions or sanctions in accordance with University policy and procedures and applicable state and federal laws. Violating HIPAA and/or HITECH may result in civil and/or criminal penalties including fines and jail time.

The National Institute for Standards and Technology (NIST) has created Special Publication 800-171, “Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations” to provide the security controls required to protect the confidentiality of Controlled Unclassified Information (CUI). The following logging and auditing security controls must be implemented for compliance with NIST Special Publication 800-171:

Basic Security Requirements:

  1. Create, protect, and retain information system audit records to the extent needed to enable the monitoring, analysis, investigation, and reporting of unlawful, unauthorized, or inappropriate information system activity. (Requirement 3.3.1)
  2. Ensure that the actions of individual information system users can be uniquely traced to those users so they can be held accountable for their actions. (Requirement 3.3.2)

Derived Security Requirements:

  1. Review and update audited events. (Requirement 3.3.3)
  2. Alert in the event of an audit process failure. (Requirement 3.3.4)
  3. Use automated mechanisms to integrate and correlate audit review, analysis, and reporting processes for investigation and response to indications of inappropriate, suspicious, or unusual activity. (Requirement 3.3.5)
  4. Provide audit reduction and report generation to support on-demand analysis and reporting. (Requirement 3.3.6)
  5. Provide an information system capability that compares and synchronizes internal system clocks with an authoritative source to generate time stamps for audit records. (Requirement 3.3.7)
  6. Protect audit information and audit tools from unauthorized access, modification, and deletion. (Requirement 3.3.8)
  7. Limit management of audit functionality to a subset of privileged users. (Requirement 3.3.9)

Federal Information Security Management Act (FISMA) or DFARS 252.204.7012

If your project requires compliance with the Federal Information Security Management Act (FISMA) or DFARS 252.204-7012, please contact IT Purdue Systems Security via itpolicyanswers@purdue.edu for assistance in meeting compliance.

 Revised December 19, 2024: Reviewed and administrative changes made.