Most of the computer security white papers in the Reading Room have been written by students seeking GIAC certification to fulfill part of their certification requirements and are provided by SANS as a resource to benefit the security community at large. SANS attempts to ensure the accuracy of information, but papers are published "as is". Errors or inconsistencies may exist or may be introduced over time as material becomes dated. If you suspect a serious error, please contact email@example.com.
When the Internet was invented in the late 1960's to conduct research between specific colleges and the US Department of Defense (DOD), no one envisioned that in the future networks would be connected into a singular global one.
Blackboard Learn (Bb Learn) is an application suite providing educational technology
to facilitate online, web based learning. It is typical to see Bb Learn hosting courses and
content. Common add-ons include the Community and Content systems which are
By: Robert Sorensen (posted on September 21, 2011)
Today's computing environment is fraught with much treachery. It used to be that one could surf the web without any thought of infection or loss of private information. Those times have changed! One might argue that the safest connection to the web is no connection at all. However, thiS is not feasible in today's social networked world (Powell, 2011). The target has only increased for hackers.
When an administrator builds a Linux server, they make many decisions. One of the most difficult is deciding which packages to install. Linux distributions, upon installation, try to pass package selection off as an easy choice. The administrator must simply choose a function from the list and the installation program will automatically install all the necessary software to provide that service. The installation usually works and the resulting machine performs the desired task. Administrators focused only on functionality consider themselves finished and move on to the next task.
`A lot of currently available penetration testing resources lack report writing methodology and approach which leads to a very big gap in the penetration testing cycle. Report in its definition is a statement of the results of an investigation or of any matter on which definite information is required (Oxford English Dictionary).
A penetration test is useless without something tangible to give to a client or executive officer. A report should detail the outcome of the test and, if you are making recommendations, document the recommendations to secure any high-risk systems (Whitaker & Newman, 2005). Report Writing is a crucial part for any service providers especially in IT service/ advisory providers. In pen-testing the final result is a report that shows the services provided, the methodology adopted, as well as testing results and recommendations. As one of the project managers at major electronics firm Said "We don't actually manufacture anything. Most of the time, the tangible products of this department [engineering] are reports." There is an old saying that in the consulting business: “If you do not document it, it did not happen.” (Smith, LeBlanc & Lam, 2004)
With today’s technology there exist many methods to subvert an information system which could compromise the confidentiality, integrity, or availability of the resource. Due to the abstract nature of modern computing, the only way to be reliably alerted of a system compromise is by reviewing the system’s actions at both the host and network layers and then correlating those two layers to develop a thorough view into the system’s actions. In most instances, the computer user often has no indication of the existence of the malicious software and therefore cannot be relied upon to determine if their system is indeed compromised.
Defense-in-Depth is a term commonly used when describing a layered model for protecting computing environments; by having multiple layers of protection, from the perimeter of the network to each computing system at the core, security-related failures at any single layer should not compromise the confidentiality, integrity or availability of the overall system. In this day and age, simple reliance on firewalls for protecting is generally considered to be imprudent (Brining, 2008), for they offer no network-level protection in case of failure, poor configuration, software misbehavior, or unauthorized access attempts posing as legitimate traffic; nor can they offer any protection if communications circumvent the firewall itself.
There is substantial industry documentation on web browser security because the web browser is currently a frequently used vector of attack. This paper investigates current literature discussing the threats present in today's environment.
The original Sudo application was designed by Bob Coggeshall and Cliff Spencer in 1980 within the halls of the Department of Computer Science at SUNY/Buffalo. Sudo encourages the principal of least privilege that is, a user operates with a bare minimum number of privileges on a system until the user requests a higher level of privilege in order to accomplish some task.
Seeking and achieving formal Certification and Accreditation of systems designed for use within the Department of Defense is a statutory requirement and a necessary part of a system's overall Information Assurance program.
The goal of this document, within the scope of the practical exam for the GSEC1 SANS2 option 2, is to present a solution for a Company, in order to be able to manage and apply computing security rules on Mainframe and Midrange systems, as well as Facilities Management systems complying with other security rules, specific to customers.
Organizations that design, develop, test, and support IP based products present unique security challenges in a converged services network. In an ideal scenario, engineering labs where these activities take place are insulated from the corporate environment to prevent interactions that can compromise corporate network confidentiality, integrity, and availability.
The principle objective of `Patch Management and the Need for Metrics' is to demonstrate that organisations cannot meaningfully assess their security posture; with reference to their patch status, without the use of appropriate metrics.
Corporate websites get defaced; business activities of organizations get crippled; identity stolen; confidential information made public - all because of not securing information and resources, and not taking precautions necessary to protect against attacks.
Systems maintenance, including operating system and software upgrades and patch management, has long been a major factor in security-related incidents. Application upgrades and patches can be equally necessary to system integrity, yet are equally likely to be ignored.
A case study detailing the implementation of a business continuity plan for a regional newspaper. This study covers the requirements-gathering process, testing, and implementation of a series of plans jointly developed by members of the newsroom, IT, online staff, and operations.
Web Filters are designed to improve the security and productivity of a network, but as with anything else, it must be implemented correctly to work properly. In order to ensure a Web Filter is implemented successfully, several factors need to be considered.
This paper attempts to build on the foundational article submitted by Steven Bourdon in March 2002 (Bourdon), by providing a security-focused overview of the basic tasks performed when installing a standalone OpenVMS server.
This paper discusses the process of integrating a credit card application to the front end of already existing accounting and payments processing applications, the information risk analysis process needed and the action plan to implement the mitigated controls.
This paper looks at Vulnerability Identification Studies which focus on identifying the enticements, common vulnerabilities, and information leakage, the things that account for most of the risk to IT (Information Technology) that we face today.
With a few careful considerations for data redundancy and archival, centralized network security management can take advantage of the full power and potential for defense in-depth and a hardened security posture.
In the event of a successful attack, limiting the amount of damage and quickly redistributing the assets to maintain a minimum essential infrastructure is critical in keeping the defense and national economy functioning.
The Open Source Risk Mitigation Process described in this paper, is a tool for corporations to use when trying to understand why a simple decision to use the "free" Open Source software should be taken very seriously.
This paper outlines the components of the RILOE, detailed features and functionality of the card, pre installation tips, physical installation instructions, physical setup instructions, and initial setup configuration parameters.
This paper examines the long-standing vision of one senior OMB manager to re-enforce a seven year-old plan he helped draft that uses the Federal IT budget planning process to accomplish these three principal objectives.
This paper presents one methodology for identifying, evaluating and applying security patches in a real world environment along with descriptions of some useful tools that can be used to automate the process.
This document endeavors to provide the reader with a solid understanding of the certification process, the order in which the steps should be completed, and some lessens learned from actual experience.
The purpose of this paper is to take the wide variety of federal government laws, regulations, and guidance combined with industry best practices and define the essential elements of an effective IT security program
I have attended many conferences/training sessions, and SANS by far has been the best. The instructors are the top in the industry, examples are from real life experiences - terrific! -Chris Bush, Novartis Pharmaceuticals