Searching over 5,500,000 cases.

Buy This Entire Record For $7.95

Download the entire decision to receive the complete text, official citation,
docket number, dissents and concurrences, and footnotes for this case.

Learn more about what you receive with purchase of this case.

Cobell v. Norton

October 20, 2005


The opinion of the court was delivered by: Royce C. Lamberth, United States District Judge


This matter comes before the Court on the plaintiffs' Motion [2926] for Preliminary Injunction, which alleges that the Department of the Interior has insufficient computer security to adequately safeguard the electronically stored Individual Indian Trust Data of which it is a custodian. The Court has considered the plaintiffs' motion, the defendants' opposition, and the plaintiffs' reply, along with the documentary and testimonial evidence adduced at the fifty-nine day evidentiary hearing the Court conducted to resolve this motion. The Court concludes that the plaintiffs' motion will be granted.


This is not the first time this Court has addressed the security posture of Interior's information technology ("IT") systems. Interior's numerous and complicated IT systems house and provide access to a massive volume of Individual Indian Trust Data ("IITD") stored in electronic form, which data is essential both to the day-to-day management of the trust and to completion of the full and accurate accounting of the trust that has been mandated by this Court. In light of evidence that Interior's IT security was seriously deficient, the Court has found it necessary to disconnect Interior's IT systems from the Internet more than once before and has granted various other forms of relief to protect electronic IITD.

A. Factual and Procedural History

As early as April 4, 2000, the Court noted problems with Interior's ability to secure electronic trust data. The Court was "alarmed and disturbed," for example, "by the revelation that the [Bureau of Indian Affairs ("BIA")] has no security plan for the preservation of [Indian trust] data." Tr. (Hrng., Apr. 4, 2000), at 11. The Special Master was thus assigned the task of investigating the extent of Interior's IT security problems. During the pendency of the Special Master's initial inquiry, in February, 2001, the D.C. Circuit affirmed this Court's finding that Interior, one of the federal government's trustee-delegates for the Individual Indian Money Trust ("IIM Trust"), breached its fiduciary duty to provide the trust beneficiaries with a full and accurate accounting of their trust assets. See Cobell v. Norton ("Cobell VI"), 240 F.3d 1081 (D.C. Cir. 2001). As relevant here, the Court of Appeals noted that "the federal government will be unable to provide an adequate accounting without computer systems, staffing, and document retention policies that are adequate for the task," and remanded the case to this Court for further proceedings. Cobell VI, 240 F.3d at 1109.

On November 14, 2001, the Special Master filed a 154-page report with the Court in which he observed that IITD was housed on Interior IT systems that had "no firewalls, no staff currently trained/capable of building and maintaining firewall devices, no hardware/software solution for monitoring network activity including but not limited to hacking, virus, and worm notification ...."*fn1 Report and Recommendation of the Special Master Regarding the Security of Trust Data at the Department of the Interior, at 141 (Nov. 14, 2001). The Special Master also noted Interior's "serious lack of wide area networking and security personnel in general," and that "[t]he BIA is also far behind the other bureaus in Interior regarding staffing of messaging systems and infrastructure support." Id.

On December 5, 2001, in response to this report, the Court entered a temporary restraining order requiring "that defendants shall immediately disconnect from the Internet all information technology systems that house or provide access to individual Indian trust data [and] that defendants shall immediately disconnect from the Internet all computers within the custody and control of the Department of the Interior, its employees and contractors, that have access to individual Indian trust data." Temporary Restraining Order [1036], issued Dec. 5, 2001, at 2.*fn2 At Interior's request, the Court entered a consent order on December 17, 2001, which authorized Interior to reconnect IT systems to the Internet upon presenting evidence of increased security to the Special Master and gaining his approval. See Consent Order [1063], issued Dec. 17, 2001. Specifically, the Consent Order provided that

the Special Master shall verify compliance with this Consent Order and may conduct interviews with Interior personnel or contractors or conduct site visits wherever information technology systems or individual Indian trust data is housed or accessed.

Consent Order [1063] at 7. Pursuant to this directive, the Special Master hired first IBM, and then in March 2002 the Security Assurance Group ("SAG"), to provide independent evaluations of IT security at various Interior sites and to conduct external penetration testing of Interior's IT systems. External penetration testing, which requires simulating an attempt to gain access to an IT system by an outside "hacker," was governed by rules of engagement agreed to by the Special Master and Interior.

On behalf of the Special Master, SAG performed security assessments and penetration testing of Interior's IT systems between March 2002 and July 2003, finding numerous vulnerabilities that called into question Interior's IT security-related certifications to the Court. For example, despite the Bureau of Land Management's ("BLM") August 11, 2003 certification its intrusion detection system ("IDS") is "monitored by network security personnel on a daily basis," BLM Cert. (Aug. 11, 2003), at 34; SAG's earlier penetration testing of BLM, conducted between February 10, 2003 and March 26, 2003, showed that "throughout all Phases of the testing ... no effort was made by BLM administrators to restrict, block, or deny access from the source of the attacks. SAG believes that none of SAG's activities were detected at any time." Internet Assessment of DOI/BLM Networks (Mar. 27, 2003), at 1.

SAG also found serious vulnerabilities in the Minerals Management Service's ("MMS") IT systems, see generally Assessment of Minerals Management Service- Camarillo Revisit (Mar. 26, 2003), which Interior later informed the Court had not been corrected. See Defs.' Comments on IT Security Repts. Filed by the Special Master in Accordance with this Court's January 21, 2004 Order (Feb. 12, 2004), at 4 n.7. Similar vulnerabilities were identified in IT systems at the Bureau of Reclamation ("BOR"). See Assessment of Bureau of Reclamation-Sacramento Revisit (Mar. 24, 2003). In assessing IT security at one Office of Surface Mining ("OSM") branch office, SAG found that "the Intrusion Detection System had not been monitored or reviewed by anyone for approximately forty-five days and that an additional system was connected to the Internet for twenty-six days with no Intrusion Detection System implemented at all." Cobell v. Norton ("Cobell XI"), 310 F. Supp. 2d 77, 82 (D.D.C. 2004) (citing Assessment of Office of Surface Mining- Pittsburgh Revisit (June, 2003)).

The Consent Order procedure for reconnection worked well for nearly two years, and the Special Master eventually approved some ninety-five percent of Interior's systems for reconnection. See Cobell XI, 310 F. Supp. 2d at 82. However, an April 2003 incident concerning the "unplugging" of a network cable at the Office of Surface Mining "at the exact time the agency was aware the Special Master's contractor was performing penetration testing on that system" escalated over the course of the late spring and early summer, resulting in the breakdown of the relationship between Interior and the Special Master. See id. at 82; see also Cobell v. Norton ("Cobell IX"), 274 F. Supp. 2d 111, 114--24 (D.D.C. 2003) (detailing the events surrounding this breakdown).

On June 26, 2003, after it had become clear that the Consent-Order process had failed, the plaintiffs moved for a temporary restraining order to require Interior to disconnect IT systems housing or accessing IITD from the Internet. The Court held a hearing and subsequently entered the TRO the following day, June 27, 2003. See TRO [2118] (June 27, 2003). Noting that "the parties continue to be at an impasse as to the manner in which the Consent Order should be implemented ... [and that] the Court has no confidence that this impasse will be resolved," the Court issued a preliminary injunction on July 28, 2003 staying the Consent Order requiring that Interior "immediately disconnect from the Internet all Information Technology Systems within [its] custody or control ... until such time as the Court approves their reconnection to the Internet." Cobell IX, 274 F. Supp. 2d at 133--135. The Court allowed Interior IT systems connected to the Internet as of the date the preliminary injunction was issued to remain connected if they "impact[ed] life or property," or if Interior certified to Court that the connected systems either did not house or access Individual Indian Trust data or were secure from unauthorized access from the Internet. See id. at 135--36. In light of Interior's objection to continuing oversight by the Special Master, see id. at 123--24, the Court decided to make the necessary determinations regarding reconnection of Interior's systems itself. See id. at 133. Interior appealed.

During the pendency of Interior's appeal from the July 28, 2003 preliminary injunction, after reviewing Interior's certifications for Internet-connected IT systems submitted on August 11, 2003 in accordance with the July 28, 2003 preliminary injunction and finding them to be both procedurally and substantively defective, the Court, on March 15, 2004, entered a preliminary injunction that required Interior to disconnect (or to keep disconnected) from the Internet Interior's IT systems at certain bureaus and offices, including the BIA, regardless of whether or not they housed or accessed Individual Indian Trust Data. See Cobell XI, 310 F. Supp. 2d at 96-97. The March 15, 2004 preliminary injunction superseded and replaced the Court's June 28, 2003 preliminary injunction. See Preliminary Injunction Order [2531], issued Mar. 15, 2004, at 1. Interior systems essential to protecting against fires or other threats to life or property, as well as IT systems at Interior's National Parks Service ("NPS"), Office of Policy Management and Budget ("OPMB"), and United States Geological Survey ("USGS") were exempted from disconnection. See Cobell XI, 310 F. Supp. 2d at 100--01. Finally, the March 15, 2004 preliminary injunction provided that reconnection would be possible upon Court approval of a reconnection plan to be submitted by the Secretary of the Interior. See id. at 101. Again, Interior appealed.

The D.C. Circuit consolidated Interior's appeals and vacated this Court's March 2004 preliminary injunction in an opinion issued December 3, 2004. See Cobell v. Norton ("Cobell XII"), 391 F.3d 251 (D.C. Cir. Dec. 3, 2004). Interior argued on appeal that the issuance of the IT security related preliminary injunction was illegitimate because Cobell VI restricted this Court's remedial authority to established breaches of fiduciary duty and held only that Interior had breached its duty to render an accounting of the IIM trust. See Cobell XII, 391 F.3d at 257. The Court of Appeals, however, explained that Cobell VI "did not limit the district court's authority to exercise its discretion as a court of equity in fashioning a remedy to right a century-old wrong or to enforce a consent decree[,]" and that "maintaining adequate computer systems, along with staff and document retention policies, is critical to the completion of an adequate accounting." Id. Thus, the Court concluded, equitable relief to ensure the security of electronic IITD was well within this Court's authority.

In response to Interior's argument that this Court's jurisdiction is limited to typical Administrative Procedure Act-style review of Interior's actions, the Court of Appeals explained that this Court "retains substantial latitude, much more so than in the typical agency case, to fashion an equitable remedy because the underlying lawsuit is both an Indian case and a trust case in which the trustees have egregiously breached their fiduciary duties." Id. at 257--58. Additionally, rejecting Interior's argument that this Court's IT security related injunctions violated separation of powers principles by essentially "taking over" the Department, the Court of Appeals reasoned that "the injunction ... requires the Secretary to develop IT security programs" and does not "include particular tasks for Interior to perform based on policies developed by the district court." Id. at 258. Indeed, "[t]he injunction does no more than to ensure that the Secretary is 'tak[ing] reasonable steps toward the discharge of the federal government's fiduciary obligations to IIM trust beneficiaries ....'" Id.

The Court of Appeals vacated and remanded the IT security injunction on procedural rather than substantive grounds, holding that this Court had erred first in declining to consider Interior's August 11, 2003 certifications regarding IT security and second in issuing the preliminary injunction without holding an evidentiary hearing. See Cobell XII, 391 F.3d. at 258--62. In the wake of the decision of the Court of Appeals, the plaintiffs requested that the Court hold an emergency status conference to determine how to proceed in addressing Interior's Indian trust-related IT security issues going forward. See Pls.' Request [2776] for Emergency Status Conference Regarding the Security of Electronic Trust Records (Dec. 3, 2004); Pl's Renewed Request [2804] for Emergency Status Conference Regarding the Security of Electronic Trust Records (Jan. 4, 2005).

On April 8, 2005, while the plaintiffs' requests were pending, Interior filed with the Court a Notice [2924] to the Court Regarding Inspector General's "Notice of Potential Findings and Recommendation" with Respect to Information Technology Systems ("Defs.' Notice"), which recounted the results of a recent penetration test of BLM's IT systems conducted by a contractor hired by Interior's Inspector General ("IG"). Specifically, the IG's Notification explained that:

Given the poor state of network security at [BLM] and the weak access controls we encountered on many systems, it is safe to say that we could have easily compromised the confidentiality, integrity, and availability of the identified Indian Trust data residing on those systems. However, due to the various court orders protecting Indian Trust data, the [Inspector General] carried out no further testing that could have jeopardized [BLM] Indian Trust systems. No information was collected, no data was manipulated, and no system was actually compromised.

See Defs.' Notice at 2. Four days later, the plaintiffs filed a motion for a temporary restraining order to disconnect Interior's IT systems housing or accessing IITD from the Internet, along with the present motion for a preliminary injunction to the same effect. The Court held a hearing on the plaintiffs' TRO motion on April 20, 2005, at which the motion was taken under advisement and preparations were made to conduct an evidentiary hearing regarding the plaintiffs' accompanying motion for preliminary injunction. It is from the evidence presented at that fifty-nine day evidentiary hearing that the Court herein makes findings of fact and conclusions of law.

B. Statutory and Regulatory Framework

The statutory and regulatory requirements reviewed herein are not at issue on the present motion, but they provide the only available baseline standard for government IT security against which to measure Interior's accomplishments in that arena. Title III of the 2002 E-Government Act, Pub. L. No. 107-347, is the Federal Information Security Management Act ("FISMA"). FISMA permanently reauthorized the IT security requirements set out in the Government Information Security Reform Act ("GISRA"), which expired of its own terms in November 2002. The 2002 version of FISMA enacted in the E-Government Act replaced the earlier version of FISMA enacted as part of the Homeland Security Act, Pub. L. No. 107-296.

i. FISMA Requirements

FISMA requires that government agencies submit annual IT security reports to the OMB, the House Committees on Government Reform and Science, the Senate Committees on Government Affairs and Commerce, Science, and Transportation, the authorization and appropriation committees of each individual congressional agency, and the General Accounting Office ("GAO"). See 44 U.S.C.A. § 3544(c)(1) (Supp. 2005). Each agency's annual report must include information about risk assessments, security policies and procedures, individual system security plans, IT security training, annual testing and evaluation of IT security, remediation processes, IT security incident reporting processes, and continuity of operations planning. See 44 U.S.C.A. §§ 3544(c)(1), 3544(b)(1)--(8) (Supp. 2005).

Aside from the reporting requirement, FISMA requires that agencies "develop, document, and implement an agencywide information security program ... to provide information security for the information and information systems that support the operations and assets of the agency, including those provided or managed by another agency, contractor, or other source[.]" 44 U.S.C.A. § 3544(b) (Supp. 2005). This agencywide IT security program must be approved by the agency's director, see 44 U.S.C.A. §§ 3543(a)(5), 3544(b) (Supp. 2005), and must include "periodic assessments of the risk and magnitude of the harm that could result from the unauthorized access, use, disclosure, disruption, modification, or destruction of the information and information systems that support the operations and assets of the agency." 44 U.S.C.A. § 3544(b)(1) (Supp. 2005). The IT security policies and procedures set forth in the agency's program must, among other things, be "based on the risk assessments" that FISMA requires be conducted, 44 U.S.C.A. § 3544(b)(2)(A) (Supp. 2005), and "ensure compliance with "information security standards promulgated under section 11331 of title 40" and "any other applicable requirements[.]" 44 U.S.C.A. §§ 3544(b)(2)(D)(iii), 3544(b)(2)(D)(iv) (Supp. 2005).

FISMA places primary responsibility for "developing and overseeing the implementation of policies, principles, standards, and guidelines on information security" on the Director of OMB, who must report to Congress annually regarding executive agencies' compliance with FISMA's directives. See 44 U.S.C.A. § 3543(a)(1) (Supp. 2005). OMB's principal IT security policy is set forth in OMB Circular A-130. Appendix III to OMB Circular A-130 requires the certification and accreditation ("C&A") of agencies' IT systems, meaning that each agency is required to implement "a minimum set of security controls to be included in Federal automated information security programs," and to "[e]nsure that a management official authorizes in writing the use of [an IT system] based on implementation of its security plan." OMB Circ. A-130, App. III. OMB requires C&A for all "general support systems," which are defined as:

information resources under the same direct management control which shares common functionality. A system normally includes hardware, software, information, data, applications, communications, and people. A system can be, for example, a local area network (LAN) including smart terminals that supports a branch office, an agency-wide backbone, a communications network, a departmental data processing center including its operating system and utilities, a tactical radio network, or a shared information processing service organization (IPSO).

OMB Circ. A-130, App. III. C&A is also required for all "major applications." Id. A "major application" is:

an application that requires special attention to security due to the risk and magnitude of the harm resulting from the loss, misuse, or unauthorized access to or modification of the information in the application. Note: All Federal applications require some level of protection. Certain applications, because of the information in them, however, require special management oversight and should be treated as major. Adequate security for other applications should be provided by security of the systems in which they operate.

Id. The terms "information system" and "IT system" will be used interchangeably herein to mean both "general support systems" and "major applications." OMB Circular A-130, Appendix III specifies that C&A is necessary to the provision of "adequate" security for IT systems, which OMB defines as "security commensurate with the risk and magnitude of the harm from the loss, misuse, or unauthorized access to or modification of information. This includes assuring that systems and applications used by the agency operate effectively and provide appropriate confidentiality, integrity, and availability [of data], through the use of cost-effective management, personnel, operational, and technical controls." OMB Circ. A-130, App. III.

Additionally, FISMA requires agency compliance with the Department of Commerce's National Institute of Standards and Technology's ("NIST") "standards and guidelines pertaining to Federal Information Systems," including "information security standards that ... provide minimum information security requirements" that "shall be compulsory and binding" on federal agencies. 40 U.S.C.A. §§ 11331(a)(1), 11331(b)(2)(A)-- (B) (2000); see 44 U.S.C.A. § 3544(b)(2)(D)(ii) (Supp. 2005) (requiring agency IT security plans to comply with NIST guidance). NIST's relevant IT security guidelines are reflected in NIST Special Publication 800-18, which establishes specific requirements for agencies' system security plans ("SSP"), NIST Special Publication 800-30, which establishes a methodology for assessing and managing IT security risks as required by FISMA, and NIST Special Publication 800-37, which establishes the requirements for C&A of IT systems. NIST Special Publications 800-34, 800-47, 800-50, 800-61, and 800-70 provide guidance on IT security contingency planning, securing information system interconnections, IT security awareness and training, IT security incident response planning, and IT security configuration checklists, respectively. Also relevant is NIST's Federal Information Processing Standards Publication 199 ("FIPS 199"), which NIST advises agencies to rely upon for guidance in determining the sensitivity of data and systems implicated by IT security problems, which is an essential step in completing FISMA-mandated risk assessments.

ii. Certification and Accreditation

NIST Special Publication 800-37, then, implements the C&A requirements of FISMA as specified in OMB Circular A-130, Appendix III. NIST defines IT security accreditation as "the official management decision given by a senior agency official to authorize operation of an information system and to explicitly accept the risk to agency operations, agency assets, or individuals based on the implementation of an agreed-upon set of security controls." Ron Ross, Marianne Swanson, et al., Information Security: Guide for the Security Certification and Accreditation of Federal Information Systems, NIST Special Publication 800-37, at 1 (May 2004), United States Department of Commerce, National Institute of Standards and Technology, available at [hereinafter "NIST SP 800-37"]. Security certification is defined as a comprehensive assessment of the management, operational, and technical security controls in an information system, made in support of security accreditation, to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system. The results of a security certification are used to reassess the risks and update the system security plan, thus providing the factual basis for an authorizing official to render a security accreditation decision.

NIST SP 800-37, at 1.*fn3 It should be noted that "the level of effort for security certification and accreditation (expressed in terms of degree of rigor and formality) should be scalable to the FIPS 199 security category of the information system." Id. at 25. That is, the higher the risk rating assigned to the IT system under FIPS 199, the more comprehensive and penetrating the C&A process must be in order to be considered adequate under NIST standards.

The C&A process is divided into four phases: (1) the initiation phase, designed to "ensure that the authorizing official and the senior agency information officer are in agreement with the contents of the system security plan, including the system's documented security requirements, before the certification agent begins the assessment of the security controls in the information system"; (2) the security certification phase, designed to "determine the extent to which the security controls in the information system are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system" and to "address[] specific actions taken or planned to correct deficiencies in the security controls and to reduce or eliminate known vulnerabilities in the information system"; (3) the security accreditation phase, designed to "determine if the remaining known vulnerabilities in the information system (after the implementation of an agreed-upon set of security controls) pose an acceptable level of risk to agency operations, agency assets, or individuals" and resulting in either the granting of authorization to operate ("ATO") the system, the granting of an interim authorization to operate ("IATO") the system, or a denial of authorization to operate; and (4) the continuous monitoring phase, which "provide[s] oversight and monitoring of the security controls in the information system on an ongoing basis and ... inform[s] the authorizing official when changes occur that may impact the security of the system." See NIST SP 800-37, at 2.

For the purposes of the C&A process, an "authorizing official" is defined as "a senior management official or executive with the authority to formally assume responsibility for operating an information system at an acceptable level of risk to agency operations, agency assets, or individuals." NIST SP 800-37, at 13. Interior usually designates an executive at the Assistant-Secretary level as the authorizing official for C&A of IT systems. See Pls.' Ex. 395 ("Department of the Interior (DOI) Information Technology (IT) Security Program: DOI Certification and Accreditation (C&A) Guide, Version 1.1", July 10, 2003) ("Pls.' Ex. 395"), at 19.*fn4 However, the authorizing official may delegate his or her responsibilities in the C&A process to a representative, see NIST SP 800-37, at 13, who Interior calls the "designated authorizing agent" ("DAA"). See Pls.' Ex. 395, at 19. A "certification agent" is defined as "an individual, group, or organization responsible for conducting a security certification." NIST SP 800-37, at 15. NIST cautions that ":[t]o preserve the impartial and unbiased nature of the security certification, the certification agent should be in a position that is independent from the persons directly responsible for the development of the information system and the day-to-day operation of the system." Id. In the main, Interior has chosen to designate as the certification official the Chief Information Officer ("CIO") of the Bureau within which the system subject to C&A operates. See Pls.' Ex. 395, at 20.

The initiation phase of the C&A process is concerned principally with reviewing system risk assessments and finalizing the SSP. Primary responsibility for the tasks that must be completed in the initiation phase of the C&A process is placed on the Information System Owner ("ISO"), or the "agency official responsible for the overall procurement, development, integration, modification, or operation and maintenance of an information system." NIST SP 800-37, at 14. Much of the preparation undertaken during this phase relies on a previously completed risk assessment and SSP for the relevant system. See id. at 26--27. During the initiation phase, the ISO is responsible for "reviewing the system security plan and confirming that the contents of the plan are consistent with an initial assessment of risk[,]" id. at 27; confirming that the FIPS 199 "security category of the information system has been determined and documented" in the SSP, id. at 28; and confirming identification and documentation in the SSP or risk assessment of the system's potential threats, known flaws and weaknesses, extant security controls, and risk to agency operations, agency assets, and individuals, id. at 29--31. After completing these tasks, the ISO must inform the "senior agency security officer, authorizing official, certification agent, user representatives, and other interested agency officials" of the need to C&A the system. Id. at 32.

Once informed of the need to begin the C&A process for a system, the authorizing official or DAA must coordinate with the senior agency information security officer ("SAISO"), the ISO, and the certification agent to "[d]etermine the level of effort and resources required" for the C&A of the relevant system. See NIST SP 800-37, at 32. Then, the DAA, SAISO, and certification agent "[r]review the FIPS 199 security categorization" of the system to determine whether "the assigned impact values ... are consistent with [the] agency's actual mission requirements[,]" id. at 33, and review the SSP to determine whether it will actually produce the risks documented in the system's risk assessment. See id. The ISO next modifies the SSP as needed on the basis of the results of these reviews, and revised SSP is reviewed by the DAA and SAISO to determine if it presents acceptable risk. See id. at 34-- 35.

The certification phase of the C&A process, which is conducted primarily by the certification agent and the ISO, is designed to test the adequacy of the security controls present on a system and to document the results of that testing for use by the DAA. See NIST SP 800-37, at 35. The principal activities during this phase are the performance of a system security test ("SST") of the system and the creation of a plan of actions and milestones ("POA&M") document when vulnerabilities are detected during the SST.

An SST can include the use of an automated vulnerability scanning tool, which "is used to scan a group of hosts or a network for known vulnerability services (e.g. systems allowing anonymous File Transfer Protocol [FTP] sendmail relaying)." Gary Stoneburner, Alice Goguen, & Alexis Feringa, Risk Management Guide for Information Technology Systems, NIST Special Publication 800-30, at 17 (July 2002), United States Department of Commerce, National Institute of Standards and Technology, available at sp800-30.pdf [hereinafter "NIST SP 800-30"]. Another SST methodology is a security test and evaluation (ST&E), which "test[s] the effectiveness of the security controls of an [individual] IT system as they have been applied in an operational environment." NIST SP 800-30, at 17. Or, an SST can involve penetration testing, which "test[s] the IT system from the viewpoint of a threat-source and ... identif[ies] potential failures in the IT system protection schemes." Id. A certification agent may choose one or a combination of these SST methods. A POA&M "describes actions taken or planned by the information system owner to correct deficiencies in the security controls and to address remaining vulnerabilities in the information system (i.e. reduce, eliminate, or accept the vulnerabilities)." Id. at 39. A POA&M "identifies: (i) the tasks needing to be accomplished; (ii) the resources required to accomplish the elements of the plan; (iii) any milestones in meeting the tasks; and (iv) scheduled completion dates for the milestones." Id.

In the certification phase, the certification agent works with the ISO to gather any documentation required to conduct a thorough SST, see id. at 36, selects or designs an appropriate methodology for conducting the SST, and then performs the SST itself. See id. at 36--37. The certification agent then prepares the SST report and transmits it to the ISO. See id. at 37--38. The ISO must update the SSP to account for the risks identified in and modifications to the system's security controls undertaken as a result of the SST report, and prepare the POA&M based on the results of the SST. See id. at 38--39. When these tasks are completed, the ISO assembles for delivery to the DAA the "final security accreditation package," which must include the SST report, the POA&M document, and the updated SSP. See id. at 39. These are the documents on which the DAA relies in making his or her accreditation decision in the next phase of the C&A process.

The DAA has primary responsibility for completing the necessary tasks in the security accreditation phase of the C&A process. This phase involves assessment of the IT security risks identified for the subject system, and either acceptance of those risks or denial of accreditation on the basis of unacceptable levels of risk. The DAA determines the level of risk from the security accreditation package provided by the ISO, and decides whether "the risk to agency operations, agency assets, or individuals" posed by continuing to operate the system is acceptable. See NIST SP 800-37, at 40--41. "If, after assessing the results of the security certification, the [DAA] deems that the agency-level risk is acceptable, an [ATO] is issued. The information system is accredited without any restrictions or limitations on its operation." Id. at 41. "If, after assessing the results of the security certification, the [DAA] deems that the agency-level risk is unacceptable, but there is an important mission-related need to place the information system into operation, an [IATO] may be issued." Id. NIST describes the consequences of an IATO:

The interim authorization to operate is a limited authorization under specific terms and conditions including corrective actions to be taken by the [ISO] and a required timeframe for completion of those actions. A detailed [POA&M] should be submitted by the [ISO] and approved by the [DAA] prior to the [IATO] taking effect. The information system is not accredited during the period of limited authorization to operate. The [ISO] is responsible for completing the corrective actions identified in the [POA&M] and resubmitting an updated security accreditation package upon completion of those actions.

Id. (emphasis in original). If the DAA determines that the agency-level risk of operating a system is unacceptable and decides not to issue an IATO, accreditation is denied and "the information system is not authorized for operation." Id.

After the accreditation decision is made, the DAA prepares for inclusion in the final accreditation package a letter documenting his or her determination, including "the rationale for the decision, the terms and conditions for information system operation, and required corrective actions, if appropriate." NIST SP 800-37, at 41--42. This letter and the other documents that comprise the final accreditation package are then transmitted to the ISO and other interested agency officials and made available to "auditors and oversight agencies," such as OMB. Id. at 42. Finally, the ISO updates the SSP again based on the DAA's assessment of the risk of operating the system. Id. at 42--43.

iii. Security Monitoring After Certification and Accreditation

NIST explains:

[s]security accreditation is part of a dynamic, ongoing risk management process. An information system is authorized for operation at a specific point in time reflecting the current security state of the system. The inevitable changes to the information system (including hardware, firmware, software and people) and the potential impact those changes may have on agency operations, agency assets, or individuals, require a structured and disciplined process capable of monitoring the effectiveness of the security controls in the information system on an ongoing basis.

NIST SP 800-37, at 9--10. In light of this policy, NIST makes clear that the purpose of the continuous monitoring phase of the C&A process "is to provide oversight and monitoring of the security controls in the information system on an ongoing basis and to inform the [DAA] when changes occur that may impact the security of the system." Id. at 43. Certain kinds of changes to an IT system, as well as federal or agency policies, may require that systems undergo repeated C&A. See id. The ISO has primary responsibility for the tasks required during the continuous monitoring phase.

The ISO is required to document any "proposed or actual changes to the information system (including hardware, software, firmware, and surrounding environment)[,]" analyze such changes to determine their impact on system security, and update both the SSP and POA&M accordingly. See NIST SP 800-37, at 43--44, 46. In addition, the ISO is responsible for selecting and testing, on a continuous basis, a set of security controls that exist on the relevant IT system. See id. at 44--45. The set of security controls selected for continuous monitoring should be a representative sampling of controls in operation on the system, but may also include certain controls "considered more critical ... because of the potential impact on the information system if those controls were subverted or found to be ineffective." Id. at 45. Such decisions should be based on "the agency's priorities and the importance of the information system to the agency." Id. at 44. Finally, the ISO must provide to the DAA and SAISO periodic system security status reports "address[ing] vulnerabilities in the information system discovered during the security certification, security impact analysis, and security control monitoring ...." Id. at 47. The frequency of these reports is at the discretion of the agency, but they should be submitted "at appropriate intervals to transmit significant security-related information about the system." Id.

OMB requires that an agency's IT systems undergo C&A at least every three years. See OMB Circ. A-130, App. III. While agencies may require re-C&A more regularly, see NIST SP 800-37, at 5, Interior has no standing policy to that effect. NIST requires that subsequent C&A "should begin, as in the original security accreditation, with the Initiation Phase," and proceed, again, through the entire process. See id. at 47. In addition, re-C&A is required whenever an IT system undergoes a "significant change," see OMB Circ. A-130, App. III, examples of which may include but are not limited to:

(i) installation of a new or upgraded operating system, middleware component, or application; (ii) modifications to system ports, protocols, or services; (iii) installation of a new or upgraded hardware platform or firmware component; or (iv) modifications to cryptographic modules or services. Changes in laws, directives, policies, or regulations, while not always directly related to the information system, can also potentially affect the security of the system and trigger a reaccreditation action.

Id. at 5 n.10.

FISMA also requires that "[e]ach year each agency shall have performed an independent evaluation of the information security programs and practices of that agency to determine the effectiveness of such programs and practices." 44 U.S.C.A. §3545(a)(1) (Supp. 2005). For agencies that have Inspectors General, such as Interior, the "annual evaluation required by [FISMA] shall be performed by the Inspector General or by an independent external auditor, as determined by the Inspector General of the Agency[.]" 44 U.S.C.A. § 3545(b)(1) (Supp. 2005). These annual evaluations are independent of the C&A process, and must include both "testing of the effectiveness of information security policies, procedures, and practices of a representative subset of the agency's information systems" and "an assessment of compliance with ... [FISMA] ... [and] related information security policies, procedures, standards, and guidelines ...." 44 U.S.C.A. §§ 3545(a)(2)(A), 3545(a)(2)(B)(i)-- (ii) (Supp. 2005). The results of these annual IG evaluations are submitted to the Director of OMB for use in OMB's annual FISMA-required report to Congress. See 44 U.S.C.A. § 3545(e)(1) (Supp. 2005).

The IG's annual FISMA review of IT security is not necessarily as detailed as the C&A process. OMB explains that "[t]he necessary depth and breadth of an annual FISMA review depends on several factors such as: 1) the acceptable level of risk and magnitude of harm to the system or information; 2) the extent to which system configurations and settings are documented and continuously monitored; 3) the extent to which patch management is employed for the system; 4) the relative comprehensiveness of the most recent past review; and 5) the vintage of the most recent in-depth testing and evaluation as part of system certification and final accreditation." OMB Mem. 03-19 (August 6, 2004), at 7. Thus, for systems that have undergone full C&A within the past year "and received final (not interim) authority to operate, [have] documented configuration settings, employ[] automated scanning tools ...; and [have] an effective patch management capability, a simple maintenance review using NIST's self assessment tool may meet the FISMA ... requirement." Id. For systems that do not satisfy some or all of these criteria, the IG's "annual testing and evaluation must be far more comprehensive" to comply with FISMA. See id.

iv. IT Security Risk Assessment

As Interior's system-level and agency-level IT security risk assessment practices became one focal point of the evidentiary hearing on the present motion, the Court will give a brief overview of the relevant requirements in this connection.

NIST notes that "assessment of risk" is an important activity in "an agency's information security program that directly support[s] security accreditation and [is] required by FISMA and OMB ...." NIST SP 800-37, at 4. "Risk assessments influence the development of the security controls for information systems and generate much of the information needed for the associated system security plans." Id. While the rigor and formality of risk assessments may vary among agencies and depending on the FIPS 199 classification of the IT system in question, "[a]t a minimum, documentation should be produced that describes the process employed and the results obtained." Id. at 5.

FISMA requires that NIST develop "standards to be used by all agencies to categorize all information and information systems collected or maintained by or on behalf of each agency based on the objectives of providing appropriate levels of information security according to a range of risk levels," as well as "guidelines recommending the types of information and information systems to be included in each such category." 15 U.S.C.A. § 278g-3(b)(1)(A)--(B) (Supp. 2005); see also 44 U.S.C.A. § 3543(a)(8)(B) (Supp. 2005) (incorporating this requirement of the National Institute of Standards and Technology Act into FISMA by reference). Accordingly, FIPS 199:

establishes security categories for both information and information systems ... based on the potential impact on an organization should certain events occur which jeopardize the information and information systems needed by the organization to accomplish its assigned mission, protect its assets, fulfill its legal responsibilities, maintain its day-to-day functions, and protect individuals.

FIPS 199 (Feb. 2004), at 1. The security categorization of a system is a function of the security categorization of the data on that system, FIPS 199, at 4--5, and the data and system security categorizations, in turn, "are used in conjunction with vulnerability and threat information in assessing the risk to an organization" of continuing that system in operation. See FIPS 199, at 1.

FIPS 199 categorizes information (as opposed to information systems) "according to its information type ... [or] a specific category of information (e.g. privacy, medical, proprietary, financial, investigative, contractor sensitive, security management) defined by an organization or, in some instances, by a specific law." FIPS 199, at 1 n.1 (emphasis in original). Security categorizations for information are assigned on the basis of "the potential impact on organizations or individuals should there be a breach of security" that adversely affects one or more of FISMA's three "security objectives." Id. at 2. These security objectives are: (1) confidentiality, compromise of which results in "unauthorized disclosure of information"; (2) integrity, compromise of which results in "unauthorized modification or destruction of information"; and (3) availability, compromise of results in "disruption of access to or use of information or an information system." Id. The security categorization process requires that, for a particular information type, a rating of low, moderate, or high risk be assigned for each of the security objectives depending on the likely consequences of their compromise. Id. at 3.

If a loss of confidentiality, integrity, or availability of a specific information type would cause "limited adverse effect on organization operations, organization assets, or individuals," that information type should receive a rating of "low risk" for the specific security objective being evaluated; if a loss of confidentiality, integrity, or availability would cause "a serious adverse effect," the information type should receive a "moderate risk rating" for the security objective at issue; and the information type should be rated "high risk" for the relevant objective if a loss of confidentiality, integrity, or availability would have "a severe or catastrophic adverse effect on organizational operations, organizational assets, or individuals." FIPS 199, at 2--3. If, for example, the impact rating for financial information on a given system is moderate for confidentiality, but high for both integrity and availability, then the security categorization ("SC") for the financial information should be represented thus:

SC (financial information) = {(confidentiality, moderate), (integrity, high), (availability, high)}. See id. at 3 (presenting other examples).

Assigning a security categorization to an information system requires analysis of "the security categories of all information types resident on that system. For an information system, the potential impact values assigned to the respective security objectives ... shall be the highest value (i.e. high water mark) from among those security categories that have been determined for each type of information resident on the information system." FIPS 199, at 4. To reuse the example set forth above, if the information system containing the financial information contains only one other information type, say individual medical records, then the security categorization for this second information type must be taken into consideration. The medical information has the security categorization:

SC (medical information) = {(confidentiality, high), (integrity, low), (availability, low)}.

When considered alongside the security categorization for the financial information, FIPS 199 requires that the information system receive the following security categorization:

SC (information system) = {(confidentiality, high), (integrity, high), (availability, high)}.

See id. at 4 (giving other examples). The security categorizations of information and information systems, again, are relevant to risk assessments, which are a central part of FISMA's overall information security regime.

NIST Special Publication 800-30 sets forth a nine-step process for system-level risk assessments, which are generally completed prior to or during the initiation phase of the C&A process. See NIST SP 800-30, at 8.*fn5 First, the IT system's characteristics, including its "criticality," or importance to the agency and the sensitivity of both the system and the data it houses. Id. at 4. Then, potential threat sources (e.g., malicious hackers, environmental dangers, etc.)*fn6 and the system vulnerabilities that threat-sources may be able to exploit must be identified, along with the security controls operating on the system that may neutralize either vulnerabilities or threat sources. See id. at 12--20. A "threat" is a conceptual outgrowth of a threat-source/vulnerability pair-the potential that a given threat-source may be able to exploit a given vulnerability is the actual "threat." See id. at 12. This notion of a "threat" is not to be confused with the likelihood of exploitation, which is a separate concept-the potential for exploitation of a vulnerability is either a positive value or zero, because "a threat-source does not present a risk when there is no vulnerability that can be exploited." Id. at 12. Thus, a threat exists when a threat-source can be paired with a vulnerability exploitable by that threat-source, without regard to the likelihood that the vulnerability would actually be exploited.

Assessing the likelihood of exploitation is a distinct step in risk assessment. See NIST SP 800-30, at 21. The "overall likelihood rating" for a threat-source/vulnerability pair depends on consideration of "[t]hreat-source motivation and cabability[,]" as well as the "[n]ature of the vulnerability [and the] [e]xistence and effectiveness of current [security] controls." Id. The likelihood of exploitation of a given vulnerability will be low if "[t]he threat-source lacks motivation or capability, or controls are in place to prevent, or at least significantly impede, the vulnerability from being exercised"; the likelihood rating will be medium if "[t]he threat-source is motivated and capable, but controls are in place that may impede successful exercise of the vulnerability"; and the likelihood rating will be high if "[t]he threat-source is highly motivated and sufficiently capable, and controls to prevent the vulnerability from being exercised are ineffective." Id.

The likelihood rating for a particular threat-source/vulnerability pair is combined with the impact of exploitation to arrive at a measurement of the risk presented by the threat-source/vulnerability pair. See NIST SP 800-30, at 24. The impact of a threat-source/vulnerability pair is classified as "low" in magnitude if "[e]xercise of the vulnerability (1) may result in the loss of some tangible assets or resources or (2) may noticeably affect an organization's mission, reputation, or interest." Id. at 23. The magnitude of the impact is "medium" if "[e]xercise of the vulnerability (1) may result in the costly loss of tangible assets or resources; (2) may violate, harm, or impede an organization's mission, reputation, or interest; or (3) may result in human injury." Id. Impact magnitude is high if "[e]xercise of the vulnerability (1) may result in highly costly loss of major tangible assets or resources; (2) may significantly violate, harm, or impede an organization's mission, reputation, or interest; or (3) may result in human death or serious injury." Id. Note that any potential for human injury requires an impact rating of at least medium.

Determining the magnitude of the impact requires consideration of the nature of the vulnerability in relation to FISMA's three security objectives: confidentiality, integrity, and availability of information and information systems, and the effects on agency operations, agency assets, and individuals of the compromise of one or more of these objectives. See id. at 22. The nature of a given vulnerability may reveal a potential impact on the security objectives with respect to data housed on an IT system, the IT system itself, or both. Evaluating the nature of the risk in relation to the FIPS 199 security categorization of either the system or the information it houses will often determine the magnitude of the impact of a given threat-source/vulnerability pair. Recall that assigning security categorizations to information and information systems requires assessing the potential impact of their compromise on organizational operations, organizational assets, or individuals, FIPS 199, at 2--3, which are some of the factors to be considered in system-level risk-magnitude analysis. See id. at 21.

For example, if a financial institution utilizes an IT system that houses financial data, several kinds of threat-source/vulnerability pairs may exist. Weak passwords for employee-level system access may pose a threat to the confidentiality of the financial information, but not to the integrity or availability of that information or to the system itself if the level of access that an unauthorized user might obtain by "cracking" one of these weak passwords does not allow alteration of the data or manipulation of the operations of the system. If, however, the FIPS 199 security categorization for the financial data is "high" for confidentiality, then the potential impact of this threat-source/vulnerability pair might be high. Or, a vulnerability in an internet-based application running on the system may allow a hacker to gain control over the entire system such that he or she could fully compromise its availability. If, however, the financial information housed on that system is encrypted, and the encryption key is not itself stored on the system, then the hacker will not be able to threaten the confidentiality or integrity of the financial data, despite total control over the system. In this case, the impact-magnitude should be determined with reference to the FIPS 199 security categorization of the financial information with respect to availability, and that of the information system with respect to all three security objectives. Finally, if weak password protection could allow an unauthorized user to gain sufficient user privileges on the system to alter the financial information and manipulate system controls, then the threat-source/vulnerability pair potentially impacts the confidentiality, integrity, and availability of both the financial information and the system itself. Each of the FIPS 199 security categories for both the financial information and the information system must be considered to assign an impact-magnitude in this example.

The likelihood rating for the threat-source vulnerability pair must be combined with the impact-magnitude as determined above to determine the threat-source/vulnerability pair's level of risk to the IT system. See NIST SP 800-30, at 24. For example, if the likelihood of a threat-source exploiting a given vulnerability is low, but the magnitude of the impact is high if the vulnerability is in fact exploited, an overall risk rating of "medium" may be assigned to the threat-source/vulnerability pair. A "risk-level matrix" should be developed to reflect the rating for "mission risk" for each threat-source/vulnerability pair, see id. at 24--25, and senior management should take certain actions depending on the resulting risk ratings. For high risk items, a system may only be accredited and continue in operation if corrective actions are taken as soon as possible; for medium risk items, a plan for corrective actions must be developed "within a reasonable period of time"; and for low risk items, corrective actions may be taken or the risk may simply be accepted. See id. at 25. Recommended corrective actions should be included with the results of the risk assessment in the official report for senior management. See id. at 26.

"Residual risk" is the aggregate of the risk-levels of threat-source/vulnerability pairs that remain after an organization completes any risk mitigation activities undertaken on the basis of recommendations made in the initial risk assessment report. See NIST SP 800-30, at 40. Risks might be mitigated by implementing new or enhanced IT security controls that eliminate system vulnerabilities (e.g., software patches that ameliorate program weaknesses), "reduce the capacity and motivation of a threat-source" (e.g., physical restrictions on employee access to a computer workstation that accesses sensitive information or systems), or reduce the impact magnitude of an item (e.g., modification of "the relationship between an IT system and the organization's mission" or assets). See id. at 39--40. If such controls are implemented, then risk should be reassessed after their implementation to determine residual risk. The residual risk will be identical to the risk reflected in the initial risk assessment report if no corrective actions are taken or insufficient time has elapsed for remedial efforts to be completed before the DAA reviews the risk assessment for accreditation purposes.*fn7

A DAA's decision to accredit an IT system must be based on analysis of that system's residual risk-he or she must consider the agency-level risk posed by accrediting and continuing in operation the particular IT system in light of its residual risks. Upon completion of this residual risk analysis, the DAA will either "sign a statement accepting any residual risk and authorizing the operation of the ... IT system[,]" or, "[i]f the residual risk has not been reduced to an acceptable level," the DAA will not accredit the system and "the risk management cycle must be repeated to identify a way of lowering the residual risk to an acceptable level." Id. at 40.

v. Interconnecting IT Systems

NIST Special Publication 800-47 "provides guidance for planning, establishing, maintaining, and terminating interconnections between information technology (IT) systems that are owned and operated by different organizations." Joan Hash, Tim Grance, et al., Security Guide for Interconnecting Information Technology Systems: Recommendations of the National Institute of Standards and Technology, NIST Special Publication 800-47, at ES-1 (Aug. 2002), United States Department of Commerce, National Institute of Standards and Technology, available at [hereinafter "NIST SP 800-47"] (marked for identification as Pls.' Ex. 118). NIST SP 800-47 offers guidance for four "phases" in the life cycle of an IT system interconnection: planning, implementation, maintenance, and termination. See NIST SP 800-47, at ES-1. Because the majority of Interior's IT system interconnections with which the Court is concerned for present purposes have already been planned and implemented, and are currently functioning as parts of Interior's IT infrastructure, NIST's guidance on maintenance of IT systems interconnections will be emphasized here.

NIST defines an IT system interconnection as "the direct connection of two or more IT systems for the purpose of sharing data and other information resources." NIST SP 800-47, at 2-1. IT system interconnections may be beneficial for a number of reasons, including their capacity to reduce operating costs, increase the functionality of the connected systems, increase the efficiency of system and organizational operations, and provide centralized access to data. See id. "Organizations can connect their IT systems using a dedicated line that is owned by one of the organizations or is leased from a third party," or they may "connect systems over a public network (e.g., the Internet), using a virtual private network (VPN)[,]" which NIST defines as "a data network that enables two or more parties to communicate securely across a public network by creating a private connection, or 'tunnel,' between them." Id. NIST cautions, however, that transmitting data between IT systems across a VPN interconnection increases the risk that the data "can be intercepted by unauthorized parties," which "necessitat[es] the use of authentication and data encryption to ensure data confidentiality and integrity." Id. Thus, "[t]he decision to pass data over a public network should be based on an assessment of the associated risks" conducted in accordance with NIST SP 800-30. See id. at 2-2.

NIST explains that there may be "varying levels of system interconnection" defined by access limitations that may be imposed "dependent on [an organization's] mission and its security needs." NIST SP 800-47, at 2-2. Organizations implementing IT system interconnections may choose, based on mission, needs, risks, and other relevant factors, to create a "limited interconnection, whereby users are restricted to a single application or file location[;] ... [a] broader interconnection, enabling users to access multiple applications or databases[;] ... [or] an interconnection that permits full transparency and access across their respective enterprises." Id. Whatever the level of system access facilitated by an interconnection, "interconnecting IT systems can expose the participating organizations to risk," including "security failures" that may "compromise the connected systems and the data that they store, process, or transmit." Id. Indeed, "if one of the connected systems is compromised, the interconnection could be used as a conduit to compromise the other system and its data[,]" a problem that is exacerbated by the fact that "the participating organizations have little or no control over the operation and management of the other party's system." Id.

In view of these risks, NIST advised that during the planning and implementation phases of the life cycle of an IT system interconnection, the participating organizations create a formal agreement or memorandum of understanding "regarding the management, operation, and use of the interconnection." Id.; see also id. at 3-5--3-6 (discussing the formulation and desired contents of an interconnection agreement); id., Appx. A, at A-1--A-7 (giving more detail on this process, presenting an example agreement). The parties to an IT system interconnection agreement, during the planning phase, should "[i]dentify the sensitivity level of data or information resources that will be made available, exchanged, or passed one-way only across the interconnection" in order to "determin[e] the security controls that should be used to protect the connected systems and data[,]" as well as specifically enumerate "security controls that will be implemented to protect the confidentiality, integrity, and availability of the connected systems and the data that will pass between them." Id. at 3-3. NIST also insists that a critical prerequisite to establishing an IT system interconnection is that the putatively connected systems undergo a full C&A. See id. at 3-2. The very first step of the implementation phase, NIST advises, should involve the participating parties' "implement[ing] appropriate security controls," see id. at 4-2, including firewalls, intrusion detection systems, "mechanisms to record activities occurring across the interconnection" (auditing systems), id., systems for identification and authentication of authorized users, logical access controls that limit the functionality of accessible applications and systems to authorized activities, virus scanning, encryption of data, and physical and environmental security on both ends of the interconnection. See id. at 4-2--4-3.

While an IT system interconnection is operational, participating organizations should "review the security controls for the interconnection at least annually or whenever a significant change occurs to ensure they are operating properly and are providing appropriate levels of protection." NIST SP 800-47, at 5-2. NIST allows that either or both of the participating organizations, or "an independent third party," may conduct these annual security reviews, in accordance with the agreement between the parties. See id. NIST adds that annual security testing should include penetration testing of the connected IT systems, and that "[s]security risks and problems should be corrected or addressed in a timely manner," and that "[c]orrective actions should be documented, and the records should be stored in a secure location." Id. In addition to these annual general security assessments, "[o]ne or both organizations should analyze audit logs at predetermined intervals to detect and track unusual or suspicious activities across the interconnection that might indicate intrusions or internal misuse," and "[a]utomated scanning tools should be used to scan for anomalies, unusual patterns, and known attack signatures." Id. However, NIST advises that a system administrator should manually review audit logs at regular intervals to catch problems that automated scanning tools might overlook. See id.

Within Interior, several types of IT system interconnections are operational, running between Interior's different bureaus and offices, between Interior's bureaus or offices and IT systems maintained by private contractors or Indian Tribes, and between Interior's bureaus and offices and other governmental organizations. Additionally, Interior operates what is referred to as a "network backbone" that is intended to connect all bureau and office IT networks through a single tunnel. Currently, this backbone is called Interior's Virtual Private Exchange ("VPX"), but there are plans to transition over to something called the Enterprise Services Network ("ESN"), which is already in development and, indeed, in use by some bureaus and offices. Both of these major Interior interconnections will be examined more closely herein, as well as some of the other more limited kinds of interconnections listed above. NIST SP 800-47 does not specify whether IT system interconnections between semi-independent elements of a larger organization must conform to these same security standards. However, it will become apparent that operating IT system interconnections even among the sub-parts of a single organization entails risks that must be addressed. NIST's framework is relevant, then, in that it provides the only available governmental guidance on what constitutes good security practice for the operation of IT system interconnections.


A. Annual FISMA Reporting by Interior's Inspector General

i. Overview

Prior to the enactment of FISMA, Interior's IG had "very little" involvement in IT security testing. See Tr. (Hrng., May 20, 2005 AM Sess.), at 11 (testimony of Earl Devaney, Inspector General of the Department of the Interior). The IG's office "had a unit of auditors in Denver, Colorado who were, by professional training, auditors and more or less self-taught and had gone to courses in IT" who were "performing occasional reviews and audits of the department's information systems[.]" Id. When Inspector General Earl Devaney*fn9 took over in August, 1999, see Tr. (Hrng., May 20, 2005 AM Sess.), at 8 (testimony of Devaney), he began a process of "shifting our resources from [the Denver] unit to a more robust unit ... in Washington" so that the IG's office could have "more capacity to implement the FISMA requirements." Tr. (Hrng., May 20, 2005 AM Sess.), at 11 (testimony of Devaney).

Devaney explains that after FISMA took effect, "sometime in 2002," most of the Inspectors General viewed the statute as "sort of an unfunded mandate that IGs do this work without the resources to accompany it[.]" Tr. (Hrng., May 20, 2005 AM Sess.), at 11 (testimony of Devaney). FISMA requires IGs to submit IT security evaluation reports to OMB at the end of each fiscal year ("FY"),*fn10 and to conduct the FY 2003 FISMA evaluation, Devaney "tried to ... borrow from other areas in [the IG] program[.]" Id. In part for this reason, the IG's FISMA report for FY 2003 was necessarily more limited in scope, involving reviews of "policies, procedures, [and] training," id at 23, as well as "security practices and general and application controls, [and] ... security ... documents, such as security plans and risk assessments as of July 31, 2003." Pls.' Ex. 14 (Notice of Filing Under Seal of the Department of the Interior's "Report on the Implementation of the Federal Information Security Management Act (FISMA) FY 2003" and the Department of the Interior's Office of the Inspector General's "Annual Evaluation of the Information Security Programs of the Department of the Interior" (Report No. 2003-I-0066, September 2003)) ("Pls.' Ex. 15"), at bates page ("bp.") DEF0043818.

For the FY 2004 and 2005 FISMA reports, Devaney is "using a slug of money that was offered to us by the department, and using some of our own money. I think in terms of percentages, it's probably two-thirds the department's and one third [the IG's]." Tr. (Hrng., May 20, 2005 AM Sess.), at 26 (testimony of Devaney); see also Pls.' Ex. 1 ("Memorandum of Understanding/Intra-Agency Agreement Between the United States Department of the Interior, Office of the Secretary, and the United States Department of the Interior, Office of Inspector General") ("Pls.' Ex. 1"), at 1, bp. DOIITE018000009--DOIITE018000011 (reimbursable support agreement ("RSA")). Devaney explained that this funding from the department "is essentially two-year money, 2004 and 2005. Starting in 2006 and on out from there, I don't see the department giving us any more money, and at that point I'll have the capacity to do [FISMA reporting] myself." Id. at 27. Accordingly, for the FY 2006 and FY 2007 budget proposals, Devaney has been "submitting budgets that are asking for money to do [FISMA] work." Id. Though, "technically speaking," the Secretary of the Interior has some control over the content of the IG's budget requests, "[t]his particular secretary has, to [Devaney's] knowledge, never lowered [the IG's] budget." Id. at 14. These changes in the budgeting and planning for FISMA reporting are necessary, Devaney explained, because "the nature of what we're doing under FISMA has now evolved ... from looking at policies, procedures, training, to actually getting into testing the systems, and that's a highly technical area ...." Id. at 23.

Interior's IG has also implemented personnel changes to meet the evolving requirements of FISMA reporting. The majority of the IT security testing for FY 2003 and FY 2004 had been performed a group within the IG's audits department called the National Information System Office ("NISO"). See Tr. (Hrng., May 20, 2005 AM Sess.), at 11, 18 (testimony of Devaney); Tr. (Hrng., May 25, 2005, PM Sess.), at 38 (testimony of Sandy) (giving the group's name). However, Devaney explained, as FISMA reporting "is getting more complicated and bigger, ... I felt ... that eventually we were going to have to have a self-contained unit that would do little else but FISMA." Id. at 18. Thus, the IG's office is "transferring some of the people in Denver into this new FISMA unit [that] ... would be housed under the CIO's office under ... the management piece of my organization." Id. at 19.

The "management piece" of the IG's office Devaney referenced is the Office of Administrative Services and Information Management ("ASIM"), of which the CIO's office is one division. The functions and a number of the staff of the Denver NISO, which had conducted most of the IG's IT testing and was formerly headed by Diann Sandy,*fn11 see id. at 24, is currently being transferred to the IG's headquarters in Washington. See id. This new central FISMA unit-called the National Security Management Unit ("NSM") and headed by Roger Mahach, formerly Interior's Departmental Information Technology Security Manager("DITSM")-should be completely assembled by October 1, 2005. See id. at 20--21. At that time, the Denver audit staff's involvement in FISMA reporting will be phased out. See id. at 21.

Creating the NSM is one piece of Devaney's larger "vision" for the IG's role in Interior's IT security going forward. He elaborated:

The strategic vision is to ... do the work we have been doing, to continue to look at policies, procedures, guidance, look at training, look at the certification, accreditation, all of the paper ... [and] to add to that the penetration testing we're doing, and we're also going to try and draw from our ... investigative component, and our audit component, any information with respect to IT systems that they come across in their normal duties, and have [this funneled] into this unit in Washington to make sure that we capture the entire [holistic] picture ... of all our work across the country that has anything to do with IT systems. It's a much more robust and [holistic] approach than we've had before.

Tr. (Hrng., May 20, 2005 AM Sess.), at 48 (testimony of Devaney). Having a FISMA-specialized group, Devaney notes, allows the IG to deploy "a cadre of highly technical people supervised by my CIO," one Michael Wood, who "has a technical background as well." Id. at 22. In addition to the current NSM staff, Devaney has included in his FY 2006 budget proposal a request for funds sufficient to hire at least four additional NSM staff members with technical expertise, so that the IG will have the capacity to conduct the full panoply of IT security testing "in-house." See id. at 29--31. "[M]y goal would always be to be able to do everything in-house ... so that we can totally control the situation." Id. at 30. Currently, however, the IG's office has no such internal capacity, and must retain outside contractors to conduct various kinds of IT security evaluations, such as the external penetration testing conducted as part of the IG's FY 2005 FISMA evaluation. See id. at 79

("Q: And did you believe you were in a position to provide ... an independent [IT security] evaluation [in FY 2004 and FY 2005]?

A: I believed that we could hire a contractor to do that for us.

Q: Okay.

In other words, at that point in time you didn't have the expertise to do it?

A: Correct.").

Around the time that the IG's FY 2004 FISMA evaluation report was nearing completion, the IG's NISO prepared a list of the "eight key areas" of consideration for evaluating Interior's compliance with FISMA. See Pls.' Ex. 116 (document entitled "Assignment Workpaper; Subject: Connecting the information relating to the OIG evaluation report, FISMA public law, and DOI Guidance," prepared by Kathryn Saylor (Sept. 14, 2004)) ("NISO Eight Key Areas"); see also Tr. (Hrng., May 25, 2005, PM Sess.), at 73 (testimony of Sandy) (authenticating the document as produced by her NISO staff). These eight areas of consideration will frame the discussion of the IG's findings during its FY 2003, FY 2004, and FY 2005 FISMA evaluations.

First, the agency's development, documentation, and maintenance of risk assessments for IT systems is evaluated. See Pls.' Ex. 116, at bp. DOI_OIG_IT0027883 (NISO Eight Key Areas); Tr. (Hrng., May 25, 2005, PM Sess.) (testimony of Sandy) (explaining what is considered in this area, including sensitivity ratings assigned to data, assessments of different kinds of threats and vulnerabilities, and the "determination as to whether that system is high, medium or low risk in the areas of confidentiality, availability, and integrity"). Second, the IG considers whether the department has created satisfactory plans that incorporate IT security into the life cycles of systems. See Pls.' Ex. 116 (NISO Eight Key Areas), at bp. DOI_OIG_IT0027884; Tr. (Hrng, May 25, 2005, PM Sess.), at 84--86 (testimony of Sandy) (explaining that this item basically calls for an evaluation of system security plans, including "the management controls, the technical controls, and the operational controls that surround and are used to safeguard the data and information in that system," which should be embodied in some form of written document).

The IG's third principal area of evaluation for FISMA compliance involves determining whether the department has and is correctly maintaining plans for providing adequate information security, see Pls.' Ex. 116 (NISO Eight Key Areas), at bp. DOI_OIG_IT0027884, or the practice of "keeping the [security] plan up to date. In other words, as part of your security plan and the result of your security tests and evaluation, the plans need to be updated. You find possibly a control is not working like you thought; therefore you need to address that to keep that plan in place." Tr. (Hrng., May 25, 2005, PM Sess.), at 86 (testimony of Sandy). Sandy explained that the documentation that must be produced to memorialize the process of updating SSPs "can be an addendum to the actual original security plan or it can just be a brand new security plan for that system," and that SSPs should be reviewed and updated "at least every three years, or sooner if there's been significant changes." Id. Fourth, the IG considers whether Interior is conducting adequate IT security training for employees, contractors, and other individuals whose duties involve IT security responsibilities. See Pls.' Ex. 116 (NISO Eight Key Areas), at bp. DOI_OIG_IT0027884. Sandy made clear that the training requirement "should include everyone"--including private contractors or Indian Tribes-who has IT security responsibilities because they interact in some appreciable way with Interior's IT infrastructure. See Tr (Hrng., May 25, 2005, PM Sess.), at 87 (testimony of Sandy).

The fifth key area for FISMA evaluation involves determining whether Interior is performing appropriate testing and evaluation of security controls for IT systems on an annual basis. See Pls.' Ex. 116 (NISO Eight Key Areas), at bp. DOI_OIG_IT0027885. Sandy explained that this requires looking at "the annual evaluation that should be conducted on your controls to determine whether they are still operating as you intended," which is "testing the individual bureaus [within Interior] do on their own system ... an internal testing and evaluation." Tr. (Hrng., May 25, 2005, PM Sess.), at 96 (testimony of Sandy). Documentation of these annual security evaluations, according to Interior policy, is prepared using the guidance provided in NIST Special Publication 800-26, which governs IT system security self-assessment. See generally Marianne Swanson, Security Self-Assessment Guide for Information Technology Systems, NIST Special Publication 800-26 (Nov. 2001), United States Department of Commerce, National Institute of Standards and Technology, available at 800-26/sp800-26.pdf, with revised questionnaire incorporating baseline security controls from NIST SP 800-53,*fn12 available at 800-53v1.doc [hereinafter "NIST SP 800-26"]. Sixth, the IG considers the adequacy of Interior's "process for planning, implementing, evaluating and documenting remedial action to address any deficiencies in the information security policies, procedures, and practices of the agency[.]" Pls' Ex. 116 (NISO Eight Key Areas), at bp. DOI_OIG_IT 0027885. This item, Sandy explained, involves review of Interior's implementation and management of the POA&M process. See Tr. (Hrng, May 26, 2005, AM Sess.), at 10--11 (testimony of Sandy).

The seventh key area of FISMA evaluation requires that the IG examine Interior's "procedures for detecting, reporting, and responding to security incidents[,]" including whether Interior is in compliance with the requirement that IT security incidents be reported to the relevant "federal information security incident center." Pls.' Ex. 116 (NISO Eight Key Areas), at bp. DOI_OIG_IT 0027885. Within Interior, the centralized incident reporting application is known as "DOICIRC," and the general federal IT security incident reporting center, one known as FEDCIRC, is now called U.S. CERT and is operated by the Department of Homeland Security. See Tr. (Hrng., May 26, 2005, AM Sess.), at 20--21 (testimony of Sandy). Eighth and finally, the IG's annual FISMA evaluation requires consideration of Interior's IT security contingency, or "continuity of operations" planning. See Pls.' Ex. 116 (NISO Eight Key Areas), at bp. DOI_OIG_IT0027886. Sandy explained that FISMA requires that Interior have contingency plans for each IT network or system that "identify the critical resources that need to be brought up first in the case of a disaster, whether it's nature or man made, or a system failure." Tr. (Hrng., May 26, 2005, AM Sess.), at 24--25 (testimony of Sandy). Sandy further detailed relevant contingency planning considerations, including questions like "[h]ow do you move it, where to you have a back-up site, can you get that data, can you bring it back up ...." Id. at 24 (testimony of Sandy). FISMA also requires that contingency plans be in written form, see id. at 25, and that contingency plans be tested "[a]t least annually." See id. at 27 (testimony of Sandy).

Pre-FISMA IT Security Assessments-Before FISMA's enactment in 2003, IT security for governmental agencies was governed in general by GISRA. Independent evaluations of Interior's compliance with IT security requirements, to the extent that they were conducted at all, were handled by a Diann Sandy's Denver-based team under the IG's audits department. See Tr. (Hrng., May 26, 2005, AM Sess.), at 59--60 (testimony of Sandy) (explaining that her group compiled GISRA evaluation reports that formed the basis of OMB's general GISRA reports to Congress). In OMB's 2002 report to Congress under GISRA, Interior's IT security posture for FYs 2001 and 2002 is described in overview. See Pls.' Ex. 123 (extract from document entitled "Office of Management and Budget, FY 2002 Report to Congress on Federal Government Information Security Reform (Rpt. No. A-IN- MOA-0099-2003, May 16, 2003)) ("2002 GISRA Rep."), at bp. DOI_OIG_IT001676.

For FY 2002, out of 224 total systems identified, Interior had 42 systems that were "assessed for risk and assigned a level of risk," 70 systems with an "up-to-date IT security plan," 49 systems "authorized for processing following certification and accreditation," 175 systems "operating without written authorization," 109 systems with "security control costs integrated into the life cycle of the system," 51 systems for which the "security controls have been tested and evaluated in the last year," 63 "systems with a contingency plan," but only 23 "systems for which contingency plans have been tested[.]" Pls.' Ex. 123 (2002 GISRA Rep.), at bp. DOI_OIG_IT0016766. These numbers showed relative improvement over those reported for FY 2001. See id. OMB noted, however, that the "DOI and the DOI IG report that the lack of a credible DOI IT system inventory casts doubt on the accuracy of various statistics and performance results contained in the DOI FY 2002 GISRA submission." See id. at bp. DOI_OIG_IT0016770.

Additionally, OMB's 2002 GISRA report noted management and policy problems related to Interior's IT security program. For example, OMB observed that "program officials, such as the Assistant Secretaries and Bureau heads, deputies, and assistant directors have not been held accountable for carrying out their [IT security] responsibilities," Pls.' Ex. 123 (2002 GISRA Rep.), at bp. DOI_OIG_IT0016770, and that "all [departmental] policies and guidance were not implemented by the Bureaus[,] [a]ll systems were not identified, certified, accredited, and authorized to operate[,] [p]rocedures were not developed to validate whether all Bureaus have effectively implemented federal and DOI IT policies, procedures, standards, and guidelines[,] ... [and] procedures were not established to keep DOI IT security policies and guidance up to date." Id. at bp. DOI_OIG_0016773.

OMB's GISRA report to Congress is based on self-reporting by the various governmental agencies that are evaluated; part of Interior's submission to OMB for this purpose is an "IT Security Scorecard" created by the departmental CIO's office. See, e.g., Pls.' Ex. 124 (email from Diann Sandy to Jennifer Schafer, Subject: "DRAFT IT Security Scorecard for June 2003", July 7, 2003), Attachment (document entitled "Department of Interior IT Security Scorecard, June 2003"), at bp. DOI_OIG_IT0016124. Sandy's office took issue with some of the information Interior provided to OMB during the compilation of the 2002 GISRA Report, "primarily with the fact that DOI was acknowledging that systems were certified and accredited when, in fact, there was no defined process at that time." Tr. (Hrng., May 26, 2005, AM Sess.), at 77 (testimony of Sandy). Indeed, in Interior's CIO's response to OMB's draft of the FY 2002 GISRA report, the following comment from the Inspector General's office was included.

Overall, the IG's information is depicted accurately in the Office of Management and Budget's report of the U.S. Department of the Interior, for the Government Information Security Reform Act. However, we have concerns regarding the number of Department of Interior systems authorized to process after certification and accreditation. The Department reported that 49 systems had been authorized to process after a certification and accreditation. We noted in the IG's report that the Department did not have a certification and accreditation process. Further, in OIG reviews of Departmental components of IT systems, rarely were systems certified and accredited and if the systems were identified as certified and accredited, the inappropriate level of management was certifying and accrediting the systems. One of the DOI components had a documented certification and accreditation process, however, the process was broken and not implemented.

Pls.' Ex. 125 (email from Kamela White, OMB, to Stephen King, DOI OCIO, Subject: "Re: DOI Comments on DRAFT FY 2002 GISRA Report and DOI Summary", May 6, 2003), Attachment (email from Stephen King, DOI OCIO, to Kamela White, OMB, Subject: "DOI Comments on DRAFT FY 2002 GISRA Report and DOI Summary", May 2, 2003), at bp. DOI_OIG_IT0023724. While Sandy noted that there had been some improvement in the state of Interior's C&A process from FY 2002 under GISRA to FY 2003 under FISMA, see Tr. (Hrng., May 26, 2005, AM Sess.), at 77 (testimony of Sandy), she would go on to note numerous IT security problems that have existed since the preparation of the IG's FY 2002 GISRA report and remain today.

FY 2003 FISMA Evaluation-The IG's FY 2003 FISMA evaluation was managed by Diann Sandy and conducted primarily by her NISO. See Tr. (Hrng., May 25, 2005, PM Sess.), at 43 (testimony of Sandy). Devaney's consistently increasing focus on his office's FISMA reporting responsibilities is reflected in the broadening scope of the FISMA-related activities his staff has undertaken from year to year. In FY 2003, for example, the IG's office analyzed previously conducted reviews of "security practices and general and application controls over information systems supporting telecommunications, energy and water operations, scientific research and mapping, park operations, and financial operations included in financial statement audits; and ... DOI's management of Web sites[;]" reviewed FY 2003 reports concerning Interior's IT security prepared by the Government Accountability Office ("GAO") and OMB; and examined "[i]nternal reviews performed and documents provided by the DOI ... [CIO] and bureaus and offices." Pls.' Ex. 14, at bp. DEF0043818.

Along with analyzing previous review documents, the FY 2003 IG FISMA evaluation included a first-hand review of "DOI's and bureaus' and offices' security management policies, procedures, and practices documents, such as security plans and risk assessments," and testing of "information system security controls as part of [the IG's] detailed review of general controls over information security at U. S. Geological Survey [("USGS")], National Park Service [("NPS")], Bureau of Reclamation [("BOR")], and DOI Web sites." Id. at bp. DEF0043818--DEF0043819. The IG's actual security testing covered "98 systems including 5 that were operated and maintained by contractors and 405 Web site component systems." Id. at bp. DEF0043819.

In the IG's 2003 FISMA report to OMB, NISO noted numerous areas where Interior's IT Security program was not in compliance with applicable standards. See generally Pls.' Ex. 120 (memorandum from Diann Sandy, Manager, NISO, to CIO, Department of the Interior, Subject: Evaluation of the Department of the Interior's Information Security Program (Report No. 2003-I-0066)) ("2003 FISMA Rep."). As in the FY 2002 GISRA report, the IG noted in 2003 that "DOI has not ensured that all of its security policies have been implemented and integrated," and that "bureau and office senior level management were not always held accountable for ensuring that Federal and DOI policies, procedures, practices, and control techniques were implemented." Pls. Ex. 120 (2003 FISMA Rep.), at bp. DOI_IT0018062-- DOI_IT001863; see also Tr. (Hrng., May 26, 2005 PM Sess.), at 56--57 (testimony of Sandy) (discussing these organizational and accountability issues). The IG also observed that:

[a]ll systems operated for or on behalf of the DOI including DOI Commissions such as the Indian Gaming Commission; outsourced Web sites; universities and colleges; state, local, and tribal governments; and hosting of Web sites for organizations such as not-for-profits are not included in information system inventories. For example, in at least three bureaus, information systems personnel did not consider that outsourced Web sites or contractor operated and managed applications used to collect and process DOI information should be included as part of the bureaus' system inventory.

Pls.' Ex. 120 (2003 FISMA Rep.), at bp. DOI_IT0018067. Indeed, this faulty system inventory, Sandy explained, gave rise to "some concerns ... that it was easy for the bureaus to improve their [DOI and OMB] score without improving what they had done from a security mangement perspective by merely lumping systems together[;] ... merely by reducing the number of systems, the bureau was able to show improvement for conducting the required reviews ...." Tr. (Hrng., May 26, 2005, PM Sess.), at 73 (testimony of Sandy). Among other recommendations, the IG advised Interior to "[e]stablish and periodically provide a training program that addresses the requirements needed for any position including program officials and system owners and federal or contractor employees with significant information and information system security responsibilities." Id. at bp. DOI_IT0018071.

In recounting the NISO's activities during the compilation of the IG's FY 2003 FISMA report, Sandy recalled encountering problems with implementation of the Secretary's order that all bureaus and offices with 5,000 or more employees establish a CIO's office, see Tr (Hrng., May 26, 2005, PM Sess.), at 59--60 (testimony of Sandy); Pls.' Ex. 128 (document entitled "A-IN-MOA-0099-2003, Fiscal Year 2003 FISMA"), at bp. DOI_OIG_IT0015686-- DOI_OIG_ IT0015686 (NISO "found that the CIO positions for at least two bureaus were not filled by the established milestone date"), operation of systems lacking a certification and accreditation or even an IATO, see Tr. (Hrng., May 26, 2005, PM Sess.), at 61 (testimony of Sandy), implementation of an effective IT security training program for Interior personnel and contractors, see id. at 63--64 (testimony of Sandy), and monitoring of contractor access to and management of Interior's IT systems. See id. at 65 (testimony of Sandy). Sandy's NISO team found that bureaus and offices were conducting annual security assessments according to the outdated NIST SP 800-26 self-assessment guidance rather than the new 800-30 independent assessment requirements, see id. at 79--80 (testimony of Sandy), and were implementing "dial-up access to DOI's networks ... without [security] controls being effectively implemented." Id. at 78; see Pls.' Ex. 120 (2003 FISMA Rep.), at bp. DOI_IT0018067.

Other IT security problems uncovered during the IG's FY 2003 FISMA evaluation included inadequate POA&M processes and documentation, see Tr. (Hrng., May 26, 2005, PM Sess.), at 82--83 (testimony of Sandy), managerial acceptance of IT security risks without adequate supporting documentation as required by FISMA, see id. at 84--86 (testimony of Sandy), and acceptance of IT security risks by Interior employees other than the DAA in violation of FISMA and OMB standards, see id. at 85--88 (testimony of Sandy). One particularly troubling instance involved the CIO of BIA signing C&A documentation for BIA's TrustNet system as both the certifying agent and the accrediting authority. See Tr. (Hrng., May 27, 2005, AM Sess.), at 8 (testimony of Sandy).*fn13 Sandy testified that this practice both violated FISMA and OMB requirements and involved an inherent conflict of interest, see id. at 9 (testimony of Sandy), and that the problem was reported to the departmental CIO and later corrected. See id. In summary, NISO found that Interior's IT security program was significantly deficient in all of NISO's Eight Key Areas of FISMA evaluation. See Tr. (Hrng., June 1, 2005, AM Sess.), at 4--5 (testimony of Sandy).

FY 2004 FISMA Evaluation-The IG's FISMA evaluation covered more ground in FY 2004, even though the NSM had not yet been completely assembled and most of the work was still being conducted "under the Office of Audits [by] Roger LaRouche and Diann Sandy." Tr. (Hrng., May 20, 2005 AM Sess.), at 48 (testimony of Devaney); Tr. (Hrng., May 25, 2005, PM Sess.), at 44 (testimony of Sandy) (explaining her managing role in the IG's FY 2004 FISMA evaluation). Roger Mahach, who now heads the IG's NSM group, was hired in June 2004, see id. at 43, so that while "he was consulted about his thoughts" on the FY 2004 FISMA evaluation, "the work ... started much earlier in the [fiscal] year, and [was] being rolled up at the end of the [fiscal] year." Id. at 49. In addition to reviewing updated versions of the same kinds of reports and documentation that were considered in FY 2003, the IG "tested controls over 20 of DOI's 157 information systems-9 major applications and 11 general support systems. These tests included the performance of limited non-intrusive scanning of DOI networks and devices, such as servers and firewalls, which were accessible from the Internet." See Pls.' Ex. 15 (letter from Earl Devaney, Inspector General for the U.S. Department of the Interior, to Joshua Bolton, Director, Office of Management and Budget (Oct. 12, 2004)) ("IG 2004 FISMA Letter"), Enclosure (document entitled "United States Department of the Interior, Office of the Inspector General: Annual Evaluation, DOI Information Security Program (Report No. A-EV-MOA-0006-2004, Oct. 2004)") ("2004 FISMA Rep."), at 1.

The IG's FY 2004 evaluation considered Interior's compliance with FISMA's requirements that "federal agencies ... implement security programs that protect information systems from unauthorized access, use, disclosure, disruption, modification, or destruction," id., including mechanisms designed to: "assess risks and implement policies and procedures to reduce risks; test and evaluate security controls; plan for continuity of operations; maintain subordinate plans for providing information security; plan for security throughout life cycle of systems; plan corrective actions; train employees and contractors; and detect, report, and respond to security incidents." Id. This is a substantially more detailed list of features of a FISMA-compliant IT security plan than was presented in the FY 2003 IG report, see Pls.' Ex. 14, at bp. DEF0043818, underscoring Inspector General Devaney's testimony regarding the evolving and increasingly technical nature of FISMA reporting in general. See Tr. (Hrng., May 20, 2005, AM Sess.), at 23 (testimony of Devaney).

Sandy's NISO team identified many of the same deficiencies in FY 2004 that were observed during the FY 2003 FISMA reporting period. See generally Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 3 (summarizing findings). Interior's self-reporting scorecard for FY 2004 graded the department's IT security program at 67.5 out of 100; see Pls.' Ex. 158 (document entitled "Department of the Interior, IT Security Scorecard" (Apr. 30, 2004), at bp. DOI_OIG_IT0021994; this failing grade, Sandy indicated, was likely more favorable than the grading Interior would receive from OMB, so that failing the self-score card almost guarantees failing the OMB scorecard. See Tr. (Hrng., June 1, 2005, PM Sess.), at 24 (testimony of Sandy). The IG's overall evaluation for 2004 emphasized that "despite sound guidance from the Office of the Chief Information Officer [of Interior], we continue to identify weaknesses in bureau and office implementation of IT security requirements." See Pls.' Ex. 15, Enc. (2004 FISMA Rep.) (cover letter from Devaney to the Secretary of the Interior). The IG elaborated in the statement of results:

We found that DOI has effectively designed its information security management program to meet the requirements of FISMA .... However, despite these efforts, our review of information and actions reported by bureaus indicated that they have not consistently followed DOI guidance in implementing their security programs. In particular, our tests of 20 systems, 19 of which were certified and accredited by the bureaus, identified weaknesses in the conduct of a majority of the system certifications and accreditations. In our opinion, this demonstrates a clear need for qualitative examination by the CIO of reported bureau accomplishments.

Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 3.

Contributing to Interior's poor performance, Sandy explained, were recurring problems such as failures at the Fish and Wildlife Service ("FWS") and NBC to take the risks of interconnections between IT systems into consideration when conducting risk assessments, see id. at 33--34 (testimony of Sandy); Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 4; Pls.' Ex. 160 (document entitled "Finding Outline, Subject(s): DOI guidelines are not clear on 800-30 risk assessment requirements", prepared by Stacey Crouser, NISO (Oct. 12, 2004)) (reporting that "[o]f the 20 systems [NISO] tested, 19 were certified and accredited, but 12 of 19 of the systems certified and accredited (63 percent) did not have risk assessments that followed NIST SP 800-30 guidance"), at bp. DOI_OIG_IT0027586; Pls.' Ex. 162 (document entitled "Assignment Workpaper, Subject: Risk Assessments," prepared by Harriet Thiesen, NISO (Sept. 21, 2005)), at bp. DOI_OIG_IT0029067--DOI_OIG_IT0029068 (discussing absence of consideration of IT system interconnections from risk assessments), failures at BLM to ensure that risk assessment documentation was placed at each physical location of a system, including field offices, see Tr. (Hrng., June 1, 2005, PM Sess.), at 35--36 (testimony of Sandy); Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 4; Pls.' Ex. 162, at bp. DOI_OIG_IT0029067 --DOI_OIG_IT002968, and a department-level failure to ensure uniform IT security policies across all bureaus and offices operating "enclaves," or collections of systems and networks that are interconnected throughout a bureau and that are certified and accredited as a single composite GSS. See Tr. (Hrng., June 1, 2005, PM Sess.), at 36--37 (testimony of Sandy).

In addition, NISO's FY 2004 FISMA investigation found that Interior was not taking the steps necessary to ensure that Interior systems hosted or operated at non-governmental contractor facilities, such as BIA's Indian trust system backbone TrustNet, have the level of security required by FISMA. See Tr. (Hrng., June 1, 2005, PM Sess.), at 40; see also Pls' Ex. 163 (document entitled "Assignment Workpaper, Subject: Record of Observation- Contractor Operated Services and Facilities," prepared by Stacey Crouser, NISO (Sept. 29, 2004)), at bp. DOI_OIG_IT0028443 --DOI_OIG_IT0028444 (also discussing the same problem with respect to an NBC system operated by a contractor at a remote facility). "Based on a review of [the contract between BIA and the system operator], there are no requirements for [the contractor] to follow NIST, DOI, and OMB guidelines and no requirements for independent audits. ... Additionally, there is no requirement for [the contractor] to develop the required certification and accreditation documentation ...." Pls.' Ex. 163, at bp. DOI_OIG_IT0028443. Sandy testified that it is Interior's responsibility to ensure that any contractor with access to or control over any of Interior's information or information systems complies with FISMA requirements for securing those IT assets. See Tr. (Hrng., June 1, 2005, PM Sess.) at 42 (testimony of Sandy). Compounding this problem, Sandy's team found that Interior had not even fully identified all systems that are "outsourced" to be operated and maintained by private contractors as FISMA requires. See Tr. (Hrng., June 1, 2005, PM Sess.), at 43--44 (testimony of Sandy); Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 7 ("DOI had no specific methodology to identify all contractors with access to DOI systems."); Pls.' Ex. 164 (document entitled "Assignment Workpaper, Subject: Contractor Operated Meeting FISMA, OMB, and DOI Policies," prepared by Stacey Crouser, NISO (Aug. 27, 2005)), at bp. DOI_OIG_IT0028422 (noting that "[o]f the 12 systems[] [NISO] found that were contractor operated facilities or operations, DOI had only classified 58% of them as contractor operated facilities or operations (7/12 = 58%)").

The NISO reported significant deficiencies in SSPs for Interior systems, including instances of SSPs lacking a complete list of all active security controls on a system, thereby frustrating attempts to evaluate the effectiveness of those controls, and SSPs that failed to identify who the relevant security contact person is for the system. See Tr. (Hrng., June 1, 2005, PM Sess.) at 62--63 (testimony of Sandy) (indicating that roughly 40 percent, or 8 of 19, of the SSPs tested were deficient under NIST standards); id. at 64 (testimony of Sandy) (Many SSPs "didn't do a really good job of identifying who the [IT security] contact was [for the system]. A lot of management information was the same person or different people. You couldn't really tell who was overall responsible for the system, and the security of it."); Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 5--6; Pls.' Ex. 168 (document entitled "Finding Outline, Assignment Number: A-EV-MOA-0006-2004, Assignment: Evaluation of DOI's Information Security policies-procedures-practices-and controls, Program Name: Administration and Background, Finding Number: 1.7," prepared by Stacey Crouser, NISO (Oct. 12, 2004), at bp. DOI_OIG_IT0027618. A number of NBC's SSPs were found deficient during this review. See Tr. (Hrng., June 1, 2005, PM Sess.), at 68--73 (testimony of Sandy) (discussing problems identified in NBC's SSPs); Pls.' Ex. 169 (document entitled "Assignment Workpaper, Subject: Summary of NBC System Security Plans," prepared by Stacey Crouser, NISO (Sept. 16, 2004)), at bp. DOI_OIG_IT0028440--DOI_OIG_ IT0028442. For example, the SSP for the NBC Denver Data Center General Support System Enclave, one of two NBC GSSs that supports numerous important applications for various bureaus and offices, had no "real description of the applications supported and who the users are" and did not incorporate completed interconnection agreements for all the other Interior bureaus and third parties who connect to the system as required by NIST SP 800-47. See Pls' Ex. 169, at bp. DOI_OIG_IT0028440.

Additionally, Sandy's team found that SSP's "were not being updated based on the results of security tests and evaluations and risk assessments," Tr. (Hrng., June 1, 2005, PM Sess.), at 64 (testimony of Sandy), and had, in the main, deficient contingency plans. See id. at 65 (testimony of Sandy). See also Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 5 (noting that while 16 of the 19 C&A'd systems that were evaluated had some form of contingency plan, "we found deficiencies in 12 of these [16] plans). For example, the contingency plan for BIA's TrustNet system, the network "backbone" for BIA's Indian Trust systems, "was limited to only technical procedures and did not identify a team for the recovery operations or include the specific steps to recover from a disruption in service. Additionally, the plan did not show the order of priority for recovering critical applications." Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 5.

For its required annual security testing and evaluation, Interior was relying primarily on monthly automated scanning with a tool called "Nessus," and was only setting the scans to test for vulnerabilities enumerated on the FBI/SysAdmin, Audit, Network, Security (SANS) "Top 20" list of IT security weaknesses. See Tr. (Hrng., June 2, 2005, AM Sess.), at 9 (testimony of Sandy); Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 4; see also Tr. (Hrng., June 1, 2005, PM Sess.), at 75--76 (describing the FBI/SANS Top 20 list as "the ones that come from FBI that have been identified as the most critical weaknesses throughout the IT world that ... can allow hackers or crackers to get into systems real easily"). While this level of vulnerability scanning is certainly preferable to nothing at all, Sandy at length detailed the reasons why scanning only for the 20 most serious weaknesses does not result in an adequate representation of the risk that a network might be subject to unauthorized access from the Internet. See Tr. (Hrng., June 1, 2005, PM Sess.), at 75--102 (discussing problems with reliance on the FBI/SANS Top 20 list); see also Pls.' Ex. 172 (document entitled "Assignment Workpaper, Subject: Expanded Scope for Nessus Scans," prepared by Hector DeJesus, NISO (Sept. 10, 2004)), at bp. DOI_OIG_IT0029061--DOI_OIG_ IT0029065 (identifying various "vulnerabilities that are categorized by Nessus as High Risk factors and Medium Risk Factors that are not included in the [FBI/SANS] top twenty scan" on a number of USGS servers); Pls.' Ex. 175 (document entitled "Assignment Workpaper, Subject: USGS and MMS Important Vulnerabilities," prepared by Hector DeJesus, NISO (Aug. 24, 2004)), at bp. DOI_OIG_IT0029025 DOI_OIG_IT0015686--DOI_OIG_IT0029039 (identifying numerous high risk vulnerabilities on USGS networks and one on an MMS network that were not included in the FBI/SANS Top 20 list).

Sandy's team also found numerous deficiencies in bureaus' and offices' POA&M policies and procedures, as it had in FY 2003. See Tr. (Hrng., June 2, 2005, AM Sess.), at 38 (testimony of Sandy); Pls.' Ex. 15, Enc. (2004 FISMA Rep.), at 6--7 (finding that while 84 percent of "bureaus recorded known weaknesses in their POA&Ms most of the time[,]" there was nevertheless "a need to ensure that all reported weaknesses are recorded, priorities are assigned to correct all weaknesses, and costs needed to remedy weaknesses are always identified"). At the POA&M program management level, for example, Sandy found that FWS, BIA, and NPS had no POA&M policies or procedures in place; OSM had policies, but not procedures; BLM had policies and procedures that were not being implemented by BLM state offices; and BOR had both policies and procedures for POA&M management in place. See Tr. (Hrng., June 2, 2005, AM Sess.), at 41--42 (testimony of Sandy); Pls.' Ex. 181 (document entitled "Assignment Workpaper, Subject: Bureau POA&M Policies and Procedures," prepared by Stacey Crouser, NISO (Aug. 5, 2004)), at bp. DOI_OIG_IT0027996--DOI_OIG_IT0027997 (reporting these results).

For example, while BLM's POA&M policies and procedures called for each state office to compile separate POA&M's to be incorporated into the bureau-level POA&M, NISO found that state offices simply were not compiling their individual POA&Ms. See Tr. (Hrng., June 2, 2005, AM Sess.), at 52--65 (testimony of Sandy). NISO focused particularly on the California and Idaho state offices of BLM, which were found to be not in compliance with BLM's POA&M policy. See id. at 61--62 (testimony of Sandy) (discussing both state offices); Tr. (Hrng., June 2, 2005, PM Sess.), at 64--66 (testimony of Sandy) (discussing problems with POA&M policy implementation at BLM's California State Office); Pls.' Ex. 184 (document entitled "Record of Discussion, Subject: POAM at the state offices," prepared by Kathryn Saylor, NISO (Aug. 11, 2004)), at bp. DOI_OIG_IT0028006 (indicating that BLM's Idaho state office has no POA&M process at all, and that on the BLM's California State Office POA&M "the priority column was not filled in therefore, there is no prioritization of weaknesses for correction ... [and] [o]nly 2 of the 31 weaknesses had resources identified" for correcting the weakness).*fn14 Both these offices connect to BLM's National Information Resource Management Center ("NIRMC"), which in turn supports a number of Indian Trust systems and applications that access IITD. See Tr. (Hrng., July 25, 2005, AM Sess.), at 25--33 (testimony of James Rolfes, Dir., BLM IT Incident Command Center); Tr. (Hrng., June 28, 2005, PM Sess.), at 106--13 (testimony of Ronnie Levine, BLM CIO). More generally, the IG reported that BLM's POA&M for the BLM enclave reflected resource allocation issues, explaining that:

in Bureau of Land Management's POA&M for the BLM Enclave, there were 20 weaknesses reported as high priority. The POA&M identified the resources need to correct only 5 of these high priorities. However, other weaknesses classified as medium in this same system had the resources identified. Consequently, it is difficult for ...

Buy This Entire Record For $7.95

Download the entire decision to receive the complete text, official citation,
docket number, dissents and concurrences, and footnotes for this case.

Learn more about what you receive with purchase of this case.