Search
By Kathryn Rock
In October 2022, the White House shared a framework with five main principles as a “Blueprint for an AI Bill of Rights” to guide the design, use, and deployment of AI and automated decision systems. The blueprint stated the framework should apply to housing, credit, employment, and financial services. On October 29, 2023, the White House issued an Executive Order (EO) to enforce requirements on developing and leveraging AI and automated decision systems. While the immediate impact is to federal agencies, the EO has long-term implications for private sector adoption and use of AI.
The EO aims to ensure that automated decision systems using AI or machine learning embed data security, equity, and fairness considerations into the development and deployment process. The EO focuses on:
1. Standards for AI Safety and Security — It requires the National Institute of Standards and Technology to establish standards for adversarial testing (“red-team testing”) foundation models for flaws and/or vulnerabilities that pose a serious risk to national security, national economic security, or national public health and safety. This includes “dual-use” foundation models, which in addition to posing the risks mentioned above, refer to AI systems that have applications in civilian and military or other sensitive sectors. The EO relies on the Defense Production Act to impose reporting requirements on these dual-use foundation models regarding planned activities related to their development and production.
2. Advancing Equity and Civil Rights —In addition to enforcing existing authorities to protect the rights and safety of Americans, the EO provides guidance to landlords, federal benefits programs, and federal contractors to prevent AI from being used to exacerbate discrimination. The White House will coordinate with federal agencies and the Department of Justice to address algorithmic discrimination in housing, healthcare, lending, and employment. Among others, the EO:
3. Opportunities for Collaboration and Partnership —The EO emphasizes the need for collaboration and competition in the AI space to promote innovation.
Companies who currently use or intend to use automated decision systems using AI or machine learning in regulated industries or who contract with the federal government should pay close attention to the variety of standards developed under the new EO. Having robust model risk management and documented controls monitoring AI systems are critical to meeting higher expectations of transparency, protection of consumer data, and an audit of the equitable use of AI on decisions surrounding lending, healthcare services, and employment.
Firms should take an inventory of current AI capabilities and where automated decision systems are currently deployed throughout the organization. Guidehouse can work with your team to evaluate current AI systems and processes, identify any gaps and/or weaknesses, and develop a roadmap for enhancing AI capabilities. A key consideration for managing AI risk is developing an AI Governance Framework to ensure the responsible use and deployment of AI systems organization-wide.
As firms navigate the implications of the new EO on AI, it is essential to have a partner who can provide expert guidance and support. Guidehouse has a deep understanding of regulated industries and extensive experience in responsible AI.
Guidehouse can comprehensively review, assess, and validate automated decision systems. We can also help your organization maximize the predictive power from automated decision systems while implementing industry standards for data privacy and protection, model monitoring and risk mitigation, and the equitable and responsible use of AI.
Guidehouse is a global consultancy providing advisory, digital, and managed services to the commercial and public sectors. Purpose-built to serve the national security, financial services, healthcare, energy, and infrastructure industries, the firm collaborates with leaders to outwit complexity and achieve transformational changes that meaningfully shape the future.