Bill Sponsor
Senate Bill 3062
119th Congress(2025-2026)
GUARD Act
Introduced
Introduced
Introduced in Senate on Oct 28, 2025
Overview
Text
Not Scanned for Linkage
About Linkage
Multiple bills can contain the same text. This could be an identical bill in the opposite chamber or a smaller bill with a section embedded in a larger bill.
Bill Sponsor regularly scans bill texts to find sections that are contained in other bill texts. When a matching section is found, the bills containing that section can be viewed by clicking "View Bills" within the bill text section.
Bill Sponsor is currently only finding exact word-for-word section matches. In a future release, partial matches will be included.
Not Scanned for Linkage
About Linkage
Multiple bills can contain the same text. This could be an identical bill in the opposite chamber or a smaller bill with a section embedded in a larger bill.
Bill Sponsor regularly scans bill texts to find sections that are contained in other bill texts. When a matching section is found, the bills containing that section can be viewed by clicking "View Bills" within the bill text section.
Bill Sponsor is currently only finding exact word-for-word section matches. In a future release, partial matches will be included.
S. 3062 (Introduced-in-Senate)


119th CONGRESS
1st Session
S. 3062


To require artificial intelligence chatbots to implement age verification measures and make certain disclosures, and for other purposes.


IN THE SENATE OF THE UNITED STATES

October 28, 2025

Mr. Hawley (for himself, Mr. Blumenthal, Mrs. Britt, Mr. Warner, Mr. Murphy, and Mr. Kelly) introduced the following bill; which was read twice and referred to the Committee on the Judiciary


A BILL

To require artificial intelligence chatbots to implement age verification measures and make certain disclosures, and for other purposes.

Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled,

SECTION 1. Short title.

This Act may be cited as the “Guidelines for User Age-verification and Responsible Dialogue Act of 2025” or the “GUARD Act”.

SEC. 2. Findings.

Congress finds the following:

(1) Artificial intelligence chatbots are increasingly being deployed on social media platforms and in consumer applications used by minors.

(2) These chatbots can generate and disseminate harmful or sexually explicit content to children.

(3) These chatbots can manipulate emotions and influence behavior in ways that exploit the developmental vulnerabilities of minors.

(4) The widespread availability of such chatbots exposes children to physical and psychological safety risks, including grooming, addiction, self-harm, and harm to others.

(5) Protecting children from artificial intelligence chatbots that simulate human interaction without accountability is a compelling governmental interest.

SEC. 3. Definitions.

In this Act:

(1) AI COMPANION.—The term “AI companion” means an artificial intelligence chatbot that—

(A) provides adaptive, human-like responses to user inputs; and

(B) is designed to encourage or facilitate the simulation of interpersonal or emotional interaction, friendship, companionship, or therapeutic communication.

(2) ARTIFICIAL INTELLIGENCE CHATBOT.—The term “artificial intelligence chatbot”—

(A) means any interactive computer service or software application that—

(i) produces new expressive content or responses not fully predetermined by the developer or operator of the service or application; and

(ii) accepts open-ended natural-language or multimodal user input and produces adaptive or context-responsive output; and

(B) does not include an interactive computer service or software application—

(i) the responses of which are limited to contextualized replies; and

(ii) that is unable to respond on a range of topics outside of a narrow specified purpose.

(3) COVERED ENTITY.—The term “covered entity” means any person who owns, operates, or otherwise makes available an artificial intelligence chatbot to individuals in the United States.

(4) MINOR.—The term “minor” means any individual who has not attained 18 years of age.

(5) REASONABLE AGE VERIFICATION MEASURE.—The term “reasonable age verification measure” means a method that is authenticated to relate to a user of an artificial intelligence chatbot, such as—

(A) a government-issued identification; or

(B) any other commercially reasonable method that can reliably and accurately—

(i) determine whether a user is an adult; and

(ii) prevent access by minors to AI companions, as required by section 6.

(6) REASONABLE AGE VERIFICATION PROCESS.—The term “reasonable age verification process” means an age verification process employed by a covered entity that—

(A) uses one or more reasonable age verification measures in order to verify the age of a user of an artificial intelligence chatbot owned, operated, or otherwise made available by the covered entity;

(B) provides that requiring a user to confirm that the user is not a minor, or to insert the user's birth date, is not sufficient to constitute a reasonable age verification measure;

(C) ensures that each user is subjected to each reasonable age verification measure used by the covered entity as part of the age verification process; and

(D) does not base verification of a user's age on factors such as whether the user shares an Internet Protocol address, hardware identifier, or other technical indicator with another user determined to not be a minor.

SEC. 4. Criminal prohibitions.

(a) In general.—Part I of title 18, United States Code, is amended by inserting after chapter 5 the following:

“CHAPTER 6ARTIFICIAL INTELLIGENCE


“Sec.

“91. Artificial intelligence chatbots.

§ 91. Artificial intelligence chatbots

“(a) Definitions.—In this section:

“(1) ARTIFICIAL INTELLIGENCE CHATBOT.—The term ‘artificial intelligence chatbot’—

“(A) means any interactive computer service or software application that—

“(i) produces new expressive content or responses not fully predetermined by the developer or operator of the service or application; and

“(ii) accepts open-ended natural-language or multimodal user input and produces adaptive or context-responsive output; and

“(B) does not include an interactive computer service or software application—

“(i) the responses of which are limited to contextualized replies; and

“(ii) that is unable to respond on a range of topics outside of a narrow specified purpose.

“(2) MINOR.—The term ‘minor’ means any individual who has not attained 18 years of age.

“(3) SEXUALLY EXPLICIT CONDUCT.—The term ‘sexually explicit conduct’ has the meaning given the term in section 2256.

“(b) Solicitation of minors.—

“(1) OFFENSE.—It shall be unlawful to design, develop, or make available an artificial intelligence chatbot, knowing or with reckless disregard for the fact that the artificial intelligence chatbot poses a risk of soliciting, encouraging, or inducing minors to—

“(A) engage in, describe, or simulate sexually explicit conduct; or

“(B) create or transmit any visual depiction of sexually explicit conduct, including any visual depiction described in section 1466A(a).

“(2) PENALTY.—Any person who violates paragraph (1) shall be fined not more than $100,000 per offense.

“(c) Promotion of physical violence.—

“(1) OFFENSE.—It shall be unlawful to design, develop, or make available an artificial intelligence chatbot, knowing or with reckless disregard for the fact that the artificial intelligence chatbot encourages, promotes, or coerces suicide, non-suicidal self-injury, or imminent physical or sexual violence.

“(2) PENALTY.—Any person who violates paragraph (1) shall be fined not more than $100,000 per offense.”.

(b) Technical and conforming amendment.—The table of chapters for part I of title 18, United States Code, is amended by inserting after the item relating to chapter 5 the following:

  • “6. Artificial intelligence 91”.




SEC. 5. Covered entity obligations.

(a) Creation of user accounts.—A covered entity shall require each individual accessing an artificial intelligence chatbot to make a user account in order to use or otherwise interact with such chatbot.

(b) Age verification.—

(1) AGE VERIFICATION OF EXISTING ACCOUNTS.—With respect to each user account of an artificial intelligence chatbot that exists as of the effective date of this Act, a covered entity shall—

(A) on such date, freeze any such account;

(B) in order to restore the functionality of such account, require that the user provide age data that is verifiable using a reasonable age verification process, subject to paragraph (4); and

(C) using such age data, classify each user as a minor or an adult.

(2) AGE VERIFICATION OF NEW ACCOUNTS.—At the time an individual creates a new user account to use or interact with an artificial intelligence chatbot, a covered entity shall—

(A) request age data from the individual;

(B) verify the individual’s age using a reasonable age verification process, subject to paragraph (4); and

(C) using such age data, classify each user as a minor or an adult.

(3) PERIODIC AGE VERIFICATION.—A covered entity shall periodically review previously verified user accounts using a reasonable age verification process, subject to paragraph (4), to ensure compliance with this Act.

(4) USE OF THIRD PARTIES.—For purposes of paragraphs (1)(B), (2)(B), and (3), a covered entity may contract with a third party to employ reasonable age verification measures as part of the covered entity's reasonable age verification process, but the use of such a third party shall not relieve the covered entity of its obligations under this Act or from liability under this Act.

(5) AGE VERIFICATION MEASURE DATA SECURITY.—A covered entity—

(A) shall establish, implement, and maintain reasonable data security to—

(i) limit collection of personal data to that which is minimally necessary to verify a user’s age or maintain compliance with this Act; and

(ii) protect such age verification data against unauthorized access;

(B) shall protect such age verification data against unauthorized access;

(C) shall protect the integrity and confidentiality of such data by only transmitting such data using industry-standard encryption protocols;

(D) shall retain such data for no longer than is reasonably necessary to verify a user’s age or maintain compliance with this Act; and

(E) may not share with, transfer to, or sell to, any other entity such data.

(c) Required disclosures for artificial intelligence chatbots.—

(1) DISCLOSURE OF NON-HUMAN STATUS.—Each artificial intelligence chatbot made available to users shall—

(A) at the initiation of each conversation with a user and at 30-minute intervals, clearly and conspicuously disclose to the user that the chatbot is an artificial intelligence system and not a human being; and

(B) be programmed to ensure that the chatbot does not claim to be a human being or otherwise respond deceptively when asked by a user if the chatbot is a human being.

(2) DISCLOSURE REGARDING NON-PROFESSIONAL STATUS.—

(A) IN GENERAL.—An artificial intelligence chatbot may not represent, directly or indirectly, that the chatbot is a licensed professional, including a therapist, physician, lawyer, financial advisor, or other professional.

(B) OTHER LIMITATIONS.—Each artificial intelligence chatbot made available to users shall, at the initiation of each conversation with a user and at reasonably regular intervals, clearly and conspicuously disclose to the user that—

(i) the chatbot does not provide medical, legal, financial, or psychological services; and

(ii) users of the chatbot should consult a licensed professional for such advice.

SEC. 6. Prohibition on minor use of AI companions.

If the age verification process described in section 5(b) determines that an individual is a minor, a covered entity shall prohibit the minor from accessing or using any AI companion owned, operated, or otherwise made available by the covered entity.

SEC. 7. Enforcement.

(a) In general.—In the case of a violation of section 5 or 6, or a regulation promulgated thereunder, the Attorney General may bring a civil action in an appropriate district court of the United States to—

(1) enjoin the violation;

(2) enforce compliance with section 5 or 6, or the regulation promulgated thereunder; or

(3) obtain civil penalties under subsection (c) of this section, restitution, and other appropriate relief.

(b) Attorney General powers.—

(1) INVESTIGATORY POWERS.—For the purpose of conducting investigations or bringing enforcement actions under this section, the Attorney General may issue subpoenas, administer oaths, and compel the production of documents or testimony.

(2) RULEMAKING.—The Attorney General may promulgate any regulations necessary to carry out this Act.

(c) Civil penalties.—

(1) IN GENERAL.—Any person who violates section 5 or 6, or a regulation promulgated thereunder, shall be subject to a civil penalty not to exceed $100,000 for each violation.

(2) SEPARATE VIOLATIONS.—Each violation described in paragraph (1) shall be considered a separate violation.

(d) State enforcement.—In any case in which the attorney general of a State has reason to believe that an interest of the residents of that State has been or is threatened or adversely affected by the engagement of any covered entity in a violation of this Act or a regulation promulgated thereunder, the State, as parens patriae, may bring a civil action on behalf of the residents of the State in a district court of the United States or a State court of appropriate jurisdiction to obtain injunctive relief.

(e) Relationship to State laws.—Nothing in this Act or an amendment made by this Act, or any regulation promulgated thereunder, shall be construed to prohibit or otherwise affect the enforcement of any State law or regulation that is at least as protective of users of artificial intelligence chatbots as this Act and the amendments made by this Act, and the regulations promulgated thereunder.

SEC. 8. Effective date.

This Act and the amendments made by this Act shall take effect on the date that is 180 days after the date of enactment of this Act.