Last action was on 9-29-2025
Current status is Read twice and referred to the Committee on Commerce, Science, and Transportation.
View Official Bill Information at congress.govNo users have voted for/against support on this bill yet. Be the first!
This Act may be cited as the "Management of Individuals’ Neural Data Act of 2025" or the "MIND Act of 2025".
It is the sense of Congress that—
(1) - an individual’s neural data and other related data can be monetized and used to shape individual behavior, emotional states, and decision making in ways existing laws do not adequately address;
(2) - vertical corporate integration of neurotechnology, artificial intelligence systems, wearable devices, digital platforms, and global data infrastructure may create interconnected systems with insufficient transparency, accountability, or user control regarding the use of such data;
(3) - such concentration increases the risk of behavioral influence, cognitive manipulation, erosion of personal autonomy, and the exacerbation of existing social and economic disparities, particularly in the absence of enforceable privacy protections, including protections of neural data and other related data;
(4) - the absence of a comprehensive Federal standard for the collection, processing, and international transfer of such data presents risks to civil liberties and to national security, given the dual-use potential of and foreign interest in the data assets of the United States;
(5) - strong protections for such data are essential to safeguard privacy, prevent discrimination and exploitation, and ensure that innovation in neurotechnology applications proceeds with accountability and public trust; and
(6) - while this Act focuses primarily on neural data, related biometric and behavioral data that can reveal mental states may pose similar risks and warrant comparative analysis to identify broader privacy gaps.
In this Act:
(1) Artificial intelligence - The term artificial intelligence has the meaning given such term in section 5002 of the National Artificial Intelligence Initiative Act of 2020 (15 U.S.C. 9401).
(2) Commission - The term Commission means the Federal Trade Commission.
(3) Federal agency - The term Federal agency has the meaning given the term agency in section 551 of title 5, United States Code.
(4) Neural data - The term neural data means information obtained by measuring the activity of an individual’s central or peripheral nervous system through the use of neurotechnology.
(5) Neurotechnology - The term neurotechnology means a device, system, or procedure that accesses, monitors, records, analyzes, predicts, stimulates or alters the nervous system of an individual to understand, influence, restore, or anticipate the structure, activity, or function of the nervous system.
(6) Other related data - The term other related data—
(A) - means biometric, physiological, or behavioral information that does not directly measure the neural activity or central or peripheral nervous system of an individual, but can be processed, analyzed, or combined with other data to infer, predict, or reveal cognitive, emotional, or psychological states or neurological conditions; and
(B) - may include heart rate variability, eye-tracking patterns, voice analysis, facial expression recognition, sleep patterns, or other signals derived from consumer devices, wearables, or biosensors.
(a) Study and report
(1) Study
(A) In general - The Commission shall conduct a study on—
(i) - what additional authorities, if any, the Federal Government needs to regulate neural data and other related data that can reveal an individual’s mental state or activity, and to establish appropriate privacy protections for individuals in the United States;
(ii) - best practices for privacy and data security for the private sector to protect such data; and
(iii) - the extent to which existing laws, regulations, and governing frameworks, including the Health Insurance Portability and Accountability Act of 1996 (Public Law 104–191), govern the use, storage, processing, portability, and privacy of such data, any gaps in law that should be addressed, and potential additional protections for such data that fall outside the scope of such Act.
(B) Consultation - In conducting the study described in subparagraph (A), the Commission shall consult with—
(i) - the Director of the Office of Science and Technology Policy;
(ii) - the Commissioner of Food and Drugs;
(iii) - other relevant Federal agencies determined appropriate by the Commission; and
(iv) - representatives of the private sector, academia, civil society, consumer advocacy organizations, labor organizations, patient advocacy organizations, and clinical research stakeholders including medical and health care professionals.
(2) Report - Not later than 1 year after the date of enactment of this Act, the Commission shall—
(A) - submit to Congress a report on the study conducted under paragraph (1) that—
(i) - includes the information described in subsection (b); and
(ii) - describes a regulatory framework that maximizes opportunities for responsible innovation in neurotechnology while minimizing the risks of harm that arise from such innovation, such as discrimination, profiling, surveillance, manipulation, and the misuse of neural data and other related data in employment, healthcare, financial services, education, commerce, and public life; and
(B) - publish the report on the website of the Commission.
(b) Report contents - The report described in subsection (a)(2) shall include—
(1) - an analysis on—
(A) - the collection, processing, storage, sale, and transfer of neural data and other related data; and
(B) - all relevant uses of neurotechnology, neural data, and other related data for understanding, analyzing, and influencing human mental states and behavior;
(2) - a summary of the ethical, legal, and regulatory landscape surrounding neural data and other related data that can reveal an individual’s mental state or activity, including any existing guidelines related to—
(A) - the collection of such data;
(B) - consent for the collection, use, and transfer of such data;
(C) - individual rights relating to such data;
(D) - predictive modeling; and
(E) - using such data to infer or influence behavior;
(3) - an assessment of—
(A) - how neural and other related data is collected, processed, and transferred in interstate commerce, and the benefits and risks associated with the collection and use of such data, including how such data may serve the public interest, improve the quality of life of the people of the United States, or advance innovation in neurotechnology and neuroscience; and
(B) - how the use of such data may pose risks to individuals, including vulnerable populations, across different contexts or use cases;
(4) - recommendations for the categorization and oversight of neural data and other related data uses, including—
(A) - a framework that—
(i) - distinguishes categories of such data, classifying such data based on both the potential for beneficial use cases (including medical, scientific, or assistive applications), and the potential for individual, societal, or group-level harm arising from misuse;
(ii) - describes the properties of such data based on its capacity to directly or indirectly identify an individual or to reveal or infer sensitive personal information about an individual; and
(iii) - suggests corresponding governance requirements such as heightened oversight, stricter consent standards, prohibited use cases regardless of individual consent, enhanced access restrictions, and cybersecurity protections;
(B) - standards for computational models of the brain and guidance on assessing harms in contexts where such data is integrated with artificial intelligence or used as part of a system designed to influence behavior or decision making;
(C) - an analysis of whether, and if so how, individuals may be exposed to unfair, deceptive, or coercive trade practices through the misuse of neural data and other related data across different environments, and recommendations for safeguards to prevent such harms; and
(D) - recommendations for categorizing certain applications of neural data and other related data, or certain practices regarding such data, as impermissible, such as those designed to manipulate behavior or erode privacy with respect to an individual’s mental state or activity;
(5) - an examination of how the application of artificial intelligence to neural and other related data that can reveal an individual’s mental state or activity may reshape the risks, oversight demands, and ethical considerations associated with such data;
(6) - recommendations for consumer transparency, consent frameworks, and neural data and other related data use restrictions, such as—
(A) - limiting such data use to only clearly disclosed purposes;
(B) - restricting the resale of such data to third parties or the use of such data for individual profiling or targeted advertising;
(C) - the use of separate, conspicuous consent mechanisms for the use of such data in developing or deploying computational models of the brain;
(D) - the public disclosure of—
(i) - intended uses for such data, sharing practices, and artificial intelligence applications; and
(ii) - policies related to the retention and deletion of such data; and
(E) - prohibited use cases, regardless of individual consent;
(7) - recommendations regarding applications of neural data and other related data in specific areas, including—
(A) - sectors or practices that raise concerns about privacy, manipulation, discrimination, inequality, or long-term harm, such as—
(i) - employment practices, such as in hiring, surveillance, or performance evaluation;
(ii) - educational settings and other settings involving children under the age of 13 and teens;
(iii) - insurance, financial, and housing services;
(iv) - neuromarketing and behavioral shaping, including the targeting of consumers;
(v) - commercial surveillance;
(vi) - monetization models, such as data brokers, that aggregate or sell neural data and other related data;
(vii) - the transfer of neural data and other related data through acquisitions, mergers, or bankruptcy proceedings;
(viii) - law enforcement and the criminal justice system; and
(ix) - sectors where algorithmic recommendation or design patterns intentionally amplify addictive use or behavioral manipulation;
(B) - how existing Federal statutes enforced by the Commission, including the Federal Trade Commission Act (15 U.S.C. 41 et seq.) and other consumer protection laws, apply to neural data and other related data; and
(C) - whether there are regulatory gaps in protecting the privacy of children and teens, including the applicability of the Children’s Online Privacy Protection Act of 1998 (15 U.S.C. 6501 et seq.) and related laws to neural data and other related data;
(8) - an analysis of the potential security risks associated with the collection, use, and transfer of neural data and other related data, including—
(A) - an assessment of current cybersecurity and data protection requirements applicable to entities that collect, process, or store neural data or other related data, including any gaps in such requirements where such entities fall outside existing Federal standards, such as the Health Insurance Portability and Accountability Act of 1996 (Public Law 104–191);
(B) - an assessment of interagency review models to determine whether certain exports, public releases, or commercial uses of neurotechnologies, including their component parts and integration with artificial intelligence systems, should be subject to restrictions or enhanced controls;
(C) - an examination of foreign investment risks in neurotechnology firms;
(D) - recommendations on actions the Government and nongovernment actors can take to ensure transparency and due diligence in international partnerships involving such data;
(E) - supply chain risks involving components used in neurotechnology that are acquired from foreign countries; and
(F) - the implications of storing and processing such data locally versus in cloud environments;
(9) - recommendations for incentive structures that promote ethical innovation in neurotechnology that prioritize consumer protection and descriptions of how such structures can be aligned with existing regulatory and certification pathways or requirements, such as the development of—
(A) - voluntary standards tied to business incentives, such as research and development tax credits and expedited regulatory pathways;
(B) - financial support for responsible scientific inquiry and innovation in neurotechnology, conducted in ethically governed and controlled environments, with safeguards to prevent misuse or harmful applications;
(C) - regulatory sandbox mechanisms to enable early-stage neural data applications to be tested with agency oversight, informed consent, and structured risk review;
(D) - policies that promote long-term support for users of brain-computer interfaces, such as interoperability standards and post-trial maintenance practices;
(E) - competitive incentives, such as procurement preferences for companies that meet specified standards relating to the use of neurotechnology;
(F) - public-private partnerships to develop open standards and ethical practices regarding the treatment of neural data and other related data; and
(G) - ways the Centers for Medicare & Medicaid Services and the Food and Drug Administration can coordinate on the use and approval of neurotechnology to reduce reimbursement and coverage barriers;
(10) - a proposed framework for enforcement mechanisms, remedies, and penalties for the misuse of, gross negligence regarding the use of, and unauthorized collection, use, transfer, or disclosure of neural data and other related data; and
(11) - other analysis and recommendations determined appropriate by the Commission.
(c) Annual updates - Not later than 1 year after the date the Commission submits the report to Congress under subsection (a), and not less frequently than annually thereafter, the Commission shall publicly update the findings in such report to—
(1) - reflect evolving advancements in neurotechnology, neural data and other related data use cases, and the associated risks involved with such advancements and use cases; and
(2) - assess whether additional reports or updates to any guidance are necessary to ensure that privacy, particularly as it relates to neural data and other related data, continues to be protected.
(d) Authorization of appropriations - There is authorized to be appropriated $10,000,000 for purposes of carrying out this section.
(a) Guidance to Federal agencies -
(1) In general - Not later than 180 days after the Commission submits the report described in section 4(a)(2), the Director of the Office of Science and Technology Policy, in consultation with the Commission and the Director of the Office of Management and Budget, shall develop guidance, using such report to inform such guidance, regarding the procurement and operational use by Federal agencies of neurotechnology that collects, uses, procures, or otherwise processes neural data or other related data. Such guidance shall identify—
(A) - prohibited, permissible, and conditionally permitted use cases of such neurotechnology that are consistent with such report;
(B) - technical, procedural, and ethical safeguards regarding each use case of such neurotechnology; and
(C) - requirements for transparency, limitations regarding the purposes for which such neurotechnology can be used, individual opt-in consent mechanisms regarding the use of such neurotechnology, and protections for the privacy of the people of the United States.
(2) Binding guidance - Not later than 60 days after the Director of the Office of Science and Technology Policy develops the guidance under paragraph (1), the Director of the Office of Management and Budget shall issue binding implementation guidance to each Federal agency pursuant to the guidance developed under paragraph (1).
(b) Prohibition -
(1) In general - The head of a Federal agency may not procure or operate any neurotechnology that collects, uses, procures, or otherwise processes neural data in a manner inconsistent with the guidance issued under subsection (a)(2).
(2) Effective date - Paragraph (1) shall take effect on the date that is 1 year after the date on which the Director of the Office of Management and Budget issues guidance in accordance with subsection (a)(2).