AI Legislation Tracking Portal Launched

As a tool for researchers, lawmakers, journalists and the public, the CNTR AISLE Portal provides analysis of state- and federal-level AI bills pending across the U.S.

PROVIDENCE, R.I. [Brown University] - As the federal government and state legislatures across the nation grapple with how to regulate rapidly evolving AI technologies, a team of Brown University researchers has developed a tool to help policymakers, journalists and the public track and better understand AI legislation.

The CNTR AISLE Portal is a public database that aggregates AI legislation pending at the federal level and from all 50 states, and provides analysis by trained evaluators on what aspects of AI policy those bills cover. The portal was developed by a team of faculty, students and staff at the Center for Technological Responsibility, Reimagination and Redesign (CNTR), a project in Brown's Data Science Institute that aims to align computer science education and research with the needs of end users and the broader public.

"Over the last three years, over 1,000 AI-related bills have been introduced in the U.S.," the AISLE team wrote in launching the portal. "With AISLE, we will help the public, journalists, researchers and policymakers identify key policy trends and assess the maturity of these proposals."

The AISLE Portal includes a bill library that aggregates all AI-related bills listed in a larger legislative database called LegiScan. A subset of the bills in the library has been evaluated by the AISLE policy team - a group of 17 undergraduates and five graduate students trained to assess bills according to the AISLE framework. The framework is a set of 159 questions designed to assess the extent to which each bill applies to any of six general categories: accountability and transparency, data protection, bias and discrimination, education, synthetic content, and the labor force.

For each bill evaluated, the portal includes a "bill profile" that summarizes its content according to the AISLE framework. The profile includes the percentage of questions related to each category that the evaluator answered in the affirmative. There's also a graphic illustrating each bill's areas of impact. The library is searchable by keyword and sortable by state to help users find bills that may affect their communities or areas of interest.

Suresh Venkatasubramanian, a professor of computer science and data science at Brown who leads CNTR, said the team was intentional about developing objective standards to evaluate each bill.

"The goal here is not for us to say which bills we think are good and which ones are bad," Venkatasubramanian said. "Instead, we want to provide an easily digestible format for people to see what kinds of topics each bill covers and better understand where policymakers are in terms of addressing developments in AI."

To date, the team has evaluated around 100 bills, and they plan to add more analyses on a rolling basis.

Ultimately, they hope to evaluate enough bills to tease out large-scale trends in AI legislation and governance. The team has already started to develop those kinds of insights based on a preliminary version of the framework developed last year; some are included in a report issued in January in conjunction with the American Civil Liberties Union.

"With the analysis data that AISLE has provided, it is possible to understand which topics come in and out of the spotlight in each year's legislative session, such as the rise in attention paid to the consequences of AI-generated synthetic content," Venkatasubramanian said of the report's results. "We were also able to analyze similarities between bills to understand how ideas spread and diffuse across different states, and how 'template' bills influence how legislators draft legislation."

The team is hopeful that more trends and insights will emerge as additional bills are evaluated under the latest version of the framework.

"Over time, I'm hoping this will become something that's useful for policymakers - legislators and legislative staff who are working to understand this landscape and actually writing bills," said Tomo Lazovitch, an assistant professor of the practice in AI governance and policy at Brown who leads the AISLE policy team. "I think it could be helpful when drafting legislation to be able to say, 'let's look at how other states are handling this,' and be able to search the portal and find related bills."

Wilber Sean Anterola, a first-year undergraduate at Brown, said that working as a member of the AISLE policy team has been a valuable educational experience.

"It's been a great experience for me personally in expanding my perspective," Anterola said. "I have been very interested in leaning into tech policy, and this system has been a way for me to get in. We haven't had any centralized system for keeping track of all these bills, so I think this is going to be a great thing for students, researchers and the general public."

The work of the CNTR AISLE project is just getting started, and the team is planning to add new features to the portal in the coming weeks. But as 2026 legislative sessions open around the country, the team hopes the portal will be useful to a wide array of users.

"When we started work on AISLE, we hoped that the system we were building would be useful to policymakers, the press and the public," Venkatasubramanian said. "But as our team has grown, and as the work has developed, I've come to realize how invaluable AISLE is as an educational experience for the many students in technical and non-technical disciplines interested in AI policy. It has also become clear that AISLE lays the foundation for long-term scholarly research on how efforts to shape this critical and transformative technology are evolving over time."

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.