Skip to main content

2,212 studies used disappearing federal climate justice tools. We analyzed them all.

Here’s what the data tells us about how they’re used and why researchers rely on them.


Animated collage of papers and headlines related to environmental justice research.
Illustration by Elham Ali/PEDP

This story is part of Made Possible, a PEDP series about the people whose research, teaching, and advocacy depend on federal environmental justice tools to serve their communities.

Key points

  • Publications using five major federal environmental justice (EJ) tools dropped sharply in 2024—months before formal rollback orders—amid data outages, grant freezes, and leadership turnover.
  • These tools supported research across health care, education, government, and nonprofit sectors on issues like air quality, housing security, and disaster risk.
  • Rebuilding them is costly and slow, especially for underfunded public universities, making it harder to train future researchers, hold agencies accountable, and design effective climate policy.

The hottest place in America in June 2021 wasn’t Arizona or Texas. It was Oregon.

That month, Portland—typically known for its mild summers—reached 116 degrees, hot enough to melt power cables and buckle highways. Sixty-nine people died inside their homes, smothered by a heatwave so unprecedented that scientists called it “virtually impossible” without climate change.

The irony felt unbearable for Dr. Jola Ajibade, an associate professor of Environmental and Climate Justice at Emory University in Atlanta and previously at Portland State University in Oregon, who grew up under the humid 91-degree skies of Lagos, Nigeria. She had moved halfway across the world expecting gentler summers, only to find herself experiencing climate change’s impacts first hand in her own living room.

The phenomenon, known as a heat dome, forms when a ridge of high pressure traps hot air over an area, turning entire cities into ovens.

Her air conditioner had given out, and her home’s walls in Beaverton, Oregon radiated heat like a kiln.

“I honestly thought I was going to have a heat stroke,” Dr. Ajibade said. “I was scared for my family.”

When she turned to the Environmental Justice Screening tool, or EJScreen, the now-defunct federal tool used for identifying where environmental and social inequities overlapped, to see who bore the worst of that heat, the pattern became clear. Environmental justice (EJ) is the idea that everyone deserves to live and thrive in safe, healthy environments and to have an equal voice in the decisions that affect them.

Why this matters

Public environmental data turns lived experiences into evidence used to, among other things: identify cleaner air investments, stronger flood protections, and cooling centers in heat-burdened neighborhoods. When tools such as the EJScreen disappear, so do the proof communities need to protect themselves and have their voice represented in science and policy.

After the heat dome, Dr. Ajibade set out to understand why the same neighborhoods kept suffering first and worst. In her study, “Disaster vulnerability hotspots in the Portland metro-region,” she used EJScreen data to map neighborhoods where social vulnerability and disparities keep some communities from being more resilient than others.

The results pointed to a familiar storyline of inequity. Rockwood, Cully, and Canby are Portland neighborhoods hurt by poverty, single-parent households, aging infrastructure, and the generational reach of redlining, a discriminatory practice of denying mortgages and loans to Black and Brown residents. She called them “resilience gaps” because these are communities where people face the greatest risks but have the fewest resources to bounce back. The same year, poor air quality from wildfire smoke and a severe snowstorm that cut power for days adversely impacted those communities.

Dr. Ajibade’s paper is one of 2,212 publications we analyzed at the Public Environmental Data Partners (PEDP) to understand how researchers across disciplines use federal environmental justice tools. These publications—spanning academic journals, government reports, and community projects—relied on one or more of five federal datasets:

  • Climate and Economic Justice Screening Tool (CEJST), which defines disadvantaged communities for Justice40 (a Biden-era directive to federal agencies requiring that 40 percent of the benefits of climate and clean-energy projects be directed to disadvantaged communities).
  • Environmental Protection Agency’s (EPA) EJScreen identifies disproportionate pollution burdens based on environmental and demographic data.
  • Environmental Justice Index (EJI) ranks counties by their cumulative environmental and health inequities.
  • FEMA’s Future Risk Index and Climate Risk Viewer project identifies where hazards will intensify.

By late 2024, with the Trump administration incoming, PEDP volunteers began archiving these tools in anticipation of their removal. Their fears became reality in early 2025 when federal environmental websites and dashboards vanished, leaving researchers, agencies, and communities without crucial tools. Working with partners like the Environmental Policy Innovation Center and the Environmental Data & Governance Initiative, PEDP mirrored datasets, rebuilt interfaces, and restored public access at screening-tools.com.

By the numbers

Our analysis found that publications citing federal EJ tools began falling in 2024, months before the Trump administration formally rolled back Justice40 and other climate programs in 2025.

Even before executive orders landed, evidence demonstrated the unraveling of the data infrastructure.

EJScreen, the most widely used tool, peaked in 2024 with more than 340 studies before plunging to 196 the next year. CEJST followed a similar arc, while FEMA’s newer tools never gained full traction before declining.

Policy alone didn’t cause the usage drop. Late 2024 brought a wave of uncertainty that rippled through scientific institutions. Federal data users began reporting outages for the official CEJST site and expressed concerns about the tool’s reliability. CEJST’s download portal failed weeks before it was removed from the White House website. The disappearance of tools and websites interrupted scientific work already in progress.

An Office of Management and Budget memo (M-25-13) paused climate-related grants, halting hundreds of research projects nationwide. The EPA alone suspended 477 environmental justice grants worth $1.7 billion and placed nearly 170 staff on leave; the Administration dissolved the White House Environmental Justice Advisory Council. These actions paused community projects from Oregon indefinitely, including new Climate Resilience Hubs and pollution reduction efforts. The EPA and U.S. Department of Energy both sent “stop work” notices to grant recipients in late January 2025, instructing them to cease spending on projects funded by Inflation Reduction Act climate programs until further notice.

Partnerships froze. Planned evaluations of Justice40 benefits disappeared.

It wasn’t the first time researchers feared what would happen to climate and EJ datasets under a Trump Administration. In 2016, groups like Data Refuge scrambled to back up climate databases ahead of the first transition. Similar “defensive archiving” returned in 2024, as scientists mirrored datasets to independent servers before websites went dark.

94% of publications cluster around EJScreen and CEJST, unsurprising given their longer availability.

Newer tools like FEMA’s Future Risk Index show smaller but notable footprints. The circle's tiny but it’s there.

“Climate Change Risk for LGBT People in the United States” used FEMA’s Future Risk Index (FRI) to map climate hazards with demographic data.

Dr. Ari Shaw of UCLA’s Williams Institute noted that the FEMA composite risk score “linked structural risk metrics to lived realities,” underscoring why inclusive disaster planning matters.

Most publications applied or mentioned federal EJ tools. Some used them for statistical modeling or geospatial overlays; others critiqued or adapted their indices to fit local contexts.

Among them was “Community Outbreak Preparedness Index (COPI),” which applied the Environmental Justice Index (EJI) to track community vulnerability to communicable disease.

Dr. Ghosh said, “Having a national-scale tool like EJI makes our work more efficient and consistent. Without it, we couldn’t easily account for community vulnerability.”

Across all publications, health, governance, equity, and climate risks were the dominant topics. Publications also span various sectors from health care to nonprofits.

CEJST guided regional equity planning too.

Urban Institute researcher Dr. Christina Plerhoples Stacy said the tool “lent credibility and weight” to local climate investment decisions in Rochester, New York.

Top publishers included ProQuest, OSTI.gov, HeinOnline, Elsevier, and Springer Nature, reflecting a mix of academic, government, and grey literature sources.

What we heard

Through interviews with researchers, we heard federal EJ tools lower the barrier to doing science and research that matters. They provide reliable, uniform, and easy-to-use environmental and demographic data. Without them, rebuilding equivalent datasets will cost billions and take years.

“There’s just no way we could have done most aspects of our work without publicly accessible data. Ninety percent of what we did relied on those datasets,” Dr. Ajibade said. “Before tools like EJScreen, people had to build data from scratch or patch it together. It’s time-consuming, expensive, and often not comparable across places. These tools let us work at census-tract and county levels without spending months and money we don’t have.”

For many researchers, public tools and datasets are the great equalizer. They narrow the gap between large, well-funded universities and smaller public colleges that lack the staff, computing power, or budgets to build datasets from scratch. In 2023, the National Center for Science and Engineering Statistics reported that 20 universities received over one-third of all federal research and development funding, while the remaining 640 shared the rest.

A recent analysis of proposed funding cuts found that public institutions will bear the brunt of lost research grants. “Publicly accessible datasets level the playing field,” said Dr. Ajibade. “At a public university, we don’t have the internal funds wealthier private institutions have.”

These tools also shape how students learn and how early-career scientists build their skills. Across the country, professors use them to teach data literacy and replication, core tenets of scientific integrity. “With EJScreen, I can show students exactly where to get the data and how to repeat the analysis in Texas, in Florida, wherever,” said Dr. Ajibade.

Before it was taken down in early 2025, EPA’s EJScreen Office Hours and Training Portal offered free workshops for educators and researchers. Those sessions vanished when the site went dark. “We’re equipping the next generation,” Dr. Ajibade stated. “We want people involved in the science. These tools connect people to what’s real.”

For early-career researchers, EJ tools connect community experiences with empirical evidence. Brandon Lewis, a doctoral candidate at Yale University, used EJScreen to study air pollution from North Carolina’s hog farms. At a local conference, he saw pictures of residents who wore masks to block the odor from nearby Concentrated Animal Feeding Operations. “EJScreen gave me a framework to link what people described—their health issues, the smell in the air—to quantifiable exposure data,” he said. “Take that away, and it’s harder for communities to back up their experiences with numbers.”

The power of maps to confirm lived inequity isn’t new. In 1987, the Toxic Wastes and Race report used national maps to show hazardous sites clustered in poor communities of color, visual proof of what affected residents had long known. The removal of these mapping tools erases that visibility. EJScreen had become central to environmental justice advocacy; as recently as 2023, the Center for Public Integrity advised communities to cite EJScreen data in discrimination complaints to the EPA. Community members successfully used the tool to strengthen environmental justice claims.

Researchers describe these tools as a public service and a measure of government accountability. “If we want the U.S. to retain leadership in science and environmental work, these public datasets are essential,” Dr. Ajibade said. For more than a century, the federal government has published environmental data through public reports, congressional testimony, and now open-data websites. That transparency is the foundation of what scholars call “data justice,” the idea that information about the environment belongs to the people it affects. Open data allows journalists and researchers to verify findings, hold agencies accountable, and keep policy grounded in evidence.

These tools also shape civic dialogue and action. Teachers use them to spark kitchen-table conversations about pollution and health. Journalists cite the tools to bring attention to environmental justice disparities. In 2024, the Union of Concerned Scientists used EJScreen to identify Tyson Foods meatpacking plants discharging waste into historically overburdened communities, prompting formal recommendations and a comment that urged the EPA to strengthen water protections against contamination from the meat and poultry industry.

“These tools help us talk, even when we disagree,” said Babu Gounder, a doctoral candidate at the University of Illinois Urbana-Champaign. His research, The (Un)Just Transition in Ecomodernist Climate Policy, critiqued CEJST for excluding race as a factor, central to defining equity in federal climate investments, and called for continued assessment of the tool. “The tool is a step in the right direction,” he said. “But if we ignore unequal responsibility for emissions or omit race, we risk reinforcing the very disparities we’re trying to fix. The point isn’t to discard the tool, but to evolve it.”

“We don’t tell students what to think,” Dr. Ajibade said. “We give them the tools. They map cancer rates, toxic sites, vulnerabilities and they see the injustices in their own communities.”

When Dr. Ajibade learned that PEDP had restored those tools, she called it “a beacon of hope.” “I shared the link with colleagues immediately,” she said. “Last semester we couldn’t use EJScreen in the classroom because we didn’t know where to find it. Now I can start teaching my students again with it.”

Data and methods

You can access the source data for this article in this GitHub repository. We last refreshed it on September 15, 2025.

To find publications, we used a tool called Publish or Perish, which pulls citation data from different sources. For each climate tool, we wrote search queries using both their full names and acronyms. We checked results in Web of Science and Scopus, but we chose to use Google Scholar as the main source because it includes both academic and grey literature like policy reports, theses, and government documents that the other tools don’t cover. Searches covered each tool’s release year through 2025 (e.g., CEJST ≈ 2022–2025). Since the tools launched in different years, we didn’t consider raw citation counts as a tool's “impact score.”

In total, we collected 2,212 studies. We then drew a 10% stratified random sample (n = 221) by tool. To ensure sufficient representation of smaller tools (e.g., FRI, CRV), we allocated a minimum of 10 studies per tool, with the remaining cases distributed proportionally to corpus size. This oversampling of smaller tools supports tool-specific agreement checks and descriptive comparisons, while slightly under-sampling the largest tools.

We manually categorized each sampled study by topic and by type of tool use (applied, evaluated, critiqued, or mentioned). Two collaborators independently categorized the same sample, and inter-rater agreement was confirmed (κ = 0.16 and 0.32, respectively), which measures how much agreement exists beyond chance. Although agreement was modest, it revealed meaningful nuances in how we each interpreted topic and tool-use categories. These differences were instructive rather than problematic because they helped us clarify definitions, strengthen our shared understanding, and refine the coding guidebook before applying the full labeled subset to all studies. After cleaning false positives, we ended up with 1,826 publications.

We also manually reviewed 387 publishers and sorted them into nine categories such as commercial publishers, university presses, or government repositories. We based each decision on trusted sources such as the publisher’s own site, university or government libraries, or society pages. If we determined that no real publisher had printed the study, we marked it as “None” and explained why. Short notes with links are included for every entry. Two collaborators independently categorized the same sample, and inter-rater agreement was confirmed (κ = 0.90).

Like any dataset, this one has its limits. Major publishers like Elsevier and Springer Nature take nearly a year from submission to publication, so when citations and publications dip after an administration change, some portion of the dip may be attributed to the lag between when research is done and when it finally appears in print. The dataset is also just a snapshot in time. Our last data pull was on September 15, 2025, so publications from the rest of the year are not captured. Additionally, results are shaped by search terms. If a study used one of the five tools without naming it directly in the title, abstract, body of the text, or references, it may not appear in the dataset. Finally, we confined our scope to five federal climate and environmental justice tools, which means the analysis reflects only a slice of the larger evidence infrastructure.

Last updated: 10-28-2025.

Acknowledgements

We are grateful to the authors of the publications featured in this analysis for sharing and trusting us with their stories and insights. Your work continues to advance both science and service to your communities.

Our thanks also go to the volunteers at Public Environmental Data Partners (PEDP) whose dedication, perspective, and hard work made this inaugural piece possible.

Credits

Reporting and Data Visualization

Elham Ali

Data Analysis and Quality Assurance

Elham Ali, Mary Beth Bosco, Cole Reardon, Chris Zuppa

Editing

Chris Zuppa, Mary Beth Bosco