Rachel Peterson announced on March 28, 2026, that Meta Platforms will finance 10 gas-fired power plants to energize its expanding Louisiana AI campus. The company reached a final agreement with New Orleans-based Entergy to build and operate seven new facilities, supplementing three previously approved units. These structures will support the Hyperion AI data center complex located in rural Richland Parish. Scaling operations to this level requires a vast infusion of energy, and the new plants will provide 7.5 gigawatts of capacity. This total is a 30 percent increase to the entire power grid capacity of Louisiana. Peterson, serving as Meta vice president for data centers, stated the project ensures other consumers avoid subsidizing these specific infrastructure costs.
Expansion plans for the Hyperion site now include a total development cost of $27 billion. While the initial December 2024 proposal covered 2,250 acres, the company quietly acquired another 1,400 acres earlier this year. A joint venture with Blue Owl Capital is the financial backbone for the multi-phase hub. Mark Zuckerberg described the campus as occupying a footprint comparable to a meaningful portion of Manhattan. Energy requirements for the facility could power more than 5 million homes. Local regulators at the Louisiana Public Service Commission must still grant final approval for the seven newest gas plants.
Hyperion Infrastructure and Louisiana Energy Grid Impact
Blue Owl Capital and Meta intend for Hyperion to function as a long-term anchor for next-generation AI innovation. The project includes 2.5 gigawatts of renewable energy capacity and battery storage to complement the gas-fired plants. Despite these green investments, the reliance on fossil fuels highlights the immediate energy density required by large language models. Data centers of this magnitude consume electricity at rates that challenge existing utility frameworks. Entergy has prioritized the Meta partnership to secure long-term industrial demand in the northeastern quadrant of the state. Construction schedules suggest the first of the new plants will come online within the next 24 months.
Richland Parish officials anticipate the facility will alter the local economic profile. High-voltage transmission lines will cut across the 3,650-acre site to feed the server racks. Cooling systems for the AI hardware require a solid water management strategy alongside the electrical demands. Louisiana regulators are weighing the impact of such a large industrial load on regional rate stability. Documents filed with the Public Service Commission indicate that Meta will bear the full brunt of the financing for the seven new generation units. Previous approvals for the first three plants set a precedent for rapid utility expansion in the region.
Privacy Violations in AI Training Pipelines
Privacy concerns now overshadow the physical expansion of the Meta hardware footprint. A federal lawsuit alleges that the company misled users regarding the security of its Ray-Ban Meta smart glasses. Marketing materials from September 2023 claimed the devices were built with privacy at their core. Yet, an investigation by Swedish publications Svenska Dagbladet and Göteborgs-Posten found that human contractors review sensitive user footage. These workers are located in Kenya and process data to train the AI models powering the glasses. Evidence suggests contractors have viewed footage of users in bathrooms and in various states of undress. One Kenyan worker detailed the level of exposure during the training process.
“In some videos you can see someone going to the toilet, or getting undressed. I don’t think they know, because if they knew they wouldn’t be recording,” a contractor for the Meta training pipeline said.
Data sharing for AI training purposes allows the company to route footage to these human reviewers. The lawsuit contends that Meta was not transparent about the extent of human involvement in data processing. Users who opted into data sharing effectively invited remote observers into their private lives. Federal agents and court officials have been spotted wearing the devices in sensitive environments. The risk of capturing financial documents or private conversations is still a central point of the litigation. Meta maintains that the glasses use encryption and that data sharing is a voluntary choice for the consumer. Contractors in Kenya reportedly see everything from living rooms to intimate personal moments.
Liability for Psychological Impact of Algorithms
Legal pressure intensified after a court found Meta and YouTube liable for negligence regarding a minor. The ruling focused on the psychological toll of infinite scroll features and algorithmic recommendations. Damages were awarded for the specific harm caused by these engagement-driven systems. This verdict establishes a new baseline for how social media companies must protect younger demographics. The court reviewed internal documents detailing the addictive nature of the scrolling architecture. Negligence in system design led to documented psychological trauma for the plaintiff. The ruling creates a marked financial and regulatory risk for the Meta business model.
Infinite scroll designs are intended to maximize time spent on the platform. Critics have long argued that these features exploit neurological vulnerabilities in children and adolescents. While Meta has introduced parental controls, the court determined these measures were insufficient to prevent harm. The jury found that the company failed to implement adequate safeguards despite knowing the potential risks. This decision follows years of public testimony regarding the impact of social media on youth mental health. Liability for algorithmic harm may force a redesign of the primary user interfaces for Instagram and Facebook. The legal precedent could trigger a wave of similar lawsuits across the United States and Europe.
Corporate accountability for AI and algorithm design is a growing focus for federal regulators. The Louisiana infrastructure projects demonstrate the physical scale of Meta, but the lawsuits highlight the ethical gaps in its software. Between the power-hungry Hyperion campus and the privacy breaches in the Ray-Ban line, the company faces scrutiny on multiple fronts. Local communities in Richland Parish are watching the regulatory process for the gas plants closely. At the same time, users are questioning the safety of wearing cameras linked to overseas training hubs.
The intersection of enormous energy consumption and intrusive data collection defines the current operational climate for the tech giant. Regulatory approval for the new power plants will depend on the ability of Entergy to balance industrial needs with public interest.
The Elite Tribune Perspective
Why does Meta Platforms believe it can consume a third of a state’s energy grid while simultaneously peeping into the bathrooms of its customers? The audacity of the Hyperion project is matched only by the sheer incompetence of its privacy safeguards. Mark Zuckerberg is essentially building a private utility company in Louisiana to feed an AI beast that requires the voyeuristic exploitation of Kenyan laborers. We are told these smart glasses are the future of human-computer interaction, but the reality is a pipeline of digital stalking disguised as innovation.
The Louisiana Public Service Commission should think twice before handing over 30 percent of the state’s grid capacity to a company that cannot even keep its users’ clothes on in its own databases.
The negligence verdict regarding infinite scroll proves that the legal system is finally catching up to the predatory architecture of Silicon Valley. For years, these companies treated the mental health of minors as an acceptable externality in the pursuit of ad revenue. Now, the bill is coming due. If Meta cannot manage the psychological safety of its current platforms, it has no business building a Manhattan-sized AI hub that will only accelerate these harms. The $27 billion price tag for Hyperion is a monument to corporate ego, built on the backs of fossil fuels and the compromised privacy of an unsuspecting public.