Legal Roundup: Shell Under Fire, MLB Scores a Legal Win and More

A fire at a Houston-area refinery has led the State of Texas to sue Shell for pollution in an unresolved case that highlights the importance of environmental safety.
By: | September 3, 2023

Texas Takes Shell to Court Over Houston Plant Fire

The Case: After a Houston-area petrochemical and refinery complex went up in flames, the State of Texas sued Shell in Travis County district court, alleging that the blaze “caused ‘mass quantities’ of airborne contaminants and illegal flows of wastewater into nearby waterways,” according to the Wall Street Journal. The state claims that 68.7 million gallons of wastewater were “unlawfully discharged into a stormwater pond and into the nearby Houston Ship Channel.”

Scorecard: The case has recently been filed and has not yet reached a resolution. Texas is seeking damages exceeding $1 million, alleging both air and water contamination as a result of the fire.

Takeaway: The case offers a critical reminder, especially to those in potentially hazardous industries, to prioritize environmental safety and adhere strictly to state and federal regulations. Lapses can result in not only financial penalties but also damage to the company’s reputation. Companies should invest in regular safety training and reviews to prevent such accidents.

Aside from the state’s case, Shell faces private lawsuits from over two dozen employees and contractors who claim the fire injured them and exposed them to harmful chemical fumes.

MLB Umpire Strikes Out in Discrimination Lawsuit Appeal

The Case: Angel Hernandez, a Cuban-born MLB umpire since 1993, sued the league in 2017, claiming he was overlooked for crew chief and World Series assignments due to his race and ethnicity. ​​Hernandez “claimed he had been discriminated against because he was passed over for crew chief five times between 2011 and 2018, and last umpired a World Series in 2005,” according to Reuters.

Judge Paul Oetken in Manhattan district court dismissed the case, saying Hernandez failed to demonstrate that Major League Baseball treated him unfairly or that its policies had a disproportionate negative impact on minorities. Hernandez appealed the case to the 2nd U.S. Circuit Court of Appeals in Manhattan. The league named its first Black crew chief in 2020 and first Latino crew chief in 1985.

Scorecard: Hernandez struck out again. “In a 3-0 decision, the 2nd U.S. Circuit Court of Appeals in Manhattan rejected the Cuban-born umpire’s arguments,” according to Reuters. In explaining why Hernandez was passed over, the defendant cited “a missed home run call in 2013 that Hernandez failed to acknowledge, and what [Joe] Torre called Hernandez’s ‘overly confrontational style,’ ” according to Reuters.

Takeaway: Fans know Hernandez as one of the poorest-performing umpires in the game and frequently vent about his missed calls online. Perhaps that reputation hurt his ability to win a crew chief position.

Also, plaintiffs like Hernandez must provide concrete, statistically significant evidence to support claims of discrimination. Plus, seniority doesn’t always mean you’re qualified. Judge Oetken’s rationale underlines the principle that seniority does not automatically imply greater qualification. Organizations, while being sensitive to the longevity of service, must make decisions based on a holistic assessment of an individual’s qualifications, competence and performance.

Music Labels Sue Internet Archive Over Digital Records

The Case: The nonprofit Internet Archive — home to millions of songs, books, movies and The Wayback Machine — faces a lawsuit brought by Universal Music Group, Sony Music Entertainment and others.

Filing in New York federal court, the publishers claim that the site’s repository of digitized vintage records constitutes copy infringement. The suit “said the Archive’s ‘Great 78 Project’ functions as an ‘illegal record store’ for songs by musicians including Frank Sinatra, Ella Fitzgerald, Miles Davis and Billie Holiday,” according to Reuters. With 2,749 copyrights in question, damages in the case could run up to $412 million, according to the plaintiff.

Scorecard: The case has recently been filed and has not reached a resolution.

Takeaway: This case highlights the importance of intellectual property rights, particularly in the digital age. Companies and organizations must ensure that their digitization and distribution efforts do not infringe upon the rights of copyright holders. But doing so means understanding the balancing act between access and copyrights. The case could set a precedent for universal access to cultural materials versus respecting legal rights.

Tutoring Firm Settles AI Bias Lawsuit in U.S. Agency First

The Case: In 2022, the U.S. Equal Employment Opportunity Commission sued three Chinese companies operating under the “iTutorGroup” brand name.

iTutorGroup hires Americans to teach English to Chinese students remotely. The EEOC alleged violation of the Age Discrimination in Employment Act because iTutorGroup allegedly programmed recruitment software to reject female applicants aged 55 or older and male applicants aged 60 or older, resulting in the rejection of over 200 qualified applicants.

The suit “was the first by the U.S. Equal Employment Opportunity Commission involving a company’s use of AI to make employment decisions,” according to Reuters. In 2021, the EEOC initiated a program to guarantee that AI software utilized by U.S. employers adheres to laws prohibiting discrimination.

Scorecard: iTutorGroup has agreed to settle the case, paying $365,000 to more than 200 job applicants.

Takeaway: Reuters reports that a whopping “85% of large U.S. employers are using AI in some aspects of employment,” and this case could set a precedent for its usage. This integration spans preliminary screening processes, chatbots for HR, and software that offers performance reviews and promotion suggestions.

While AI can streamline and make recruitment processes more efficient, there is significant concern — shared by worker advocates and policymakers — regarding the potential for unintentional biases to be reinforced by AI software. Essentially, biases from real-world data can be “learned” and perpetuated by these systems, potentially leading to discriminatory practices. &

Jared Shelly is a journalist based in Philadelphia. He can be reached at [email protected].

More from Risk & Insurance