On September 23, 2024, Principal Deputy Assistant Attorney General Nicole M. Argentieri announced updates to the U.S. Department of Justice’s (“DOJ”) guidance relative to its Principles of Federal Prosecution of Business Organizations through the Evaluation of Corporate Compliance Programs (“ECCP”). The ECCP is “meant to assist prosecutors in making informed decisions as to whether, and to what extent, the corporation’s compliance program was effective at the time of [an offense under investigation], and is effective at the time of a charging decision or resolution, for purposes of determining the appropriate (1) form of any resolution or prosecution; (2) monetary penalty, if any; and (3) compliance obligations contained in any corporate criminal resolution (e.g., monitorship or reporting obligations)” with DOJ.
The ECCP was updated last year with new policies relating to a corporation’s access to and retention of employee electronic communications as well as a company’s compensation structure for executives and employees. This year’s updates focus on three new policies regarding evaluations of:
- How companies are assessing and managing risk related to the use of new technology such as artificial intelligence (“AI”);
- Companies’ “speak up” cultures; and
- Compliance programs’ appropriate access to data, including to assess their own effectiveness.
The widespread availability of Artificial Intelligence (AI) tools has enabled the growing use of “deepfakes,” whereby the human voice and likeness can be replicated seamlessly such that impersonations are impossible to detect with the naked eye (or ear). These deepfakes pose substantial new risks for commercial organizations. For example, deepfakes can threaten an organization’s brand, impersonate leaders and financial officers, and enable access to networks, communications, and sensitive information.
In 2023, the National Security Agency (NSA), Federal Bureau of Investigations (FBI), and Cybersecurity and Infrastructure Security Agency (CISA) released a Cybersecurity Information Sheet (the “Joint CSI”) entitled “Contextualizing Deepfake Threats to Organizations,” which outlines the risks to organizations posed by deepfakes and recommends steps that organizations, including national critical infrastructure companies (such as financial services, energy, healthcare and manufacturing organizations), can take to protect themselves. Loosely defining deepfakes as “multimedia that have either been created (fully synthetic) or edited (partially synthetic) using some form of machine/deep learning (artificial intelligence),” the Joint CSI cautioned that the “market is now flooded with free, easily accessible tools” such that “fakes can be produced in a fraction of the time with limited or no technical expertise.” Thus, deepfake perpetrators could be mere amateur mischief makers or savvy, experienced cybercriminals.
As a privacy officer, what keeps you up at night?
Is it the ransomware boogeyman, or perhaps the data breach creeps?
Whatever it may be, Epstein Becker Green litigators J.T. Wilson III, Stuart Gerson, and Brian Cesaratto are here to shed light on the subject in this episode of Speaking of Litigation.
Searching the internet for “AI and litigation” reveals tons of results about how AI will either replace lawyers or transform the legal profession. These results are unsurprising. Since the early 2010s, articles focusing on the potential impact AI may have on lawyers have popped up every few months. But these results miss the bigger and more important picture, which is that AI likely will spawn a myriad of litigation stemming from its use. This litigation will create the rise of AI lawyers who specialize in the complexities of AI.
In fact, this year we are already seeing the rise of AI lawyers as a handful of lawsuits surrounding AI have been filed. Below is a summary of current proceedings that have been filed this year and where they stand.
By now, the story of two New York attorneys facing scrutiny for citing nonexistent cases generated by the artificial intelligence (“AI”) tool ChatGPT has made national (and international) headlines. Late last month, a federal judge in the Southern District of New York sanctioned the attorneys and their firm $5,000. The court’s decision (Roberto Mata v. Avianca, Inc., No. 22-cv-1461-PKC (S.D.N.Y. June 22, 2023) (ECF No. 54)) provides a humbling reminder of both an attorney’s responsibilities in ensuring the accuracy of his or her filings, and the limits of certain technologies in the legal profession.
New episode of our podcast, Speaking of Litigation: From chart-topping artificial rap songs to employment screening tools, artificial intelligence (AI) is not only a societal phenomenon but also a growing legal dilemma.
Trial lawyers around the globe are focused on the emergence of AI-related disputes in and out of the courtroom.
Epstein Becker Green attorneys Teddy McCormick, Jim Flynn, and Ali Nienaber illustrate the influence that AI has on litigation, employment practices, music, and more.
Advances in artificial intelligence (“AI”) continue to present exciting opportunities to transform decision-making and targeted marketing within the world of consumer products. While AI has been touted for its capabilities in creating fairer, more inclusive systems, including with respect to lending and creditworthiness, AI models can also embed human and societal biases in a way that can result in unintended, and potentially unlawful, downstream effects.
Mostly when we talk about bias, we focus on accidental bias. What about intentional bias? The following ...
Blog Editors
Recent Updates
- The Sleeping Giant: New York’s Commercial Division Expert Disclosure Rules
- Commission Commitments: Massachusetts Appeals Court Upholds Obligation to Continue Paying Commission for the Life of the Underlying Customer Relationship
- A Win for Out-of-Network Providers
- Mastering Legal Writing: Elevate Your Written Advocacy – Speaking of Litigation Video Podcast
- DOJ’s First Civil Cyber-Fraud Initiative Litigation Serves as Warning to Government Contractors Who Fail to Abide by Contractual and Statutory Cybersecurity Requirements