|
Welcome
Welcome to our tenth issue of 2025 of Decoded - our technology law insights e-newsletter.
We are very pleased to announce that Suzanne Y. Pierce has joined Spilman's Roanoke office as Member. Suzanne focuses her practice on corporate law, advising clients in mergers and acquisitions, intellectual property, data privacy and compliance, and commercial real estate transactions. With experience that spans both domestic and international markets, she helps businesses navigate every stage of growth – from formation and financing to strategic transactions and succession planning. Click here to learn more about how Suzanne is adding to our team.
We hope you enjoy this issue and thank you for reading.
Nicholas P. Mooney II, Co-Editor of Decoded; Chair of Spilman's Technology Practice Group; Co-Chair of the Cybersecurity & Data Protection Practice Group; and Co-Chair of the Artificial Intelligence Law Practice Group
and
Alexander L. Turner, Co-Editor of Decoded and Co-Chair of the Cybersecurity & Data Protection Practice Group
| | |
“SB 243 sets mandatory safety protocols to shield children and at-risk users from documented dangers tied to AI chatbot interactions.”
Why this is important: California’s new law SB 243 marks one of the most aggressive steps toward regulating the safety of AI chatbots and protecting minors from online harm. The law, which goes into effect January 1, 2026, imposes mandatory safeguards on companies whose chatbots interact with children or other vulnerable users —reflecting a growing recognition that unregulated AI systems can cause real-world harm.
The legislation comes in response to several tragic incidents linking AI chatbot conversations to youth suicides, including the death of a Colorado teen after disturbing exchanges with an AI companion. In light of these incidents, SB 243 establishes duties for AI developers: verifying user ages, disclosing when users are speaking to a bot, preventing sexualized or self-harm content from reaching minors, and reporting safety measures to the state. SB 243 is significant because it creates direct liability for companies that fail to meet these standards, with penalties reaching $250,000 per violation. The law represents a shift in how lawmakers view AI—a product capable of psychological influence, which warrants public safety regulation.
More broadly, SB 243 signals the start of state-level AI governance, especially in areas where federal oversight remains limited. Other states are likely to follow California’s lead, shaping a new compliance landscape for companies developing or deploying conversational AI. The convergence of product liability, safety, and child protection underscores a new era of AI accountability. --- Alison M. Sacriponte
| | |
“CRISPR gets a major boost from DNA-coated nanostructures that supercharge delivery and precision.”
Why this is important: Northwestern University chemists have developed a new nanostructure that could solve one of CRISPR’s biggest challenges: safely and efficiently delivering its gene-editing machinery into cells.
The innovation, called lipid nanoparticle spherical nucleic acids (LNP-SNAs), packages all of CRISPR’s components (the Cas9 enzyme, guide RNA, and a DNA repair template) inside a lipid nanoparticle that is wrapped in a dense shell of DNA. This DNA coating protects the CRISPR cargo, determines which tissues the particles target, and helps them enter cells more effectively.
In laboratory tests using human and animal cells, LNP-SNAs entered cells up to three times more efficiently than standard lipid nanoparticles, caused less toxicity, and increased gene-editing success threefold. They also improved the accuracy of DNA repair by more than 60 percent.
The study, published in PNAS on September 5th, emphasizes that a nanomaterial’s structure, rather than just its chemical makeup, can determine its effectiveness. This finding supports the principles of structural nanomedicine, an emerging field developed by Northwestern’s Chad A. Mirkin, who led the research.
Traditional CRISPR delivery methods, such as viral vectors and standard lipid nanoparticles, have significant limitations. Viruses can trigger immune reactions, while lipid nanoparticles often trap CRISPR components in endosomes before they reach the cell nucleus. The SNA architecture overcomes these obstacles by using DNA’s natural affinity for cell receptors to promote uptake and targeting.
The system performed well across several human cell types, including skin, kidney, and bone marrow cells. Mirkin’s team plans to test LNP-SNAs in live disease models next. Because the platform is modular, it can be adapted for different diseases and therapeutic uses.
Northwestern licensed start-up Flashpoint Therapeutics is commercializing the technology with the goal of moving it into clinical trials. Mirkin said that combining CRISPR with SNA technology could unlock the full potential of gene editing and enable the development of a new generation of safer and more precise genetic medicines.
This CRISPR delivery breakthrough represents a model case of technology transfer: a university lab discovery, protected through IP, licensed to a startup, and developed toward real-world medical impact. Successful translation of this nanotechnology into approved therapies could yield licensing revenue for Northwestern, support additional research funding, and bring novel gene-editing treatments to patients, fulfilling the societal mission of university technology transfer. --- Shane P. Riley
| | |
“Data center demand for computing facilities that can consume as much power as entire cities, but America's electrical grid is struggling to keep pace.”
Why this is important: Artificial Intelligence (AI) is not new, but it has certainly taken center stage in national energy production and development discussions. The article amplifies that it is no surprise that the U.S. energy grid is old and in need of serious renovation. In addition to current development and consumer needs, AI centers are requesting and requiring a range of multiple gigawatts of power. Some analysts estimate that trillions of dollars are likely to be invested between now and 2030, with as much as $3 trillion by 2028. A key roadblock is access to fundamental components needed for infrastructure upgrades, such as transformers. With supply chain delays, resource market fluctuation, and the ever-increasing competition from multiple market sources, the increased demand on the grid is also creating delays in the development of other legitimate projects because of the backlog of planned but possibly hypothetical AI data projects. Acres of space, gigawatts of power, and potential environmental consequences are the holy trinity of the AI frenzy. How the grid will sustain itself and expand to meet these demands is yet to be determined. --- Sophia L. Hines
| | |
“It will affect Medicare patients, and the doctors and hospitals who care for them, in Arizona, Ohio, Oklahoma, New Jersey, Texas, and Washington, starting Jan. 1 and running through 2031.”
Why this is important: The Center for Medicare Services is expected to begin a pilot program to utilize AI technology to weed out wasteful “low-value” services. The process will be akin to a process of prior authorizations. The timing of the announcement was awkward, given that the Trump administration had previously unveiled a voluntary effort by private health insurers to revamp and reduce their own use of prior authorizations. The pilot will initially target procedures involving skin and tissue substitutes, electrical nerve stimulator implant, and knee arthroscopy on the belief that those procedures are particularly vulnerable to fraud, waste, and abuse. There is real concern in the medical community that the AI initiative is designed to reduce medical services and create a process whereby a patient has to appeal a denial to potentially obtain coverage. A majority of physicians surveyed believe that the denials will exacerbate avoidable patient harms and escalate unnecessary waste. There is also a fear that the program will not be transparent as to what will or will not be covered, leaving patients without needed and traditionally offered care. --- Bryan S. Neft
| | |
“Industry leaders should engage with state lawmakers on AI bills, given many legislators have limited experience with the technology or in healthcare more broadly, experts said during a HIMSS summit.”
Why this is important: With the use of AI spreading throughout our economy, there has been a significant increase in proposed legislation to regulate it. The rise in this proposed legislation is at the state level. Over 1,000 AI-related bills have been introduced in state legislatures throughout the country. That is an increase of 300-500 AI bills from 2024. There is a need for these state-level laws regulating healthcare-related AI technology due to the federal government’s lack of oversight and desire to deregulate the development of AI.
Many of these bills prohibit or require monitoring of the use of AI by insurers to handle claims. Large insurers like UnitedHealthcare, Humana, and CVS have already been subject to governmental scrutiny and have even been sued for using predictive algorithms to deny claims. In one class action, UnitedHealthcare was sued because its claims-handling AI had a high error rate when evaluating claims, it overruled the patients’ physicians’ recommendations, and UnitedHealthcare refused to have the decisions reviewed by a human reviewer. Humana was sued because its claims-handling AI allegedly denied rehabilitation care for elderly patients despite recommendations made by the patients’ physicians. Cigna was also subject to a class action because its claims-handling AI denied claims without a human review. These lawsuits also included claims of disparate impact, lack of human oversight, and that the AI algorithms included biased data. State legislators are seeking to ensure that if AI is used to handle insurance claims, the process is neutral, fair, and has human oversight.
State legislators are also concerned about the impact the use of AI has on mental health. Increasingly, people are turning to mental health chatbots for mental health issues. There is a concern that AI is not capable of delivering effective mental health care. More alarming are the cases of teenagers who have died by suicide after being consumed by their relationship with AI. State legislators are seeking to understand these issues and ensure proper safeguards are in place to protect people when they are at their most vulnerable.
As with all new technologies, the technology often is far out ahead of regulation. State legislators are now playing catch-up and trying to not only understand AI, but also reel it in before their constituents are harmed by it. Time will tell whether these efforts will be successful. --- Alexander L. Turner
| | |
“The goal is simple and serious, understand how space changes our bodies so crews can travel safely to the Moon and, later, to Mars.”
Why this is important: NASA is preparing to launch AVATAR, a groundbreaking biological experiment that will study how deep space affects human cells. Aboard Artemis II, the mission will carry tiny organ-on-a-chip devices containing living cells donated by the astronauts themselves. Over a 10-day orbit around the Moon, these chips will help NASA understand how radiation and microgravity change human biology, which will be crucial knowledge for keeping crews healthy on future Moon and Mars missions.
Each chip is about the size of a thumb drive and mimics the function of real tissues like bone marrow, heart, or brain. For this flight, NASA is focusing on bone marrow, which produces blood and immune cells but is highly sensitive to radiation. Using the astronauts’ own bone marrow cells allows scientists to directly compare how the same person’s tissues respond in space versus on Earth, eliminating guesswork.
While Orion travels around the Moon, the chips will run automatically inside a small, self-contained system that maintains temperature and nutrients. After splashdown, researchers will analyze them using single-cell RNA sequencing to measure how genes behave under space conditions, identifying changes linked to radiation, immunity, or aging.
The goal is to enable personalized medical care in space, which will customize countermeasures and supplies to each astronaut’s biology rather than relying on generic kits. The findings may also guide safer mission planning and spacecraft design.
Beyond spaceflight, organ-chip technology has major potential on Earth. It can speed up drug testing, reduce the need for animal experiments, and support more precise medical treatments. Though AVATAR’s first flight lasts only days, it marks the start of a new era where NASA can monitor astronaut health in real time, laying the groundwork for safer, longer journeys into deep space.
AVATAR sits at a crossroads of public research and private innovation. The core organ-chip technologies likely remain privately patented, while NASA may claim co-ownership or use rights for space-adapted variants to use in future missions. Future discoveries could generate new patent portfolios and licensing opportunities across both aerospace and healthcare industries. --- Shane P. Riley
| | |
“A report from industrial cybersecurity firm Dragos highlights growing risks of business interruption and supply-chain disruptions.”
Why this is important: Information Technology (IT) does and should always garner attention and resources to ensure protection, maintenance, and versatility in the modern era. Operational Technology (OT), which covers physical processes often in critical infrastructure, is increasingly coming under attack by cybercriminals seeking to disrupt and hold hostage as many systems as they can. Cyberattacks, whether employing ransomware, malware, or corrupting and disrupting operational technology, are likely to cost companies billions of dollars in annual aggregated fees for ransom payments, insurance claims, and other forms of increased protection. Cybersecurity firms are using probability statistics and a decade's worth of cyber breach and insurance claim data to estimate that business interruptions to operational technology could reach a global aggregated total of 31 billion dollars (USD) over the next 12 months. Attacks have successfully targeted supply chain operations and online transaction capabilities. “The three OT security controls most associated with risk reduction were maintaining a comprehensive incident-response plan, using defensible architecture and performing continuous monitoring to preserve visibility into a network.” Preparation is key. Run audits and update systems regularly. Thinking that a hack has not occurred because all systems are running is not a safety plan. Having appropriate protocols in place in the event of a breach from staff to third-party vendors can make the difference between a few hours and dollars or months and millions worth of downtime. --- Sophia L. Hines
| | |
“Attacks are down year-over-year; however, successful attacks are proving even costlier to mitigate, according to the Mid-Year Risk Report from the cyber risk management company Resilience.”
Why this is important: Healthcare has long been targeted by cybercriminals due to the high value of patient records, and the fact that hospitals cannot tolerate disruption, as it risks patient safety. The industry is extensively targeted by ransomware groups as there is a higher probability that the ransom will be paid to prevent the publication of stolen data and ensure a fast recovery. Data on cyberattacks indicate that providers are improving in attack prevention, but successful attacks are resulting in higher payouts.
A potential cause of cybersecurity gaps is a focus on compliance with the HIPAA Security Rule, which is more than two decades old. A focus on compliance may help avoid penalties by regulators, but may not be effective in reducing risks or adequately protecting against modern threats. To address these gaps, practices should prioritize improving their cybersecurity posture by, among other things, implementing training programs that address phishing, social engineering, and proper data handling procedures. --- Joseph C. Unger
| | The Rise of Data Centers and Electricity Demands on Virginia, Ohio and North Carolina | | |
By Jason E. Wandling
Data centers have generated unprecedented controversy across the country over the past two years, but have attracted the most attention in Virginia, North Carolina, and Ohio. Each of those states is currently experiencing a strong surge in electricity demand driven by the expansion of data centers, which has caused consternation among existing power utility customers and their new neighbors. This piece examines the experiences of data centers in each of those states.
Click here to read the entire article.
| | |
This is an attorney advertisement. Your receipt and/or use of this material does not constitute or create an attorney-client relationship between you and Spilman Thomas & Battle, PLLC or any attorney associated with the firm. This e-mail publication is distributed with the understanding that the author, publisher and distributor are not rendering legal or other professional advice on specific facts or matters and, accordingly, assume no liability whatsoever in connection with its use.
Responsible Attorney: Michael J. Basile, 800-967-8251
| | | | |