Volume 4, Issue 7, 2023

Welcome

Welcome to the seventh issue of Decoded for 2023.


We are very pleased to announce that we have opened a new office in Jacksonville, Florida – marking our official entrance into the Sunshine State and expanding our footprint to eight offices in total.


Kevin L. Carr, a longstanding partner of the firm and co-chair of our Labor & Employment Law Practice Group, is helming the Jacksonville office.


The new office is a strategic expansion to better meet the needs of our growing client base in the southeastern United States. If you or a colleague have expressed interest in getting representation in Florida, we encourage you to click here to learn more.


We hope you enjoy this issue and, as always, thank you for reading.



Nicholas P. Mooney II, Co-Editor of Decoded, Chair of Spilman's Technology Practice Group, and Co-Chair of the Cybersecurity & Data Protection Practice Group


and


Alexander L. Turner, Co-Editor of Decoded and Co-Chair of the Cybersecurity & Data Protection Practice Group

You may be Entitled to Financial Compensation…for Your Data

“Without all of our writings and photos that AI companies are using to train their models, they would have nothing to sell.”


Why this is important: This opinion piece makes an interesting argument that everyone in America should be compensated for the data they share with the public and that eventually makes its way into Generative AI platforms. Without that data, the article argues, AI companies would have nothing to sell. It would cost AI companies a fortune to create the data that powers AI platforms, and that data likely would result in an artificial output. The article suggests a fund be created that AI companies pay into in order to receive a license that permits them to use public data when training their platform. The money in that fund then would be split equally among all Americans. This removes the problem of trying to determine who contributed what data for which AI platforms. The authors of this plan liken it to Alaska’s Permanent Fund into which oil revenues are paid and from which funds are sent to every Alaskan every year. The proposition discussed in the article falls in line with the data dignity movement and likely will be a topic that continues to garner interest as AI platforms become more ubiquitous in our everyday lives. --- Nicholas P. Mooney II

Examining Health Data Privacy, HIPAA Compliance Risks of AI Chatbots

“Healthcare organizations seeking to reap the benefits of AI chatbots must consider the HIPAA compliance and data privacy risks that come along with them.”


Why this is important: The article discusses the growing use of AI chatbots in healthcare, such as Google's Bard and OpenAI's ChatGPT, and the challenges they pose in terms of patient privacy and compliance with HIPAA regulations. Two viewpoint articles published in JAMA explore the risks associated with using AI chatbots in healthcare and offer suggestions on how providers can navigate HIPAA compliance.


The first viewpoint highlights that the risk with AI chatbots lies in their chat functionality, where the clinical details shared with the chatbot may be stored on servers owned by the company that developed the technology, potentially leading to unauthorized disclosure of Protected Health Information (“PHI”). To avoid this, healthcare providers are advised to avoid entering any PHI into chatbots or manually deidentify the data before entering it.


The second viewpoint argues that AI chatbots cannot fully comply with HIPAA regulations and that HIPAA itself may be outdated and inadequate in addressing AI-related privacy concerns. The authors advocate for novel legal and ethical approaches to safeguard patient privacy when using these technologies.


Both viewpoints emphasize the need for caution when using AI chatbots in healthcare and stress the importance of protecting patient data and confidentiality. They suggest that relying solely on HIPAA compliance may not be sufficient and that additional measures and training should be implemented to mitigate privacy risks associated with AI chatbots. --- Shane P. Riley

Mouse Study: Chemotherapy Works Better with Nanotechnology

“Nanotechnology may provide more effective treatment for patients with peritoneal cancer.”


Why this is important: Modern medicine has come very far in treating cancer. Cancer treatment is one of our greatest successes. One of the challenges is that there are so many different forms of cancer. Cancer in the abdominal region is difficult to treat with chemotherapy, because it often is discovered later, and the chemical treatments are flushed out of the system quickly. Around 1951, R. Buckminster Fuller filed for a patent on a geodesic dome structure (not mentioning a similar structure patented in 1926 in Germany). Fuller promoted the structure and proved its merits for architecture. In 1996, Doctors Smalley, Curl, and Kroto received a Nobel Prize for discovering a way to create a nano-sized geodesic structure from carbon atoms that they called the “Buckyball.” The thought was that it could be used in medicine to incase other molecules for various medical purposes, particularly diagnostic procedures. Recently, Astrid Hyldbakk of the Norwegian University of Science and Technology used a similar nano-structure to “house” chemotherapy drugs in experiments on abdominal cancers in mice and rats. The experiments successfully improved treatment efficacy. We don’t know that this will work in humans, but this technique of incasing treatments also may have use in other cancers. --- Hugh B. Wellons

What does the 'XRP Not a Security' Ruling Mean for Other Cryptocurrencies?

“The recent ruling in the SEC-Ripple case doesn't quite end the lack of clarity around crypto regulation.”


Why this is important: The crypto industry celebrated a partial win for Ripple in court as a judge ruled that XRP is not a security. This ruling is seen as a positive sign for the cryptocurrency industry, but legal experts emphasize that more regulatory clarity is still needed. The U.S. Securities and Exchange Commission (“SEC”) considers most cryptocurrencies as securities, leading to enforcement actions against issuers and exchanges. The recent ruling on XRP could potentially force the SEC to take a more hands-off approach in the future.


The ruling may also impact other cryptocurrencies previously deemed securities, leading to increased flexibility for exchanges in listing tokens. However, the decision could still be appealed by the SEC, and the final regulatory classification of crypto assets may come from Congress through new legislation.


Some legal experts believe that the court made an error in not considering programmatic sales of XRP on exchanges as investment contracts, arguing that Ripple is the principal promoter of XRP regardless of purchasers' awareness.


Overall, while the ruling is seen as a positive development, there is still uncertainty and the need for further legal clarity in the cryptocurrency industry. --- Shane P. Riley

4 Ways Educators are Configuring AI for Classroom Use

“Some innovative teachers see generative AI as a tool to produce lesson prompts, help students avoid future digital divides, and more.”


Why this is important: With the rise of artificial intelligence (“AI”) and the popularity of new online tools like ChatGPT, professionals of all industries are grappling with how to best use, or not use, AI in the work that they do, while managing the concern that these tools could eventually replace them altogether. One area particularly affected is education: Are students cheating if they use AI? Will AI degrade the value of an education? Can AI eventually replace a teacher? Rather than dwell on these negatives, this article discusses some ways that teachers are adapting to AI without compromising their standards.


First, Danny Liu of the University of Sydney in Australia is helping to educate educators on what generative AI tools are and how to use them. Liu stresses the importance of a well thought out and well written prompt to produce the best results. If teachers learn to craft specific prompts, the AI is likely to return something useful in the classroom. A part of this education effort is also to remove some of the initial fear-based stigma around the use of AI. It was once thought that having access to the internet during the school day would give students too much opportunity to cheat, but teachers have largely adapted and have made use of the internet an integral part of their curricula. The same could be true of AI.


In California, Peter Paccone is following this initiative by integrating ChatGPT into his work and allowing his students to use it. Paccone may use it to help create tests, write prompts, and form study guides. The students may use it to generate some results that contribute to a broader classroom discussion. As Paccone sees it, even if they are using AI, they are thinking and participating, which is the point of the exercise. Other teachers are using AI to generate summaries of presentations, fill-in-the-blank exercises, and other classroom materials to save time without losing value.


Of course, there are concerns about academic integrity when using tools that are so skilled at mimicking natural human language. That is why some teachers are requiring short and varied writing samples from their students at the beginning of the semester so they can get a sense of their voice and are better able to point out language that may have been generated by AI.


Lastly, educators embracing AI are acknowledging that students must have access to these tools during their education or they could fall behind in a world that is rapidly adapting to them. Just as a divide formed when some students had home computers and internet access while others did not, some students who are banned from using AI could enter a job market one day that expects a proficient understanding of how to use them. In this respect, it is an educational imperative to adapt to the use of AI and find ways to integrate the tools into school work rather than restrict or outright ban its use. This takeaway, it seems, is what all sectors and industries are dealing with right now – adapt or be left behind. --- Shane P. Riley

FCC Chair Proposes $200M Investment to Boost K-12 Cybersecurity

“The funds would go toward a three-year pilot program aimed at enhancing cybersecurity protections for school and library networks.”


Why this is important: Chairwoman Rosenworcel’s announcement and her proposal are the first step in the rule making process aimed at boosting federal investment in K-12 cybersecurity. Her proposal will require approval from the full commission before a proposed rule would formally issue for public comment. Interested parties should watch for further announcements that will trigger public response and comment deadlines.


This initial movement by the FCC bodes well for a showing that federal authorities are giving serious consideration to leveraging significant resources toward combatting cyber threats. With recent attacks taking aim at education services and networks, including the New York City Public Schools, and the California State Teachers’ Retirement Fund, the threat is truly nationwide. Interconnected networks are only as secure as the least protected entry point, whether that’s an out-of-compliance library catalogue terminal, or the central student database. It will be important to involve additional resources, rather than relying on already-strained budgets alone to address the problem in any meaningful way. --- Brian H. Richardson

Ransomware Criminals are Dumping Kids’ Private Files Online After School Hacks

“They describe student sexual assaults, psychiatric hospitalizations, abusive parents, truancy — even suicide attempts.”


Why this is important: We have previously discussed the increase in cyberattacks, especially ransomware attacks, on school district’s and other educational institutions’ computer networks. Educational institutions are often targeted because educational records include a lot of personal information, and because schools have lagged behind other industries in implementing adequate cybersecurity protections due to the lack of funds. Student records contain personal information, including information about sexual assaults, health records, psychiatric records, Social Security numbers, home addresses, and other personal information. It is not just student records that can be exposed as the result of a breach, but also employees’ records and personal information. School district technology budgets are often allocated for teaching tools and not data security.


Recently, Minneapolis Public Schools (“MPS”) experienced a ransomware attack on its computer network. The MPS followed federal guidelines when it refused to pay the $1 million ransom. In response to MPS’s refusal to pay the ransom, the cybercriminals released 300,000 files containing the personal information of tens of thousands of students online. Despite promises by the MPS to be transparent regarding the breach, it has not notified any of the victims of the breach about whether their personal information was stolen as a result of the breach. The disclosure of student’s sensitive information on the Internet for all to see for perpetuity caused many students to be retraumatized.


What can you do in response to a data breach at your child’s school? Unfortunately, not much. The Family Educational Rights and Privacy Act of 1974 (“FERPA”) bars the disclosure of student’s personal information without authorization from the student’s guardian or student themselves depending on the age and educational status of the student. However, FERPA does not provided for a private cause of action in the event that at student’s records are breached. The only recourse students or their guardians have in the event of a violation of FERPA is to report the incident to the U.S. Department of Education for it to investigate the violation. Moreover, there is no federal law that requires school districts to notify students that their records were compromised by a data breach or ransomware attack. This does not let school districts or other educational institutions off the hook in the event of a data breach. Even though FERPA does not provide for a cause of action for the improper disclosure of student’s records, school districts and other educational institutions can still be found liable for negligence if they did not take reasonable steps to safe guard their computer networks. There is also the risk of regulatory action by the U.S. Department of Education if the breach is reported pursuant to FERPA. Additionally, in the absence of a federal data breach notification law, schools and educational institutions need be sure that they are in compliance with state data breach notification laws or risk being subject to litigation by not only students and/or their guardians, but by also state attorneys general. If your school district of educational institution needs help creating policies and procedures to prevent a data breach, or if you need to respond to a data breach, please contact Spilman’s Cybersecurity and Data Privacy Practice Group. --- Alexander L. Turner

Charles Schwab Announces TD Ameritrade Data Breach

“Tens of thousands of clients could have been affected in huge attack resulting from vulnerabilities found in MOVEit file transfer software.”


Why this is important: Another data breach. What’s the big deal? Well, this affects millions, and it was a broad attack. It also worked against CALPERS and Genworth Financial. Similar attacks against other financial services may be undiscovered at this point. We still may not know all the financial companies where this attack was successful. As important, this breach potentially provides access to people’s retirement and investment accounts. Imagine finding out that all of your retirement savings were affected by such a breach? Many (most?) middle class Americans have far more at risk in such investment accounts than they have in bank accounts. Bank accounts are protected by FDIC insurance, which, up to the limit, protects with the full faith and credit of the United States. Investment accounts are insured privately. These breaches pose much larger threats to middle class America than most realize. --- Hugh B. Wellons

The FTC’s Biggest AI Enforcement Tool? Forcing Companies to Delete Their Algorithms

“Algorithm disgorgement requires companies to remove products built on data they shouldn't have used in the first place.”


Why this is important: As lawmakers around the world contemplate safety mechanisms and regulations for AI, the FTC already has a powerful enforcement tool in place. Since 2019, the FTC has utilized a tool called algorithm disgorgement, or model deletion, which forces companies to remove products developed using data they should not have accessed. The approach has been used in actions against companies like Amazon, a diet app for children, and Cambridge Analytica. The FTC hopes its actions will be a warning to companies that may be mishandling user data in a race to build new products, and FTC officials emphasize that machine learning cannot be an excuse to violate the law.


The FTC's strength in regulating AI can be credited to the broad scope and flexibility of the FTC ACT, which calls for protecting consumers as new technologies emerge. Model deletion holds significant consequences for companies, motivating them to be more cautious with data collection and use. Whereas fines can seem like a slap on the wrist for big companies, algorithm disgorgement creates significant costs on business models. However, the FTC faces limitations, and experts suggest that a comprehensive federal privacy law would strengthen its ability to protect user data and enforce model deletion more effectively. --- Alison M. Sacriponte

A Painful Reset for Biotech Investing  

“Biotech investors for early stage projects were in short supply in 2022, and overall investment fell.”  


Why this is important: Biotech investing has been a rocky road for the past few years. This short article explains how the biotech investor world has changed this year, both for good and bad. Investments interest is up for early stages, but valuations are substantially lower. Many early-stage projects do not have publicly traded stock, and larger investors often require such stock to liquidate their interests. Now, many of these investors are insisting that the early-stage project have an exit plan within the next 12-18 months that does not require registration as public stock. In other words, no IPO or Form 10 filing required with the SEC, if you want investment. This limits exit plans substantially. It effectively means that if your science alone is not marketable at a reasonable profit, investment still may be hard to get. The IPO window in biotech stocks, however, seems to be showing some signs of opening, so this may loosen by yearend. --- Hugh B. Wellons

FTC Questions OpenAI Over ChatGPT Data Security Practices

“The Federal Trade Commission sent to ChatGPT developer OpenAI a list of questions seeking clarity on how the company monitors, collects, and retains user personal information and ensures control over its popular artificial intelligence chatbot.”


Why this is important: The FTC has sent a list of questions to OpenAI, the developer behind ChatGPT, inquiring about the company's practices regarding personal information, privacy, and data security. The 20-page document, linked to in a report by the Washington Post, aims to determine if OpenAI engaged in any unfair or deceptive practices, potentially causing harm to consumers. The FTC seeks information such as: what data is used for training the chatbot; what measures are taken to exclude personal information during training; what policies and procedures are in place for risk assessment before updates; what steps are taken to assess the potential generation of false statements about real individuals; and what cybersecurity incidents involving users’ personal data the company has experienced. On Twitter, OpenAI's CEO, Sam Altman, expressed disappointment at the FTC’s request but assured the company’s cooperation. --- Alison M. Sacriponte

Feel Again: Advancements in Prosthetics Limb Technology Allow Feeling, Control

“Sensors in the plastic fingers are connected to nerves in his arm to return a basic sense of touch which he can demonstrate with his eyes closed.”


Why this is important: The article discusses advancements in artificial limb technology and the restoration of the sense of touch for people with prosthetic limbs and spinal injuries. The Defense Department launched a $100 million project to improve prosthetic limbs, leading to the development of robotic limbs controlled by the user's thoughts.


One individual benefiting from these advancements, Brandon Prestwood, lost his left hand in an accident and volunteered for experimental research involving surgery at the V.A. His arm was fitted with electrodes that pick up brain signals for movement, allowing him to control the artificial hand. Additionally, sensors in the fingers are connected to nerves in his arm, giving him a basic sense of touch.


Initially, experts believed that targeting the brain's neurons with electrical stimulation was too complex, but the research proved otherwise. Neuroscientists managed to return signals for touch to the brain through implants, allowing users to experience sensations through their artificial limbs.


Another individual, Austin Beggin, bypassed his damaged spine using implants in his arm to control the limb directly through his brain impulses. These advancements have the potential to restore functionality and independence to those with severe spinal cord injuries.


The article also highlights the emotional impact of regaining the sense of touch for individuals like Brandon Prestwood and his wife, Amy. The restoration of touch can make a significant difference in the lives of those with artificial limbs, helping them feel whole again and reconnect with the world in a profound way. --- Shane P. Riley

Court Allows Axonics Neurostim Patent Case to Proceed Against Medtronic

“A federal appeals court found that a U.S. Patent Office board erred in its legal analysis of the challenge to Medtronic’s neurostimulation patents.”


Why this is important: Medtronic and Axonics are competing in the market for sacral nerve modulation to treat urinary incontinence. This involves implanting a device above the tailbone to stimulate the sacral nerve and alleviate the condition. Medtronic sued Axonics for patent infringement, and Axonics challenged the claims' validity at the Patent Trial and Appeal Board, arguing that the technology was unpatentable due to its obviousness.


The Patent Trial and Appeal Board initially ruled in favor of Medtronic, stating that Axonics failed to prove the challenged claims were unpatentable. However, the U.S. Court of Appeals for the Federal Circuit has now rejected the board's decision and sent the case back for further consideration. The court found that the board made errors in its framing of the challenge and obviousness analysis.


Medtronic claims its patents have withstood various challenges by Axonics at the Patent Office and that the Federal Circuit has not invalidated any of its patent claims against Axonics. The trial for a related lawsuit is scheduled to begin on August 15 in a California federal court, where Medtronic is suing Axonics for infringement of seven patents.


Analysts believe that despite the litigation's outcome being uncertain, Axonics is well-positioned to continue penetrating the market with its device portfolio, as it targets a large and underserved market. Both companies are expected to continue pursuing legal resolutions, and there may be progress toward resolution later this year. --- Shane P. Riley

Mosquito-Friendly Gene Drive may Lead to a Malaria-Free Future

"Previous considerations of using gene drive modifications to eradicate mosquitoes have been met with great concern for the potential unintended effects of removing a species from the environment, even one as universally despised as mosquitoes."


Why this is important: We’ve talked before about the power of CRISPR Cas-9 process for gene manipulation. Research at the University of California – Irvine has developed the gene drive solution for altering the genetic code of mosquitos, so that, over just a few years, malaria will be spread by 50 percent fewer (and possibly up to 90 percent fewer) mosquitos than now. This may save between 310,000 and 557,000 humans every year, over three-quarters of them children under 5. It also might prevent long-term pain and suffering for millions, over time. This is not getting a lot of press, but it will. --- Hugh B. Wellons

LinkedIn Share This Email
This is an attorney advertisement. Your receipt and/or use of this material does not constitute or create an attorney-client relationship between you and Spilman Thomas & Battle, PLLC or any attorney associated with the firm. This e-mail publication is distributed with the understanding that the author, publisher and distributor are not rendering legal or other professional advice on specific facts or matters and, accordingly, assume no liability whatsoever in connection with its use.

Responsible Attorney: Michael J. Basile, 800-967-8251