Has anyone has developed a ChatGPT / use of AI policy? Or do you reference existing P&Ps?
Sort by:
When you say de-identified do you mean the data cannot be reverse engineered back to a specific individual?<br><br>I’m not providing legal advice herewith, but improper exposure of genetic information can be problematic. Article referencing an on-point class action lawsuit linked below.<br><br>https://news.bloomberglaw.com/litigation/23andme-sued-over-hack-of-genetic-data-affecting-thousands<br>
We created a short one as part of our acceptable use policy but have started to work on more formal one now. I think the core messages are
1) security: don’t shared sensitive or confidential data. If you’re not sure, ask
2) verify: don’t trust the output for accuracy. Always double check
3) disclose: if you use AI to produce something, disclose it.
Generative AI Acceptable Use Policy and well as policies and procedures. Align also to AI Risk Framework by NIST, updates to ISO, EU AI Rules, and similar regulations that affect the company. Cyber Liability Insurance is going to be asking you if you haven't yet renewed how you are proactively handling this as a whole as research is showing 7-10% corporate IP and sensitive data leaving into these tools not to mention the risks and lawsuits from data coming into the company that have copyright and other issues. AI insurance riders are now on the market. Don't rubber stamp this. It's a risk that must be addressed holisticly through Enterprise Risk Management, communicated and enforced.
Yes, we've rolled out an acceptable use policy with approval structures for our teams.
We launched an internal slack.plugin and self hosted version of chatgpt to have data privacy.. encouraging people to use that instead of public one

Does anyone know about whole genome or even exome DNA sequencing research projects and whether it is even possible to consider giving a privacy waiver?
Whole genome sequenced DNA from bio-specimens is fully readable with current databases cross referenced. Is it allowable to think that just because some of the data is de-identified, that we can forego consent?