Submitted by ykilcher t3_yqj1hq in MachineLearning

https://youtu.be/W5M-dvzpzSQ

So-called responsible AI licenses are stupid, counterproductive, and have a dangerous legal loophole in them.

​

OpenRAIL++ License here: https://www.ykilcher.com/license

​

OUTLINE:

0:00 - Introduction

0:40 - Responsible AI Licenses (RAIL) of BLOOM and Stable Diffusion

3:35 - Open source software's dilemma of bad usage and restrictions

8:45 - Good applications, bad applications

12:45 - A dangerous legal loophole

15:50 - OpenRAIL++ License

16:50 - This has nothing to do with copyright

26:00 - Final thoughts

24

Comments

You must log in or register to comment.

EmbarrassedHelp t1_ivp5rcd wrote

> Updates and Runtime Restrictions. To the maximum extent permitted by law, Licensor reserves the right to restrict (remotely or otherwise) usage of the Model in violation of this License, update the Model through electronic means, or modify the Output of the Model based on updates. You shall undertake reasonable efforts to use the latest version of the Model.

This appears to be the poison pill he talks about. The creator can restrict usage and force updates upon users.

18

farmingvillein t1_ivpuvwj wrote

Pretty gnarly.

Some quick observations:

Not clear if this is an immediate concern, since you can make a "Derivative of the Model" which arguably gets you out from under this clause.

"Version" is poorly defined--I could see someone trying to take an aggressive stance that a sufficiently degraded update is not an updated "version", but something else.

"Reasonable efforts" is potentially (depending on jurisdiction) a hole large enough to drive a truck through. E.g., if you build a service that depends on their model, and then they release a degraded one that makes it so that you can't service a large % of your users anymore, you could argue that there is no "reasonable effort" available that allows you to transition to the new model (given the corresponding commercial costs).

If you want to rely on any of the above, though, you definitely should get your counsel's blessing...

10

TiredOldCrow t1_ivpr1wm wrote

For reference, here's a link to the BigScience RAIL License.

The license includes usage restrictions which specifically forbid illegal and harmful uses (many of which are ostensibly already illegal under the law), as well as specific other uses that the model authors are uneasy about (e.g., medical advice, law enforcement, and automated decision making).

Researchers are effectively attempting to morally (and perhaps legally) exonerate themselves from abusive usage of technology they have developed. With this, they also hope to retain the right to exceptionally approve or deny usage in sensitive areas.

Personally, I'm sympathetic to the dilemma faced by researchers, given the large potential for abuse of these models, and the relative lack of regulation of AI systems in some jurisdictions. That said, I believe that hard legislation is ideally where usage restrictions would be enforced, not software licensing. Existing laws and policies, such as Article 22 of the GDPR, or Canada's TBS Directive on Automated Decision Making, should serve as a template.

7

FutureIsMine t1_ivx6xol wrote

This is the real reason here they've got all these provisions is so that the original maker of the model can't be held liable or sued

1

duschendestroyer t1_ivokvxi wrote

They should have called it the CYAL - Cover Your Ass License.

2

FoundationPM t1_ivsl7nv wrote

Is there a guy who wrote the licence seriously? What's the point of `reinventing` such licence? Cann't GNU/MIT or other well-known licences cover their need? Is it just because the huge data collected cannot be seriously regarded as legal input for their training? It's really a waste of their talent on this.

0

Sinity t1_ivu2pa0 wrote

Hopefully people will ignore these licences. In the end, it's just text.

Some anon can (physically) download these models from HuggingFace and reupload them elsewhere stripped of the licence. Possibly even under a different name. Possibly worth ensuring that file's hash is different than the original.

Then, any user can download it. And they can't possibly know that OpenRAIL-M is a thing that applies to this model.

Law is not morality.

Also maybe worth it to make the life harder for the people responsible, not necessarily through illegal ways. Mobs on Twitter do it all the time, after all, and it's apparently fine.

0