您的位置:首页 > 其它

立法者在民权受到侵蚀时忽略了黑匣子算法

2020-07-22 03:55 30 查看

The detrimental effects of black box algorithms have taken their toll on unprivileged groups. Meanwhile, disconnected lawmakers have shown no concern for civil rights laws as they apply to artificial intelligence.

黑匣子算法的不利影响已对无特权的群体造成了损失。 同时,不连贯的立法者对适用于人工智能的民权法也不表示关注。

Artificial intelligence (AI) legislation is needed to stop the relentless trampling of civil rights. Additionally, taxpayers should not have to pay for the imminent tsunami of law suits resulting from this legislative shirking of duty.

需要制定人工智能(AI)立法来制止残酷践踏公民权利的行为。 此外,纳税人不必为因立法逃避法律而引发的海啸诉讼付费。

Placing computer code in the metaphorical “black box” is common. Typically, an organization hides the code used in a computer program such as AI software. Corporations have fought to keep this code hidden.

将计算机代码放在隐喻的“黑匣子”中很常见。 通常,组织会隐藏计算机程序(例如AI软件)中使用的代码。 公司已努力保持隐藏此代码。

They claim that transparency would result in intellectual property theft. However, this code opacity has resulted in violations of civil rights laws.

他们声称透明度会导致知识产权盗窃。 但是,这种不透明的代码导致违反民权法。

ProPublica exposed one such violation in the article, “Machine Bias.” The article describes an algorithm being used by law enforcement agencies throughout the United States.

ProPublica在“ 机器偏见 ”一文中暴露了一种此类违规行为。 本文介绍了整个美国执法机构使用的算​​法。

It is being used to convict individuals of crimes, predict recidivism, and assign prison sentences. AI software seems like a great way to expedite these processes, right?

它被用于定罪个人,预测累犯和判处徒刑。 AI软件似乎是加快这些过程的好方法,对吗?

One problem — it does not work. It suffers from alarming rates of false positives and false negatives. Additionally, these rates have worse outcomes for people of color. To worsen matters, the questionnaire providing data for the algorithm includes questions like:

一个问题-它不起作用。 它的误报率和误报率令人震惊 。 此外,这些比率对有色人种的不良后果。 更糟糕的是,提供算法数据的问卷包括以下问题

  • “Was one of your parents ever sent to jail or prison?”

    “您的父母之一曾经被送进监狱或监狱吗?”
  • “How many of your friends/acquaintances are taking drugs illegally?”

    “您有多少朋友/熟人非法吸毒?”

This is not exactly textbook AI ethics. The algorithm does one thing well — it ruins the lives of some citizens forever.

这不完全是教科书中的AI伦理学。 该算法可以很好地完成一件事-它永远毁了一些公民的生活。

A false positive could trigger wrongful sentencing. This can result in a person going to jail for 10 years instead of 2. In the case of a false negative, a dangerous individual could be released early on their own recognizance, only to commit more crimes.

误报可能会触发错误的判决。 这可能导致一个人将被判10年监禁而不是2年监禁。在假阴性的情况下,危险的个人可以在其本人承认的情况下提早释放,从而犯下更多罪行。

Numerous other problems exist with these types of algorithms. For example, the limitations and problems of facial recognition have been known for some time now, e.g., unwarranted search and seizure.

这些类型的算法还存在许多其他问题。 例如,面部识别的局限性和问题已经有一段时间了,例如不必要的搜索和癫痫发作。

Those that favor black box algorithms cling to two vacuous arguments:

那些支持黑匣子算法的算法坚持两个虚假的论点:

  • “These black box algorithms are required to protect intellectual property.”

    “这些黑匣子算法是保护知识产权所必需的。”
  • “Protecting intellectual property and civil rights is a zero sum game and a tradeoff is required.”

    “保护知识产权和公民权利是一项零和游戏,需要进行权衡。”

似是而非的论点 (Specious Arguments)

知识产权盗窃 (Intellectual Property Theft)

To some degree, the protection of black box computer code is couched in rationality. A company needs to protect its financial investment in terms of person-hours of programming.

从某种程度上说,黑匣子计算机代码的保护是合理的。 公司需要以编程的人/小时来保护其财务投资。

However, this implies that a company should be able to create any code, without regard to impact on society. The belief that its “secret ingredients” must be protected at all cost to society is simply not part of mainstream morality.

但是,这意味着公司应该能够创建任何代码,而不考虑对社会的影响。 认为必须不惜一切代价保护其“秘密成分”对社会而言不是主流道德的一部分。

The American zeitgeist not only favors, it demands, the creation of protective consumer agencies like the Food and Drug Administration (FDA). When a drug manufacturer creates a drug to treat an illness, laws require transparent testing and scrutiny by the FDA.

美国时代精神不仅支持而且要求建立保护性的消费者保护机构,例如食品药品管理局 (FDA)。 当药品制造商生产用于治疗疾病的药品时,法律要求FDA进行透明测试和审查。

The result of this mandated scrutiny is an unveiling of the drug’s active ingredients. These laws protect society from the potential impact of a flawed drug.

经过严格审查的结果是揭露了该药物的活性成分。 这些法律保护社会免受药品缺陷的潜在影响。

Similarly, the act of creating an algorithm, with the potential to infringe a protected class of individuals, should be subjected to scrutiny. An algorithm is directly impacted by the implicit and explicit biases of the programmer. It is imperative that these biases are mitigated.

同样,应仔细检查创建算法的行为,该行为有可能侵犯受保护的个人类别。 程序员的隐式和显式偏差直接影响算法。 这些偏见必须得到缓解。

Biases impact AI software in several ways. One way is by the “cleaning” or pre-processing of data in such a manner that the results from predictive analysis favor one protected subclass over another.

偏差以多种方式影响AI软件。 一种方法是通过“清理”或预处理数据,使预测分析的结果优先于一个受保护的子类

This, in turn, may lead to the algorithm concluding a correlation between race and propensity to recidivate. The software could present this specious correlation as cause and effect. In essence, software could “determine” that, if you’re Black, you’re more likely to recidivate.

反过来,这可能导致该算法得出种族和再犯倾向之间的相关性。 该软件可以将这种似然的相关性呈现为因果关系。 从本质上讲,如果您是黑人,软件可以“确定”您更容易陷入困境

We have the FDA to protect citizens from dangerous drugs. Congress needs to create a similar agency to protect citizens from these transgressions.

我们有FDA保护市民免受危险药物的侵害。 国会需要建立一个类似的机构来保护公民免受这些侵害。

And no, intellectual property theft would not be an issue. Members of such an agency could be sworn to confidentiality to prevent such theft.

不,知识产权盗窃不会成为问题。 该机构的成员可能会发誓要保密以防止这种盗窃。

The failure to act will also likely increase taxes for all citizens. Increased taxes will be required for government entities to defend the high number of law suits based on unethical software use.

不采取行动也将增加所有公民的税收。 政府实体将需要增加税收,以捍卫基于不道德软件使用的大量诉讼。

The proposed agency would incentivize corporations to provide self-oversight in the creation of algorithms. No corporation would want to be ordered to retool software found to infract civil rights. They would try to get it right the first time.

提议的代理机构将激励公司在创建算法时提供自我监督。 没有任何一家公司愿意被勒令重组发现侵犯公民权利的软件。 他们会在第一时间尝试解决问题。

Congress should act now to form such an agency.

国会现在应该采取行动以建立这样的机构。

代码保护与公民权利是零和游戏 (Code Protection vs. Civil Rights is a Zero Sum Game)

This effectively conflates code protection and civil rights. It implies that a company’s right to hide code takes precedence in all use case scenarios.

这有效地放宽了代码保护和公民权利。 这意味着 公司隐藏代码的权利在所有用例场景中都优先。

First, this is patently false. The Civil Rights Act of 1964 has established the rights of protected classes. If an algorithm is suspected to infringe those rights, it is likely that its contents will be subject to examination in court. This is similar to when a weapon is suspected of use in a crime.

首先,这显然是错误的。 1964年的《民权法》确立了受保护阶级的权利。 如果怀疑某个算法侵犯了这些权利,则其内容可能会在法庭上受到审查。 这类似于涉嫌在犯罪中使用武器的情况。

Second, this argument presents code protection and civil rights as a zero sum game. This is essentially stating that if civil rights are protected, corporate rights would suffer and vice versa. This is false. Algorithms do not have to infringe civil rights to work correctly.

其次,该论点将代码保护和公民权利作为零和游戏。 这实质上表明,如果民权得到保护,公司权利将受到损害,反之亦然。 这是错误的。 算法不必侵犯公民权利即可正常工作。

An additional advantage of an oversight agency is that, if an algorithm was ever suspected to infract civil rights, it could be expeditiously “removed from the market” in the same manner that a dangerous drug would be removed.

监督机构的另一个优势是,如果曾经怀疑算法侵犯了公民权利,则可以像清除危险药物一样,将其Swift“撤出市场”。

In this manner, citizen’s rights and intellectual property are both protected from theft.

这样,公民的权利和知识产权都受到保护,免遭盗窃。

结论 (Conclusion)

By ignoring black box algorithms, Congress has failed to protect unprivileged classes. Through their collective inertia, lawmakers are morally complicit in the violation of basic civil rights in the interest of corporate rights.

通过忽略黑盒算法,国会未能保护非特权阶级。 由于集体的惯性,立法者在道德上为公司权利的利益同谋侵犯基本民权。

Biases exist and will continue to surface for a long time. Education and will are required to overcome prejudice and ignorance. While those efforts continue, civil rights legislation can and must still be enforced. Both can be concurrent.

存在偏差,并且将持续很长时间。 必须进行教育并克服偏见和无知。 在继续进行这些努力的同时,可以而且仍然必须执行民权立法。 两者可以是并发的。

If citizens can chew gum and think at the same time, it’s not too much to ask our legislators to do the same.

如果市民可以同时嚼口香糖和思考,那么要求我们的立法者做同样的事情也不是太多。

翻译自: https://medium.com/swlh/lawmakers-ignore-black-box-algorithms-while-civil-rights-erode-d271ebc2ffc9

内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: