Law in the Internet Society

For Lawyers is Alexa Safe for Work?

-- By AyeletBentley - 03 Dec 2019

Legal ramifications of Alexa have been discussed in relation to Alexa recordings as evidence in court. Prosecutors have requested recordings. Amazon claims they won’t turn over recordings, but ultimately they have. This paper, however, discusses a different legal issue: can lawyers ethically have Alexa?

Many people are momentarily disturbed when they realize the breadth of Alexa privacy concerns, but they find asking Alexa what time it is worth the privacy loss. However, for lawyers and law firms bound by Rules of Professional Responsibility, this lax view is not an option.

Case.one and Thomas Reuters have designed Alexa apps to assist law firms with billing and case look up respectively. Before use lawyers should consider how Alexa uses information and whether it is compatible with the Rules of Professional Responsibility. If lawyers choose to have Alexa they should consider ways to improve security to not violate these rules. This paper begins by going through how Alexa captures and uses data. It then looks at how this data use interacts with Rules of Professional Responsibility. Finally, it offers some potential solutions.

ALEXA'S PRIVACY CONCERNS

When Alexa listens, the clips she hears are not destroyed after she responds, but are stored indefinitely (or until someone deletes them). One can go through and listen to every conversation they have had with Alexa on their account. A clear issue apparent when doing this is that the conversations are not necessarily preceded by a “wake word” like Amazon claims. Further, Alexa automatically records the few seconds preceding the “wake word.”

Another major privacy concern is that people are actually listening to these recordings. Amazon itself has admitted to this. Amazon claims that they take privacy seriously and only a small number of conversations from a random set of customers are listened to. However, any conversation could be listened to and even if the specific one isn’t, Amazon has the ability to do so. Any expectation of privacy is lost.

PROFESSIONAL RESPONSIBILITY

Some commentators have argued that lawyers cannot use Alexa while others have discussed issues but ultimately decided they can. This paper seeks to more thoroughly analyze the issues. A lawyer’s confidentiality duty is covered by Rule 1.6. Rule 1.6 states lawyers are not to reveal information related to the representation of the client unless 1) the client gives informed consent, 2) the disclosure is implied in order to carry out duties of representation, or 3) a number of exceptions including protecting the client or third parties from death or if the client is about to commit a crime. For this analysis number three is outside the scope since it is case specific. However, the following paragraphs will look at the ways Alexa can breach this duty of confidentiality as well as the first two allowances for disclosure.

Speaking to Alexa likely counts as third party disclosure and thus for lawyers to ethically have an Alexa this disclosure should be covered under one of the subsections of Rule 1.6. It has been established that humans are listening to Alexa. While there are potential legal problems with Alexa recording and saving conversations without client permission, the listening puts Alexa in the category of third party disclosure. Since Alexa records and sends snippets of conversations not preceded by a wake word this is a particular concern. A lawyer or law firm cannot have Alexa and just avoid saying her name when discussing client matters.

Thus, for lawyers to have Alexa her use would have to be covered under the third party disclosure allowances of Rule 1.6. Perhaps lawyers could receive informed consent putting Alexa into an information disclosure form that their client signs. Besides the face value absurdity of this, this could run into a problem because 1) the client does not know specifically what matters would be shared with Alexa and 2) neither the lawyer or client knows who will actually be listening. The first factor is an issue because generally clients do not grant blanket disclosure consent but instead grant consent for the discussion of something specific. As to the second, “Amazon employees” feels far too broad to put in an informed consent agreement.

An even more troubling argument would be that the disclosure is implied to carry out duties. One could argue that as long as the client sees Alexa when walking into the law office, there is implied consent. This, however, would be a bad argument. It doesn’t cover home Alexas for when lawyers work from home, assumes a knowledge that actual humans are listening that shouldn’t be assumed, and opens up a slippery slope of assumptions.

WHAT SHOULD LAWYERS DO?

This paper argues Alexa violates Rules of Professional Responsibility for lawyers, so what should lawyers do about that Alexa in their law office or home office? The most obvious answer is not have Alexa or unplug it. The trade off of violating Rules of Professional Responsibility versus not being able to track billable hours with Alexa seems like a no brainer. Alternatively, one can turn off their recordings for development purposes which theoretically should prevent their voice from being sent to Amazon. However, a distrust of Amazon would prevent one from relying on this. Using the old Amazon Tap would have helped since it required pushing a button to activate the speaker, but Amazon doesn’t make this model anymore. Others have suggested keeping Alexa in places where confidential matters are not generally discussed, turning off when discussing private matters, and deleting old conversations. Another potential solution is Project Alias which creates white noise that is stopped only by a wake word.

Though there are potential remedies that may allow Alexa’s use by lawyers, they are not sufficient that one should feel comfortable. The potential for violating Rules of Professional Responsibility is real and the benefit of Alexa minor. Prosecutors might see Alexa as their friend when she is present at the scene of a crime, but no lawyer should see Alexa as a friend in their own office.

I'm surprised you think it takes this much analysis to prove that putting a listening device controlled by a third party inside a law office is unacceptable. If I recorded conversations with or about clients and gave the tapes to third parties no one would doubt that constituted waiver of privilege and breach of the duty of confidentiality. Why would the conclusion differ because the microphone had a name and called itself a speaker?

So it seems to me you can use the bulk of your space on two other questions:

  1. Why are so many lawyers determined not to perceive the obvious?
  2. Are Amazon, Google, and similar platforms potentially liable for not warning professionals (including but not limited to lawyers) that use of their devices in some contexts may violate professional duties and local laws?


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r2 - 11 Jan 2020 - 11:10:56 - EbenMoglen
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM